Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Priors for DLM coefficients #15

Open
devincaughey opened this issue Mar 5, 2017 · 0 comments
Open

Priors for DLM coefficients #15

devincaughey opened this issue Mar 5, 2017 · 0 comments

Comments

@devincaughey
Copy link
Collaborator

At present, xi and gamma are drawn from a prior centered on their value in the previous period. This made sense when the DLM did not include theta_bar[t-1] because it implied a prior belief that the coefficients in the cross-sectional model predicting theta_bar[t] were stable over time. But now that theta_bar[t-1] is in there, xi and gamma serve a different purpose: xi[t] now captures CHANGES in average opinion from t-1, and the gammas how these changes differ across groups. Drawing these coefficients from a prior distribution centered at their previous value thus implies that we think that trends in opinion are likely to be stable over time (e.g., if everyone got more liberal last year, they probably did again this year). In some cases this may be reasonable, but in other cases it may impose too much stability. I have a suspicion that this may be related to the fact that our model sometimes estimates an oscillating pattern in opinion (i.e., a delta_tbar less than 0), which may be its way of compensating for the hierarchical model's overly rigid insistence on persistent trends. A comprehensive fix to this problem may involve a fundamental change to the structure of the DLM, but here are two more targeted solutions:
(1) Add an option to stipulate that xi and gamma should be drawn independently in each period even if separate_t = 0.
(2) Add an option to drop XX * gamma[t] entirely after the first period—i.e., no hierarchical model, just a DLM (though keeping xi).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant