[R] generalized least squares with empirical error covariance matrix
Andrew Schuh
aschuh at atmos.colostate.edu
Wed May 9 22:09:34 CEST 2007
I have a bayesian hierarchical normal regression model, in which the
regression coefficients are nested, which I've wrapped into one
regression framework, y = X %*% beta + e . I would like to run data
through the model in a filter style (kalman filterish), updating
regression coefficients at each step new data can be gathered. After
the first filter step, I will need to be able to feed the a non-diagonal
posterior covariance in for the prior of the next step. "gls" and "glm"
seem to be set up to handle structured error covariances, where mine is
more empirical, driven completely by the data. Explicitly solving w/
"solve" is really sensitive to small values in the covariance matrix and
I've only been able to get reliable results at the first step by using
weighted regression w/ lm(). Am I missing an obvious function for
linear regression w/ a correlated prior on the errors for the updating
steps? Thanks in advance for any advice.
More information about the R-help
mailing list