[R] Time lag Regression and Standard Error
laro
l_rohner at gmx.ch
Sun Sep 1 20:34:44 CEST 2013
Hi R Team
I've got the following problem
I'd like to run a time series regression of the following form
Regression1:
At = α + β1 * Bt + β2 * Bt-1 + β3 [(Bt-2 + Bt-3 + Bt-4)/3] + εt
The B's are the input values and the A's are the output values, the
subscript stands for the lag.
The real Beta of this regression is βreal = β1 + β2 + β3
First: How can I run the regression without manually laging the B's?
And second: I need the standard error for βreal. How can I calculate it with
the information given from the lm(Regression1)? (I read something about the
deltamethod?)
Thank you a lot!
Kind regards
--
View this message in context: http://r.789695.n4.nabble.com/Time-lag-Regression-and-Standard-Error-tp4675130.html
Sent from the R help mailing list archive at Nabble.com.
More information about the R-help
mailing list