[R] Heteroskedasticity and autocorrelation of residuals

araiss anas.raiss at gmail.com
Mon Dec 27 14:17:23 CET 2010


Hello everyone,
I'm working on a current linear model Y = a0 + a1* X1 + ... + a7*X7 +
residuals. And I know that this model presents both heteroskedasticity
(tried Breusch-Pagan test and White test) and residuals autocorrelation
(using Durbin Watson test). Ultimately, this model being meant to be used
for predictions, I would like to be able to remove this heteroskedasticity
and residuals autocorrelation.

What I've done until now : 
- I've used the sandwich package (function vcovHAC) with the coeftest
function and I was able to compute correct standard deviations of my
coefficient estimations. However, the coefficients remain the same. Only the
standard deviation change. And what I'm looking for is actually a way to
obtain a new set of coefficients. 
- I've tried another approach which is to take the var/covar matrix given by
the function vcovHAC. And I 
used the result of Generalized least squares estimation to compute an
estimation of the coefficients. This didn't work: the residuals were still
autocorrelated and the model still presents heteroskedasticity. However, the
plot of residuals vs fitted values look slightly better.

Would any of you know how to :
- Determine a new set of coefficients for my linear model (possibly using
the Generalized least square estimation) ? I assume I must choose a new
weighting for my observations. If yes, how do you find it given a model with
7 variables ?
- Remove the autocorrelation of my residuals. I've tried also to apply
several function on Y (log, power..) but it still didn't solve the problem.

Thank you very much for your help !
A.
-- 
View this message in context: http://r.789695.n4.nabble.com/Heteroskedasticity-and-autocorrelation-of-residuals-tp3164981p3164981.html
Sent from the R help mailing list archive at Nabble.com.



More information about the R-help mailing list