[R] lme, corARMA and large data sets
Peter Wandeler
p_wandeler at gmx.ch
Thu Apr 14 14:12:15 CEST 2005
I am currently trying to get a "lme" analyses running to correct for the
non-independence of residuals (using e.g. corAR1, corARMA) for a larger data
set (>10000 obs) for an independent (lgeodisE) and dependent variable
(gendis). Previous attempts using SAS failed. In addition we were told by
SAS that our data set was too large to be handled by this procedure anyway
(!!).
SAS script
proc mixed data=raw method=reml maxiter=1000;
model gendis=lgeodisE / solution;
repeated /subject=intercept type=arma(1,1);
So I turned to R. Being a complete R newbie I didn't arrive computing
exactly the same model in R on a reduced data set so far.
R command line (using "dummy" as a dummy group variable)
>
model.ARMA<-lme(gendis~lgeodisE,correlation=corARMA(p=1,q=1),random=~1|dummy).
Furthermore, memory allocation problems occurred again on my 1GB RAM desktop
during some trials with larger data sets.
Can anybody help?
Cheers,
Peter
More information about the R-help
mailing list