[R] memory problems in NLME

Douglas Bates bates at stat.wisc.edu
Sun Mar 30 22:13:46 CEST 2003


"Vumani Dlamini" <dvumani at hotmail.com> writes:

> I am trying to fit a random coeficient model with about 30 covariates
> with a random intercept and one random slope. The data set has 65000
> observations, and whenever I use LME I get the message that all memory
> has been used.

Do you know what the number of columns in the model matrix for the
fixed-effects will be?  You say you have 30 covariates but if some of
those are factors or if you take powers of continuous covariates or
interactions between terms then the number of columns in the model
matrix can be much larger than 30.  Given the dimensions you mention
it seems that the model matrix for the fixed effects is nearly 16 MB
in size or larger.  Evaluation of the log-likelihood requires 3 or 4
copies of matrices like this plus the original data frame and the
memory being used by other R objects.

> I was wondering whether there is a more efficient way fo fitting the model.

Saikat DebRoy and I are working on a new package for lme and related
functions using S4 classes.  That package, which we plan to release in
a 'snapshot' form shortly after R-1.7.0 is released (scheduled for April 16,
2003), controls the number of copies of the model matrices being
created.

I can run this example on the new package and the old package and
provide comparisons if you wish.  I use a machine with 1 GB of memory
which should be enough. Please contact me off-list.



More information about the R-help mailing list