[R-sig-ME] Too big for lmer?

ONKELINX, Thierry Thierry.ONKELINX at inbo.be
Fri Aug 7 10:32:16 CEST 2009


Dear Kevin,

This is probably due to the huge number of levels in L which you put in
the fixed effects. lmer() calculates the correlation between all fixed
effects. That requires a huge matrix in this case.

Since you have far less random effects than fixed effects, it seems to
me more appropriate to interchange them. 

m0 <- lmer(y~H + (1|L), data = dat)

Anonther option it to use them both as crossed random effects.

m1 <- lmer(y~(1|H)+(1|L), data=dat)

HTH,

Thierry

------------------------------------------------------------------------
----
ir. Thierry Onkelinx
Instituut voor natuur- en bosonderzoek / Research Institute for Nature
and Forest
Cel biometrie, methodologie en kwaliteitszorg / Section biometrics,
methodology and quality assurance
Gaverstraat 4
9500 Geraardsbergen
Belgium
tel. + 32 54/436 185
Thierry.Onkelinx at inbo.be
www.inbo.be

To call in the statistician after the experiment is done may be no more
than asking him to perform a post-mortem examination: he may be able to
say what the experiment died of.
~ Sir Ronald Aylmer Fisher

The plural of anecdote is not data.
~ Roger Brinner

The combination of some data and an aching desire for an answer does not
ensure that a reasonable answer can be extracted from a given body of
data.
~ John Tukey

-----Oorspronkelijk bericht-----
Van: r-sig-mixed-models-bounces at r-project.org
[mailto:r-sig-mixed-models-bounces at r-project.org] Namens Kevin W
Verzonden: donderdag 6 augustus 2009 20:57
Aan: r-sig-mixed-models at r-project.org
Onderwerp: [R-sig-ME] Too big for lmer?

I have a simple model that appears to be too big for lmer (using 2GB of
memory).  I _can_ fit the model with asreml, but I would like to make a
comparison with lmer. Simulated data is used below, but I have real data
this causing the same problem.

set.seed(496789)
dat <- data.frame(H=sample(1:51, 24000, replace=TRUE),
                  L=sample(1:6101, 24000, replace=TRUE)) Heff <-
rnorm(51, sd=sqrt(40)) Leff <- rnorm(6101, sd=sqrt(1200)) err <-
rnorm(24000, sd=10) dat$y <- 100+Heff[dat$H] + Leff[dat$L] + err dat <-
transform(dat, H=factor(H), L=factor(L))
str(dat)
bwplot(y~H, dat)  # Looks right

Using asreml recovers the variance components almost exactly:

m1 <- asreml(y~1, data=dat, sparse=~L, random=~H)

summary(m1)$varcomp
           component std.error   z.ratio constraint
H           50.96058 10.249266  4.972121   Positive
R!variance 100.07324  1.056039 94.762853   Positive

Now try lmer:

m0 <- lmer(y~1+L+(1|H), data=dat)

Error: cannot allocate vector of size 1.1 Gb In addition: Warning
messages:
1: In model.matrix.default(mt, mf, contrasts) :
  Reached total allocation of 1535Mb: see help(memory.size)

Am I pushing lmer past its limits (given the 2GB of memory) or is there
a way to make this fit?


Kevin Wright

	[[alternative HTML version deleted]]

_______________________________________________
R-sig-mixed-models at r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models

Dit bericht en eventuele bijlagen geven enkel de visie van de schrijver weer 
en binden het INBO onder geen enkel beding, zolang dit bericht niet bevestigd is
door een geldig ondertekend document. The views expressed in  this message 
and any annex are purely those of the writer and may not be regarded as stating 
an official position of INBO, as long as the message is not confirmed by a duly 
signed document.




More information about the R-sig-mixed-models mailing list