Thanks for the responses. Some comments and further details
follow.
A couple of people suggested changing the fixed effect L (with 6101 levels)
to a random effect. However, the data represent only about 24000/(51*6101)
= 7.7% of all possible combinations of the H and L factors. With this many
missing combinations, changing H from a fixed effect to a random effect
results in shrinkage of the effect estimates. No good -- the estimates of
the L effects will be used in subsequent analyses that are sensitive to
shrinkage.
Martin Maechler suggested using lmer2( , sparseX=TRUE) from the lme4a
package. I tried to find this package, but it does not appear to be here:
http://r-forge.r-project.org/src/contrib/
Perhaps there is a build problem on R-forge? (There is a subversion branch
labeled 'allcoef', is it this one?)
I found another way to fit this model. Using asreml I was able to specify L
as a random effect and fix its variance at a "large" value. The estimates
of L are then nearly identical to the estimates when L is fit as a fixed
effect. (This is a trick I learned from Arthur Gilmour). If lmer can fix
variances, this might work. I googled around for a while and found a couple
of similar questions, but found no answer. Apparently it is not possible to
fix variances with lmer. Please show me how if it is possible.
Several people suggested that I look into memory issues. However, since I
was able to fit the model using asreml, the implied question is: why is lmer
not able to fit the model when asreml can? Is there a trick to fitting the
model that I don't know? I'll try lmer2( , sparseX=TRUE) when it becomes
available.
Kevin
On Thu, Aug 6, 2009 at 1:56 PM, Kevin W wrote:
> I have a simple model that appears to be too big for lmer (using 2GB of
> memory). I _can_ fit the model with asreml, but I would like to make a
> comparison with lmer. Simulated data is used below, but I have real data
> this causing the same problem.
>
> set.seed(496789)
> dat <- data.frame(H=sample(1:51, 24000, replace=TRUE),
> L=sample(1:6101, 24000, replace=TRUE))
> Heff <- rnorm(51, sd=sqrt(40))
> Leff <- rnorm(6101, sd=sqrt(1200))
> err <- rnorm(24000, sd=10)
> dat$y <- 100+Heff[dat$H] + Leff[dat$L] + err
> dat <- transform(dat, H=factor(H), L=factor(L))
> str(dat)
> bwplot(y~H, dat) # Looks right
>
> Using asreml recovers the variance components almost exactly:
>
> m1 <- asreml(y~1, data=dat, sparse=~L, random=~H)
>
> summary(m1)$varcomp
> component std.error z.ratio constraint
> H 50.96058 10.249266 4.972121 Positive
> R!variance 100.07324 1.056039 94.762853 Positive
>
> Now try lmer:
>
> m0 <- lmer(y~1+L+(1|H), data=dat)
>
> Error: cannot allocate vector of size 1.1 Gb
> In addition: Warning messages:
> 1: In model.matrix.default(mt, mf, contrasts) :
> Reached total allocation of 1535Mb: see help(memory.size)
>
> Am I pushing lmer past its limits (given the 2GB of memory) or is there a
> way to make this fit?
>
>
> Kevin Wright
>
>
[[alternative HTML version deleted]]