[R-sig-ME] problems with allocate memory with lme4 package

cumuluss at web.de cumuluss at web.de
Tue Dec 20 01:19:55 CET 2011


Hi Andrew,

thank you for your reply. The first tips I already tried but the same error. The third is regretfully not possible with my data structure.
Because of my initial difficulties with the forum registration i did unfortunately a double posting. In the other Douglas Bates answered and he had also an idea.

Thanks again.
Regards
Paul


------- Original-Nachricht --------
> Datum: Mon, 19 Dec 2011 12:16:33 +0000
> Von: Andrew Crowe <Andrew.Crowe at fera.gsi.gov.uk>
> An: "r-sig-mixed-models at r-project.org" <r-sig-mixed-models at r-project.org>
> Betreff: Re: [R-sig-ME] problems with allocate memory with lme4 package

> Paul
>
> While I'm not an expert on the lmer function you might want to try the
> following:
>
> 1) Check that you have a 64 bit install of R/lmer (not just 32 bit running
> on a 64 bit system)
> 2) Try extending the memory using memory.limit()
> 3) Even with a complex model, 3 million rows is quite a lot of data.
> Could this be reduced by subsampling and then using the 'removed' data for
> validation of the model?
>
> Regards
>
> Andrew
>
> Dr Andrew Crowe
> Senior Land Use Change Scientist
> Food and Environment Research Agency
> Sand Hutton
> York
> UK
>
> Email. Andrew.Crowe at fera.gsi.gov.uk
>
>
> -----Original Message-----
> From: r-sig-mixed-models-bounces at r-project.org
> [mailto:r-sig-mixed-models-bounces at r-project.org] On Behalf Of cumuluss at web.de
> Sent: 19 December 2011 11:39
> To: r-sig-mixed-models at r-project.org
> Subject: [R-sig-ME] problems with allocate memory with lme4 package
>
> Hi to everyone,
>  
> I have been trying to fit with lmer function a glmm with a binomial error
> structure. My model is a little bit complex. I have 8 continuous predictor
> variables one of them as nonlinear term, 5 categorical predictor variables
> with some three-way interactions between them. Additional I have 3 random
> effects and one offset variable in the model. Number of obs is greater than
> 3million.
> I’m working with the latest version of R 2.14.0 on a 64 bit windows
> system with 8Gb ram.
> Everything I tried (reducing model complexity, different 64bit PC with
> even more memory) nothing leads to a fitted model, always the Error occurs:
> cannot allocate vector of size 2GB.
> Is there anything I can do? I would be very grateful for any commentary.
>
> Paul T.
>
> _______________________________________________
> R-sig-mixed-models at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models
>
> The information contained in this message may include privileged,
> proprietary or confidential information. Please treat it with the same respect that
> you would expect for your own information.  If you have received it in
> error, we apologise and ask that you contact the sender immediately and erase
> it from your computer. Thank you for your co-operation.
>
> The original of this email was scanned for viruses by the Government
> Secure Intranet virus scanning service. On leaving the GSi this email was
> certified virus free.
> _______________________________________________
> R-sig-mixed-models at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-sig-mixed-models




More information about the R-sig-mixed-models mailing list