[R] memory limits in Windows

johnston at stat.ubc.ca johnston at stat.ubc.ca
Wed Mar 8 20:45:44 CET 2006


Hello,

I apologize, since I know that variations of this question come up often, but I
have searched both the archives and the rw-FAQ, and haven't had any luck solving
my problem.  I am trying to run some generalized additive mixed models (using
the mgcv library) with some large datasets, and get error messages regarding
memory allocation.  An example dataset has around 45,000 points contributed by
about 2200 individuals (so about 20 observations per individual).  Errors within
individuals are modeled as AR(1).  Currently, I can run the models on a random
subset of about 25% of the data in a reasonable amount of time, so I think that
memory is the only major issue here. 

I have used the "--max-mem-size" command line option, and set it to the maximum
allowable, which is 3Gb (any larger and I get a message that it is too large and
is ignored when I open R).  I also run the R command memory.limit(4095), again
the maximum allowable without receiving a "don't be silly" message.  

I am running this on a brand new computer with Windows XP-64 OS and 4GB RAM (it
was bought primarily to be able to handle these models!)  Do I have any options
to increase memory?  Any advice would be hugely appreciated.

Thanks so much for any input.

Karissa Johnston




More information about the R-help mailing list