[R] memory limits in Windows
Prof Brian Ripley
ripley at stats.ox.ac.uk
Thu Mar 9 10:29:09 CET 2006
Is your OS set to allow 3GB users address space for processes?
Since you are running a 32-bit executable under `Windows XP-64' I don't
know if that is possible or how to do it, so please ask whoever advised
you to buy that OS.
You might want to send some sample code to the mgcv maintainer to see if
he has any suggestions for how to do this more frugally.
On Wed, 8 Mar 2006, johnston at stat.ubc.ca wrote:
> Hello,
>
> I apologize, since I know that variations of this question come up often, but I
> have searched both the archives and the rw-FAQ, and haven't had any luck solving
> my problem. I am trying to run some generalized additive mixed models (using
> the mgcv library) with some large datasets, and get error messages regarding
> memory allocation. An example dataset has around 45,000 points contributed by
> about 2200 individuals (so about 20 observations per individual). Errors within
> individuals are modeled as AR(1). Currently, I can run the models on a random
> subset of about 25% of the data in a reasonable amount of time, so I think that
> memory is the only major issue here.
>
> I have used the "--max-mem-size" command line option, and set it to the maximum
> allowable, which is 3Gb (any larger and I get a message that it is too large and
> is ignored when I open R). I also run the R command memory.limit(4095), again
> the maximum allowable without receiving a "don't be silly" message.
>
> I am running this on a brand new computer with Windows XP-64 OS and 4GB RAM (it
> was bought primarily to be able to handle these models!) Do I have any options
> to increase memory? Any advice would be hugely appreciated.
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
More information about the R-help
mailing list