[R] running out of memory while running a VERY LARGE regression

Prof Brian Ripley ripley at stats.ox.ac.uk
Wed Nov 23 10:06:31 CET 2005


On Tue, 22 Nov 2005, t c wrote:

> I am running a VERY LARGE regression (many factors, many rows of data) 
> using LM.
>
>  I think I have my memory set as high as possible. ( I ran 
> "memory.limit(size = 4000)")
>
>  Is there anything I can do?  ( FYI, I "think" I have removed all data I 
> am not using, and I "think" I have only the data needed for the 
> regression loaded.) Thanks.

Snce you mention memory.limit, I guess you are using Windows without 
telling us.  If so, have you set up the /3GB switch (see the rw-FAQ Q2.9) 
and modified the R executables?  (The modification is not necessary if you 
use the current R-patched available from CRAN.)

You will be able to save memory by using lm.fit rather than lm, perhaps 
running a session containing just the model matrix and the response.
(Unless of course you run out of memory forming the model matrix.)

The best answer is to use a 64-bit OS and a 64-bit build of R.

> 	[[alternative HTML version deleted]]

> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

Please do as it asks and tell us your OS and do not send HTML mail and 
report the exact problem with the error messages.

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595




More information about the R-help mailing list