[R] memory limit

Ott Toomet otoomet at econ.dk
Fri Mar 14 09:21:55 CET 2003


Dear Kristina,

There are some general suggestions one should always try when having
problems with memory.  Have you tried to:

1) check how much memory R really takes.  Look at gc() and
   object.size(), and on system tools which tell how much memory a
   process is taking (like top on Unix).  Try mem.limits() which tells
   if there are any limits currently in effect.

2) run your analysis on a subset of your data.  Perhaps what you
   really want to do is something less complex?  You get an idea how
   much the memory comnsumption depends on the size of your data.

3) It may well be the case that the memory on your computer is getting
   fragmented.  Try another OS (unix/linux should be better), or
   reboot your computer and try again.

Perhaps it helps.

Ott

 | From: "Kristina Hanspers" <khanspers at gladstone.ucsf.edu>
 | Date: Thu, 13 Mar 2003 14:11:59 -0800
 | 
 | Hi,
 | 
 | I get an error saying " Cannot allocate vector of size 71289kb". So I tried
 | to increase memory by memory.limit(size=3000000000), I also tried other
 | numbers. Each time I get the message NULL and then I still get the same
 | error as above. I'm using Windows 2000. The system has 1G RAM, and 1.6GHz
 | processor. I was only running R, and was trying to do use a BioConductor
 | package. I apparociate your answer. Thanks,
 | 
 | Kristina



More information about the R-help mailing list