[R] heap size trouble

Jim Lemon bitwrit at ozemail.com.au
Wed May 31 13:51:13 CEST 2000


karamian wrote:

...I want to load a file that contains 93 thousand raws and 22 colums of
data (essentially float)...

I just had to process over 199000 records with four numeric values.  If
I remember correctly, I used:

--vsize 30M  --nsize 500000

which pretty much ate all the RAM (64M) I had.  Don't forget to "rm" big
data sets before you exit, or R will bomb when you next try to load
without the increased memory.  Just reread from the data file when you
need them again (and it helps to exit other apps before starting R to
avoid disk thrashing).

Jim

-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list