[R] size limits

Peter Dalgaard BSA p.dalgaard at biostat.ku.dk
Sun Jan 23 19:55:17 CET 2000

Jeff Miller <jdm at xnet.com> writes:

>     On the other hand, when I try to do the same on a PC (128 M RAM, 400MHz), running
>     Linux (Redhat 6.1) , on R version 0.90.0, I find that it is
>     impossible.
>    When I allocate (what I believe to be) the maximum amount of vsize
>     memory and a large amount of nsize memory
>       R --vsize 200M --nsize 4000k,
>     and then try to read the file in  using read.table() or scan()
>      myData <- read.table(file = "mydata.dat")
>      or
>       myData <- scan(file = "myData.dat", what = list("",0,0,...,0))               (with
> 29 zeros)
>      I get kicked out of R.
>      More worrisome, I did succeed in reading in a subset of the data with 30,000 rows.
>     However, when I tried to plot one of the columns, my monitor began blinking
>     wildly, and the machine crashed. I had to reboot.

You've probably come too close to the machine capacity there. Linuxen
are often run without user limits on process size so if you eat too
much memory, some random process will be killed and with a bit of bad
luck it will be something critical such as your X server... Notice
that 200M vsize + 4000k nodes (20 bytes each) is about 150M more
than your physical memory, and with system processes easily taking up
60M you'd need 200M swap to run. a quick calculation suggests that
your data alone takes ~240M, which suggests that you really need a
bigger machine. 

   O__  ---- Peter Dalgaard             Blegdamsvej 3  
  c/ /'_ --- Dept. of Biostatistics     2200 Cph. N   
 (*) \(*) -- University of Copenhagen   Denmark      Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard at biostat.ku.dk)             FAX: (+45) 35327907
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch

More information about the R-help mailing list