[R] increasing memory
Roger D. Peng
rpeng at jhsph.edu
Wed May 5 14:20:21 CEST 2004
Just a note, R 1.9.0 is the most recent version.
It sounds like your computer might just be running out of memory
and thrashing. 620MB of RAM is unfortunately not an enormous
amount of RAM with which to be using R. These are the trade
offs. If you can't read in your table using scan() then we may
be running out of R options.
-roger
Janet Rosenbaum wrote:
>
>
>>If it actually crashes there is a bug, but I suspect that it stops with an
>>error message -- please do read the posting guide and tell us exactly what
>>happens.
>
>
> Sorry, I hadn't realized that "crash" means to give an error message on
> this mailing list.
>
> To me, "crash" means that the computer freezes entirely, or if I'm
> lucky it just runs for several hours without doing anything, and the
> process can't even be killed with -9, and the computer can't be
> shutdown, but has to be powercycled.
>
> For instance, I left it doing a read.table on a text format file from this
> data (a few hundred megs) and eight hours later it was still "going".
> I watched the process with "top" for awhile and the computer had plenty
> of free memory -- over 100 M this whole time, and R was using almost no
> CPU.
>
> I have tried all sorts of ways of reading in the data. It's best if I
> can read the xport file since that has all the original labels which
> don't get to the text file, but read.xport actually freezes the
> computer.
>
> As I said, I am running R 1.8.1 which claims to be the most recent
> version (when I type is.RAqua.updated()) on an ibook G3/800 with 620 M
> RAM (the maximum) running 10.3.3.
>
> The command really doesn't much matter. These are totally normal files
> and I can load in the normal sized files with the exact same
> commands.
>
>>w<-read.table("pedagogue.csv",header=T, sep=",")
>>library(foreign)
>>w<-read.xport("demagogue.xpt")
>
>
> The xpt files are up to 400 M, and the csv files are about 100 M.
>
> Janet
More information about the R-help
mailing list