[R] increasing memory
Liaw, Andy
andy_liaw at merck.com
Wed May 5 14:45:17 CEST 2004
> From: Roger D. Peng
>
> Just a note, R 1.9.0 is the most recent version.
>
> It sounds like your computer might just be running out of memory
> and thrashing. 620MB of RAM is unfortunately not an enormous
> amount of RAM with which to be using R. These are the trade
> offs. If you can't read in your table using scan() then we may
> be running out of R options.
`Easy' ones anyway. One can always use `lower level' things like
connections and readLines() to read in chunks at a time...
BTW (for Janet) `crash' and `freeze' aren't exactly the same (at least to
me). To me `crash' is something like segfaults, or if the program abruptly
exits, or BSOD, etc. If you still can run top, then the computer hasn't
really frozen, either. I suspect the reason you can't kill the program is
that the computer is running _really_ low on memory and thrashing, thus not
really responding to `kill' in a timely manner. I have gotten into that
situation on a Linux box (w/ 2GB RAM), and can't even log in as root to try
the `kill'. Similarly, the only option was to hit that big red (or maybe
not so `big' or `red') button...
We do need accurate descriptions of what happened to be able to be more
helpful, including some basic descriptions of the data file (e.g., how many
rows/columns, delimiters, column types, etc.)
Andy
> -roger
>
> Janet Rosenbaum wrote:
>
> >
> >
> >>If it actually crashes there is a bug, but I suspect that
> it stops with an
> >>error message -- please do read the posting guide and tell
> us exactly what
> >>happens.
> >
> >
> > Sorry, I hadn't realized that "crash" means to give an
> error message on
> > this mailing list.
> >
> > To me, "crash" means that the computer freezes entirely, or if I'm
> > lucky it just runs for several hours without doing
> anything, and the
> > process can't even be killed with -9, and the computer can't be
> > shutdown, but has to be powercycled.
> >
> > For instance, I left it doing a read.table on a text format
> file from this
> > data (a few hundred megs) and eight hours later it was
> still "going".
> > I watched the process with "top" for awhile and the
> computer had plenty
> > of free memory -- over 100 M this whole time, and R was
> using almost no
> > CPU.
> >
> > I have tried all sorts of ways of reading in the data.
> It's best if I
> > can read the xport file since that has all the original labels which
> > don't get to the text file, but read.xport actually freezes the
> > computer.
> >
> > As I said, I am running R 1.8.1 which claims to be the most recent
> > version (when I type is.RAqua.updated()) on an ibook G3/800
> with 620 M
> > RAM (the maximum) running 10.3.3.
> >
> > The command really doesn't much matter. These are totally
> normal files
> > and I can load in the normal sized files with the exact same
> > commands.
> >
> >>w<-read.table("pedagogue.csv",header=T, sep=",")
> >>library(foreign)
> >>w<-read.xport("demagogue.xpt")
> >
> >
> > The xpt files are up to 400 M, and the csv files are about 100 M.
> >
> > Janet
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html
>
>
------------------------------------------------------------------------------
Notice: This e-mail message, together with any attachments,...{{dropped}}
More information about the R-help
mailing list