[R] increasing memory
Prof Brian Ripley
ripley at stats.ox.ac.uk
Wed May 5 08:05:53 CEST 2004
The commands do matter. Both ?read.table and the Data Import/Export
Manual tell you ways to speed up reading a table.
However, there seems to be a problem with either MacOS X or perhaps your
hardware that is probably impossible to diagnose remotely. A Unix system
should be able to kill any process you own with kill -9 (unless it is out
of other resources, e.g. processes to run kill): that's not an R issue.
On Tue, 4 May 2004, Janet Rosenbaum wrote:
>
>
> > If it actually crashes there is a bug, but I suspect that it stops with an
> > error message -- please do read the posting guide and tell us exactly what
> > happens.
>
> Sorry, I hadn't realized that "crash" means to give an error message on
> this mailing list.
>
> To me, "crash" means that the computer freezes entirely, or if I'm
> lucky it just runs for several hours without doing anything, and the
> process can't even be killed with -9, and the computer can't be
> shutdown, but has to be powercycled.
>
> For instance, I left it doing a read.table on a text format file from this
> data (a few hundred megs) and eight hours later it was still "going".
> I watched the process with "top" for awhile and the computer had plenty
> of free memory -- over 100 M this whole time, and R was using almost no
> CPU.
It may still have been swapping.
> I have tried all sorts of ways of reading in the data. It's best if I
> can read the xport file since that has all the original labels which
> don't get to the text file, but read.xport actually freezes the
> computer.
>
> As I said, I am running R 1.8.1 which claims to be the most recent
> version (when I type is.RAqua.updated()) on an ibook G3/800 with 620 M
> RAM (the maximum) running 10.3.3.
>
> The command really doesn't much matter. These are totally normal files
> and I can load in the normal sized files with the exact same
> commands.
> > w<-read.table("pedagogue.csv",header=T, sep=",")
> > library(foreign)
> > w<-read.xport("demagogue.xpt")
>
> The xpt files are up to 400 M, and the csv files are about 100 M.
>
> Janet
>
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
More information about the R-help
mailing list