R-beta: Memory Management in R-0.50-a4
ihaka at stat.auckland.ac.nz
Fri Nov 28 02:19:28 CET 1997
Ian Thurlbeck writes:
> Dear R users
> we're having a problem reading a largish data file using
> read.table(). The file consists of 175000 lines of 4
> floating pt numbers. Here's what happens:
"read.table" uses vast amounts of memory. First it reads everything
as as string and then converts to numbers. By using "scan" instead
you can cut down your memory demands. If you know you have exactly
175000 observations using something like
x <- scan(what=list(0,0,0,0), nmax=175000)
you will cut you memory demands to near the minimum.
> What can I do? Is the answer a better memory management
> system ?
The memory management definitely needs some work (well actually it
needs a quick bullet between the eyes). When we got started on R we
didn't anticipate that it would be see much use outside of teaching
here at Auckland.
I certainly have plans to revisit the memory management at some point
and to look at some of the scalability problems too. But don't hold
your breath ...
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
More information about the R-help