[R] Huge data frames?
Magnus Lie Hetland
magnus at hetland.org
Wed Aug 28 05:38:50 CEST 2002
A friend of mine recently mentioned that he had painlessly imported a
data file with 8 columns and 500,000 rows into matlab. When I tried
the same thing in R (both Unix and Windows variants) I had little
success. The Windows version hung for a very long time, until I
eventually more or less ran out of virtual memory; I tried to set the
proper memory allocations for the Unix version, but it never seemed
satisfied :]
I used read.table -- should I have used something else? Is it even
possible to work with this large files? I assume a memory-mapped
binary file would have been quite efficient (as opposed to an
in-memory parsed text file) -- is something like that even possible in
R?
--
Magnus Lie Hetland The Anygui Project
http://hetland.org http://anygui.org
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
More information about the R-help
mailing list