[R] Huge data frames?
Ott Toomet
siim at obs.ee
Wed Aug 28 08:32:42 CEST 2002
Hi,
You should use scan() to read large ASCII tables. If you save a dataframe
using save(), you get a binary file which works pretty fast.
Note that similar problems arise if you try to save big dataframes in ASCII
(you may consider my package savetable at
http://www.obs.ee/~siim/savetable_0.1.0.tar.gz in order to do that).
Best wishes,
Ott
On Wed, 28 Aug 2002, Magnus Lie Hetland wrote:
|A friend of mine recently mentioned that he had painlessly imported a
|data file with 8 columns and 500,000 rows into matlab. When I tried
|the same thing in R (both Unix and Windows variants) I had little
|success. The Windows version hung for a very long time, until I
|eventually more or less ran out of virtual memory; I tried to set the
|proper memory allocations for the Unix version, but it never seemed
|satisfied :]
|
|I used read.table -- should I have used something else? Is it even
|possible to work with this large files? I assume a memory-mapped
|binary file would have been quite efficient (as opposed to an
|in-memory parsed text file) -- is something like that even possible in
|R?
|
|
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
More information about the R-help
mailing list