R-beta: Memory Management in R-0.50-a4
Ian Thurlbeck
ian at stams.strath.ac.uk
Thu Nov 27 12:01:35 CET 1997
Dear R users
we're having a problem reading a largish data file using
read.table(). The file consists of 175000 lines of 4
floating pt numbers. Here's what happens:
> dat_read.table('sst.dat')
Error: memory exhausted
(This is line 358 of src/main/memory.c).
Cutting down the file to around 15000 lines allows
read.table() to work OK.
I edited the memory limits in Platform.h and re-compiled
and now read.table() can manage up to around 125000 lines.
#define R_VSIZE 30000000L /* 15 times original figure (Defn.h) */
#define R_NSIZE 1000000L /* 5 times original figure (Defn.h) */
#define R_PPSSIZE 100000L /* 10 times original figure (Defn.h) */
Clearly I can keep upping these values until it works, but has
the side-effect of making the running R binary pretty big.
What can I do? Is the answer a better memory management
system ?
Any help appreciated.
Yours
Ian
--
Ian Thurlbeck http://www.stams.strath.ac.uk/
Statistics and Modelling Science, University of Strathclyde
Livingstone Tower, 26 Richmond Street, Glasgow, UK, G1 1XH
Tel: +44 (0)141 548 3667 Fax: +44 (0)141 552 2079
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
More information about the R-help
mailing list