R-beta: Memory requirement in Win NT
Joe Mortzheim
jmortz at snake1.cr.usgs.gov
Tue Apr 7 13:07:42 CEST 1998
I generally use S+ but have been dismayed by it's slow operation and memory hogging. I recently downloaded R and was hoping my memory problems were over. To the contrary, R seems to run out of memory almost immediately; particularly during read.table().
I tried to change the arguments on startup of R, for example:
R -n200000 -v20
However, I get an error:
The instruction at "0x0008001f" referenced memory at "0x0008001f". The memory could not be "read"
I get this error regardless of the values I use for -n or -v and I get the error if I try to use either the -n OR the -v arguments.
What gives? It's enough to make me switch to SAS or (God forbid!) begin longing for a Microsoft product.
I have a Pentium 166 with 64 Megs of RAM and I'm running Windows NT. Surely this is enough computational power if only I could tell R to use more resources than it starts with as default? I would rather not go through the hassle of breaking down a read.table() command into a bunch of scan() commands or break up my dataset into multiple files. I can load the same dataset with read.table() using S+ on this machine, it just takes forever.
I certainly appreciate any help. Please respond to jmortz at snake1.cr.usgs.gov as well as the mailing list.
Thanks in advance for the advice.
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
More information about the R-help
mailing list