[R] heap size trouble
Faheem Mitha
faheem at email.unc.edu
Tue May 30 16:18:42 CEST 2000
You need to change R_VSIZE in .Renviron I belived, at least if you are
using Unix. Faheem.
Ie R_SIZE=somethingM
where something is a number currently larger than what you are using, and
M stands for megs. eg
R_SIZE=35M
You can use gc() to see your current memory situation. Ie. mine gives:
> gc()
free total (Mb) <- currently available
Ncells 839332 1024000 19.6
Vcells 3258006 4587520 35.0
This is all for Unix, might differ for other platforms. In any case look
at the section in the FAQ about memory or do
> help(Memory)
I've found memory problems with large data sets in R a big pain. Sometimes
I just run out of memory and have to give up. I wish there was a magical
way arond this, but there doesn't seem to be.
Yours, Faheem.
On Tue, 30 May 2000, karamian wrote:
> Hi ,
>
> I 've got a trouble with using R.
>
> When I want to load a file that contains 93 thousand raws and 22 colums
> of data (essentially float)
> R shows me this error message
>
> "heap size trouble"
>
> Does anyone could tell me what parameter shall I precise before
> launching R in order to load my big file.
>
> Thanks a lot
>
>
>
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
More information about the R-help
mailing list