[R] a quick Q about memory limit in R
Uwe Ligges
ligges at statistik.uni-dortmund.de
Wed May 21 08:47:59 CEST 2003
Yan Yu wrote:
> Hello, there,
> I got this error when i tried to run " data.kr <- surf.gls(2, expcov,
> data, d=0.7);"
>
> "Error: cannot allocate vector of size 382890 Kb
> Execution halted"
>
> My data is 100x100 grid.
>
> the following is the summary of "data":
>
>>summary(data);
>
> x y z
> Min. : 1.00 Min. : 1.00 Min. :-1.0172
> 1st Qu.: 26.00 1st Qu.: 25.75 1st Qu.: 0.6550
> Median : 51.00 Median : 50.50 Median : 0.9657
> Mean : 50.99 Mean : 50.50 Mean : 1.0000
> 3rd Qu.: 76.00 3rd Qu.: 75.25 3rd Qu.: 1.2817
> Max. :100.00 Max. :100.00 Max. : 2.6501
>
> I have 2 Qs:
> (1). for a 100x100 grid, why R tried to allocate such a HUGE vector,
> 382890 Kb??
Because you perform some calculations with the data which consumes more
memory than the data itself, e.g. by generating some matrices, temporary
objects and copies of the data.
> (2) what decides the memory limit in R, How can increase that?
a) See ?Memory and the R FAQ 7.1, for example. If you are on Windows
also R fow Windows FAQ 2.7. These are obvious places to look, aren't they?
b) By buying some more memory for your machine.
Uwe Ligges
> Many thanks,
> yan
More information about the R-help
mailing list