[R] a quick Q about memory limit in R
Roger.Bivand at nhh.no
Wed May 21 10:38:52 CEST 2003
On Tue, 20 May 2003, Yan Yu wrote:
> Hello, there,
> I got this error when i tried to run " data.kr <- surf.gls(2, expcov,
> data, d=0.7);"
> "Error: cannot allocate vector of size 382890 Kb
> Execution halted"
> My data is 100x100 grid.
This is, I think, where the problem is. You have n=10000 observations, and
if you do debug(surf.gls) before running, you will probably find that it
Z <- .C("VR_gls", as.double(x), as.double(y), as.double(z),
as.integer(n), as.integer(np), as.integer(npar), as.double(f),
l = double((n * (n + 1))/2), r = double((npar * (npar +
1))/2), beta = double(npar), wz = double(n), yy = double(n),
W = double(n), ifail = as.integer(0), l1f = double(n *
npar), PACKAGE = "spatial")
because (n * (n + 1))/2 in your case is 50,005,000, times 8 as a double,
so l is a very large object ("On output L contains the Cholesky factor of
the covariance matrix of the observations" - comment in spatial/src/kr.c).
Do you need to have so many observations? If so, perhaps you could
consider using other packages that permit the search area to be restricted
to close neighbours of your observations?
> the following is the summary of "data":
> > summary(data);
> x y z
> Min. : 1.00 Min. : 1.00 Min. :-1.0172
> 1st Qu.: 26.00 1st Qu.: 25.75 1st Qu.: 0.6550
> Median : 51.00 Median : 50.50 Median : 0.9657
> Mean : 50.99 Mean : 50.50 Mean : 1.0000
> 3rd Qu.: 76.00 3rd Qu.: 75.25 3rd Qu.: 1.2817
> Max. :100.00 Max. :100.00 Max. : 2.6501
> I have 2 Qs:
> (1). for a 100x100 grid, why R tried to allocate such a HUGE vector,
> 382890 Kb??
> (2) what decides the memory limit in R, How can increase that?
> Many thanks,
> R-help at stat.math.ethz.ch mailing list
Economic Geography Section, Department of Economics, Norwegian School of
Economics and Business Administration, Breiviksveien 40, N-5045 Bergen,
Norway. voice: +47 55 95 93 55; fax +47 55 95 93 93
e-mail: Roger.Bivand at nhh.no
More information about the R-help