[R] memory issues

Rubén Roa-Ureta rroa at udec.cl
Thu Apr 17 00:33:07 CEST 2008


I think any geostatistical program/R package would have trouble handling 
12000 observations on a PC. The empirical variogram would be built with 
the combinations of 12000 over 2 pairs, nearly 72 millions pairs, and 
during kriging, if you didn't restrict the search neighbourhood, 
interpolation would involve solving something very big, more so if you 
defined a fine grid, ... Try restricting the search neighbourhood, if 
you didn't, with maxdist and nmax.

Rubén


Prof Brian Ripley wrote:
> I think the clue is that the message you quote comes from gstat, which 
> does not use R's memory allocator.  It is gstat and not R that has failed 
> to allocate memory.
> 
> Try re-reading the help page for memory.size.  'max=T' does not indicate 
> the limit (that is the job of memory.limit()), but the maximum 'used' 
> (acquired from the OS) in that session.  So 19Mb looks reasonable for R's 
> usage.
> 
> I don't understand the message from memory.limit() (and the formatting is 
> odd).  memory.limit() does not call max() (it is entirely internal), so I 
> wonder if that really is the output from that command.  (If you can 
> reproduce it, please let us have precise reproduction instructions.)
> 
> There isn't much point in increasing the memory limit from the default 
> 1.5Gb on a 2Gb XP machine.  The problem is that the user address space 
> limit is 2Gb and fragmentation means that you are unlikely to be able to 
> get over 1.5Gb unless you have very many small objects, in which case R 
> will run very slowly.  In any case, that is not the issue here.
> 
> On Wed, 16 Apr 2008, Dave Depew wrote:
> 
>> Hi all,
>> I've read the R for windows FAQ and am a little confused re:
>> memory.limit and memory.size
>>
>> to start using R 2.6.2 on WinXP, 2GB RAM, I have the command line "sdi
>> --max-mem-size=2047M"
>> Once the Rgui is open, memory.limit() returns 2047, memory.size()
>> returns 11.315, and memory.size(max=T) returns 19.615
>>
>> Shouldn't memory.size(max=T) return 2047?
>>
>> Upon running several operations involving kriging (gstat package,
>> original data file 3 variables, 12000 observations)
>> the program runs out of memory
>>
>> "memory.c", line 57: can't allocate memory in function m_get()
>> Error in predict.gstat(fit.uk, newdata = EcoSAV.grid.clip.spxdf,
>> debug.level = -2,  :
>>  m_get
>>
>> Immediately following this,
>>
>> memory.limit() returns [1] -Inf
>>                        Warning message:
>>                            In memory.limit() : no non-missing arguments
>> to max; returning -Inf
>>
>> memory.size() returns 24.573.
>>
>> memory.size(max=T) returns 46.75
>>
>> To my untrained eye, it appears that R is not being allowed access to
>> the full memory limit specified in the cmd line....if this is the case,
>> how does one ensure that R is getting access to the full allotment of RAM?
>> Any insight is appreciated...
>>
>>
>>> sessionInfo()
>> R version 2.6.2 (2008-02-08)
>> i386-pc-mingw32
>>
>> locale:
>> LC_COLLATE=English_United States.1252;LC_CTYPE=English_United
>> States.1252;LC_MONETARY=English_United
>> States.1252;LC_NUMERIC=C;LC_TIME=English_United States.1252
>>
>> attached base packages:
>> [1] stats     graphics  grDevices datasets  tcltk     utils
>> methods   base
>>
>> other attached packages:
>> [1] maptools_0.7-7 foreign_0.8-23 gstat_0.9-43   rgdal_0.5-24
>> lattice_0.17-4 sp_0.9-23      svSocket_0.9-5 svIO_0.9-5
>> R2HTML_1.58    svMisc_0.9-5   svIDE_0.9-5
>>
>> loaded via a namespace (and not attached):
>> [1] grid_2.6.2  tools_2.6.2
>>
>> ______________________________________________
>> R-help at r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>>
>



More information about the R-help mailing list