[R-sig-Geo] Memory limit problems in R / import of maps

Edzer Pebesma edzer.pebesma at uni-muenster.de
Tue Apr 22 16:14:17 CEST 2008


Hi Tom,

Tomislav Hengl wrote:
> Should we simply give up on running spatial analysis using large grids (>10 million grids) in R?
>   
Yes, and I would be very interested to hear along which other path you 
were then successful to finish the job.

Other options I can see are:
- buy a decent pc with 16 or 32 Gb memory, and use 64 bits linux (have 
you checked how much this would cost, and compared it to the budget of 
your project?). There's nothing special about it, I use it 100% of my 
time on my 1.2 kg laptop (with much less RAM).
OR:
- don't go through the grid in a single pass, but do it by tiles, e.g. 
use rgdal to read part of the grid and do that for 100 tiles, should 
reduce memory needs with a factor 100. Of course this takes a little bit 
more effort, in terms of administration (as Roger mentioned),
OR:
- rewrite the memory-hungry parts such that the bulky data is not first 
read into memory, but read directly from disk. Several attempts can be 
found in various packages.

I believe you don't mean it like that, but your question (above) sounds 
a bit like "you" want "us" to solve your problems. That's always a 
dangerous attitude on lists where help comes only voluntary.

You haven't even told us how much memory your computer or OS has.

Best wishes,
--
Edzer




More information about the R-sig-Geo mailing list