[R-sig-Geo] Memory limit problems in R / import of maps

Tomislav Hengl hengl at science.uva.nl
Tue Apr 22 16:59:10 CEST 2008


Dear Edzer, Roger,

Thanks for the tips! BTW, I was using Dell Latitude D630 laptop with 2Gb of RAM and 2GHz Intel chip
with:

(a) Windows XP professional
(b) Linux Ubuntu

Under both OS I get the same error message.

And yes, we would like to import about 25 maps of 1.5 millions pixels each. 

I will try to summarize your strategies:

(1) Import the data in tiles (e.g. via rgdal), e.g.:  

> info <- GDALinfo("dem50m.asc")
> gridmaps01 = readGDAL("dem50m.asc", region.dim = round(c(info[["rows"]]/2,info[["columns"]])))
> gridmaps02 = readGDAL("dem50m.asc", region.dim = round(c(info[["rows"]]/2,info[["columns"]])),
offset=round(c(info[["rows"]]/2,0)))

Or try reading directly from disk (any documentation on how to achieve this?).

(2) Obtain a 64 bits OS/PC with >10Gb of RAM.

We would like to use the package adehabitat that reads ArcInfo ASCII maps via 'import.asc'.
Unfortunately, I could not find that it can read the data in tiles, but we might try importing via
rgdal first and then converting to the 'ksc' class.

If anybody else has experience/solution with working with large maps we would be interested to
hear/learn from his/hear opinion. 

Just one last thing, if R is reporting an error message, that does not necessarily mean that there
is a memory limit problem with the machine - shouldn't there be a way to implement memory handling
in R in a more efficient way?

Thanks in any case,

Tom Hengl



-----Original Message-----
From: Edzer Pebesma [mailto:edzer.pebesma at uni-muenster.de] 
Sent: dinsdag 22 april 2008 16:14
To: Tomislav Hengl
Cc: r-sig-geo at stat.math.ethz.ch; 'Michalis Vardakis'
Subject: Re: [R-sig-Geo] Memory limit problems in R / import of maps

Hi Tom,

Tomislav Hengl wrote:
> Should we simply give up on running spatial analysis using large grids (>10 million grids) in R?
>   
Yes, and I would be very interested to hear along which other path you 
were then successful to finish the job.

Other options I can see are:
- buy a decent pc with 16 or 32 Gb memory, and use 64 bits linux (have 
you checked how much this would cost, and compared it to the budget of 
your project?). There's nothing special about it, I use it 100% of my 
time on my 1.2 kg laptop (with much less RAM).
OR:
- don't go through the grid in a single pass, but do it by tiles, e.g. 
use rgdal to read part of the grid and do that for 100 tiles, should 
reduce memory needs with a factor 100. Of course this takes a little bit 
more effort, in terms of administration (as Roger mentioned),
OR:
- rewrite the memory-hungry parts such that the bulky data is not first 
read into memory, but read directly from disk. Several attempts can be 
found in various packages.

I believe you don't mean it like that, but your question (above) sounds 
a bit like "you" want "us" to solve your problems. That's always a 
dangerous attitude on lists where help comes only voluntary.

You haven't even told us how much memory your computer or OS has.

Best wishes,
--
Edzer




More information about the R-sig-Geo mailing list