[R-sig-Geo] Memory limit problems in R / import of maps

Tomislav Hengl hengl at science.uva.nl
Tue Apr 22 17:49:10 CEST 2008


Dylan,

Thanks for your note.

A student of mine would like to run habitat suitability analysis using the adehabitat package
(http://dx.doi.org/10.1890%2F0012-9658%282002%29083%5B2027%3AENFAHT%5D2.0.CO%3B2). I encouraged him
to use R, for many reasons.

At the moment, he is thinking of doing the whole thing in Matlab (or using the original Biomapper
software), because we would not like to give up on the original resolution (250 m).  

As a GIS person, I definitively do not see ~20 millions pixels as a Huge data set.

cheers,

Tom Hengl



-----Original Message-----
From: Dylan Beaudette [mailto:dylan.beaudette at gmail.com] 
Sent: dinsdag 22 april 2008 17:22
To: Tomislav Hengl
Cc: r-sig-geo at stat.math.ethz.ch; Michalis Vardakis
Subject: Re: [R-sig-Geo] Memory limit problems in R / import of maps

On Tue, Apr 22, 2008 at 6:49 AM, Tomislav Hengl <hengl at science.uva.nl> wrote:
>
>  Dear list,
>
>  I know that much has already been said about the memory limit problems. If there is any progress
>  about this problem, we would be interested to hear.
>
>  In our project, we are importing 24 maps/bands, each consists of 1,450,000 pixels. We further
would
>  like to glue all maps into a single data frame (e.g. 'ksc' class in adehabitat package; or
>  'SpatialGridDataFrame' in sp package), but this seems to be impossible.
>
>  We tried to run this under windows (after following
>
http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-us
>  es_0021 and setting the --max-mem-size) and under Linux Ubuntu, but still get the same error
message
>  (seems that there is no difference in use of memory under the two OS):
>
>  "Error: Cannot allocate vector of size 11.1 Mb"
>
>  The R workspace with 24 loaded grids is also really small (18 MB), but any further gluing and
>  calculation is blocked due the vector size error message.
>
>  For a comparison, in a GIS such as ArcGIS or SAGA/ILWIS (open source) we have no problems of
loading
>  and processing 3-4 times more grids.
>
>  Should we simply give up on running spatial analysis using large grids (>10 million grids) in R?

Hi,

What exactly were you hoping to do with such a massive data frame once
you overcame the initial memory problems associated with loading the
data? Any type of multivariate, classification, or inference testing
would certainly require just as much memory to perform any analysis on
the stack of grids.

Not knowing what the purpose of this operation is (although I would
guess something related to soil property or landscape modeling of some
sort), it is hard to suggest a better approach. For grid that size I
would use an algorithm that operates on strips or tiles. There are
several great starting points in the GRASS source code. Doing all of
the pre-processing, and possibly some aggregating to larger support
size, in GRASS would allow you to test any R-centric operations on a
coarser version of the original dataset.

Cheers,

Dylan




More information about the R-sig-Geo mailing list