[R-sig-Geo] virtual memory and raster package

Ariel Ortiz-Bobea aortizbobea at arec.umd.edu
Tue Jul 10 16:53:05 CEST 2012


Hello,

My general goal is to compute the share of cropland in each 0.125 deg grid
in the US. I have a large land cover raster for each state and I reclassify
into 0 and 1 for non-crop/crop. The 0.125deg grid corresponds to another
dataset that I will be using and I want to use the cropland information for
weighting.

To achieve this I:

1- rasterize the 0.125 polygon grid for each state so that i matches the
extent and resolution of the cropland layer raster. (I also proceed to break
up the the polygons into pieces and run rasterize in parallel with mclapply
on a Mac, it's about 60% faster).

2- use the zonal() function with the stat "sum" to get the total number of
"pixels" of cropland in each 0.125deg zone (corresponding to the polygon
grid mentioned above).

It seems that this strategy is faster than using extract() as I believe
Roger Bivand pointed out in a previous thread. However, when I run this code
(with Rscript from the console) I can see my hard disk space decrease
rapidly and after a couple of States have been processed I'm out of HD space
and so the code crashes. When I reboot the computer the free disk space is
restored at its initial level (~110GB) and I'm able to run the code again
for a couple of more states.

I would like to avoid this "leaking" to run the code at once without having
to reboot the computer.

Any ideas or suggestions would be greatly appreciated!

Ariel


-----
Ariel Ortiz-Bobea
PhD Candidate in Agricultural & Resource Economics
University of Maryland - College Park
--
View this message in context: http://r-sig-geo.2731867.n2.nabble.com/virtual-memory-and-raster-package-tp7580420.html
Sent from the R-sig-geo mailing list archive at Nabble.com.



More information about the R-sig-Geo mailing list