[R-sig-Geo] Alternative to zonal for large images

Oscar Perpiñán Lamigueiro oscar.perpinan at gmail.com
Thu Feb 14 08:00:06 CET 2013


> Oscar, I find your idea ingenious, however I don't see how this method
> could still work with out of memory raster (since you load the whole raster
> using getValues()). data.table storage is more efficient, so maybe it's the
> fact that you can shrink the space taken by the two layers that allow to
> process rasters in memory ? Or is canProcessInMemory() too conservative ? I
> think it's using a 10% buffer.

Hello,

My last code focused only on speed improvement. We should add several
lines of code to check memory usage. On the other hand, I am still
learning to use the amazing data.table package, so I cannot give a
solution for out of memory problems. However, it seems that data.table
copes with them quite well:

http://stackoverflow.com/questions/11564775/exceeding-memory-limit-in-r-even-with-24gb-ram

Besides, I have just discovered the ":=" operator which could be useful
for working with large datasets.

http://stackoverflow.com/questions/9508118/out-of-memory-when-modifying-a-big-r-dataframe
http://stackoverflow.com/questions/7029944/when-should-i-use-the-operator-in-data-table

Best,

Oscar.



More information about the R-sig-Geo mailing list