[R-sig-Geo] Applying a simple math on a big stack takes a long time. How to improve it?
Thiago V. dos Santos
thi_veloso at yahoo.com.br
Fri Feb 6 23:06:31 CET 2015
Hi all,
I am processing some GeoTiff rasters with 60 files in total (might be more), which are relatively big: resolution is about 7800 x 7700 at 30m res and total file size is around 120MB (Landsat 8 images). I am trying to do a simple math equation calculation (please see the end of the code) on a stack of six of those images:
# create some artificial data
# this is used within a loop with all files (~60) in the directory
r <- raster(nrows=7801, ncols=7711)
r[] <- runif(ncell(r),0,65000)
s.dn <- stack(r,r,r,r,r,r)
# Define calibration factors
rad.mult.fact <- c(0.012852, 0.013161, 0.012128,
0.010227, 0.006258, 0.001556)
rad.add.fact <- c(-64.2618, -65.8048, -60.6386,
-51.1339, -31.2914)
#convert DN to TOA radiance
s.rad <- (s.dn * rad.mult.fact) + rad.add.fact
#write file
writeRaster(s.rad, filename='teste.tif', format="GTiff", overwrite=TRUE)
The calculation runs extremely slow and takes about 10 minutes on my dual-core system with 8GB RAM. Three 2.8GB files are written on my temp dir during calculations. Finally, by the end I receive a warning:
Warning message:
In (s.img * rad.mult.fact) + rad.add.fact :
number of items to replace is not a multiple of replacement length
I wonder if there is a way to optimise that script? And why calculation is so slow in my case? What is causing the warning message and how to fix it?
Many thanks,
--
Thiago V. dos Santos
PhD student
Land and Atmospheric Science
University of Minnesota
http://www.laas.umn.edu/CurrentStudents/MeettheStudents/ThiagodosSantos/index.htm
Phone: (612) 323 9898
More information about the R-sig-Geo
mailing list