[R-sig-Geo] Memory problems with dnearneigh in spdep
phytophthorasb at yahoo.com
Tue Apr 9 23:56:26 CEST 2013
Thanks so much, Roger, for your suggestions.
I should have realized it was taking kilometers rather than meters!
My summary(r) is:
Object of class SpatialPointsDataFrame
coords.x1 -17.52090 51.40407
coords.x2 -34.82918 37.53746
Is projected: TRUE
[+proj=aea +lat_1=20 +lat_2=-23 +lat_0=0 +lon_0=25 +x_0=0 +y_0=0 +ellps=WGS84
+datum=WGS84 +units=m +no_defs +towgs84=0,0,0]
Number of points: 36820887
Min. : 1 Min. : 0.00
1st Qu.: 250000 1st Qu.: 0.28
Median : 500000 Median : 2.57
Mean : 500000 Mean : 26.43
3rd Qu.: 750000 3rd Qu.: 10.51
Max. : 999999 Max. :103104.70
It seems that there are a lot of NA's. (Everything in the square outside the continent's borders should be an NA, so maybe that's a reasonable number). It seemed right when I plotted it earlier.
I ran it with 9.333 as the distance band (then I was going to try 1.5 next).
nb<- dnearneigh(r, 0, 9.333)
It used a constant amount of memory for the first ~90 minutes and then it increased at a constant rate for the next 20 hours
until it was using virtually all the memory on the server and was stopped.
Because the administrator had to kill my job, I wasn't able to get session info while the session was open. I'm not sure this is at all helpful, but when I re-started R this is the sessionInfo() :
R version 2.15.3 (2013-03-01)
Platform: x86_64-pc-linux-gnu (64-bit)
 LC_CTYPE=en_US.UTF-8 LC_NUMERIC=C LC_TIME=C LC_COLLATE=C
 LC_MONETARY=C LC_MESSAGES=C LC_PAPER=C LC_NAME=C
 LC_ADDRESS=C LC_TELEPHONE=C LC_MEASUREMENT=C LC_IDENTIFICATION=C
attached base packages:
 stats graphics grDevices utils datasets methods base
loaded via a namespace (and not attached):
From: Roger Bivand <Roger.Bivand at nhh.no>
To: Empty Empty <phytophthorasb at yahoo.com>
Cc: "r-sig-geo at r-project.org" <r-sig-geo at r-project.org>
Sent: Sunday, April 7, 2013 12:15 PM
Subject: Re: [R-sig-Geo] Memory problems with dnearneigh in spdep
I do prefer an affiliation, a real name, and a motivation; a
yahoo or gmail address tells me nothing about what you might be expected
to know. If I (or a list helper) cannot "place" you, it is much harder to
know why you are having problems.
You are also in breach of list rules because you are posting in HTML rather than just plain text. Removing the HTML will shorten your posting radically.
On Sun, 7 Apr 2013, Empty Empty wrote:
> I'm running into memory problems calculating Gi* using dnearneigh and wonder if someone has any suggestions for either my script or another work around.
> I have 1km data for Africa. My colleague calculated the appropriate distance band (9333m) using incremental spatial autocorrelation in ArcGIS but she was only able to run a subset of the data or with the data aggregated to 5km due to memory issues (fixed distance, Euclidean). I tried it in both ArcGIS (memory error after 45 hours) and R (Within 2 hours it was using more than 245GB on the server and then crashed). This is what I used
Please always include the output of sessionInfo(). Here also include summary(r) output.
This is not needed.
> nb9333<- dnearneigh(coords, 0,9333,longlat = F)
nb9333<- dnearneigh(r, 0, 9333)
would work, and would not override longlat=. I assume in fact that r is in longlat, and that you would then mean to use a km threshold of 9.333. If I'm right, then the memory usage is being complicated by your including all observations as neighbours of each observation.
For a 1km grid, a threshold of 9333m means that each of the about 30M gridcells has 100 neighbours (rather strong smoothing). Is the threshold sensibly chosen? The neighbour object will be 30M by 100 by 4, 1.21e+10 (12G); the labels of the neighbour object at least the same; the weights 24G in addition, so nb9333.listw is about 50G. With a more sensible threshold (1.5km?), the sizes of the nb9333.listw weights object become more tractable, maybe about 5G.
Is any of the data observed on this grid? For localG* (or localG) to make any sense, the variable must be observed at this resolution (satellite imagery?) - if interpolated, all bets are off.
Finally, the output involves so many multiple comparisons that no statistical inference is possible under normal assumptions - where is the 3.886 coming from - is this an attempt to adjust for multiple comparisons?
I'm unsure why anyone would do this, as you could use standard filtering techniques on imaging data. The raster package also has functions for computing the local Moran and Geary measures on RasterLayer objects.
Hope this clarifies,
> nb9333.listw <- nb2listw(include.self(nb9333), style="B")???
> sig3 <- classIntervals(G9333, n=3, style="fixed",
> ?????????????????????? fixedBreaks=c(min(G9333),-3.886, 3.886, max(G9333)))???? #I dont' think 3.886 is right
> cs<-c('blue','white','red') # blue = below -3, white = intermediate, red = above 3
> sigcol <- findColours(sig3, cs)?
> plot(r, col=sigcol, axes=TRUE)
> [[alternative HTML version deleted]]
-- Roger Bivand
Department of Economics, NHH Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; fax +47 55 95 95 43
e-mail: Roger.Bivand at nhh.no
More information about the R-sig-Geo