[R-sig-Geo] dealing with large spatial data
Ozlem Yanmaz
ozlem.yanmaz at gmail.com
Fri Sep 24 18:54:25 CEST 2010
Dear fellow R users,
I am fairly new to spatial models. I have been using "spdep" package
to model the spatial correlation between my data points. I have used
"dnearneigh" and "knearneigh" functions to get the neighborhood list.
I don't have a problem running the functions when the data is small.
My problem is the data set that I will eventually be working on is
fairly large (more than 50,000 points). Thus I face with memory
problem. Is there a way to speed up this process, like maybe creating
the neighborhoods in clusters and combine them later to get large
weight matrix for the "spautolm" function?
any idea, suggestion is appreciated.
Thanks and Regards
More information about the R-sig-Geo
mailing list