[R-sig-Geo] Creating very large spatial weight matrix

Michael Sumner mdsumner at gmail.com
Fri Nov 19 00:39:30 CET 2010


In general you need at least twice the required memory, and it has to
be contiguous. Try with a fresh instance of R and try to create a
single vector of that size, that might show that you *could* do it.

Otherwise, check out the ff package, and see other options in the High
Performance Computing Task View on CRAN.

There may be other techniques you can use to solve the problem, but
those two things are my direct answers to your questions.

Cheers, Mike.

On Fri, Nov 19, 2010 at 10:28 AM, Aleksandr Andreev
<aleksandr.andreev at gmail.com> wrote:
> Hello list,
>
> I have 120,000 geocoded observations, for which I'm trying to create a
> distance-based spatial weighting matrix so that I can perform a Moran
> test.
>
> Each observation has Lat and Lon.
>
> Unfortunately, when I run
> dists <- as.matrix(dist(cbind(Lon, Lat)))
> I get the message:
> Error in vector("double", length) : vector size specified is too large
>
> Now I realize that 120,000^2 / 2 is on the order of 6 GB. However, I
> seem to be running into software limitations on the vector size before
> I hit RAM limitations. Also, in principle, it should be possible
> (though slow) to use hard disk space to store this matrix. Does anyone
> have any ideas on how to do this in R?
>
> Thanks,
>
> ------------------------
> Aleksandr Andreev
> Graduate Student - Department of Economics
> University of North Carolina at Chapel Hill
>
> _______________________________________________
> R-sig-Geo mailing list
> R-sig-Geo at stat.math.ethz.ch
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>



-- 
Michael Sumner
Institute for Marine and Antarctic Studies, University of Tasmania
Hobart, Australia
e-mail: mdsumner at gmail.com



More information about the R-sig-Geo mailing list