[R-sig-Geo] error in errorsarlm
Roger Bivand
Roger.Bivand at nhh.no
Tue Jul 20 16:24:51 CEST 2010
On Tue, 20 Jul 2010, elaine kuo wrote:
> Hello,
>
> Thanks for the always concerns.
> answering the previous questions first.
>
>
>>> 1. distance threshold
>>>
>>
>> But 9 what, metres, degrees? Should your code be:
>> cbind(datam$lat,datam$lon), as the normal ordering is eastings then
>> northings, not the opposite?
>
>
> => unit is degree, and the lat/lon is revised (thanks)
>
>>
>>
>>
>>>
>>> 9 is the cutting point for avoiding empty neighbour sets
>>>
>>
>> This is not a good reason, as graph neighbours can be sparse and link over
>> longer distances.
>>
>>
> => according to previous message with similar questions,
> it is mentioned that including empty neighbor sets could be
> meaningless...
> so, if empty neighbor sets are accepted here,
> could there be anything needed to be cautious upon ?
Just do not use distance as a criterion when the observations are very
irregularly spaced! Use perhaps a symmetricised k-nearest neighbour
definition. And make sure that you do use Great Circle distances - you are
not doing so now. With 200 neighbours, you have no viable spatial process,
really.
You can accept no-neighbour observations when there is no good reason to
suppose that they could interact, but then have to attend to your
zero.policy settings. It all depends on what processes are thought to be
causing the observations to influence each other.
>
>>
>>
>>> 2. dataset
>>> length(nb10)=4873.
>>> sum(card(nb10)=885478)
>>>
>>>
>> This gives an average of 181.7 neighbours per observation, and is the root
>> of the problem. The Chebyshev method uses a set of low powers of the
>> weights, which fill in quickly for large average numbers of neighbours. Did
>> you try the Matrix method (your error message suggests that you may have
>> done, but the code does not)?
>>
>> => "Matrix" tried but failed. The warning and code as below..
>
>
> 1. change method to Matrix
>
> Error in solve.default(-(mat), tol.solve = tol.solve) :
>
> System calculation is specific, the condition is =5.14146e-17 (translated
> from Chinese)
>
> Warning messages:
Warning not a problem, but the error suggests scaling problems in the
covariance matrix of the coefficients. The X variables are most likely
measured in inappropriate metrics (some coefficients of size 1000, others
of size 0.0001, for example).
Please find local help to resolve the underlying problems, related to why
you want these weights, and why you are trying to fit this kind of model
(that is: is your research question sensible and what authority
(literature) do you have for your choices?). Solve all of the problems
first on a small subset of your data, and scale up afterwards.
Roger
>
> 1: In determinant(x, TRUE) : This version of the Matrix package returns
>
> |determinant(L)| instead of determinant(A), i.e., a
>
> *DIFFERENT* value.
>
> If still necessary, do change your code, following
> http://matrix.r-forge.r-project.org
>
>
>
> 2: In powerWeights(W = W, rho = lambda, order = con$pWOrder, X = B, :
>
> not converged within order iterations
>
> 3: In powerWeights(W = t(W), rho = lambda, order = con$pWOrder, X = C, :
>
> not converged within order iterations
>
>
>
>
>
> code
>
>
>
> rm(list=ls())
>
> datam <-read.csv("c:/migration/M_R_2090707_winterM.csv",header=T,
> row.names=1)
>
> dim(datam)
>
> datam[1,]
>
>
>
> library(ncf)
>
> library(spdep)
>
>
>
> # Define coordinates, neighbours, and spatial weights
>
> coords<-cbind(datam$lon,datam$lat)
>
> coords<-as.matrix(coords)
>
>
>
> #Define neighbourhood (here distance 9)
>
> nb9<-dnearneigh(coords,0,9)
>
>
>
> length(nb9)
>
> sum(card(nb9))
>
>
>
> #Spatial weights, illustrated with coding style "W" (row standardized)
>
> nb9.w<-nb2listw(nb9, glist=NULL, style="W", zero.policy=FALSE)
>
>
>
> # Spatial SAR error model
>
> sem.nb9.w <-
> errorsarlm(datam$WinterM_ratio~datam$topo_mean+datam$coast+datam$prec_max
> +datam$temp_min+datam$evi_min, data=datam, listw=nb9.w, na.action=na.omit,
> method="Matrix", zero.policy=TRUE,,tol.solve=1.0e-17)
>
>
>
>>
>> 3. memory size
>>> the memory limit is 1535 Mb, which is much larger than 121.8 Mb.
>>>
>>>
>> Windows (your OS?) memory handling is very poor for larger objects, and the
>> 121Mb is extra memory needed.
>
>
>
> => OS : Vista HOME
>
>
>
>> However, the main problem is your design of weights, which are far too
>> dense to make sense. Choose an alternative weights definition and try again.
>>
>
> => 2. decrease the neighbor distance from 9 to 5
>
> Warning messages:
>
> 1: In powerWeights(W = W, rho = lambda, order = con$pWOrder, X = B, :
>
> not converged within order iterations
>
> 2: In powerWeights(W = t(W), rho = lambda, order = con$pWOrder, X = C, :
>
> not converged within order iterations
>
>
>
>
>
>
>
> code
>
> rm(list=ls())
>
> datam <-read.csv("c:/migration/M_R_20100707_winterM.csv",header=T,
> row.names=1)
>
> dim(datam)
>
> datam[1,]
>
>
>
> library(ncf)
>
> library(spdep)
>
>
>
> # Define coordinates, neighbours, and spatial weights
>
> coords<-cbind(datam$lon,datam$lat)
>
> coords<-as.matrix(coords)
>
>
>
> #Define neighbourhood (here distance 5)
>
> nb5<-dnearneigh(coords,0,5)
>
>
>
> #Spatial weights, illustrated with coding style "W" (row standardized)
>
> nb5.w<-nb2listw(nb5, glist=NULL, style="W", zero.policy=TRUE)
>
>
>
>
>
> # Spatial SAR error model
> sem.nb5.w <-
> errorsarlm(datam$WinterM_ratio~datam$topo_mean+datam$coast+datam$prec_max
> +datam$temp_min+datam$evi_min, data=datam, listw=nb5.w, na.action=na.omit,
> method="Chebyshev", zero.policy=TRUE,,tol.solve=1.0e-17)
>
> Elaine
>
--
Roger Bivand
Economic Geography Section, Department of Economics, Norwegian School of
Economics and Business Administration, Helleveien 30, N-5045 Bergen,
Norway. voice: +47 55 95 93 55; fax +47 55 95 95 43
e-mail: Roger.Bivand at nhh.no
More information about the R-sig-Geo
mailing list