[R-sig-Geo] Time series in spatial regression model (spautolm)

Roger Bivand Roger.Bivand at nhh.no
Sat Oct 10 15:11:17 CEST 2015


You cross-posted to this list and r-help - never do that - it splinters 
any replies. I have chosen to ignore your r-help posting and hope that 
others will do the same.

On Fri, 9 Oct 2015, Tobias Rüttenauer wrote:

> Dear r-sig-geo team,
>
> I started working with spatial analysis some month ago, so I'm quite new
> (and unknowing ) in this field. However, my aim is to connect time series
> analysis with spatial analysis, what seems to be quite difficult (to me).
>

Why are you not using the splm package, which provides most of the models 
you might ever need to fit? Your manipulation of the weights matrix by 
years is a Kronecker product - used in that package. Study the plm package 
first, run your model aspatially with that, and move on to splm.

Roger

> The dataset I am working with a spatial polygons data frame, containing 402
> spatial polygons for the years 2007-2011.
>
> In a first step, I want to estimate a SAR model which accounts only for the
> spatial autoregressive error term within a year. So what I am basically
> trying is to construct a weights list object, containing only weights for
> neighbors in the same year (excluding the linkages to the "self" spatial
> unit (in different years) and neighboring units in different years). So what
> I tried was to expand the original weights matrix by duplicating the
> original matrix on the main diagonal while filling all the other linkages
> (e.g. linkages between 2007 spatial units and 2008 spatial units) with zero
> by the following code:
>
>
> # Queens links:
>> data.nb<-poly2nb(data_subset.spdf)
>>
>> # Remove temporal links
>> data2.nb<-aggregate(data.nb, data_subset.spdf$id, remove.self = TRUE)
>>
>> # Get matrix
>> tmp.mat<-nb2mat(data2.nb)
>>
>> # Expand matrix
>> zero1.mat<-matrix(0, 402, 402)
>> zero2.mat<-matrix(0, 402, 402)
>> zero3.mat<-matrix(0, 402, 402)
>> zero4.mat<-matrix(0, 402, 402)
>> zero5.mat<-matrix(0, 402, 402)
>>
>> tmp2.mat<-tmp.mat
>> tmp3.mat<-tmp.mat
>> tmp4.mat<-tmp.mat
>> tmp5.mat<-tmp.mat
>>
>> row.names(zero1.mat)<-paste("2007", as.numeric(row.names(tmp.mat)),
>> sep="_") row.names(zero2.mat)<-paste("2008",
>> as.numeric(row.names(tmp.mat)), sep="_")
>> row.names(zero3.mat)<-paste("2009", as.numeric(row.names(tmp.mat)),
>> sep="_") row.names(zero4.mat)<-paste("2010",
>> as.numeric(row.names(tmp.mat)), sep="_")
>> row.names(zero5.mat)<-paste("2011", as.numeric(row.names(tmp.mat)),
>> sep="_")
>>
>> row.names(tmp.mat)<-paste("2007", as.numeric(row.names(tmp2.mat)),
>> sep="_") row.names(tmp2.mat)<-paste("2008",
>> as.numeric(row.names(tmp2.mat)), sep="_")
>> row.names(tmp3.mat)<-paste("2009", as.numeric(row.names(tmp3.mat)),
>> sep="_") row.names(tmp4.mat)<-paste("2010",
>> as.numeric(row.names(tmp4.mat)), sep="_")
>> row.names(tmp5.mat)<-paste("2011", as.numeric(row.names(tmp5.mat)),
>> sep="_")
>>
>> tmp1<-rbind(tmp.mat, zero2.mat, zero3.mat, zero4.mat, zero5.mat)
>> tmp2<-rbind(zero1.mat, tmp2.mat, zero3.mat, zero4.mat, zero5.mat)
>> tmp3<-rbind(zero1.mat, zero2.mat, tmp3.mat, zero4.mat, zero5.mat)
>> tmp4<-rbind(zero1.mat, zero2.mat, zero3.mat, tmp4.mat, zero5.mat)
>> tmp5<-rbind(zero1.mat, zero2.mat, zero3.mat, zero4.mat, tmp5.mat)
>>
>> nb.mat<-cbind(tmp1, tmp2, tmp3, tmp4, tmp5)
>>
>> data_sub.lw<-mat2listw(data.matrix(nb.mat))
>>
>> any(is.na(nb.mat))
> [1] FALSE
>
> So I get a weights list object including 2010 observations (5 years with 402
> observations, which fits the spatial polygon data frame with 2010
> observations), but after running a spautolm model, I get the following error
> message:
>
>> spreg.mod<-spautolm(sqrt(fortzuege_gem) ~ v1 + v2,
> +                 data=data_subset.spdf, listw=data_sub.lw,
> weights=area_sqkm,
> +                 zero.policy=TRUE)
> Error in subset.listw(listw, subset, zero.policy = zero.policy) :
>  Not yet able to subset general weights lists
>
> Elsewhere, it is mentioned that this error messages occurs if the weights
> matrix contains missing values, but that's not the case here. I assume that
> there is some mistake in creating the weights matrix. Do you have any ideas?
>
>
> Another thing I was trying to estimate is a SAR model with unit fixed
> effects, just by including the id dummies into the equation (for the test I
> include the total list weights objects, containing all linkages).
>
>> data_total.nb<-poly2nb(data_subset.spdf)
>> data_total.lw<-nb2listw(data.nb, style="W")
>>
>> spreg_false.mod<-spautolm(sqrt(fortzuege_gem) ~ id  + v1 + v2,
> +                     data=data_subset.spdf, listw=data_total.lw,
> weights=area_sqkm,
> +                     zero.policy=TRUE)
> Error in solve.default(crossprod(X, as.matrix(IlW %*% X)), tol = tol.solve)
> :
>  system is computationally singular: reciprocal condition number =
> 1.01026e-16
>>
>
> So I assume there occurs some conflict, between using ID dummies and a
> weights matrix in one model? Is there any way to solve this problem? This
> may be a stupid question for some who is (totally) aware of the mathematics
> behind the model.
>
> I would be really happy about any help!
>
> Thank you in advance and best wishes,
> Tobi
>
> _______________________________________________
> R-sig-Geo mailing list
> R-sig-Geo at r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>

-- 
Roger Bivand
Department of Economics, Norwegian School of Economics,
Helleveien 30, N-5045 Bergen, Norway.
voice: +47 55 95 93 55; fax +47 55 95 91 00
e-mail: Roger.Bivand at nhh.no


More information about the R-sig-Geo mailing list