[R-sig-Geo] how to get weights for inverse distance weighting

Waichler, Scott R Scott.Waichler at pnnl.gov
Sat Apr 8 00:01:26 CEST 2017


Hello, I am trying to get inverse distance weights to estimate values on a regular grid from a set of data points, over a sequence of times.  The locations of the data points don't vary with time, but their values will with each instance, and in general some are NA.  I want to determine the number of unique weight matrices needed to do the IDW estimation across time.  I'm looking at gstat and spdep but am still having trouble with some of the basics . . .I should be more fluent with sp, I know.  The weight matrix has grid points for rows and data points for columns.

x.grid <- y.grid <- seq(0, 100, by=10)  # regular grid, locations I want to estimate z at
N <- 10  # number of data points
x.data <- runif(0, 100, n=N)  # set locations of the data points
y.data <- runif(0, 100, n=N)

w <- list()  # 
# Loop thru 20 instances (a sequence of months in my real problem)
for(i in 1:20) {
  # set the z.data values to something between 0 and 1
  z.data <- runif(0, 1, n=N)
  # set some of the z.data values to NA
  ind.na <- sample.int(N, size=round(runif(1, 0, N)))
  z.data[ind.na] <- NA

  # For this time, I want to get the matrix of weights (>= 0) that would be used in an inverse
  # distance weighting interpolation.  I would like to specify parameters like nmax, nmin, omax, maxdist, and force
  # as used in gstat().   Rows are grid points, columns are data points.
  #w[[i]] <- matrix(data= ????, ncol=N, byrow=T)
}

Finally, how can I determine the number of unique w[[i]] matrices?  I know for my toy problem it will
in general be 20, but in my real problem it will be less than the number of times.

Thank you,
Scott Waichler
Pacific Northwest National Laboratory
Richland, WA    USA



More information about the R-sig-Geo mailing list