[R-sig-Geo] time-weighted kernel density interpolation

stepen.condors at gmail.com stepen.condors at gmail.com
Thu Mar 31 05:30:54 CEST 2011


Many thanks for these references / library Clément - they are very useful.

Best 
Stepen

On 25/03/2011, at 7:59 PM, Clément Calenge wrote:

> I am not sure, but wouldn't the function kernelkcbase from the package adehabitatHR fit your needs? it implements the product kernel estimator over space and time. It estimates, for a given date, the kernel density of the points, the point weights depending on their associated dates (see examples of this function). This approach is developed in detail pages p. 91 and pp. 103 and following in;
> 
> @BOOK{Wand1995,
>  title = {Kernel smoothing},
>  publisher = {Chapman \& Hall/CRC},
>  year = {1995},
>  author = {Wand, M.P. and Jones, M.C.}
> }
> 
> Has been introduced in ecology for the smoothing of space utilization by animals by:
> 
> @ARTICLE{Keating2009,
>  author = {Keating, K.A. and Cherry, S.},
>  title = {Modeling utilization distributions in space and time},
>  journal = {Ecology},
>  year = {2009},
>  volume = {90},
>  pages = {1971-1980}
> }
> 
> And has been used for the modelling of ring recoveries by:
> 
> @ARTICLE{Calenge2010,
>  author = {Calenge, C. and Guillemain, M. and Gauthier-Clerc, M. and Simon,
>    G.},
>  title = {{A new exploratory approach to the study of the spatio-temporal distribution
>    of ring recoveries: the example of Teal (Anas crecca) ringed in Camargue,
>    Southern France}},
>  journal = {Journal of Ornithology},
>  year = {2010},
>  pages = {1--6}
> }
> 
> HTH,
> 
> Clément Calenge
> 
> 
> On 03/25/2011 08:05 AM, stepen.condors at gmail.com wrote:
>> OK Michael I will attempt o explain my requirements a little more thoroughly and some the options I have started to explore.
>> I begin by outlining what I would like to do at a functional level.
>> 
>> 1. I have a set of observations each with an  associated x&  y co-ordinates and an age in days - i.e. this observation was made at this point 3 days ago etc.
>> 2. I want to create a grid over the area in which the points are found. Each cell of this grid has an associated z-score.
>> 3. I now step through the point set and for each point find which grid cell it lies in and then add some value to the z-score of that cell and other cells within a finite distance of the origin cell. These intensity scores are based on the age of the observation and the distance of the grid cell from the point  and are looked up from some other data structure such as a matrix. i.e.
>> 		Space
>> Age-days 	0-100 	100-200	300-400 	400+
>> 0-1		0.8		0.7		0.6		0.5
>> 2-3		0.6		0.5		0.4		0.3
>> 4+		0.4		0.3		0.2		0.1
>> 
>> Thus, an observation that occurred today might add a z-score of 0.8 to the z of grid cell in which it lies - 0.7 to those within 100-200, 0.6 to those with 300-400 and so on. On the other hand an observation that is 5 days old only adds 0.4 to the cell in which it lies, 0.3 to those within 100-200 etc etc.
>> 
>> It is important that this z-score per grid cell is cumulative so that if multiple points are near one another some grid cells might have numerous values added to their z score.
>> 
>> Describing the above in pseudo code:
>> 
>> points = (x,y,age)
>> grid = makegrid(nrows,ncolumns)
>> 
>> for point in points
>> {
>> 	for cell in grid
>> 	{
>> 		if point iswithin cell
>> 		{
>> 			z of cell = z of cell + matrix[point.age,1]
>> 			
>> 			for nearcell within 100-200m of cell
>> 			{
>> 				z of nearcell = z of nearcell + matrix[point.age,2]
>> 			}	
>> 
>> 			for nearcell within 200-300m of cell
>> 			{
>> 				z of nearcell = z of nearcell + matrix[point.age,3]
>> 			}					
>> 			
>> 			for nearcell within 300-400m of cell
>> 			{
>> 				z of nearcell = z of nearcell + matrix[point.age,4]
>> 			}
>> 		}
>> 	}
>> }		
>> 
>> On paper this doesn't seem overly complicated, but I am struggling to find the right libraries to do it efficiently.
>> 
>> 
>> I have attempted a few different methods - but unfortunately I seem to be hitting brick walls.
>> I will list a few options I have investigated below:
>> 
>> THE POINTS:
>> say for example then my data set is made up as follows:
>> 	
>> 	xs = sample(1:1000, 100, replace = TRUE)
>> 	ys = sample(1:1000, 100, replace = TRUE)
>> 	age = sample(1:30,100, replace = TRUE)
>> 
>> I can then generate a ppp as follows:
>> 
>> 	test.ppp = ppp(xs, ys, xrange=c(1,1000), yrange=c(1,1000))
>> 
>> However this only associates the x,y co-eds with points and not the age
>> - maybe I can use age as the mark variable as follows:
>> 	
>> 	test.ppp = ppp(xs, ys, marks = age, xrange=c(1,1000), yrange=c(1,1000))
>> 
>> However, this only seems to impact on the visual display of data.
>> 
>> Perhaps I am better creating a simple object as follows
>> 	
>> 	observations<- cbind(xs,ys,age)
>> 
>> But then this has no spatial data associated with it
>> 
>> THE GRID:
>> In making the grid I can do several things:
>> Using pixellate I can count the number of observations in a grid I specify the size of as follows (100x100 grid cells)
>> 
>> 	Z<- pixellate(test.ppp, eps=100)
>> 
>> I can then convert the grid to a data.frame and view it:
>> 	
>> 	>  as.data.frame(Z)
>>       	x     y 	     value
>> 	1    50  50      0
>> 	2    50 150     3
>> 	3    50 250     0
>> 	4    50 350     2
>> 
>> However, the points are already aggregated as counts and I need to know the age of each in order to calculate a z-score for each grid. I.e. grid cell 2 has three points in it I need the age of each.
>> 
>> An alternative seems to be to create a bunch of polygon objects for each grid cell as follows (10x10 grid cells in this example):
>> 
>>     	for(x in 0:100)
>>         	{
>>             	for(y in 0:100)
>>                 	{
>>                     	cellX = (x * 10)
>>                     	cellY = (y * 10)
>>                     	grid_cell_dim = cbind(x=c(cellX,(cellX+10),(cellX+10),cellX),y=c((cellY+10),(cellY+10),cellY,cellY))
>>                     	polygon(grid_cell_dim)
>> 
>>                 	}
>>         	}
>> 
>> Once I have created these polygons I am unsure how I can access them or associate a z-score with each. When I messed about twith SpatialPolygonsDataFrame it keeps telling me 'ring not closed'.
>> 
>> 
>> Apologies for the long-winded message - but as you can see I am struggling to find the best course of action to do what seems a relatively simple task. Therefore, any advice would be greatly appreciated - are any of these options viable? is there a much simpler solution?
>> 
>> Best Regards
>> Stepen
>> 
>> 
>> On 24/03/2011, at 6:14 AM, Michael Sumner wrote:
>> 
>>> I'm not sure if anyone could give advice here without know more about
>>> the data, but perhaps ?density.ppp or ?pixellate.ppp in spatstat would
>>> be useful.
>>> 
>>> Cheers, Mike.
>>> 
>>> On Tue, Mar 22, 2011 at 4:05 PM,<stepen.condors at gmail.com>  wrote:
>>>> Thanks for this Michael - I am investigating the trip library as we speak.
>>>> 
>>>> Another (potentially stupid) query:
>>>> 
>>>> I have a point pattern (as a ppp):
>>>> 
>>>> test.ppp = ppp(test.eastings, test.northings, xrange=c(1000000, 2000000), yrange=c(1000000,2000000))
>>>> 
>>>> I'd like to generate a grid over this area storing an associated intensity score with each grid cell - so that where points intersect that grid I can add some value to the intensity score of the cell they lie in (and surrounding ones ideally).
>>>> 
>>>> What grid method would you recommend?  and then how might you go about calculating the intersection of points and polygons?
>>>> 
>>>> Many Thanks
>>>> Stepen
>>>> 
>>>> 
>>>> On 21/03/2011, at 6:46 AM, Michael Sumner wrote:
>>>> 
>>>>> Hi Stepen,
>>>>> 
>>>>> The function tripGrid in package trip does something similar to this
>>>>> for the time interval between points ("time spent in area") - by using
>>>>> spatstat's density function on the line segments. The KDE approach is
>>>>> there for exploration though I think the straight grid approach is
>>>>> usually better (neither method has any systematic handling for
>>>>> location error).
>>>>> 
>>>>> I've long meant to generalize the tripGrid function so that it's not
>>>>> so tied to the time interval and the user could specify the
>>>>> "weighting" - but that always brings up the issue of whether is is the
>>>>> points or the implicit line segments between them that are of
>>>>> interest.
>>>>> 
>>>>> The density.ppp function in spatstat could be used to do this, though
>>>>> I wonder what kind of result you are expecting from this method?
>>>>> 
>>>>> Some of the approaches in adehabitat (and its new family of packages)
>>>>> could be useful too.
>>>>> 
>>>>> Cheers, Mike.
>>>>> 
>>>>> 
>>>>> 
>>>>> On Mon, Mar 21, 2011 at 3:05 AM,<stepen.condors at gmail.com>  wrote:
>>>>>> Hi all
>>>>>> I am currently looking into developing some sort of time-weighted kernel density interpolation in R. It is my aim to build something which allows me to the the following:
>>>>>> 
>>>>>> - Import a point pattern p with associated times for each event
>>>>>> - Plot a time weighted kernel density map of p - such that more recent events have a greater weighting than those that occurred earlier.
>>>>>> - Ideally it would be useful to specify both the spatial and temporal bandwidths and the decay function types i.e. linear, exponential
>>>>>> - Import new point data and assign an intensity score to each point from its location on time-weighted kernel density surface.
>>>>>> 
>>>>>> Is this something that is relatively easily doable in R or am I crazy?
>>>>>> I have developed something similar previously in both C++ and VB.NET (cough) interfacing directly with MapInfo so I am not afraid of code.
>>>>>> However, as I am still relatively new to R I imagined that there were likely some better and worse ways to do this and/or libraries to look at. I have checked out the sp and spatstat libraries - but neither seem to have much to do with time, next i am considering trip. Where would you start? I would hate to waste time implementing something just because I was unaware of an existing solution.
>>>>>> 
>>>>>> Therefore any advice, suggestions, experiences or abuse concerning how I might accomplish this functionality would be greatly appreciated.
>>>>>> 
>>>>>> Best
>>>>>> Stepen
>>>>>> _______________________________________________
>>>>>> R-sig-Geo mailing list
>>>>>> R-sig-Geo at r-project.org
>>>>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>>>>> 
>>>>> 
>>>>> 
>>>>> --
>>>>> Michael Sumner
>>>>> Institute for Marine and Antarctic Studies, University of Tasmania
>>>>> Hobart, Australia
>>>>> e-mail: mdsumner at gmail.com
>>>> _______________________________________________
>>>> R-sig-Geo mailing list
>>>> R-sig-Geo at r-project.org
>>>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>>>> 
>>> 
>>> 
>>> -- 
>>> Michael Sumner
>>> Institute for Marine and Antarctic Studies, University of Tasmania
>>> Hobart, Australia
>>> e-mail: mdsumner at gmail.com
>> 
>> 	[[alternative HTML version deleted]]
>> 
>> _______________________________________________
>> R-sig-Geo mailing list
>> R-sig-Geo at r-project.org
>> https://stat.ethz.ch/mailman/listinfo/r-sig-geo
>> 
> 
> 
> -- 
> Clément CALENGE
> Cellule d'appui à l'analyse de données
> Direction des Etudes et de la Recherche
> Office national de la chasse et de la faune sauvage
> Saint Benoist - 78610 Auffargis
> tel. (33) 01.30.46.54.14
> 



More information about the R-sig-Geo mailing list