[R] voronoi.mosaic chokes?
Andrew Pierce
adp179 at psu.edu
Wed May 9 06:20:58 CEST 2007
Hi all,
I am running R 2.5.0 under Windows XP Media Center Edition. Here's a
problem that's been stumping me for a few days now, and I can't find
anything useful in the archives. I am using voronoi.mosaic (tripack
package) to create proximity polygons for a study of vegetation
competition and dynamics. The points lists are read in from a file for
each plot, then 8 duplicates are translated around the edges of the plot
(Toroidal edge correction). This is completed using the torus(...)
function that I wrote myself.
VMuncorrected is the voronoi mosaic that is not toroidally edge corrected
VMcorrected is the voronoi mosaic that is toroidally edge corrected
>treemap = read.table('af1.txt', sep = "\t", header = 1)
>VMuncorrected = voronoi.mosaic(treemap$X, treemap$Y)
###Use the torus function to create 8 copies around the study region
>toroid = torus(treemap$X, treemap$Y, 25, 25)
>VMcorrected = voronoi.mosaic(toroid[,1], toroid[,2], duplicate = "remove")
Here's the problem. When I read in the points for the data file listed
above ('af1.txt'), both calls to voronoi.mosaic() work fine. (The
second one takes about 1.5 seconds because there are 1147 points in the
toroidally corrected set).
However, when I read in the points from the next file ('af2.txt'), the
first call to voronoi.mosaic() works. The next call (to torus()) also
works fine. But the second call to voronoi.mosaic() causes R to freeze
completely requiring Ctrl-Alt-Del.
I have 10 sets of points and this problem happens for about 5 of them.
Factors I have ruled out:
-too many points in the call (one set had 1147 and worked fine while the
next set had 801 and froze R)
-duplicate points (taken care of by voronoi.mosaic(..., duplicate =
"remove") and also independently verified by exporting the data. no
duplicates in either the original or the toroidally corrected set)
-points too close together in space (minimum distance between two points
in 'af1.txt' is 0.1414 and works fine. minimum distance in the second
set, 'af2.txt', is 0.2236, and this set causes R to freeze)
-not enough memory (each data set is run in a new R session-i.e. R was
quit between each attempt)
-'flukiness' (the problem happens the same way every time for the
problem data sets, and the code runs fine every time for the non-problem
data sets)
-file formats (each text file has the same number of columns, all the
labels for the columns are identical, and the columns are always in the
same order)
-outdated versions (I am using R 2.5.0 and updated the tripack package
within the last week. also, I update packages about once a month)
This is a very frustrating problem because I get no errors indicating
any problem with the data, and I have checked the data over and over for
any type of error and found none. If anyone has ANY helpful
suggestions, I would love to hear about any and all of them.
Andrew
p.s. - for those of you who are really intrigued, I can email you the
.txt files and the code for the torus() function.
More information about the R-help
mailing list