[R-sig-Geo] reading PostGIS table into sp data frame

Edward Vanden Berghe evberghe at gmail.com
Wed Dec 19 14:48:29 CET 2012


I wanted to create a global map with squares in lat-lon. I have PostGIS tables to define these squares – but I haven’t been able to figure out an efficient way of reading those tables into R. The code I am using now is:

       crs <- CRS("+proj=longlat +ellps=WGS84")
       s <- paste("select id, st_astext(geom) as geom from geo.cs10d";", sep="")
       r <- dbGetQuery(con, s)
       p <- readWKT(r$geom[1],id=r$id[1],p4s=crs)
       for(i in 2:length(r$id)){
              p <- rbind(p, readWKT(r$geom[i], id=r$id[i], p4s=crs))
       }

where geo.cs10d is the table with squares, id the primary key of the table, and geom the binary geometry field.

The code above works fine for the larger squares, such as 10 degrees, of which I only need 648 to cover the globe. For finer resolutions, the above takes just too long – I assume because the rbind function rewrites the whole sp object each time it executes. I’ve seen other R scripts that initiate an empty data frame of the correct length to go round similar problems with the rbind function; I haven’t been able to find an equivalent for spatial polygons. How can I initiate an empty data frame with the right structure, and the right length?

A preferable solution would be if there would be a single function to load a complete PostGIS table, rather than having to load the polygons one by one in a loop. Is there such a function?

I’m using PostgreSQL 8.4, PostGIS 1.5, R 2.15.2, platform x86_64-w64-mingw32; IDE is StatET 3.0.1 plugin for Eclipse 3.7.2.

Any help would be much appreciated.

Edward 



More information about the R-sig-Geo mailing list