[R] Time and db precision
Mikkel Grum
mi2kelgrum at yahoo.com
Thu May 26 01:25:35 CEST 2011
I have a loop that regularly checks for new data to analyse in my database. In order to facilitate this, the database table has a timestamp column with the time that the data was inserted into the database. Something like this:
while (....) {
load(timetoken.Rdata)
df <- sqlQuery(con, paste("SELECT * FROM tabledf WHERE timestamp > ", timetoken, sep = ""))
analyse(df)
timetoken <- max(df$timestamp)
save(timetoken, file = "timetoken.Rdata")
Sys.sleep(60)
}
Now this used to work fairly well with R and PostgreSQL on Windows, but on Ubuntu, I'm getting a lot of data being pulled up again and again. I suspect what is happening is that R is rounding off to the nearest second, while PostgreSQL is using a much higher level of precision, so that if no new data has come in in the meantime, chances are fairly high (50% ??) that the PostgreSQL timestamp is higher than the version that has been rounded off by R.
Is there any way of recording the timestamp in R exactly as it is in PostgreSQL? Or is there another way of dealing with this??
sessionInfo()
R version 2.11.1 (2010-05-31)
x86_64-pc-linux-gnu
locale:
[1] C
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] RODBC_1.3-1
All assistance greatly appreciated.
Mikkel
More information about the R-help
mailing list