[R-sig-DB] postgresql - running out of connections
dinesh
d|ne@h@k@@om@n| @end|ng |rom gm@||@com
Thu Jun 9 16:15:09 CEST 2011
Hi,
I just updated my RpgSQL package and very quickly ran out of connections
on my postgresql database. The database instance is on my notebook, not
shared. Below is the sample code
> library(RpgSQL)
> con=dbConnect(pgSQL())
> dbGetQuery(con, "select count(*) from pg_stat_activity")
count
1 7
> dbDisconnect(con)
[1] TRUE
> dbGetQuery(con, "select count(*) from pg_stat_activity")
Error in .verify.JDBC.result(r, "Unable to retrieve JDBC result set for
", :
Unable to retrieve JDBC result set for select count(*) from
pg_stat_activity (An I/O error occured while sending to the backend.)
So far so good, the disconnect is trashing the connection.
> con=dbConnect(pgSQL())
> dbGetQuery(con, "select count(*) from pg_stat_activity")
count
1 8
> dbDisconnect(con)
[1] TRUE
So, now I am left with one stray connection, albeit IDLE. If I repeat
this process a few more times then I run out of free connection handles
on my database.
> for ( i in 1:100 ) { dbDisconnect( dbConnect( pgSQL() ) ) }
> for ( i in 1:200 ) { dbDisconnect( dbConnect( pgSQL() ) ) }
Error in is(object, Cl) :
error in evaluating the argument 'conn' in selecting a method for
function 'dbDisconnect': Error in .jcall(drv using jdrv,
"Ljava/sql/Connection;", "connect", as.character(url)[1], :
org.postgresql.util.PSQLException: FATAL: sorry, too many clients already
>
I am not sure if this is in RJDBC, RpgSQL or some other stack, or my own
code. I am uploading thousands of CSV files into db, and fwiw the code
closes and opens a new connection at some logical point under the
pretext of modular software development.
It's a shiny new install of R2.13, on Windows 7-64bit, with latest
RpgSQL; and the database running Postgresql 8.4. This also broke under
R2.12 (prompting me to upgrade my R version.)
I will greatly appreciate your help.
Thanks and regards
Dinesh
More information about the R-sig-DB
mailing list