[R-sig-DB] concurrent reading/writing in "chunks" with RSQLite (need some help troubleshooting)
Jonathan Greenberg
greenberg @end|ng |rom ucd@v|@@edu
Mon Jun 28 22:26:33 CEST 2010
Folks:
I'm trying to cycle through a database, reading a chunk of data from
one table in the DB, processing this data chunk, and then writing the
output to another table in the database. I'm having to set the
fetch(n=) option because the database is too large to read the entire
thing into memory at once. Here's my sample code:
***
require(RSQLite)
mysqldb='test.sqlite'
fetch_n=3
# Create a data table.
somedata=data.frame(data1=seq(1:10),data2=seq(1:10)*3)
# Make a table in our database with it.
m <- dbDriver("SQLite")
con=dbConnect(m, dbname=mysqldb)
dbWriteTable(con,"TABLEA",somedata,overwrite=TRUE)
dbDisconnect(con)
# Now we want to read from TABLEA in "chunks" and write to TABLEB.
read_con=dbConnect(m, dbname=mysqldb)
write_con=dbConnect(m, dbname=mysqldb)
read_query=dbSendQuery(read_con,"SELECT * FROM TABLEA")
while(!dbHasCompleted(read_query))
{
read_chunk=fetch(read_query,fetch_n)
new_data=data.frame(data3=read_chunk$data1*4,data4=read_chunk$data2*4)
if(!dbExistsTable(write_con, "TABLEB"))
{
dbWriteTable(write_con,"TABLEB",new_data)
} else
{
dbWriteTable(write_con,"TABLEB",new_data, append=TRUE)
}
}
dbDisconnect(read_con)
dbDisconnect(write_con)
**
I am getting this error:
Warning messages:
1: In sqliteWriteTable(conn, name, value, ...) :
RS_SQLite_exec: could not execute1: database is locked
2: In sqliteWriteTable(conn, name, value, ...) :
RS_SQLite_exec: could not execute1: database is locked
3: In sqliteWriteTable(conn, name, value, ...) :
RS_SQLite_exec: could not execute1: database is locked
> dbDisconnect(read_con)
[1] TRUE
Warning message:
In sqliteCloseConnection(conn, ...) :
RS-DBI driver warning: (closing pending result sets before closing
this connection)
> dbDisconnect(write_con)
[1] TRUE
Any suggestions on how to fix this? Thanks!
--j
More information about the R-sig-DB
mailing list