[R-sig-DB] concurrent reading/writing in "chunks" with RSQLite (need some help troubleshooting)

Seth Falcon @eth @end|ng |rom u@erpr|m@ry@net
Mon Jun 28 22:52:41 CEST 2010


Hi,

A couple of ideas below...  I will try to reproduce this later.

On Mon, Jun 28, 2010 at 1:26 PM, Jonathan Greenberg
<greenberg using ucdavis.edu> wrote:
> I'm trying to cycle through a database, reading a chunk of data from
> one table in the DB, processing this data chunk, and then writing the
> output to another table in the database.  I'm having to set the
> fetch(n=) option because the database is too large to read the entire
> thing into memory at once.  Here's my sample code:
>
> ***
>
> require(RSQLite)
> mysqldb='test.sqlite'
> fetch_n=3
>
> # Create a data table.
> somedata=data.frame(data1=seq(1:10),data2=seq(1:10)*3)
>
> # Make a table in our database with it.
> m <- dbDriver("SQLite")
> con=dbConnect(m, dbname=mysqldb)
> dbWriteTable(con,"TABLEA",somedata,overwrite=TRUE)
> dbDisconnect(con)
>
> # Now we want to read from TABLEA in "chunks" and write to TABLEB.
> read_con=dbConnect(m, dbname=mysqldb)
> write_con=dbConnect(m, dbname=mysqldb)

You might try setting the flags on the read connection to indicate
that it is read only:

 read_con = dbConnect(m, dbname = mysqldb, flags = SQLITE_RO)

There are some details in ?sqliteSupport

I wonder if it would make a difference if you created the destination
table (TABLEB) earlier (say before opening read/write cons).

> read_query=dbSendQuery(read_con,"SELECT * FROM TABLEA")
>
> while(!dbHasCompleted(read_query))
> {
>        read_chunk=fetch(read_query,fetch_n)
>        new_data=data.frame(data3=read_chunk$data1*4,data4=read_chunk$data2*4)
>        if(!dbExistsTable(write_con, "TABLEB"))
>        {
>                dbWriteTable(write_con,"TABLEB",new_data)
>        } else
>        {
>                dbWriteTable(write_con,"TABLEB",new_data, append=TRUE)
>        }
> }
> dbDisconnect(read_con)
> dbDisconnect(write_con)
>
> **
>
> I am getting this error:
>
> Warning messages:
> 1: In sqliteWriteTable(conn, name, value, ...) :
>  RS_SQLite_exec: could not execute1: database is locked
> 2: In sqliteWriteTable(conn, name, value, ...) :
>  RS_SQLite_exec: could not execute1: database is locked
> 3: In sqliteWriteTable(conn, name, value, ...) :
>  RS_SQLite_exec: could not execute1: database is locked
>> dbDisconnect(read_con)
> [1] TRUE
> Warning message:
> In sqliteCloseConnection(conn, ...) :
>  RS-DBI driver warning: (closing pending result sets before closing
> this connection)

You can get rid of this warning by calling dbClearResult on your query
result object.

>> dbDisconnect(write_con)
> [1] TRUE
>
> Any suggestions on how to fix this?  Thanks!

I will try and have a closer look over the next couple of days.

+ seth

-- 
Seth Falcon | @sfalcon | http://userprimary.net/




More information about the R-sig-DB mailing list