[R-sig-DB] RMySQL "lost connection" during dbWriteTable()
Paul Gilbert
pg||bert902 @end|ng |rom gm@||@com
Wed Oct 2 20:07:10 CEST 2013
It is possible, but not certain, that some server parameter settings can
affect this. You might consider adjusting max_allowed_packets and maybe
some memory/cache sizes. As I recall, if you run the server in debug
mode in a terminal you may get better error information. You could also
check if you get a similiar problem writing data with the mysql client
program, or with perl. I did have a similar problem even after server
adjustments with perl, and had to break large inserts into groups (of
100,000 records with about 6 fields). It seems unlikely to me that you
would need to break it as small as 100 records unless soething else is
going on.
Paul
On 13-10-01 10:31 AM, Krzysztof Sakrejda wrote:
> I've had this problem before, writing over a slow internet connection
> and I believe it would be nice to make the dbWriteTable command do
> this... but I haven't had the chance to do anything useful about it,
> other than write this quick and dirty function for myself. This
> function carelessly re-uses dbConnect which can exhaust available
> connections, but it could be adapted to do a check, like the appended
> dbReconnect function. Hope that helps... Krzysztof
>
> dbRobustWriteTable <- function(conn, user, password, host, dbname,
> name, value, tries) {
> numFullChunks <- nrow(value)%/%100
> lengthLastChunk <- nrow(value)%%100
> if (numFullChunks >= 1) {
> writeSeqFullChunks <- data.frame(Start =
> seq(0,numFullChunks-1,1)*100+1, Stop = seq(1,numFullChunks,1)*100)
> }
> writeSeqLastChunk <- data.frame(Start = numFullChunks*100+1, Stop
> = numFullChunks*100+lengthLastChunk)
> if (numFullChunks >= 1) {
> writeSeqAllChunks <- rbind(writeSeqFullChunks,writeSeqLastChunk)
> } else { writeSeqAllChunks <- writeSeqLastChunk }
>
> for(i in 1:nrow(writeSeqAllChunks)) {
> try <- 0
> rowSeq <-
> seq(writeSeqAllChunks$Start[i],writeSeqAllChunks$Stop[i],1)
> while (!dbWriteTable(conn = conn, name = name, value =
> value[rowSeq,], overwrite = FALSE, append = TRUE) & try < tries) {
> conn <-
> dbConnect(MySQL(),user=user,password=password,host=host,dbname=dbname)
> try <- try + 1
> if (try == tries) { stop("EPIC FAIL") }
> print(paste("Fail number",try,"epical fail
> at",tries,"tries.",sep = " "))
> }
> }
> }
>
>
> dbReconnect <- function(
> conn = NULL,
> idFile = NULL,
> drv = MySQL()
> ) {
> if ( is.null(idFile) ) {
> stop('Can\'t reconnect, no id.')
> }
> if ( !is.null(conn) ) {
> tryCatch(
> expr = dbDisconnect( conn = conn ),
> error = function(cond) {
> print(cond)
> }
> )
> }
> ev = new.env()
> load( file = idFile, envir = ev )
> ev$drv <- drv
> conn <- with(
> data = ev,
> expr = dbConnect(
> drv = drv,
> user = user,
> password = pass,
> dbname = dbname,
> host = host
> )
> ); rm(ev)
> return(conn)
>
> }
>
>
>
> On Tue, Oct 1, 2013 at 8:45 AM, Magnus Thor Torfason
> <zulutime.net using gmail.com> wrote:
>> When doing dbWriteTable() to write my (admittedly large) data.frame to a
>> MySQL database I randomly get the following error:
>>
>>> dbWriteTable(conn, "tablename", data)
>> Error in mysqlExecStatement(conn, statement, ...) :
>> RS-DBI driver: (could not run statement: Lost connection to MySQL server
>> during query)
>>
>> Googling on MySQL in general suggests that this can be the result of sending
>> too big of a chunk of data to MySQL at the same time.
>>
>> But the data.frame sits comfortably in memory and the table sits comfortably
>> in the database. So it is disappointing that what I thought was the
>> recommended way to achieve this task (dbWriteTable), should be a bottleneck.
>>
>> I suppose I could write a loop around dbWriteTable(), writing only a few
>> rows of the data.frame at the same time, but that seems like something that
>> dbWriteTable() should be doing internally.
>>
>> Are there any good suggestions on what I should do?
>>
>> Best,
>> Magnus
>>
>> _______________________________________________
>> R-sig-DB mailing list -- R Special Interest Group
>> R-sig-DB using r-project.org
>> https://stat.ethz.ch/mailman/listinfo/r-sig-db
>
>
>
More information about the R-sig-DB
mailing list