[R] Passing database connections to functions

David Hutchinson mtb_dave at yahoo.ca
Thu Jan 26 15:37:05 CET 2017


Hi,

I have a series of functions which query various tables of a sqlite 
database. The database was developed by a government agency and is 
downloaded to a local users computer (it is relatively large - 1 GB). 
These functions I have developed typically take the form of:

getData <- function(con, id) {
     sqlString <- sprintf("SELECT * FROM TABLE WHERE STATION = \'%s'\", id)
     qryResult <- dbGetQuery(con, sqlString)
     return(qryResult)
}

where 'con' represents an open database connection derived from:

db.path <- "~/Documents/mydatabase.sqlite3" # this path is user-dependent
con <- dbConnect(RSQLite::SQLite(), db.path)

I would like to expose these functions through an R package. I have two 
questions:
1. Is there a better way to handle passing of database connections to 
functions? By opening the database connection first, users can access 
one or more of the functions to return data by passing in 'con' as an 
argument. I've had a hard time finding resources on best practices.
2. Any documented examples for the eventual package will have to be 
wrapped with \donotrun{} since I cannot guarantee where the database may 
reside on the users computer. What is the R-communities take on 
developing a package for CRAN where documented methods have no working 
examples? I wish the underlying data were available through an http 
request, but they are not.

Thanks in advance,
Dave



More information about the R-help mailing list