[Rd] Lightweight 'package' idea.
Barry Rowlingson
b.rowlingson at lancaster.ac.uk
Fri Aug 21 14:03:40 CEST 2009
I'm often wanting to develop functions whilst manipulating data. But I
don't want to end up with a .RData full of functions and data. It
might be that I have functions that are re-usable but not worth
sticking in a package.
So I've tried to come up with a paradigm for function development
that more closely follows the way Matlab and Python do it (partly
inspired by a confused Matlab convert over on R-help).
My requirements were thus:
* .R files as the master source for R functions
* Don't see the functions in ls()
* After editing R, make it easy to update the definitions visible to
R (unlike rebuilding and reloading a package).
So I wrote these two in a few mins:
loadDir <- function(dir){
e = attach(NULL,name=dir)
assign("__path__",dir,envir=e)
reloadDir(e)
e
}
reloadDir <- function(e){
path = get("__path__",e)
files = list.files(path,".R$",full.names=TRUE,recursive=TRUE,ignore.case=TRUE)
for(f in files){
sys.source(f,envir=e)
}
}
Usage is something like:
lib1 = loadDir("/foo/bar/baz/lib1/")
- it creates a new environment on the search path and sources any .R
it finds in there into that environment. If you edit anything in that
directory, just do reloadDir(lib1) and the updated definitions are
loaded. It's like python's "import foo" and "reload(foo)".
Sourcing everything on any change seems a bit wasteful, but until R
objects have timestamps I can't think of a better way. Hmm, maybe my
environment could keep a __timestamp__ object... Okay, this is getting
less simple now...
So anyway, have I done anything wrong or stupid here, or is it a
useful paradigm that seems so obvious someone else has probably done
it (better)?
Barry
More information about the R-devel
mailing list