[R] Running R Script on a Sequence of Files
Gabor Grothendieck
ggrothendieck at gmail.com
Fri Dec 5 20:26:47 CET 2008
Use dir to get the names and then lapply over them with a
custom anonymous function where L is a list of the returned
values:
# assumes file names are those in
# current directory that end in .dat
filenames <- dir(pattern = "\\.dat$")
L <- lapply(filenames, function(x) {
DF <- read.table(x, ...whatever...)
somefunction(DF)
})
Now L is a list of the returned 900 values. Alternately you could use
a loop.
On Fri, Dec 5, 2008 at 1:01 PM, Chris Poliquin <poliquin at sas.upenn.edu> wrote:
> Hi,
>
> I have about 900 files that I need to run the same R script on. I looked
> over the R Data Import/Export Manual and couldn't come up with a way to
> read in a sequence of files.
>
> The files all have unique names and are in the same directory. What I want
> to do is:
> 1) Create a list of the file names in the directory (this is really what I
> need help with)
> 2) For each item in the list...
> a) open the file with read.table
> b) perform some analysis
> c) append some results to an array or save them to another file
> 3) Next File
>
> My initial instinct is to use Python to rename all the files with numbers
> 1:900 and then read them all, but the file names contain some information
> that I would like to keep intact and having to keep a separate database of
> original names and numbers seems inefficient. Is there a way to have R read
> all the files in a directory one at a time?
>
> - Chris
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
More information about the R-help
mailing list