[R] Reading large files quickly

Gabor Grothendieck ggrothendieck at gmail.com
Sun May 10 00:10:36 CEST 2009


You could try it with sqldf and see if that is any faster.
It use RSQLite/sqlite to read the data into a database without
going through R and from there it reads all or a portion as
specified into R.  It requires two lines of code of the form:

f < file("myfile.dat")
DF <- sqldf("select * from f", dbname = tempfile())

with appropriate modification to specify the format of your file and
possibly to indicate a portion only.  See example 6 on the sqldf
home page: http://sqldf.googlecode.com
and ?sqldf


On Sat, May 9, 2009 at 12:25 PM, Rob Steele
<freenx.10.robsteele at xoxy.net> wrote:
> I'm finding that readLines() and read.fwf() take nearly two hours to
> work through a 3.5 GB file, even when reading in large (100 MB) chunks.
>  The unix command wc by contrast processes the same file in three
> minutes.  Is there a faster way to read files in R?
>
> Thanks!
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>




More information about the R-help mailing list