[R] efficient equivalent to read.csv / write.csv
statquant2
statquant at gmail.com
Tue Sep 28 23:02:53 CEST 2010
Hello all,
the test I provided was just to pinpoint that for loading once a big csv
file with read.csv was quicker than read.csv.sql... I have already
"optimized" my calls to read.csv for my particular problem, but is a simple
call to read.csv was quicker than read.csv.sql I doubt that specifying args
would invert the reult a lot...
May be I should outline my problem :
I am working on a powerful machine with 32Go or 64Go of RAM, so loading file
and keeping them in memory is not really an issue.
Those files (let's say 100) are shared by many and are flat csv files (this
to say that modify them is out of question).
Those files have lots of rows and between 10 and 20 colums, string and
numeric...
I basically need to be able to load these files to quicker possible and then
I will keep those data frame in memory...
So :
Should I write my own C++ function and call it from R ?
Or is there a R way of improving drastically read.csv ?
Thanks a lot
--
View this message in context: http://r.789695.n4.nabble.com/efficient-equivalent-to-read-csv-write-csv-tp2714325p2717937.html
Sent from the R help mailing list archive at Nabble.com.
More information about the R-help
mailing list