[R] Reading big files in chunks-ff package
Jan van der Laan
rhelp at eoos.dds.nl
Sun Mar 25 12:24:42 CEST 2012
Your question is not completely clear. read.csv.ffdf automatically
reads in the data in chunks. You don´t have to do anything for that. You
can specify the size of the chunks using the next.rows option.
Jan
On 03/24/2012 09:29 PM, Mav wrote:
> Hello!
> A question about reading large CSV files
>
> I need to analyse several files with sizes larger than 3 GB. Those files
> have more than 10million rows (and up to 25 million) and 9 columns. Since I
> don´t have a large RAM memory, I think that the ff package can really help
> me. I am trying to use read.csv.ffdf but I have some questions:
>
> How can I read the files in several chunks…with an automatic way of
> calculating the number of rows to include in each chunk? (my problem is that
> the files have different number of rows)
>
> For instance…. I have used
> read.csv.ffdf(NULL, “file.csv”, sep="|", dec=".",header = T,row.names =
> NULL,colClasses = c(rep("integer", 3), rep("integer", 10), rep("integer",
> 6)))
> But with this way I am reading the whole file....I would prefer to read it
> in chunks....but I don´t know how to read it in chunks
>
> I have read the ff documentation but I am not good with R!
>
> Thanks in advance!
>
> --
> View this message in context: http://r.789695.n4.nabble.com/Reading-big-files-in-chunks-ff-package-tp4502070p4502070.html
> Sent from the R help mailing list archive at Nabble.com.
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
More information about the R-help
mailing list