[R] big data?
Spencer Graves
spencer.graves at structuremonitoring.com
Tue Aug 5 19:20:10 CEST 2014
What tools do you like for working with tab delimited text files
up to 1.5 GB (under Windows 7 with 8 GB RAM)?
Standard tools for smaller data sometimes grab all the available
RAM, after which CPU usage drops to 3% ;-)
The "bigmemory" project won the 2010 John Chambers Award but "is
not available (for R version 3.1.0)".
findFn("big data", 999) downloaded 961 links in 437 packages.
That contains tools for data PostgreSQL and other formats, but I
couldn't find anything for large tab delimited text files.
Absent a better idea, I plan to write a function getField to
extract a specific field from the data, then use that to split the data
into 4 smaller files, which I think should be small enough that I can do
what I want.
Thanks,
Spencer
More information about the R-help
mailing list