[R] R & very large files
S Devriese
sdmaillist at gmail.com
Wed Dec 16 12:08:59 CET 2009
On 12/16/2009 11:59 AM, Albert-Jan Roskam wrote:
> Hi,
>
> I very recently started using R (as in: last week) and I was wondering if anyone could point me to website(s) with sample code to deal with large datasets (length- and/or breadthwise). I understood that R was never designed to work with datasets larger than, say, a couple of hundred Mb. One way is (as I also read) to let R work in conjunction with SQL. That's one interesting approach I'd like to know more about. But I was also hoping that there also were pure R solutions for working with very large tables (was 'scan' designed for that?). In any case, a standard approach would be desirable.
>
> Thanks in advance.
>
> Cheers!!
> Albert-Jan
>
> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> In the face of ambiguity, refuse the temptation to guess.
> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
>
See for example the "Large memory and out-of-memory data" section of the
taskview "High-Performance and Parallel Computing with R"
(pick a mirror on http://cran.r-project.org/mirrors.html then "Task
Views" then "HighPerformanceComputing")
Stephan
More information about the R-help
mailing list