[R] One critical question in R

Steve Lianoglou mailinglist.honeypot at gmail.com
Tue Aug 4 17:31:02 CEST 2009


Hi,

On Aug 4, 2009, at 11:20 AM, Hyo Karen Lee wrote:

> Hi,
> I have one critical question in using R.
> I am currently working on some research which involves huge amounts
> of data(it is about 15GB).
> I am trying to use R in this research rather than using SAS or STATA.
> (The company where I am working right now, is trying to switch SAS/ 
> STATA to
> R)
>
> As far as I know, the memory limit in R is 4GB;

While that might be true on windows(?), I'm pretty/quite (positively,  
even) sure that's not true on 64bit linux/osx.

> However, I believe that there are ways to handle the large dataset.
> Most of my works in R would be something like cleaning the data or  
> running a
> simple regression(OLS/Logit) though.

One place to look would be the bigmemory package:

http://cran.r-project.org/web/packages/bigmemory/

As well as the other packages listed in the High Performance Computing  
view on CRAN:

http://cran.r-project.org/web/views/HighPerformanceComputing.html

Specifically the "Large memory and out-of-memory data" section.

-steve

--
Steve Lianoglou
Graduate Student: Computational Systems Biology
   |  Memorial Sloan-Kettering Cancer Center
   |  Weill Medical College of Cornell University
Contact Info: http://cbio.mskcc.org/~lianos/contact




More information about the R-help mailing list