[R] R's memory limitation and Hadoop

John McKown john.archie.mckown at gmail.com
Tue Sep 16 14:01:38 CEST 2014


On Tue, Sep 16, 2014 at 6:40 AM, Barry King <barry.king at qlx.com> wrote:
> Is there a way to get around R’s memory-bound limitation by interfacing
> with a Hadoop database or should I look at products like SAS or JMP to work
> with data that has hundreds of thousands of records?  Any help is
> appreciated.
> __________________________
> *Barry E. King, Ph.D.*
> Analytics Modeler

Please change your email to plain text only, per forum standards.

You might want to look at bigmemory.
http://cran.revolutionanalytics.com/web/packages/bigmemory/index.html


-- 
There is nothing more pleasant than traveling and meeting new people!
Genghis Khan

Maranatha! <><
John McKown



More information about the R-help mailing list