[R] R's memory limitation and Hadoop
Hadley Wickham
h.wickham at gmail.com
Tue Sep 16 15:53:15 CEST 2014
Hundreds of thousands of records usually fit into memory fine.
Hadley
On Tue, Sep 16, 2014 at 12:40 PM, Barry King <barry.king at qlx.com> wrote:
> Is there a way to get around R’s memory-bound limitation by interfacing
> with a Hadoop database or should I look at products like SAS or JMP to work
> with data that has hundreds of thousands of records? Any help is
> appreciated.
>
> --
> __________________________
> *Barry E. King, Ph.D.*
> Analytics Modeler
> Qualex Consulting Services, Inc.
> Barry.King at qlx.com
> O: (317)940-5464
> M: (317)507-0661
> __________________________
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
--
http://had.co.nz/
More information about the R-help
mailing list