[R-sig-eco] Using vegan with a large data set

Peter Solymos solymos at ualberta.ca
Tue Mar 6 01:11:49 CET 2012


Bier,

Solutions might depend on OS and 32/64 bit build that you are using.
For general info, have a look at R FAQ:
http://cran.r-project.org/bin/windows/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021
or read help("Memory-limits")

7000*50 is usually not considered big data nowadays, but the
calculations can eat up the memory regardless. If brute force memory
allocation does not solve your problem, you can randomly take subsets
of the data and combine the estimates in the end.

HTH,

Peter

Péter Sólymos
Alberta Biodiversity Monitoring Institute
and Boreal Avian Modelling project
Department of Biological Sciences
CW 405, Biological Sciences Bldg
University of Alberta
Edmonton, Alberta, T6G 2E9, Canada
Phone: 780.492.8534
Fax: 780.492.7635
email <- paste("solymos", "ualberta.ca", sep = "@")
http://www.abmi.ca
http://www.borealbirds.ca
http://sites.google.com/site/psolymos



On Mon, Mar 5, 2012 at 4:42 PM, Bier Ekaphan Kraichak
<ekraichak at gmail.com> wrote:
> Dear list,
>
> I'm trying to perform adonis and betadisper functions on a relative large
> data set (7000 sites x 50 spp.) and found that R cannot allocate that much
> memory to this task (I got an error along the line of "Cannot allocate
> vector of xx Mb"). I have tried to use some of the high performance
> computing packages mentioned on CRAN tasks page (e.g. bigmemory), but it
> seems like functions in Vegan can't really take objects produced from this
> kind of packages. Any suggestions on how to proceed with this type of
> analysis?
>
> Thank you,
> Bier Kraichak
>
>        [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-ecology mailing list
> R-sig-ecology at r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-ecology
>



More information about the R-sig-ecology mailing list