[R] Memory issues in R

Ben Bolker bolker at ufl.edu
Mon Apr 27 05:18:50 CEST 2009




Neotropical bat risk assessments wrote:
> 
> 
>    How do people deal with R and memory issues?
>    I have tried using gc() to see how much memory is used at each step.
>    Scanned Crawley R-Book and all other R books I have available and the
> FAQ
>    on-line but no help really found.
>    Running WinXP Pro (32 bit) with 4 GB RAM.
>    One SATA drive pair is in RAID 0 configuration with 10000 MB allocated
> as
>    virtual memory.
>    I do have another machine set up with Ubuntu but it only has 2 GB RAM
> and
>    have not been able to get R installed on that system.
>    I can run smaller sample data sets w/o problems and everything plots as
>    needed.
>    However I need to review large data sets.
>    Using latest R version 2.9.0 (2009-04-17)
>    My  data is in CSV format with a header row and is a big data set with
>    1,200,240 rows!
>    E.g. below:
>    
> 

 Maybe not the general solution you're looking for, but would you get
reasonable results by either (1) subsampling data or (2) reading the
data file in chunks and averaging the kernel densities you get from
each chunk? 


-- 
View this message in context: http://www.nabble.com/Memory-issues-in-R-tp23243275p23249481.html
Sent from the R help mailing list archive at Nabble.com.




More information about the R-help mailing list