[R] Problems with R memory usage on Linux

Prof Brian Ripley ripley at stats.ox.ac.uk
Wed Oct 15 19:51:36 CEST 2008


See ?"Memory-size"

On Wed, 15 Oct 2008, B. Bogart wrote:

> Hello all,
>
> I'm working with a large data-set, and upgraded my RAM to 4GB to help
> with the mem use.
>
> I've got a 32bit kernel with 64GB memory support compiled in.
>
> gnome-system-monitor and free both show the full 4GB as being available.
>
> In R I was doing some processing and I got the following message (when
> collecting 100 307200*8 dataframes into a single data-frame (for plotting):
>
> Error: cannot allocate vector of size 2.3 Mb
>
> So I checked the R memory usage:
>
> $ ps -C R -o size
>   SZ
> 3102548
>
> I tried removing some objects and running gc() R then shows much less
> memory being used:
>
> $ ps -C R -o size
>   SZ
> 2732124
>
> Which should give me an extra 300MB in R.
>
> I still get the same error about R being unable to allocate another 2.3MB.
>
> I deleted well over 2.3MB of objects...
>
> Any suggestions as to get around this?
>
> Is the only way to use all 4GB in R to use a 64bit kernel?
>
> Thanks all,
> B. Bogart
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>

-- 
Brian D. Ripley,                  ripley at stats.ox.ac.uk
Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
University of Oxford,             Tel:  +44 1865 272861 (self)
1 South Parks Road,                     +44 1865 272866 (PA)
Oxford OX1 3TG, UK                Fax:  +44 1865 272595



More information about the R-help mailing list