[R] R memory usage
Prof Brian Ripley
ripley at stats.ox.ac.uk
Thu Aug 9 09:45:24 CEST 2007
See
?gc
?"Memory-limits"
On Wed, 8 Aug 2007, Jun Ding wrote:
> Hi All,
>
> I have two questions in terms of the memory usage in R
> (sorry if the questions are naive, I am not familiar
> with this at all).
>
> 1) I am running R in a linux cluster. By reading the R
> helps, it seems there are no default upper limits for
> vsize or nsize. Is this right? Is there an upper limit
> for whole memory usage? How can I know the default in
> my specific linux environment? And can I increase the
> default?
See ?"Memory-limits", but that is principally a Linux question.
>
> 2) I use R to read in several big files (~200Mb each),
> and then I run:
>
> gc()
>
> I get:
>
> used (Mb) gc trigger (Mb) max used
> Ncells 23083130 616.4 51411332 1372.9 51411332
> Vcells 106644603 813.7 240815267 1837.3 227550003
>
> (Mb)
> 1372.9
> 1736.1
>
> What do columns of "used", "gc trigger" and "max used"
> mean? It seems to me I have used 616Mb of Ncells and
> 813.7Mb of Vcells. Comparing with the numbers of "max
> used", I still should have enough memory. But when I
> try
>
> object.size(area.results) ## area.results is a big
> data.frame
>
> I get an error message:
>
> Error: cannot allocate vector of size 32768 Kb
>
> Why is that? Looks like I am running out of memory. Is
> there a way to solve this problem?
>
> Thank you very much!
>
> Jun
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
More information about the R-help
mailing list