[R] Cleaning up the memory
Prof Brian Ripley
ripley at stats.ox.ac.uk
Fri Aug 10 19:28:07 CEST 2007
On Fri, 10 Aug 2007, Monica Pisica wrote:
>
> Hi,
>
> I have 4 huge tables on which i want to do a PCA analysis and a kmean
> clustering. If i run each table individually i have no problems, but if
> i want to run it in a for loop i exceed the memory alocation after the
> second table, even if i save the results as a csv table and i clean up
> all the big objects with rm command. To me it seems that even if i don't
> have the objects anymore, the memory these objects used to occupy is not
> cleared. Is there any way to clear up the memory as well? I don't want
> to close R and start it up again. Also i am running R under Windows.
See ?gc, which does the clearing.
However, unless you study the memory allocation in detail (which you
cannot do from R code), you don't actually know that this is the problem.
More likely is that you have fragmentation of your 32-bit address space:
see ?"Memory-limits".
Without any idea what memory you have and what 'huge' means, we can only
make wild guesses. It might be worth raising the memory limit (the
--max-mem-size flag).
>
> thanks,
>
> Monica
> _________________________________________________________________
> [[trailing spam removed]]
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
>
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
More information about the R-help
mailing list