[R] Clearing out or reclaiming memory

gug guygreen at netvigator.com
Tue Jun 30 13:36:26 CEST 2009


Thanks - that's great.  A combination of "object.size", "rm" and "gc" seems
to be enough for me to work out what was causing the problem and then get
beyond it.

In particular, using "rm" on the result of the multiple regression seems to
make a big difference: it wasn't obvious to me before, but of course the
memory taken up by that result doesn't get freed up when running a new
regression to the same output name.  Also not obvious to me was that it was
the regression expression itself that was taking up so much memory, as
opposed to just the underlying data.

I've been using "attach" because I was following one of the approaches
recommended in this "Basic Statistics and R" tutorial
(http://ehsan.karim.googlepages.com/lab251t3.pdf), in order to be able to
easily use the column headings within the regression formula.

I hadn't used "detach" at this point because I was aiming to use the same
data (i.e. the result of the rbind) in the later operations, with only the
factors being tested changing.  I will look into the "with" approach you
mention - thanks for that.

Thanks again,
Guy


jholtman wrote:
> 
> You can use 'rm' to remove objects.  Are you remembering to do a 'detach'
> after the 'attach'?  Why are you using 'attach' (I personally avoid it)?
> Think about using 'with'.  Take a look at the size of the objects you are
> working with (object.size) to understand where you might have problems. 
> Use
> 'search' to see what still might be attached.  I think that as long as
> something is 'attached' memory is not freed up after 'rm' until you do the
> 'detach'
> 
> -- 
> Jim Holtman
> Cincinnati, OH
> +1 513 646 9390
> 
> What is the problem that you are trying to solve?
> 
> 
> 

-- 
View this message in context: http://www.nabble.com/Clearing-out-or-reclaiming-memory-tp24268680p24270331.html
Sent from the R help mailing list archive at Nabble.com.




More information about the R-help mailing list