[R] Memory usage of R on Windows XP

Petr Pikal petr.pikal at precheza.cz
Fri May 21 07:24:28 CEST 2004



On 20 May 2004 at 20:58, Peter Wilkinson wrote:

> 
> I am running R 1.8.1 on windows xp. I have been using the 'apply' R
> function to run a short function against a matrix of 19000 x 340 rows
> ... yes it is a big matrix. every item in the matrix is a float that
> can have a maximum value of 2^16 ~ 65k.

Hi

I would go for computing only, should be quicker.

(yourmatrix>=256)*1

will give you 1 on places where was number greater or equal 256 and 0 where it 
was lower.

Not sure about memory issues but I suppose better performance than if else apply 
construction.

Cheers
Petr


> 
> The function:
> 
> mask256 <- function(value) {
>      if (value < 256) {
>         result = 0
>      }
>      else {
>         result = 1
>      }
>      result
> }
> 
> what happens is that the memory required for the session to run starts
> ballooning. The matrix with a few other objects starts at about 160M,
> but then quickly goes up to 750M, and stays there when the function
> has completed
> 
> I am fairly new to R. Is there something I should know about writing
> functions , i.e. do I need to clean-up at the end of the function? It
> seems R can not release the memory once it has been used. When I close

I suppose R is doing it best but Windows does not take it back.

> the R application and open the R application again then the memory is
> back down to what it is supposed to be, the size of the workspace,
> plus any new objects that I have created
> 
> Does anybody know what is going on?
> 
> Peter
> 
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide!
> http://www.R-project.org/posting-guide.html

Petr Pikal
petr.pikal at precheza.cz




More information about the R-help mailing list