[Rd] R scripts slowing down after repeated called to compiled code
Vladimir Dergachev
vdergachev at rcgardis.com
Sat May 26 01:29:55 CEST 2007
On Friday 25 May 2007 7:12 pm, Michael Braun wrote:
> Thanks in advance to anyone that might be able to help me with this
>
> Also, it is not just the compiled call that slows down. EVERYTHING
> slows down, even those that consist only of standard R functions. The
> time for each of these function calls is roughly proportional to the
> time of the .Call to the C function.
>
> Another observation is that when I terminate the algorithm, do a rm
> (list=ls()), and then a gc(), not all of the memory is returned to the
> OS. It is not until I terminate the R session that I get all of the
> memory back. In my C code, I am not doing anything to de-allocate the
> SEXP's I create, relying on the PROTECT/UNPROTECT mechanism instead (is
> this right?).
>
> I spent most of the day thinking I have a memory leak, but that no
> longer appears to be the case. I tried using Rprof(), but that only
> gives me the aggregated relative time spent in each function (more than
> 80% of the time, it's in the .Call).
One possibility is that you are somehow creating a lot of R objects (say by
calling assign() or missing UNPROTECT()) and this slows garbage collector
down. The garbage collector running time will grow with the number of objects
you have - their total size does not have to be large.
Could you try printing numbers from gc() call and checking whether the numbers
of allocated objects grow a lot ?
best
Vladimir Dergachev
>
> So I'm stuck. Can anyone help?
>
> Thanks,
>
> Michael
More information about the R-devel
mailing list