[Rd] R scripts slowing down after repeated called to compiled code

Dirk Eddelbuettel edd at debian.org
Sat May 26 03:40:18 CEST 2007


On 25 May 2007 at 19:12, Michael Braun wrote:
| So I'm stuck.  Can anyone help?

It sounds like a memory issue. Your memory may just get fragmented. One tool
that may help you find leaks is valgrind -- see the 'R Extensions' manual. I
can also recommend the visualisers like kcachegrind (part of KDE).

But it may not be a leak. I found that R just doesn't cope well with many
large memory allocations and releases -- I often loop over data request that
I subset and process. This drives my 'peak' memory use to 1.5 or 1.7gb on
32bit/multicore machine with 4gb, 6gb or 8gb (but 32bit leading to the hard
3gb per process limit) .  And I just can't loop over many such task.  So I
now use the littler frontend to script this, dump the processed chunks as
Rdata files and later re-read the pieces. That works reliably.

So one think you could try is to dump your data in 'gsl ready' format from R,
quit R, leave it out of the equation and then see if what happens if you do
the iterations in only GSL and your code.

Hth, Dirk

-- 
Hell, there are no rules here - we're trying to accomplish something. 
                                                  -- Thomas A. Edison



More information about the R-devel mailing list