[Rd] list_files() memory corruption?
Seth Falcon
seth at userprimary.net
Wed Mar 17 17:42:39 CET 2010
On 3/17/10 7:16 AM, Alistair Gee wrote:
> Yes. I had noticed that R occasionally segfaults (especially when I
> run many concurrent R processes), so I used valgrind to log every use
> of R. In the valgrind logs, I tracked the problem to list_files().
>
> I attached a patch to platform.c (for trunk). Unfortunately, I am
> having trouble building R from the subversion trunk--it is taking a
> very long time decompressing/installing the recommended packages--so I
> haven't been able to verify the fix yet. But my version of platform.c
> does compile, and it does simplify the code b/c count_files() is no
> longer needed.
Hmm, I see that you "grow" the vector containing filenames by calling
lengthgets and doubling the length. I don't see where you cleanup
before returning -- seems likely you will end up returning a vector that
is too long.
And there are some performance characteristics to consider in terms of
both run time and memory profile. Does making a single pass through the
files make up for the allocations/data copying that result from
lengthgets? Is it worth possibly requiring twice the memory for the
worst case?
+ seth
More information about the R-devel
mailing list