[R] How to clear R memory in a for loop

Dimitri Liakhovitski dimitri.liakhovitski at gmail.com
Wed Oct 22 21:37:09 CEST 2014


Thank you very much for looking into it, gentlemen!

On Tue, Oct 21, 2014 at 2:29 PM, Uwe Ligges
<ligges at statistik.tu-dortmund.de> wrote:
>
>
> On 21.10.2014 19:00, William Dunlap wrote:
>>
>> A few minutes with valgrind showed that output_pos was never
>> initialized, so the output array was not getting filled correctly.
>> The following fixes that problem
>>
>> diff -ru tuneR/src/readmp3.c /homes/bill/packages/tuneR/src/readmp3.c
>> --- tuneR/src/readmp3.c 2014-04-07 04:38:21.000000000 -0700
>> +++ /homes/bill/packages/tuneR/src/readmp3.c    2014-10-21
>> 09:54:19.351867000 -0700
>> @@ -96,6 +96,7 @@
>>     state.input = blob;
>>     state.input_size = n_blob;
>>     state.output_size = 0;
>> +  state.output_pos = 0;
>>     mad_decoder_init(&decoder, &state,
>>              mad_input_cb, mad_header_cb, NULL,
>>              NULL, NULL, NULL);
>
>
>
> Thanks, Bill!
> I haven't found the time to look at it.
> Now in the master sources, bugfix release will follow shortly,
> Uwe
>
>
>
>> Bill Dunlap
>> TIBCO Software
>> wdunlap tibco.com
>>
>>
>> On Tue, Oct 21, 2014 at 7:51 AM, Prof Brian Ripley
>> <ripley at stats.ox.ac.uk> wrote:
>>>
>>> On 21/10/2014 15:47, Dimitri Liakhovitski wrote:
>>>>
>>>>
>>>> I will try with .wav files and report back.
>>>> So far, I am not sure I understood what could be done (if anything) to
>>>> fix
>>>> it...
>>>
>>>
>>>
>>> This is nothing to do with my reply!
>>>
>>> The posting guide asked you to contact the tuneR maintainer *before
>>> posting*.  What did he say?
>>>
>>> Bill Dunlap's reply pointed to a bug in tuneR (or a library it uses).
>>>
>>>
>>>>
>>>> On Tue, Oct 21, 2014 at 2:26 AM, Prof Brian Ripley
>>>> <ripley at stats.ox.ac.uk> wrote:
>>>>>
>>>>>
>>>>> On 20/10/2014 17:53, John McKown wrote:
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Mon, Oct 20, 2014 at 10:30 AM, Dimitri Liakhovitski <
>>>>>> dimitri.liakhovitski at gmail.com> wrote:
>>>>>>
>>>>>>> Dear Rers,
>>>>>>>
>>>>>>> I am trying to run a for-loop in R.
>>>>>>> During each iteration I read in an mp3 file and do some basic
>>>>>>> processing.
>>>>>>> If I do what I need to do for each file one by one - it works fine.
>>>>>>> But once I start running a loop, it soon runs out of memory and says:
>>>>>>> can't
>>>>>>> allocate a vector of size...
>>>>>>> In each iteration of my loop I always overwrite the previously
>>>>>>> created
>>>>>>> object and do gc().
>>>>>>>
>>>>>>> Any hints on how to fight this?
>>>>>>>
>>>>>>> Thanks a lot!
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>> Please don't use HTML for messages.
>>>>>>
>>>>>> What occurs to me, from reading the other replies, is that perhaps
>>>>>> within
>>>>>> the loop you are causing other objects to be allocated. And that can
>>>>>> be
>>>>>> done just by doing a simple assignment, so it may not be obvious. What
>>>>>> this
>>>>>> can do is cause what we called a "sand bar" in the old days. That's
>>>>>> where
>>>>>> you allocate a big chunk of memory for an object. Say this take up 1/2
>>>>>> of
>>>>>> your available space. You now create a small object. This object is
>>>>>> _probably_ right next to the large object. You now release the large
>>>>>> object. Your apparent free space is now almost what it was at the
>>>>>> beginning. But when you try to allocate another large object which is,
>>>>>> say,
>>>>>> 2/3 of the maximum space, you can't because that small object is
>>>>>> sitting
>>>>>> right in the middle of our memory space. So you _can_ allocate 2 large
>>>>>> objects which are 1/3 your free space size, but not 1 object which is
>>>>>> 2/3
>>>>>> of the free space size. Which can lead to your type of situation.
>>>>>>
>>>>>> This is just a SWAG based on some experience in other systems. Most
>>>>>> "garbage collection" do _not_ do memory consolidation. I don't know
>>>>>> about
>>>>>> R.
>>>>>>
>>>>>>
>>>>> That is true of R (except for the early days which did have a moving
>>>>> garbage
>>>>> collector).
>>>>>
>>>>> However 'your available space' is not the amount of RAM you have but
>>>>> the
>>>>> process address space.  The latter is enormous on any 64-bit OS, so
>>>>> 'memory
>>>>> fragmentation' (as this is termed) is a thing of the past except for
>>>>> those
>>>>> limited to many-years-old OSes.
>>>>>
>>>>>
>>>>> --
>>>>> Brian D. Ripley,                  ripley at stats.ox.ac.uk
>>>>> Emeritus Professor of Applied Statistics, University of Oxford
>>>>> 1 South Parks Road, Oxford OX1 3TG, UK
>>>
>>>
>>>
>>>
>>>
>>> --
>>> Brian D. Ripley,                  ripley at stats.ox.ac.uk
>>> Emeritus Professor of Applied Statistics, University of Oxford
>>> 1 South Parks Road, Oxford OX1 3TG, UK
>>>
>>> ______________________________________________
>>> R-help at r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide
>>> http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.



-- 
Dimitri Liakhovitski



More information about the R-help mailing list