[R] [R-sig-hpc] recursion depth limitations

Rui Barradas ruipbarradas at sapo.pt
Thu Dec 13 23:21:36 CET 2012


Hello.

Inline.
Em 13-12-2012 21:31, Suzen, Mehmet escreveu:
> On 13 December 2012 17:52, Simon Urbanek <simon.urbanek at r-project.org> wrote:
>> Because it's far beyond what you can handle without changing a lot of other things. 500k expressions will require at least about 320Mb of stack (!) in the eval() chain alone -- compare that to the 8Mb stack size which is default in most OSes, so you'll hit the wall way before that limit is reached.
>>
> Thank you for the explanation. Sorry to be dummy on this but why one
> need a stack? I thought pointing to itself
> has no memory cost for a function.

But it does, each recursive call will load another copy of the function, 
and another copy of the variables used.
In fact, the cost can become quite large since everything is loaded in 
memory again.

Hope this helps,

Rui Barradas
>   Is it about how compilers designed
> or about R being dynamic language?
>
>>> While some algorithms are only "solvable"  feasibility with recursion
>>> and 500K sounds not too much i.e. graph algorithms
>>> for example dependency trees with large nodes easily reach to that number.
>>>
>> I don't see how "large nodes" have anything to do with this since we are talking about expression depth, not about sizes of any kind.
>> Again, in any realistic example you'll hit other limits first anyway.
> I was thinking about very big tree with large depth, so each recursion
> step may correspond to one leaf. Well not sure what
> would be the application that has million depth, maybe a genetic algorithm.
>
> Cheers,
> Mehmet
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.




More information about the R-help mailing list