[R] error heatmap and stack overflow

Ben Bolker bbolker at gmail.com
Sat Aug 21 15:42:13 CEST 2010


>    michy <m.simon <at> har.mrc.ac.uk> writes:

> 
> Hello,
> Im trying to create a heatmap with a dataset (38 x 15037) but get the 
> error below:
> 
> Error: protect(): protection stack overflow
> Execution halted
> 
> or
> 
> Error: C stack usage is too close to the limit
> Execution halted
> 
> I tried to increase the stack size by changing:
> 
> extern uintptr_t R_CStackLimit
> 
>  but my systems manager said that R by default uses all the memory 
> available to it from the operating system.  Our machine has 128G.  I 
> also use options(expressions = 100000) but I still get the above 
> errors.  Can anyone help?  Im I trying to change the wrong thing or is 
> there anything else I can do?

  Can you provide a reproducible example, and the results of
sessionInfo() ?

  I need to find a machine where I can get through this -- one
one machine (Ubuntu Lucid running under VMWare) I get

  > set.seed(1001)
  > pdf("heatmap_test.pdf")
  > heatmap(matrix(runif(38*15037),ncol=38))
  Error: cannot allocate vector of size 862.5 Mb

   On the same machine but on the non-virtual,
MacOS side (R64) it's still running
after 21 minutes, with stable memory usage of about 5G.
Both are with R 2.11.1.

 It does strike me that your problem is likely (?) to 
be an actual bug: most of the hits one gets on http://rseek.org
for "C stack usage" seem to be from people using embedded R,
running R from with python, etc.

  Have you thought about using a less memory-intensive clustering
algorithm?



More information about the R-help mailing list