[Rd] segfault when trying to allocate a large vector
Karl Millar
kmillar at google.com
Thu Dec 18 09:44:05 CET 2014
Hi Pierrick,
You're storing largevec on the stack, which is probably causing a stack
overflow. Allocate largvec on the heap with malloc or one of the R memory
allocation routines instead and it should work fine.
Karl
On Thu, Dec 18, 2014 at 12:00 AM, Pierrick Bruneau <pbruneau at gmail.com>
wrote:
>
> Dear R contributors,
>
> I'm running into trouble when trying to allocate some large (but in
> theory viable) vector in the context of C code bound to R through
> .Call(). Here is some sample code summarizing the problem:
>
> SEXP test() {
>
> int size = 10000000;
> double largevec[size];
> memset(largevec, 0, size*sizeof(double));
> return(R_NilValue);
>
> }
>
> If size if small enough (up to 10^6), everything is fine. When it
> reaches 10^7 as above, I get a segfault. As far as I know, a double
> value is represented with 8 bytes, which would make largevec above
> approx. 80Mo -> this is certainly large for a single variable, but
> should remain well below the limits of my machine... Also, doing a
> calloc for the same vector size leads to the same outcome.
>
> In my package, I would use large vectors that cannot be assumed to be
> sparse - so utilities for sparse matrices may not be considered.
>
> I run R on ubuntu 64-bit, with 8G RAM, and a 64-bit R build (3.1.2).
> As my problem looks close to that seen in
> http://r.789695.n4.nabble.com/allocMatrix-limits-td864864.html,
> following what I have seen in ?"Memory-limits" I checked that ulimit
> -v returns "unlimited".
>
> I guess I must miss something, like contiguity issues, or other. Does
> anyone have a clue for me?
>
> Thanks by advance,
> Pierrick
>
> ______________________________________________
> R-devel at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>
[[alternative HTML version deleted]]
More information about the R-devel
mailing list