[Rd] segfault when trying to allocate a large vector
Pierrick Bruneau
pbruneau at gmail.com
Thu Dec 18 09:00:26 CET 2014
Dear R contributors,
I'm running into trouble when trying to allocate some large (but in
theory viable) vector in the context of C code bound to R through
.Call(). Here is some sample code summarizing the problem:
SEXP test() {
int size = 10000000;
double largevec[size];
memset(largevec, 0, size*sizeof(double));
return(R_NilValue);
}
If size if small enough (up to 10^6), everything is fine. When it
reaches 10^7 as above, I get a segfault. As far as I know, a double
value is represented with 8 bytes, which would make largevec above
approx. 80Mo -> this is certainly large for a single variable, but
should remain well below the limits of my machine... Also, doing a
calloc for the same vector size leads to the same outcome.
In my package, I would use large vectors that cannot be assumed to be
sparse - so utilities for sparse matrices may not be considered.
I run R on ubuntu 64-bit, with 8G RAM, and a 64-bit R build (3.1.2).
As my problem looks close to that seen in
http://r.789695.n4.nabble.com/allocMatrix-limits-td864864.html,
following what I have seen in ?"Memory-limits" I checked that ulimit
-v returns "unlimited".
I guess I must miss something, like contiguity issues, or other. Does
anyone have a clue for me?
Thanks by advance,
Pierrick
More information about the R-devel
mailing list