[R] R --nsize 2M runs havoc (under linux)

Martyn Plummer plummer at iarc.fr
Wed Oct 6 17:21:24 CEST 1999


Each cons cell is 16 bytes, so asking for a 10MB vector heap and 3 million
cons cells will give you close to 60MB of memory. Add a little extra
for the R program itself and your figure looks reasonable (in kB not MB,
as Peter pointed out).

This is a lot of memory on a machine with 128MB RAM. Even if R doesn't
crash it will be very slow due to the use of the swap partition (you do
have one and it is enabled I hope). I suspect the reason you can't reproduce
the bug on Solaris is that your Sun has 4GB of memory, so probably won't
be swapping at all.

Do you really need that many cons cells? The default is 250,000.

Martyn

On 06-Oct-99 Joerg Kindermann wrote:
> Dear All,
> 
> I am running R version 0.65.0 under
> 
> a) Suse-Linux 6.1, and Suse-Linux 6.2, compiler gcc-2.95, CPUs pentium pro
> 200, 128MB, and pentium II 450, 128MB
> b) Solaris 5.7, compiler gcc-2.95, cpu SUN sparc, 4000MB
> 
> When I set --nsize to more than 1M, R's internal storage management runs
> havoc. gc() indicates the requested sizes, but the overall process size is
> much too big: Running R with --vsize 10M --nsize 3M will for example result
> in a process size of 63.276 MB! Using such an R process will lead to a
> segmentation fault sooner or later, usually inside the storage allocation
> routine of R. I cannot reproduce the strange behavior under Solaris,
> however.
> 
> I would be glad to hear from anyone who has encountered and fixed(!) this
> problem.
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list