<!doctype html public "-//w3c//dtd html 4.0 transitional//en">
<html>
Dear R-users,
<br>
<p>I am currently facing what appears to be a strange thing (at least
to my humble understanding).
<p>If I understood correctly, starting with the version 1.2.3, R
memory allocation can be done dynamically,
<br>and there is no need to fiddle with the --nsize and --vsize parameter
any longer...
<p>So far this everything seemed to go this way (I saw the size of my processes
growing when I was using big objects and
<br>so on). Howver recently I had trouble with the memory. It seems
there is a limit of about 1,2 Go, beyond which R starts
<br>to send memory allocation error messages... not consistent with
the memory still available
<br>(like 'Error: cannot allocate vector of size 125382 Kb', while there
still about 17Go free).
<p>I thought default limitation were set, but it does not seem to be the
case
<p>> mem.limits()
<br>nsize vsize
<br> NA NA
<br>
<p>Any idea ?
<p>Where am I wrong ?
<br>
<br>
<p>Laurent
<br>
<p>PS: I am currently using R-1.3.0-patched, compiled on SGI IRIX
6.5 (I was using 1.2.3 and had the same kind of problems, that's why I upgraded)
<br>
<br>
<pre>--
Laurent Gautier CBS, Building 208, DTU
PhD. Student D-2800 Lyngby,Denmark
tel: +45 45 25 24 85 <A HREF="http://www.cbs.dtu.dk/laurent">http://www.cbs.dtu.dk/laurent</A></pre>
</html>