[Rd] R process killed when allocating too large matrix (Mac OS X)
Prof Brian Ripley
ripley at stats.ox.ac.uk
Thu May 5 11:39:37 CEST 2016
On 05/05/2016 10:11, Uwe Ligges wrote:
>
>
> On 05.05.2016 04:25, Marius Hofert wrote:
>> Hi Simon,
>>
>> ... all interesting (but quite a bit above my head). I only read
>> 'Linux' and want to throw in that this problem does not appear on
>> Linux (it seems). I talked about this with Martin Maechler and he
>> reported that the same example (on one of his machines; with NA_real_
>> instead of '0's in the matrix) gave:
>>
>> Error: cannot allocate vector of size 70.8 Gb
>> Timing stopped at: 144.79 41.619 202.019
>>
>> ... but no killer around...
>
> Well, with n=1. ;-)
>
> Actually this also happens under Linux and I had my R processes killed
> more than once (and much worse also other processes so that we had to
> reboot a server, essentially). That's why we use job scheduling on
> servers for R nowadays ...
Yes, Linux does not deal safely with running out of memory, although it
is better than it was. In my experience, only commercial Unices do that
gracefully.
Have you tried setting a (virtual) memory limit on the process using the
shell it is launched from? I have found that to be effective on most
OSes, at least in protecting other processes from being killed.
However, some things do reserve excessive amounts of VM that they do not
use and so cannot be run under a sensible limit.
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Emeritus Professor of Applied Statistics, University of Oxford
More information about the R-devel
mailing list