[R-sig-hpc] Managing R CPU/memory usage on Linux
Davor Cubranic
cubranic at stat.ubc.ca
Fri Nov 19 22:41:48 CET 2010
We have some Linux compute servers that our users occasionally overwhelm
with their R batch jobs to the point where the machines are completely
unresponsive and have to be rebooted. This seems to be happening more
often recently, and got me wondering what other people do to manage the
CPU/memory resources used by R on their servers.
We 'nice -19' the R process, but that doesn't seem to help. Are there
any other R options or OS settings that would be useful? Or should I
consider installing a queuing manager and closing the servers to
interactive logins? As far as I can tell, out users just run existing R
packages from CRAN, and there is no parallelization or distributed
computing going on.
The machines are dual-CPU 64-bit Intels with 4GB of RAM and running
Ubuntu 8.04. So they won't be making the TOP500 list any time soon, but
I would have hoped the kernel would be a litter better at squelching
down CPU and memory hogs.
Thanks,
Davor
More information about the R-sig-hpc
mailing list