[R] R on a cluster head node and memory limits
Sean Davis
sdavis2 at mail.nih.gov
Mon Sep 24 14:27:05 CEST 2012
On our local cluster, users relatively routinely crash the head node
by doing simple and silly things that inadvertently eat all memory and
crash the head node. I read a bit about memory limits, but I am still
a bit unclear as to whether memory limits can be imposed a the R level
under linux. Currently, I see this under help(Memory):
"R has a variable-sized workspace. Prior to R 2.15.0 there were
(rarely-used) command-line options to control its size, but it is now
sized automatically."
What approaches can I suggest to our cluster administrators to limit R
memory usage (and only R) on the head node (the only node with an
outside internet connection)? The goal is to allow a small R instance
to run on the head node to allow package installation, but nothing
more complicated than that.
Thanks,
Sean
More information about the R-help
mailing list