[Rd] Running R on dual/quad Opteron machines

Sean Davis sdavis2 at mail.nih.gov
Mon Mar 6 20:28:30 CET 2006




On 3/6/06 1:37 PM, "Thomas Lumley" <tlumley at u.washington.edu> wrote:

> On Mon, 6 Mar 2006, Simone Giannerini wrote:
> 
>> On 3/6/06, Thomas Lumley <tlumley at u.washington.edu> wrote:
>>> On Mon, 6 Mar 2006, Simone Giannerini wrote:
>>>> The environment will probably be either Unix/Linux or Solaris and the
>>>> amount of RAM will be 8-16Gb, depending on the number of processors.
>>>> My main concerns are the following:
>>>> 
>>>> 1. How much does R  benefit from passing from one processor to
>>>> two/four processor machines? Consider that the typical intensive use
>>>> of the server
>>>> will be represented by simulation studies with many repeated loops.
>>> 
>>> The typical way that R is used on multiprocessor systems is running more
>>> than one program, rather than parallel processing. If four people are
>>> using the computer or if one person splits 10,000 iterations of a
>>> simulation into 4 sets of 2,500 you will be using all four processors.
>>> 
>> Many thanks, if I have understood correctly, in this case I would need
>> running four separate instances of R, since a single thread cannot
>> exploit more than one cpu, am I correct?
>> 
> 
> You *can* exploit more than one CPU using eg the "snow" package, but it's
> often easier to just run multiple instances of R, and for a shared
> computing system there are often multiple people each running one instance
> of R.

And let me couch my earlier statements on snow/Rmpi by saying that we use
these tools on a relatively large beowulf cluster (~200 nodes), which is
somewhat different than a single box with 2-4 processors, so it is may not
be worth the trouble outside of a cluster environment.  For example, we have
not moved to using Rmpi/snow on our dual-processor G5s because the speed
gain just isn't worth the extra installation trouble, etc.

Sean



More information about the R-devel mailing list