[R-sig-hpc] A question about Cores vs. tasks using Slurm for R

Norm Matloff nsmatloff at ucdavis.edu
Sun Sep 27 02:23:18 CEST 2015


(Apologies if this message is circulated twice.)

Erin, specifically, what machine (and not machineS, plural, as in a
cluster) are you running this on, and what specific OS?  The number of
cores that are sensed here could depend on either or both.  See R's
online help for 'detectCores' for a brief discussion, as well as the
material on hyperthreading in my book (Ch.1).  

In addition, there are probably definitional issues regarding SLURM
here.  I'm not a SLURM user, but I'm sure others on this list can
comment on that.

Norm

On Sat, Sep 26, 2015 at 11:02:25PM +0000, Hodgess, Erin wrote:
> Hello!
> 
> I'm experimenting with some of the very interesting code in Prof. Matloff's new book "Parallel Computing for Data Science with Examples in R, C++ and CUDA".
> 
> I have access to a supercomputer and am using SLURM to run my batch jobs in R.
> 
> Here is what I have in my SBATCH file:
> 
> 
> !/bin/bash
> 
> #SBATCH -t 00:05:00
> 
> #SBATCH -n 32
> 
> #SBATCH -p development
> 
> #SBATCH -J myparb
> 
> #SBATCH -o myparb.%j.out
> 
> #SBATCH -e myparb.%j.err
> 
> 
> 
> 
> module load intel/15.0.2 Rstats/3.2.1
> 
> 
> source ~/.profile_user
> 
> 
> R CMD BATCH --slave ~/MutLinkPar2.R parb.txt
> 
> 
> Fair enough.  Here is a little section of my R file:
> 
> 
> cores <- detectCores()
> 
> cls <- initmc(cores)
> 
> cat("cores",cores,"\n")
> 
> sim(500,500,cls)
> 
> 
> 
> However, the output for the "cat" statement shows the number of cores to be 16.
> 
> Why is this, please?  Shouldn't it be 32, please?
> 
> Thank you in advance for any help!
> 
> Note:  This is R version 3.2.1.
> 
> Sincerely,
> Erin Hodgess
> 
> 
> 
> 	[[alternative HTML version deleted]]
> 
> _______________________________________________
> R-sig-hpc mailing list
> R-sig-hpc at r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-hpc



More information about the R-sig-hpc mailing list