[R-sig-hpc] A question about Cores vs. tasks using Slurm for R

Hodgess, Erin HodgessE at uhd.edu
Sun Sep 27 01:02:25 CEST 2015


Hello!

I'm experimenting with some of the very interesting code in Prof. Matloff's new book "Parallel Computing for Data Science with Examples in R, C++ and CUDA".

I have access to a supercomputer and am using SLURM to run my batch jobs in R.

Here is what I have in my SBATCH file:


!/bin/bash

#SBATCH -t 00:05:00

#SBATCH -n 32

#SBATCH -p development

#SBATCH -J myparb

#SBATCH -o myparb.%j.out

#SBATCH -e myparb.%j.err




module load intel/15.0.2 Rstats/3.2.1


source ~/.profile_user


R CMD BATCH --slave ~/MutLinkPar2.R parb.txt


Fair enough.  Here is a little section of my R file:


cores <- detectCores()

cls <- initmc(cores)

cat("cores",cores,"\n")

sim(500,500,cls)



However, the output for the "cat" statement shows the number of cores to be 16.

Why is this, please?  Shouldn't it be 32, please?

Thank you in advance for any help!

Note:  This is R version 3.2.1.

Sincerely,
Erin Hodgess



	[[alternative HTML version deleted]]



More information about the R-sig-hpc mailing list