[R-meta] Influential case diagnostics in a multivariate multilevel meta-analysis in metafor
Viechtbauer, Wolfgang (SP)
wo||g@ng@v|echtb@uer @end|ng |rom m@@@tr|chtun|ver@|ty@n|
Fri Jan 18 18:15:31 CET 2019
Happy to hear that!
Best,
Wolfgang
-----Original Message-----
From: Yogev Kivity [mailto:yogev_k using yahoo.com]
Sent: Friday, 18 January, 2019 16:18
To: Viechtbauer, Wolfgang (SP)
Cc: R-sig-meta-analysis using r-project.org
Subject: Re: [R-meta] Influential case diagnostics in a multivariate multilevel meta-analysis in metafor
Hi Wolfgang,
Using the latest 'devel' version of metafor worked! It took the computation about 10 minutes to run with 4 parallel cores (number of cores was indeed determined using the 'parallel' package).
Thanks for all your help!
Yogev
--
Yogev Kivity, Ph.D.
Postdoctoral Fellow
Department of Psychology
The Pennsylvania State University
Bruce V. Moore Building
University Park, PA 16802
Office Phone: (814) 867-2330
On Thu, Jan 17, 2019 at 5:16 PM Viechtbauer, Wolfgang (SP) <wolfgang.viechtbauer using maastrichtuniversity.nl> wrote:
Hi Yogev,
Just to be safe, make sure you are using the latest 'devel' version of metafor. Run devtools::install_github("wviechtb/metafor") to be sure. Also, I would go with whatever detectCores(logical=FALSE) tells you for the number of cores. But even without that, things should finish in a few minutes. Beyond that, I really don't know what the issue could be. It certainly isn't an issue with metafor per se.
Best,
Wolfgang
-----Original Message-----
From: Yogev Kivity [mailto:yogev_k using yahoo.com]
Sent: Thursday, 17 January, 2019 21:37
To: Viechtbauer, Wolfgang (SP)
Cc: Martineau, Roger (AAFC/AAC); R-sig-meta-analysis using r-project.org
Subject: Re: [R-meta] Influential case diagnostics in a multivariate multilevel meta-analysis in metafor
Hi Wolfgang,
Thanks for your detailed reply and suggestions. Unfortunately, even after implementing your suggestions, I could not get the computation to terminate after letting it run for the night (with 4 logical cores).
I was going to suggest that perhaps the unbalanced dataset I am working with compared to the konstantopoulos2011 data has something to do with it (cluster size in my dataset ranges between 1 and 234 effect sizes with a mean of 11 and a median of 5). However, when I tried to run the konstantopoulos2011 code, I got similar running times for fitting the models (using standard BLAS), but I could not get the Cook’s distances computation to terminate even after 2050 seconds – even when I used parallel processing with 4 logical cores. I used this code:
system.time(sav2 <- cooks.distance(res2, cluster=dat$group, reestimate=FALSE, parallel="snow", ncpus=4))
Any thoughts?
Thanks,
Yogev
More information about the R-sig-meta-analysis
mailing list