[R-meta] Cross-Classified Random-Effects Model in rma.mv

Assink, Mark m@@@@|nk @end|ng |rom uv@@n|
Fri Jan 18 16:48:37 CET 2019

Dear Wolfgang and other members,

In a recent paper of Fern�ndez-Castilla and colleagues (2018; https://doi.org/10.3758/s13428-018-1063-2), it is explained how a cross-classified random effects model (CCREM) can be fitted in SAS. I was wondering whether and how CCREM's can be fitted in R using the rma.mv function of the metafor package.

Following the above cited paper, suppose you have a meta-analytic structure in which effect sizes are nested within studies:

* Level 1 -> Variability in effect sizes due to sampling variance;
* Level 2 -> Variability in effect sizes extracted from the same studies (i.e., within-study variance);
* Level 3 -> Variability in effect sizes extracted from different studies (i.e., between-study variance).

Let's say that across primary studies multiple/different instruments were used to measure a specific outcome. For example, three effect sizes from study 1 were based on instruments 1, 2, and 3; two effect sizes from study 2 were based on instruments 1 and 4; three effect sizes from study 3 were based on instruments 2, 4, and 5, etc. So, the variable "instrument" can be regarded as a crossed factor.

To model the above structure using the rma.mv function, I would write:

rma.mv(yi, vi, random = list(~ 1 | study/effectsize, ~ 1 | instrument), data=data)

I assume that with this syntax, the clustering or dependency of effect sizes within studies is accounted for, while the variation in effect sizes based on the different instruments that are used (between-instrument variance) is also modeled.

However, I am not sure whether this would be correct as Fern�ndez-Castilla et al. (2018) refer to "random factors nested within studies" in their appendices with SAS codes. I'd say that a variable like "instrument" from the example above would not be nested within studies, because the same instrument(s) are used across studies.

Are my reasoning and R-syntax correct? I highly appreciate any reflection, help, or suggestion.


	[[alternative HTML version deleted]]

More information about the R-sig-meta-analysis mailing list