[R-meta] Categorical mixed effect models and interpretation of results

Viechtbauer, Wolfgang (SP) wolfg@ng@viechtb@uer @ending from m@@@trichtuniver@ity@nl
Tue Jun 19 11:31:52 CEST 2018


Hi Alex,

Yes, in principle this is right. If 'Treatment' only has two levels, then the QM-test and the test of the treatment coefficient are identical, so you can also just look at the latter.

However, I think your random effects structure is too simple given what you wrote. At the least, it should be something like:

random = ~ 1 | Study_id / Exp_id

where Exp_id is, as the name implies, the experiment id. See:

http://www.metafor-project.org/doku.php/analyses:konstantopoulos2011

and esp. the "A Common Mistake in the Three-Level Model" section.

You might also want to consider adding random effects for species. For example:

random = list(~ 1 | Study_id / Exp_id, ~ 1 | Species)

would add species random effects (as a crossed random effect, not nested within study and/or experiment).

Best,
Wolfgang

-----Original Message-----
From: R-sig-meta-analysis [mailto:r-sig-meta-analysis-bounces using r-project.org] On Behalf Of Alexander Sullivan (BIO - Student)
Sent: Tuesday, 19 June, 2018 10:58
To: r-sig-meta-analysis using r-project.org
Subject: [R-meta] Categorical mixed effect models and interpretation of results

Dear all,

In my meta-analysis I have [say] 100 experiments, from 30 studies, across 12 species. These experiments can be equally divided into two treatment levels [low and high]. If I want to determine if there is a significant difference in effect sizes between these two treatment levels would this be the right code:

res <- rma.mv(yi, vi, method = "REML", data = mydata, mods = ~factor(Treatment), random = ~1|Study_id)

... and then from the output of this model if the Qm statistic is significant I could say there is a significant difference in effect sizes between the two treatments?

Thank you for your time,

Alex

MSc Ecology and Conservation student
University of East Anglia, UK



More information about the R-sig-meta-analysis mailing list