[R-meta] Additional Info: Pairwise moderator testing in multilevel meta-analysis with CRVE / CIs

Röhl, Sebastian @eb@@t|@n@roeh| @end|ng |rom un|-tueb|ngen@de
Fri Feb 10 17:11:48 CET 2023


Hi James,

thank you very much! This helps me a lot!

Best regards,
Sebastian

-----Ursprüngliche Nachricht-----
Von: R-sig-meta-analysis <r-sig-meta-analysis-bounces using r-project.org> Im Auftrag von James Pustejovsky via R-sig-meta-analysis
Gesendet: Freitag, 10. Februar 2023 15:12
An: R Special Interest Group for Meta-Analysis <r-sig-meta-analysis using r-project.org>
Cc: James Pustejovsky <jepusto using gmail.com>
Betreff: Re: [R-meta] Additional Info: Pairwise moderator testing in multilevel meta-analysis with CRVE / CIs

Hi Sebastian,

Pairwise tests are definitely possible when using CRVE. The issue is that overlap of confidence intervals is not generally a valid method for gauging statistical significance of differences.

When comparing the means of *independent* samples, confidence interval overlap is conservative, so overlap does not imply statistical non-significance of differences in means. See Schenker & Gentleman (2001; https://doi.org/10.1198/000313001317097960), Austin & Hux (2002;
https://doi.org/10.1067/mva.2002.125015) or many others.

If the means are from *dependent* samples (as could be the case for your meta-regression results), there is no direct correspondence between CI overlap and statistical significance. This is because the SE for the difference in means depends not just on the SEs for the means but also on the sampling covariance between them. As a simple example, consider the confidence intervals for the means of A and B, based on a sample of N = 100 from a bivariate normal distribution where meanB = meanA + 0.1, sdA = sdB = 1, and cor(A,B) = 0.9. The confidence intervals will have a probability of overlapping but the difference in means will be fairly precisely estimated because the correlation is so high.

James

On Fri, Feb 10, 2023 at 2:50 AM Röhl, Sebastian via R-sig-meta-analysis < r-sig-meta-analysis using r-project.org> wrote:

> Hi all,
>
> Just an addition to my question from yesterday:
> Additionally to using the robust() and anova() function, I also tried 
> out
> Wald_test() from the clubSandwich packacke.
> The results are the same (with F instead of T statistics):
> > Wald_test(out_3, constraints = constrain_pairwise(1:3), vcov="CR2")
> $`out_acad - out_intg`
>  test Fstat df_num df_denom  p_val sig
>   HTZ  4.14      1     10.9 0.0669   .
>
> $`out_socem - out_intg`
>  test Fstat df_num df_denom p_val sig
>   HTZ 0.225      1     13.2 0.643
>
> $`out_socem - out_acad`
>  test Fstat df_num df_denom   p_val sig
>   HTZ  18.7      1      9.6 0.00165  **
>
> Can anybody help me?
>
> Thank you.
>
> All the best,
> Sebastian
>
> -----Ursprüngliche Nachricht-----
> Von: R-sig-meta-analysis <r-sig-meta-analysis-bounces using r-project.org> 
> Im Auftrag von Röhl, Sebastian via R-sig-meta-analysis
> Gesendet: Donnerstag, 9. Februar 2023 12:30
> An: r-sig-meta-analysis using r-project.org
> Cc: Röhl, Sebastian <sebastian.roehl using uni-tuebingen.de>
> Betreff: [R-meta] Pairwise moderator testing in multilevel 
> meta-analysis with CRVE / CIs
>
> Hi,
>
> I have the following problem:
> I am conducting a multilevel meta-analysis using metafor with cluster 
> robust variance estimation and want to test the moderating effect of 
> different kinds of outcomes. Additionally I want to test whether the 
> several outcomes differ significantly from each other.
> Here is an example:
> out_3 <- rma.mv(zr, V=var, random = ~ 1| Sample_ID / number, mods = ~ 
> -1
> + out_intg + out_acad + out_socem,
>                        data = data_int) out_3_rob <- robust(out_3, 
> Sample_ID, clubSandwich = T) anova(out_3_rob, 
> X=rbind(c(-1,1,0),c(-1,0,1), c(0,-1,1)))
>
> The robust model result shows C.I. that overlap.
> Model Results:
>
>            estimate     se¹   tval¹    df¹   pval¹  ci.lb¹  ci.ub¹     ​
> out_intg     0.2302  0.0231  9.9484  30.84  <.0001  0.1830  0.2773  ***
> out_acad     0.1646  0.0220  7.4677  17.36  <.0001  0.1182  0.2111  ***
> out_socem    0.2458  0.0278  8.8510  22.27  <.0001  0.1882  0.3034  ***
>
> BUT the anova results show significant differences between 2 outcomes:
>
> Hypotheses:
>
> 1:  -out_intg + out_acad = 0
>
> 2: -out_intg + out_socem = 0
>
> 3: -out_acad + out_socem = 0
>
>
>
> Results:
>
>    estimate     se    tval    df   pval
>
> 1:  -0.0655 0.0322 -2.0349 10.92 0.0669
>
> 2:   0.0157 0.0330  0.4742 13.21 0.6431
>
> 3:   0.0812 0.0188  4.3264  9.60 0.0016
>
> Do I have a thinking error here about the ANOVA or is this pairwise 
> testing not possible with the CRVE-results?
> Perhaps I am also interpreting the C.I.s incorrectly? If I calculate a 
> pairwise comparison with the non-robust model, I also get significant 
> difference although also the non-robust C.I. overlap.
>
> Thank you very much for your help!
> Best,
> Sebastian
>
> ****************************
> Dr. Sebastian Röhl
> Eberhard Karls Universität Tübingen
> Institute for Educational Science
> Tübingen School of Education (TüSE)
> Wilhelmstraße 31 / Room 302
> D-72074 Tübingen
> Germany
>
> Phone: +49 7071 29-75527
> Fax: +49 7071 29-35309
> Email: sebastian.roehl using uni-tuebingen.de<mailto:
> sebastian.roehl using uni-tuebingen.de>
> Twitter: @sebastian_roehl  @ResTeacherEdu
>
>
>
>         [[alternative HTML version deleted]]
>
> _______________________________________________
> R-sig-meta-analysis mailing list @ R-sig-meta-analysis using r-project.org 
> To manage your subscription to this mailing list, go to:
> https://stat.ethz.ch/mailman/listinfo/r-sig-meta-analysis
> _______________________________________________
> R-sig-meta-analysis mailing list @ R-sig-meta-analysis using r-project.org 
> To manage your subscription to this mailing list, go to:
> https://stat.ethz.ch/mailman/listinfo/r-sig-meta-analysis
>

	[[alternative HTML version deleted]]

_______________________________________________
R-sig-meta-analysis mailing list @ R-sig-meta-analysis using r-project.org To manage your subscription to this mailing list, go to:
https://stat.ethz.ch/mailman/listinfo/r-sig-meta-analysis


More information about the R-sig-meta-analysis mailing list