[R-meta] Should we adjust for (standardized) baseline in meta-regression like ANCOVA?

Zac Robinson z@crob|n@on2015 @end|ng |rom gm@||@com
Sun Mar 3 02:17:26 CET 2024


Dear All,


I am working through a conceptual issue that I can't seem to find any
resources on. Specifically, I am performing a multilevel dose-response
meta-regression using the 'metafor' package. My model includes effect sizes
calculated as a standardized mean change within each study arm and a
continuous moderator. So in R syntax:


((Post Mean - Pre Mean) / Pre SD) ~ Moderator


Whenever I write this out, I can’t help but think of similarities to a
typical “change score ANCOVA” many run for an RCT on a continuous outcome,
where the baseline mean is included as a covariate to improve precision and
account for things like regression to the mean:


(Post Mean - Pre Mean) ~ Treatment + Pre Mean


To me, it seems like the same rationale would apply for meta-regression,
just that the pre-score would need to be standardized. It seems like you
are basically dividing each side of the equation by the Pre SD.


((Post Mean - Pre Mean) / Pre SD) ~ Moderator + (Pre Mean / Pre SD)


Am I totally off base with this? It seems to make sense to me, but I am
also very open to the possibility that I’m missing something and the
approach I am proposing could be introducing mathematical coupling that may
be misleading. It could also be something that is taken care of my random
effects (i.e., list(~Moderator|study, ~1|arm, ~1|es)) - although that still
seems like it wouldn't totally remedy the issue.


Also, I am aware that it is sometimes recommended to extract the adjusted
means from the ANCOVA of each individual study to get around this issue -
but in my case I only have access to the raw (unadjusted) means. Moreover,
this issue seems a bit more straightforward if I was able to keep my effect
sizes in raw units, but because I am including effects on multiple scales,
effects need to be standardized.


Thank you in advance!


Zac

	[[alternative HTML version deleted]]



More information about the R-sig-meta-analysis mailing list