[R-meta] Interpreting meta-regression results for dummy-coded variables
Acar, Selcuk
Se|cuk@Ac@r @end|ng |rom unt@edu
Sun Jun 19 00:14:02 CEST 2022
Hi,
I ran a meta-regression in metafor package with both continuous and dummy-coded moderators. In some of the moderators, when we had only one dummy-code significant, we interpreted this as this moderator with several categories being significant. For example, we had "participant group" moderator consisting of "elementary" "middle" "high" and "undergraduate" categories, and used "undergraduates" as the reference group. We thought this moderator would be significant even when only one of dummy codes "undergraduates vs elementary" is significant without a separate test (linear hypothesis testing).
One of the reviewers provided the following feedback:
"Because of the dummy-coding these coefficients are differences in Fisher-z-transformed correlations between the coded category and the reference category. Hence, this reporting could be more accurately reflect this. In addition, these tests of coefficients are not a substitute for an overall test of the moderator. In other words, the fact that one coefficient related to a moderator is significant, does not imply that the moderator is significant. For example, an overall test for Index of Creativity can be non-significant even when a single coefficient such as the one for flexibility vs. fluency is significant. Overall, moderator tests could be done by means of linear hypothesis testing, for example."
In my opinion, running separate tests for each moderator kills the point of a meta-regression, and meta-regression should be the basis of the interpretation including these dummy-coded variables. I thought a categorical moderator would be significant even when one of the dummy codes turn out significant.
Who is correct here? Is there a good source that I could cite\?
I would appreciate input on this.
Selcuk Acar, Ph.D.
Associate Professor
Department of Educational Psychology
University of North Texas
[[alternative HTML version deleted]]
More information about the R-sig-meta-analysis
mailing list