[R-meta] Mean-adjustment for weighting
James Pustejovsky
jepusto at gmail.com
Tue Mar 27 23:16:33 CEST 2018
Vojtech,
I do not know enough about the performance of the adjustment to be able to
unequivocally recommend it or not. All the same, I will offer a couple of
observations in case they are useful to you:
1. The adjustments described by Doncaster & Spake are very similar to
methods proposed by Hunter & Schmidt in their book, Methods of
Meta-Analysis. So they are not entirely unknown.
2. This adjustment should only matter much if you are dealing with
exceedingly small sample sizes, which as Doncaster & Spake demonstrate are
not uncommon in ecology. If your sample sizes are much larger (say,
smallest total sample sizes are in the 20's, not the single digits), then
perhaps it is less of a concern.
3. The range of effect size estimates is also a consideration. In
psychology and education, I don't usually think about standardized mean
differences bigger than 1 or 1.5. For SMDs larger than 3, I often start to
wonder whether a different effect size metric might be more appropriate.
4. An ideal way to address your question about whether to use the
adjustment method would be to run some simulations that emulate the
conditions (sample sizes, ranges of effects, number of studies) you
observed in your meta-analysis. The authors provide R code for their
simulations, which could be modified to resemble the conditions in your
meta. But of course nobody has unlimited time and resources so this might
not be feasible.
5. I think it would useful to also report standard errors/confidence
intervals based on other techniques, such as the Knapp-Hartung adjustment
or Sidik & Jonkman's robust standard errors. Reporting results based on
these other techniques would, I think, help to build the reader's
confidence that your ultimate findings are credible rather than being
contingent on use of an uncommon set of methods. The Knapp-Hartung
adjustment is available in metafor using test = "knha". Robust standard
errors can be calculated using robust() in metafor or coef_test() in the
clubSandwich package. In either case, you would specify a unique id
variable for the cluster = argument.
James
On Tue, Mar 27, 2018 at 6:56 AM, Vojtěch Brlík <vojtech.brlik at gmail.com>
wrote:
> Dear all,
>
> I have conducted a meta-analysis for my bachelor thesis (that means I am
> highly inexperienced) using the unbiased standardized mean difference
> (Hedges‘ g) as a measure of the effect size. I have noticed recently
> published study (https://doi.org/10.1111/2041-210X.12927) suggesting the
> adjustment in the standard error calculation as the weights of the effect
> sizes are not corresponding to their sample sizes symmetrically. This
> inequality causes the biased estimates of pooled effect size variance.
>
> I decided to use this adjustment but it does not cause the same adjustment
> in all same-sized studies as the differences between the adjusted and
> non
> -
> adjusted errors are not symmetric (see below the plots in four categories
> of effect I want to recalculate
> , also attached below
> ).
>
>
>
>
> Please, write me in case you cannot see the figures.
>
> However, the effect size
> s
> remain unchanged and the variance is wider as Doncaster & Spake 2018
> suggested.
>
> What is you opinion about this study, do you recommend the use the
> adjustment for the standard error calculation or not?
>
> Thank you for your advises and comments.
>
> With kind regards,
>
> Vojtech
> Brlik
>
> _______________________________________________
> R-sig-meta-analysis mailing list
> R-sig-meta-analysis at r-project.org
> https://stat.ethz.ch/mailman/listinfo/r-sig-meta-analysis
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://stat.ethz.ch/pipermail/r-sig-meta-analysis/attachments/20180327/fda4c0f3/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: doncaster_plot_forum.png
Type: image/png
Size: 23356 bytes
Desc: not available
URL: <https://stat.ethz.ch/pipermail/r-sig-meta-analysis/attachments/20180327/fda4c0f3/attachment-0001.png>
More information about the R-sig-meta-analysis
mailing list