[R-meta] negative reliability
||@t@ @end|ng |rom dewey@myzen@co@uk
Fri Apr 28 11:01:10 CEST 2023
You can check whether it was transmitted by going to
Where it appears.
The fact that you got no response may be because we are all struggling
with the idea of a test-retest or split-half reliability estimate which
was negative and what we would do with it. So people who scored high the
first time now score low? If it is split-half it suggests that the
hypothesis that the test measures one thing is false.
On 28/04/2023 01:45, Catia Oliveira via R-sig-meta-analysis wrote:
> Dear all,
> I apologise if I am spamming you but I think you didn't receive my previous
> email. At least I was not notified.
> I am running a meta-analysis on the reliability of a task (computed as a
> correlation between sessions or halves of the task depending on whether it
> is test-retest or split-half reliability) and I have come across one result
> that I am not sure how to handle. According to the authors, they found
> negative reliability and, because of that, they applied a correction
> suggested by Krus and Helmstadter(1993). Thus, I am wondering if I should
> use the original correlation or the corrected one. When authors applied the
> Spearman-Brown correction I reverted them to the original score, but with
> this one I don't know if such an approach is OK. My intuition would be to
> use the uncorrected measure since that's the most common approach in the
> sample and there isn't sufficient information to allow us to test the
> impact of these corrections. But I would appreciate your input on this.
> A second issue, but somewhat in line with the previous one, what do you
> recommend one to do when multiple approaches are used to compute the
> reliability of the task but only one converges with what was typically done
> by other authors? I wouldn't be able to assess whether the decisions made
> an impact on the reliability as it is only one study but also don't want to
> bias the findings with my selection (though I have to say the results are
> quite consistent across approaches).
> Thank you.
> Best wishes,
> [[alternative HTML version deleted]]
> R-sig-meta-analysis mailing list @ R-sig-meta-analysis using r-project.org
> To manage your subscription to this mailing list, go to:
More information about the R-sig-meta-analysis