[R-meta] inter-rater agreement for multiple raters
Viechtbauer Wolfgang (SP)
wolfgang.viechtbauer at maastrichtuniversity.nl
Tue Aug 1 17:58:36 CEST 2017
I thought Krippendorff's alpha could handle this. See:
I see that package 'irr' has a kripp.alpha() function, but it may still do listwise deletion. If so, maybe look for other packages that can compute Krippendorff's alpha (e.g., 'rel' and 'DescTools').
Wolfgang Viechtbauer, Ph.D., Statistician | Department of Psychiatry and
Neuropsychology | Maastricht University | P.O. Box 616 (VIJV1) | 6200 MD
Maastricht, The Netherlands | +31 (43) 388-4170 | http://www.wvbauer.com
From: R-sig-meta-analysis [mailto:r-sig-meta-analysis-bounces at r-project.org] On Behalf Of James Pustejovsky
Sent: Tuesday, August 01, 2017 17:50
To: r-sig-meta-analysis at r-project.org
Subject: [R-meta] inter-rater agreement for multiple raters
This question pertains to a common issue in systematic reviews (although
not meta-analysis per-se).
I have a group of 5 raters who screened abstracts for possible inclusion in
a systematic review. Each abstract was scored (dichotomously, as "not
relevant" or "possibly relevant") by at least two raters. However, only a
small percentage of abstracts were scored by all 5 raters.
Does anyone know of an R package that will compute inter-rater agreement
statistics (e.g., Fleiss's kappa or Light's kappa or something better that
I don't know about) for such a design? The functions that I know of (e.g.,
from the irr package) do list-wise deletion, which means that the
statistics are computed based on a very small subset of the sample. I need
something that uses pair-wise deletion.
More information about the R-sig-meta-analysis