Article ID Journal Published Year Pages File Type
1150962 Statistical Methodology 2012 16 Pages PDF
Abstract

Cohen’s unweighted kappa and weighted kappa are popular descriptive statistics for measuring agreement between two raters on a categorical scale. With m≥3m≥3 raters, there are several views in the literature on how to define agreement. We consider a family of weighted kappas for multiple raters using the concept of gg-agreement (g=2,3,…,mg=2,3,…,m) which refers to the situation in which it is decided that there is agreement if gg out of mm raters assign an object to the same category. Given mm raters, we may formulate m−1m−1 weighted kappas in this family, one for each type of gg-agreement. We show that the m−1m−1 weighted kappas coincide if we use the weighting scheme proposed by Mielke et al. (2007) [31].

Related Topics
Physical Sciences and Engineering Mathematics Statistics and Probability
Authors
,