New Interpretations of Cohen’s Kappa
Cohen’s kappa is a widely used association coefficient for summarizing interrater agreement on a nominal scale. Kappa reduces the ratings of the two observers to a single number. With three or more categories it is more informative to summarize the ratings by category coefficients that describe the...
Main Author: | Matthijs J. Warrens |
---|---|
Format: | Article |
Language: | English |
Published: |
Hindawi Limited
2014-01-01
|
Series: | Journal of Mathematics |
Online Access: | http://dx.doi.org/10.1155/2014/203907 |
Similar Items
-
The Matthews Correlation Coefficient (MCC) is More Informative Than Cohen’s Kappa and Brier Score in Binary Classification Assessment
by: Davide Chicco, et al.
Published: (2021-01-01) -
Weighted Kappas for 3×3 Tables
by: Matthijs J. Warrens
Published: (2013-01-01) -
Why Cohen's Kappa should be avoided as performance measure in classification.
by: Rosario Delgado, et al.
Published: (2019-01-01) -
Measurement of Interobserver Disagreement: Correction of Cohen’s Kappa for Negative Values
by: Tarald O. Kvålseth
Published: (2015-01-01) -
A Variance Estimator for Cohen’s Kappa under a Clustered Sampling Design
by: Abdel-Rasoul, Mahmoud Hisham
Published: (2011)