Cohen’s kappa statistic; Diagnostic agreement; Information measures; Inter-reader agreement; Multivalue ordered-categorical ratings

Beyond kappa: an informational index for diagnostic agreement in dichotomous and multivalue ordered-categorical ratings

Agreement measures are useful tools to both compare different evaluations of the same diagnostic outcomes and validate new rating systems or devices. Cohen’s kappa (κ) certainly is the most popular agreement method between two raters, and proved its …