Extending Information Agreement by Continuity

Abstract

Agreement measures are useful metrics to both compare different evaluations of the same diagnostic outcomes and validate new rating systems or devices. While many of them have been proposed in the literature so far, Cohen’s. is still the de facto standard in gauging the agreement.Information Agreement (IA) is a novel two-observers information-theoretic-based metric introduced to overcome all the limitations and alleged pitfalls of Cohen’s kappa. It offers an operative meaning to the agreement since it measures - in both dichotomous and multi-value ordered-categorical cases - the information shared between two raters through the virtual diagnostic channel connecting them: the more information exchanged between the raters, the higher their agreement. Unfortunately, this measure is only able to deal with agreement matrices whose values are all strictly positive numbers.This work extends IA by admitting also 0 as a possible value for the entries of an agreement matrix. Moreover, a Python software library to compute the extended version of IA, together with some of the most used agreement measures, is presented and tested.

Publication
2020 IEEE International Conference on Bioinformatics and Biomedicine (BIBM)

Related