Cohen Kappa

Introduced to me when looking at inter-annotator agreement.

Resources

Cohen’s kappa measures the agreement between two raters who each classify N items into C mutually exclusive categories. The definition of κ {\textstyle \kappa } is