Note on Cohen's Kappa

Abstract
Cohen's Kappa is a measure of the over-all agreement between two raters classifying items into a given set of categories. This communication describes a simple computational method of determining the agreement on specific categories without the need to collapse the original data table as required by the previous Kappa-based method. It is also pointed out that Kappa may be formulated in terms of certain distance metrics. The computational procedure for the specific agreement measure is exemplified using hypothetical data from psychological diagnoses.

This publication has 3 references indexed in Scilit: