Abstract
Kappa on a single item Ksi is proposed as a measure of the interrater agreement when a single item or object is rated by multiple raters. A statistical test and Monte Carlo simulations are provided for testing the statistical significance of Ksi beyond chance agreement.

This publication has 0 references indexed in Scilit: