The Statistics of Agreement on a Single Item or Object by Multiple Raters
- 1 October 1993
- journal article
- Published by SAGE Publications in Perceptual and Motor Skills
- Vol. 77 (2) , 377-378
- https://doi.org/10.2466/pms.1993.77.2.377
Abstract
Kappa on a single item Ksi is proposed as a measure of the interrater agreement when a single item or object is rated by multiple raters. A statistical test and Monte Carlo simulations are provided for testing the statistical significance of Ksi beyond chance agreement.Keywords
This publication has 0 references indexed in Scilit: