Kappa muddles together two sources of disagreement: Tetrachoric correlation is preferable
- 1 August 1993
- journal article
- focus on-psychometric
- Published by Wiley in Research in Nursing & Health
- Vol. 16 (4) , 313-316
- https://doi.org/10.1002/nur.4770160410
Abstract
When assessing agreement between experts, it is important to distinguish between disagreements that can and cannot be explained by different placing of the boundaries between categories. Cohen's kappa statistic is affected by both types of disagreement, tetrachoric correlation only by the second. © 1993 John Wiley & Sons, Inc.Keywords
This publication has 5 references indexed in Scilit:
- Focus on psychometrics the kappa statistic for establishing interrater reliability in the secondary analysis of qualitative clinical dataResearch in Nursing & Health, 1992
- MISINTERPRETATION AND MISUSE OF THE KAPPA STATISTICAmerican Journal of Epidemiology, 1987
- A Coefficient of Agreement for Nominal ScalesEducational and Psychological Measurement, 1960
- On the Scientific Evaluation of Diagnostic ProceduresRadiology, 1949
- On Theories of AssociationBiometrika, 1913