Focus on psychometrics the kappa statistic for establishing interrater reliability in the secondary analysis of qualitative clinical data

Abstract
Analysis of extant clinical records is receiving increased emphasis in nursing investigations. Appropriate use of this approach to patient research requires careful attention to data management, including assessment of reliability. Percent agreement, phi, and Kappa all serve as estimates of interrater reliability in the analysis of data. Kappa has particular merit as a measure of interrater reliability; it also has some peculiar problems in implementation and interpretation. The nature and computation of Kappa and its application in analysis of clinical data are discussed.

This publication has 27 references indexed in Scilit: