The Use of Chance Corrected Percentage of Agreement to Interpret the Results of a Discriminant Analysis
- 1 April 1978
- journal article
- Published by SAGE Publications in Educational and Psychological Measurement
- Vol. 38 (1) , 29-35
- https://doi.org/10.1177/001316447803800105
Abstract
Most programs for performing discriminant analysis provide a summary table of hits and misses in predicting group membership by using the discriminant function. The interpretation of such tables can be enhanced greatly by computing Cohen's kappa, κ, the chance corrected percentage of agreement between actual and predicted group membership. The standard error of kappa can be used to set confidence limits for the accuracy of the discriminant prediction and to test the difference in predictive accuracy for two independent samples. This was demonstrated in this article, using data previously published in a more preliminary form.Keywords
This publication has 1 reference indexed in Scilit:
- Weighted kappa: Nominal scale agreement provision for scaled disagreement or partial credit.Psychological Bulletin, 1968