A Revised Index of Interrater Agreement for Multi-Item Ratings of a Single Target
- 1 June 1999
- journal article
- Published by SAGE Publications in Applied Psychological Measurement
- Vol. 23 (2) , 127-135
- https://doi.org/10.1177/01466219922031257
Abstract
The commonly used form of r wg. (J) can display irregular behavior, so four variants of this index were examined. An alternative index, r* wg. J, is recommended. This index is an inverse linear function of the ratio of the average obtained variance to the variance of uniformly distributed random error. r* wg.Jis superficially similar to Cronbach’s α, but careful examination confirms that r* wg.Jis an index of agreement, not reliability. Based on an examination of the small-sample behavior of r wgand r* wg.J, sample sizes of 10 or more raters are recommended.Keywords
This publication has 9 references indexed in Scilit:
- Measuring Interrater Agreement for Ratings of a Single TargetApplied Psychological Measurement, 1997
- rwg: An assessment of within-group interrater agreement.Journal of Applied Psychology, 1993
- A disagreement about within-group agreement: Disentangling issues of consistency versus consensus.Journal of Applied Psychology, 1992
- Interrater reliability coefficients cannot be computed when only one stimulus is rated.Journal of Applied Psychology, 1989
- Estimating within-group interrater reliability with and without response bias.Journal of Applied Psychology, 1984
- Interrater reliability and agreement of subjective judgments.Journal of Counseling Psychology, 1975
- Judgment of counseling process: Reliability, agreement, and error.Psychological Bulletin, 1972
- A Note on Estimating the Reliability of Categorical DataEducational and Psychological Measurement, 1970
- Coefficient alpha and the internal structure of testsPsychometrika, 1951