Inter-observer Variation in Assessment of Undescended Testis Analysis of Kappa Statistics as a Coefficient of Reliability
- 1 December 1989
- journal article
- research article
- Published by Wiley in British Journal of Urology
- Vol. 64 (6) , 644-648
- https://doi.org/10.1111/j.1464-410x.1989.tb05328.x
Abstract
Summary— In a prospective study the inter-observer variation in the diagnosis of undescended testis was analysed. Two physicians assessed independently the position and motility of the testes of 37 boys referred for undescended testis. The boys were examined in the supine and squatting positions. The observed agreement rate between the observers was 0.90 to 0.97. Using kappa (K) statistics, the values were adjusted for the expected chance agreement; k values between 0.47 and 0.81 were obtained, slightly higher values for patients in the supine position. Complete agreement on all observations was reached in 13.5% of the patients. Inter-observer variation may be a substantial source of bias in diagnosing the undescended testis and one of the reasons for the varying results in studies of hormonal treatment of this condition. It is also a fact that the number of orchiopexies in some countries exceeds the incidence of this condition.This publication has 10 references indexed in Scilit:
- The Measurement of Interrater AgreementPublished by Wiley ,2003
- DOUBLE-BLIND, PLACEBO-CONTROLLED STUDY OF LUTEINISING-HORMONE- RELEASING-HORMONE NASAL SPRAY IN TREATMENT OF UNDESCENDED TESTESThe Lancet, 1986
- APPARENT DOUBLING OF FREQUENCY OF UNDESCENDED TESTIS IN ENGLAND AND WALES IN 1962-81The Lancet, 1984
- Interobserver Variation in Assessment of Respiratory SignsActa Medica Scandinavica, 1984
- LH-RH-Nasalspray (Kryptocur), ein neuer Aspekt in der hormonellen Behandlung des HodenhochstandesKlinische Padiatrie, 1981
- Ramifications of a Population Model for κ as a Coefficient of ReliabilityPsychometrika, 1979
- The Equivalence of Weighted Kappa and the Intraclass Correlation Coefficient as Measures of ReliabilityEducational and Psychological Measurement, 1973
- Weighted kappa: Nominal scale agreement provision for scaled disagreement or partial credit.Psychological Bulletin, 1968
- A Coefficient of Agreement for Nominal ScalesEducational and Psychological Measurement, 1960