The reliability of expert judgement' in the marking of state nursing examinations
- 1 May 1983
- journal article
- Published by Wiley in Journal of Advanced Nursing
- Vol. 8 (3) , 221-226
- https://doi.org/10.1111/j.1365-2648.1983.tb00317.x
Abstract
A pilot study was undertaken to investigate the reliability of the 'expert judgement' exercised in marking state examination papers. Estimates of inter-rater reliability had a median coefficient of 0.76. Intra-rater reliability was marginally higher overall, with a median coefficient of 0.81. While marker reliability is only one source of variation in the assessment of extended answers, it is important the level be known. This study provides useful information and guidelines for improvements.Keywords
This publication has 5 references indexed in Scilit:
- Analysis of factors contributing to test quality over six yearsMedical Education, 2009
- A FURTHER REPORT OF INVESTIGATIONS INTO THE RELIABILITY OF MARKING OF GCE EXAMINATIONSBritish Journal of Educational Psychology, 1982
- A pilot experiment on the inter-examiner reliability of short essay questionsMedical Education, 1979
- REMOVING THE MARKS FROM EXAMINATION SCRIPTS BEFORE RE‐MARKING THEM: DOES IT MAKE ANY DIFFERENCE?British Journal of Educational Psychology, 1979
- RELIABILITY OF MARKING IN EIGHT GCE EXAMINATIONSBritish Journal of Educational Psychology, 1978