Reliability and learning from the objective structured clinical examination
- 1 May 1990
- journal article
- Published by Wiley in Medical Education
- Vol. 24 (3) , 219-223
- https://doi.org/10.1111/j.1365-2923.1990.tb00004.x
Abstract
Summary The difficulties in measurement of the clinical performance of students in the health professions are well known by educators. One innovative measure incorporated in several of the educational programmes, including the BSc in Nursing programme, in the Faculty of Health Sciences, at McMaster University, Hamilton, Ontario, Canada is the objective structured clinical examination (OSCE). The purpose of this study was to determine the reliability of this evaluation method, both within and between stations. One problem that has been noted by users of the OSCE method is that performance on individual OSCE stations is poorly correlated across stations, apparently regardless of the particular content of the station. A number of hypotheses have been advanced to attempt to explain this phenomenon: performance of any skill is sufficiently variable that the correlation is poor; different skills have little common basis, so that there is no generalizability from one to another, or reliability of assessment in any one station is low. To test these hypotheses, a study was designed for test‐retest and interrater reliability. Students undergoing a 10‐station OSCE also repeated their starting OSCE station at the end of the examination circuit. In addition, several stations were rated by more than one observer (interrater). This study of 71 first‐year BScN students showed that the interrater reliability was high (ICC = 0.80 to 0.99), and test‐retest reliability on the same station was good (ICC = 0.66 to 0.86); however, correlation across stations was low (α= 0.198). Thus it is apparent that there is high consistency of repeated performance of a skill but little consistency of performance on different skills.Keywords
This publication has 12 references indexed in Scilit:
- The effectiveness of immediate feedback during the Objective Structured Clinical ExaminationMedical Education, 1989
- Using the OSCE to measure clinical skills performance in nursingJournal of Advanced Nursing, 1988
- The Objective Structured Clinical Examination: An Alternative Approach to Assessing Student Clinical PerformanceJournal Of Nursing Education, 1987
- Knowledge and clinical problem-solvingMedical Education, 1985
- Simulation in health sciences educationJournal of Instructional Development, 1985
- The validity and reliability of a new examination of the clinical competence of medical studentsMedical Education, 1981
- Assessment of clinical competence using an objective structured clinical examination (OSCE)Medical Education, 1979
- Medical Problem SolvingPublished by Harvard University Press ,1978
- Assessment of clinical competence using objective structured examination.BMJ, 1975