Comparison of an Aggregate Scoring Method With a Consensus Scoring Method in a Measure of Clinical Reasoning Capacity
- 1 July 2002
- journal article
- research article
- Published by Taylor & Francis in Teaching and Learning in Medicine
- Vol. 14 (3) , 150-156
- https://doi.org/10.1207/s15328015tlm1403_3
Abstract
Diversity of clinical reasoning paths of thought among experts is well known. Nevertheless, in written clinical reasoning assessment, the common practice is to ask experts to reach a consensus on each item and to assess students on a unique "good answer." To explore the effects of taking the variability of experts answers into account in a method of clinical reasoning assessment based on authentic tasks: the Script Concordance Test. Two different methods were used to build answer keys. The first incorporated variability among a group of experts (criterion experts) through an aggregate scoring method. The second was made with the consensus obtained from the group of criterion experts for each answer. Scores obtained with the two methods by students and another group of experts (tested experts) were compared. The domain of assessment was gynecology-obstetric clinical knowledge. The sample consisted of 150 clerkship students and seven other experts (tested experts). In a context of authentic tasks, experts' answers on items varied substantially. Amazingly, 59% of answers given individually by criterion group experts differed from the answer they provided when they were asked in a group to provide the "good answer" required from students. The aggregate scoring method showed several advantages and was more sensitive to detecting expertise. The findings suggest that, in assessment of complex performance in ill-defined situations, the usual practice of asking experts to reach a consensus on each item reduces and hinders the detection of expertise. If these results are confirmed by other researches, this practice should be reconsidered.Keywords
This publication has 15 references indexed in Scilit:
- Scripts and Medical Diagnostic KnowledgeAcademic Medicine, 2000
- The Diagnosis Script Questionnaire: A New Tool to Assess a Specific Dimension of Clinical CompetenceAdvances in Health Sciences Education, 1998
- Script questionnaires: their use for assessment of diagnostic knowledge in radiologyMedical Teacher, 1998
- A cognitive perspective on medical expertiseAcademic Medicine, 1990
- The Use of Aggregate Scoring for a Recertifying ExaminationEvaluation & the Health Professions, 1990
- Primary knowledge, medical education and consultant expertiseMedical Education, 1988
- ASSESSMENT OF CLINICAL COMPETENCE: WRITTEN AND COMPUTER‐BASED SIMULATIONSAssessment & Evaluation in Higher Education, 1987
- Objective measurement of clinical performanceMedical Education, 1985
- Teaching Clinical Medicine by Iterative Hypothesis TestingNew England Journal of Medicine, 1983
- Scoring Patient Management ProblemsEvaluation & the Health Professions, 1982