Different written assessment methods: what can be said about their strengths and weaknesses?
- 1 September 2004
- journal article
- Published by Wiley in Medical Education
- Vol. 38 (9) , 974-979
- https://doi.org/10.1111/j.1365-2929.2004.01916.x
Abstract
Written assessment techniques can be subdivided according to their stimulus format--what the question asks--and their response format--how the answer is recorded. The former is more important in determining the type of competence being asked for than the latter. It is nevertheless important to consider both when selecting the most appropriate types. Some major elements to consider when making such a selection are cueing effect, reliability, validity, educational impact and resource-intensiveness. Open-ended questions should be used solely to test aspects that cannot be tested with multiple-choice questions. In all other cases the loss of reliability and the higher resource-intensiveness represent a significant downside. In such cases, multiple-choice questions are not less valid than open-ended questions. When making this distinction, it is important to consider whether the question is embedded within a relevant case or context and cannot be answered without the case, or not. This appears to be more or less essential according to what is being tested by the question. Context-rich questions test other cognitive skills than do context-free questions. If knowledge alone is the purpose of the test, context-free questions may be useful, but if it is the application of knowledge or knowledge as a part of problem solving that is being tested, then context is indispensable. Every format has its (dis)advantages and a combination of formats based on rational selection is more useful than trying to find or develop a panacea. The response format is less important in this respect than the stimulus.Keywords
This publication has 18 references indexed in Scilit:
- Do short cases elicit different thinking processes than factual knowledge questions do?Medical Education, 2001
- The development of diagnostic competenceAcademic Medicine, 1996
- The assessment of professional competence: Developments, research and practical implicationsAdvances in Health Sciences Education, 1996
- Conceptual and methodological issues in studies comparing assessment formatsTeaching and Learning in Medicine, 1996
- A closer look at cueing effects in multiple-choice questionsMedical Education, 1996
- Extended‐matching items: A practical alternative to free‐response questionsTeaching and Learning in Medicine, 1993
- ASSESSMENT OF CLINICAL COMPETENCE: WRITTEN AND COMPUTER‐BASED SIMULATIONSAssessment & Evaluation in Higher Education, 1987
- Factors underlying performance on written tests of knowledgeMedical Education, 1987
- The real test bias: Influences of testing on teaching and learning.American Psychologist, 1984
- A new approach to evaluating-problem-solving in medical studentsAcademic Medicine, 1974