Defining and Assessing Professional Competence
Top Cited Papers
- 9 January 2002
- journal article
- review article
- Published by American Medical Association (AMA) in JAMA
- Vol. 287 (2) , 226-235
- https://doi.org/10.1001/jama.287.2.226
Abstract
ContextCurrent assessment formats for physicians and trainees reliably test core knowledge and basic skills. However, they may underemphasize some important domains of professional medical practice, including interpersonal skills, lifelong learning, professionalism, and integration of core knowledge into clinical practice.ObjectivesTo propose a definition of professional competence, to review current means for assessing it, and to suggest new approaches to assessment.Data SourcesWe searched the MEDLINE database from 1966 to 2001 and reference lists of relevant articles for English-language studies of reliability or validity of measures of competence of physicians, medical students, and residents.Study SelectionWe excluded articles of a purely descriptive nature, duplicate reports, reviews, and opinions and position statements, which yielded 195 relevant citations.Data ExtractionData were abstracted by 1 of us (R.M.E.). Quality criteria for inclusion were broad, given the heterogeneity of interventions, complexity of outcome measures, and paucity of randomized or longitudinal study designs.Data SynthesisWe generated an inclusive definition of competence: the habitual and judicious use of communication, knowledge, technical skills, clinical reasoning, emotions, values, and reflection in daily practice for the benefit of the individual and the community being served. Aside from protecting the public and limiting access to advanced training, assessments should foster habits of learning and self-reflection and drive institutional change. Subjective, multiple-choice, and standardized patient assessments, although reliable, underemphasize important domains of professional competence: integration of knowledge and skills, context of care, information management, teamwork, health systems, and patient-physician relationships. Few assessments observe trainees in real-life situations, incorporate the perspectives of peers and patients, or use measures that predict clinical outcomes.ConclusionsIn addition to assessments of basic skills, new formats that assess clinical reasoning, expert judgment, management of ambiguity, professionalism, time management, learning strategies, and teamwork promise a multidimensional assessment while maintaining adequate reliability and validity. Institutional support, reflection, and mentoring must accompany the development of assessment programs.Keywords
This publication has 92 references indexed in Scilit:
- Preventable anesthesia mishaps: a study of human factorsQuality and Safety in Health Care, 2002
- Helping students learn to think like experts when solving clinical problemsAcademic Medicine, 1997
- Model-Based Practice Analysis and Test SpecificationsApplied Measurement in Education, 1997
- An expert-judgment approach to setting standards for a standardized-patient examinationAcademic Medicine, 1996
- Validity of simple approach to scoring and standard setting for standardized-patient cases in an examination of clinical competenceAcademic Medicine, 1996
- Can standardized patients predict real‐patient satisfaction with the doctor‐patient relationship?Teaching and Learning in Medicine, 1994
- Detecting and correcting for rater-induced differences in standardized patient tests of clinical competenceAcademic Medicine, 1990
- The structure of medical knowledge in the memories of medical students and general practitioners: categories and prototypesMedical Education, 1984
- Improvement of reliability of an oral examination by a structured evaluation instrumentAcademic Medicine, 1983
- The assessment of clinical competence on the emergency medicine specialty certification examination: The validity of clinically relevant multiple-choice itemsAnnals of Emergency Medicine, 1980