Residency Program Director Evaluations Do Not Correlate With Performance on a Required 4th-Year Objective Structured Clinical Examination

Abstract
Assessment of resident performance is a complex task. To correlate performance on a 4th-year objective structured clinical examination (OSCE) with residency program director assessment, class rank, and U.S. Medical Licensing Examination (USMLE) scores. We surveyed program directors about the performance of 50 graduates from our medical school chosen to represent the highest (OSCEHI) and lowest (OSCELO) 25 performers on our required 4th-year OSCE. Program directors were unaware of the OSCE scores of the graduates. OSCE scores did not correlate with Likert scores for any survey parameter studied (r < .23, p > .13 for all comparisons). Similarly, program director evaluations did not correlate with class rank or USMLE scores (r < .26, p > .09 for all comparisons). We concluded that program director evaluations of resident performance do not appear to correlate with objective tests of either clinical skills or knowledge taken during medical school. These findings suggest that more structured and objective evaluative tools might improve postgraduate training program assessment of trainees.