Abstract
The authors tested the validity of an evaluation procedure designed to measure the effects of university-organized continuing medical education (CME) courses on participants' prescribing behavior. They sought to substantiate the participating physicians' self-reported intentions of changing their behavior and self-reported behavior changes by comparing these changes with physicians' actual behavior before and after the course. Sixty-one participants used automatic carbon prescription pads for six weeks before a CME course on therapeutics and for 16 weeks after the course. The copies of the prescriptions were analyzed to establish real behavior, which was compared with the physicians' self-reports. There were no significant changes in the physicians' general prescribing behavior. However, once they stated formal intentions to change their behavior, the participants did change in the intended direction. to change their behavior, the participants did change in the intended direction. When analysis included only those prescribing areas in which intentions to change were stated, there was strong evidence that the self-reports of behavior change accurately represented the actual behavior change. Both results reached statistically significant levels. The use of self-reports of behavior change as a method of evaluating in CME courses appears to the authors to be valid.