Abstract
The remarkable reduction of 50 percent or more in the age-adjusted mortality rate from coronary heart disease over the past four decades in the United States1 has been driven by three parallel phenomena: an understanding of the risk factors that influence the likelihood and presentation of coronary events, the use of randomized trials to create an indisputable evidence base on which to make decisions, and the flexibility to change systems of care to incorporate new knowledge into practice. Despite the failure of some physicians to adopt proven interventions2 and substantial disparities in the U.S. health care system,3 the reduction in . . .