A Cause for Concern

Abstract
Percutaneous coronary intervention has become the most frequently used method of myocardial revascularization.1,2 The advent of coronary stenting led to a significant decrease in the complications seen after balloon angioplasty, resulting in improved patient outcome.3,4 Yet, stented angioplasty has been plagued from the onset by early stent thrombosis (<30 days after index procedure) and late in-stent restenosis (ISR). Initially, stent thrombosis rates as high as 24% raised serious doubts as to the viability of the therapy.5 With the combined prescription of thienopyridines and aspirin for 4 to 8 weeks,6,7 together with proper stent deployment techniques,8 early stent thrombosis rates decreased to what was felt to be an unavoidable and acceptable 1% to 1.5%. At the same time, efforts to reduce the 30% late ISR rates through systemic pharmacological approaches remained unsuccessful until local radiation, a strong antiproliferative therapy, was applied to prevent or treat ISR.9–13 Vascular brachytherapy was the first illustration that delayed healing might portend an increased risk of thrombosis together with the expected reduction in restenosis. Indeed, stent thrombosis rates increased again up to 5.3%, and the time window of event occurrence was extended beyond 1 year so that the initial clinical benefit would eventually erode as time went by.13,14 Today, first-generation drug-eluting stents (gen1-DES: Cypher, Cordis, Johnson & Johnson, Miami Lakes, Fla [sirolimus-eluting stent, SES] and Taxus, Boston Scientific Corp, Natick, Mass [paclitaxel-eluting stent]) releasing an antiproliferative compound (sirolimus or paclitaxel, respectively) via a nonbioerodable polymer have been proved to reduce the incidence of ISR by up to 75%.15–39 Since the publication in 2002 of the first randomized trial16 comparing DES and bare metal stents (BMS) in highly selected patients and lesions, the use of DES in clinical practice has expanded to the majority of coronary lesion subsets …