Abstract
It has been demonstrated that systemic exposure to antiretroviral agents drives both the therapeutic and the toxic effects seen in clinical trials. Consequently, it is necessary to determine plasma concentrations of drugs and their metabolites (and possibly intracellular concentrations) during the course of Phase 1 and 2 evaluations. It also has been demonstrated that having precise mathematical relationships between exposure and response as well as between exposure and toxicity allows rational selection of drug dose for evaluation in larger Phase 2 and 3 clinical trials. This can be seen with AZT, where the exposure-efficacy relationship is clearly at a plateau by 600 mg/day, while the exposure-toxicity relationship predicts increasing toxicity (and, because of the plateauing of the effect relationship, without better therapeutic response). It can be seen most clearly with ddI, where the exposure-efficacy relationship pointed to longer-term therapy with lower doses ultimately being as efficacious as higher dose therapy. When coupled with the long-term tolerance data for ddI, this led to a prospective prediction of superiority of lower-dose therapy, which was validated later. Finally, some problems with current Phase 1 and 2 trial design have been pointed out, in that bioavailability needs to be recognized explicitly as an important evaluation factor in oral antiretroviral therapy. Perhaps most notably, the importance of initial susceptibility of the virus to the study drug has been suggested by the data of Drusano et al. Clearly, the microbiology of the patient needs to be taken into consideration in the evaluation of Phase 1 and 2 antiretroviral trial results.(ABSTRACT TRUNCATED AT 250 WORDS)