|Home | About | Journals | Submit | Contact Us | Français|
Antiretroviral therapy in the developed world has resulted in substantial reductions in HIV-associated morbidity and mortality, changing an HIV diagnosis from a likely death sentence into a manageable chronic infection (F. J. Palella, Jr., K. M. Delaney, A. C. Moorman, M. O. Loveless, J. Fuhrer, G. A. Satten, D. J. Aschman, and S. D. Holmberg, N. Engl. J. Med. 338:853-860, 1998). Several million years of life have been saved by effective anti-HIV treatment, although these successes should not obscure the magnitude of the ongoing worldwide HIV epidemic (R. P. Walensky, A. D. Paltiel, E. Losina, L. M. Mercincavage, B. R. Schackman, P. E. Sax, M. C. Weinstein, and K. A. Freedberg, J. Infect. Dis. 194:11-19, 2006). Readers of the Journal of Virology are doubtless aware of the fundamental advances in retrovirology that have made possible the development of potent inhibitors of HIV replication. In this review, we focus on the issues surrounding how these drugs and drug regimens are actually used in clinical settings. Their proper use requires detailed knowledge of the natural history of HIV infection, the pharmacology of the individual drugs, the complexities of drug-drug interactions, and the use of sophisticated molecular tests for monitoring of viral load, immunologic response, and drug resistance.
Currently, over 25 antiretroviral drugs and several fixed-dose drug combinations are available in most developed countries. Individual agents target many of the critical steps in the HIV replication cycle—entry, reverse transcription, integration, and proteolytic processing. Newer regimens offer greater convenience and less toxicity than ones previously used, and emerging data suggest that antiretroviral therapy should be initiated earlier during the natural history of HIV infection than was previously recommended (54). Randomized comparative testing has demonstrated superior clinical, immunologic, and virologic outcomes with certain drug combinations, although the use of certain otherwise effective antiretroviral regimens may sometimes be limited by co-morbid illnesses and toxicities. This review focuses on the translation of insights gleaned from both bench and clinical research into the day-to-day care of HIV-infected patients. We discuss the practical issues of how to choose an antiretroviral regimen, when to start therapy, how to monitor the clinical response, and how to adjust therapy for treatment failure or drug-associated toxicities.
A list of approved antiretroviral drugs is shown in Table Table1.1. Nucleoside and nucleotide reverse transcriptase inhibitors (NRTIs) were the first antiretrovirals to enter clinical use, but some of the early agents in this class (e.g., zalcitabine and stavudine) have largely been replaced in clinical practice by newer drugs with improved toxicity and tolerance profiles. All drugs in this class are analogs of native nucleotides, and almost all of them share the common motif of a lack of a 3′-OH group on their ribose ring that prevents the addition of nucleotides to the elongating proviral-DNA strand; this effectively terminates proviral-DNA synthesis. The structural exception is tenofovir, which causes chain termination because of the lack of an intact ribose moiety. Drugs in the NRTI class must be phosphorylated by intracellular kinases into their active triphosphate forms before they can effectively inhibit reverse transcriptase. Intracellular triphosphate forms have longer elimination half-lives ([t1/2] 3 to 50 h) than the parent drugs (1 to 10 h); excretion of NRTIs occurs predominantly via the kidney.
All NRTIs can, to a much lesser extent, inhibit the activity of normal cellular DNA polymerases, most notably the mitochondrial DNA (mtDNA) polymerase γ (pol-γ). This NRTI-associated inhibition of mitochondrial function may account for certain drug-specific adverse effects, e.g., elevated serum lactate and resulting lactic acidosis, as well as disorders of the liver, muscles, adipose tissue, and peripheral nerves. The dideoxynucleoside RT inhibitors exhibit the tightest binding to and the most inefficient exonucleolytic removal from DNA pol-γ, leading to the greatest degree of mtDNA synthesis inhibition via chain termination (4, 21, 31). Lamivudine, emtricitabine, abacavir, and tenofovir are the currently available NRTIs least likely to be associated with adverse drug effects resulting from mitochondrial dysfunction and the most likely NRTIs to be used in first-line regimens.
Tenofovir is now included in many preferred first-line antiretroviral regimens and can also be used in treatment-experienced patients whose virus lacks the K65R mutation. Most tenofovir is excreted unchanged in the urine via glomerular filtration, although the drug is also actively secreted across the renal tubule. There has been some concern about the cumulative nephrotoxic potential of tenofovir, given its structural similarity to the nephrotoxic nucleotides adefovir and cidofovir. Decreases in glomerular filtration rates (GFR), a clinical measure of renal function, in patients receiving tenofovir are modest for the first 6 months after starting therapy (−14 ml/min per 1.73 m2) and appear to stabilize after 18 months (a GFR decline of −19 ml/min per 1.73 m2); these findings are not associated with an increased rate of tenofovir discontinuation (17). Tenofovir, in combination with boosted protease inhibitors (PIs), is associated with greater declines in renal function than tenofovir in combination with nonnucleoside reverse transcriptase inhibitors (NNRTIs), although it is not always clear that observed decreases in GFR on tenofovir are clinically significant (15). Given the relatively short periods of follow-up in previous studies, long-term monitoring of renal function in patients receiving tenofovir is warranted.
The barriers to NRTI resistance vary among the available drugs. Some point mutations in RT occur commonly and may lead to high-level resistance (e.g., M184V and lamivudine or emtricitabine), whereas others inactivate an individual drug but are uncommon (e.g., K65R and tenofovir). NRTI resistance in the era when single or sequential drugs were used generally occurred by the accumulation of thymidine analog mutations (TAMs) in RT. Increasing numbers of TAMs correlated with higher-level and broader cross-resistance to multiple NRTIs.
NNRTIs bind reverse transcriptase in a pocket far from the active site. Available NNRTIs have long half-lives (25 to 55 h), do not require phosphorylation, and are HIV-1 specific; they have no activity against HIV-2. NNRTIs are hepatically metabolized and are substrates for the cytochrome P (CYP) enzymes. CYPs make up an enzyme superfamily that metabolizes many therapeutic drugs of different classes. The potential for clinically relevant drug-drug interactions is, therefore, higher among NNRTIs than NRTIs.
The barrier to HIV-1 resistance is relatively low for available NNRTIs. Single-point mutations in RT can inactivate all members of this class, with the exception of etravirine. Given this low resistance barrier, NNRTIs are often used early in therapy when the probability of HIV resistance to these agents is lowest and the combined protective effect of three fully active drugs is optimal.
HIV relies on its aspartyl protease to cleave Gag and Gag-Pol polyproteins into their essential structural and enzymatic (RT and integrase) components. Many human monomeric aspartyl proteases (e.g., renin and pepsin) exist, but it is the homodimer structure of HIV-1 and HIV-2 protease that selectively binds to, and is inhibited by, protease inhibitors. Ritonavir is a PI with unacceptable levels of tolerance and toxicity at optimal HIV-inhibitory concentrations; however, ritonavir is also a potent inhibitor of CYP3A4 metabolism. This property has been exploited to increase, or “boost,” plasma drug levels of other PIs by coadministering subtherapeutic and less-toxic doses of ritonavir. CYP3A4 inhibition can lead, however, to drug-drug interactions with other medication classes. PIs can be effective components of initial, second-line, and salvage antiretroviral regimens, although elevated blood cholesterol and triglyceride levels, also known as dyslipidemias, can be a problem with some PIs and may develop within weeks to months of starting PI-based therapy (56, 61, 66, 70). Although certain PI-based regimens increase the risk of insulin resistance and diabetes (3, 23, 46, 48), this is unlikely to be a class-wide PI effect and, in certain cases, may have more to do with a regimen's NRTI backbone (6, 9, 12, 28, 45, 47, 69). There is often a higher genetic barrier to resistance to protease inhibitors than to either NNRTIs or integrase inhibitors, and multiple mutations are typically required for protease inhibitors to lose substantial antiviral activity, although exceptions exist (e.g., saquinavir and nelfinavir). Boosted protease inhibitors select resistance mutations based on the major PI used, not the low-dose ritonavir. The genotypic-mutation patterns associated with PIs can be particularly complex and challenging to interpret; phenotypic-resistance testing can often help resolve these clinical ambiguities (19). Limited data exist on PI resistance mutations selected during boosted PI therapy in treatment-naïve patients, primarily because when regimens fail in these patients, resistance develops to the NRTI backbone and not to the boosted PI (22).
HIV integrase catalyzes both viral-cDNA processing and integration into the cellular genome by strand transfer. Integrase strand transfer inhibitors (INSTI) block the strand transfer reaction, thereby inhibiting HIV-1 and HIV-2 replication (59). Raltegravir inhibits the catalytic activity of HIV integrase and is now approved for use in treatment-naïve and -experienced patients. Raltegravir is primarily metabolized by glucuronidation. There are, at present, insufficient data reported to determine if promoter polymorphisms in glucuronidation enzymes have any clinically relevant effect on the safety or activity profile of this drug. HIV resistance to raltegravir is conferred by amino acid substitutions that occur in proximity to the integrase catalytic residues (8, 59). The genetic barrier to resistance to integrase strand transfer inhibitors is considered to be low; single mutations (Q148H and N155H) confer roughly 10-fold decreases in sensitivity to raltegravir (35). Raltegravir may be used as a component in both first-line and salvage antiretroviral regimens.
Inhibitors of HIV entry have targeted the conformational rearrangement of gp41 (enfuvirtide) and the gp120-CCR5 interaction (maraviroc). The need for twice daily injections with enfuvirtide, along with the local adverse effects that accompany those injections, has limited its clinical use. Resistance to enfuvirtide is conferred by amino acid substitutions in the heptad repeat 1 (HR1) region of gp41; HR2 mutations can also impact susceptibility to enfuvirtide. Maraviroc, the first approved CCR5 antagonist, has seen limited clinical use to date, in part because it is active only against CCR5-using viruses and thus requires an expensive test of blood coreceptor usage prior to use. In clinical trials, escape from maraviroc via mutation and selection has been uncommon compared with escape via selection of minority CXCR4-using viral populations that circulate below the detection limit of coreceptor usage assays. Maraviroc confers no virologic benefit in subjects with a CXCR-4-using virus or with viruses that either use both receptors or exist as mixtures of CCR5- and CXCR-4-using viruses (dual/mixed virus) (39). The selection of CXCR4-using viruses during treatment raises clinical concern, because in the natural history of HIV infection, the appearance of a CXCR4-using virus is often associated with a faster rate of CD4+ T-cell decline, more rapid disease progression, and an increased rate of development of AIDS and death (24, 33, 62, 65). In clinical trials to date, however, discontinuation of maraviroc has generally resulted in a loss of detectable CXCR4-using viruses and a reappearance of CCR5-using viral populations.
Although over 25 drugs are now available for HIV therapy, existing problems with drug tolerability, toxicity, resistance, and cost necessitate continued research toward development of new antiretroviral drugs. Several new drugs from existing classes are in advanced stages of clinical trials, though drugs that act by novel antiviral mechanisms appear to be lagging far behind in their clinical development.
Recommendations regarding when to initiate antiretroviral therapy for HIV infection have evolved over the years. The pendulum, based on accumulating data, is now swinging toward earlier treatment for infected individuals. For all patients, regardless of the duration of infection or prior treatment experience, the goal of therapy is the reduction of plasma viral load to below detectable levels, currently 50 copies/ml. Most “highly active” antiretroviral therapy (ART) regimens can now achieve this goal. Guidelines from expert panels are periodically updated, and comprehensive online versions of these recommendations are available (14, 18, 54). The decision to begin antiretroviral therapy for any patient must balance the burden and toxicity of the drug regimen against the benefits of decreased HIV-related morbidities and increased life expectancy (Table (Table2).2). Although clinicians are most comfortable considering the deleterious consequences of HIV infection in discrete quanta of CD4 counts, (e.g., <50, <200, >350, or >500 cells/μl,), there is a continuum, without clear demarcation, in the risk of progression to clinical disease and death across the range of declining CD4 counts from ≥650 cells/μl to <50 cells/μl (11, 32, 40, 57).
As the risks associated with ART have decreased because of more-potent and -tolerable drug combinations, the risk-benefit ratio of initiating ART has shifted toward beginning therapy at higher CD4 cell counts (72). Increasing evidence suggests a reduction in death, opportunistic infections, and serious non-AIDS events and an increase in rates of maximal virologic suppression and CD4 cell counts in patients who initiate therapy with CD4 counts between 200 to 350 cells/μl (5, 10, 16, 20, 29, 30, 38, 43, 49, 53, 57, 67). Data from a large observational cohort recently demonstrated an increased risk of death in patients who did not initiate antiretroviral therapy with CD4 counts either between 351 to 500 cells/μl or of >500 cells/μl relative to patients in those CD4 strata that did initiate therapy (26). The most recent U.S. Department of Health and Human Services (DHHS) guidelines suggest that treatment be initiated in all HIV-infected patients regardless of CD4 count unless specific clinical or patient circumstances warrant deferral (54). Although treatment guidelines play a useful role in helping physicians decide when to start therapy, patient willingness and readiness to start life-long therapy are critical, and the role of meticulous adherence in the success of ART is undeniable (1, 2, 55). The deferral of therapy until adherence can be maximized is preferable to suboptimal or incomplete therapy.
Several patient and virus factors need to be considered when choosing an initial regimen (Table (Table3).3). These include existing comorbidities (e.g., cardiovascular, renal, or psychiatric disease), potential adverse drug effects and interactions with other medications the patient may be receiving, pregnancy or pregnancy potential, convenience, and patient adherence. Determining the antiretroviral susceptibility of a patient's HIV isolate is also an important step in constructing an effective combination antiretroviral regimen. Based on the results of genotypic-resistance testing, a regimen should be constructed that maximizes the probability of virologic suppression while minimizing adverse effects, toxicities, and pill burden. Because of cost and the longer time required, phenotypic-resistance testing is generally reserved for more-complex cases where multiple PI resistance mutations are present.
Most preferred regimens for treatment initiation consist of a dual-NRTI backbone in combination with an NNRTI, a ritonavir-boosted PI, or an INSTI. The choice of whether to use an NNRTI, a boosted PI, or an INSTI as part of initial therapy needs to be individualized, based on issues including co-morbid conditions, likely adherence, dosing requirements, and pregnancy potential. A combination of tenofovir and emtricitabine has become the most commonly used dual-NRTI backbone in the developed world because of superior virologic outcomes, reduced drug resistance, and less toxicity than other NRTI regimens (13, 60, 64). Combination therapy without a dual-nucleoside backbone is not generally recommended for initial antiretroviral therapy, though several “nucleoside-sparing” regimens are under study.
NNRTI-, PI-, or INSTI-based therapies, in combination with a dual-NRTI backbone, provide effective suppression of HIV-1 replication and reconstitution of CD4 cell counts. A randomized prospective trial of 1,400 subjects demonstrated similar composite endpoints of death, AIDS-defining event, or CD4 count decline to <200 cells/μl in subjects receiving either an NNRTI- or PI-based regimen (34). A meta-analysis of clinical trials comparing NNRTI- and PI-based therapies suggested that NNRTI-based therapy was more effective than PI-based therapy for virologic suppression but was similar to PI-based therapy for clinical outcomes (7). The low genetic barrier to resistance to NNRTIs, where single-nucleotide substitutions may confer broad class-wide resistance (except to etravirine), provides an additional rationale for the first use of a boosted PI when adherence may be a problem. However, NNRTI-based regimens may have lower pill burdens, provide greater convenience (particularly the fixed-dose regimen of tenofovir/emtricitabine/efavirenz), and possibly have improved lipid profiles than PI-based regimens. Although there is less experience with INSTI-based regimens, studies to date show virologic outcomes with raltegravir to be similar to those with efavirenz when either is combined with tenofovir and emtricitabine over 96 weeks of observation (37).
Several clinical trials have evaluated which PIs to use initially, and it is generally agreed that ritonavir-boosted PIs are preferred over unboosted PIs for first-line ART regimens unless patients are intolerant to ritonavir-associated side effects. Atazanavir/r, darunavir/r, fosamprenavir/r, and saquinavir/r have all demonstrated virologic outcomes similar to or better than those achieved with lopinavir/r (12, 42, 50, 74), and current guidelines favor boosted atazanavir or boosted darunavir as components of initial PI regimens because of efficacy, good tolerability, once-daily dosing, and low pill count (54). For an initial NNRTI-based regimen, efavirenz is preferred over nevirapine in most treatment-naïve patients because of less toxicity and possibly greater efficacy; these two drugs should not be used in combination (54, 71). There are situations, however, where nevirapine is preferred, particularly in pregnant women and women of child-bearing age who might become pregnant, where efavirenz is contraindicated because of teratogenicity.
Temporary discontinuations of antiretroviral treatment have been studied in patients with HIV infection as an experimental strategy to minimize drug toxicities and cost, decrease treatment fatigue, improve quality of life, stimulate HIV-specific immune responses, and minimize the emergence of drug-resistant viruses. To date, however, interruptions of therapy have been unsuccessful. The body of available evidence suggests a lack of benefit, and a large controlled study demonstrated potential harm with this approach (11, 51).
Blood CD4 cell counts and plasma viral loads are most widely used to monitor the success of antiretroviral therapy. Drug resistance testing, HLA-B57 typing, and viral tropism assays are also important laboratory tests that assist the clinician in designing the most effective and patient-specific antiretroviral regimen. When CCR5 antagonist therapy is being considered, coreceptor tropism testing is essential. Routine monitoring of liver and kidney function, along with serum lipids, fasting glucose, and hematologic parameters, should be undertaken when appropriate.
CD4 T-cell counts should be determined when an HIV diagnosis is made and should be monitored regularly thereafter to guide both antiretroviral therapy and prophylaxis against opportunistic infections. Once combination antiretroviral therapy is started, the CD4 count may reasonably be expected to increase between 50 to 150 cells/mm3 in the first year and 50 to 100 cells/mm3 in the second year (25, 27). In patients initiating therapy when CD4 counts fall below 200 cells/mm3, only a minority reconstitute their CD4 counts to >500 cells/mm3 after 4 years (25). CD4 counts can often reach levels considered normal in patients who initiate therapy with CD4 counts of >350 cells/mm3 (41).
Prospective screening for the HLA-B*5701 allele reduces substantially, but does not eliminate, the risk of a hypersensitivity reaction to the NRTI abacavir (36, 63). All patients being considered for an abacavir-containing regimen should first undergo HLA-B*5701 testing, and those who test positive for this allele should not receive abacavir.
Viral-load monitoring is necessary to assess the response to antiretroviral therapy and the durability of virologic suppression. The goal of all antiretroviral therapy, whether in treatment-naïve or -experienced patients living in developed or developing countries, is suppression of plasma HIV RNA to undetectable levels (currently <50 copies/ml). Viral load should be measured before starting therapy, 2 to 8 weeks later, and then at 4- to 8-week intervals until HIV RNA is no longer detectable. At least a 1 log10 reduction in viral load should be expected at 4 weeks, with a decline in plasma HIV RNA to <50 copies/ml by 16 to 24 weeks of therapy. Viral-load reduction usually precedes significant CD4 count improvement. Viral blips, or transient viral-load increases to between 50 to 1,000 copies/ml in a patient with previously suppressed plasma HIV RNA, are occasionally seen but do not appear to be associated with eventual virologic failure and do not necessitate a change in therapy (44, 68). In some cases, blips may be the results of temporary lapses in patient adherence to antiretroviral regimens (58). True virologic failure is defined as a persistently detectable viral load in a patient with previously suppressed plasma HIV RNA or the inability to achieve an undetectable viral load after 24 to 48 weeks of therapy. Although multiple causes of virologic failure are possible, a detectable viral load in either scenario should prompt HIV drug resistance testing.
Clinical assessment, together with measurement of HIV RNA levels and CD4 cell counts, should be used to assess the need to change therapy. If the patient experiences drug toxicity or intolerance or is unable to adhere to therapy, a change in regimens may be required. In these situations, it may be appropriate simply to replace the offending drug by another that is better tolerated and exhibits similar potency.
Clinical situations that should prompt consideration for changing an entire regimen include a poor early virologic response to therapy, failure to suppress plasma HIV-1 RNA to undetectable levels by 4 to 6 months after initiation, repeated detection of virus in plasma after initial suppression, a persistent and significant decline in the CD4 T-cell count, or clinical deterioration.
The selection of a new regimen in patients with virologic, immunologic, and clinical failure should involve consideration of the history of previous antiretroviral-drug exposure, current drug resistance patterns, other medications with the potential for drug-drug interactions, and individual co-morbid conditions. At least two, and preferably three, fully active drugs should be included in the new regimen, ideally using agents from at least one new class. With the multiple drugs available, the goal of the new regimen should always be to durably suppress plasma HIV RNA levels to below limits detectable by the most sensitive available assay.
Better understanding of the HIV replicative cycle, mechanisms of viral drug resistance, viral pathogenetic mechanisms, and host responses to the virus have driven the development of successful combination antiretroviral strategies over the past 25 years. The results have been enormously successful, extending the healthy lives of millions of infected individuals. Many challenges remain for patients on antiretroviral therapy, however, including adherence to complex regimens, emergence of drug-resistant virus variants, and the development of complications of therapy, including drug toxicity. Close monitoring of patients on therapy, utilizing available laboratory tests such as CD4 cell counts and viral-load measurements to evaluate success or failure, remains essential. Progress is also being made in the development of new drugs and in the roll-out of antiretroviral therapy in the developing world, although major challenges remain because of the costs and infrastructure needs required for sustainable programs. As the roll-out and optimization of antiretroviral-treatment programs continue in the developing world, laboratory monitoring (e.g., monitoring of CD4 counts and plasma viral load and drug resistance testing) will be increasingly important to minimize the morbidity associated with suboptimal treatment regimens.
Published ahead of print on 24 February 2010.