The associations of acute HIV infection (AHI) and other predictors with transmitted drug resistance (TDR) prevalence were assessed in a cohort of HIV-infected, antiretroviral-naïve patients. AHI was defined as being seronegative with detectable HIV RNA. Binomial regression was used to estimate prevalence ratios and 95% confidence intervals (CIs) for associations with TDR. Among 43 AHI patients, TDR prevalence was 20.9%, while prevalence was 8.6% among 677 chronically-infected patients. AHI was associated with 1.9 times the prevalence of TDR (95% CI: 1.0, 3.6) in multivariable analysis. AHI patients may represent a vanguard group that portends increasing TDR in the future.
Transmitted drug resistance; HIV-1; Acute HIV Infection; Antiretroviral Resistance
The p16INK4a tumor suppressor gene is a mediator of cellular senescence and has been suggested to be a biomarker of ‘molecular’ age in several tissues including T-cells. To determine the association of both active and suppressed HIV infection with T-cell aging, T-cell p16INK4a expression was compared between 60 HIV+ suppressed subjects, 23 HIV+ untreated subjects, and 18 contemporaneously collected HIV-negative controls, as well as 148 HIV-negative historical samples. Expression did not correlate with chronologic age in untreated HIV+ patients, consistent with an effect of active HIV replication on p16INK4a expression. In patients on cART with suppressed viral loads, however, p16INK4a levels were similar to uninfected controls and correlated with chronologic age, with a trend toward an inverse correlation with CD4 count. These data show that p16INK4a is a reliable biomarker of T cell aging in HIV+ patients with suppressed viral loads and suggest that poor CD4 cell recovery on cART may be associated with increased T-cell expression of p16INK4a, a marker of cellular senescence.
Modeling clinical endpoints as a function of change in antiretroviral therapy (ART) attempts to answer one simple but very challenging question: was the change in ART beneficial or not? We conceive a similar scientific question of interest in the current manuscript except that we are interested in modeling the time of ART regimen change rather than a comparison of two or more ART regimens. The answer to this scientific riddle is unknown and has been difficult to address clinically. Naturally, ART regimen change is left to a participant and his or her provider and so the date of change depends on participant characteristics. There exists a vast literature on how to address potential confounding and those techniques are vital to the success of the method here. A more substantial challenge is devising a systematic modeling strategy to overcome the missing time of regimen change for those participants who do not switch to second-line ART within the study period even after failing the initial ART. In this paper, we adopt and apply a statistical method that was originally proposed for modeling infusion trial data, where infusion length may be informatively censored, and argue that the same strategy may be employed here. Our application of this method to therapeutic HIV/AIDS studies is new and interesting. Using data from the AIDS Clinical Trials Group (ACTG) Study A5095, we model immunological endpoints as a polynomial function of a participant’s switching time to second-line ART for 182 participants who already failed the initial ART. In our analysis, we find that participants who switch early have somewhat better sustained suppression of HIV-1 RNA after virological failure than those who switch later. However, we also found that participants who switched very late, possibly censored due to the end of the study, had good HIV-1 RNA suppression, on average. We believe our scientific conclusions contribute to the relevant HIV literature and hope that the basic modeling strategy outlined here would be useful to others contemplating similar analyses with partially missing treatment length data.
Causal inference; Informative Censoring; Observational data; Propensity score
To examine the association between early HIV viremia and mortality after HIV-associated lymphoma.
Multicenter observational cohort study.
Center for AIDS Research Network of Integrated Clinical Systems cohort.
HIV-infected patients with lymphoma diagnosed between 1996 and 2011, who were alive 6 months after lymphoma diagnosis and with ≥2 HIV RNA values during the 6 months after lymphoma diagnosis.
Cumulative HIV viremia during the 6 months after lymphoma diagnosis, expressed as viremia copy-6-months.
Main outcome measure
All-cause mortality between 6 months and 5 years after lymphoma diagnosis.
Of 224 included patients, 183 (82%) had non-Hodgkin lymphoma (NHL) and 41 (18%) had Hodgkin lymphoma (HL). At lymphoma diagnosis, 105 (47%) patients were on antiretroviral therapy (ART), median CD4 count was 148 cells/µlL (IQR 54– 322), and 33% had suppressed HIV RNA (<400 copies/mL). In adjusted analyses, mortality was associated with older age [adjusted hazard ratio (AHR) 1.37 per decade increase, 95% CI 1.03–1.83], lymphoma occurrence on ART (AHR 1.63, 95% CI 1.02– 2.63), lower CD4 count (AHR 0.75 per 100 cell/µL increase, 95% CI 0.64–0.89), and higher early cumulative viremia (AHR 1.35 per log10copies × 6-months/mL, 95% CI 1.11–1.65). The detrimental effect of early cumulative viremia was consistent across patient groups defined by ART status, CD4 count, and histology.
Exposure to each additional 1-unit log10 in HIV RNA throughout the 6 months after lymphoma diagnosis, was associated with a 35% increase in subsequent mortality. These results suggest that early and effective ART during chemotherapy may improve survival.
AIDS; Burkitt lymphoma; diffuse large B-cell lymphoma; HIV; Hodgkin lymphoma; lymphoma; non-Hodgkin lymphoma
Despite prevention efforts new HIV diagnoses continue in the Southern US, where the epidemic is characterized by significant racial/ethnic disparities. We integrated phylogenetic analyses with clinical data to reveal trends in local HIV transmission.
Cross-sectional analysis of 1671 HIV-infected individuals each with one B-subtype pol sequence obtained during chronic (82%; UNC Center for AIDS Research Clinical Cohort) or acute/recent (18%; Duke/UNC Acute HIV Consortium) infection.
Phylogenies were inferred using neighbor joining to select related sequences then confirmed with Bayesian methods. We characterized transmission clusters (clades n≥3 sequences supported by posterior probabilities=1) by factors including race/ethnicity and transmission risk. Factors associated with cluster membership were evaluated for newly diagnosed patients.
Overall, 72% were male, 59% black and 39% MSM. A total of 557 (33%) sequences grouped in either 108 pairs (n=216) or 67 clusters (n=341). Clusters ranged from 3–36 (median 4) members. Composition was delineated primarily by race, with 28% exclusively black, and to a lesser extent by risk group. Both MSM and heterosexuals formed discrete clusters though substantial mixing was observed. In multivariable analysis, patients with age ≤30 years (P=0.009), acute infection (P=0.02), local residence (P=0.002), and transmitted drug resistance (P=0.02) were more likely to be cluster members while Latinos were less likely (P<0.001).
Integration of molecular, clinical and demographic data offers a unique view into the structure of local transmission networks. Clustering by black race, youth and TDR and inability to identify Latino clusters will inform prevention, testing and linkage to care strategies.
Molecular epidemiology; HIV transmission; Risk Factors; Acute Infection; Southeast US
Alcohol Drinking; HIV Seropositivity; Men who Have Sex with Men; Prospective Studies; Sexual Behavior
Acute HIV-1 infection causes a rapid total body depletion of CD4+ T cells in most individuals and HIV-1-specific CD8+ T cell expansion in response to viral replication. A numerically high CD8 T cell response may indicate limited T cell repertoire against HIV and rapid progression. We present a detailed evaluation of an acutely infected individual with a strong HIV-1-specific CD8 T cell response targeting multiple epitopes demonstrating that the upper limit of CD8 expansion in this setting may be much higher than previously reported and was likely driven by the narrow HIV-specific response.
HIV increases risk of non-Hodgkin lymphoma (NHL) and Hodgkin lymphoma (HL). The effect of HIV on presentation, treatment, and outcomes of NHL and HL in routine care in the combination antiretroviral therapy (cART) merits further characterization. We performed a retrospective analysis of HIV-infected patients with NHL and HL receiving care at the University of North Carolina at Chapel Hill from January 1, 2000 until December 31, 2010. Statistical analyses were conducted using SAS, version 9.2 (SAS Institute Inc). Sixty-five HIV-infected patients with NHL and HL were identified. Patients with non-CNS NHL and HL presented with advanced disease (85% stage III or IV) and adverse prognostic features. Patients completed 87% of planned chemotherapy cycles, and 68% of patients completed stage-appropriate therapy. Dose reduction, interruption, and/or delay occurred during more than 25% of administered cycles in 64% of patients. Infectious complications, febrile neutropenia, and myelosuppression accounted for 78% of deviations from planned cumulative dose and dose intensity. Primary CNS lymphoma (PCNSL) was associated with poor prognosis, but 2-year overall survival was 66% for all non-CNS lymphoma. Among patients surviving at least 2 years, 75% had CD4 count >200 cells/μl and 79% had HIV viral load <400 copies/ml at last follow-up. Despite advanced disease and difficulty tolerating chemotherapy with optimal cumulative dose and dose intensity, most patients with non-CNS HIV-associated lymphoma survived more than 2 years after diagnosis, the majority with suppressed HIV RNA.
Racial differences in antiretroviral treatment responses remain incompletely explained and may be a consequence of differential pharmacokinetics (PK) associated with race. Raltegravir, an inhibitor of HIV-1 integrase, is commonly used in the treatment of HIV-infected patients, many of whom are African-American. However, there are few data regarding the PK of raltegravir in African-Americans. HIV-infected men and women, self-described as African-American and naive to antiretroviral therapy were treated with raltegravir (RAL) at 400 mg twice a day, plus a fixed dose of tenofovir-emtricitabine (TDF/FTC) at 300 mg/200 mg once daily. Intensive PK sampling was conducted over 24 h at week 4. Drug concentrations at two trough values of 12 and 24 h after dosing (C12 and C24), area under the concentration-curve values (AUC), maximum drug concentration (Cmax), and the time at which this concentration occurred (Tmax) in plasma were estimated with noncompartmental pharmacokinetic methods and compared to data from a subset of white subjects randomized to the RAL twice a day (plus TDF/FTC) arm of the QDMRK study, a phase III study of the safety and efficacy of once daily versus twice daily RAL in treatment naive patients. A total of 38 African-American participants were enrolled (90% male) into the REAL cohort with the following median baseline characteristics: age of 36 years, body mass index (BMI) of 23 kg/m2, and a CD4 cell count of 339/ml. Plasma HIV RNA levels were below 200 copies/ml in 95% of participants at week 4. The characteristics of the 16 white QDMRK study participants were similar, although fewer (69%) were male, the median age was higher (45 years), and BMI was lower (19 kg/m2). There was considerable interindividual variability in RAL concentrations in both cohorts. Median C12 in REAL was 91 ng/ml (range, 10 to 1,386) and in QDMRK participants was 128 ng/ml (range, 15 to 1,074). The Cmax median concentration was 1,042 ng/ml (range, 196 to 10,092) for REAL and 1,360 ng/ml (range, 218 to 9,701) for QDMRK. There were no significant differences in any RAL PK parameter between these cohorts of African-American and white individuals. Based on plasma PK, and with similar adherence rates, the performance of RAL among HIV-infected African-Americans should be no different than that of infected patients who are white.
MicroRNAs (miRNAs) are stable, small non-coding RNAs that modulate many downstream target genes. Recently, circulating miRNAs have been detected in various body fluids and within exosomes, prompting their evaluation as candidate biomarkers of diseases, especially cancer. Kaposi's sarcoma (KS) is the most common AIDS-associated cancer and remains prevalent despite Highly Active Anti-Retroviral Therapy (HAART). KS is caused by KS-associated herpesvirus (KSHV), a gamma herpesvirus also associated with Primary Effusion Lymphoma (PEL). We sought to determine the host and viral circulating miRNAs in plasma, pleural fluid or serum from patients with the KSHV-associated malignancies KS and PEL and from two mouse models of KS. Both KSHV-encoded miRNAs and host miRNAs, including members of the miR-17–92 cluster, were detectable within patient exosomes and circulating miRNA profiles from KSHV mouse models. Further characterization revealed a subset of miRNAs that seemed to be preferentially incorporated into exosomes. Gene ontology analysis of signature exosomal miRNA targets revealed several signaling pathways that are known to be important in KSHV pathogenesis. Functional analysis of endothelial cells exposed to patient-derived exosomes demonstrated enhanced cell migration and IL-6 secretion. This suggests that exosomes derived from KSHV-associated malignancies are functional and contain a distinct subset of miRNAs. These could represent candidate biomarkers of disease and may contribute to the paracrine phenotypes that are a characteristic of KS.
Circulating microRNAs (miRNAs), such as those found in exosomes, have emerged as diagnostic tools and hold promise as minimally invasive, stable biomarkers. Transfer of tumor-derived exosomal miRNAs to surrounding cells may be an important form of cellular communication. Kaposi's sarcoma-associated herpesvirus (KSHV) is the etiological agent of Kaposi's sarcoma (KS), the most common AIDS-defining cancer worldwide. Here, we survey systemically circulating miRNAs and reveal potential biomarkers for KS and Primary Effusion Lymphoma (PEL). This expands previous tissue culture studies by profiling clinical samples and by using two new mouse models of KSHV tumorigenesis. Profiling of circulating miRNAs revealed that oncogenic and viral miRNAs were present in exosomes from KS patient plasma, pleural effusions and mouse models of KS. Analysis of human oncogenic miRNAs, including the well-known miR-17-92 cluster, revealed that several miRNAs were preferentially incorporated into exosomes in our KS mouse model. Gene ontology analysis of upregulated miRNAs showed that the majority of pathways affected were known targets of KSHV signaling pathways. Transfer of these oncogenic exosomes to immortalized hTERT-HUVEC cells enhanced cell migration and IL-6 secretion. These circulating miRNAs and KS derived exosomes may therefore be part of the paracrine signaling mechanism that mediates KSHV pathogenesis.
Previous studies have revealed that HIV infected individuals possess circulating CD4+CD8+ (DP) T-cells specific for HIV antigens. In the present study, we analyzed the proliferation and functional profile of circulating DP T-cells from 30 acutely HIV infected individuals and 10 chronically HIV infected viral controllers. The acutely infected group had DP T-cells which showed more proliferative capability and multifunctionality than both their CD4+ and CD8+ T-cells. DP T-cells were found to exhibit greater proliferation and higher multifunctionality compared to CD4 T-cells in the viral controller group. The DP T-cell response represented 16% of the total anti-HIV proliferative response and greater than 70% of the anti-HIV multifunctional response in the acutely infected subjects. Proliferating DP T-cells of the acutely infected subjects responded to all HIV antigen pools with equal magnitude. Conversely, the multifunctional response was focused on the pool representing Nef, Rev, Tat, VPR and VPU. Meanwhile, the controllers’ DP T-cells focused on Gag and the Nef, Rev, Tat, VPR and VPU pool for both their proliferative and multifunctional responses. Finally, we show that the presence of proliferating DP T-cells following all HIV antigen stimulations is well correlated with proliferating CD4 T-cells while multifunctionality appears to be largely independent of multifunctionality in other T-cell compartments. Therefore, DP T-cells represent a highly reactive cell population during acute HIV infection, which responds independently from the traditional T-cell compartments.
In HIV-infected populations in developed countries, the most recent published cancer incidence trend analyses are only updated through 2008. We assessed changes in the distribution of cancer types and incidence trends among HIV-infected patients in North Carolina up until 2011.
We linked the University of North Carolina Center for AIDS Research HIV Clinical Cohort, an observational clinical cohort of 3141 HIV-infected patients, with the North Carolina Cancer registry. Cancer incidence rates were estimated across calendar years from 2000 to 2011. The distribution of cancer types was described. Incidence trends were assessed with linear regression.
Across 15,022 person-years of follow-up, 202 cancers were identified (incidence rate per 100,000 person-years [IR]: 1345; 95% confidence interval [CI]: 1166, 1544). The majority of cancers were virus-related (61%), including Kaposi sarcoma (N = 32) (IR: 213; 95%CI: 146, 301), non-Hodgkin lymphoma (N = 34) (IR: 226; 95%CI: 157, 316), and anal cancer (N = 16) (IR: 107; 95%CI: 61, 173). Non-Hodgkin lymphoma was observed to decrease from 2000 to 2011 (decline of 15 cases per 100,000 person-years per calendar year, 95%CI: -27, -3). No other changes in incidence or changes in incidence trends were observed for other cancers (all P > 0.20).
We observed a substantial burden of a variety of cancers in this population in the last decade. Kaposi sarcoma and non-Hodgkin lymphoma were consistently two of the greatest contributors to cancer burden across calendar time. Cancer rates appeared stable across calendar years, except for non-Hodgkin lymphoma, which appeared to decrease throughout the study period.
Kaposi sarcoma; AIDS; HIV; AIDS-associated malignancies; Cancer
The current goal of initial antiretroviral (ARV) therapy is suppression of plasma human immunodeficiency virus (HIV)-1 RNA levels to below 200 copies per milliliter. A proportion of HIV-infected patients who initiate antiretroviral therapy in clinical practice or antiretroviral clinical trials either fail to suppress HIV-1 RNA or have HIV-1 RNA levels rebound on therapy. Frequently, these patients have sustained CD4 cell counts responses and limited or no clinical symptoms and, therefore, have potentially limited indications for altering therapy which they may be tolerating well despite increased viral replication. On the other hand, increased viral replication on therapy leads to selection of resistance mutations to the antiretroviral agents comprising their therapy and potentially cross-resistance to other agents in the same class decreasing the likelihood of response to subsequent antiretroviral therapy. The optimal time to switch antiretroviral therapy to ensure sustained virologic suppression and prevent clinical events in patients who have rebound in their HIV-1 RNA, yet are stable, is not known. Randomized clinical trials to compare early versus delayed switching have been difficult to design and more difficult to enroll. In some clinical trials, such as the AIDS Clinical Trials Group (ACTG) Study A5095, patients randomized to initial antiretroviral treatment combinations, who fail to suppress HIV-1 RNA or have a rebound of HIV-1 RNA on therapy are allowed to switch from the initial ARV regimen to a new regimen, based on clinician and patient decisions. We delineate a statistical framework to estimate the effect of early versus late regimen change using data from ACTG A5095 in the context of two-stage designs.
In causal inference, a large class of doubly robust estimators are derived through semiparametric theory with applications to missing data problems. This class of estimators is motivated through geometric arguments and relies on large samples for good performance. By now, several authors have noted that a doubly robust estimator may be suboptimal when the outcome model is misspecified even if it is semiparametric efficient when the outcome regression model is correctly specified. Through auxiliary variables, two-stage designs, and within the contextual backdrop of our scientific problem and clinical study, we propose improved doubly robust, locally efficient estimators of a population mean and average causal effect for early versus delayed switching to second-line ARV treatment regimens. Our analysis of the ACTG A5095 data further demonstrates how methods that use auxiliary variables can improve over methods that ignore them. Using the methods developed here, we conclude that patients who switch within 8 weeks of virologic failure have better clinical outcomes, on average, than patients who delay switching to a new second-line ARV regimen after failing on the initial regimen. Ordinary statistical methods fail to find such differences. This article has online supplementary material.
Causal inference; Double robustness; Longitudinal data analysis; Missing data; Rubin causal model; Semiparametric efficient estimation
To examine long-term effects of antiretroviral therapy (ART) on kidney function, we evaluated the incidence and risk factors for chronic kidney disease (CKD) among ART-naive, HIV-infected adults and compared changes in estimated glomerular filtration rates (eGFR) before and after starting ART.
Multicenter observational cohort study of patients with at least one serum creatinine measurement before and after initiating ART. Cox proportional hazard models, and marginal structure models examined CKD risk factors; mixed-effects linear models examined eGFR slopes.
Three thousand, three hundred and twenty-nine patients met entry criteria, contributing 10 099 person-years of observation on ART. ART was associated with a significantly slower rate of eGFR decline (from −2.18 to −1.37 ml/min per 1.73 m2 per year; P = 0.02). The incidence of CKD defined by eGFR thresholds of 60, 45 and 30 ml/min per 1.73 m2 was 10.5, 3.4 and 1.6 per 1000 person-years, respectively. In adjusted analyses black race, hepatitis C coinfection, lower time-varying CD4 cell count and higher time-varying viral load on ART were associated with higher CKD risk, and the magnitude of these risks increased with more severe CKD. Tenofovir and a ritonavir-boosted protease inhibitor (rPI) was also associated with higher CKD risk [hazard odds ratio for an eGFR threshold <60 ml/min per 1.73 m2: 3.35 (95% confidence interval (CI) = 1.40–8.02)], which developed in 5.7% of patients after 4 years of exposure to this regimen-type.
ART was associated with reduced CKD risk in association with CD4 cell restoration and plasma viral load suppression, despite an increased CKD risk that was associated with initial regimens that included tenofovir and rPI.
antiretroviral therapy; chronic kidney disease; tenofovir
Many adults in the United States enter primary care late in the course of HIV infection, countering the clinical benefits of timely HIV services and missing opportunities for risk reduction. Our objective was to determine if perceived social support was associated with delay entering care after an HIV diagnosis. Two hundred sixteen patients receiving primary care at a large, university-based HIV outpatient clinic in North Carolina were included in the study. Dimensions of functional social support (emotional/informational, tangible, affectionate and positive social interaction) were quantified with a modified Medical Outcomes Study Social Support Scale and included in proportional hazard models to determine their effect on delays seeking care. The median delay between diagnosis and entry to primary care was 5.9 months. Levels of social support were high but only positive social interaction was moderately associated with delayed presentation in adjusted models. The effect of low perceived positive social interaction on the time to initiation of primary care differed by history of alcoholism (no history of alcoholism, hazard ratio (HR): 1.43, 95% confidence interval (CI): 0.88, 2.34; history of alcoholism, HR: 0.71, 95% CI: 0.40, 1.28). Ensuring timely access to HIV care remains a challenge in the southeastern United States. Affectionate, tangible, and emotional/informational social support were not associated with the time from diagnosis to care. The presence of positive social interaction may be an important factor influencing care seeking behavior after diagnosis.
HIV infection; social support; time factors; delivery of health care; southeastern United States
Prompted by 3 cases of resistance noted in the Pre-Exposure Prophylaxis Initiative and TDF2 trials, we examined literature on mutations elicited by antiretrovirals used for pre-exposure prophylaxis. We discuss signature mutations, how rapidly these emerge, and individual-level and public health consequences of antiretroviral resistance.
Pre-exposure prophylaxis (PrEP), the use of antiretrovirals (ARVs) by human immunodeficiency virus (HIV)–uninfected individuals to prevent acquisition of the virus during high-risk sexual encounters, enjoyed its first 2 major successes with the Centre for the AIDS Programme of Research in South Africa (CAPRISA) 004 and the Pre-Exposure Prophylaxis Initiative (iPrEx). These successes were buoyed by additional positive results from the TDF2 and Partners PrEP trials. Although no seroconverters in either arm of CAPRISA developed resistance to tenofovir, 2 participants in iPrEx with undetected, seronegative acute HIV infection were randomized to receive daily oral tenofovir-emtricitabine and resistance to emtricitabine was later discovered in both men. A similar case in the TDF2 study resulted in resistance to both ARVs. These cases prompted us to examine existing literature on the nature of resistance mutations elicited by ARVs used for PrEP. Here, we discuss the impact of signature mutations selected by PrEP, how rapidly these emerge with daily ARV exposure, and the individual-level and public health consequences of ARV resistance.
Background. Dolutegravir (DTG; S/GSK1349572), a human immunodeficiency virus type 1 (HIV-1) integrase inhibitor, has limited cross-resistance to raltegravir (RAL) and elvitegravir in vitro. This phase IIb study assessed the activity of DTG in HIV-1–infected subjects with genotypic evidence of RAL resistance.
Methods. Subjects received DTG 50 mg once daily (cohort I) or 50 mg twice daily (cohort II) while continuing a failing regimen (without RAL) through day 10, after which the background regimen was optimized, when feasible, for cohort I, and at least 1 fully active drug was mandated for cohort II. The primary efficacy end point was the proportion of subjects on day 11 in whom the plasma HIV-1 RNA load decreased by ≥0.7 log10 copies/mL from baseline or was <400 copies/mL.
Results. A rapid antiviral response was observed. More subjects achieved the primary end point in cohort II (23 of 24 [96%]), compared with cohort I (21 of 27 [78%]) at day 11. At week 24, 41% and 75% of subjects had an HIV-1 RNA load of <50 copies/mL in cohorts I and II, respectively. Further integrase genotypic evolution was uncommon. Dolutegravir had a good, similar safety profile with each dosing regimen.
Conclusion. Dolutegravir 50 mg twice daily with an optimized background provided greater and more durable benefit than the once-daily regimen. These data are the first clinical demonstration of the activity of any integrase inhibitor in subjects with HIV-1 resistant to RAL.
Dolutegravir; DTG; S/GSK1349572; integrase inhibitor; raltegravir resistance
To explore darunavir/ritonavir (DRV/r) plus raltegravir (RAL) combination therapy in antiretroviral-naive patients.
Phase IIb, single-arm, open-label, multicenter study.
One hundred and twelve antiretroviral-naive, HIV-1-infected patients received DRV/r 800/100 mg once daily and RAL 400 mg twice daily. Primary endpoint was virologic failure by week 24. Virologic failure was defined as confirmed viral load of 1000 copies/ml or more at week 12, or an increase of more than 0.5 log10 copies/ml in viral load from week 4 to 12, or a confirmed viral load of more than 50 copies/ml at or after week 24. Protease and integrase genes were sequenced in patients experiencing virologic failure.
Virologic failure rate was 16% [95% confidence interval (CI) 10–24] by week 24 and 26% (95% CI 19–36) by week 48 in an intent-to-treat analysis. Viral load at virologic failure was 51–200 copies/ml in 17/28 failures. Adjusting for age and sex, virologic failure was associated with baseline viral load of more than 100 000 copies/ml [hazard ratio 3.76, 95% CI (1.52–9.31), P =0.004] and lower CD4 cell count [0.77 per 100 cells/μl increase (95% CI 0.61–0.98), P =0.037]. When trough RAL concentrations were included as a time-varying covariate in the analysis, virologic failure remained associated with baseline viral load more than 100 000 copies/ml [hazard ratio =4.67 (95% CI 1.93–11.25), P <0.001], whereas RAL level below detection limit in plasma at one or more previous visits was associated with increased hazard [hazard ratio =3.42 (95% CI 1.41–8.26), P =0.006]. All five participants with integrase mutations during virologic failure had baseline viral load more than 100 000 copies/ml.
DRV/r plus RAL was effective and well tolerated in most patients, but virologic failure and integrase resistance were common, particularly in patients with baseline viral load more than 100 000 copies/ml.
antiretroviral therapy; darunavir; nucleoside sparing; raltegravir
QDMRK was a phase III clinical trial of raltegravir given once daily (QD) (800-mg dose) versus twice daily (BID) (400 mg per dose), each in combination with once-daily coformulated tenofovir-emtricitabine, in treatment-naive HIV-infected patients. Pharmacokinetic (PK) and pharmacokinetic/pharmacodynamic (PK/PD) analyses were conducted using a 2-step approach: individual non-model-based PK parameters from observed sparse concentration data were determined, followed by statistical analysis of potential relationships between PK and efficacy response parameters after 48 weeks of treatment. Sparse PK sampling was performed for all patients (QD, n = 380; BID, n = 384); selected sites performed an intensive PK evaluation at week 4 (QD, n = 22; BID, n = 20). In the intensive PK subgroup, daily exposures (area under the concentration-time curve from 0 to 24 h [AUC0–24]) were similar between the two regimens, but patients on 800 mg QD experienced ∼4-fold-higher maximum drug concentration in plasma (Cmax) values and ∼6-fold-lower trough drug concentration (Ctrough) values than those on 400 mg BID. Geometric mean (GM) Ctrough values were similarly lower in the sparse PK analysis. With BID dosing, there was no indication of any significant PK/PD association over the range of tested PK parameters. With QD dosing, Ctrough values correlated with the likelihood of virologic response. Failure to achieve an HIV RNA level of <50 copies/ml appeared predominantly at high baseline HIV RNA levels in both treatment arms and was associated with lower values of GM Ctrough in the 800-mg-QD arm, though other possible drivers of efficacy, such as time above a threshold concentration, could not be evaluated due to the sparse sampling scheme. Together, these findings emphasize the importance of the shape of the plasma concentration-versus-time curve for long-term efficacy.
Persons with unrecognized HIV infection forgo timely clinical intervention and may unknowingly transmit HIV to partners. However, in the United States, unrecognized infection and late diagnosis are common. To understand barriers and facilitators to HIV testing and care, we conducted a qualitative study of 24 HIV infected persons attending a Southeastern HIV clinic who presented with clinically advanced illness. The primary barrier to HIV testing prior to diagnosis was perception of risk; consequently, most participants were diagnosed after the onset of clinical symptoms. While most patients were anxious to initiate care rapidly after diagnosis, some felt frustrated by the passive process of connecting to specialty care. The first visit with an HIV care provider was identified as critical in the coping process for many patients. Implications for the implementation of recent CDC HIV routine screening guidelines are discussed.
HIV Infection; Voluntary Counseling and Testing; Delivery of Health Care; Southeastern United States; Social Support
Viremia copy-years predicted all-cause mortality independent of traditional, cross-sectional viral load measures and time-updated CD4+ T-lymphocyte count in antiretroviral therapy-treated patients suggesting cumulative human immunodeficiency virus replication causes harm independent of its effect on the degree of immunodeficiency.
Background. Cross-sectional plasma human immunodeficiency virus (HIV) viral load (VL) measures have proven invaluable for clinical and research purposes. However, cross-sectional VL measures fail to capture cumulative plasma HIV burden longitudinally. We evaluated the cumulative effect of exposure to HIV replication on mortality following initiation of combination antiretroviral therapy (ART).
Methods. We included treatment-naive HIV-infected patients starting ART from 2000 to 2008 at 8 Center for AIDS Research Network of Integrated Clinical Systems sites. Viremia copy-years, a time-varying measure of cumulative plasma HIV exposure, were determined for each patient using the area under the VL curve. Multivariable Cox models were used to evaluate the independent association of viremia copy-years for all-cause mortality.
Results. Among 2027 patients contributing 6579 person-years of follow-up, the median viremia copy-years was 5.3 log10 copy × y/mL (interquartile range: 4.9–6.3 log10 copy × y/mL), and 85 patients (4.2%) died. When evaluated separately, viremia copy-years (hazard ratio [HR] = 1.81 per log10 copy × y/mL; 95% confidence interval [CI], 1.51–2.18 per log10 copy × y/mL), 24-week VL (1.74 per log10 copies/mL; 95% CI, 1.48–2.04 per log10 copies/mL), and most recent VL (HR = 1.89 per log10 copies/mL; 95% CI: 1.63–2.20 per log10 copies/mL) were associated with increased mortality. When simultaneously evaluating VL measures and controlling for other covariates, viremia copy-years increased mortality risk (HR = 1.44 per log10 copy × y/mL; 95% CI, 1.07–1.94 per log10 copy × y/mL), whereas no cross-sectional VL measure was independently associated with mortality.
Conclusions. Viremia copy-years predicted all-cause mortality independent of traditional, cross-sectional VL measures and time-updated CD4+ T-lymphocyte count in ART-treated patients, suggesting cumulative HIV replication causes harm independent of its effect on the degree of immunodeficiency.
Among new patients entering HIV care from 1999 to 2009 in a North Carolina observational clinical cohort, Latinos initiated HIV care at lower CD4 cell counts and were more likely to have several specific AIDS-defining clinical conditions, compared with non-Latinos.
(See the Editorial Commentary by Rio on pages 488–489.)
Background. Late diagnosis of human immunodeficiency virus (HIV) infection remains common despite advances in therapy and prognosis. The southeastern United States is a rapidly growing Latino settlement area where ethnic disparities may contribute to late presentation to care.
Methods. We assessed demographic and clinical factors between racial/ethnic groups at the time of HIV care initiation in the University of North Carolina Center for AIDS Research Clinical Cohort. We identified independent predictors of late presentation, defined as a CD4+ T lymphocyte (CD4) count <350 cells/mm3 or an AIDS-defining event (ADE), using log-linear binomial regression.
Results. During the period 1999–2009, 853 patients initiated HIV care, of whom 11% were Latino, 28% were white, and 61% were black. Median initial CD4 counts were lower for Latino patients (186 cells/mm3) than white patients (292 cells/mm3; P = .006) and black patients (302 cells/mm3; P = .02). Latino persons were more likely to be late presenters than white or black persons (76% vs 58%; P < .001) and accounted for 86%, 75%, and 50% of all presenting cases of active tuberculosis, histoplasmosis, and toxoplasmosis, respectively. Latino ethnicity, older age, male sex, and earlier entry year were independently associated with late presentation (P < .05 for all). In multivariable analyses, Latino persons were 1.29 times more likely to present to care late than white or black persons (95% confidence interval, 1.15–1.45).
Conclusions. Latinos are more likely to initiate HIV care later in the course of illness than are black and white persons and account for a majority of several ADEs. Strategies to improve earlier HIV testing among Latinos in new settlement areas are needed.
Population sequencing was performed for persons identified with persistent low-level viremia in 2 clinical trials. Persistent low-level viremia (defined as plasma HIV-1 RNA level >50 and <1000 copies/mL in at least 2 determinations over a 24-week period, after at least 24 weeks of antiretroviral therapy) was observed in 65 (5.6%) of 1158 patients at risk. New resistance mutations were detected during persistent low-level viremia in 37% of the 54 evaluable cases. The most common mutations were M184I/V (14 cases), K103N (9), and M230L (3). Detection of new mutations was associated with higher HIV-1 RNA levels during persistent low-level viremia.
HIV-1 protease inhibitors (PIs) have antimalarial activity in vitro and in murine models. The potential beneficial effect of HIV-1 PIs on malaria has not been studied in clinical settings. We used data from Adult AIDS Clinical Trials Group A5208 sites where malaria is endemic to compare the incidence of clinically diagnosed malaria among HIV-infected adult women randomized to either lopinavir/ritonavir (LPV/r)-based antiretroviral therapy (ART) or to nevirapine (NVP)-based ART. We calculated hazard ratios and 95% confidence intervals. We conducted a recurrent events analysis that included both first and second clinical malarial episodes and also conducted analyses to assess the sensitivity of results to outcome misclassification. Among the 445 women in this analysis, 137 (31%) received a clinical diagnosis of malaria at least once during follow-up. Of these 137, 72 (53%) were randomized to LPV/r-based ART. Assignment to the LPV/r treatment group (n = 226) was not consistent with a large decrease in the hazard of first clinical malarial episode (hazard ratio = 1.11 [0.79 to 1.56]). The results were similar in the recurrent events analysis. Sensitivity analyses indicated the results were robust to reasonable levels of outcome misclassification. In this study, the treatment with LPV/r compared to NVP had no apparent beneficial effect on the incidence of clinical malaria among HIV-infected adult women. Additional research concerning the effects of PI-based therapy on the incidence of malaria diagnosed by more specific criteria and among groups at a higher risk for severe disease is warranted.