Transmission of HIV continues in the United States (US), despite prevention efforts aimed at education and treatment. Concurrently, drug resistance in HIV, particularly in patients being infected with HIV for the first time, poses a threat to the continued success of treatment for HIV positive individuals. In North Carolina, nearly one in five individuals with acute HIV infection (AHI) is infected with a drug-resistant strain, a phenomenon known as transmitted drug resistance (TDR). Few studies of AHI or TDR take into account both the spatial aspects of residence at time of infection and the genetic characteristics of the viruses, and questions remain about how viruses are transmitted across space and the rural-urban divide. Using AHI strains from North Carolina, we examined whether differences exist in the spatial patterns of AHI versus AHI with TDR, as well as whether the genetic characteristics of these HIV infections vary by rural-urban status and across Health Service Areas. The highest amounts of TDR were detected in persons under age 30, African Americans, and men who have sex with men (MSM) - similar to the populations where the highest numbers of AHI without TDR are observed. Nearly a quarter of patients reside in rural areas, and there are no significant differences between rural and urban residence among individuals infected with drug resistant or drug susceptible viruses. We observe similar levels of genetic distance between HIV found in rural and urban areas, indicating that viruses are shared across the rural-urban divide. Genetic differences are observed, however, across Health Service Areas, suggesting that local areas are sites of genetic differentiation in viruses being transmitted to newly infected individuals. These results indicate that future efforts to prevent HIV transmission need to be spatially targeted, focusing on local-level transmission in risky populations, in addition to statewide anti- HIV efforts.
In HIV-1-infected patients receiving antiretroviral therapy (ART), the relationship between residual viremia and ex vivo recovery of infectious virus from latently-infected CD4 cells is uncertain.
We measured residual viremia (HIV-1 RNA copies/mL) by single-copy assay (SCA) and the latent reservoir by infectious virus recovery from resting memory CD4 cells (infectious units per million cells [IUPM]) in patients who initiated ART. We assessed immune activation by measuring CD38 expression on T cells.
Ten patients who initiated ART and maintained a plasma HIV-1 RNA level <200 copies/mL had residual viremia and IUPM measured every 24 weeks. Five of 10 patients had longitudinal IUPM measured at weeks 24–96; the remainder had IUPM measured 1–3 times over 24–72 weeks. Analyses of 29 paired measurements revealed a positive association between level of residual viremia and IUPM (0.56 higher log10 HIV-1 RNA copies/mL per 1 log10 higher IUPM, p=0.005). Residual viremia level was positively associated with CD38 density and percentage on CD8+ T-cells in concurrent samples and with pre-ART HIV-1 RNA levels.
In patients with HIV-1 RNA levels <200 copies/mL 24–96 weeks after initiating ART, the level of viremia is positively associated with infectious virus recovery from resting memory CD4 cells. Whether this association persists after longer-term suppressive ART needs to be determined. If additional studies show that residual viremia measured by SCA reflects the size of the latent reservoir in patients who have had virologic suppression for longer periods of time, this could facilitate testing of potentially curative strategies to reduce this important reservoir.
HIV-1; reservoir; residual viremia; single-copy assay
Drug-resistant HIV complicates management of HIV infection. Although an estimated 14% of all HIV-positive persons pass through a prison or jail in the United States each year, little is known about the overall prevalence of antiretroviral (ARV) resistance in incarcerated persons. All genotypic sequence data on HIV-positive prisoners in the North Carolina (NC) Department of Corrections (DOC) were obtained from LabCorp. Screening for major resistance mutations in protease (PI) and reverse transcriptase (NRTI and NNRTI) was done using Genosure and the Stanford HIV Database. For subjects with multiple genotype reports, each mutation was counted only once and considered present on all subsequent genotypes. Between October 2006 and February 2010, the NC DOC incarcerated 1,911 HIV+ individuals of whom 19.2% (n=367) had at least one genotype performed. The overall prevalence of a major resistance mutation was 28.3% (95% CI 23.7, 33.0). Among prisoners ever exposed to an ARV during incarceration (n=329) prevalence of a major resistance mutation was 29.8% (95% CI 24.9, 34.7); resistance by class was 20.4% (95% CI 16.0, 24.7) for NRTIs, 19.8% (95% CI 15.5, 24.1) for NNRTIs, and 8.8% (95% CI 5.8,11.9) for PIs. Single class drug resistance was most prevalent at 14.2% (10.2,17.7) followed by dual 12.5% (I8.9,16.0) and triple class 3.3% (1.4,5.3) resistance. The three most prevalent mutations were K103N 15.8% (12.0, 20.2), M184V 14.3% (10.7,18.5), and M41L 4.9% (2.8,7.8). In the NC DOC ARV resistance prevalence, dual and triple class drug resistance was moderate over the study period. Resistance to PIs was lower than NNRTIs and NRTIs, likely reflecting higher usage of these two classes or a lower barrier to resistance.
HIV-1-infected individuals with plasma RNA <50 copies/mL on antiretroviral therapy (ART) may have residual, low-level viremia detectable by PCR assays which can detect a single copy of viral RNA (single-copy assay, SCA). The clinical predictors of residual viremia in patients on long-term suppressive ART are incompletely understood.
We evaluated factors associated with residual viremia in patients on suppressive ART who underwent screening for a raltegravir intensification trial (ACTG A5244). The screened population was HIV-1-infected adults receiving ART for ≥12 months with pre-ART HIV-1 RNA >100,000 copies/mL and on-therapy RNA levels below detection limits of commercial assays for ≥6 months.
Of 103 patients eligible for analysis, the median age was 46 years and the median duration of viral suppression was 4.8 years. Sixty-two percent had detectable viremia (>0.2 copies/mL) by SCA (median 0.2 copies/mL; quartile [Q] 1, Q3 [<0.2, 1.8]). Younger patients had lower HIV-1 RNA levels than older individuals (r=0.27, p=0.005). Patients with virologic suppression on ART for 2 years or less had higher residual viremia than those with suppression for more than 2 years (median 2.3 vs. 0.2 copies/mL, p=0.016).
Among HIV-1-infected patients with pre-ART HIV-1 RNA >100,000 copies/mL, residual viremia was detectable in the majority (62%) despite many years of suppressive ART. Higher level viremia was associated with older age and less than 2 years of virologic suppression on ART. These findings should help in selection of candidates for clinical trials of interventions designed to eliminate residual viremia.
HIV-1; Single-copy assay; residual viremia
Valproic acid and intensified antiretroviral therapy may deplete resting CD4+ T-cell HIV infection. We tested the ability of valproic acid to deplete resting CD4+ T-cell infection in patients receiving standard antiretroviral therapy.
Resting CD4+ T-cell infection was measured in 11 stably aviremic volunteers twice prior to, and twice after Depakote ER 1000 mg was added to standard antiretroviral therapy. Resting CD4+ T-cell infection frequency was measured by outgrowth assay. Low-level viremia was quantitated by single copy plasma HIV RNA assay.
A decrease in resting CD4+ T-cell infection was observed in only four of the 11 patients. Levels of immune activation and HIV-specific T-cell response were low and stable. Valproic acid levels ranged from 26 to 96 μg/ml when measured near trough. Single copy assay was performed in nine patients. In three patients with depletion of resting CD4+ T-cell infection following valproic acid, single copy assay ranged from less than 1–5 copies/ml. Continuous low-level viremia was observed in three patients with stable resting CD4+ T-cell infection (24–87, 8–87, and 1–7 copies/ml respectively) in whom multiple samples were analyzed.
The prospective addition of valproic acid to stable antiretroviral therapy reduced the frequency of resting CD4+ T-cell infection in a minority of volunteers. In patients in whom resting CD4+ T-cell infection depletion was observed, viremia was rarely detectable by single copy assay.
antiretroviral therapy; HIV; latency; resting CD4+ T cells; valproic acid
HIV infections increased 48% among young, Black men who have sex with men (MSM) in the United States between 2006–2009. Incomplete understanding of this trend undermines prevention strategy development. We investigated a sexual network to characterize the risk environment in which young, Black MSM acquire HIV.
Persons reported to the state following diagnosis of HIV or syphilis were included, along with sexual partners. We used network mapping alongside descriptive and bivariate statistics to characterize network connections. Generalized linear models assessed predictors of having untraceable sex partners.
The network included 398 individuals and 419 sexual relationships. Three-quarters were Black (n=299); 92% were MSM. Median age at first network appearance was 26 years and decreased over time (P<0.001). HIV prevalence was at least 29% (n=117); serostatus was unknown for 47% of the network, either because they were untraceable (n=150) or refused HIV testing (n=39). One in 5 network members diagnosed with HIV had a subsequent incident sexually transmitted infection. In multivariable models, one-time encounters increased the risk of having an untraceable partner (risk ratio 4.51, 95% CI, 2.27, 8.97), while being acutely HIV infected at diagnosis reduced it (RR 0.27, 95% CI, 0.08, 0.89).
HIV prevalence in this sexual network of young, Black MSM rivals that of sub-Saharan Africa, reflecting dramatically increased risk of acquiring HIV from the moment one entered the network. Prevention efforts for this population must consider the effect of sexual networks on HIV risk, and find ways of leveraging network structure to reduce transmission.
HIV; African-American; men who have sex with men; sexual networks
The associations of acute HIV infection (AHI) and other predictors with transmitted drug resistance (TDR) prevalence were assessed in a cohort of HIV-infected, antiretroviral-naïve patients. AHI was defined as being seronegative with detectable HIV RNA. Binomial regression was used to estimate prevalence ratios and 95% confidence intervals (CIs) for associations with TDR. Among 43 AHI patients, TDR prevalence was 20.9%, while prevalence was 8.6% among 677 chronically-infected patients. AHI was associated with 1.9 times the prevalence of TDR (95% CI: 1.0, 3.6) in multivariable analysis. AHI patients may represent a vanguard group that portends increasing TDR in the future.
Transmitted drug resistance; HIV-1; Acute HIV Infection; Antiretroviral Resistance
The p16INK4a tumor suppressor gene is a mediator of cellular senescence and has been suggested to be a biomarker of ‘molecular’ age in several tissues including T-cells. To determine the association of both active and suppressed HIV infection with T-cell aging, T-cell p16INK4a expression was compared between 60 HIV+ suppressed subjects, 23 HIV+ untreated subjects, and 18 contemporaneously collected HIV-negative controls, as well as 148 HIV-negative historical samples. Expression did not correlate with chronologic age in untreated HIV+ patients, consistent with an effect of active HIV replication on p16INK4a expression. In patients on cART with suppressed viral loads, however, p16INK4a levels were similar to uninfected controls and correlated with chronologic age, with a trend toward an inverse correlation with CD4 count. These data show that p16INK4a is a reliable biomarker of T cell aging in HIV+ patients with suppressed viral loads and suggest that poor CD4 cell recovery on cART may be associated with increased T-cell expression of p16INK4a, a marker of cellular senescence.
Modeling clinical endpoints as a function of change in antiretroviral therapy (ART) attempts to answer one simple but very challenging question: was the change in ART beneficial or not? We conceive a similar scientific question of interest in the current manuscript except that we are interested in modeling the time of ART regimen change rather than a comparison of two or more ART regimens. The answer to this scientific riddle is unknown and has been difficult to address clinically. Naturally, ART regimen change is left to a participant and his or her provider and so the date of change depends on participant characteristics. There exists a vast literature on how to address potential confounding and those techniques are vital to the success of the method here. A more substantial challenge is devising a systematic modeling strategy to overcome the missing time of regimen change for those participants who do not switch to second-line ART within the study period even after failing the initial ART. In this paper, we adopt and apply a statistical method that was originally proposed for modeling infusion trial data, where infusion length may be informatively censored, and argue that the same strategy may be employed here. Our application of this method to therapeutic HIV/AIDS studies is new and interesting. Using data from the AIDS Clinical Trials Group (ACTG) Study A5095, we model immunological endpoints as a polynomial function of a participant’s switching time to second-line ART for 182 participants who already failed the initial ART. In our analysis, we find that participants who switch early have somewhat better sustained suppression of HIV-1 RNA after virological failure than those who switch later. However, we also found that participants who switched very late, possibly censored due to the end of the study, had good HIV-1 RNA suppression, on average. We believe our scientific conclusions contribute to the relevant HIV literature and hope that the basic modeling strategy outlined here would be useful to others contemplating similar analyses with partially missing treatment length data.
Causal inference; Informative Censoring; Observational data; Propensity score
To examine the association between early HIV viremia and mortality after HIV-associated lymphoma.
Multicenter observational cohort study.
Center for AIDS Research Network of Integrated Clinical Systems cohort.
HIV-infected patients with lymphoma diagnosed between 1996 and 2011, who were alive 6 months after lymphoma diagnosis and with ≥2 HIV RNA values during the 6 months after lymphoma diagnosis.
Cumulative HIV viremia during the 6 months after lymphoma diagnosis, expressed as viremia copy-6-months.
Main outcome measure
All-cause mortality between 6 months and 5 years after lymphoma diagnosis.
Of 224 included patients, 183 (82%) had non-Hodgkin lymphoma (NHL) and 41 (18%) had Hodgkin lymphoma (HL). At lymphoma diagnosis, 105 (47%) patients were on antiretroviral therapy (ART), median CD4 count was 148 cells/µlL (IQR 54– 322), and 33% had suppressed HIV RNA (<400 copies/mL). In adjusted analyses, mortality was associated with older age [adjusted hazard ratio (AHR) 1.37 per decade increase, 95% CI 1.03–1.83], lymphoma occurrence on ART (AHR 1.63, 95% CI 1.02– 2.63), lower CD4 count (AHR 0.75 per 100 cell/µL increase, 95% CI 0.64–0.89), and higher early cumulative viremia (AHR 1.35 per log10copies × 6-months/mL, 95% CI 1.11–1.65). The detrimental effect of early cumulative viremia was consistent across patient groups defined by ART status, CD4 count, and histology.
Exposure to each additional 1-unit log10 in HIV RNA throughout the 6 months after lymphoma diagnosis, was associated with a 35% increase in subsequent mortality. These results suggest that early and effective ART during chemotherapy may improve survival.
AIDS; Burkitt lymphoma; diffuse large B-cell lymphoma; HIV; Hodgkin lymphoma; lymphoma; non-Hodgkin lymphoma
Despite prevention efforts new HIV diagnoses continue in the Southern US, where the epidemic is characterized by significant racial/ethnic disparities. We integrated phylogenetic analyses with clinical data to reveal trends in local HIV transmission.
Cross-sectional analysis of 1671 HIV-infected individuals each with one B-subtype pol sequence obtained during chronic (82%; UNC Center for AIDS Research Clinical Cohort) or acute/recent (18%; Duke/UNC Acute HIV Consortium) infection.
Phylogenies were inferred using neighbor joining to select related sequences then confirmed with Bayesian methods. We characterized transmission clusters (clades n≥3 sequences supported by posterior probabilities=1) by factors including race/ethnicity and transmission risk. Factors associated with cluster membership were evaluated for newly diagnosed patients.
Overall, 72% were male, 59% black and 39% MSM. A total of 557 (33%) sequences grouped in either 108 pairs (n=216) or 67 clusters (n=341). Clusters ranged from 3–36 (median 4) members. Composition was delineated primarily by race, with 28% exclusively black, and to a lesser extent by risk group. Both MSM and heterosexuals formed discrete clusters though substantial mixing was observed. In multivariable analysis, patients with age ≤30 years (P=0.009), acute infection (P=0.02), local residence (P=0.002), and transmitted drug resistance (P=0.02) were more likely to be cluster members while Latinos were less likely (P<0.001).
Integration of molecular, clinical and demographic data offers a unique view into the structure of local transmission networks. Clustering by black race, youth and TDR and inability to identify Latino clusters will inform prevention, testing and linkage to care strategies.
Molecular epidemiology; HIV transmission; Risk Factors; Acute Infection; Southeast US
Alcohol Drinking; HIV Seropositivity; Men who Have Sex with Men; Prospective Studies; Sexual Behavior
Acute HIV-1 infection causes a rapid total body depletion of CD4+ T cells in most individuals and HIV-1-specific CD8+ T cell expansion in response to viral replication. A numerically high CD8 T cell response may indicate limited T cell repertoire against HIV and rapid progression. We present a detailed evaluation of an acutely infected individual with a strong HIV-1-specific CD8 T cell response targeting multiple epitopes demonstrating that the upper limit of CD8 expansion in this setting may be much higher than previously reported and was likely driven by the narrow HIV-specific response.
HIV increases risk of non-Hodgkin lymphoma (NHL) and Hodgkin lymphoma (HL). The effect of HIV on presentation, treatment, and outcomes of NHL and HL in routine care in the combination antiretroviral therapy (cART) merits further characterization. We performed a retrospective analysis of HIV-infected patients with NHL and HL receiving care at the University of North Carolina at Chapel Hill from January 1, 2000 until December 31, 2010. Statistical analyses were conducted using SAS, version 9.2 (SAS Institute Inc). Sixty-five HIV-infected patients with NHL and HL were identified. Patients with non-CNS NHL and HL presented with advanced disease (85% stage III or IV) and adverse prognostic features. Patients completed 87% of planned chemotherapy cycles, and 68% of patients completed stage-appropriate therapy. Dose reduction, interruption, and/or delay occurred during more than 25% of administered cycles in 64% of patients. Infectious complications, febrile neutropenia, and myelosuppression accounted for 78% of deviations from planned cumulative dose and dose intensity. Primary CNS lymphoma (PCNSL) was associated with poor prognosis, but 2-year overall survival was 66% for all non-CNS lymphoma. Among patients surviving at least 2 years, 75% had CD4 count >200 cells/μl and 79% had HIV viral load <400 copies/ml at last follow-up. Despite advanced disease and difficulty tolerating chemotherapy with optimal cumulative dose and dose intensity, most patients with non-CNS HIV-associated lymphoma survived more than 2 years after diagnosis, the majority with suppressed HIV RNA.
Racial differences in antiretroviral treatment responses remain incompletely explained and may be a consequence of differential pharmacokinetics (PK) associated with race. Raltegravir, an inhibitor of HIV-1 integrase, is commonly used in the treatment of HIV-infected patients, many of whom are African-American. However, there are few data regarding the PK of raltegravir in African-Americans. HIV-infected men and women, self-described as African-American and naive to antiretroviral therapy were treated with raltegravir (RAL) at 400 mg twice a day, plus a fixed dose of tenofovir-emtricitabine (TDF/FTC) at 300 mg/200 mg once daily. Intensive PK sampling was conducted over 24 h at week 4. Drug concentrations at two trough values of 12 and 24 h after dosing (C12 and C24), area under the concentration-curve values (AUC), maximum drug concentration (Cmax), and the time at which this concentration occurred (Tmax) in plasma were estimated with noncompartmental pharmacokinetic methods and compared to data from a subset of white subjects randomized to the RAL twice a day (plus TDF/FTC) arm of the QDMRK study, a phase III study of the safety and efficacy of once daily versus twice daily RAL in treatment naive patients. A total of 38 African-American participants were enrolled (90% male) into the REAL cohort with the following median baseline characteristics: age of 36 years, body mass index (BMI) of 23 kg/m2, and a CD4 cell count of 339/ml. Plasma HIV RNA levels were below 200 copies/ml in 95% of participants at week 4. The characteristics of the 16 white QDMRK study participants were similar, although fewer (69%) were male, the median age was higher (45 years), and BMI was lower (19 kg/m2). There was considerable interindividual variability in RAL concentrations in both cohorts. Median C12 in REAL was 91 ng/ml (range, 10 to 1,386) and in QDMRK participants was 128 ng/ml (range, 15 to 1,074). The Cmax median concentration was 1,042 ng/ml (range, 196 to 10,092) for REAL and 1,360 ng/ml (range, 218 to 9,701) for QDMRK. There were no significant differences in any RAL PK parameter between these cohorts of African-American and white individuals. Based on plasma PK, and with similar adherence rates, the performance of RAL among HIV-infected African-Americans should be no different than that of infected patients who are white.
MicroRNAs (miRNAs) are stable, small non-coding RNAs that modulate many downstream target genes. Recently, circulating miRNAs have been detected in various body fluids and within exosomes, prompting their evaluation as candidate biomarkers of diseases, especially cancer. Kaposi's sarcoma (KS) is the most common AIDS-associated cancer and remains prevalent despite Highly Active Anti-Retroviral Therapy (HAART). KS is caused by KS-associated herpesvirus (KSHV), a gamma herpesvirus also associated with Primary Effusion Lymphoma (PEL). We sought to determine the host and viral circulating miRNAs in plasma, pleural fluid or serum from patients with the KSHV-associated malignancies KS and PEL and from two mouse models of KS. Both KSHV-encoded miRNAs and host miRNAs, including members of the miR-17–92 cluster, were detectable within patient exosomes and circulating miRNA profiles from KSHV mouse models. Further characterization revealed a subset of miRNAs that seemed to be preferentially incorporated into exosomes. Gene ontology analysis of signature exosomal miRNA targets revealed several signaling pathways that are known to be important in KSHV pathogenesis. Functional analysis of endothelial cells exposed to patient-derived exosomes demonstrated enhanced cell migration and IL-6 secretion. This suggests that exosomes derived from KSHV-associated malignancies are functional and contain a distinct subset of miRNAs. These could represent candidate biomarkers of disease and may contribute to the paracrine phenotypes that are a characteristic of KS.
Circulating microRNAs (miRNAs), such as those found in exosomes, have emerged as diagnostic tools and hold promise as minimally invasive, stable biomarkers. Transfer of tumor-derived exosomal miRNAs to surrounding cells may be an important form of cellular communication. Kaposi's sarcoma-associated herpesvirus (KSHV) is the etiological agent of Kaposi's sarcoma (KS), the most common AIDS-defining cancer worldwide. Here, we survey systemically circulating miRNAs and reveal potential biomarkers for KS and Primary Effusion Lymphoma (PEL). This expands previous tissue culture studies by profiling clinical samples and by using two new mouse models of KSHV tumorigenesis. Profiling of circulating miRNAs revealed that oncogenic and viral miRNAs were present in exosomes from KS patient plasma, pleural effusions and mouse models of KS. Analysis of human oncogenic miRNAs, including the well-known miR-17-92 cluster, revealed that several miRNAs were preferentially incorporated into exosomes in our KS mouse model. Gene ontology analysis of upregulated miRNAs showed that the majority of pathways affected were known targets of KSHV signaling pathways. Transfer of these oncogenic exosomes to immortalized hTERT-HUVEC cells enhanced cell migration and IL-6 secretion. These circulating miRNAs and KS derived exosomes may therefore be part of the paracrine signaling mechanism that mediates KSHV pathogenesis.
Previous studies have revealed that HIV infected individuals possess circulating CD4+CD8+ (DP) T-cells specific for HIV antigens. In the present study, we analyzed the proliferation and functional profile of circulating DP T-cells from 30 acutely HIV infected individuals and 10 chronically HIV infected viral controllers. The acutely infected group had DP T-cells which showed more proliferative capability and multifunctionality than both their CD4+ and CD8+ T-cells. DP T-cells were found to exhibit greater proliferation and higher multifunctionality compared to CD4 T-cells in the viral controller group. The DP T-cell response represented 16% of the total anti-HIV proliferative response and greater than 70% of the anti-HIV multifunctional response in the acutely infected subjects. Proliferating DP T-cells of the acutely infected subjects responded to all HIV antigen pools with equal magnitude. Conversely, the multifunctional response was focused on the pool representing Nef, Rev, Tat, VPR and VPU. Meanwhile, the controllers’ DP T-cells focused on Gag and the Nef, Rev, Tat, VPR and VPU pool for both their proliferative and multifunctional responses. Finally, we show that the presence of proliferating DP T-cells following all HIV antigen stimulations is well correlated with proliferating CD4 T-cells while multifunctionality appears to be largely independent of multifunctionality in other T-cell compartments. Therefore, DP T-cells represent a highly reactive cell population during acute HIV infection, which responds independently from the traditional T-cell compartments.
In HIV-infected populations in developed countries, the most recent published cancer incidence trend analyses are only updated through 2008. We assessed changes in the distribution of cancer types and incidence trends among HIV-infected patients in North Carolina up until 2011.
We linked the University of North Carolina Center for AIDS Research HIV Clinical Cohort, an observational clinical cohort of 3141 HIV-infected patients, with the North Carolina Cancer registry. Cancer incidence rates were estimated across calendar years from 2000 to 2011. The distribution of cancer types was described. Incidence trends were assessed with linear regression.
Across 15,022 person-years of follow-up, 202 cancers were identified (incidence rate per 100,000 person-years [IR]: 1345; 95% confidence interval [CI]: 1166, 1544). The majority of cancers were virus-related (61%), including Kaposi sarcoma (N = 32) (IR: 213; 95%CI: 146, 301), non-Hodgkin lymphoma (N = 34) (IR: 226; 95%CI: 157, 316), and anal cancer (N = 16) (IR: 107; 95%CI: 61, 173). Non-Hodgkin lymphoma was observed to decrease from 2000 to 2011 (decline of 15 cases per 100,000 person-years per calendar year, 95%CI: -27, -3). No other changes in incidence or changes in incidence trends were observed for other cancers (all P > 0.20).
We observed a substantial burden of a variety of cancers in this population in the last decade. Kaposi sarcoma and non-Hodgkin lymphoma were consistently two of the greatest contributors to cancer burden across calendar time. Cancer rates appeared stable across calendar years, except for non-Hodgkin lymphoma, which appeared to decrease throughout the study period.
Kaposi sarcoma; AIDS; HIV; AIDS-associated malignancies; Cancer
The current goal of initial antiretroviral (ARV) therapy is suppression of plasma human immunodeficiency virus (HIV)-1 RNA levels to below 200 copies per milliliter. A proportion of HIV-infected patients who initiate antiretroviral therapy in clinical practice or antiretroviral clinical trials either fail to suppress HIV-1 RNA or have HIV-1 RNA levels rebound on therapy. Frequently, these patients have sustained CD4 cell counts responses and limited or no clinical symptoms and, therefore, have potentially limited indications for altering therapy which they may be tolerating well despite increased viral replication. On the other hand, increased viral replication on therapy leads to selection of resistance mutations to the antiretroviral agents comprising their therapy and potentially cross-resistance to other agents in the same class decreasing the likelihood of response to subsequent antiretroviral therapy. The optimal time to switch antiretroviral therapy to ensure sustained virologic suppression and prevent clinical events in patients who have rebound in their HIV-1 RNA, yet are stable, is not known. Randomized clinical trials to compare early versus delayed switching have been difficult to design and more difficult to enroll. In some clinical trials, such as the AIDS Clinical Trials Group (ACTG) Study A5095, patients randomized to initial antiretroviral treatment combinations, who fail to suppress HIV-1 RNA or have a rebound of HIV-1 RNA on therapy are allowed to switch from the initial ARV regimen to a new regimen, based on clinician and patient decisions. We delineate a statistical framework to estimate the effect of early versus late regimen change using data from ACTG A5095 in the context of two-stage designs.
In causal inference, a large class of doubly robust estimators are derived through semiparametric theory with applications to missing data problems. This class of estimators is motivated through geometric arguments and relies on large samples for good performance. By now, several authors have noted that a doubly robust estimator may be suboptimal when the outcome model is misspecified even if it is semiparametric efficient when the outcome regression model is correctly specified. Through auxiliary variables, two-stage designs, and within the contextual backdrop of our scientific problem and clinical study, we propose improved doubly robust, locally efficient estimators of a population mean and average causal effect for early versus delayed switching to second-line ARV treatment regimens. Our analysis of the ACTG A5095 data further demonstrates how methods that use auxiliary variables can improve over methods that ignore them. Using the methods developed here, we conclude that patients who switch within 8 weeks of virologic failure have better clinical outcomes, on average, than patients who delay switching to a new second-line ARV regimen after failing on the initial regimen. Ordinary statistical methods fail to find such differences. This article has online supplementary material.
Causal inference; Double robustness; Longitudinal data analysis; Missing data; Rubin causal model; Semiparametric efficient estimation
To examine long-term effects of antiretroviral therapy (ART) on kidney function, we evaluated the incidence and risk factors for chronic kidney disease (CKD) among ART-naive, HIV-infected adults and compared changes in estimated glomerular filtration rates (eGFR) before and after starting ART.
Multicenter observational cohort study of patients with at least one serum creatinine measurement before and after initiating ART. Cox proportional hazard models, and marginal structure models examined CKD risk factors; mixed-effects linear models examined eGFR slopes.
Three thousand, three hundred and twenty-nine patients met entry criteria, contributing 10 099 person-years of observation on ART. ART was associated with a significantly slower rate of eGFR decline (from −2.18 to −1.37 ml/min per 1.73 m2 per year; P = 0.02). The incidence of CKD defined by eGFR thresholds of 60, 45 and 30 ml/min per 1.73 m2 was 10.5, 3.4 and 1.6 per 1000 person-years, respectively. In adjusted analyses black race, hepatitis C coinfection, lower time-varying CD4 cell count and higher time-varying viral load on ART were associated with higher CKD risk, and the magnitude of these risks increased with more severe CKD. Tenofovir and a ritonavir-boosted protease inhibitor (rPI) was also associated with higher CKD risk [hazard odds ratio for an eGFR threshold <60 ml/min per 1.73 m2: 3.35 (95% confidence interval (CI) = 1.40–8.02)], which developed in 5.7% of patients after 4 years of exposure to this regimen-type.
ART was associated with reduced CKD risk in association with CD4 cell restoration and plasma viral load suppression, despite an increased CKD risk that was associated with initial regimens that included tenofovir and rPI.
antiretroviral therapy; chronic kidney disease; tenofovir
Many adults in the United States enter primary care late in the course of HIV infection, countering the clinical benefits of timely HIV services and missing opportunities for risk reduction. Our objective was to determine if perceived social support was associated with delay entering care after an HIV diagnosis. Two hundred sixteen patients receiving primary care at a large, university-based HIV outpatient clinic in North Carolina were included in the study. Dimensions of functional social support (emotional/informational, tangible, affectionate and positive social interaction) were quantified with a modified Medical Outcomes Study Social Support Scale and included in proportional hazard models to determine their effect on delays seeking care. The median delay between diagnosis and entry to primary care was 5.9 months. Levels of social support were high but only positive social interaction was moderately associated with delayed presentation in adjusted models. The effect of low perceived positive social interaction on the time to initiation of primary care differed by history of alcoholism (no history of alcoholism, hazard ratio (HR): 1.43, 95% confidence interval (CI): 0.88, 2.34; history of alcoholism, HR: 0.71, 95% CI: 0.40, 1.28). Ensuring timely access to HIV care remains a challenge in the southeastern United States. Affectionate, tangible, and emotional/informational social support were not associated with the time from diagnosis to care. The presence of positive social interaction may be an important factor influencing care seeking behavior after diagnosis.
HIV infection; social support; time factors; delivery of health care; southeastern United States
Prompted by 3 cases of resistance noted in the Pre-Exposure Prophylaxis Initiative and TDF2 trials, we examined literature on mutations elicited by antiretrovirals used for pre-exposure prophylaxis. We discuss signature mutations, how rapidly these emerge, and individual-level and public health consequences of antiretroviral resistance.
Pre-exposure prophylaxis (PrEP), the use of antiretrovirals (ARVs) by human immunodeficiency virus (HIV)–uninfected individuals to prevent acquisition of the virus during high-risk sexual encounters, enjoyed its first 2 major successes with the Centre for the AIDS Programme of Research in South Africa (CAPRISA) 004 and the Pre-Exposure Prophylaxis Initiative (iPrEx). These successes were buoyed by additional positive results from the TDF2 and Partners PrEP trials. Although no seroconverters in either arm of CAPRISA developed resistance to tenofovir, 2 participants in iPrEx with undetected, seronegative acute HIV infection were randomized to receive daily oral tenofovir-emtricitabine and resistance to emtricitabine was later discovered in both men. A similar case in the TDF2 study resulted in resistance to both ARVs. These cases prompted us to examine existing literature on the nature of resistance mutations elicited by ARVs used for PrEP. Here, we discuss the impact of signature mutations selected by PrEP, how rapidly these emerge with daily ARV exposure, and the individual-level and public health consequences of ARV resistance.
Background. Dolutegravir (DTG; S/GSK1349572), a human immunodeficiency virus type 1 (HIV-1) integrase inhibitor, has limited cross-resistance to raltegravir (RAL) and elvitegravir in vitro. This phase IIb study assessed the activity of DTG in HIV-1–infected subjects with genotypic evidence of RAL resistance.
Methods. Subjects received DTG 50 mg once daily (cohort I) or 50 mg twice daily (cohort II) while continuing a failing regimen (without RAL) through day 10, after which the background regimen was optimized, when feasible, for cohort I, and at least 1 fully active drug was mandated for cohort II. The primary efficacy end point was the proportion of subjects on day 11 in whom the plasma HIV-1 RNA load decreased by ≥0.7 log10 copies/mL from baseline or was <400 copies/mL.
Results. A rapid antiviral response was observed. More subjects achieved the primary end point in cohort II (23 of 24 [96%]), compared with cohort I (21 of 27 [78%]) at day 11. At week 24, 41% and 75% of subjects had an HIV-1 RNA load of <50 copies/mL in cohorts I and II, respectively. Further integrase genotypic evolution was uncommon. Dolutegravir had a good, similar safety profile with each dosing regimen.
Conclusion. Dolutegravir 50 mg twice daily with an optimized background provided greater and more durable benefit than the once-daily regimen. These data are the first clinical demonstration of the activity of any integrase inhibitor in subjects with HIV-1 resistant to RAL.
Dolutegravir; DTG; S/GSK1349572; integrase inhibitor; raltegravir resistance
To explore darunavir/ritonavir (DRV/r) plus raltegravir (RAL) combination therapy in antiretroviral-naive patients.
Phase IIb, single-arm, open-label, multicenter study.
One hundred and twelve antiretroviral-naive, HIV-1-infected patients received DRV/r 800/100 mg once daily and RAL 400 mg twice daily. Primary endpoint was virologic failure by week 24. Virologic failure was defined as confirmed viral load of 1000 copies/ml or more at week 12, or an increase of more than 0.5 log10 copies/ml in viral load from week 4 to 12, or a confirmed viral load of more than 50 copies/ml at or after week 24. Protease and integrase genes were sequenced in patients experiencing virologic failure.
Virologic failure rate was 16% [95% confidence interval (CI) 10–24] by week 24 and 26% (95% CI 19–36) by week 48 in an intent-to-treat analysis. Viral load at virologic failure was 51–200 copies/ml in 17/28 failures. Adjusting for age and sex, virologic failure was associated with baseline viral load of more than 100 000 copies/ml [hazard ratio 3.76, 95% CI (1.52–9.31), P =0.004] and lower CD4 cell count [0.77 per 100 cells/μl increase (95% CI 0.61–0.98), P =0.037]. When trough RAL concentrations were included as a time-varying covariate in the analysis, virologic failure remained associated with baseline viral load more than 100 000 copies/ml [hazard ratio =4.67 (95% CI 1.93–11.25), P <0.001], whereas RAL level below detection limit in plasma at one or more previous visits was associated with increased hazard [hazard ratio =3.42 (95% CI 1.41–8.26), P =0.006]. All five participants with integrase mutations during virologic failure had baseline viral load more than 100 000 copies/ml.
DRV/r plus RAL was effective and well tolerated in most patients, but virologic failure and integrase resistance were common, particularly in patients with baseline viral load more than 100 000 copies/ml.
antiretroviral therapy; darunavir; nucleoside sparing; raltegravir
QDMRK was a phase III clinical trial of raltegravir given once daily (QD) (800-mg dose) versus twice daily (BID) (400 mg per dose), each in combination with once-daily coformulated tenofovir-emtricitabine, in treatment-naive HIV-infected patients. Pharmacokinetic (PK) and pharmacokinetic/pharmacodynamic (PK/PD) analyses were conducted using a 2-step approach: individual non-model-based PK parameters from observed sparse concentration data were determined, followed by statistical analysis of potential relationships between PK and efficacy response parameters after 48 weeks of treatment. Sparse PK sampling was performed for all patients (QD, n = 380; BID, n = 384); selected sites performed an intensive PK evaluation at week 4 (QD, n = 22; BID, n = 20). In the intensive PK subgroup, daily exposures (area under the concentration-time curve from 0 to 24 h [AUC0–24]) were similar between the two regimens, but patients on 800 mg QD experienced ∼4-fold-higher maximum drug concentration in plasma (Cmax) values and ∼6-fold-lower trough drug concentration (Ctrough) values than those on 400 mg BID. Geometric mean (GM) Ctrough values were similarly lower in the sparse PK analysis. With BID dosing, there was no indication of any significant PK/PD association over the range of tested PK parameters. With QD dosing, Ctrough values correlated with the likelihood of virologic response. Failure to achieve an HIV RNA level of <50 copies/ml appeared predominantly at high baseline HIV RNA levels in both treatment arms and was associated with lower values of GM Ctrough in the 800-mg-QD arm, though other possible drivers of efficacy, such as time above a threshold concentration, could not be evaluated due to the sparse sampling scheme. Together, these findings emphasize the importance of the shape of the plasma concentration-versus-time curve for long-term efficacy.