We estimated US Department of Health and Human Services (DHHS)–approved human immunodeficiency virus (HIV) indicators. Among patients, 71% were retained in care, 82% were prescribed treatment, and 78% had HIV RNA ≤200 copies/mL; younger adults, women, blacks, and injection drug users had poorer outcomes. Interventions are needed to reduce retention- and treatment-related disparities.
HIV; quality of care; retention in care; antiretroviral therapy; HIV RNA suppression
In the last decade, timely initiation of antiretroviral therapy and resulting virologic suppression have greatly improved in North America concurrent with the development of better tolerated and more potent regimens, but significant barriers to treatment uptake remain.
Background. Since the mid-1990s, effective antiretroviral therapy (ART) regimens have improved in potency, tolerability, ease of use, and class diversity. We sought to examine trends in treatment initiation and resulting human immunodeficiency virus (HIV) virologic suppression in North America between 2001 and 2009, and demographic and geographic disparities in these outcomes.
Methods. We analyzed data on HIV-infected individuals newly clinically eligible for ART (ie, first reported CD4+ count <350 cells/µL or AIDS-defining illness, based on treatment guidelines during the study period) from 17 North American AIDS Cohort Collaboration on Research and Design cohorts. Outcomes included timely ART initiation (within 6 months of eligibility) and virologic suppression (≤500 copies/mL, within 1 year). We examined time trends and considered differences by geographic location, age, sex, transmission risk, race/ethnicity, CD4+ count, and viral load, and documented psychosocial barriers to ART initiation, including non–injection drug abuse, alcohol abuse, and mental illness.
Results. Among 10 692 HIV-infected individuals, the cumulative incidence of 6-month ART initiation increased from 51% in 2001 to 72% in 2009 (Ptrend < .001). The cumulative incidence of 1-year virologic suppression increased from 55% to 81%, and among ART initiators, from 84% to 93% (both Ptrend < .001). A greater number of psychosocial barriers were associated with decreased ART initiation, but not virologic suppression once ART was initiated. We found significant heterogeneity by state or province of residence (P < .001).
Conclusions. In the last decade, timely ART initiation and virologic suppression have greatly improved in North America concurrent with the development of better-tolerated and more potent regimens, but significant barriers to treatment uptake remain, both at the individual level and systemwide.
antiretroviral therapy; healthcare disparities; HIV; time factors; viral load
Background. The role of active hepatitis C virus (HCV) replication in chronic kidney disease (CKD) risk has not been clarified.
Methods. We compared CKD incidence in a large cohort of HIV-infected subjects who were HCV seronegative, HCV viremic (detectable HCV RNA), or HCV aviremic (HCV seropositive, undetectable HCV RNA). Stages 3 and 5 CKD were defined according to standard criteria. Progressive CKD was defined as a sustained 25% glomerular filtration rate (GFR) decrease from baseline to a GFR < 60 mL/min/1.73 m2. We used Cox models to calculate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs).
Results. A total of 52 602 HCV seronegative, 9508 HCV viremic, and 913 HCV aviremic subjects were included. Compared with HCV seronegative subjects, HCV viremic subjects were at increased risk for stage 3 CKD (adjusted HR 1.36 [95% CI, 1.26, 1.46]), stage 5 CKD (1.95 [1.64, 2.31]), and progressive CKD (1.31 [1.19, 1.44]), while HCV aviremic subjects were also at increased risk for stage 3 CKD (1.19 [0.98, 1.45]), stage 5 CKD (1.69 [1.07, 2.65]), and progressive CKD (1.31 [1.02, 1.68]).
Conclusions. Compared with HIV-infected subjects who were HCV seronegative, both HCV viremic and HCV aviremic individuals were at increased risk for moderate and advanced CKD.
HIV; hepatitis C virus; chronic kidney disease; hepatitis C RNA; cohort study; glomerular filtration rate; injection drug use
Purpose of Review
HIV-infected individuals are living longer as a result of effective treatment. Age-related comorbidities now account for the majority of morbidity and mortality among treated HIV-infected adults. Previous findings regarding the age at, and risk of, these comorbidities have been mixed, sparking debate in the field. Discerning potential differences in the occurrence and burden of age-related comorbidities among treated HIV-infected adults as compared with uninfected adults of the same age requires careful selection of the appropriate uninfected comparison group.
The validity of comparisons to HIV-uninfected populations is threatened when differences in demographic, clinical, and lifestyle characteristics between HIV-infected and uninfected adults are not considered. Identifying a pool of HIV-uninfected individuals from existing secondary data resources and employing selection methodologies may be a novel approach to reduce threats to internal validity. Issues related to identifying data sources, understanding inclusion criteria, determining measurement error, and threats to inference are discussed.
The development of clinical interventions targeting age-related comorbidities will rely on deriving valid inferences from appropriate comparison groups. The use of secondary data resources and selection methodology to create the appropriate uninfected comparison group is an attractive approach in the setting of finite resources, but are not without limitations.
HIV-uninfected; HIV infection; aging; harmonization; causal inference
To determine the impact of age and initial HAART regimen class on virologic and immunologic response within 24 months after initiation.
Pooled analysis of data from 19 prospective cohort studies in the North American AIDS Cohort Collaboration on Research and Design (NA-ACCORD).
Twelve thousand, one hundred and ninety-six antiretroviral-naive adults who initiated HAART between 1998 and 2008 using a boosted protease inhibitor-based regimen or a nonnucleoside reverse transcriptase inhibitor (NNRTI)-based regimen were included in our study. Discrete time-to-event models estimated adjusted hazard odds ratios (aHOR) and 95% confidence intervals (CIs) for suppressed viral load (≤500 copies/ml) and, separately, at least 100 cells/μl increase in CD4 cell count. Truncated, stabilized inverse probability weights accounted for selection biases from discontinuation of initial regimen class.
Among 12 196 eligible participants (mean age = 42 years), 50% changed regimen classes after initiation (57 and 48% of whom initiated protease inhibitor and NNRTI-based regimens, respectively). Mean CD4 cell count at initiation was similar by age. Virologic response to treatment was less likely in those initiating using a boosted protease inhibitor [aHOR = 0.77 (0.73, 0.82)], regardless of age. Immunologic response decreased with increasing age [18–<30: ref; 30–<40: aHOR 0.92 (0.85, 1.00); 40–<50: aHOR = 0.85 (0.78, 0.92); 50–<60: aHOR = 0.82 (0.74, 0.90); ≥60: aHOR=0.74 (0.65, 0.85)], regardless of initial regimen.
We found no evidence of an interaction between age and initial anti-retroviral regimen on virologic or immunologic response to HAART; however, decreased immunologic response with increasing age may have implications for age-specific when-to-start guidelines.
age; CD4 lymphocyte count; HAART; HIV; viral load
Effective antiretroviral (ARV) therapy depends on adequate drug exposure, yet methods to assess ARV exposure are limited. Concentrations of ARV in hair are the product of steady-state pharmacokinetics factors and longitudinal adherence. We investigated nevirapine (NVP) concentrations in hair as a predictor of treatment response in women receiving ARVs. In participants of the Women’s Interagency HIV Study, who reported NVP use for >1 month from 2003–2008, NVP concentrations in hair were measured via liquid-chromatography-tandem mass-spectrometry. The outcome was virologic suppression (plasma HIV RNA below assay threshold) at the time of hair sampling and the primary predictor was nevirapine concentration categorized into quartiles. We controlled for age, race/ethnicity, pre-treatment HIV RNA, CD4 cell count, and self-reported adherence over the 6-month visit interval (categorized ≤ 74%, 75%–94% or ≥ 95%). We also assessed the relation of NVP concentration with changes in hepatic transaminase levels via multivariate random intercept logistic regression and linear regression analyses. 271 women contributed 1089 person-visits to the analysis (median 3 of semi-annual visits). Viral suppression was least frequent in concentration quartile 1 (86/178 (48.3%)) and increased in higher quartiles (to 158/204 (77.5%) for quartile 4). The odds of viral suppression in the highest concentration quartile were 9.17 times (95% CI 3.2–26, P < 0.0001) those in the lowest. African-American race was associated with lower rates of virologic suppression independent of NVP hair concentration. NVP concentration was not significantly associated with patterns of serum transaminases. Concentration of NVP in hair was a strong independent predictor of virologic suppression in women taking NVP, stronger than self-reported adherence, but did not appear to be strongly predictive of hepatotoxicity.
The burden of HIV disease has shifted from traditional AIDS-defining illnesses to serious non-AIDS-defining comorbid conditions. Research aimed at improving HIV-related comorbid disease outcomes requires well-defined, verified clinical endpoints. We developed methods to ascertain and verify end-stage renal disease (ESRD) and end-stage liver disease (ESLD) and validated screening algorithms within the largest HIV cohort collaboration in North America (NA-ACCORD). Individuals who screened positive among all participants in twelve cohorts enrolled between January 1996 and December 2009 underwent medical record review to verify incident ESRD or ESLD using standardized protocols. We randomly sampled 6% of contributing cohorts to determine the sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) of ESLD and ESRD screening algorithms in a validation subcohort. Among 43,433 patients screened for ESRD, 822 screened positive of which 620 met clinical criteria for ESRD. The algorithm had 100% sensitivity, 99% specificity, 82% PPV, and 100% NPV for ESRD. Among 41,463 patients screened for ESLD, 2,024 screened positive of which 645 met diagnostic criteria for ESLD. The algorithm had 100% sensitivity, 95% specificity, 27% PPV, and 100% NPV for ESLD. Our methods proved robust for ascertainment of ESRD and ESLD in persons infected with HIV.
Adults infected with HIV have increased atherosclerosis potentially associated with both HIV and non‐HIV associated factors. We characterized risk factors for atherosclerosis as measured by noninvasive vascular imaging.
Methods and Results
We used B‐mode ultrasound to examine levels and correlates of echogenicity and vessel wall thickness of the carotid artery intima‐media complex in 1282 HIV‐infected and 510 HIV‐uninfected women of the Women's Interagency HIV Study. Levels of gray scale median (GSM, a measure of echogenicity) did not vary between HIV infection groups. In both groups, smokers had increased GSM, whereas age, diabetes, elevated blood pressure, and high BMI were associated with lower (rather than higher) GSM. Each of these non‐lipid CVD risk factors, especially age and blood pressure, was also associated with higher levels of carotid artery intima‐media thickness (cIMT). Higher serum triglyceride levels were associated with lower GSM in both HIV‐infected and HIV‐uninfected groups. Additional lipid risk factors for low GSM including high LDL cholesterol and low HDL cholesterol levels were identified in HIV uninfected but not in HIV infected women. In contrast to findings for GSM, among the lipid parameters only LDL cholesterol level had an association with cIMT, which was observed only in the HIV uninfected group.
Lipid and non‐lipid risk factor associations with echolucency of the carotid artery and the thickness of the common carotid artery intima‐media layer suggest that these measures capture different aspects of atherosclerosis.
carotid arteries; epidemiology; immune system; risk factors; ultrasonics
Tenofovir is used commonly in HIV treatment and prevention settings, but factors that correlate with tenofovir exposure in real-world setting are unknown.
Intensive pharmacokinetic (PK) studies of tenofovir in a large, diverse cohort of HIV-infected women over 24-hours at steady-state were performed and factors that influenced exposure (assessed by areas-under-the-time-concentration curves, AUCs) identified
HIV-infected women (n=101) on tenofovir-based therapy underwent intensive 24-hour PK sampling. Data on race/ethnicity, age, exogenous steroid use, menstrual cycle phase, concomitant medications, recreational drugs and/or tobacco, hepatic and renal function, weight and body mass index (BMI) were collected. Multivariable models using forward stepwise selection identified factors associated with effects on AUC. Glomerular filtration rates (GFR) prior to starting tenofovir were estimated by the CKD-EPI equation using both creatinine and cystatin-C measures
The median (range) of tenofovir AUCs was 3350 (1031–13,911) ng x h/mL. Higher AUCs were associated with concomitant ritonavir use (1.33-fold increase, p 0.002), increasing age (1.21-fold increase per decade, p=0.0007) and decreasing BMI (1.04-fold increase per 10% decrease in BMI). When GFR was calculated using cystatin-C measures, mild renal insufficiency prior to tenofovir initiation was associated with higher subsequent exposure (1.35-fold increase when pre-tenofovir GFR <70mL/min, p=0.0075).
Concomitant ritonavir use, increasing age, decreasing BMI and lower GFR prior to tenofovir initiation as estimated by cystatin C were all associated with elevated tenofovir exposure in a diverse cohort of HIV-infected women. Clinicians treating HIV-infected women should be aware of common clinical conditions that affect tenofovir exposure when prescribing this medication.
Tenofovir; pharmacokinetics; HIV-infected women; diverse populations; GFR; cystatin C
We sought to quantify agreement between Institute of Medicine (IOM) and Department of Health and Human Services (DHHS) retention indicators, which have not been compared in the same population, and assess clinical retention within the largest HIV cohort collaboration in the U.S.
Observational study from 2008–2010, using clinical cohort data in the North American AIDS Cohort Collaboration on Research and Design (NA-ACCORD).
Retention definitions used HIV primary care visits. The IOM retention indicator was: ≥2 visits, ≥90 days apart, each calendar year. This was extended to a 2-year period; retention required meeting the definition in both years. The DHHS retention indicator was: ≥1 visit each semester over 2 years, each ≥60 days apart. Kappa statistics detected agreement between indicators and C statistics (areas under Receiver-Operating Characteristic curves) from logistic regression analyses summarized discrimination of the IOM indicator by the DHHS indicator.
Among 36,769 patients in 2008–2009 and 34,017 in 2009–2010, there were higher percentages of participants retained in care under the IOM indicator than the DHHS indicator (80% vs. 75% in 2008–2009; 78% vs. 72% in 2009–2010, respectively) (p<0.01), persisting across all demographic and clinical characteristics (p<0.01). There was high agreement between indicators overall (κ = 0.83 in 2008–2009; κ = 0.79 in 2009–2010, p<0.001), and C statistics revealed a very strong ability to predict retention according to the IOM indicator based on DHHS indicator status, even within characteristic strata.
Although the IOM indicator consistently reported higher retention in care compared with the DHHS indicator, there was strong agreement between IOM and DHHS retention indicators in a cohort demographically similar to persons living with HIV/AIDS in the U.S. Persons with poorer retention represent subgroups of interest for retention improvement programs nationally, particularly in light of the White House Executive Order on the HIV Care Continuum.
HIV-associated immune injury is hypothesized to increase the risk of preclinical disability and frailty via inflammatory pathways. We investigated the role of CD4+ T cell depletion and clinical AIDS on preclinical disability and frailty in HIV-positive women with a history of combination antiretroviral therapy (cART) and HIV-negative women.
This was a cross-sectional study nested within the Women's Interagency HIV Study (WIHS), a prospective cohort study initiated in 1994 across five U.S. cities. Questionnaires and tests were performed by 573 HIV-negative and 1206 HIV-positive women. Prevalence ratios were computed using regression models.
Severe CD4+ cell depletion was an independent predictor of slowness, weakness, and frailty in HIV-positive women compared with HIV-negative women. Women with CD4+ counts <100 cells/mm3 were 0.13 seconds slower to complete 4 meters (95% CI 0.06-0.21), 1.25 kg weaker (95% CI −2.31-−0.19), and had 2.7 times higher prevalence of frailty (95% CI 1.46-5.01).
This study is one of the largest studies to administer performance-based tests to investigate disability and frailty in HIV-positive women. HIV-positive women with intact immune systems and without a history of clinical AIDS were no different from HIV-negative women on tests of slowness, weakness, and frailty phenotype.
Among 127 HIV-infected women, the magnitude of HDLc increases after HAART initiation predicted the magnitude of concurrent decreases in inflammation biomarkers. After HAART initiation, changes in LDLc and inflammation were unrelated. In the same population, predicted risk of coronary heart disease based upon levels of standard clinical risk factors was similar before and after HAART treatment. Thus, it remains unknown whether short-term treatment-related changes in standard risk factors may appreciably change risk of CVD.
lipids; HAART; HIV infection; inflammation
Natural history studies suggest increased risk for kidney function decline with HIV infection, but few studies have made comparisons with HIV-uninfected women. We examined whether HIV infection treated with highly active antiretroviral therapy (HAART) remains associated with faster kidney function decline in the Women's Interagency HIV Study. HIV-infected women initiating HAART with (n=105) or without (n=373) tenofovir (TDF) were matched to HIV-uninfected women on calendar and length of follow-up, age, systolic blood pressure, hepatitis C antibody serostatus, and diabetes history. Linear mixed models were used to evaluate differences in annual estimated glomerular filtration rate (eGFR). Person-visits were 4,741 and 11,512 for the TDF-treated and non-TDF-treated analyses, respectively. Mean baseline eGFRs were higher among women initiated on TDF-containing HAART and lower among those on TDF-sparing HAART compared to their respective HIV-uninfected matches (p<0.05 for both). HIV-infected women had annual rates of eGFR changes similar to HIV-uninfected matches (p-interaction >0.05 for both). Adjusting for baseline eGFR, mean eGFRs at 1 and 3 years of follow-up among women initiated on TDF-containing HAART were lower than their uninfected matches (−4.98 and −4.26 ml/min/1.73 m2, respectively; p<0.05 for both). Mean eGFR of women initiated on TDF-sparing HAART was lower versus uninfected matches at 5 years (–2.19 ml/min/1.73 m2, p=0.03). HAART-treated HIV-infected women had lower mean eGFRs at follow-up but experienced rates of annual eGFR decline similar to HIV-uninfected women. Tenofovir use in HIV-infected women with normal kidney function did not accelerate long-term kidney function decline relative to HIV-uninfected women.
HIV infection and low CD4+ T-cell count are associated with an increased risk of persistent oncogenic HPV infection – the major risk factor for cervical cancer. Few reported prospective cohort studies have characterized the incidence of invasive cervical cancer (ICC) in HIV-infected women.
Data were obtained from HIV-infected and -uninfected female participants in the NA-ACCORD with no history of ICC at enrollment. Participants were followed from study entry or January, 1996 through ICC, loss-to follow-up or December, 2010. The relationship of HIV infection and CD4+ T-cell count with risk of ICC was assessed using age-adjusted Poisson regression models and standardized incidence ratios (SIR). All cases were confirmed by cancer registry records and/or pathology reports. Cervical cytology screening history was assessed through medical record abstraction.
A total of 13,690 HIV-infected and 12,021 HIV-uninfected women contributed 66,249 and 70,815 person-years (pys) of observation, respectively. Incident ICC was diagnosed in 17 HIV-infected and 4 HIV-uninfected women (incidence rate of 26 and 6 per 100,000 pys, respectively). HIV-infected women with baseline CD4+ T-cells of ≥ 350, 200–349 and <200 cells/uL had a 2.3-times, 3.0-times and 7.7-times increase in ICC incidence, respectively, compared with HIV-uninfected women (Ptrend =0.001). Of the 17 HIV-infected cases, medical records for the 5 years prior to diagnosis showed that 6 had no documented screening, 5 had screening with low grade or normal results, and 6 had high-grade results.
This study found elevated incidence of ICC in HIV-infected compared to -uninfected women, and these rates increased with immunosuppression.
Human papilloma virus; HIV-infection; Invasive Cervical Cancer; Immunosuppression
Proteinuria is associated with adverse clinical outcomes in HIV infection. Here we evaluated whether APOL1 risk alleles, previously associated with advanced kidney disease, is independently associated with proteinuria in HIV infection in a cross-sectional study of HIV-infected women in the Women’s Interagency HIV Study. We estimated the percent difference in urine protein excretion and odds of proteinuria (200 mg/g and higher) associated with two versus one or no APOL1 risk allele using linear and logistic regression, respectively. Of 1285 women successfully genotyped, 379 carried one and 80 carried two risk alleles. Proteinuria was present in 124 women; 78 of whom had proteinuria confirmed on a second sample. In women without prior AIDS, two risk alleles were independently associated with a 69% higher urine protein excretion (95% CI: 36%, 108%) and 5-fold higher odds of proteinuria (95% CI: 2.45, 10.37) versus one or no risk allele. No association was found in women with prior AIDS. Analyses in which women with impaired kidney function were excluded and proteinuria was confirmed by a second urine sample yielded similar estimates. Thus, APOL1 risk alleles are associated with significant proteinuria in HIV-infected persons without prior clinical AIDS, independent of clinical factors traditionally associated with proteinuria. Trials are needed to determine whether APOL1 genotyping identifies individuals who could benefit from earlier intervention to prevent overt renal disease.
Retention in care is key to improving HIV outcomes. Our goal was to describe “churn” in patterns of entry, exit, and retention in HIV care in the US and Canada.
Adults contributing ≥1 CD4 count or HIV-1 RNA (HIV-lab) from 2000–2008 in North American Cohort Collaboration on Research and Design (NA-ACCORD) clinical cohorts were included. Incomplete retention was defined as lack of 2 HIV-labs (≥90 days apart) within 12 months, summarized by calendar year. We used beta-binomial regression models to estimate adjusted odds ratios (OR) and 95% confidence intervals (CI) of factors associated with incomplete retention.
Among 61,438 participants, 15,360 (25%) with incomplete retention significantly differed in univariate analyses (p<0.001) from 46,078 (75%) consistently retained by age, race/ethnicity, HIV risk, CD4, ART use, and country of care (US vs. Canada). From 2000–2004, females (OR=0.82, CI:0.70–0.95), older individuals (OR=0.78, CI:0.74–0.83 per 10 years), and ART users (OR= 0.61, CI:0.54–0.68 vs all others) were less likely to have incomplete retention, while black individuals (OR=1.31, CI:1.16–1.49, vs. white), those with injection drug use (IDU) HIV risk (OR=1.68, CI:1.49–1.89, vs. non-IDU) and those in care longer (OR=1.09, CI:1.07–1.11 per year) were more likely to have incomplete retention. Results from 2005–2008 were similar.
From 2000 to 2008, 75% of the NA-ACCORD population was consistently retained in care with 25% experiencing some change in status, or churn. In addition to the programmatic and policy implications, our findings identify patient groups who may benefit from focused retention efforts.
retention; churn; HIV clinical care; North America; HRSA HAB; National HIV/AIDS Strategy
Human leukocyte antigen (HLA) genotype has been associated with probability of spontaneous clearance of hepatitis C virus (HCV). However, no prior studies have examined whether this relationship may be further characterized by grouping HLA alleles according to their supertypes, defined by their binding capacities. There is debate regarding the most appropriate method to define supertypes. Therefore, previously reported HLA supertypes (46 class I and 25 class II) were assessed for their relation with HCV clearance in a population of 758 HCV-seropositive women. Two HLA class II supertypes were significant in multivariable models that included: (i) supertypes with significant or borderline associations with HCV clearance after adjustment for multiple tests, and (ii) individual HLA alleles not part of these supertypes, but associated with HCV clearance in our prior study in this population. Specifically, supertype DRB3 (prevalence ratio (PR)=0.4; p=0.004) was associated with HCV persistence while DR8 (PR=1.8; p=0.01) was associated with HCV clearance. Two individual alleles (B*57:01 and C*01:02) associated with HCV clearance in our prior study became non-significant in analysis that included supertypes while B*57:03 (PR=1.9; p=0.008) and DRB1*07:01 (PR=1.7; p=0.005) retained significance. These data provide epidemiologic support for the significance of HLA supertypes in relation to HCV clearance.
hepatitis C virus; HLA; human leukocyte antigen; supertype
Statistical approaches for estimating and drawing inference on the
correlation between two biomarkers which are repeatedly assessed over time and
subject to left-censoring due to minimum detection levels are lacking. We
propose a linear mixed-effects model and estimate the parameters with the Monte
Carlo Expectation Maximization (MCEM) method. Inferences regarding the model
parameters and the correlation between the biomarkers are performed by applying
Louis’s method and the delta method. Simulation studies were conducted
to compare the proposed MCEM method with existing methods including the MLE
method, the multiple imputation (MI) method, and two widely used ad hoc
approaches: replacing the censored values with the detection limit (DL) or with
half of the detection limit (HDL). The results show that the performance of the
MCEM with respect to relative bias and coverage probability for the 95%
confidence interval is superior to the DL and HDL approaches and exceeds that of
the MI method at medium to high levels of censoring, and the standard error
estimates from the MCEM method are close to ideal. The MLE method can estimate
the parameters accurately; however, a non-positive definite information matrix
can occur so that the variances are not estimable. These five methods are
illustrated with data from a longitudinal HIV study to estimate and draw
inference on the correlation between HIV RNA levels measured in plasma and in
cervical secretions at multiple time points.
information matrix; longitudinal data; mixed-effects; monte carlo expectation maximization
Combination antiretroviral therapy (ART) has significantly increased survival among HIV-positive adults in the United States (U.S.) and Canada, but gains in life expectancy for this region have not been well characterized. We aim to estimate temporal changes in life expectancy among HIV-positive adults on ART from 2000–2007 in the U.S. and Canada.
Participants were from the North American AIDS Cohort Collaboration on Research and Design (NA-ACCORD), aged ≥20 years and on ART. Mortality rates were calculated using participants' person-time from January 1, 2000 or ART initiation until death, loss to follow-up, or administrative censoring December 31, 2007. Life expectancy at age 20, defined as the average number of additional years that a person of a specific age will live, provided the current age-specific mortality rates remain constant, was estimated using abridged life tables.
The crude mortality rate was 19.8/1,000 person-years, among 22,937 individuals contributing 82,022 person-years and 1,622 deaths. Life expectancy increased from 36.1 [standard error (SE) 0.5] to 51.4 [SE 0.5] years from 2000–2002 to 2006–2007. Men and women had comparable life expectancies in all periods except the last (2006–2007). Life expectancy was lower for individuals with a history of injection drug use, non-whites, and in patients with baseline CD4 counts <350 cells/mm3.
A 20-year-old HIV-positive adult on ART in the U.S. or Canada is expected to live into their early 70 s, a life expectancy approaching that of the general population. Differences by sex, race, HIV transmission risk group, and CD4 count remain.
We examined serum lipids in association with carotid artery intima-media thickness (CIMT) in HIV-infected and HIV-uninfected women.
In 2003–4, among 1827 Women’s Interagency HIV Study participants, we measured CIMT and lipids (high-density lipoprotein cholesterol [HDL-c], low-density lipoprotein cholesterol [LDL-c], total cholesterol [TC], non-HDL-c). A subset of 520 treated HIV-infected women had pre-1997 lipid measures. We used multivariable linear regression to examine associations between lipids and CIMT.
In HIV-uninfected women, higher TC, LDL-c and non-HDL-c were associated with increased CIMT. Among HIV-infected women, associations of lipids with CIMT were observed in treated but not untreated women. Among the HIV-infected women treated in 2003–4, CIMT was associated both with lipids measured a decade earlier in infection, and with late lipid measurements.
Among HIV-infected women, hyperlipidemia is most strongly associated with subclinical atherosclerosis in treated women. Among treated women, the association appeared strongest early in the disease course.
cardiovascular diseases; carotid arteries; HAART; HIV; lipids
U.S. state AIDS Drug Assistance Programs (ADAPs) are federally funded to provide antiretroviral therapy (ART) as the payer of last resort to eligible persons with HIV infection. States differ regarding their financial contributions to and ways of implementing these programs, and it remains unclear how this interstate variability affects HIV treatment outcomes.
We analyzed data from HIV-infected individuals who were clinically-eligible for ART between 2001 and 2009 (i.e., a first reported CD4+ <350 cells/uL or AIDS-defining illness) from 14 U.S. cohorts of the North American AIDS Cohort Collaboration on Research and Design (NA-ACCORD). Using propensity score matching and Cox regression, we assessed ART initiation (within 6 months following eligibility) and virologic suppression (within 1 year) based on differences in two state ADAP features: the amount of state funding in annual ADAP budgets and the implementation of waiting lists. We performed an a priori subgroup analysis in persons with a history of injection drug use (IDU).
Among 8,874 persons, 56% initiated ART within six months following eligibility. Persons living in states with no additional state contribution to the ADAP budget initiated ART on a less timely basis (hazard ratio [HR] 0.73, 95% CI 0.60–0.88). Living in a state with an ADAP waiting list was not associated with less timely initiation (HR 1.12, 95% CI 0.87–1.45). Neither additional state contributions nor waiting lists were significantly associated with virologic suppression. Persons with an IDU history initiated ART on a less timely basis (HR 0.67, 95% CI 0.47–0.95).
We found that living in states that did not contribute additionally to the ADAP budget was associated with delayed ART initiation when treatment was clinically indicated. Given the changing healthcare environment, continued assessment of the role of ADAPs and their features that facilitate prompt treatment is needed.
Background. Efavirenz exhibits marked interindividual variability in plasma levels and toxicities. Prior pharmacogenetic studies usually measure exposure via single plasma levels, examine limited numbers of polymorphisms, and rarely model multiple contributors. We analyzed numerous genetic and nongenetic factors impacting short-term and long-term exposure in a large heterogeneous population of human immunodeficiency virus (HIV)–infected women.
Methods. We performed 24-hour intensive pharmacokinetic studies in 111 women receiving efavirenz under actual-use conditions and calculated the area-under-the-concentration-time curve (AUC) to assess short-term exposure; the efavirenz concentration in hair was measured to estimate long-term exposure. A total of 182 single-nucleotide polymorphisms (SNPs) and 45 haplotypes in 9 genes were analyzed in relationship to exposure by use of multivariate models that included a number of nongenetic factors.
Results. Efavirenz AUCs increased 1.26-fold per doubling of the alanine aminotransferase level and 1.23-fold with orange and/or orange juice consumption. Individuals with the CYP2B6 516TT genotype displayed 3.5-fold increases in AUCs and 3.2-fold increases in hair concentrations, compared with individuals with the TG/GG genotype. Another SNP in CYP2B6 (983TT) and a p-glycoprotein haplotype affected AUCs without substantially altering long-term exposure.
Conclusions. This comprehensive pharmacogenomics study showed that individuals with the CYP2B6 516TT genotype displayed >3-fold increases in both short-term and long-term efavirenz exposure, signifying durable effects. Pharmacogenetic testing combined with monitoring of hair levels may improve efavirenz outcomes and reduce toxicities.
Background.Inflammation persists in treated human immunodeficiency virus (HIV) infection and may contribute to an increased risk for non–AIDS-related pathologies. We investigated the correlation of cytokine responses with changes in CD4 T-cell levels and coinfection with hepatitis C virus (HCV) during highly active antiretroviral treatment (HAART).
Methods.A total of 383 participants in the Women's Interagency HIV Study (212 with HIV monoinfection, 56 with HCV monoinfection, and 115 with HIV/HCV coinfection) were studied. HIV-infected women had <1000 HIV RNA copies/mL, 99.7% had >200 CD4 T cells/μL; 98% were receiving HAART at baseline. Changes in CD4 T-cell count between baseline and 2–4 years later were calculated. Peripheral blood mononuclear cells (PBMCs) obtained at baseline were used to measure interleukin 1β (IL-1β), interleukin 6 (IL-6), interleukin 10 (IL-10), interleukin 12 (IL-12), and tumor necrosis factor α (TNF-α) responses to Toll-like receptor (TLR) 3 and TLR4 stimulation.
Results.Undetectable HIV RNA (<80 copies/mL) at baseline and secretion of IL-10 by PBMCs were positively associated with gains in CD4 T-cell counts at follow-up. Inflammatory cytokines (IL-1β, IL-6, IL-12, and TNF-α) were also produced in TLR-stimulated cultures, but only IL-10 was significantly associated with sustained increases in CD4 T-cell levels. This association was significant only in women with HIV monoinfection, indicating that HCV coinfection is an important factor limiting gains in CD4 T-cell counts, possibly by contributing to unbalanced persistent inflammation.
Conclusions.Secreted IL-10 from PBMCs may balance the inflammatory environment of HIV, resulting in CD4 T-cell stability.
To examine interstate variation in US HIV case-fatality rates, and compare them with corresponding conventional HIV death rates.
Cross-sectional analysis using data on deaths due to HIV infection from the National Vital Statistics System and data on persons 15 years or older living with HIV infection in 2001—2007 in 37 U.S. states from the national HIV/AIDS Reporting System.
State rankings by age-adjusted HIV case-fatality rates (with HIV-infected population denominators) were compared with rankings by conventional death rates (with general population denominators). Negative binomial regression determined case-fatality rate ratios (RRs) among states, adjusted for age, sex, race/ethnicity, year, and state-level markers of late HIV diagnosis.
Based on 3,096,729 HIV-infected person-years, the overall HIV case-fatality rate was 20.6/1,000 person-years (95% confidence interval [CI], 20.3–20.9). Age-adjusted rates by state ranged from 9.6 (95% CI 6.8–12.4) in Idaho to 32.9 (95% CI 29.8–36.0) in Mississippi, demonstrating significant differences across states, even after adjusting for race/ethnicity (p<0.0001). Many states with low conventional death rates had high case-fatality rates. Nine of the ten states with the highest case-fatality rates were located in the U.S. South.
Case-fatality rates complement and are not entirely concordant with conventional death rates. Interstate differences in these rates may reflect differences in secondary and tertiary prevention of HIV-related mortality among infected persons. These data suggest that state-specific contextual barriers to care may impede improvements in quality and disparities of health-care without targeted interventions.
case fatality rate; geographic factors; healthcare disparities; mortality; excess; mortality determinants; surveillance; United States
The parametric g-formula can be used to contrast the distribution of potential outcomes under arbitrary treatment regimes. Like g-estimation of structural nested models and inverse probability weighting of marginal structural models, the parametric g-formula can appropriately adjust for measured time-varying confounders that are affected by prior treatment. However, there have been few implementations of the parametric g-formula to date. Here, we apply the parametric g-formula to assess the impact of highly active antiretroviral therapy on time to AIDS or death in two US-based HIV cohorts including 1,498 participants. These participants contributed approximately 7,300 person-years of follow-up of which 49% was exposed to HAART and 382 events occurred; 259 participants were censored due to drop out. Using the parametric g-formula, we estimated that antiretroviral therapy substantially reduces the hazard of AIDS or death (HR=0.55; 95% confidence limits [CL]: 0.42, 0.71). This estimate was similar to one previously reported using a marginal structural model 0.54 (95% CL: 0.38, 0.78). The 6.5-year difference in risk of AIDS or death was 13% (95% CL: 8%, 18%). Results were robust to assumptions about temporal ordering, and extent of history modeled, for time-varying covariates. The parametric g-formula is a viable alternative to inverse probability weighting of marginal structural models and g-estimation of structural nested models for the analysis of complex longitudinal data.
Cohort study; Confounding; g-formula; HIV/AIDS; Monte Carlo methods