To examine if altered levels of adipokines, adipose-derived peptides associated with myocardial infarction in the general population, may contribute to subclinical coronary atherosclerosis in HIV-infected persons.
Nested cohort study.
We studied HIV-infected(HIV+) and HIV-uninfected(HIV−) men in the Multicenter AIDS Cohort Study with noncontrast CT to measure coronary artery calcium and regional adiposity; 75% additionally underwent coronary CT angiography to measure plaque composition and stenosis. Adiponectin and leptin levels were assessed. Multiple regression models were used to assess associations between adipokine levels and HIV disease parameters, regional adiposity, and plaque adjusted for age, race, HIV serostatus and CVD risk factors (RFs).
Significant findings were limited to adiponectin. HIV+ men (n=493) had lower adiponectin levels than HIV− men (n=250) after adjusting for CVD RFs (p<0.0001), which became non-significant after adjustment for abdominal visceral and thigh subcutaneous adipose tissue. Among HIV+ men, lower adiponectin levels were associated with higher CD4+ T cell counts (p= 0.004), longer duration of antiretroviral therapy (p= 0.006) and undetectable HIV RNA levels (p = 0.04) after adjusting for age, race and CVD RFs; only CD4+ cell count remained significant after further adjustment for adipose tissue. In both groups, lower adiponectin levels were associated with increased odds of coronary stenosis > 50% (p <0.007). Lower adiponectin levels were associated with increased extent of plaque in HIV+ and of mixed plaque in HIV− men.
Adiponectin levels were lower in HIV-infected men and related to the severity of subclinical atherosclerosis, independent of traditional CVD risk factors.
Adipokines; adiponectin; leptin; heart; subclinical coronary atherosclerosis; metabolic side effects of HIV infection; coronary CT angiography; cardiac CT
Cytokine stimulation of B-cell proliferation may be an important etiologic mechanism for acquired immunodeficiency syndrome (AIDS)-related non-Hodgkin lymphoma (NHL). The Epstein-Barr virus may be a co-factor, particularly for primary central nervous system (CNS) tumors, which are uniformly EBV-positive in the setting of AIDS. Thus, we examined associations of genetic variation in IL10 and related cytokine signaling molecules (IL10RA, CXCL12, IL13, IL4, IL4R, CCL5 and BCL6) with AIDS-related NHL risk and evaluated differences between primary CNS and systemic tumors. We compared 160 Multicenter AIDS Cohort Study (MACS) participants with incident lymphomas, of which 90 followed another AIDS diagnosis, to HIV-1-seropositive controls matched on duration of lymphoma-free survival post-HIV-1 infection (N=160) or post-AIDS diagnosis (N=90). We fit conditional logistic regression models to estimate odds ratios (ORs) and 95 percent confidence intervals (95%CIs). Carriage of at least one copy of the T allele for the IL10 rs1800871 (as compared to no copies) was associated with decreased AIDS-NHL risk specific to lymphomas arising from the CNS (CC vs. CT/TT: OR=0.3; 95%CI: 0.1, 0.7) but not systemically (CC vs. CT/TT: OR=1.0; 95%CI: 0.5, 1.9) (Pheterogeneity=0.03). Carriage of two copies of the “low IL10” haplotype rs1800896_A/rs1800871_T/rs1800872_A was associated with decreased lymphoma risk that varied by number of copies (Ptrend=0.02). None of the ORs for the other studied polymorphisms was significantly different from 1.0. Excessive IL10 response to HIV-1 infection may be associated with increased risk of NHL, particularly in the CNS. IL10 dysregulation may be an important etiologic pathway for EBV-related lymphomagenesis.
cytokine; SNPs; AIDS-related lymphoma
Formulae used to estimate glomerular filtration rate (GFR) underestimate higher GFRs and have not been well-studied in HIV-infected (HIV(+)) people; we evaluated the relationships of HIV infection and known or potential risk factors for kidney disease with directly measured GFR and the presence of chronic kidney disease (CKD).
Cross-sectional measurement of iohexol-based GFR (iGFR) in HIV(+) men (n = 455) receiving antiretroviral therapy, and HIV-uninfected (HIV(−)) men (n = 258) in the Multicenter AIDS Cohort Study.
iGFR was calculated from disappearance of infused iohexol from plasma. Determinants of GFR and the presence of CKD were compared using iGFR and GFR estimated by the CKD-Epi equation (eGFR).
Median iGFR was higher among HIV(+) than HIV(−) men (109 vs. 106 ml/min/1.73 m2, respectively, p = .046), and was 7 ml/min higher than median eGFR. Mean iGFR was lower in men who were older, had chronic hepatitis C virus (HCV) infection, or had a history of AIDS. Low iGFR (≤90 ml/min/1.73 m2) was associated with these factors and with black race. Other than age, factors associated with low iGFR were not observed with low eGFR. CKD was more common in HIV(+) than HIV(−) men; predictors of CKD were similar using iGFR and eGFR.
iGFR was higher than eGFR in this population of HIV-infected and -uninfected men who have sex with men. Presence of CKD was predicted equally well by iGFR and eGFR, but associations of chronic HCV infection and history of clinically-defined AIDS with mildly decreased GFR were seen only with iGFR.
To compare neuropsychological test performance before and after HIV-1 seroconversion in order to identify possible acute changes in psychomotor speed, memory, attention, and concentration secondary to seroconversion.
Mixed effects models to examine longitudinal neuropsychological test data.
We conducted a nested cohort study of 362 male HIV-1 seroconverters enrolled in the Multicenter AIDS Cohort Study. We used linear mixed models with random subject effects to compare repeated neuropsychological test outcomes from 5 years before seroconversion to 2 years after seroconversion on the Trail Making Test (Parts A & B), Symbol-Digit Test, Grooved Pegboard (dominant and non-dominant hands), Stroop Color-Interference Test, Rey Auditory Verbal Learning Test, and the CalCAP Reaction Time Test.
We found no significant changes in the time-dependent score after seroconversion for the majority of neuropsychological tests used in the Multicenter AIDS Cohort Study. There was a significant change in time trend after seroconversion on part B of the Trail Making Test (p=0.042) but the difference only represented a 2% decrease in performance. We found the following characteristics to be associated with worse neuropsychological test performance: lower education levels, history of depression, older age, and no previous neurocognitive testing (p<.05).
Our results suggest that despite a 50% decrease in CD4 cell count immediately following infection, HIV-1 does not appear to have a measurable effect on psychomotor or complex cognitive processing for up to 2 years following infection, using this set of neurocognitive measures.
HIV; neuropsychological test; seroconversion; CD4; neurocognition
CXCL13 and CXCR5 are a chemokine and receptor pair whose interaction is critical for naïve B cell trafficking and activation within germinal centers. We sought to determine whether CXCL13 levels are elevated prior to HIV-associated non-Hodgkin B-cell lymphoma (AIDS-NHL), and whether polymorphisms in CXCL13 or CXCR5 are associated with AIDS-NHL risk and CXCL13 levels in a large cohort of HIV-infected men.
CXCL13 levels were measured in sera from 179 AIDS-NHL cases and 179 controls at three time-points. TagSNPs in CXCL13 (n=16) and CXCR5 (n=11) were genotyped in 183 AIDS-NHL cases and 533 controls. Odds ratios (OR) and 95% confidence intervals (CIs) for the associations between one unit increase in log CXCL13 levels and AIDS-NHL, as well as tagSNP genotypes and AIDS-NHL, were computed using logistic regression. Mixed linear regression was used to estimate mean ratios (MR) for the association between tagSNPs and CXCL13 levels.
CXCL13 levels were elevated >3 years (OR=3.24, 95% CI=1.90–5.54), 1–3 years (OR=3.39, 95% CI=1.94–5.94) and 0–1 year (OR=3.94, 95% CI=1.98–7.81) prior to an AIDS-NHL diagnosis. The minor allele of CXCL13 rs355689 was associated with reduced AIDS-NHL risk (ORTCvsTT=0.65; 95% CI=0.45–0.96) and reduced CXCL13 levels (MRCCvsTT=0.82, 95% CI=0.68–0.99). The minor allele of CXCR5 rs630923 was associated with increased CXCL13 levels (MRAAvsTT=2.40, 95% CI=1.43–4.50).
CXCL13 levels were elevated preceding an AIDS-NHL diagnosis, genetic variation in CXCL13 may contribute to AIDS-NHL risk, and CXCL13 levels may be associated with genetic variation in CXCL13 and CXCR5.
CXCL13 may serve as a biomarker for early AIDS-NHL detection.
Non-Hodgkin Lymphoma; HIV; CXCL13; CXCR5; chemokine
Background. Accurate testing algorithms are needed for estimating human immunodeficiency virus (HIV) incidence from cross-sectional surveys.
Methods. We developed a multiassay algorithm (MAA) for HIV incidence that includes the BED capture enzyme immunoassay (BED-CEIA), an antibody avidity assay, HIV load, and CD4+ T-cell count. We analyzed 1782 samples from 709 individuals in the United States who had a known duration of HIV infection (range, 0 to >8 years). Logistic regression with cubic splines was used to compare the performance of the MAA to the BED-CEIA and to determine the window period of the MAA. We compared the annual incidence estimated with the MAA to the annual incidence based on HIV seroconversion in a longitudinal cohort.
Results. The MAA had a window period of 141 days (95% confidence interval [CI], 94–150) and a very low false-recent misclassification rate (only 0.4% of 1474 samples from subjects infected for >1 year were misclassified as indicative of recent infection). In a cohort study, annual incidence based on HIV seroconversion was 1.04% (95% CI, .70%–1.55%). The incidence estimate obtained using the MAA was essentially identical: 0.97% (95% CI, .51%–1.71%).
Conclusions. The MAA is as sensitive for detecting recent HIV infection as the BED-CEIA and has a very low rate of false-recent misclassification. It provides a powerful tool for cross-sectional HIV incidence determination.
HIV; incidence testing; United States; epidemiology
Parametric and semiparametric competing risks methods were used to estimate proportions, timing, and predictors of acquired immune deficiency syndrome (AIDS)-related and non-AIDS-related mortality among individuals both positive and negative for the human immunodeficiency syndrome (HIV) in the Multicenter AIDS Cohort Study (MACS) and Women's Interagency HIV Study (WIHS) from 1984 to 2008 and 1996 to 2008, respectively. Among HIV-positive MACS participants, the proportion of deaths unrelated to AIDS increased from 6% before the introduction of highly active antiretroviral therapy (HAART) (before 1996) to 53% in the HAART era (P < 0.01); the median age of persons who died from non-AIDS-related causes after age 35 years increased from 49.0 to 66.0 years (P < 0.01). In both cohorts during the HAART era, median ages at time of non-AIDS-related death were younger for HIV-positive individuals than for comparable HIV-negative individuals (8.7 years younger in MACS (P < 0.01) and 7.6 years younger in WIHS (P < 0.01)). In a multivariate proportional cause-specific hazards model, unemployment (for non-AIDS death, hazard ratio (HR) = 1.8; for AIDS death, HR = 2.3), depression (for non-AIDS death, HR = 1.4; for AIDS death, HR = 1.4), and hepatitis B or C infection (for non-AIDS death, HR = 1.8, for AIDS death; HR = 1.4) were significantly (P < 0.05) associated with higher hazards of both non-AIDS and AIDS mortality among HIV-positive individuals in the HAART era, independent of study cohort. The results illuminate the changing face of mortality among the growing population infected with HIV.
acquired immunodeficiency syndrome; antiretroviral therapy, highly active; cohort studies; competing risks; HIV; mixture model; mortality; proportional hazards models
Combination antiretroviral therapy (ART) has significantly increased survival among HIV-positive adults in the United States (U.S.) and Canada, but gains in life expectancy for this region have not been well characterized. We aim to estimate temporal changes in life expectancy among HIV-positive adults on ART from 2000–2007 in the U.S. and Canada.
Participants were from the North American AIDS Cohort Collaboration on Research and Design (NA-ACCORD), aged ≥20 years and on ART. Mortality rates were calculated using participants' person-time from January 1, 2000 or ART initiation until death, loss to follow-up, or administrative censoring December 31, 2007. Life expectancy at age 20, defined as the average number of additional years that a person of a specific age will live, provided the current age-specific mortality rates remain constant, was estimated using abridged life tables.
The crude mortality rate was 19.8/1,000 person-years, among 22,937 individuals contributing 82,022 person-years and 1,622 deaths. Life expectancy increased from 36.1 [standard error (SE) 0.5] to 51.4 [SE 0.5] years from 2000–2002 to 2006–2007. Men and women had comparable life expectancies in all periods except the last (2006–2007). Life expectancy was lower for individuals with a history of injection drug use, non-whites, and in patients with baseline CD4 counts <350 cells/mm3.
A 20-year-old HIV-positive adult on ART in the U.S. or Canada is expected to live into their early 70 s, a life expectancy approaching that of the general population. Differences by sex, race, HIV transmission risk group, and CD4 count remain.
Chronic kidney disease and HIV infection both independently increase the risk of anemia. It is not known if individuals with both HIV infection and kidney dysfunction are at greater than expected risk of anemia resulting from the combined effect of these factors. Men from the Multicenter AIDS Cohort Study with AIDS-free time after 1996 were included in the analysis if they had an initial hemoglobin value greater than 13 g/dl and available serum creatinine measurements for the estimation of glomerular filtration rate. Hemoglobin data were fit parametrically using a linear mixed effects model and effects of medication use on hemoglobin levels were removed using censoring methods. The effect of both HIV infection and glomerular filtration rate less than 60 ml/min/1.73 m2 on the mean hemoglobin value was assessed. The risk of having anemia (hemoglobin level falling below 13 g/dl) was estimated. There were 862 HIV-infected and 1,214 HIV-uninfected men who contributed to the analysis. Hemoglobin values across all 17,341 person-visits, adjusting for age, were generally lower in HIV-infected AIDS-free men with impaired kidney function by −0.22 g/dl (95% CI: −0.42, −0.03) compared to men with either HIV infection or impaired kidney function, but not both. HIV-infected AIDS-free men with impaired kidney function have a higher risk of anemia by 1.2% compared to HIV-uninfected men with normal kidney function. Comorbid conditions and medication use did not explain this increase in risk. HIV infection and impaired kidney function have a combined impact on lowering hemoglobin levels, resulting in a higher risk of anemia.
U.S. state AIDS Drug Assistance Programs (ADAPs) are federally funded to provide antiretroviral therapy (ART) as the payer of last resort to eligible persons with HIV infection. States differ regarding their financial contributions to and ways of implementing these programs, and it remains unclear how this interstate variability affects HIV treatment outcomes.
We analyzed data from HIV-infected individuals who were clinically-eligible for ART between 2001 and 2009 (i.e., a first reported CD4+ <350 cells/uL or AIDS-defining illness) from 14 U.S. cohorts of the North American AIDS Cohort Collaboration on Research and Design (NA-ACCORD). Using propensity score matching and Cox regression, we assessed ART initiation (within 6 months following eligibility) and virologic suppression (within 1 year) based on differences in two state ADAP features: the amount of state funding in annual ADAP budgets and the implementation of waiting lists. We performed an a priori subgroup analysis in persons with a history of injection drug use (IDU).
Among 8,874 persons, 56% initiated ART within six months following eligibility. Persons living in states with no additional state contribution to the ADAP budget initiated ART on a less timely basis (hazard ratio [HR] 0.73, 95% CI 0.60–0.88). Living in a state with an ADAP waiting list was not associated with less timely initiation (HR 1.12, 95% CI 0.87–1.45). Neither additional state contributions nor waiting lists were significantly associated with virologic suppression. Persons with an IDU history initiated ART on a less timely basis (HR 0.67, 95% CI 0.47–0.95).
We found that living in states that did not contribute additionally to the ADAP budget was associated with delayed ART initiation when treatment was clinically indicated. Given the changing healthcare environment, continued assessment of the role of ADAPs and their features that facilitate prompt treatment is needed.
To study the incidence and pattern of neurologic disorders in a large cohort of HIV-positive men, compared with HIV-negative men, in the era of highly active antiretroviral therapy (HAART).
The Multicenter AIDS Cohort Study is a prospective study of men who have sex with men enrolled in 4 cities in the United States. We compared HIV-positive vs HIV-negative men for incidence and category of neurologic diagnoses in the HAART era (July 1, 1996, to last known follow-up or death, on or before July 1, 2011).
There were 3,945 participants alive during the HAART era (2,083 HIV negative, 1,776 HIV positive, and 86 who became infected with HIV during the study period) including 3,427 who were older than 40 years of age. Median age at first neurologic diagnosis among all participants alive in the HAART era was lower in HAART-treated HIV-positive vs HIV-negative men (48 vs 57 years of age, p < 0.001). Incidence of neurologic diagnoses was higher in HAART-treated HIV-positive vs HIV-negative men (younger than 40 years: 11.4 vs 0 diagnoses per 1,000 person-years [p < 0.001]; 40–49 years: 11.6 vs 2.0 [p < 0.001]; 50–60 years: 15.1 vs 3.0 [p < 0.001]; older than 60 years: 17.0 vs 5.7 [p < 0.01]). Excess neurologic disease was found in the categories of nervous system infections (p < 0.001), dementia (p < 0.001), seizures/epilepsy (p < 0.01), and peripheral nervous system disorders (p < 0.001), but not stroke (p = 0.60).
HIV-positive men receiving HAART have a higher burden of neurologic disease than HIV-negative men and develop neurologic disease at younger ages.
Too many people with HIV have left the job market permanently and those with reduced work capacity have been unable to keep their jobs. There is a need to examine the health effects of labor force participation in people with HIV. This study presents longitudinal data from 1,415 HIV-positive men who have sex with men taking part in the Multicenter AIDS Cohort Study. Generalized Estimating Equations show that employment is associated with better physical and mental health quality of life and suggests that there may be an adaptation process to the experience of unemployment. Post-hoc analyses also suggest that people who are more physically vulnerable may undergo steeper health declines due to job loss than those who are generally healthier. However, this may also be the result of a selection effect whereby poor physical health contributes to unemployment. Policies that promote labor force participation may not only increase employment rates but also improve the health of people living with HIV.
The neuropathogenesis of HIV-associated neurocognitive disorders (HAND) is unclear. Candidate gene studies have implicated genetic susceptibility loci within immune-related genes; however, these have not been reliably validated. Here we employed genome-wide association (GWA) methods to discover novel genetic susceptibility loci associated with HAND, and validate susceptibility loci implicated in prior candidate gene studies.
Data from 1287 participants enrolled in the Multicenter AIDS Cohort Study between 1985 and 2010 were used. Genotyping was conducted with Illumina 1M, 1MDuo, or 550K platform. Linear mixed models determined subject-specific slopes for change over time in processing speed and executive functioning, considering all visits including baseline and the most recent study visit. Covariates modeled as fixed effects included: time since the first visit, depression severity, nadir CD4+ T-cell count, Hepatitis C co-infection, substance use, and antiretroviral medication regimen. Prevalence of HIV-associated dementia (HAD) and neurocognitive impairment (NCI) was also examined as neurocognitive phenotypes in a case-control analysis.
No genetic susceptibility loci were associated with decline in processing speed or executive functioning among almost 2.5 million single nucleotide polymorphisms (SNPs) directly genotyped or imputed. No association between the SNPs and HAD or NCI were found. Previously reported associations between specific genetic susceptibility loci, HIV-associated neurocognitive impairment and HAD were not validated.
In this first GWAS of HAND, no novel or previously identified genetic susceptibility loci were associated with any of the phenotypes examined. Due to the relatively small sample size, future collaborative efforts that incorporate this dataset may still yield important findings.
HIV; NeuroAIDS; HIV-associated neurocognitive disorder; genome-wide association; HIV-associated dementia
Alcohol Drinking; HIV Seropositivity; Men who Have Sex with Men; Prospective Studies; Sexual Behavior
Highly active antiretroviral therapy (HAART) rapidly suppresses human immunodeficiency virus (HIV) viral replication and reduces circulating viral load, but the long-term effects of HAART on viral load remain unclear.
We evaluated HIV viral load trajectories over 8 years following HAART initiation in the Multicenter AIDS Cohort Study and the Women’s Interagency HIV Study. The study included 157 HIV-infected men and 199 HIV-infected women who were antiretroviral naïve and contributed 1311 and 1837 semiannual person-visits post-HAART, respectively. To account for within-subject correlation and the high proportion of left-censored viral loads, we used a segmental Bernoulli/lognormal random effects model.
Approximately 3 months (0.30 years for men and 0.22 years for women) after HAART initiation, HIV viral loads were optimally suppressed (ie, with very low HIV RNA) for 44% (95% confidence interval = 39%–49%) of men and 43% (38%–47%) of women, whereas the other 56% of men and 57% of women had on average 2.1 (1.5–2.6) and 3.0 (2.7–3.2) log10 copies/mL, respectively.
After 8 years on HAART, 75% of men and 80% of women had optimal suppression, whereas the rest of the men and women had suboptimal suppression with a median HIV RNA of 3.1 and 3.7 log10 copies/mL, respectively.
We assessed associations of herpes simplex virus types 1 and 2 (HSV-1 and -2), cytomegalovirus (CMV), and human herpesvirus 8 (HHV-8) infection with subclinical coronary atherosclerosis in 291 HIV-infected men in the Multicenter AIDS Cohort Study. Coronary artery calcium (CAC) was measured by non-contrast coronary CT imaging. Markers for herpesviruses infection were measured in frozen specimens collected 10-12 years prior to case identification. Multivariable logistic regression models and ordinal logistic regression models were performed. HSV-2 seropositivity was associated with coronary atherosclerosis (adjusted odds ratio [AOR] =4.12, 95% confidence interval [CI] =1.58-10.85) after adjustment for age, race/ethnicity, cardiovascular risk factors, and HIV infection related factors. Infection with a greater number of herpesviruses was associated with elevated CAC levels (AOR=1.58, 95% CI=1.06-2.36). Our findings suggest HSV-2 may be a risk factor for subclinical coronary atherosclerosis in HIV-infected men. Infection with multiple herpesviruses may contribute to the increased burden of atherosclerosis.
herpesvirus; HSV-2; atherosclerosis; HIV-1/AIDS; risk factors
The BED capture enzyme immunoassay (BED-CEIA) was developed for estimating HIV incidence from cross-sectional data. This assay misclassifies some individuals with nonrecent HIV infection as recently infected, leading to overestimation of HIV incidence. We analyzed factors associated with misclassification by the BED-CEIA. We analyzed samples from 383 men who were diagnosed with HIV infection less than 1 year after a negative HIV test (Multicenter AIDS Cohort Study). Samples were collected 2–8 years after HIV seroconversion, which was defined as the midpoint between the last negative and first positive HIV test. Samples were analyzed using the BED-CEIA with a cutoff of OD-n ≤0.8 for recent infection. Logistic regression was used to identify factors associated with misclassification. Ninety-one (15.1%) of 603 samples were misclassified. In multivariate models, misclassification was independently associated with highly active antiretroviral treatment (HAART) for >2 years, HIV RNA <400 copies/ml, and CD4 cell count <50 or <200 cells/mm3; adjusted odds ratios (OR) and 95% confidence intervals (CI) were 4.72 (1.35–16.5), 3.96 (1.53–10.3), 6.85 (2.71–17.4), and 11.5 (3.64–36.0), respectively. Among 220 men with paired samples, misclassification 2–4 years after seroconversion was significantly associated with misclassification 6–8 years after seroconversion [adjusted OR: 25.8 (95% CI: 8.17–81.5), p<0.001] after adjusting for race, CD4 cell count, HIV viral load, and HAART use. Low HIV viral load, low CD4 cell count, and >2 years of HAART were significantly associated with misclassification using the BED-CEIA. Some men were persistently misclassified as recently infected up to 8 years after HIV seroconversion.
The methodology for use of cardiac CT angiography (CTA) in low risk populations is not well defined. In order to present a reference for future studies, we present CTA methodology that is being used in an epidemiology study- the Multicenter AIDS Cohort Study (MACS).
The Multicenter AIDS Cohort Study (MACS) is an on-going multicenter prospective, observational cohort study. The MACS Cardiovascular Disease substudy plans to enroll 800 men (n= 575 HIV seropositive and n= 225 HIV seronegative) age 40-75 years for coronary atherosclerosis imaging using cardiac CTA. The protocol includes heart rate (HR) optimization with beta blockers; use of proper field of view; scan length limitation; prospective ECG-gating using the lowest beam voltage possible. All scans are evaluated for presence, extent, and composition of coronary atherosclerosis, left atrial volumes, left ventricular volume and mass and non-coronary cardiac pathology.
The first 498 participants had an average radiation dose of 2.5±1.6 milliSieverts (mSv) for the cardiac CTA study. Overall quality of scans was fair to excellent in 98.6% of studies. There were three significant adverse events- two allergic reactions to contrast and one subcutaneous contrast extravasation.
Cardiac CTA was safe and afforded a low effective radiation exposure to these asymptomatic research participants and provides valuable cardiovascular endpoints for scientific analysis. The cardiac CTA methodology described here may serve as a reference for use in future epidemiology studies aiming to assess coronary atherosclerosis and cardiac anatomy in low risk populations while minimizing radiation exposure.
CT angiography; radiation dose; epidemiological study
In a large North American cohort study, anal cancer incidence rates were substantially higher for HIV-infected men who have sex with men, other men, and women compared with HIV-uninfected individuals. Rates increased from 1996–1999 to 2000–2003 but plateaued by 2004–2007.
Background. Anal cancer is one of the most common cancers affecting individuals infected with human immunodeficiency virus (HIV), although few have evaluated rates separately for men who have sex with men (MSM), other men, and women. There are also conflicting data regarding calendar trends.
Methods. In a study involving 13 cohorts from North America with follow-up between 1996 and 2007, we compared anal cancer incidence rates among 34 189 HIV-infected (55% MSM, 19% other men, 26% women) and 114 260 HIV-uninfected individuals (90% men).
Results. Among men, the unadjusted anal cancer incidence rates per 100 000 person-years were 131 for HIV-infected MSM, 46 for other HIV-infected men, and 2 for HIV-uninfected men, corresponding to demographically adjusted rate ratios (RRs) of 80.3 (95% confidence interval [CI], 42.7–151.1) for HIV-infected MSM and 26.7 (95% CI, 11.5–61.7) for other HIV-infected men compared with HIV-uninfected men. HIV-infected women had an anal cancer rate of 30/100 000 person-years, and no cases were observed for HIV-uninfected women. In a multivariable Poisson regression model, among HIV-infected individuals, the risk was higher for MSM compared with other men (RR, 3.3; 95% CI, 1.8–6.0), but no difference was observed comparing women with other men (RR, 1.0; 95% CI, 0.5–2.2). In comparison with the period 2000–2003, HIV-infected individuals had an adjusted RR of 0.5 (95% CI, .3–.9) in 1996–1999 and 0.9 (95% CI, .6–1.2) in 2004–2007.
Conclusions. Anal cancer rates were substantially higher for HIV-infected MSM, other men, and women compared with HIV-uninfected individuals, suggesting a need for universal prevention efforts. Rates increased after the early antiretroviral therapy era and then plateaued.
Plasma HIV-1 RNA was measured in 306 samples, collected from 273 highly active antiretroviral therapy (HAART)-experienced men, using both the Roche COBAS TaqMan (limit of detection [LD]=20 copies/mL) and Roche Amplicor (LD=50 copies/mL) assays. Mixtures of Gaussian distributions incorporating left-censored data were used in analyses. The more sensitive TaqMan assay estimated that 23% and 0.0003% of HIV-1 RNA values would be below 1 copy/mL and 1 copy/3L, respectively. This is in sharp contrast to the overestimation provided by the less sensitive Amplicor assay, whereby the corresponding predicted percentages were 51% and 1%. Both assays appropriately characterized sub-optimal virologic response as the rightmost peaks of both distributions provided an excellent fit to the observed data. Our results based on a widely available 20 copies/mL sensitive assay reproduce those obtained using customized assays that quantified HIV-1 RNA values as low as 1 copy/mL.
limit of detection; HIV-1 RNA; bimodal distribution; HAART
Although HLA-B*57 (B57) is associated with slow progression to disease following HIV-1 infection, B57 heterozygotes display a wide spectrum of outcomes, including rapid progression, viremic slow progression, and elite control. Efforts to identify differences between B57-positive (B57+) slow progressors and B57+ rapid progressors have largely focused on cytotoxic T lymphocyte (CTL) phenotypes and specificities during chronic stages of infection. Although CTL responses in the early months of infection are likely to be the most important for the long-term rate of HIV-1 disease progression, few data on the early CTL responses of eventual slow progressors have been available. Utilizing the Multicenter AIDS Cohort Study (MACS), we retrospectively examined the early HIV-1-specific CTL responses of 14 B57+ individuals whose time to development of disease ranged from 3.5 years to longer than 25 years after infection. In general, a greater breadth of targeting of epitopes from structural proteins, especially Gag, as well as of highly conserved epitopes from any HIV-1 protein, correlated with longer times until disease. The single elite controller in the cohort was an outlier on several correlations of CTL targeting and time until disease, consistent with reports that elite control is typically not achieved solely by protective HLA-mediated CTLs. When targeting of individual epitopes was analyzed, we found that early CTL responses to the IW9 (ISPRTLNAW) epitope of Gag, while generally subdominant, correlated with delayed progression to disease. This is the first study to identify early CTL responses to IW9 as a correlate of protection in persons with HLA-B*57.
The potential for changing HIV-1 virulence has significant implications for the AIDS epidemic, including changing HIV transmission rates, rapidity of disease progression, and timing of ART. Published data to date have provided conflicting results.
We conducted a meta-analysis of changes in baseline CD4+ T-cell counts and set point plasma viral RNA load over time in order to establish whether summary trends are consistent with changing HIV-1 virulence.
We searched PubMed for studies of trends in HIV-1 prognostic markers of disease progression and supplemented findings with publications referenced in epidemiological or virulence studies. We identified 12 studies of trends in baseline CD4+ T-cell counts (21 052 total individuals), and eight studies of trends in set point viral loads (10 785 total individuals), spanning the years 1984–2010. Using random-effects meta-analysis, we estimated summary effect sizes for trends in HIV-1 plasma viral loads and CD4+ T-cell counts.
Baseline CD4+ T-cell counts showed a summary trend of decreasing cell counts [effect=−4.93 cells/µl per year, 95% confidence interval (CI) −6.53 to −3.3]. Set point viral loads showed a summary trend of increasing plasma viral RNA loads (effect=0.013 log10 copies/ml per year, 95% CI −0.001 to 0.03). The trend rates decelerated in recent years for both prognostic markers.
Our results are consistent with increased virulence of HIV-1 over the course of the epidemic. Extrapolating over the 30 years since the first description of AIDS, this represents a CD4+ T cells loss of approximately 148 cells/µl and a gain of 0.39 log10 copies/ml of viral RNA measured during early infection. These effect sizes would predict increasing rates of disease progression, and need for ART as well as increasing transmission risk.
CD4 lymphocyte count; disease progression; HIV infections/epidemiology; HIV infections/virology; HIV pathogenicity; viral load/trends; virulence
To estimate the association of antiretroviral therapy initiation with incident acquired immunodeficiency syndrome (AIDS) or death while accounting for time-varying confounding in a cost-efficient manner, the authors combined a case-cohort study design with inverse probability-weighted estimation of a marginal structural Cox proportional hazards model. A total of 950 adults who were positive for human immunodeficiency virus type 1 were followed in 2 US cohort studies between 1995 and 2007. In the full cohort, 211 AIDS cases or deaths occurred during 4,456 person-years. In an illustrative 20% random subcohort of 190 participants, 41 AIDS cases or deaths occurred during 861 person-years. Accounting for measured confounders and determinants of dropout by inverse probability weighting, the full cohort hazard ratio was 0.41 (95% confidence interval: 0.26, 0.65) and the case-cohort hazard ratio was 0.47 (95% confidence interval: 0.26, 0.83). Standard multivariable-adjusted hazard ratios were closer to the null, regardless of study design. The precision lost with the case-cohort design was modest given the cost savings. Results from Monte Carlo simulations demonstrated that the proposed approach yields approximately unbiased estimates of the hazard ratio with appropriate confidence interval coverage. Marginal structural model analysis of case-cohort study designs provides a cost-efficient design coupled with an accurate analytic method for research settings in which there is time-varying confounding.
acquired immunodeficiency syndrome; case-cohort studies; cohort studies; confounding bias; HIV; pharmacoepidemiology; selection bias
The U.S. National HIV/AIDS Strategy targets for 2015 include increasing access to care and improving health outcomes for persons living with HIV in the United States (PLWH-US).
To demonstrate the utility of the NA-ACCORD (North American AIDS Cohort Collaboration on Research and Design) for monitoring trends in the HIV epidemic in the United States and to present trends in HIV treatment and related health outcomes.
Trends from annual cross-sectional analyses comparing patients from pooled, multicenter, prospective, clinical HIV cohort studies with PLWH-US, as reported to national surveillance systems in 40 states.
U.S. HIV outpatient clinics.
HIV-infected adults with 1 or more HIV RNA plasma viral load (HIV VL) or CD4 T-lymphocyte (CD4) cell count measured in any calendar year from 1 January 2000 to 31 December 2008.
Annual rates of antiretroviral therapy use, HIV VL, and CD4 cell count at death.
45 529 HIV-infected persons received care in an NA-ACCORD–participating U.S. clinical cohort from 2000 to 2008. In 2008, the 26 030 NA-ACCORD participants in care and the 655 966 PLWH-US had qualitatively similar demographic characteristics. From 2000 to 2008, the proportion of participants prescribed highly active antiretroviral therapy increased by 9 percentage points to 83% (P < 0.001), whereas the proportion with suppressed HIV VL (≤2.7 log10 copies/mL) increased by 26 percentage points to 72% (P < 0.001). Median CD4 cell count at death more than tripled to 0.209 × 109 cells/L (P < 0.001).
The usual limitations of observational data apply.
The NA-ACCORD is the largest cohort of HIV-infected adults in clinical care in the United States that is demographically similar to PLWH-US in 2008. From 2000 to 2008, increases were observed in the percentage of prescribed HAART, the percentage who achieved a suppressed HIV VL, and the median CD4 cell count at death.
Primary Funding Source
National Institutes of Health, Centers for Disease Control and Prevention, Canadian Institutes of Health Research, Canadian HIV Trials Network, and the government of British Columbia, Canada.
Epidemiological evidence has suggested that consumption of a diet rich in cruciferous vegetables reduces the risk of several types of cancers and chronic degenerative diseases. In particular, broccoli sprouts are a convenient and rich source of the glucosinolate, glucoraphanin, which can release the chemopreventive agent, sulforaphane, an inducer of glutathione S-transferases. Two broccoli sprout-derived beverages, one sulforaphane-rich (SFR) and the other glucoraphanin-rich (GRR), were evaluated for pharmacodynamic action in a crossover clinical trial design. Study participants were recruited from the farming community of He Zuo Township, Qidong, China, previously documented to have a high incidence of hepatocellular carcinoma with concomitant exposures to aflatoxin and more recently characterized with exposures to substantive levels of airborne pollutants. Fifty healthy participants were randomized into two treatment arms. The study protocol was as follows: a 5 days run-in period, a 7 days administration of beverage, a 5 days washout period and a 7 days administration of the opposite beverage. Urinary excretion of the mercapturic acids of acrolein, crotonaldehyde, ethylene oxide and benzene were measured both pre- and postinterventions using liquid chromatography tandem mass spectrometry. Statistically significant increases of 20–50% in the levels of excretion of glutathione-derived conjugates of acrolein, crotonaldehyde and benzene were seen in individuals receiving SFR, GRR or both compared with their preintervention baseline values. No significant differences were seen between the effects of SFR versus GRR. Intervention with broccoli sprouts may enhance detoxication of airborne pollutants and attenuate their associated health risks.