In the last decade, timely initiation of antiretroviral therapy and resulting virologic suppression have greatly improved in North America concurrent with the development of better tolerated and more potent regimens, but significant barriers to treatment uptake remain.
Background. Since the mid-1990s, effective antiretroviral therapy (ART) regimens have improved in potency, tolerability, ease of use, and class diversity. We sought to examine trends in treatment initiation and resulting human immunodeficiency virus (HIV) virologic suppression in North America between 2001 and 2009, and demographic and geographic disparities in these outcomes.
Methods. We analyzed data on HIV-infected individuals newly clinically eligible for ART (ie, first reported CD4+ count <350 cells/µL or AIDS-defining illness, based on treatment guidelines during the study period) from 17 North American AIDS Cohort Collaboration on Research and Design cohorts. Outcomes included timely ART initiation (within 6 months of eligibility) and virologic suppression (≤500 copies/mL, within 1 year). We examined time trends and considered differences by geographic location, age, sex, transmission risk, race/ethnicity, CD4+ count, and viral load, and documented psychosocial barriers to ART initiation, including non–injection drug abuse, alcohol abuse, and mental illness.
Results. Among 10 692 HIV-infected individuals, the cumulative incidence of 6-month ART initiation increased from 51% in 2001 to 72% in 2009 (Ptrend < .001). The cumulative incidence of 1-year virologic suppression increased from 55% to 81%, and among ART initiators, from 84% to 93% (both Ptrend < .001). A greater number of psychosocial barriers were associated with decreased ART initiation, but not virologic suppression once ART was initiated. We found significant heterogeneity by state or province of residence (P < .001).
Conclusions. In the last decade, timely ART initiation and virologic suppression have greatly improved in North America concurrent with the development of better-tolerated and more potent regimens, but significant barriers to treatment uptake remain, both at the individual level and systemwide.
antiretroviral therapy; healthcare disparities; HIV; time factors; viral load
To investigate the impact of HAART-induced HIV suppression on levels of 24 serological biomarkers of inflammation and immune activation.
Prospective cohort study.
Biomarkers were measured with multiplex assays in centralized laboratories using stored serum samples contributed by 1,697 men during 8,903 person-visits in the Multicenter AIDS Cohort Study (MACS) from 1984–2009. Using generalized gamma models, we compared biomarker values across three groups, adjusting for possible confounders: HIV-uninfected (NEG); HIV+, HAART-naïve (NAI); and HAART-exposed with HIV RNA suppressed to <50 copies/mL plasma (SUP). We also estimated changes in biomarker levels associated with duration of HIV suppression, using splined generalized gamma regression with a knot at one year.
Most biomarkers were relatively normalized in the SUP group relative to the NAI group; however, 12 biomarkers in the SUP group were distinct (p<0.002) from NEG values: CXCL10, CRP, sCD14, sTNFR2, TNF-α, sCD27, sGP130, IL-8, CCL13, BAFF, GM-CSF, and IL-12p70. Thirteen biomarkers exhibited significant changes in the first year after viral suppression, but none changed significantly after that time.
Biomarkers of inflammation and immune activation moved toward HIV-negative levels within the first year after HAART-induced HIV suppression. Although several markers of T cell activation returned to levels present in HIV-negative men, residual immune activation, particularly monocyte/macrophage activation, was present. This residual immune activation may represent a therapeutic target to improve the prognosis of HIV-infected individuals receiving HAART.
Acquired Immunodeficiency Syndrome; Antiretroviral Therapy, Highly Active; Biological Markers; Inflammation; Prospective Studies; Male
To determine whether markers of systemic inflammation are associated with the presence of moderate-to-severe obstructive sleep apnea (OSA), and whether this association differs based on HIV and HIV treatment status.
HIV-uninfected men (HIV−; n=60), HIV-infected men receiving HAART (HIV+/HAART; n=58), and HIV-infected men not receiving HAART (HIV+/ No HAART; n=41) underwent polysomnograpy and measurement of plasma levels of TNF-alpha, soluble TNF-alpha receptors I and II (sTNFRI and sTNFRII) and IL-6. The relationship between moderate-severe OSA (respiratory disturbance index ≥15 apnea/hypopnea events/hour) and inflammatory markers was assessed with multivariable regression models.
Compared to the HIV− men, HIV+/HAART men and HIV+/No HAART men had higher levels of TNF-alpha, sTNFRI, and sTNFRII, independent of age, race, smoking status, obstructive lung disease (OLD), and BMI. Moderate-to-severe OSA was present in 48% of the sample (HIV−:57%; HIV+/HAART: 41%; HIV+/No HAART: 44%). Among the HIV+/No HAART men, but not in the other groups, TNF-alpha, sTNFRII, and IL-6 levels were higher in those with moderate-severe OSA compared to men with no-to-mild OSA after adjustment for age, race, smoking status, OLD, and BMI. Within this group, the association of high TNF-alpha concentrations with moderate-severe OSA was also independent of CD4 cell count and plasma HIV RNA concentration.
Compared to HIV-infected men on HAART and HIV-uninfected men, markers of systemic inflammation were higher in HIV-infected men not receiving HAART. In these men, TNF-alpha was significantly related to obstructive sleep apnea, independent of HIV-related covariates.
Diabetes and hypertension, common conditions in antiretroviral (ART) treated HIV-infected individuals, are associated with glomerular hyperfiltration, which precedes the onset of proteinuria and accelerated kidney function decline. In the Multicenter AIDS Cohort Study, we examined the extent to which hyperfiltration is present and associated with metabolic, cardiovascular, HIV and treatment risk factors among HIV-infected men.
Cross-sectional cohort using direct measurement of glomerular filtration rate (GFR) by iohexol plasma clearance for 367 HIV-infected men and 241 HIV-uninfected men who were free of CKD.
Hyperfiltration was defined as GFR >140 ml/min/1.73m2 - 1 ml/min/1.73m2 per each year over age 40. Multivariate logistic regression was used to estimate the odds ratios (OR) of prevalent hyperfiltration for metabolic, cardiovascular, HIV and cumulative ART exposure factors.
Among subjects without CKD, the prevalence of hyperfiltration was higher for HIV-infected participants (25%) compared to uninfected participants (17%; p=0.01). HIV infection was associated with hyperfiltration (OR: 1.70, 95%CI: 1.11, 2.61) and modified the association between diabetes and hyperfiltration, such that the association among HIV-uninfected men (OR: 2.56 95%CI: 1.33, 5.54) was not observed among HIV-infected men (OR: 1.19, 95%CI: 0.69, 2.05). These associations were independent of known risk factors for hyperfiltration. Indicators of hyperglycemia and hypertension were also associated with hyperfiltration as was cumulative zidovudine exposure.
Hyperfiltration, a potential modifiable predictor of kidney disease progression, is common among ART-treated HIV-infected men. HIV infection is associated with significant odds of hyperfiltration in addition to known risk factors for kidney damage.
Glomerular hyperfiltration; glomerular filtration rate; HIV; antiretroviral therapy; iohexol
Like other members of the γ-herpesvirus family, human herpes virus 8 (HHV-8), the etiologic agent of classic and HIV-related Kaposi’s sarcoma (HIV-KS) acquired and evolved several human genes with key immune modulatory and cellular growth control functions. The encoded viral homologs substitute for their human counterparts but escape cellular regulation, leading to uncontrolled cell proliferation. We postulated that DNA variants in the human homologs of viral genes that potentially alter the expression or the binding of the encoded factors controlling the antiviral response may facilitate viral interference. To test whether cellular homologs are candidate susceptibility genes, we evaluated the association of DNA variants in 92 immune-related genes including 7 cellular homologs with the risk for HIV-KS in a matched case and control study nested in the Multicenter AIDS Cohort Study. Low- and high-risk gene-by-gene interactions were estimated by multifactor dimensionality reduction and used as predictors in conditional logistic models. Among the most significant gene interactions at risk (OR=2.84–3.92; Bonferroni-adjusted p= 9.9×10−3−2.6×10−4), three comprised human homologs of two latently expressed viral genes, cyclin D1 (CCND1) and interleukin-6 (IL-6), in conjunction with angiogenic genes (VEGF, EDN-1 and EDNRB). At lower significance thresholds (adjusted p < 0.05), human homologs related to apoptosis (CFLAR) and chemotaxis (CCL2) emerged as candidates. This “proof of concept” study identified human homologs involved in the regulation of type I interferon-induced signaling, cell cycle and apoptosis potentially as important determinants of HIV-KS
Kaposi’s sarcoma; Immunodeficiency; Herpes Virus 8; Multifactor Dimensionality Reduction; Polymorphism; Genetic association
Antiretroviral regimens (ART) changes occur frequently among HIV-infected persons. Duration and type of initial highly active antiretroviral therapy (HAART) and factors associated with regimen switching were evaluated in the Multicenter AIDS cohort Study.
Participants were classified according to the calendar period of HAART initiation: T1 (1996-2001), T2 (2002-2005) and T3 (2006-2009). Kaplan Meier curves depicted time from HAART initiation to first regimen changes within 5.5 years. Cox proportional hazards regression models were used to examine factors associated with time to switching.
Of 1009 participants, 796 changed regimen within 5.5 years after HAART initiation. The percentage of participants who switched declined from 85% during T1 to 49 % in T3., The likelihood of switching in T3 decreased by 50% (p<0.01) compared to T1 after adjustment for pre-HAART ART use, age, race and CD4 count. Incomplete HIV suppression decreased over time (p<0.01) but predicted switching across all time periods. Lower HAART adherence (≤ 95% of prescribed doses) was predictive of switching only in T1. In T2, central nervous system symptoms predicted switching (RH = 1.7, p= 0.012). Older age at HAART initiation was associated with increased switching in T1 (RH=1.03 per year increase) and decreased switching in T2 (RH = 0.97 per year increase).
During the first 15 years of the HAART era, initial HAART regimen duration lengthened and regimen discontinuation rates diminished. Both HIV RNA non-suppression and poor adherence predicted switching prior to 2001 while side effects that were possibly ART-related were more prominent during T2.
Comprehensive neuropsychological assessments for youth with ADHD allow for thorough consideration of co-occurring disorders and provide targeted recommendations for treating ADHD and comorbid conditions. This study offers a preliminary evaluation of the added value (compared to routine care) associated with neuropsychological assessment in the identification and treatment of ADHD in youth ages 3-17 years. First, we describe a novel measure developed to evaluate broad-based outcomes for youth with ADHD following neuropsychological assessment. Next, we compare parent ratings of child symptoms and quality of life between two groups of youth with ADHD: those who have recently received neuropsychological assessments (NP+), and those who have not (NP−). Participants were surveyed again 5 months after baseline to assess changes in symptoms, quality of life, and service utilization. While both groups experienced significant improvements in behavioral/emotional symptoms, the NP+ group had greater initiation of parent behavior management training and special education services and greater initiation of medication management over the follow-up period, compared with the NP− group. Satisfaction with neuropsychological assessment was high overall but slightly decreased over the course of the follow-up period. The findings offer preliminary support for the incremental efficacy of neuropsychological evaluation in the diagnosis and management of ADHD.
ADHD; neuropsychological; testing; quality of life; utility; efficacy; outcome
Intra-subject variability (ISV) is the most consistent behavioral deficit in Attention Deficit Hyperactivity Disorder (ADHD). ISV may be associated with networks involved in sustaining task control (cingulo-opercular network: CON) and self-reflective lapses of attention (default mode network: DMN). The current study examined whether connectivity supporting attentional control is atypical in children with ADHD. Group differences in full-brain connection strength and brain–behavior associations with attentional control measures were examined for the late-developing CON and DMN in 50 children with ADHD and 50 typically-developing (TD) controls (ages 8–12 years).
Children with ADHD had hyper-connectivity both within the CON and within the DMN. Full-brain behavioral associations were found for a number of between-network connections. Across both groups, more anti-correlation between DMN and occipital cortex supported better attentional control. However, in the TD group, this brain–behavior association was stronger and occurred for a more extensive set of DMN–occipital connections. Differential support for attentional control between the two groups occurred with a number of CON–DMN connections. For all CON–DMN connections identified, increased between-network anti-correlation was associated with better attentional control for the ADHD group, but worse attentional control in the TD group. A number of between-network connections with the medial frontal cortex, in particular, showed this relationship. Follow-up analyses revealed that these associations were specific to attentional control and were not due to individual differences in working memory, IQ, motor control, age, or scan motion.
While CON–DMN anti-correlation is associated with improved attention in ADHD, other circuitry supports improved attention in TD children. Greater CON–DMN anti-correlation supported better attentional control in children with ADHD, but worse attentional control in TD children. On the other hand, greater DMN–occipital anti-correlation supported better attentional control in TD children.
•Children with ADHD are hyper-connected within both the CON and DMN.•More DMN–Visual antagonism supports better attention, particularly in controls.•More DMN–CON antagonism supports better attention only in children with ADHD.•CON–DMN compensation for attention may be due to stimulant medication use.
ADHD; Intra-subject variability; Attention; Resting-state connectivity; Network; Default mode network
BACKGROUND AND OBJECTIVE:
Behavioral disorders are highly comorbid with childhood learning disabilities (LDs), and accurate identification of LDs is vital for guiding appropriate interventions. However, it is difficult to conduct comprehensive assessment of academic skills within the context of primary care visits, lending utility to screening of academic skills via informant reports. The current study evaluated the clinical utility of a parent-reported screening measure in identifying children with learning difficulties.
Participants included 440 children (66.7% male), ages 5.25 to 17.83 years (mean = 10.32 years, SD = 3.06 years), referred for neuropsychological assessment. Academic difficulties were screened by parent report using the Colorado Learning Difficulties Questionnaire (CLDQ). Reading and math skills were assessed via individually administered academic achievement measures. Sensitivity, specificity, classification accuracy, and conditional probabilities were calculated to evaluate the efficacy of the CLDQ in predicting academic impairment.
Correlations between the CLDQ reading scale and reading achievement measures ranged from −0.35 to −0.65 and from −0.24 to −0.46 between the CLDQ math scale and math achievement measures (all P < .01). Sensitivity was good for both reading and math scales, whereas specificity was low. Taking into account the high base rate of reading and math LDs within our sample, the conditional probability of true negatives (96.2% reading, 85.1% math) was higher than for true positives (40.5% reading, 37.9% math).
Overall, the CLDQ may more accurately predict children without LDs than children with LDs. As such, the absence of parent-reported difficulties may be adequate to rule out an overt LD, whereas elevated scores likely indicate the need for more comprehensive assessment.
learning disorders; rating scales; ADHD; achievement testing; sensitivity; specificity
Children with Attention-Deficit/Hyperactivity Disorder (ADHD) demonstrate increased response variability compared with controls, which is thought to be associated with deficits in attention regulation and response control that subsequently affect performance of more cognitively demanding tasks, such as reading. The present study examined response variability during a computerized simple reaction time (RT) task in 67 children. Ex-Gaussian analyses separated the response time distribution into normal (mu and sigma) and exponential (tau) components; the association of each with reading fluency was examined. Children with ADHD had significantly slower, more variable, and more skewed RTs compared with controls. After controlling for ADHD symptom severity, tau (but not mu or mean RT) was significantly associated with reduced reading fluency, but not with single word reading accuracy. These data support the growing evidence that RT variability, but not simply slower mean response speed, is the characteristic of youth with ADHD and that longer response time latencies (tau) may be implicated in the poorer academic performance associated with ADHD.
Attention; Dyslexia; Variability; Processing speed; Executive function; Ex-Gaussian analyses
Prospective cohort studies often quantify serum immune biomarkers at a single time point to determine risk of cancer and other chronic diseases that develop years later. Estimates of the within-person temporal stability of serum markers partly assess the utility of single biomarker measurements, and may have important implications for the design of prospective studies of chronic disease risk.
Using archived sera collected from 200 HIV-seronegative men at three visits spaced over approximately two years, concentrations of 14 biomarkers (ApoA1, sCD14, sgp130, sIL-6R, sIL-2Rα, sTNFR2, BAFF/BLyS, CXCL13, IFN-γ, IL-1β, IL-6, IL-8, IL-10, TNF-α) were measured in a single laboratory. Age-and ethnicity-adjusted intraclass correlation coefficients (ICC) were calculated for each biomarker, and mixed linear regression models were utilized to examine the influence of age, ethnicity, season, and study site on biomarker concentrations.
Across all three study visits, most biomarkers had ICC values indicating fair to excellent within-person stability. ApoA1 (ICC=0.88)and TNF-α (ICC=0.87) showed the greatest stability; the ICC for IL-8 (ICC=0.33) was remarkably less stable. The ICCs were similar when calculated between pairs of consecutive visits. The covariables did not influence biomarker levels or their temporal stability. All biomarkers showed moderate to strong pairwise correlations across visits.
Serum concentrations of most evaluated immune biomarkers displayed acceptable to excellent within-person temporal reliability over a 2-year period. Further investigation may be required to clarify the stability of IL-8.
These findings lend support to using these serologic immune biomarkers in prospective studies investigating associations with chronic diseases.
Temporal stability; biomarkers; cytokines; soluble receptors; intraclass correlation coefficient
Background. The role of active hepatitis C virus (HCV) replication in chronic kidney disease (CKD) risk has not been clarified.
Methods. We compared CKD incidence in a large cohort of HIV-infected subjects who were HCV seronegative, HCV viremic (detectable HCV RNA), or HCV aviremic (HCV seropositive, undetectable HCV RNA). Stages 3 and 5 CKD were defined according to standard criteria. Progressive CKD was defined as a sustained 25% glomerular filtration rate (GFR) decrease from baseline to a GFR < 60 mL/min/1.73 m2. We used Cox models to calculate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs).
Results. A total of 52 602 HCV seronegative, 9508 HCV viremic, and 913 HCV aviremic subjects were included. Compared with HCV seronegative subjects, HCV viremic subjects were at increased risk for stage 3 CKD (adjusted HR 1.36 [95% CI, 1.26, 1.46]), stage 5 CKD (1.95 [1.64, 2.31]), and progressive CKD (1.31 [1.19, 1.44]), while HCV aviremic subjects were also at increased risk for stage 3 CKD (1.19 [0.98, 1.45]), stage 5 CKD (1.69 [1.07, 2.65]), and progressive CKD (1.31 [1.02, 1.68]).
Conclusions. Compared with HIV-infected subjects who were HCV seronegative, both HCV viremic and HCV aviremic individuals were at increased risk for moderate and advanced CKD.
HIV; hepatitis C virus; chronic kidney disease; hepatitis C RNA; cohort study; glomerular filtration rate; injection drug use
In the context of HIV, the initiation of effective antiretroviral therapy (ART) has been found to increase the risk of dyslipidemia in HIV-infected individuals, and dyslipidemia has been found to be a risk factor for kidney disease in the general population. Therefore, we examined changes in lipid profiles in HIV-infected men following ART initiation and the association with future kidney dysfunction. HIV-infected men from the Multicenter AIDS Cohort Study initiating ART between December 31, 1995 and September 30, 2011 with measured lipid and serum creatinine values pre-ART and post-ART were selected. The associations between changes in total cholesterol or high-density lipoprotein following ART initiation and the estimated change in glomerular filtration rate (eGFR) over time were assessed using piecewise linear mixed effects models. There were 365 HIV-infected men who contributed to the analysis. In the adjusted models, at 3 years post-ART, those with changes in total cholesterol >50 mg/dl had an average decrease in eGFR of 2.6 ml/min/1.73 m2 per year (p<0.001) and at 5 years post-ART, the average decrease was 2.4 ml/min/1.73 m2 per year (p=0.008). This decline contrasted with the estimates for those with changes in total cholesterol ≤50 mg/dl: 1.4 ml/min/1.73 m2 decrease per year (p<0.001) and 0.1 ml/min/1.73 m2 decrease per year (p=0.594) for the same time periods, respectively. Large decreases in high-density lipoprotein (a decline of greater than 5 mg/dl) were not associated with declines in eGFR. These results indicate that large ART-related increases in total cholesterol may be a risk factor for kidney function decline in HIV-infected men. Should these results be generalizable to the broader HIV population, monitoring cholesterol changes following the initiation of ART may be important in identifying HIV-infected persons at risk for kidney disease.
Primary liver cancer (PLC) is the third leading cause of cancer mortality globally. In endemic areas of sub-Saharan Africa and Asia PLC largely arises from chronic infection with hepatitis B virus (HBV) and ingestion of aflatoxins. While synergistic interactions between these two risk factors have been observed in cohort studies in China, here we determined the impact of agricultural reforms in the 1980s leading to diminished maize consumption and implementation of subsidized universal vaccination against HBV in the 2000s on PLC primary prevention. A population-based cancer registry was used to track PLC mortality in Qidong, China and was compared to the timeline of HBV immunization. Randomly selected serum samples from archived cohort collections from the 1980s to present were analyzed for aflatoxin biomarkers. Greater than 50% reductions in PLC mortality rates occurred across birth cohorts from the 1960s to the 1980s for Qidongese less than 35 years of age although all were born before universal vaccination of newborns. Median levels of the aflatoxin biomarker decreased from 19.3 pg/mg albumin in 1989 to undetectable (<0.5 pg/mg) by 2009. A population attributable benefit of 65% for reduced PLC mortality was estimated from a government facilitated switch of dietary staple from maize to rice; 83% of this benefit was in those infected with HBV. Food policy reforms in China resulted in a dramatic decrease in aflatoxin exposure, which, independent of HBV vaccination, reduced liver cancer risk. The extensive HBV vaccine coverage now in place augurs even greater risk reductions in the future.
aflatoxin; primary prevention; liver cancer
Coronary artery disease (CAD) has been associated with HIV infection; however data are not consistent.
We performed cardiac CT to determine whether HIV-infected men have more coronary atherosclerosis than uninfected men.
Cross-sectional study within the Multicenter AIDS Cohort Study(MACS).
HIV-infected (n=618) and –uninfected (n=383) men who have sex with men (MSM) had non-contrast and contrast enhanced cardiac CT if they were between 40–70 years, weighed <300 pounds, and had no history of coronary revascularization.
Presence and extent, for those with plaque, of coronary artery calcium (CAC) on non-contrast CT, and of any plaque, non-calcified, mixed or calcified plaque and stenosis on CT angiography.
1001 men underwent non-contrast CT of whom 759 had coronary CT angiography. After adjusting for age, race, center, and cohort, HIV-infected men had a greater prevalence of CAC [Prevalence ratio(PR)=1.21, 95% confidence interval (CI) 1.08–1.35, p=0.001], and any plaque [PR=1.14(1.05–1.24),p=0.001], including non-calcified plaque [PR=1.28(1.13–1.45),p<0.001) and mixed plaque [PR=1.35(1.10–1.65),p=0.004] than HIV-uninfected men. Associations between HIV-infection and any plaque and non-calcified plaque remained significant (p<0.005) after CAD risk factor adjustment. HIV-infected men also had a greater extent of non-calcified plaque after CAD risk factor adjustment (p=0.026). HIV-infected men had a greater prevalence of coronary artery stenosis>50% than HIV-uninfected men [PR=1.48(1.06–2.07),p=0.020), but not after CAD risk factor adjustment. Longer duration of highly active antiretroviral therapy [PR=1.09(1.02–1.17), p=0.007,per year] and lower nadir CD4+ T-cell count [PR=0.80(0.69–0.94),p=0.005, per 100 cells] were associated with coronary stenosis>50%.
Coronary artery plaque, especially non-calcified plaque, is more prevalent and extensive in HIV-infected men, independent of CAD risk factors.
Cross-sectional observational study design and inclusion of only men.
Primary Funding Source
NHLBI and NIAID
To compare the proportion, timing and hazards of non-AIDS death and AIDS death among men and women who initiated HAART at different CD4+ cell counts to mortality risks of HIV-uninfected persons with similar risk factors.
Prospective cohort studies.
We used parametric mixture models to compare proportions of AIDS and non-AIDS mortality and ages at death, and multivariable Cox models to compare cause-specific hazards of mortality, across levels of CD4+ cell count at HAART initiation (≤200 cells/μl: ‘late’, 201–350 cells/μl: ‘intermediate’, >350 cells/μl: ‘early’) and with HIV-uninfected individuals from the Multicenter AIDS Cohort Study and the Women’s Interagency HIV Study. We used multiple imputation methods to address lead-time bias in sensitivity analysis.
Earlier initiators were more likely to die of non-AIDS causes (early: 78%, intermediate: 74%, late: 49%), and at older ages (median years 72, 69, 66), relative to later initiators. Estimated median ages at non-AIDS death for each CD4+ cell count category were lower than that estimated for the HIV-uninfected group (75 years). In multivariable analysis, non-AIDS death hazard ratios relative to early initiators were 2.15 for late initiators (P < 0.01) and 1.66 for intermediate initiators (P = 0.01); AIDS death hazard ratios were 3.26 for late initiators (P <0.01) and 1.20 for intermediate initiators (P = 0.28). Strikingly, the adjusted hazards for non-AIDS death among HIV-uninfected individuals and early initiators were nearly identical (hazard ratio 1.01). Inferences were unchanged after adjustment for lead-time bias.
Results suggest the possibility of reducing the risk of non-AIDS mortality among HIV-infected individuals to approximate that faced by comparable HIV-uninfected individuals.
antiretroviral therapy; bias; CD4+ cell count; cohort studies; competing risks; mortality; statistical
Successful implementation of functional self-care skills depends upon adequate executive functioning; however, many scales assessing adaptive skills do not address the inherent executive burden of these tasks. This omission is especially relevant for individuals with spina bifida, for whom medical self-care tasks impose a significant burden requiring initiation and prospective memory. The Kennedy Krieger Independence Scales–Spina Bifida Version (KKIS–SB) is a caregiver-reported measure designed to address this gap; it assesses skills for managing both typical and spina bifida-related daily self-care demands, with a focus on the timely and independent initiation of adaptive skills.
Parents of 100 youth and young adults with spina bifida completed the KKIS–SB. Exploratory factor analysis and Pearson's correlations were used to assess the factor structure, reliability, and construct validity of the KKIS–SB.
The scale demonstrates excellent internal consistency (Cronbach's alpha = .891). Exploratory factor analysis yielded four factors, explaining 65.1% of the total variance. Two primary subscales were created, initiation of routines and prospective memory, which provide meaningful clinical information regarding management of a variety of typical (e.g., get up on time, complete daily hygiene routines on time) and spina bifida-specific self-care tasks (e.g., begin self-catheterization on time, perform self-examination for pressure sores).
Based upon internal consistency estimates and correlations with measures of similar constructs, initial data suggest good preliminary reliability and validity of the KKIS–SB.
transition; adaptive functioning; validity; factor structure; executive function
We examined the implications of using the Full Scale Intelligence Quotient (FSIQ) versus the General Abilities Index (GAI) for determination of intellectual disability using the Wechsler Intelligence Scales for Children, fourth edition (WISC-IV).
Children referred for neuropsychological assessment (543 males, 290 females; mean age 10y 5mo, SD 2y 9mo, range 6–16y) were administered the WISC-IV and the Adaptive Behavior Assessment System, Second Edition (ABAS-II).
GAI and FSIQ were highly correlated; however, fewer children were identified as having intellectual disability using GAI (n=159) than when using FSIQ (n=196). Although the 44 children classified as having intellectual disability based upon FSIQ (but not GAI) had significantly higher adaptive functioning scores than those meeting intellectual disability criteria based upon both FSIQ and GAI, mean adaptive scores still fell within the impaired range. FSIQ and GAI were comparable in predicting impairments in adaptive functioning.
Using GAI rather than FSIQ in intellectual disability diagnostic decision making resulted in fewer individuals being diagnosed with intellectual disability; however, the mean GAI of the disqualified individuals was at the upper end of criteria for intellectual impairment (standard score 75), and these individuals remained adaptively impaired. As GAI and FSIQ were similarly predictive of overall adaptive functioning, the use of GAI for intellectual disability diagnostic decision making may be of limited value.
We investigated the association of HIV infection and highly active antiretroviral therapy (HAART) with sleep disordered breathing (SDB), fatigue, and sleepiness.
HIV-uninfected men (HIV−; n = 60), HIV-infected men using HAART (HIV+/HAART+; n = 58), and HIV-infected men not using HAART (HIV+/HAART−; n = 41) recruited from two sites of the Multicenter AIDS cohort study (MACS) underwent a nocturnal sleep study, anthropometric assessment, and questionnaires for fatigue and the Epworth Sleepiness Scale. The prevalence of SDB in HIV- men was compared to that in men matched from the Sleep Heart Health Study (SHHS).
The prevalence of SDB was unexpectedly high in all groups: 86.7% for HIV−, 70.7% for HIV+/HAART+, and 73.2% for HIV+/HAART−, despite lower body-mass indices (BMI) in HIV+ groups. The higher prevalence in the HIV− men was significant in univariate analyses but not after adjustment for BMI and other variables. SDB was significantly more common in HIV− men in this study than those in SHHS, and was common in participants with BMIs <25 kg/m2. HIV+ men reported fatigue more frequently than HIV− men (25.5% vs. 6.7%; p = 0.003), but self-reported sleepiness did not differ among the three groups. Sleepiness, but not fatigue, was significantly associated with SDB.
SDB was highly prevalent in HIV− and HIV+ men, despite a normal or slightly elevated BMI. The high rate of SDB in men who have sex with men deserves further investigation. Sleepiness, but not fatigue, was related to the presence of SDB. Clinicians caring for HIV-infected patients should distinguish between fatigue and sleepiness when considering those at risk for SDB, especially in non-obese men.
We show in human immunodeficiency virus–positive persons that the coronary artery disease effect of an unfavorable genetic background is comparable to previous studies in the general population, and comparable in size to traditional risk factors and antiretroviral regimens known to increase cardiovascular risk.
Background Persons infected with human immunodeficiency virus (HIV) have increased rates of coronary artery disease (CAD). The relative contribution of genetic background, HIV-related factors, antiretroviral medications, and traditional risk factors to CAD has not been fully evaluated in the setting of HIV infection.
Methods In the general population, 23 common single-nucleotide polymorphisms (SNPs) were shown to be associated with CAD through genome-wide association analysis. Using the Metabochip, we genotyped 1875 HIV-positive, white individuals enrolled in 24 HIV observational studies, including 571 participants with a first CAD event during the 9-year study period and 1304 controls matched on sex and cohort.
Results A genetic risk score built from 23 CAD-associated SNPs contributed significantly to CAD (P = 2.9×10−4). In the final multivariable model, participants with an unfavorable genetic background (top genetic score quartile) had a CAD odds ratio (OR) of 1.47 (95% confidence interval [CI], 1.05–2.04). This effect was similar to hypertension (OR = 1.36; 95% CI, 1.06–1.73), hypercholesterolemia (OR = 1.51; 95% CI, 1.16–1.96), diabetes (OR = 1.66; 95% CI, 1.10–2.49), ≥1 year lopinavir exposure (OR = 1.36; 95% CI, 1.06–1.73), and current abacavir treatment (OR = 1.56; 95% CI, 1.17–2.07). The effect of the genetic risk score was additive to the effect of nongenetic CAD risk factors, and did not change after adjustment for family history of CAD.
Conclusions In the setting of HIV infection, the effect of an unfavorable genetic background was similar to traditional CAD risk factors and certain adverse antiretroviral exposures. Genetic testing may provide prognostic information complementary to family history of CAD.
HIV infection; coronary artery disease; genetics; traditional risk factors; antiretroviral therapy
In the United States, incident hepatitis C among men who have sex with men has been ongoing since at least 1984. Risk factors included unprotected receptive anal intercourse with multiple partners, HIV infection, and lower CD4 T-cell count among HIV-infected men.
Background Prospective characterization of hepatitis C virus (HCV) transmission in both human immunodeficiency virus (HIV)–infected and –uninfected men who have sex with men (MSM) over the entire HIV epidemic has not been comprehensively conducted.
Methods To determine the trends in and risk factors associated with incident HCV in MSM since 1984, 5310 HCV antibody (anti-HCV)–negative MSM in the Multicenter AIDS Cohort Study were prospectively followed during 1984–2011 for anti-HCV seroconversion.
Results During 55 343 person-years (PYs) of follow-up, there were 115 incident HCV infections (incidence rate, 2.08/1000 PYs) scattered throughout the study period. In a multivariable analysis with time-varying covariates, older age (incidence rate ratio [IRR], 1.40/10 years, P < .001), enrollment in the later (2001–2003) recruitment period (IRR, 3.80, P = .001), HIV infection (IRR, 5.98, P < .001), drinking >13 alcoholic drinks per week (IRR, 1.68, P < .001), hepatitis B surface antigen positivity (IRR, 1.68, P < .001), syphilis (IRR, 2.95, P < .001), and unprotected receptive anal intercourse with >1 male partner (IRR, 3.37, P < .001) were independently associated with incident HCV. Among HIV-infected subjects, every 100 cell/mm3 increase in CD4 count was associated with a 7% (P = .002) decrease in the HCV incidence rate up to a CD4 count of 500 cells/mm3, whereas there was no association with highly active antiretroviral therapy.
Conclusions The spread of HCV among both HIV-infected and -uninfected MSM in the United States has been ongoing since the beginning of the HIV epidemic. In HIV-infected men with <500 CD4+ T cells, the HCV incidence rate was inversely proportional to CD4 T-cell count.
incident HCV; sexual transmission; MSM
Executive function (EF) skills play an important role in children’s cognitive and social functioning. These skills develop throughout childhood, concurrently with a number of developmental transitions and challenges. One of these challenges is the transition from elementary into middle-level schools, which has the potential to significantly disrupt children’s academic and social trajectories. However, little is known about the role of EF in children’s adjustment during this transition. This study investigated the relation between children’s EF skills, assessed both before and during elementary school, and sixth grade academic and social competence. In addition, the influences of the type of school setting attended in sixth grade on children’s academic and behavioral outcomes were examined. EF assessed prior to and during elementary school significantly predicted sixth grade competence, as rated by teachers and parents, in both academic and social domains, after controlling for background characteristics. The interactions between type of school setting and EF skills were significant: parents tended to report more behavioral problems and less regulatory control in children with weaker EF skills who were attending middle school. In contrast, teachers reported greater academic and behavioral difficulty in students with poorer EF attending elementary school settings. In conclusion, children’s performance-based EF skills significantly affect adjustment to the academic and behavioral demands of sixth grade, with parent report suggesting greater difficulty for children with poorer EF in settings where children are provided with less external supports (e.g., middle school).
“Jitter” involves randomization of intervals between stimulus events. Compared with controls, individuals with ADHD demonstrate greater intrasubject variability (ISV) performing tasks with fixed interstimulus intervals (ISIs). Because Gaussian curves mask the effect of extremely slow or fast response times (RTs), ex-Gaussian approaches have been applied to study ISV.
This study applied ex-Gaussian analysis to examine the effects of jitter on RT variability in children with and without ADHD. A total of 75 children, aged 9 to 14 years (44 ADHD, 31 controls), completed a go/no-go test with two conditions: fixed ISI and jittered ISI.
ADHD children showed greater variability, driven by elevations in exponential (tau), but not normal (sigma) components of the RT distribution. Jitter decreased tau in ADHD to levels not statistically different than controls, reducing lapses in performance characteristic of impaired response control.
Jitter may provide a nonpharmacologic mechanism to facilitate readiness to respond and reduce lapses from sustained (controlled) performance.
ADHD; executive function; attention; variability; response time; statistics; ex-Gaussian; jitter
Human genetic variation contributes to differences in susceptibility to HIV-1 infection. To search for novel host resistance factors, we performed a genome-wide association study (GWAS) in hemophilia patients highly exposed to potentially contaminated factor VIII infusions.
Individuals with hemophilia A and a documented history of factor VIII infusions before the introduction of viral inactivation procedures (1979–1984) were recruited from 36 hemophilia treatment centers (HTCs), and their genome-wide genetic variants were compared with those from matched HIV-infected individuals. Homozygous carriers of known CCR5 resistance mutations were excluded. Single nucleotide polymorphisms (SNPs) and inferred copy number variants (CNVs) were tested using logistic regression. In addition, we performed a pathway enrichment analysis, a heritability analysis, and a search for epistatic interactions with CCR5 Δ32 heterozygosity.
A total of 560 HIV-uninfected cases were recruited: 36 (6.4%) were homozygous for CCR5 Δ32 or m303. After quality control and SNP imputation, we tested 1 081 435 SNPs and 3686 CNVs for association with HIV-1 serostatus in 431 cases and 765 HIV-infected controls. No SNP or CNV reached genome-wide significance. The additional analyses did not reveal any strong genetic effect.
Highly exposed, yet uninfected hemophiliacs form an ideal study group to investigate host resistance factors. Using a genome-wide approach, we did not detect any significant associations between SNPs and HIV-1 susceptibility, indicating that common genetic variants of major effect are unlikely to explain the observed resistance phenotype in this population.
To examine the incidence and risk factors for anal cancer in a multicenter cohort of HIV-positive and negative men who have sex with men followed between 1984 and 2006 (MACS).
Prospective analysis using Poisson regression and Cox proportional hazard models, and a nested case-control study using conditional logistic regression.
There were 28 cases of anal cancer among the 6,972 men who were evaluated. The incidence rate was significantly higher in HIV-positive men than in HIV-negative men (IR= 69 vs. 14 per 100,000 person-years). Among HIV-positive men, anal cancer incidence was higher in the HAART era than the pre-HAART era (IR=137 vs. 30 per 100,000 person-years). In multivariate analysis restricted to the HAART era, anal cancer risk increased significantly with HIV infection (RH=4.7, 95%CI=1.3–17), and increasing number of unprotected receptive anal sex partners at the first three study visits (p-trend=0.03). Among HIV-positive men, current HAART use did not decrease anal cancer risk.
HIV-positive men had increased risk of anal cancer. Improved survival of HIV-positive individuals following HAART initiation may allow for sufficient time for human papillomavirus (HPV) associated anal dysplasias to develop into malignancies, thus explaining the increased incidence of anal cancer in the HAART era.
anal; rectal; cancer; incidence; MACS; sexual risk; HAART