In the last decade, timely initiation of antiretroviral therapy and resulting virologic suppression have greatly improved in North America concurrent with the development of better tolerated and more potent regimens, but significant barriers to treatment uptake remain.
Background. Since the mid-1990s, effective antiretroviral therapy (ART) regimens have improved in potency, tolerability, ease of use, and class diversity. We sought to examine trends in treatment initiation and resulting human immunodeficiency virus (HIV) virologic suppression in North America between 2001 and 2009, and demographic and geographic disparities in these outcomes.
Methods. We analyzed data on HIV-infected individuals newly clinically eligible for ART (ie, first reported CD4+ count <350 cells/µL or AIDS-defining illness, based on treatment guidelines during the study period) from 17 North American AIDS Cohort Collaboration on Research and Design cohorts. Outcomes included timely ART initiation (within 6 months of eligibility) and virologic suppression (≤500 copies/mL, within 1 year). We examined time trends and considered differences by geographic location, age, sex, transmission risk, race/ethnicity, CD4+ count, and viral load, and documented psychosocial barriers to ART initiation, including non–injection drug abuse, alcohol abuse, and mental illness.
Results. Among 10 692 HIV-infected individuals, the cumulative incidence of 6-month ART initiation increased from 51% in 2001 to 72% in 2009 (Ptrend < .001). The cumulative incidence of 1-year virologic suppression increased from 55% to 81%, and among ART initiators, from 84% to 93% (both Ptrend < .001). A greater number of psychosocial barriers were associated with decreased ART initiation, but not virologic suppression once ART was initiated. We found significant heterogeneity by state or province of residence (P < .001).
Conclusions. In the last decade, timely ART initiation and virologic suppression have greatly improved in North America concurrent with the development of better-tolerated and more potent regimens, but significant barriers to treatment uptake remain, both at the individual level and systemwide.
antiretroviral therapy; healthcare disparities; HIV; time factors; viral load
Broccoli sprouts are a convenient and rich source of the glucosinolate, glucoraphanin, which can generate the chemopreventive agent, sulforaphane, an inducer of glutathione S-transferases (GSTs) and other cytoprotective enzymes. A broccoli sprout-derived beverage providing daily doses of 600 μmol glucoraphanin and 40 μmol sulforaphane was evaluated for magnitude and duration of pharmacodynamic action in a 12-week randomized clinical trial. Two hundred and ninety-one study participants were recruited from the rural He-He Township, Qidong, in the Yangtze River delta region of China, an area characterized by exposures to substantial levels of airborne pollutants. Exposure to air pollution has been associated with lung cancer and cardiopulmonary diseases. Urinary excretion of the mercapturic acids of the pollutants, benzene, acrolein, and crotonaldehyde, were measured before and during the intervention using liquid chromatography tandem mass spectrometry. Rapid and sustained, statistically significant (p ≤ 0.01) increases in the levels of excretion of the glutathione-derived conjugates of benzene (61%), acrolein (23%), but not crotonaldehyde were found in those receiving broccoli sprout beverage compared with placebo. Excretion of the benzene-derived mercapturic acid was higher in participants who were GSTT1-positive compared to the null genotype, irrespective of study arm assignment. Measures of sulforaphane metabolites in urine indicated that bioavailability did not decline over the 12-week daily dosing period. Thus, intervention with broccoli sprouts enhances the detoxication of some airborne pollutants and may provide a frugal means to attenuate their associated long-term health risks.
Air pollution; broccoli; sulforaphane; benzene; chemoprevention
Previous studies demonstrated that blacks have less coronary artery calcification (CAC) than whites. We evaluated racial differences in plaque composition and stenosis in the Multicenter AIDS Cohort Study (MACS). HIV positive and negative men completed non-contrast cardiac CT if they were 40–70 years, weighed <300 pounds, and had no prior history of cardiac surgery or revascularization, and if eligible, coronary CT angiography (CTA). There were 1001 men who underwent CT scans and 759 men had CTA. We measured CAC on non-contrast CT, and total plaque, non-calcified, calcified, and mixed plaque, and identified coronary stenosis >50% on CTA. The association of presence and extent of plaque with race was determined after adjustment for HIV serostatus, cardiovascular risk factors and measures of socioeconomic status. The prevalences of any plaque on CTA and non-calcified plaque were not different between black and white men; however, black men had lower prevalences of CAC (Prevalence ratio (PR)=0.79, p=0.01), calcified plaque (PR=0.69, p=0.002), and stenosis >50% (PR=0.59, p=0.009). There were no associations between black race and extent of plaque in fully adjusted models. Using log-linear regression, black race was associated with a lower extent of any plaque on CTA in HIV positive men (estimate=−0.24, p=0.051) but not in HIV negative men (0.12, p=0.50, HIV interaction p=0.005). In conclusion, a lower prevalence of CAC in black compared to white men appears to reflect less calcification of plaque and stenosis rather than a lower overall prevalence of plaque.
Epidemiology; plaque; coronary angiography; coronary artery disease; HIV
Cytokines released by epicardial fat are implicated in the pathogenesis of atherosclerosis. HIV infection and anti-retroviral therapy have been associated with changes in body fat distribution and coronary artery disease. We sought to determine if HIV infection is associated with greater epicardial fat and if epicardial fat is associated with subclinical coronary atherosclerosis.
We studied 579 HIV-infected and 353 HIV-uninfected men age 40 to 70 years with non-contrast computed tomography (CT) to measure epicardial adipose tissue volume (EAT) and coronary artery calcium (CAC). Total plaque score (TPS), and plaque subtypes (non-calcified, calcified and mixed) were measured by coronary CT angiography in 706 men.
We evaluated the association between EAT and HIV serostatus, and the association of EAT with subclinical atherosclerosis, adjusting for age, race and serostatus and with additional cardiovascular (CV) risk factors and tested for modifying effects of HIV serostatus.
HIV-infected men had greater EAT than HIV-uninfected men (p=0.001). EAT was positively associated with duration of antiretroviral therapy (p=0.02), specifically AZT (p<0.05). EAT was associated with presence of any coronary artery plaque (p=0.006) and non-calcified plaque (p=0.001), adjusting for age, race, serostatus and CV risk factors. Among men with CAC, EAT was associated with CAC extent (p=0.006). HIV serostatus did not modify associations between EAT and either CAC extent or presence of plaque.
Greater epicardial fat volume in HIV-infected men and its association with coronary plaque and antiretroviral therapy duration suggest potential mechanisms that might lead to increased risk for cardiovascular disease in HIV.
Imaging; plaque; risk factors; HIV; ART
We estimated US Department of Health and Human Services (DHHS)–approved human immunodeficiency virus (HIV) indicators. Among patients, 71% were retained in care, 82% were prescribed treatment, and 78% had HIV RNA ≤200 copies/mL; younger adults, women, blacks, and injection drug users had poorer outcomes. Interventions are needed to reduce retention- and treatment-related disparities.
HIV; quality of care; retention in care; antiretroviral therapy; HIV RNA suppression
Human papillomavirus (HPV) types 16 and 18 cause invasive cervical cancer and most invasive anal cancers (IACs). Overall, IAC rates are highest among men who have sex with men (MSM), especially MSM with HIV infection. Testosterone is prescribed for men showing hypogonadism and HIV-related wasting. While there are direct and indirect physiological effects of testosterone in males, its role in anal HPV16/18 infections in men is unknown.
Free testosterone (FT) was measured in serum from 340 Multicenter AIDS Cohort Study (MACS) participants who were tested for anal HPV16/18-DNA approximately 36 months later. The effect of log10-transformed current FT level on anal HPV16/18 prevalence was modeled using Poisson regression with robust error variance. Multivariate models controlled for other HPV types, cumulative years of exogenous testosterone use, race, age, lifetime number of receptive anal intercourse partnerships, body mass index, tobacco smoking, HIV-infection and CD4+ T-cell counts among HIV-infected, and blood draw timing.
Participants were, on average, 60 (+5.4) years of age, White (86%), and HIV-uninfected (56%); Twenty-four percent tested positive for anal HPV16 and/or 18-DNA (HPV16 prevalence=17.1%, HPV18=9.1%). In adjusted analysis, each half-log10 increase of FT was associated with a 1.9-fold (95% Confidence Interval: 1.11, 3.24) higher HPV16/18 prevalence. Additionally, other Group 1 high-risk HPVs were associated with a 1.56-fold (1.03, 2.37) higher HPV16/18 prevalence. Traditional risk factors for HPV16/18 infection (age, tobacco smoking; lifetime number of sexual partners, including the number of receptive anal intercourse partnerships within 24 months preceding HPV testing) were poorly correlated with one another and not statistically significantly associated with higher prevalence of HPV16/18 infection in unadjusted and adjusted analyses.
Higher free testosterone was associated with increased HPV16/18 prevalence measured approximately three years later, independent of sexual behavior and other potential confounders. The mechanisms underlying this association remain unclear and warrant further study.
Some individuals remain HIV-1 antibody and PCR negative after repeated exposures to the virus, and are referred to as HIV-exposed seronegatives (HESN). However, the causes of resistance to HIV-1 infection in cases other than those with a homozygous CCR5Δ32 deletion are unclear. We hypothesized that human p21WAF1/CIP1 (a cyclin-dependent kinase inhibitor) could play a role in resistance to HIV-1 infection in HESN, as p21 expression has been associated with suppression of HIV-1 in elite controllers and reported to block HIV-1 integration in cell culture. We measured p21 RNA expression in PBMC from 40 HESN and 40 low exposure HIV-1 seroconverters (LESC) prior to their infection using a real-time PCR assay. Comparing the 20 HESN with the highest exposure risk (median = 111 partners/2.5 years prior to the 20 LESC with the lowest exposure risk (median = 1 partner/2.5 years prior), p21 expression trended higher in HESN in only one of two experiments (P = 0.11 vs. P = 0.80). Additionally, comparison of p21 expression in the top 40 HESN (median = 73 partners/year) and lowest 40 LESC (median = 2 partners/year) showed no difference between the groups (P = 0.84). There was a weak linear trend between risk of infection after exposure and increasing p21 gene expression (R2 = 0.02, P = 0.12), but again only in one experiment. Hence, if p21 expression contributes to the resistance to viral infection in HESN, it likely plays a minor role evident only in those with extremely high levels of exposure to HIV-1.
The major histocompatibility complex (MHC) region on chromosome 6p21.3 is suspected to host susceptibility loci for HIV-related Kaposi’s sarcoma (HIV-KS). A nested case-control study in the Multicenter AIDS Cohort Study was designed to conduct fine genetic association mapping across central MHC. Individuals co-infected with HIV-1 and HHV-8 who later developed KS were defined as cases (n=354) and were matched 1:1 with co-infected KS-free controls.
We report data for new independent MHC class II and III susceptibility loci. In particular, class II HLA-DMB emerged as a strong candidate, with the intronic variant rs6902982 A>G associated with a 4-fold increase of risk (OR= 4.09; 95% CI: 1.90–8.80; p= 0.0003). A striking multiplicative effect on the estimated risk was associated with further carriage of two non-synonymous variants, rs1800453 A>G (Asp697Gly) and rs4148880 A>G (Ile393Val), in the linked TAP1 gene (OR=10.5; 95% CI: 2.54–43.6; p=0.0012). The class III susceptibility variant is moderately associated with HIV-KS and lies within a 120 Kb-long haplotype (OR=1.52; 95% CI: 1.01–2.28; p=0.047) formed by rs7029 A>G (GPANK1 3’UTR), rs1065356 G>A (LY6G6C), rs3749953 A>G (MSH5-SAPCD1 readthrough) and rs707926 G>A (VARS). Our data suggest that antigen processing by MHC class II molecules is a target pathway in the pathogenesis of HIV-KS.
To investigate the impact of HAART-induced HIV suppression on levels of 24 serological biomarkers of inflammation and immune activation.
Prospective cohort study.
Biomarkers were measured with multiplex assays in centralized laboratories using stored serum samples contributed by 1,697 men during 8,903 person-visits in the Multicenter AIDS Cohort Study (MACS) from 1984–2009. Using generalized gamma models, we compared biomarker values across three groups, adjusting for possible confounders: HIV-uninfected (NEG); HIV+, HAART-naïve (NAI); and HAART-exposed with HIV RNA suppressed to <50 copies/mL plasma (SUP). We also estimated changes in biomarker levels associated with duration of HIV suppression, using splined generalized gamma regression with a knot at one year.
Most biomarkers were relatively normalized in the SUP group relative to the NAI group; however, 12 biomarkers in the SUP group were distinct (p<0.002) from NEG values: CXCL10, CRP, sCD14, sTNFR2, TNF-α, sCD27, sGP130, IL-8, CCL13, BAFF, GM-CSF, and IL-12p70. Thirteen biomarkers exhibited significant changes in the first year after viral suppression, but none changed significantly after that time.
Biomarkers of inflammation and immune activation moved toward HIV-negative levels within the first year after HAART-induced HIV suppression. Although several markers of T cell activation returned to levels present in HIV-negative men, residual immune activation, particularly monocyte/macrophage activation, was present. This residual immune activation may represent a therapeutic target to improve the prognosis of HIV-infected individuals receiving HAART.
Acquired Immunodeficiency Syndrome; Antiretroviral Therapy, Highly Active; Biological Markers; Inflammation; Prospective Studies; Male
To determine whether markers of systemic inflammation are associated with the presence of moderate-to-severe obstructive sleep apnea (OSA), and whether this association differs based on HIV and HIV treatment status.
HIV-uninfected men (HIV−; n=60), HIV-infected men receiving HAART (HIV+/HAART; n=58), and HIV-infected men not receiving HAART (HIV+/ No HAART; n=41) underwent polysomnograpy and measurement of plasma levels of TNF-alpha, soluble TNF-alpha receptors I and II (sTNFRI and sTNFRII) and IL-6. The relationship between moderate-severe OSA (respiratory disturbance index ≥15 apnea/hypopnea events/hour) and inflammatory markers was assessed with multivariable regression models.
Compared to the HIV− men, HIV+/HAART men and HIV+/No HAART men had higher levels of TNF-alpha, sTNFRI, and sTNFRII, independent of age, race, smoking status, obstructive lung disease (OLD), and BMI. Moderate-to-severe OSA was present in 48% of the sample (HIV−:57%; HIV+/HAART: 41%; HIV+/No HAART: 44%). Among the HIV+/No HAART men, but not in the other groups, TNF-alpha, sTNFRII, and IL-6 levels were higher in those with moderate-severe OSA compared to men with no-to-mild OSA after adjustment for age, race, smoking status, OLD, and BMI. Within this group, the association of high TNF-alpha concentrations with moderate-severe OSA was also independent of CD4 cell count and plasma HIV RNA concentration.
Compared to HIV-infected men on HAART and HIV-uninfected men, markers of systemic inflammation were higher in HIV-infected men not receiving HAART. In these men, TNF-alpha was significantly related to obstructive sleep apnea, independent of HIV-related covariates.
Diabetes and hypertension, common conditions in antiretroviral (ART) treated HIV-infected individuals, are associated with glomerular hyperfiltration, which precedes the onset of proteinuria and accelerated kidney function decline. In the Multicenter AIDS Cohort Study, we examined the extent to which hyperfiltration is present and associated with metabolic, cardiovascular, HIV and treatment risk factors among HIV-infected men.
Cross-sectional cohort using direct measurement of glomerular filtration rate (GFR) by iohexol plasma clearance for 367 HIV-infected men and 241 HIV-uninfected men who were free of CKD.
Hyperfiltration was defined as GFR >140 ml/min/1.73m2 - 1 ml/min/1.73m2 per each year over age 40. Multivariate logistic regression was used to estimate the odds ratios (OR) of prevalent hyperfiltration for metabolic, cardiovascular, HIV and cumulative ART exposure factors.
Among subjects without CKD, the prevalence of hyperfiltration was higher for HIV-infected participants (25%) compared to uninfected participants (17%; p=0.01). HIV infection was associated with hyperfiltration (OR: 1.70, 95%CI: 1.11, 2.61) and modified the association between diabetes and hyperfiltration, such that the association among HIV-uninfected men (OR: 2.56 95%CI: 1.33, 5.54) was not observed among HIV-infected men (OR: 1.19, 95%CI: 0.69, 2.05). These associations were independent of known risk factors for hyperfiltration. Indicators of hyperglycemia and hypertension were also associated with hyperfiltration as was cumulative zidovudine exposure.
Hyperfiltration, a potential modifiable predictor of kidney disease progression, is common among ART-treated HIV-infected men. HIV infection is associated with significant odds of hyperfiltration in addition to known risk factors for kidney damage.
Glomerular hyperfiltration; glomerular filtration rate; HIV; antiretroviral therapy; iohexol
Like other members of the γ-herpesvirus family, human herpes virus 8 (HHV-8), the etiologic agent of classic and HIV-related Kaposi’s sarcoma (HIV-KS) acquired and evolved several human genes with key immune modulatory and cellular growth control functions. The encoded viral homologs substitute for their human counterparts but escape cellular regulation, leading to uncontrolled cell proliferation. We postulated that DNA variants in the human homologs of viral genes that potentially alter the expression or the binding of the encoded factors controlling the antiviral response may facilitate viral interference. To test whether cellular homologs are candidate susceptibility genes, we evaluated the association of DNA variants in 92 immune-related genes including 7 cellular homologs with the risk for HIV-KS in a matched case and control study nested in the Multicenter AIDS Cohort Study. Low- and high-risk gene-by-gene interactions were estimated by multifactor dimensionality reduction and used as predictors in conditional logistic models. Among the most significant gene interactions at risk (OR=2.84–3.92; Bonferroni-adjusted p= 9.9×10−3−2.6×10−4), three comprised human homologs of two latently expressed viral genes, cyclin D1 (CCND1) and interleukin-6 (IL-6), in conjunction with angiogenic genes (VEGF, EDN-1 and EDNRB). At lower significance thresholds (adjusted p < 0.05), human homologs related to apoptosis (CFLAR) and chemotaxis (CCL2) emerged as candidates. This “proof of concept” study identified human homologs involved in the regulation of type I interferon-induced signaling, cell cycle and apoptosis potentially as important determinants of HIV-KS
Kaposi’s sarcoma; Immunodeficiency; Herpes Virus 8; Multifactor Dimensionality Reduction; Polymorphism; Genetic association
Antiretroviral regimens (ART) changes occur frequently among HIV-infected persons. Duration and type of initial highly active antiretroviral therapy (HAART) and factors associated with regimen switching were evaluated in the Multicenter AIDS cohort Study.
Participants were classified according to the calendar period of HAART initiation: T1 (1996-2001), T2 (2002-2005) and T3 (2006-2009). Kaplan Meier curves depicted time from HAART initiation to first regimen changes within 5.5 years. Cox proportional hazards regression models were used to examine factors associated with time to switching.
Of 1009 participants, 796 changed regimen within 5.5 years after HAART initiation. The percentage of participants who switched declined from 85% during T1 to 49 % in T3., The likelihood of switching in T3 decreased by 50% (p<0.01) compared to T1 after adjustment for pre-HAART ART use, age, race and CD4 count. Incomplete HIV suppression decreased over time (p<0.01) but predicted switching across all time periods. Lower HAART adherence (≤ 95% of prescribed doses) was predictive of switching only in T1. In T2, central nervous system symptoms predicted switching (RH = 1.7, p= 0.012). Older age at HAART initiation was associated with increased switching in T1 (RH=1.03 per year increase) and decreased switching in T2 (RH = 0.97 per year increase).
During the first 15 years of the HAART era, initial HAART regimen duration lengthened and regimen discontinuation rates diminished. Both HIV RNA non-suppression and poor adherence predicted switching prior to 2001 while side effects that were possibly ART-related were more prominent during T2.
Comprehensive neuropsychological assessments for youth with ADHD allow for thorough consideration of co-occurring disorders and provide targeted recommendations for treating ADHD and comorbid conditions. This study offers a preliminary evaluation of the added value (compared to routine care) associated with neuropsychological assessment in the identification and treatment of ADHD in youth ages 3-17 years. First, we describe a novel measure developed to evaluate broad-based outcomes for youth with ADHD following neuropsychological assessment. Next, we compare parent ratings of child symptoms and quality of life between two groups of youth with ADHD: those who have recently received neuropsychological assessments (NP+), and those who have not (NP−). Participants were surveyed again 5 months after baseline to assess changes in symptoms, quality of life, and service utilization. While both groups experienced significant improvements in behavioral/emotional symptoms, the NP+ group had greater initiation of parent behavior management training and special education services and greater initiation of medication management over the follow-up period, compared with the NP− group. Satisfaction with neuropsychological assessment was high overall but slightly decreased over the course of the follow-up period. The findings offer preliminary support for the incremental efficacy of neuropsychological evaluation in the diagnosis and management of ADHD.
ADHD; neuropsychological; testing; quality of life; utility; efficacy; outcome
Intra-subject variability (ISV) is the most consistent behavioral deficit in Attention Deficit Hyperactivity Disorder (ADHD). ISV may be associated with networks involved in sustaining task control (cingulo-opercular network: CON) and self-reflective lapses of attention (default mode network: DMN). The current study examined whether connectivity supporting attentional control is atypical in children with ADHD. Group differences in full-brain connection strength and brain–behavior associations with attentional control measures were examined for the late-developing CON and DMN in 50 children with ADHD and 50 typically-developing (TD) controls (ages 8–12 years).
Children with ADHD had hyper-connectivity both within the CON and within the DMN. Full-brain behavioral associations were found for a number of between-network connections. Across both groups, more anti-correlation between DMN and occipital cortex supported better attentional control. However, in the TD group, this brain–behavior association was stronger and occurred for a more extensive set of DMN–occipital connections. Differential support for attentional control between the two groups occurred with a number of CON–DMN connections. For all CON–DMN connections identified, increased between-network anti-correlation was associated with better attentional control for the ADHD group, but worse attentional control in the TD group. A number of between-network connections with the medial frontal cortex, in particular, showed this relationship. Follow-up analyses revealed that these associations were specific to attentional control and were not due to individual differences in working memory, IQ, motor control, age, or scan motion.
While CON–DMN anti-correlation is associated with improved attention in ADHD, other circuitry supports improved attention in TD children. Greater CON–DMN anti-correlation supported better attentional control in children with ADHD, but worse attentional control in TD children. On the other hand, greater DMN–occipital anti-correlation supported better attentional control in TD children.
•Children with ADHD are hyper-connected within both the CON and DMN.•More DMN–Visual antagonism supports better attention, particularly in controls.•More DMN–CON antagonism supports better attention only in children with ADHD.•CON–DMN compensation for attention may be due to stimulant medication use.
ADHD; Intra-subject variability; Attention; Resting-state connectivity; Network; Default mode network
BACKGROUND AND OBJECTIVE:
Behavioral disorders are highly comorbid with childhood learning disabilities (LDs), and accurate identification of LDs is vital for guiding appropriate interventions. However, it is difficult to conduct comprehensive assessment of academic skills within the context of primary care visits, lending utility to screening of academic skills via informant reports. The current study evaluated the clinical utility of a parent-reported screening measure in identifying children with learning difficulties.
Participants included 440 children (66.7% male), ages 5.25 to 17.83 years (mean = 10.32 years, SD = 3.06 years), referred for neuropsychological assessment. Academic difficulties were screened by parent report using the Colorado Learning Difficulties Questionnaire (CLDQ). Reading and math skills were assessed via individually administered academic achievement measures. Sensitivity, specificity, classification accuracy, and conditional probabilities were calculated to evaluate the efficacy of the CLDQ in predicting academic impairment.
Correlations between the CLDQ reading scale and reading achievement measures ranged from −0.35 to −0.65 and from −0.24 to −0.46 between the CLDQ math scale and math achievement measures (all P < .01). Sensitivity was good for both reading and math scales, whereas specificity was low. Taking into account the high base rate of reading and math LDs within our sample, the conditional probability of true negatives (96.2% reading, 85.1% math) was higher than for true positives (40.5% reading, 37.9% math).
Overall, the CLDQ may more accurately predict children without LDs than children with LDs. As such, the absence of parent-reported difficulties may be adequate to rule out an overt LD, whereas elevated scores likely indicate the need for more comprehensive assessment.
learning disorders; rating scales; ADHD; achievement testing; sensitivity; specificity
Children with Attention-Deficit/Hyperactivity Disorder (ADHD) demonstrate increased response variability compared with controls, which is thought to be associated with deficits in attention regulation and response control that subsequently affect performance of more cognitively demanding tasks, such as reading. The present study examined response variability during a computerized simple reaction time (RT) task in 67 children. Ex-Gaussian analyses separated the response time distribution into normal (mu and sigma) and exponential (tau) components; the association of each with reading fluency was examined. Children with ADHD had significantly slower, more variable, and more skewed RTs compared with controls. After controlling for ADHD symptom severity, tau (but not mu or mean RT) was significantly associated with reduced reading fluency, but not with single word reading accuracy. These data support the growing evidence that RT variability, but not simply slower mean response speed, is the characteristic of youth with ADHD and that longer response time latencies (tau) may be implicated in the poorer academic performance associated with ADHD.
Attention; Dyslexia; Variability; Processing speed; Executive function; Ex-Gaussian analyses
Prospective cohort studies often quantify serum immune biomarkers at a single time point to determine risk of cancer and other chronic diseases that develop years later. Estimates of the within-person temporal stability of serum markers partly assess the utility of single biomarker measurements, and may have important implications for the design of prospective studies of chronic disease risk.
Using archived sera collected from 200 HIV-seronegative men at three visits spaced over approximately two years, concentrations of 14 biomarkers (ApoA1, sCD14, sgp130, sIL-6R, sIL-2Rα, sTNFR2, BAFF/BLyS, CXCL13, IFN-γ, IL-1β, IL-6, IL-8, IL-10, TNF-α) were measured in a single laboratory. Age-and ethnicity-adjusted intraclass correlation coefficients (ICC) were calculated for each biomarker, and mixed linear regression models were utilized to examine the influence of age, ethnicity, season, and study site on biomarker concentrations.
Across all three study visits, most biomarkers had ICC values indicating fair to excellent within-person stability. ApoA1 (ICC=0.88)and TNF-α (ICC=0.87) showed the greatest stability; the ICC for IL-8 (ICC=0.33) was remarkably less stable. The ICCs were similar when calculated between pairs of consecutive visits. The covariables did not influence biomarker levels or their temporal stability. All biomarkers showed moderate to strong pairwise correlations across visits.
Serum concentrations of most evaluated immune biomarkers displayed acceptable to excellent within-person temporal reliability over a 2-year period. Further investigation may be required to clarify the stability of IL-8.
These findings lend support to using these serologic immune biomarkers in prospective studies investigating associations with chronic diseases.
Temporal stability; biomarkers; cytokines; soluble receptors; intraclass correlation coefficient
Background. The role of active hepatitis C virus (HCV) replication in chronic kidney disease (CKD) risk has not been clarified.
Methods. We compared CKD incidence in a large cohort of HIV-infected subjects who were HCV seronegative, HCV viremic (detectable HCV RNA), or HCV aviremic (HCV seropositive, undetectable HCV RNA). Stages 3 and 5 CKD were defined according to standard criteria. Progressive CKD was defined as a sustained 25% glomerular filtration rate (GFR) decrease from baseline to a GFR < 60 mL/min/1.73 m2. We used Cox models to calculate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs).
Results. A total of 52 602 HCV seronegative, 9508 HCV viremic, and 913 HCV aviremic subjects were included. Compared with HCV seronegative subjects, HCV viremic subjects were at increased risk for stage 3 CKD (adjusted HR 1.36 [95% CI, 1.26, 1.46]), stage 5 CKD (1.95 [1.64, 2.31]), and progressive CKD (1.31 [1.19, 1.44]), while HCV aviremic subjects were also at increased risk for stage 3 CKD (1.19 [0.98, 1.45]), stage 5 CKD (1.69 [1.07, 2.65]), and progressive CKD (1.31 [1.02, 1.68]).
Conclusions. Compared with HIV-infected subjects who were HCV seronegative, both HCV viremic and HCV aviremic individuals were at increased risk for moderate and advanced CKD.
HIV; hepatitis C virus; chronic kidney disease; hepatitis C RNA; cohort study; glomerular filtration rate; injection drug use
In the context of HIV, the initiation of effective antiretroviral therapy (ART) has been found to increase the risk of dyslipidemia in HIV-infected individuals, and dyslipidemia has been found to be a risk factor for kidney disease in the general population. Therefore, we examined changes in lipid profiles in HIV-infected men following ART initiation and the association with future kidney dysfunction. HIV-infected men from the Multicenter AIDS Cohort Study initiating ART between December 31, 1995 and September 30, 2011 with measured lipid and serum creatinine values pre-ART and post-ART were selected. The associations between changes in total cholesterol or high-density lipoprotein following ART initiation and the estimated change in glomerular filtration rate (eGFR) over time were assessed using piecewise linear mixed effects models. There were 365 HIV-infected men who contributed to the analysis. In the adjusted models, at 3 years post-ART, those with changes in total cholesterol >50 mg/dl had an average decrease in eGFR of 2.6 ml/min/1.73 m2 per year (p<0.001) and at 5 years post-ART, the average decrease was 2.4 ml/min/1.73 m2 per year (p=0.008). This decline contrasted with the estimates for those with changes in total cholesterol ≤50 mg/dl: 1.4 ml/min/1.73 m2 decrease per year (p<0.001) and 0.1 ml/min/1.73 m2 decrease per year (p=0.594) for the same time periods, respectively. Large decreases in high-density lipoprotein (a decline of greater than 5 mg/dl) were not associated with declines in eGFR. These results indicate that large ART-related increases in total cholesterol may be a risk factor for kidney function decline in HIV-infected men. Should these results be generalizable to the broader HIV population, monitoring cholesterol changes following the initiation of ART may be important in identifying HIV-infected persons at risk for kidney disease.
Primary liver cancer (PLC) is the third leading cause of cancer mortality globally. In endemic areas of sub-Saharan Africa and Asia PLC largely arises from chronic infection with hepatitis B virus (HBV) and ingestion of aflatoxins. While synergistic interactions between these two risk factors have been observed in cohort studies in China, here we determined the impact of agricultural reforms in the 1980s leading to diminished maize consumption and implementation of subsidized universal vaccination against HBV in the 2000s on PLC primary prevention. A population-based cancer registry was used to track PLC mortality in Qidong, China and was compared to the timeline of HBV immunization. Randomly selected serum samples from archived cohort collections from the 1980s to present were analyzed for aflatoxin biomarkers. Greater than 50% reductions in PLC mortality rates occurred across birth cohorts from the 1960s to the 1980s for Qidongese less than 35 years of age although all were born before universal vaccination of newborns. Median levels of the aflatoxin biomarker decreased from 19.3 pg/mg albumin in 1989 to undetectable (<0.5 pg/mg) by 2009. A population attributable benefit of 65% for reduced PLC mortality was estimated from a government facilitated switch of dietary staple from maize to rice; 83% of this benefit was in those infected with HBV. Food policy reforms in China resulted in a dramatic decrease in aflatoxin exposure, which, independent of HBV vaccination, reduced liver cancer risk. The extensive HBV vaccine coverage now in place augurs even greater risk reductions in the future.
aflatoxin; primary prevention; liver cancer
Coronary artery disease (CAD) has been associated with HIV infection; however data are not consistent.
We performed cardiac CT to determine whether HIV-infected men have more coronary atherosclerosis than uninfected men.
Cross-sectional study within the Multicenter AIDS Cohort Study(MACS).
HIV-infected (n=618) and –uninfected (n=383) men who have sex with men (MSM) had non-contrast and contrast enhanced cardiac CT if they were between 40–70 years, weighed <300 pounds, and had no history of coronary revascularization.
Presence and extent, for those with plaque, of coronary artery calcium (CAC) on non-contrast CT, and of any plaque, non-calcified, mixed or calcified plaque and stenosis on CT angiography.
1001 men underwent non-contrast CT of whom 759 had coronary CT angiography. After adjusting for age, race, center, and cohort, HIV-infected men had a greater prevalence of CAC [Prevalence ratio(PR)=1.21, 95% confidence interval (CI) 1.08–1.35, p=0.001], and any plaque [PR=1.14(1.05–1.24),p=0.001], including non-calcified plaque [PR=1.28(1.13–1.45),p<0.001) and mixed plaque [PR=1.35(1.10–1.65),p=0.004] than HIV-uninfected men. Associations between HIV-infection and any plaque and non-calcified plaque remained significant (p<0.005) after CAD risk factor adjustment. HIV-infected men also had a greater extent of non-calcified plaque after CAD risk factor adjustment (p=0.026). HIV-infected men had a greater prevalence of coronary artery stenosis>50% than HIV-uninfected men [PR=1.48(1.06–2.07),p=0.020), but not after CAD risk factor adjustment. Longer duration of highly active antiretroviral therapy [PR=1.09(1.02–1.17), p=0.007,per year] and lower nadir CD4+ T-cell count [PR=0.80(0.69–0.94),p=0.005, per 100 cells] were associated with coronary stenosis>50%.
Coronary artery plaque, especially non-calcified plaque, is more prevalent and extensive in HIV-infected men, independent of CAD risk factors.
Cross-sectional observational study design and inclusion of only men.
Primary Funding Source
NHLBI and NIAID
To compare the proportion, timing and hazards of non-AIDS death and AIDS death among men and women who initiated HAART at different CD4+ cell counts to mortality risks of HIV-uninfected persons with similar risk factors.
Prospective cohort studies.
We used parametric mixture models to compare proportions of AIDS and non-AIDS mortality and ages at death, and multivariable Cox models to compare cause-specific hazards of mortality, across levels of CD4+ cell count at HAART initiation (≤200 cells/μl: ‘late’, 201–350 cells/μl: ‘intermediate’, >350 cells/μl: ‘early’) and with HIV-uninfected individuals from the Multicenter AIDS Cohort Study and the Women’s Interagency HIV Study. We used multiple imputation methods to address lead-time bias in sensitivity analysis.
Earlier initiators were more likely to die of non-AIDS causes (early: 78%, intermediate: 74%, late: 49%), and at older ages (median years 72, 69, 66), relative to later initiators. Estimated median ages at non-AIDS death for each CD4+ cell count category were lower than that estimated for the HIV-uninfected group (75 years). In multivariable analysis, non-AIDS death hazard ratios relative to early initiators were 2.15 for late initiators (P < 0.01) and 1.66 for intermediate initiators (P = 0.01); AIDS death hazard ratios were 3.26 for late initiators (P <0.01) and 1.20 for intermediate initiators (P = 0.28). Strikingly, the adjusted hazards for non-AIDS death among HIV-uninfected individuals and early initiators were nearly identical (hazard ratio 1.01). Inferences were unchanged after adjustment for lead-time bias.
Results suggest the possibility of reducing the risk of non-AIDS mortality among HIV-infected individuals to approximate that faced by comparable HIV-uninfected individuals.
antiretroviral therapy; bias; CD4+ cell count; cohort studies; competing risks; mortality; statistical
Successful implementation of functional self-care skills depends upon adequate executive functioning; however, many scales assessing adaptive skills do not address the inherent executive burden of these tasks. This omission is especially relevant for individuals with spina bifida, for whom medical self-care tasks impose a significant burden requiring initiation and prospective memory. The Kennedy Krieger Independence Scales–Spina Bifida Version (KKIS–SB) is a caregiver-reported measure designed to address this gap; it assesses skills for managing both typical and spina bifida-related daily self-care demands, with a focus on the timely and independent initiation of adaptive skills.
Parents of 100 youth and young adults with spina bifida completed the KKIS–SB. Exploratory factor analysis and Pearson's correlations were used to assess the factor structure, reliability, and construct validity of the KKIS–SB.
The scale demonstrates excellent internal consistency (Cronbach's alpha = .891). Exploratory factor analysis yielded four factors, explaining 65.1% of the total variance. Two primary subscales were created, initiation of routines and prospective memory, which provide meaningful clinical information regarding management of a variety of typical (e.g., get up on time, complete daily hygiene routines on time) and spina bifida-specific self-care tasks (e.g., begin self-catheterization on time, perform self-examination for pressure sores).
Based upon internal consistency estimates and correlations with measures of similar constructs, initial data suggest good preliminary reliability and validity of the KKIS–SB.
transition; adaptive functioning; validity; factor structure; executive function
We examined the implications of using the Full Scale Intelligence Quotient (FSIQ) versus the General Abilities Index (GAI) for determination of intellectual disability using the Wechsler Intelligence Scales for Children, fourth edition (WISC-IV).
Children referred for neuropsychological assessment (543 males, 290 females; mean age 10y 5mo, SD 2y 9mo, range 6–16y) were administered the WISC-IV and the Adaptive Behavior Assessment System, Second Edition (ABAS-II).
GAI and FSIQ were highly correlated; however, fewer children were identified as having intellectual disability using GAI (n=159) than when using FSIQ (n=196). Although the 44 children classified as having intellectual disability based upon FSIQ (but not GAI) had significantly higher adaptive functioning scores than those meeting intellectual disability criteria based upon both FSIQ and GAI, mean adaptive scores still fell within the impaired range. FSIQ and GAI were comparable in predicting impairments in adaptive functioning.
Using GAI rather than FSIQ in intellectual disability diagnostic decision making resulted in fewer individuals being diagnosed with intellectual disability; however, the mean GAI of the disqualified individuals was at the upper end of criteria for intellectual impairment (standard score 75), and these individuals remained adaptively impaired. As GAI and FSIQ were similarly predictive of overall adaptive functioning, the use of GAI for intellectual disability diagnostic decision making may be of limited value.