We examined associations between stimulant use (methamphetamine and cocaine) and other substances (nicotine, marijuana, alcohol, inhaled nitrites) with immune function biomarkers among HIV-seropositive (HIV+) men using highly active antiretroviral therapy (ART) and -seronegative (HIV−) men in the Multicenter AIDS Cohort Study (MACS). Among HIV+ men, cumulative adherence to ART (4.07, 95% CI: 3.52, 4.71, per 10 years of adherent HAART use), and recent cohort enrollment (1.38; 95% CI: 1.24, 1.55) were multiplicatively associated with increases in CD4+/CD8+ ratios. Cumulative use of methamphetamine (0.93; 95% CI: 0.88, 0.98, per 10 use years), cocaine (0.93; 95% CI: 0.89, 0.96, per 10 use years), and cumulative medical visits (0.99; 95% CI: 0.98, 0.99, per 10 visit years), each showed small negative associations with CD4+/CD8+ ratios. Among HIV- men, cumulative medical visits (0.996; 95% CI: 0.993, 0.999), cumulative number of male sexual partners (0.999; 95% CI: 0.998, 0.9998, per 10 partner years) and cigarette pack years (1.10; 95% CI: 1.02, 1.18, per 10 pack years) were associated with CD4+/CD8+ ratios over the same period. ART adherence is associated with a positive immune function independent of stimulant use, underscoring the influence of ART on immune health for HIV+ men who engage in stimulant use.
HIV; men; methamphetamine; cocaine; CD4+/CD8+ ratio; antiretroviral therapy; adherence; Multicenter AIDS Cohort Study
Assess the impact of patient population characteristics on accuracy by CT angiography (CTA) to detect obstructive coronary artery disease (CAD).
The ability of CTA to exclude obstructive CAD in patients of different pretest probabilities and in presence of coronary calcification remains uncertain.
For the CorE-64 study 371 patients underwent CTA and cardiac catheterization for the detection of obstructive CAD defined as 50% or greater luminal stenosis by quantitative coronary angiography (QCA). This analysis includes 80 initially excluded patients with a calcium score ≥ 600. Area under the receiver-operating-characteristics curve (AUC) was used to evaluate CTA diagnostic accuracy compared to QCA in patients according to calcium score and pretest probability of CAD.
Analysis of patient-based quantitative CTA accuracy revealed an AUC of 0.93 (95% confidence interval [CI] 0.90-0.95). AUC remained 0.93 (0.90-0.96) after excluding patients with known CAD but decreased to 0.81 (0.71-0.89) in patients with calcium score ≥ 600 (p=0.077). While AUC were similar (0.93, 0.92, and 0.93, respectively) for patients with intermediate, high pretest probability for CAD, and known CAD, negative predictive values were different: 0.90, 0.83, and 0.50, respectively. Negative predictive values decreased from 0.93 to 0.75 for patients with calcium score < or ≥ 100, respectively (p= 0.053).
Both pretest probability for CAD and coronary calcium scoring should be considered before using CTA for excluding obstructive CAD. CTA is less effective for this purpose in patients with calcium score ≥ 600 and in patients with a high pretest probability for obstructive CAD.
Coronary Disease; Imaging; Angiography
Survivors of acute lung injury (ALI) and their informal caregivers have difficulty coping with the physical and emotional challenges of recovery from critical illness. We aimed to develop and pilot test a telephone-based coping skills training intervention for this population.
58 participants were enrolled overall. 21 patients and 23 caregivers participated in a cross-sectional study to assess coping and its association with psychological distress. This also informed the development of an ALI coping skills training intervention in an iterative process involving content and methodological experts. The intervention was then evaluated in 7 patients and 7 caregivers in an uncontrolled, prospective, pre-post study. Outcomes included acceptability, feasibility, and symptoms of psychological distress measured with the Hospital Anxiety and Depression Scale (HADS) and Post-Traumatic Symptom Scale (PTSS).
Survivors and their caregivers used adaptive coping infrequently, a pattern that was strongly associated with psychological distress. These findings informed the development of a 12-session intervention for acquiring, applying, and maintaining coping skills. In the evaluation phase, participants completed 77 (92%) of a possible 84 telephone sessions and all (100%) reported that the intervention’s usefulness in their daily routine. Mean change scores reflecting improvements in the HADS (7.8 units) and PTSS (10.3 units) were associated with adaptive coping (r=0.50–0.70) and high self-efficacy (r=0.67–0.79).
A novel telephone-based coping skills training intervention was acceptable, feasible, and may have been associated with a reduction in psychological distress among survivors of ALI and their informal caregivers. A randomized trial is needed to evaluate the intervention.
coping; acute lung injury; behavioral therapy; quality of life
Families and other surrogate decision-makers for chronically critically ill patients often lack information about patient prognosis or options for care. This study describes an approach to develop and validate a printed information brochure about chronic critical illness aimed at improving comprehension of the disease process and outcomes for patients’ families and other surrogate decision-makers.
Investigators reviewed existing literature to identify key domains of informational needs. Content of these domains was incorporated in a draft brochure that included graphics and a glossary of terms. Clinical sensibility, balance, and emotional sensitivity of the draft brochure were tested in a series of evaluations by cohorts of experienced clinicians (n=49) and clinical content experts (n=8), with revisions after each review. Cognitive testing of the brochure was performed through interviews of 10 representative family members of chronically critically ill patients with quantitative and qualitative analysis of responses.
Measurements and Main Results
Clinical sensibility and balance were rated in the two most favorable categories on a 5-point scale by more than two thirds of clinicians and content experts. After review, family members described the brochure as clear and readable and recommended that the brochure be delivered to family members by clinicians, followed by a discussion of its contents. They indicated that the glossary was useful and recommended supplementation by additional lists of local resources. After reading the brochure, their prognostic estimates became more consistent with actual outcomes.
We have developed and validated a printed information brochure that may improve family comprehension of chronic critical illness and its outcomes. The structured process that is described can serve as a template for the development of other information aids for use with seriously ill populations.
critically ill; mechanical ventilation; communication; information sharing; tracheostomies; validation studies
Burkholderia pseudomallei and B. mallei are closely related Category B Select Agents of bioterrorism and the causative agents of the diseases melioidosis and glanders, respectively. Rapid phage-based diagnostic tools would greatly benefit early recognition and treatment of these diseases. There is extensive strain-to-strain variation in B. pseudomallei genome content due in part to the presence or absence of integrated prophages. Several phages have previously been isolated from B. pseudomallei lysogens, for example φK96243, φ1026b and φ52237.
We have isolated a P2-like bacteriophage, φX216, which infects 78% of all B. pseudomallei strains tested. φX216 also infects B. mallei, but not other Burkholderia species, including the closely related B. thailandensis and B. oklahomensis. The nature of the φX216 host receptor remains unclear but evidence indicates that in B. mallei φX216 uses lipopolysaccharide O-antigen but a different receptor in B. pseudomallei. The 37,637 bp genome of φX216 encodes 47 predicted open reading frames and shares 99.8% pairwise identity and an identical strain host range with bacteriophage φ52237. Closely related P2-like prophages appear to be widely distributed among B. pseudomallei strains but both φX216 and φ52237 readily infect prophage carrying strains.
The broad strain infectivity and high specificity for B. pseudomallei and B. mallei indicate that φX216 will provide a good platform for the development of phage-based diagnostics for these bacteria.
Bacteriophage; Burkholderia pseudomallei; B. mallei; P2; Prophage distribution; Phage-based diagnostics
The association between methamphetamine use and HIV seroconversion for men who have sex with men (MSM) was examined using longitudinal data from the Multicenter AIDS Cohort Study.
Seronegative (n=4003) men enrolled in 1984–85, 1987–1991 and 2001–2003 were identified. Recent methamphetamine and popper use were determined at either the current or the previous visit. Time to HIV-seroconversion was the outcome of interest. Covariates included race/ethnicity, cohort, study site, educational level, number of sexual partners, number of unprotected insertive anal sexual partners (UIAS), number of unprotected receptive anal sexual partners (URAS), insertive rimming, cocaine use at either the current or last visit, ecstasy use at either the current or last visit, any needle use since last visit, CES-D depression score > 16 since last visit, and alcohol consumption.
After adjusting for covariates, there was an approximately 1.46-fold independent increased relative hazard (HR) of HIV seroconversion for methamphetamine use. The HR associated with popper use was 2.1 [95% CI 1.63, 2.70]. The HR of HIV seroconversion increased with URAS ranging from 1.87 [95% CI 1.40, 2.51] for 1 partner to 9.32 [95% CI 6.20, 13.98] for 5+ partners. The joint HR for methamphetamine and popper use was 3.05 [95% CI 2.12, 4.37]. Most notably, there was a significant joint HR for methamphetamine use and URAS of 2.71 [95% CI 1.81, 4.04] for men with 1 unprotected receptive anal sex partner, which increased in a dose-dependent manner for >1 partners.
Further examination of the synergism of patterns of drug use and sexual risk behaviors on rates of HIV seroconversion will be necessary in order to develop new HIV prevention strategies for drug-using MSM.
Multicenter AIDS cohort study; methamphetamine; HIV seroconversion; MSM
Significant deficiencies exist in the communication of prognosis for patients requiring prolonged mechanical ventilation after acute illness, in part because of clinician uncertainty about long-term outcomes. We sought to refine a mortality prediction model for patients requiring prolonged ventilation using a multicentered study design.
Five geographically diverse tertiary care medical centers in the United States (California, Colorado, North Carolina, Pennsylvania, Washington).
Two hundred sixty adult patients who received at least 21 days of mechanical ventilation after acute illness.
Measurements and Main Results
For the probability model, we included age, platelet count, and requirement for vasopressors and/or hemodialysis, each measured on day 21 of mechanical ventilation, in a logistic regression model with 1-yr mortality as the outcome variable. We subsequently modified a simplified prognostic scoring rule (ProVent score) by categorizing the risk variables (age 18–49, 50–64, and >65 yrs; platelet count 0–150 and >150; vasopressors; hemodialysis) in another logistic regression model and assigning points to variables according to β coefficient values. Overall mortality at 1 yr was 48%. The area under the curve of the receiver operator characteristic curve for the primary ProVent probability model was 0.79 (95% confidence interval, 0.75–0.81), and the p value for the Hosmer-Lemeshow goodness-of-fit statistic was .89. The area under the curve for the categorical model was 0.77, and the p value for the goodness-of-fit statistic was .34. The area under the curve for the ProVent score was 0.76, and the p value for the Hosmer-Lemeshow goodness-of-fit statistic was .60. For the 50 patients with a ProVent score >2, only one patient was able to be discharged directly home, and 1-yr mortality was 86%.
The ProVent probability model is a simple and reproducible model that can accurately identify patients requiring prolonged mechanical ventilation who are at high risk of 1-yr mortality.
communication; critical care; mechanical ventilation; multiple organ failure; outcomes; prognosis
The rate of decline of glomerular filtration rate (GFR) in children with chronic kidney disease (CKD) can vary, even among those with similar diagnoses. Classic regression methods applied to the log-transformed GFR (i.e., lognormal) quantify only rigid shifts in a given outcome. The generalized gamma distribution offers an alternative approach for characterizing the heterogeneity of effect of an exposure on a positive, continuous outcome. Using directly measured GFR longitudinally assessed between 2005 and 2010 in 529 children enrolled in the Chronic Kidney Disease in Children Study, the authors characterized the effect of glomerular CKD versus nonglomerular CKD diagnoses on the outcome, measured as the annualized GFR ratio. Relative percentiles were used to characterize the heterogeneity of effect of CKD diagnosis across the distribution of the outcome. The rigid shift assumed by the classic mixed models failed to capture the fact that the greatest difference between the glomerular and nonglomerular diagnosis’ annualized GFR ratios was in children who exhibited the fastest GFR declines. Although this difference was enhanced in children with an initial GFR level of 45 mL/minute/1.73 m2 or less, the effect of diagnosis on outcome was not significantly modified by level. Generalized gamma models captured heterogeneity of effect more richly and provided a better fit to the data than did conventional lognormal models.
epidemiologic methods; glomerular filtration rate; pediatrics; prospective studies; regression analysis; renal insufficiency, chronic; statistical distributions
We examined whether MDCT improves the ability to define peri-infarct zone (PIZ) heterogeneity relative to MRI.
The PIZ as characterized by delayed enhanced (de) MRI identifies patients susceptible to ventricular arrhythmias and predicts outcome after myocardial infarction (MI).
Fifteen mini-pigs underwent coronary artery occlusion followed by reperfusion. MDCT and MRI were performed on the same day approximately 6 months after MI induction followed by animal sacrifice and ex-vivo MRI (n=5). Signal density threshold algorithms were applied to MRI and MDCT data sets reconstructed at various slice thicknesses (1–8mm) to define the PIZ and quantify partial volume effects.
De-MDCT reconstructed at 8mm slice thickness demonstrated excellent correlation of infarct size with post mortem pathology (r2=0.97; p<0.0001) and MRI (r2=0.92; p<0.0001). De-MDCT and de-MRI were able to detect a PIZ in all animals, which correlates to a mixture of viable and non-viable myocytes at the PIZ by histology. The ex-vivo de-MRI PIZ volume decreased with slice thickness from 0.9±0.2cc at 8mm to 0.2±0.1cc at 1mm (p=0.01). PIZ volume/mass by de-MDCT increased with decreasing slice thickness due to declining partial volume averaging in the PIZ, but was susceptible to increased image noise.
De-MDCT provides a more detailed assessment of the PIZ in chronic MI and is less susceptible to partial volume effects than MRI. This increased resolution best reflects the extent of tissue mixture by histopathology and has the potential to further enhance the ability to define the substrate of malignant arrhythmia in ischemic heart disease non-invasively.
MDCT; delayed enhancement; peri-infarct zone; MRI
Most HIV-seropositive subjects in western countries receive highly active antiretroviral therapy (HAART). Although many aspects of their health have been studied, little is known about their vestibular and balance function. The goals of this study were to determine the prevalences of vestibular and balance impairments among HIV-seropositive and comparable seronegative men and women and to determine if those groups differed.
Standard screening tests of vestibular and balance function, including head thrusts, Dix-Hallpike maneuvers, and Romberg balance tests on compliant foam were performed during semiannual study visits of participants who were enrolled in the Baltimore and Washington, D. C. sites of the Multicenter AIDS Cohort Study and the Women's Interagency HIV Study.
No significant differences by HIV status were found on most tests, but HIV-seropositive subjects who were using HAART had a lower frequency of abnormal Dix-Hallpike nystagmus than HIV-seronegative subjects. A significant number of nonclassical Dix-Hallpike responses were found. Age was associated with Romberg scores on foam with eyes closed. Sex was not associated with any of the test scores.
These findings suggest that HAART-treated HIV infection has no harmful association with vestibular function in community-dwelling, ambulatory men and women. The association with age was expected, but the lack of association with sex was unexpected. The presence of nonclassical Dix-Hallpike responses might be consistent with central nervous system lesions.
The purpose of the study was to investigate patient characteristics associated with image quality and their impact on the diagnostic accuracy of MDCT for the detection of coronary artery stenosis.
MATERIALS AND METHODS
Two hundred ninety-one patients with a coronary artery calcification (CAC) score of ≤ 600 Agatston units (214 men and 77 women; mean age, 59.3 ± 10.0 years [SD]) were analyzed. An overall image quality score was derived using an ordinal scale. The accuracy of quantitative MDCT to detect significant (≥ 50%) stenoses was assessed using quantitative coronary angiography (QCA) per patient and per vessel using a modified 19-segment model. The effect of CAC, obesity, heart rate, and heart rate variability on image quality and accuracy were evaluated by multiple logistic regression. Image quality and accuracy were further analyzed in subgroups of significant predictor variables. Diagnostic analysis was determined for image quality strata using receiver operating characteristic (ROC) curves.
Increasing body mass index (BMI) (odds ratio [OR] = 0.89, p < 0.001), increasing heart rate (OR = 0.90, p < 0.001), and the presence of breathing artifact (OR = 4.97, p ≤ 0.001) were associated with poorer image quality whereas sex, CAC score, and heart rate variability were not. Compared with examinations of white patients, studies of black patients had significantly poorer image quality (OR = 0.58, p = 0.04). At a vessel level, CAC score (10 Agatston units) (OR = 1.03, p = 0.012) and patient age (OR = 1.02, p = 0.04) were significantly associated with the diagnostic accuracy of quantitative MDCT compared with QCA. A trend was observed in differences in the areas under the ROC curves across image quality strata at the vessel level (p = 0.08).
Image quality is significantly associated with patient ethnicity, BMI, mean scan heart rate, and the presence of breathing artifact but not with CAC score at a patient level. At a vessel level, CAC score and age were associated with reduced diagnostic accuracy.
angiography; body mass index; CORE-64; coronary artery calcium; heart rate; hemodynamics; image quality; MDCT
To study the effect of in-utero alcohol exposure on the insulin-like growth factor axis (IGF) and leptin during infancy and childhood, considering that exposed children may exhibit pre- and postnatal growth retardation.
We prospectively identified heavily drinking pregnant women who consumed on average 4 or more drinks of ethanol per day (≥48 g/day) and assessed growth in 69 of their offspring and an unexposed control group of 83 children, measuring serum IGF-I (radioimmunoassay), IGF-II (immunoradiometric assay, IRMA), insulin-like growth factor-binding protein 3 (IGFBP-3) (IRMA) and leptin (IRMA) at 1 month and 1, 2, 3, 4, and 5 years of age.
IGF-II levels increased with age in both groups, but the rate of increase was significantly higher in exposed children, and levels were significantly higher in ethanol-exposed children at 3, 4, and 5 years of age. In exposed children, IGF-I levels were higher at 3 and 4 years and leptin levels were significantly lower at 1 and 2 years. Exposed subjects showed a much lower correlation between IGF-I and growth parameters than unexposed subjects.
Exposure to ethanol during pregnancy increases IGF-I and IGF-II and decreases leptin during early childhood. The increase in serum IGF-II concentrations in ethanol-exposed children suggests that this hormone should be explored as a potential marker for prenatal alcohol exposure.
Fetal alcohol syndrome; Pregnancy; Alcohol abuse; Insulin-like growth factor I; Insulin-like growth factor II
Multislice computed tomography (MSCT) for the noninvasive detection of coronary artery stenoses is a promising candidate for widespread clinical application because of its noninvasive nature and high sensitivity and negative predictive value as found in several previous studies using 16 to 64 simultaneous detector rows. A multi-centre study of CT coronary angiography using 16 simultaneous detector rows has shown that 16-slice CT is limited by a high number of nondiagnostic cases and a high false-positive rate. A recent meta-analysis indicated a significant interaction between the size of the study sample and the diagnostic odds ratios suggestive of small study bias, highlighting the importance of evaluating MSCT using 64 simultaneous detector rows in a multi-centre approach with a larger sample size. In this manuscript we detail the objectives and methods of the prospective “CORE-64” trial (“Coronary Evaluation Using Multidetector Spiral Computed Tomography Angiography using 64 Detectors”). This multi-centre trialwas unique in that it assessed the diagnostic performance of 64-slice CT coronary angiography in nine centres worldwide in comparison to conventional coronary angiography. In conclusion, the multi-centre, multi-institutional and multi-continental trial CORE-64 has great potential to ultimately assess the per-patient diagnostic performance of coronary CT angiography using 64 simultaneous detector rows.
Computed tomography; Coronary vessels; Multi-centre study; Methods; Design
The concentrations of cytokines in human serum and plasma can provide valuable information about in vivo immune status, but low concentrations often require high-sensitivity assays to permit detection. The recent development of multiplex assays, which can measure multiple cytokines in one small sample, holds great promise, especially for studies in which limited volumes of stored serum or plasma are available. Four high-sensitivity cytokine multiplex assays on a Luminex (Bio-Rad, BioSource, Linco) or electrochemiluminescence (Meso Scale Discovery) platform were evaluated for their ability to detect circulating concentrations of 13 cytokines, as well as for laboratory and lot variability. Assays were performed in six different laboratories utilizing archived serum from HIV-uninfected and -infected subjects from the Multicenter AIDS Cohort Study (MACS) and the Women's Interagency HIV Study (WIHS) and commercial plasma samples spanning initial HIV viremia. In a majority of serum samples, interleukin-6 (IL-6), IL-8, IL-10, and tumor necrosis factor alpha were detectable with at least three kits, while IL-1β was clearly detected with only one kit. No single multiplex panel detected all cytokines, and there were highly significant differences (P < 0.001) between laboratories and/or lots with all kits. Nevertheless, the kits generally detected similar patterns of cytokine perturbation during primary HIV viremia. This multisite comparison suggests that current multiplex assays vary in their ability to measure serum and/or plasma concentrations of cytokines and may not be sufficiently reproducible for repeated determinations over a long-term study or in multiple laboratories but may be useful for longitudinal studies in which relative, rather than absolute, changes in cytokines are important.
Maternal consumption of fish during the gestational period exposes the fetus to both nutrients, especially the long-chain polyunsaturated fatty acids (LCPUFAs), believed to be beneficial for fetal brain development, as well as to the neurotoxicant methylmercury (MeHg). We recently reported that nutrients present in fish may modify MeHg neurotoxicity. Understanding the apparent interaction of MeHg exposure and nutrients present in fish is complicated by the limitations of modeling methods. In this study we fit varying coefficient function models to data from the Seychelles Child Development Nutrition Study (SCDNS) cohort to assess the association of dietary nutrients and children’s development. This cohort of mother-child pairs in the Republic of Seychelles had fish consumption averaging 9 meals per week. Maternal nutritional status was assessed for five different nutritional components known to be present in fish (n-3 LCPUFA, n-6 LCPUFA, iron status, iodine status, and choline) and associated with children’s neurological development. We also included prenatal MeHg exposure (measured in maternal hair). We examined two child neurodevelopmental outcomes (Bayley Scales Infant Development-II (BSID-II) Mental Developmental Index (MDI) and Psychomotor Developmental Index (PDI)), each administered at 9 and at 30 months. The varying coefficient models allow the possible interactions between each nutritional component and MeHg to be modeled as a smoothly varying function of MeHg as an effect modifier. Iron, iodine, choline, and n-6 LCPUFA had little or no observable modulation at different MeHg exposures. In contrast the n-3 LCPUFA docosahexaenoic acid (DHA) had beneficial effects on the BSID-II PDI that were reduced or absent at higher MeHg exposures. This study presents a useful modeling method that can be brought to bear on questions involving interactions between covariates, and illustrates the continuing importance of viewing fish consumption during pregnancy as a case of multiple exposures to nutrients and to MeHg. The results encourage more emphasis on a holistic view of the risks and benefits of fish consumption as it relates to infant development.
Varying-coefficient function models; mercury exposure; neurodevelopment; interaction between nutritional status and toxic exposure
Although advances in intensive care have enabled more patients to survive an acute critical illness, they also have created a large and growing population of chronically critically ill patients with prolonged dependence on mechanical ventilation and other intensive care therapies. Chronic critical illness is a devastating condition: mortality exceeds that for most malignancies, and functional dependence persists for most survivors. Costs of treating the chronically critically ill in the United States already exceed $20 billion and are increasing. In this article, we describe the constellation of clinical features that characterize chronic critical illness. We discuss the outcomes of this condition including ventilator liberation, mortality, and physical and cognitive function, noting that comparisons among cohorts are complicated by variation in defining criteria and care settings. We also address burdens for families of the chronically critically ill and the difficulties they face in decision-making about continuation of intensive therapies. Epidemiology and resource utilization issues are reviewed to highlight the impact of chronic critical illness on our health care system. Finally, we summarize the best available evidence for managing chronic critical illness, including ventilator weaning, nutritional support, rehabilitation, and palliative care, and emphasize the importance of efforts to prevent the transition from acute to chronic critical illness. As steps forward for the field, we suggest a specific definition of chronic critical illness, advocate for the creation of a research network encompassing a broad range of venues for care, and highlight areas for future study of the comparative effectiveness of different treatment venues and approaches.
respirator, artificial; critical illness; chronic disease; respiratory care units
To investigate apelin–APJ (angiotensin receptor- like 1) signalling in vascular remodelling, we have examined the pathophysiological response to carotid ligation in apelin knockout mice.
Methods and results
Apelin null animals compared with wild-type mice had significantly decreased neointimal lesion area (1.17 ± 0.17 vs. 3.33 ± 1.04 × 104 μm2, P < 0.05) and intima/media ratio (0.81 ± 0.23 vs. 1.49 ± 0.44, P < 0.05), averaged over four sites 0.5–2 mm from the ligation. Exogenous apelin infusion rescued the apelin-KO phenotype, promoting neointima formation in the null animals. Apelin null animals showed decreased smooth muscle positive area in the neointima (82.3 ± 2.4 vs. 63.9 ± 8.4, P < 0.05), and a smaller percentage BrdU positive cells in the neointima and media (11.06 ± 1.00 vs. 6.53 ± 0.86, P < 0.05). Apelin mRNA expression increased initially (5.2-fold, P < 0.01) followed by increased apelin receptor expression (10.1-fold, P < 0.05) in the ligated artery. Cytochemistry studies localized apelin expression to luminal endothelial cells and apelin receptor upregulation to smooth muscle cells (SMC) in the media and neointima. In vitro experiments with cultured rat aortic SMC revealed that apelin stimulation increased migration. In contrast to the increased expression of apelin and apelin receptor in carotid remodelling, expression was not upregulated in the apoE high fat model, and correlated with the known disease-inhibitory effect in this model.
These data suggest that increased apelin receptor expression by SMC provides a paracrine pathway in injured vessels that allows endothelial-derived apelin to stimulate their division and migration into the neointima.
Apelin; APJ; Vascular remodelling; Smooth muscle cell; Migration
Whole genome expression microarrays can be used to study gene expression in blood, which comes in part from leukocytes, immature platelets, and red blood cells. Since these cells are important in the pathogenesis of stroke, RNA provides an index of these cellular responses to stroke. Our studies in rats have shown specific gene expression changes 24 hours after ischemic stroke, hemorrhage, status epilepticus, hypoxia, hypoglycemia, global ischemia, and following brief focal ischemia that simulated transient ischemic attacks in humans. Human studies show gene expression changes following ischemic stroke. These gene profiles predict a second cohort with >90% sensitivity and specificity. Gene profiles for ischemic stroke caused by large-vessel atherosclerosis and cardioembolism have been described that predict a second cohort with >85% sensitivity and specificity. Atherosclerotic genes were associated with clotting, platelets, and monocytes, and cardioembolic genes were associated with inflammation, infection, and neutrophils. These gene profiles predicted the cause of stroke in 58% of cryptogenic patients. These studies will provide diagnostic, prognostic, and therapeutic markers, and will advance our understanding of stroke in humans. New techniques to measure all coding and noncoding RNAs along with alternatively spliced transcripts will markedly advance molecular studies of human stroke.
brain; gene expression profiling; ischemia; microarrays
New HIV infections are being observed among men who have sex with men. Understanding the fusion of risky sexual behaviors, stimulant and erectile dysfunction drug use with HIV seroconversion may provide direction for focused intervention.
During the follow-up period (1998–2008) we identified 57 HIV seroconverters among 1,667 initially HIV-seronegative men. Time to seroconversion was modeled using Cox proportional hazards regression analysis for 7 combinations of sex-drugs (inhaled nitrites or “poppers”, stimulants, and EDDs) used at the current or previous semi-annual visit, adjusting for other risk factors including sexual behavior, alcohol and other drugs used, and depression. Model-based adjusted attributable risks were then calculated.
The risk of seroconversion increased linearly with the number of unprotected receptive anal sex partners (URASP), with hazard ratios (HR) ranging from 1.73 (95% confidence interval [CI]: 0.75, 4.01) for 1 partner, to 4.23 (95% CI: 1.76, 10.17) for 2–4 partners to 14.21 (95% CI: 6.27, 32.20) for 5+ partners, independent of other risk factors. After adjustment, risks for seroconversion increased from 2.99 (95% CI: 1.02, 8.76) for men who reported using stimulants only (1 drug) to 8.45 (95% CI: 2.67, 26.71) for men who reported using all 3 sex-drugs. The use of any of the 7 possible sex-drug combinations accounted for 63% of the nine-year HIV seroincidence in the Multicenter AIDS Cohort Study (MACS). When contributions of increased URASP and combination drug use were analyzed together, the total attributable risk for HIV seroconversion was 74%, with 41% attributable to URASP alone and a residual of 33% due to other direct or indirect effects of sex-drug use.
Use of poppers, stimulants and EDDs increased risk for HIV seroconversion significantly in this cohort. These data reinforce the importance of implementing interventions that target drug-reduction as part of comprehensive and efficacious HIV prevention strategies.
Multicenter AIDS cohort study; MSM; stimulants; inhaled nitrites; erectile dysfunction drugs; HIV seroconversion; non-intravenous drug use
Older human immunodeficiency virus–seropositive (HIV+) individuals (greater than age 50 years) are twice as likely to develop HIV dementia compared to younger HIV+ individuals. The objective of this study was to examine the impact of both age and serostatus on longitudinal changes in psychomotor speed/executive functioning performance among HIV+ and HIV– individuals. Four hundred and seventy-seven HIV+ and 799 HIV– individuals from the Multicenter AIDS Cohort Study (MACS) were subdivided into three age groups: (1) <40 years, (2) 40–50 years, and (3) >50 years. Psychomotor speed/executive functioning test performance was measured by the Symbol Digit Modalities Test (SDMT) and the Trail Making (TM) Test Parts A and B. Changes in performance were compared among the three age groups for both HIV+ and HIV– individuals. Among HIV+ individuals, on the TM Test Part B the younger group demonstrated improvement in performance over time (P = .007). The older and middle age groups demonstrated decline in performance over time (P = .041 and .030). The older group had a significantly different trajectory relative to the younger group (P = .046). Among the HIV– individuals, there was no effect of age on longitudinal performance. In conclusion, older HIV+ individuals show greater decline over time than younger HIV+ individuals on the TM Test Part B. Our results suggest that both HIV serostatus and age together may impact longitudinal performance on this test. Mild neurocognitive changes over time among older HIV+ individuals are likely to reflect age associated pathophysiological mechanisms including cerebrovascular risk factors.
age; dementia; HIV; neuropsychological assessment
In the survival analysis context, when an intervention either reduces a harmful exposure or introduces a beneficial treatment, it seems useful to quantify the gain in survival attributable to the intervention as an alternative to the reduction in risk. To accomplish this we introduce two new concepts, the attributable survival and attributable survival time, and study their properties. Our analysis includes comparison with the attributable risk function as well as hazard-based alternatives. We also extend the setting to the case where the intervention takes place at discrete points in time, and may either eliminate exposure or introduce a beneficial treatment in only a proportion of the available group. This generalization accommodates the more realistic situation where the treatment or exposure is dynamic. We apply these methods to assess the effect of introducing highly active antiretroviral therapy for the treatment of clinical AIDS at the population level.
attributable risk function; survival analysis; parametric models; generalized gamma distribution; product limit estimate
This study was an investigation of the effect of ultraviolet B light exposure on the risk of cortical cataract as a function of the region of the lens. The degree to which the lower nasal predominance of cortical cataract is a result of UVB exposure was assessed.
In studies of cortical cataract, a severity score representing the area covered by cataract is often used as the primary outcome. However, additional disease information may exist in the spatial distribution of opacities. Further, it has been hypothesized that the lower nasal region of the lens is the most susceptible to damage by environmental ultraviolet light exposure.
In a sample of 107 lens images from the Salisbury Eye Evaluation Study, a digital cortical cataract grading algorithm was used to capture the location of opacities in binary images. These images were used to estimate the severity of cataract in 16 regions around the lens. The effect of individual cumulative lifetime ocular exposure to ultraviolet B light on cortical cataract risk for each lens region was examined, as estimated by using an empiric model and baseline occupation and leisure activities data, in a linear mixed-effects model.
The lower nasal regions had the highest cortical cataract severity in both the right and left eyes. In the combined data, region 9 (the lower nasal corner of the lens) was estimated to have the highest severity. In an assessment of the high- and low-exposure ultraviolet light groups (dichotomized at the median exposure level), higher exposure had the most effect in the lower regions of the lens.
These results indicate that there are regional lens differences in the association between cataract and exposure to ultraviolet light but that ultraviolet light may not entirely explain the variations in cortical cataract severity across the lens.
Since the introduction of highly active antiretroviral therapy (HAART) and the subsequent increased life expectancy in HIV-infected persons, non-HIV–related diseases have become an important cause of morbidity and mortality. This cross-sectional study reports the prevalence of overweight and obesity, and sociodemographic, psychological, and substance use-related risk factors for elevated body mass index (BMI) among 2157 HIV-seropositive (HIV+) in comparison to 730 HIV-seronegative (HIV−) participants in the Women's Interagency HIV Study (WIHS). Separate univariable and multivariate linear regression analyses were completed for HIV+ and HIV− women. Our study revealed a similar proportion of obesity (body mass index [BMI] ≥30) among HIV+ (33%) and HIV− women (29%) (p = 0.12), as well as comparable median BMI (HIV+: 26.1 versus HIV−: 26.7, p = 0.16). HIV+ compared to HIV− women, respectively, were significantly (p < 0.01) older (median = 35.6 versus. 32.5), but similar (p = 0.97) by race/ethnicity (57% African American, 28% Hispanic, and 15% white for both). In multivariate models for both HIV+ and HIV− women, African American race/ethnicity was significantly (p < 0.05) associated with higher BMI, while higher quality of life score and illicit hard drug use were associated with lower BMI. Additionally, smoking, alcohol use, markers of advanced HIV infection (AIDS diagnosis, elevated HIV viral load, low CD4 count), and a history of antiretroviral therapy use (ART) were also associated with lower BMI among HIV+ women. In conclusion, risk factors for elevated BMI were similar for HIV+ and HIV− women in the WIHS. For HIV+ women, all markers of advanced HIV infection and ART use were additionally associated with lower BMI.