To investigate the impact of neurologist care on Parkinson disease (PD)–related hospitalizations. Recent data indicate that neurologist treatment in PD may be associated with improved survival, yet is underutilized. Factors contributing to this improved survival remain unknown, but may be due in part to optimal disease treatment or avoidance of disease-related complications.
This was a retrospective cohort study of Medicare beneficiaries diagnosed with PD in 2002 and still living in 2006. Hospitalization for PD-related (neurodegenerative disease, psychosis, depression, urinary tract infection, and traumatic injury) and general medical (hypertension, diabetes, congestive heart failure, angina, and gastrointestinal obstruction) illnesses was compared by PD treating physician specialty using Cox proportional hazard models, adjusting for confounders. Secondary analyses included PD-related rehospitalization and cost stratified by frequency of neurologist care.
We identified 24,929 eligible incident PD cases; 13,489 had neurologist care. There were 9,112 PD-related hospitalizations, and these occurred and recurred less often among neurologist-treated patients. Neurologist PD care was associated with lower adjusted odds of both initial and repeat hospitalization for psychosis (hazard ratio [HR] 0.71, 95% confidence interval [CI] 0.59–0.86), urinary tract infection (HR 0.74, 0.63–0.87), and traumatic injury (HR 0.56, 0.40–0.78). PD-related outcomes improved with frequency of neurologist care in a stepwise manner. Odds of general illness hospitalization or hospitalization did not differ by neurologist involvement.
Regular neurologist care in PD is specifically associated with lower risk of hospitalization and rehospitalization for several PD-related illnesses. This may reflect an improved ability of neurologists to prevent, recognize, or treat PD complications.
To determine whether the absence of early epileptiform abnormalities predicts absence of later seizures on continuous EEG monitoring of hospitalized patients.
We retrospectively reviewed 242 consecutive patients without a prior generalized convulsive seizure or active epilepsy who underwent continuous EEG monitoring lasting at least 18 hours for detection of nonconvulsive seizures or evaluation of unexplained altered mental status. The findings on the initial 30-minute screening EEG, subsequent continuous EEG recordings, and baseline clinical data were analyzed. We identified early EEG findings associated with absence of seizures on subsequent continuous EEG.
Seizures were detected in 70 (29%) patients. A total of 52 patients had their first seizure in the initial 30 minutes of continuous EEG monitoring. Of the remaining 190 patients, 63 had epileptiform discharges on their initial EEG, 24 had triphasic waves, while 103 had no epileptiform abnormalities. Seizures were later detected in 22% (n = 14) of studies with epileptiform discharges on their initial EEG, vs 3% (n = 3) of the studies without epileptiform abnormalities on initial EEG (p < 0.001). In the 3 patients without epileptiform abnormalities on initial EEG but with subsequent seizures, the first epileptiform discharge or electrographic seizure occurred within the first 4 hours of recording.
In patients without epileptiform abnormalities during the first 4 hours of recording, no seizures were subsequently detected. Therefore, EEG features early in the recording may indicate a low risk for seizures, and help determine whether extended monitoring is necessary.
Parkinson disease (PD), a devastating neurodegenerative disorder, affects motor abilities and cognition as well. It is not clear whether the proapoptotic protein, Bid, is involved in tumor necrosis factor death receptor I (TNFRI)–mediated destructive signal transduction pathways such as cell dysfunction or neurodegeneration in the temporal cortex of patients with PD.
Molecular and biochemical approaches were used to dissect mitochondrial related components of the destructive signaling pathway in the temporal cortex from rapidly autopsied brains (postmortem interval mean 2.6 hours). Brains from patients with PD (n = 15) had an average age of 81.4 years, compared to the average age of 84.36 years in age-matched control patient brains (n = 15).
TNFRI and its adaptor protein, TRADD, were not only present in the cytoplasm of the temporal cortex, but were significantly elevated (42.3% and 136.1%, respectively) in PD brains compared to age-matched control brains. Bid in the PD temporal cortex could be further cleaved into tBid in the cytosol, which is translocated into the mitochondria, where cytochrome c is then released and caspase-3 is subsequently activated.
Patients with PD have an activated Bid-mediated destructive signal pathway via TNFRI in the temporal cortex. Such deficits are pervasive, suggesting that they might contribute to cortex degeneration as PD manifests.
We describe temporal trends in stroke incidence stratified by age from our population-based stroke epidemiology study. We hypothesized that stroke incidence in younger adults (age 20–54) increased over time, most notably between 1999 and 2005.
The Greater Cincinnati/Northern Kentucky region includes an estimated population of 1.3 million. Strokes were ascertained in the population between July 1, 1993, and June 30, 1994, and in calendar years 1999 and 2005. Age-, race-, and gender-specific incidence rates with 95 confidence intervals were calculated assuming a Poisson distribution. We tested for differences in age trends over time using a mixed-model approach, with appropriate link functions.
The mean age at stroke significantly decreased from 71.2 years in 1993/1994 to 69.2 years in 2005 (p < 0.0001). The proportion of all strokes under age 55 increased from 12.9% in 1993/1994 to 18.6% in 2005. Regression modeling showed a significant change over time (p = 0.002), characterized as a shift to younger strokes in 2005 compared with earlier study periods. Stroke incidence rates in those 20–54 years of age were significantly increased in both black and white patients in 2005 compared to earlier periods.
We found trends toward increasing stroke incidence at younger ages. This is of great public health significance because strokes in younger patients carry the potential for greater lifetime burden of disability and because some potential contributors identified for this trend are modifiable.
Prominent growth failure typifies Rett syndrome (RTT). Our aims were to 1) develop RTT growth charts for clinical and research settings, 2) compare growth in children with RTT with that of unaffected children, and 3) compare growth patterns among RTT genotypes and phenotypes.
A cohort of the RTT Rare Diseases Clinical Research Network observational study participants was recruited, and cross-sectional and longitudinal growth data and comprehensive clinical information were collected. A reliability study confirmed interobserver consistency. Reference curves for height, weight, head circumference, and body mass index (BMI), generated using a semiparametric model with goodness-of-fit tests, were compared with normative values using Student's t test adjusted for multiple comparisons. Genotype and phenotype subgroups were compared using analysis of variance and linear regression.
Growth charts for classic and atypical RTT were created from 9,749 observations of 816 female participants. Mean growth in classic RTT decreased below that for the normative population at 1 month for head circumference, 6 months for weight, and 17 months for length. Mean BMI was similar in those with RTT and the normative population. Pubertal increases in height and weight were absent in classic RTT. Classic RTT was associated with more growth failure than atypical RTT. In classic RTT, poor growth was associated with worse development, higher disease severity, and certain MECP2 mutations (pre-C-terminal truncation, large deletion, T158M, R168X, R255X, and R270X).
RTT-specific growth references will allow effective screening for disease and treatment monitoring. Growth failure occurs less frequently in girls with RTT with better development, less morbidity typically associated with RTT, and late truncation mutations.
Florbetapir F 18 PET can image amyloid-β (Aβ) aggregates in the brains of living subjects. We prospectively evaluated the prognostic utility of detecting Aβ pathology using florbetapir PET in subjects at risk for progressive cognitive decline.
A total of 151 subjects who previously participated in a multicenter florbetapir PET imaging study were recruited for longitudinal assessment. Subjects included 51 with recently diagnosed mild cognitive impairment (MCI), 69 cognitively normal controls (CN), and 31 with clinically diagnosed Alzheimer disease dementia (AD). PET images were visually scored as positive (Aβ+) or negative (Aβ−) for pathologic levels of β-amyloid aggregation, blind to diagnostic classification. Cerebral to cerebellar standardized uptake value ratios (SUVr) were determined from the baseline PET images. Subjects were followed for 18 months to evaluate changes in cognition and diagnostic status. Analysis of covariance and correlation analyses were conducted to evaluate the association between baseline PET amyloid status and subsequent cognitive decline.
In both MCI and CN, baseline Aβ+ scans were associated with greater clinical worsening on the Alzheimer's Disease Assessment Scale–Cognitive subscale (ADAS-Cog (p < 0.01) and Clinical Dementia Rating–sum of boxes (CDR-SB) (p < 0.02). In MCI Aβ+ scans were also associated with greater decline in memory, Digit Symbol Substitution (DSS), and Mini-Mental State Examination (MMSE) (p < 0.05). In MCI, higher baseline SUVr similarly correlated with greater subsequent decline on the ADAS-Cog (p < 0.01), CDR-SB (p < 0.03), a memory measure, DSS, and MMSE (p < 0.05). Aβ+ MCI tended to convert to AD dementia at a higher rate than Aβ− subjects (p < 0.10).
Florbetapir PET may help identify individuals at increased risk for progressive cognitive decline.
The objective of this study was to examine the joint associations of estimated glomerular filtration rate (eGFR) and urinary albumin excretion with incident stroke in a large national cohort study.
Associations of urinary albumin to creatinine ratio (ACR) and eGFR with incident stroke were examined in 25,310 participants of the Reasons for Geographic and Racial Differences in Stroke (REGARDS) study, a prospective study of black and white US adults ≥45 years of age.
A total of 548 incident strokes were observed over a median of 4.7 years of follow-up. Higher ACR values were associated with lower stroke-free survival in both black and white participants. Among black participants, as compared to an ACR <10 mg/g, the hazard ratios of stroke associated with an ACR of 10–29.99, 30–300, and >300 mg/g were 1.41 (95% confidence interval [CI] 1.01–1.98), 2.10 (95% CI 1.48–2.99), and 2.70 (95% CI 1.58–4.61), respectively, in analyses adjusted for traditional stroke risk factors and eGFR. In contrast, the hazard ratios among white subjects were only modestly elevated and not statistically significant after adjustment for established stroke risk factors. eGFR <60 mL/min/1.73 m2 was not associated with incident stroke in black or white participants after adjustment for established stroke risk factors.
Higher ACR was independently associated with higher risk of stroke in black but not white participants from a national cohort. Elucidating the reasons for these findings may uncover novel mechanisms for persistent racial disparities in stroke.
To study 5-HT transport and 5-HT1A receptors in temporal lobe epilepsy (TLE) and depression.
Thirteen patients had PET with [11C]DASB for 5-HTT and [18F]FCWAY for 5-HT1A receptor binding, MRI, and psychiatric assessment. Sixteen healthy volunteers had [11C]DASB, 19 had [18F]FCWAY, and 6 had both PET studies. We used a reference tissue model to estimate [11C]DASB binding. [18F]FCWAY volume of distribution was corrected for plasma-free fraction. Images were normalized to common space. The main outcome was the regional asymmetry index. Positive asymmetry indicates relative reduced binding (reflecting transporter activity) ipsilateral to epileptic foci.
Mean regional [11C]DASB binding and asymmetry did not differ between patients and controls. [18F]FCWAY asymmetry was significantly greater for patients than controls in hippocampus, amygdala, and fusiform gyrus. On analysis of variance with region as a repeated measure, depression diagnosis had a significant effect on [11C]DASB asymmetry, with significantly higher [11C]DASB asymmetry in insular cortex (trend for fusiform gyrus). In insular cortex, patients had a significant correlation between [18F]FCWAY asymmetry and [11C]DASB asymmetry.
Our study showed increased [11C]DASB asymmetry in insula and fusiform gyrus, and relatively reduced transporter activity, in subjects with both TLE and depression, as compared to subjects with TLE alone, implying reduced reuptake and thus increased synaptic 5-HT availability. This finding may represent a compensatory mechanism for 5-HT1A receptor loss. Altered serotonergic mechanisms have an important role in TLE and concomitant depression.
We sought to identify characteristics of individuals with mild cognitive impairment (MCI) that are associated with a relatively high probability of reverting back to normal cognition, and to estimate the risk of future cognitive decline among those who revert.
We first studied 3,020 individuals diagnosed with MCI on at least 1 visit to an Alzheimer's Disease Center in the United States. All underwent standardized Uniform Data Set evaluations at their first visit with an MCI diagnosis and on a subsequent visit, about 1 year later, at which cognitive status was reassessed. Multiple logistic regression was used to identify predictors of reverting from MCI back to normal cognition. We then estimated the risk of developing MCI or dementia over the next 3 years among those who had reverted, compared with individuals who had not had a study visit with MCI.
About 16% of subjects diagnosed with MCI reverted back to normal or near-normal cognition approximately 1 year later. Five characteristics assessed at the first MCI visit contributed significantly to a model predicting a return to normal cognition: Mini-Mental State Examination (MMSE) score, Clinical Dementia Rating (CDR) score, MCI type, Functional Activities Questionnaire (FAQ) score, and APOE ϵ4 status. Survival analysis showed that the risk of retransitioning to MCI or dementia over the next 3 years was sharply elevated among those who had MCI and then improved, compared with individuals with no history of MCI.
Even in a cohort of patients seen at dementia research centers, reversion from MCI was fairly common. Nonetheless, those who reverted remained at increased risk for future cognitive decline.
We tested whether maternal hypertensive disorders in pregnancy predict age-related change in cognitive ability in the offspring up to old age.
Using mothers' blood pressure and urinary protein measurements from the maternity clinics and birth hospitals, we defined normotensive or hypertensive pregnancies in mothers of 398 men, who participated in the Helsinki Birth Cohort 1934–1944 Study. The men underwent the Finnish Defence Forces basic ability test twice: first during compulsory military service at age 20.1 (SD = 1.4) years and then in a retest at age 68.5 (SD = 2.9) years. The test yields a total score and subscores for tests measuring verbal, arithmetic, and visuospatial reasoning.
Men born after pregnancies complicated by a hypertensive disorder, compared with men born after normotensive pregnancies, scored 4.36 (95% confidence interval, 1.17–7.55) points lower on total cognitive ability at 68.5 years and displayed a greater decline in total cognitive ability (2.88; 95% confidence interval, 0.07–5.06) after 20.1 years. Of the subscores, associations were strongest for arithmetic reasoning.
Maternal hypertensive disorders in pregnancy predict lower cognitive ability and greater cognitive decline up to old age. A propensity to lower cognitive ability and decline up to old age may have prenatal origins.
Objectives and Methods:
The purpose of this study was to examine the incidence of mild cognitive impairment (MCI) and patterns of progression from incident MCI to dementia in 285 cognitively normal subjects (mean age, 78.9 years) in the Cardiovascular Health Study–Cognition Study from 1998–1999 to 2010–2011.
Two hundred (70%) of the participants progressed to MCI; the age-adjusted incidence of MCI was 111.09 (95% confidence interval, 88.13–142.95) per 1,000 person-years. A total of 107 (53.5%) of the incident MCI subjects progressed to dementia. The mean time from MCI to dementia was 2.8 ± 1.8 years. Forty (20%) of the incident MCI cases had an “unstable” course: 19 (9.5%) converted to MCI and later returned to normal; 10 (5%) converted to MCI, to normal, and later back to MCI; 7 (3.5%) converted to MCI, to normal, to MCI, and later to dementia; and 4 (2%) converted to MCI, to normal, and later to dementia. There was an increased mortality rate among the cognitively normal group (110.10 per 1,000 person-years) compared to those with incident MCI who converted to dementia (41.32 per 1,000 person-years).
The majority of the subjects aged >80 years developed an MCI syndrome, and half of them progressed to dementia. Once the MCI syndrome was present, the symptoms of dementia appeared within 2 to 3 years. Progression from normal to MCI or from normal to MCI to dementia is not always linear; subjects who developed MCI and later returned to normal can subsequently progress to dementia. Competing mortality and morbidity influence the study of incident MCI and dementia in population cohorts.
Secondary prevention trials in subjects with preclinical Alzheimer disease may require documentation of brain amyloidosis. The identification of inexpensive and noninvasive screening variables that can identify individuals who have significant amyloid accumulation would reduce screening costs.
A total of 483 cognitively normal (CN) individuals, aged 70–92 years, from the population-based Mayo Clinic Study of Aging, underwent Pittsburgh compound B (PiB)–PET imaging. Logistic regression determined whether age, sex, APOE genotype, family history, or cognitive performance was associated with odds of a PiB retention ratio >1.4 and >1.5. Area under the receiver operating characteristic curve (AUROC) evaluated the discrimination between PiB-positive and -negative subjects. For each characteristic, we determined the number needed to screen in each age group (70–79 and 80–89) to identify 100 participants with PiB >1.4 or >1.5.
A total of 211 (44%) individuals had PiB >1.4 and 151 (31%) >1.5. In univariate and multivariate models, discrimination was modest (AUROC ∼0.6–0.7). Multivariately, age and APOE best predicted odds of PiB >1.4 and >1.5. Subjective memory complaints were similar to cognitive test performance in predicting PiB >1.5. Indicators of PiB positivity varied with age. Screening APOE ε4 carriers alone reduced the number needed to screen to enroll 100 subjects with PIB >1.5 by 48% in persons aged 70–79 and 33% in those aged 80–89.
Age and APOE genotype are useful predictors of the likelihood of significant amyloid accumulation, but discrimination is modest. Nonetheless, these results suggest that inexpensive and noninvasive measures could significantly reduce the number of CN individuals needed to screen to enroll a given number of amyloid-positive subjects.
The purpose of the study was to test the hypothesis that a higher level of childhood adversity is associated with increased risk of cerebral infarction in old age.
Older participants in a longitudinal clinical–pathologic study rated adverse childhood experiences (e.g., emotional neglect, parental intimidation and violence) on a previously established 16-item scale. During a mean of 3.5 years of follow-up, there were 257 deaths, with 206 brain autopsies (80.2). Number of chronic cerebral infarcts (gross plus microscopic; expressed as 0, 1, or >1) was determined in a uniform neuropathologic examination, which had been completed in 192 individuals at the time of these analyses.
Childhood adversity scores ranged from 0 to 31 (mean = 8.3, SD = 6.4). In an ordinal logistic regression model adjusted for age, sex, and education, higher adversity was associated with higher likelihood of chronic cerebral infarction. In analyses of childhood adversity subscales, only emotional neglect was associated with infarction (odds ratio [OR] = 1.097; 95% confidence interval [CI] 1.048–1.148). The likelihood of infarction was 2.8 times higher (95% CI 2.0–4.1) in those reporting a moderately high level of childhood emotional neglect (score = 6, 75th percentile) vs a moderately low level of neglect (score = 1, 25th percentile). Results were comparable in subsequent analyses that controlled for lifetime socioeconomic status, cardiovascular risk factors, and an anxiety-related trait.
Emotional neglect in childhood may be a risk factor for cerebral infarction in old age.
We aimed to investigate whether HIV latency in the CNS might have adverse molecular, pathologic, and clinical consequences.
This was a case-control comparison of HIV-1 seropositive (HIV+) patients with clinical and neuropathologic examination. Based on the levels of HIV-1 DNA, RNA, and p24 in the brain, cases were classified as controls, latent HIV CNS infection, and HIV encephalitis (HIVE). Analysis of epigenetic markers including BCL11B, neurodegeneration, and neuroinflammation was performed utilizing immunoblot, confocal microscopy, immunochemistry/image analysis, and qPCR. Detailed antemortem neurocognitive data were available for 23 out of the 32 cases.
HIV+ controls (n = 12) had no detectable HIV-1 DNA, RNA, or p24 in the CNS; latent HIV+ cases (n = 10) showed high levels of HIV-1 DNA but no HIV RNA or p24; and HIVE cases (n = 10) had high levels of HIV-1 DNA, RNA, and p24. Compared to HIV+ controls, the HIV+ latent cases displayed moderate cognitive impairment with neurodegenerative and neuroinflammatory alterations, although to a lesser extent than HIVE cases. Remarkably, HIV+ latent cases showed higher levels of BCL11B and other chromatin modifiers involved in silencing. Increased BCL11B was associated with deregulation of proinflammatory genes like interleukin-6, tumor necrosis factor–α, and CD74.
Persistence of latent HIV-1 infection in the CNS was associated with increased levels of chromatin modifiers, including BCL11B. Alteration of these epigenetic factors might result in abnormal transcriptomes, leading to inflammation, neurodegeneration, and neurocognitive impairment. BCL11B and other epigenetic factors involved in silencing might represent potential targets for HIV-1 involvement of the CNS.
To study the frequency and degree of deconditioning, clinical features, and relationship between deconditioning and autonomic parameters in patients with orthostatic intolerance.
We retrospectively studied all patients seen for orthostatic intolerance at Mayo Clinic between January 2006 and June 2011, who underwent both standardized autonomic and exercise testing.
A total of 184 patients (84 with postural orthostatic tachycardia syndrome [POTS] and 100 without orthostatic tachycardia) fulfilled the inclusion criteria. Of these, 89% were women, and median age was 27.5 years (interquartile range [IQR] 22–37 years). Symptom duration was 4 years (IQR 2–7.8). Of the patients, 90% had deconditioning (reduced maximum oxygen uptake [VO2max%] <85%) during exercise. This finding was unrelated to age, gender, or duration of illness. The prevalence of deconditioning was similar between those with POTS (95%) and those with orthostatic intolerance (91%). VO2max% had a weak correlation with a few autonomic and laboratory parameters but adequate predictors of VO2max% could not be identified.
Reduced VO2max% consistent with deconditioning is present in almost all patients with orthostatic intolerance and may play a central role in pathophysiology. This finding provides a strong rationale for retraining in the treatment of orthostatic intolerance. None of the autonomic indices are reliable predictors of deconditioning.
To compare the cost-effectiveness of apixaban vs warfarin for secondary stroke prevention in patients with atrial fibrillation (AF).
Using standard methods, we created a Markov decision model based on the estimated cost of apixaban and data from the Apixaban for Reduction in Stroke and Other Thromboembolic Events in Atrial Fibrillation (ARISTOTLE) trial and other trials of warfarin therapy for AF. We quantified the cost and quality-adjusted life expectancy resulting from apixaban 5 mg twice daily compared with those from warfarin therapy targeted to an international normalized ratio of 2–3. Our base case population was a cohort of 70-year-old patients with no contraindication to anticoagulation and a history of stroke or TIA from nonvalvular AF.
Warfarin therapy resulted in a quality-adjusted life expectancy of 3.91 years at a cost of $378,500. In comparison, treatment with apixaban led to a quality-adjusted life expectancy of 4.19 years at a cost of $381,700. Therefore, apixaban provided a gain of 0.28 quality-adjusted life-years (QALYs) at an additional cost of $3,200, resulting in an incremental cost-effectiveness ratio of $11,400 per QALY. Our findings were robust in univariate sensitivity analyses varying model inputs across plausible ranges. In Monte Carlo analysis, apixaban was cost-effective in 62% of simulations using a threshold of $50,000 per QALY and 81% of simulations using a threshold of $100,000 per QALY.
Apixaban appears to be cost-effective relative to warfarin for secondary stroke prevention in patients with AF, assuming that it is introduced at a price similar to that of dabigatran.
The role of the physician scientist in biomedical research is increasingly threatened. Despite a clear role in clinical advances in translational medicine, the percentage of physicians engaged in research has steadily declined. Several programmatic efforts have been initiated to address this problem by providing time and financial resources to the motivated resident or fellow. However, this decline in physician scientists is due not only to a lack of time and resources but also a reflection of the uncertain path in moving from residency or postdoctoral training toward junior faculty. This article is a practical guide to the milestones and barriers to successful faculty achievement after residency or fellowship training.
To evaluate the incidence, characteristics, and clinical consequences of delayed intraventricular hemorrhage (dIVH).
Patients with primary intracerebral hemorrhage (ICH) were enrolled into a prospective registry between December 2006 and February 2012. Patients were managed, and serial neuroimaging obtained, per a structured protocol. Initial and delayed IVH were identified on imaging, along with ICH volumes, with outcomes blinded. Multivariate models were developed to test whether the occurrence of dIVH was a predictor of functional outcomes independent of known predictors, including the ICH score elements and ICH growth.
A total of 216 patients were studied, and 104 (48%) had IVH on initial imaging. Of the 112 with no IVH, 23 (21%) subsequently developed IVH. Emergent surgical intervention, mostly ventriculostomy placement, was required after discovery of dIVH in 10 (43%) of these 23. In multivariate models adjusting for all elements of the ICH score and hematoma growth, dIVH was an independent predictor of death at 14 days (p = 0.015) and higher modified Rankin Scale scores at 3 months (all p = 0.037). The effect of dIVH remained significant in a secondary analysis that adjusted for all other variables significant in the univariate analysis.
Similar to hematoma expansion dIVH is independently associated with death and poor outcomes. Because IVH is easily detected by serial neuroimaging and often requires emergent surgical intervention, monitoring for dIVH is recommended.