Diminished control of standing balance, traditionally indicated by greater postural sway magnitude and speed, is associated with falls in older adults. Tai Chi (TC) is a multisystem intervention that reduces fall risk, yet its impact on sway measures vary considerably. We hypothesized that TC improves the integrated function of multiple control systems influencing balance, quantifiable by the multi-scale “complexity” of postural sway fluctuations.
To evaluate both traditional and complexity-based measures of sway to characterize the short- and potential long-term effects of TC training on postural control and the relationships between sway measures and physical function in healthy older adults.
A cross-sectional comparison of standing postural sway in healthy TC-naïve and TC-expert (24.5±12 yrs experience) adults. TC-naïve participants then completed a 6-month, two-arm, wait-list randomized clinical trial of TC training. Postural sway was assessed before and after the training during standing on a force-plate with eyes-open (EO) and eyes-closed (EC). Anterior-posterior (AP) and medio-lateral (ML) sway speed, magnitude, and complexity (quantified by multiscale entropy) were calculated. Single-legged standing time and Timed-Up–and-Go tests characterized physical function.
At baseline, compared to TC-naïve adults (n = 60, age 64.5±7.5 yrs), TC-experts (n = 27, age 62.8±7.5 yrs) exhibited greater complexity of sway in the AP EC (P = 0.023), ML EO (P<0.001), and ML EC (P<0.001) conditions. Traditional measures of sway speed and magnitude were not significantly lower among TC-experts. Intention-to-treat analyses indicated no significant effects of short-term TC training; however, increases in AP EC and ML EC complexity amongst those randomized to TC were positively correlated with practice hours (P = 0.044, P = 0.018). Long- and short-term TC training were positively associated with physical function.
Multiscale entropy offers a complementary approach to traditional COP measures for characterizing sway during quiet standing, and may be more sensitive to the effects of TC in healthy adults.
To investigate predictors of trial start-up times, high attrition, and poor protocol adherence in amyotrophic lateral sclerosis (ALS) trials.
Retrospective analysis of start-up times, retention, and protocol adherence was performed on 5 clinical studies conducted by the Northeast ALS Consortium and 50 ALS clinical trials identified by PubMed search. Predictors of start-up times were estimated by accelerated failure time models with random effects. Predictors of retention and protocol deviations were estimated by mixed-model logistic regression.
Median times for contract execution and institutional review board (IRB) approval were 105 days and 125 days, respectively. Contract execution was faster at sites with more ongoing trials (p = 0.005), and more full-time (p = 0.006) and experienced (p < 0.001) coordinators. IRB approval was faster at sites with more ongoing trials (p = 0.010) and larger ALS clinics (p = 0.038). Site activation after IRB approval was faster at sites with more full-time (p = 0.038) and experienced (p < 0.001) coordinators. Twenty-two percent of surviving participants withdrew before completing the trial. Better participant functional score at baseline was an independent predictor of trial completion (odds ratio 1.29, p = 0.002) and fewer protocol deviations (odds ratio 0.86, p = 0.030).
Delays in IRB review contribute the most to prolonged trial start-up times, and these timelines are faster in sites with more experienced staff. Strategies to improve protocol adherence and participants' retention may include enrolling people at early disease stages.
Diffusion-weighted MRI (DWI) is often used in acute stroke to identify irreversibly injured “core” tissue, though some propose using perfusion imaging, specifically cerebral blood volume (CBV) maps, in place of DWI. We examined whether CBV maps can reliably substitute for DWI, and assessed the effect of scan duration on calculated CBV.
Materials and Methods
We retrospectively identified 58 patients who underwent DWI and MR perfusion imaging within 12 hours of stroke onset. CBV in each DWI lesion's center was divided by CBV in the normal-appearing contralateral hemisphere to yield relative regional CBV (rrCBV). The proportion of lesions with decreased rrCBV was calculated. After using the full scan duration (110 sec after contrast injection), rrCBV was recalculated using simulated shorter scans The effect of scan duration on rrCBV was tested with linear regression.
Using the full scan duration (110 sec), rrCBV was increased in most DWI lesions (62%; 95% CI 48%-74%). rrCBV increased with increasing scan duration (P<0.001). Even with the shortest duration (39.5 sec) rrCBV was increased in 33% of lesions.
Because DWI lesions may have elevated or decreased CBV, CBV maps cannot reliably substitute for DWI in identifying the infarct core.
Stroke; Perfusion Weighted Imaging; Cerebral blood volume; Diffusion Weighted Imaging
To evaluate the safety, tolerability, and pharmacokinetics of an antisense oligonucleotide designed to inhibit SOD1 expression (ISIS 333611) following intrathecal administration in patients with SOD1-related familial amyotrophic lateral sclerosis (ALS).
Mutations in SOD1 cause 13% of familial ALS. In animal studies, ISIS 333611 delivered to the cerebrospinal fluid (CSF) distributed to the brain and spinal cord, decreased SOD1 mRNA and protein levels in spinal cord tissue, and prolonged survival in the SOD1G93A rat ALS model.
In a randomized, placebo controlled Phase 1 trial, ISIS 333611 was delivered by intrathecal infusion using an external pump over 11.5 hours at increasing doses to four cohorts of eight SOD1 positive ALS subjects (randomized 6 drug: 2 placebo/cohort). Subjects were allowed to re-enroll in subsequent cohorts. Safety and tolerability assessments were made during the infusion and periodically over 28 days following the infusion. CSF and plasma drug levels were measured.
No dose-limiting toxicities were identified at doses up to 3.0 mg. No safety or tolerability concerns related to ISIS 333611 were identified. There were no serious adverse events (AEs) in ISIS 333611-treated subjects. Re-enrollment and re-dosing of subjects with ISIS 333611 was also well tolerated. Dose-dependent CSF and plasma concentrations were observed.
In this first clinical study to report intrathecal delivery of an antisense oligonucleotide, ISIS 333611 was well tolerated when administered as an intrathecal infusion in subjects with SOD1 familial ALS. CSF and plasma drug levels were consistent with levels predicted from preclinical studies. These results suggest that antisense oligonucleotide delivery to the central nervous system may be a feasible therapeutic strategy for neurological disorders.
Source of funding
ALS Association, Muscular Dystrophy Association, Isis Pharmaceuticals
Low serum folate levels previously have been associated with negative symptom risk in schizophrenia, as has the hypofunctional 677C>T variant of the MTHFR gene. This study examined whether other missense polymorphisms in folate-regulating enzymes, in concert with MTHFR, influence negative symptoms in schizophrenia, and whether total risk allele load interacts with serum folate status to further stratify negative symptom risk. Medicated outpatients with schizophrenia (n = 219), all of European origin and some included in a previous report, were rated with the Positive and Negative Syndrome Scale. A subset of 82 patients also underwent nonfasting serum folate testing. Patients were genotyped for the MTHFR 677C>T (rs1801133), MTHFR 1298A>C (rs1801131), MTR 2756A>G (rs1805087), MTRR 203A>G (rs1801394), FOLH1 484T>C (rs202676), RFC 80A>G (rs1051266), and COMT 675G>A (rs4680) polymorphisms. All genotypes were entered into a linear regression model to determine significant predictors of negative symptoms, and risk scores were calculated based on total risk allele dose. Four variants, MTHFR 677T, MTR 2756A, FOLH1 484C, and COMT 675A, emerged as significant independent predictors of negative symptom severity, accounting for significantly greater variance in negative symptoms than MTHFR 677C>T alone. Total allele dose across the 4 variants predicted negative symptom severity only among patients with low folate levels. These findings indicate that multiple genetic variants within the folate metabolic pathway contribute to negative symptoms of schizophrenia. A relationship between folate level and negative symptom severity among patients with greater genetic vulnerability is biologically plausible and suggests the utility of folate supplementation in these patients.
gene-environment interaction; methylation; epigenetics; genetic risk score
Parkinson’s disease; UCHL1; NAT2; interaction; association study
Alzheimer’s disease (AD) trials initiated during or before the prodrome are costly and lengthy because patients are enrolled long before clinical symptoms are apparent, when disease progression is slow. We hypothesized that design of such trials could be improved by: (1) selecting individuals at moderate near-term risk of progression to AD dementia (the current clinical standard) and (2) by using short-term surrogate endpoints that predict progression to AD dementia. We used a longitudinal cohort of older, initially non-demented, community-dwelling participants (n=358) to derive selection criteria and surrogate endpoints and tested them in an independent national data set (n=6,243). To identify a “mid-risk” subgroup, we applied conditional tree-based survival models to Clinical Dementia Rating (CDR) scale scores and common neuropsychological tests. In the validation cohort, a time-to-AD dementia trial applying these mid-risk selection criteria to a pool of all non-demented individuals could achieve equivalent power with 47% fewer participants than enrolling at random from that pool. We evaluated surrogate endpoints measureable over two years of follow-up based on cross-validated concordance between predictions from Cox models and observed time to AD dementia. The best performing surrogate, rate of change in CDR sum-of-boxes, did not reduce the trial duration required for equivalent power using estimates from the validation cohort, but alternative surrogates with better ability to predict time to AD dementia should be able to do so. The approach tested here might improve efficiency of prodromal AD trials using other potential measures and could be generalized to other diseases with long prodromal phases.
Clinical Trials as Topic; Surrogate Endpoint; Alzheimer’s Disease; Survival Analysis; National Alzheimer’s Coordinating Center Uniform Data Set
Work hour limitations for graduate medical trainees, motivated by concerns about patient safety, quality of care, and trainee well-being, continue to generate controversy. Little information about sleep habits and the prevalence of sleep disorders among residents is available to inform policy in this area.
To evaluate the sleep habits of matriculating residents, postgraduate year-1 (PGY-1).
An anonymous, voluntary, self-administered survey study was used with 3 validated questionnaires: the Pittsburgh Sleep Quality Index, the Insomnia Severity Index, and the Epworth Sleepiness Scale, which were fielded to PGY-1 residents entering the Accreditation Council for Graduate Medical Education–accredited programs at Massachusetts General Hospital and/or Brigham and Women's Hospitals in June and July 2009.
Of 355 eligible subjects, 310 (87%) participated. Mean sleep time for PGY-1 residents was 7 hours and 34 minutes, and 5.6% of PGY-1 residents had Pittsburgh Sleep Quality Index global scores greater than 5, indicating poor quality sleep. Using multiple linear and ordinal logistic regression models, men had higher Pittsburgh Sleep Quality Index sleep latency scores, whereas women and those with children had higher Epworth Sleepiness Scale daytime sleepiness scores, and 18% of PGY-1 residents had abnormal amounts of daytime sleepiness based on the Epworth Sleepiness Scale. The Insomnia Severity Index identified 4.2% of PGY-1 residents with moderate insomnia.
Some PGY-1 residents may begin residency with sleep dysfunctions. Efforts to provide targeted help to selected trainees in managing fatigue during residency should be investigated.
Established heart failure in thalassaemia major has a poor prognosis and optimal management remains unclear.
A 1 year prospective study comparing deferoxamine (DFO) monotherapy or when combined with deferiprone (DFP) for patients with left ventricular ejection fraction (LVEF) <56% was conducted by the Thalassemia Clinical Research Network (TCRN). All patients received DFO at 50–60 mg/kg 12–24 hr/day sc or iv 7 times weekly, combined with either DFP 75 at mg/kg/day (combination arm) or placebo (DFO monotherapy arm). The primary endpoint was the change in LVEF by CMR.
Improvement in LVEF was significant in both study arms at 6 and 12 months (p = 0.04), normalizing ventricular function in 9/16 evaluable patients. With combination therapy, the LVEF increased from 49.9% to 55.2% (+5.3% p = 0.04; n = 10) at 6 months and to 58.3% at 12 months (+8.4% p = 0.04; n = 7). With DFO monotherapy, the LVEF increased from 52.8% to 55.7% (+2.9% p = 0.04; n = 6) at 6 months and to 56.9% at 12 months (+4.1% p = 0.04; n = 4). The LVEF trend did not reach statistical difference between study arms (p = 0.89). In 2 patients on DFO monotherapy during the study and in 1 patient on combined therapy during follow up, heart failure deteriorated fatally. The study was originally powered for 86 participants to determine a 5% difference in LVEF improvement between treatments. The study was prematurely terminated due to slow recruitment and with the achieved sample size of 20 patients there was 80% power to detect an 8.6% difference in EF, which was not demonstrated. Myocardial T2* improved in both arms (combination +1.9 ± 1.6 ms p = 0.04; and DFO monotherapy +1.9 ± 1.4 ms p = 0.04), but with no significant difference between treatments (p = 0.65). Liver iron (p = 0.03) and ferritin (p < 0.001) both decreased significantly in only the combination group.
Both treatments significantly improved LVEF and myocardial T2*. Although this is the largest and only randomized study in patients with LV decompensation, further prospective evaluation is needed to identify optimal chelation management in these high-risk patients.
Thalassemia; Heart failure; Deferoxamine; Deferiprone; Combination
Glutamatergic N-methyl-D-aspartate (NMDA) receptor hypofunction has been proposed as a mechanism underlying psychosis. D-cycloserine, a partial agonist at the glycine site of the NMDA receptor, enhances learning in animal models, although tachyphylaxis develops with repeated dosing. Once-weekly dosing of D-cycloserine produces persistent improvement when combined with cognitive behavioral therapy (CBT) in anxiety disorders. Delusional beliefs can be conceptualized as a learning deficit, characterized by the failure to use contradictory evidence to modify the belief. CBT techniques have been developed with modest success to facilitate such reality-testing (or new learning) in delusional beliefs. The current study evaluated whether D-cycloserine could potentiate beneficial effects of CBT on delusional severity. Twenty-one outpatients with schizophrenia or schizoaffective disorder and moderately severe delusions were randomized in a double-blind cross-over design to receive a single-dose of either D-cycloserine 50mg or placebo in a counterbalanced order on two consecutive weeks 1-hour prior to a CBT intervention involving training in the generation of alternative beliefs. Assessments were completed at baseline, 7 days following the first study drug administration and 7 days following the second study drug administration. Contrary to prediction, there was no significant D-cycloserine treatment effect on delusional distress or severity as measured by the SAPS or PSYRATS. An unexpected finding was an order effect, whereby subjects who received D-cycloserine first had significantly reduced delusional severity, distress, and belief conviction on PSYRATS compared to subjects who received placebo first. However, this finding is consistent with animal models in which D-cycloserine enhances learning only when accompanying the first exposure to training.
NMDA receptor hypofunction; D-cycloserine; Cognitive-Behavioral Therapy; psychosis; delusions; schizophrenia
To evaluate clinical outcomes of frozen-thawed embryo transfer cycles when one or two blastocysts are transferred.
Retrospective chart review
Two hundred forty-three frozen blastocyst transfer (FBT) cycles were analyzed. Clinical pregnancy rate (50.4% vs. 34.7%), live birth rate (45.8% vs. 30.6%), and twin live birth rate (19.3% vs. 0) were significantly higher in the double versus single FBT group, respectively (p < 0.05). Prior fresh cycle success with same-cohort embryos did not predict outcome of FBT cycle. When the fresh cycle was unsuccessful, there still was a significant increase in twinning when two frozen-thawed blastocysts were transferred.
Transferring two blastocysts during an FBT cycle resulted in higher live birth and twin live birth rates. Single FBT provided acceptable pregnancy rates for couples seeking to avoid a multiple pregnancy or for those having a single blastocyst stored. Interestingly, the outcome of fresh cycle with same-cohort embryos did not influence the outcome of frozen-thawed cycle.
Frozen blastocyst transfer; Frozen cycle outcome; IVF; Multiple pregnancy; Number of embryos to transfer
To evaluate pregnancy rate (PR) and live birth rate (LBR) after freezing pronuclear (PN) embryos for two purposes: to reduce the risk of ovarian hyperstimulation syndrome (OHSS) and to bank embryos for cancer patients anticipating gametotoxic chemotherapy/radiotherapy.
Data from 3,621 consecutive IVF cycles were retrospectively analyzed. PN freezing was offered to patients at risk for OHSS and for those wishing to preserve fertility prior to cancer therapy. Primary outcomes evaluated were PR and LBR. Outcomes were compared to patients who underwent fresh embryo transfer (ET) in 2006.
Sixty-six patients froze PN embryos. Thirty-eight were at risk for OHSS. The LBR was 34.3% after one transfer, and 51.4% after a mean of 1.4 transfers. Twenty-eight cancer patients froze embryos. The LBR was 16.7% after one transfer and 25.0% after a mean of 1.5 transfers. The LBR was 35.5% for patients who underwent fresh ET.
PN freezing with delayed ET is an effective tool for achieving pregnancy for patients at risk of OHSS and for cancer patients wishing to preserve fertility.
Ovarian hyperstimulation syndrome; OHSS; IVF; Cancer; Fertility preservation
Supportive social relationships, including a positive patient-practitioner relationship, have been associated with positive health outcomes. Using the data from a randomized controlled trial (RCT) undertaken in the Boston area of the United States, this study sought to identify baseline factors predictive of patients' response to an experimentally applied supportive patient-practitioner relationship. To sort through the hundreds of potential attributes affecting the patient-practitioner relationship, we applied a false discovery rate method borrowed from the field of genomics and bioinformatics. To our knowledge such a method has not previously been applied to generate hypotheses from clinical trial data. In a previous RCT, our team investigated the effect of the patient-practitioner relationship on symptom improvement in patients with irritable bowel syndrome (IBS). Data were collected on a sample of 289 individuals with IBS using a three-week, single blind, three arm, randomized controlled design. We found that a supportive patient-practitioner relationship significantly improved symptomatology and quality of life. A complex, multi-level measurement package was used to prospectively measure change and identify factors associated with improvement. Using a local false discovery rate procedure, we examined the association of 452 baseline subject variables with sensitivity to treatment. Out of 452 variables, only two baseline factors, reclusiveness, and previous trial experience increased sensitivity to the supportive patient-practitioner relationship. A third variable, additional opportunity during the study for subjects to discuss their illness through experiential interview, was associated with improved outcomes among subjects who did not receive the supportive patient-practitioner relationship. The few variables associated with differential benefit suggest that a patient-centered supportive patient-practitioner relationship may be beneficial for most patients. This may be especially important for reclusive individuals. Within the context of our study, additional study attention in the form of repeated experiential interviews compensated for a lack of positive patient-practitioner support. A supportive patient-practitioner relationship may also help overcome low provider expectations for subjects with previous trial experience. These results converge with the results of the parent trial, implicating the importance of the social world in healing.
USA; Patient-practitioner relationship; social factors; randomized controlled trial; false discovery rate analysis
While the prognosis of patients with glioblastoma (GBM) remains poor despite recent therapeutic advances, variable survival times suggest wide variation in tumor biology and an opportunity for stratified intervention. We used volumetric analysis and morphometrics to measure the spatial relationship between subventricular zone (SVZ) proximity and survival in a cohort of 39 newly diagnosed GBM patients. We collected T2-weighted and gadolinium-enhanced T1-weighted magnetic resonance images (MRI) at pre-operative, post-operative, pre-radiation therapy, and post-radiation therapy time points, measured tumor volumes and distances to the SVZ, and collected clinical data. Univariate and multivariate Cox regression showed that tumors involving the SVZ and tumor growth rate during radiation therapy were independent predictors of shorter progression-free and overall survival. These results suggest that GBMs in close proximity to the ependymal surface of the ventricles convey a worse prognosis-an observation that may be useful for stratifying treatment.
Electronic supplementary material
The online version of this article (doi:10.1007/s11060-010-0477-1) contains supplementary material, which is available to authorized users.
SVZ; Glioblastoma; Stem cell; Outcome; MRI
Adults with β thalassemia major frequently have low BMD, fractures, and bone pain. The purpose of this study was to determine the prevalence of low BMD, fractures, and bone pain in all thalassemia syndromes in childhood, adolescence, and adulthood, associations of BMD with fractures and bone pain, and etiology of bone disease in thalassemia. Patients of all thalassemia syndromes in the Thalassemia Clinical Research Network, ≥6 yr of age, with no preexisting medical condition affecting bone mass or requiring steroids, participated. We measured spine and femur BMD and whole body BMC by DXA and assessed vertebral abnormalities by morphometric X-ray absorptiometry (MXA). Medical history by interview and review of medical records, physical examinations, and blood and urine collections were performed. Three hundred sixty-one subjects, 49% male, with a mean age of 23.2 yr (range, 6.1–75 yr), were studied. Spine and femur BMD Z-scores < −2 occurred in 46% and 25% of participants, respectively. Greater age, lower weight, hypogonadism, and increased bone turnover were strong independent predictors of low bone mass regardless of thalassemia syndrome. Peak bone mass was suboptimal. Thirty-six percent of patients had a history of fractures, and 34% reported bone pain. BMD was negatively associated with fractures but not with bone pain. Nine percent of participants had uniformly decreased height of several vertebrae by MXA, which was associated with the use of iron chelator deferoxamine before 6 yr of age. In patients with thalassemia, low BMD and fractures occur frequently and independently of the particular syndrome. Peak bone mass is suboptimal. Low BMD is associated with hypogonadism, increased bone turnover, and an increased risk for fractures.
DXA; BMD; fractures; vertebral morphometry; thalassemia
Perinatal HIV infection is characterized by a sustained high-level viremia and a high risk of rapid progression to AIDS, indicating a failure of immunologic containment of the virus. We hypothesized that age-related differences in the specificity or function of HIV-specific T cells may influence HIV RNA levels and clinical outcome following perinatal infection. Here we define the HIV epitopes targeted by 76 pediatric subjects (47 HIV-infected and 29 HIV-exposed but uninfected), and assessed the ability of HIV-specific CD8 and CD4 T cells to degranulate and produce IFNγ, TNFα, and IL-2. No responses were detected among HIV-uninfected infants, while responses among infected subjects increased in magnitude and breadth with age. Gag-specific responses were uncommon during early infancy, and their frequency was significantly lower among children younger than 24 months old (p=0.014). Importantly, Gag responders exhibited significantly lower HIV RNA levels than nonresponders (logVL 5.8 vs. 5.0; p=0.005). Both the total and Gag-specific T cell frequency correlated inversely with viral load after correction for age, whereas no relationship with targeting of other viral proteins was observed. Functional assessment of HIV-specific T cells by multiparameter flow cytometry revealed that polyfunctional CD8 cells were less prevalent in children before 24 months of age, and that HIV-specific CD4 cell responses were of universally low frequency among antiretroviral-naïve children and absent in young infants. These cross-sectional data suggest that qualitative differences in the CD8 response, combined with a deficiency of HIV-specific CD4 cells, may contribute to the inability of young infants to limit replication of HIV.
human; AIDS; T cells
The basis of the association between asthma and an increased rate of pain among children with sickle cell anemia (SCA) is unclear. To provide evidence for a familial contribution to this observation, we tested the hypothesis that a family history of asthma is associated with an increased pain rate. Using data from the Cooperative Study for Sickle Cell Disease (CSSCD), we identified 211 children with SCA with asthma history of the parents and siblings. A sibling history of asthma was associated with a greater rate of pain (mean rate ratio = 2.48, 95% CI 1.6–4.0; p<0.001) when compared to children without a sibling history of asthma. Parental history of asthma was not associated an increase rate of pain (mean ratio = 1.51, 95% CI 0.92 to 2.62; P=0.12). Further studies are needed to examine genetic and/or environmental risks for asthma as potential contributors to pain in children with SCA.
Sickle cell disease; asthma; pain; acute chest syndrome
Increasing atmospheric carbon dioxide is responsible for climate changes that are having widespread effects on biological systems. One of the clearest changes is earlier onset of spring and lengthening of the growing season. We designed the present study to examine the interactive effects of timing of dormancy release of seeds with low and high atmospheric CO2 on biomass, reproduction, and phenology in ragweed plants (Ambrosia artemisiifolia L.), which produce highly allergenic pollen. We released ragweed seeds from dormancy at three 15-day intervals and grew plants in climate-controlled glasshouses at either ambient or 700-ppm CO2 concentrations, placing open-top bags over inflorescences to capture pollen. Measurements of plant height and weight; inflorescence number, weight, and length; and days to anthesis and anthesis date were made on each plant, and whole-plant pollen productivity was estimated from an allometric-based model. Timing and CO2 interacted to influence pollen production. At ambient CO2 levels, the earlier cohort acquired a greater biomass, a higher average weight per inflorescence, and a larger number of inflorescences; flowered earlier; and had 54.8% greater pollen production than did the latest cohort. At high CO2 levels, plants showed greater biomass and reproductive effort compared with those in ambient CO2 but only for later cohorts. In the early cohort, pollen production was similar under ambient and high CO2, but in the middle and late cohorts, high CO2 increased pollen production by 32% and 55%, respectively, compared with ambient CO2 levels. Overall, ragweed pollen production can be expected to increase significantly under predicted future climate conditions.
allergenic pollen; Ambrosia artemisiifolia; climate change; climate variability; elevated CO2; global warming; ragweed; spring-time warming