By determining protein-protein interactions in normal, diseased and infected cells, we can improve our understanding of cellular systems and their reaction to various perturbations. In this protocol, we discuss how to use data obtained in affinity purification–mass spectrometry (AP-MS) experiments to generate meaningful interaction networks and effective figures. We begin with an overview of common epitope tagging, expression and AP practices, followed by liquid chromatography–MS (LC-MS) data collection. We then provide a detailed procedure covering a pipeline approach to (i) pre-processing the data by filtering against contaminant lists such as the Contaminant Repository for Affinity Purification (CRAPome) and normalization using the spectral index (SIN) or normalized spectral abundance factor (NSAF); (ii) scoring via methods such as MiST, SAInt and CompPASS; and (iii) testing the resulting scores. Data formats familiar to MS practitioners are then transformed to those most useful for network-based analyses. The protocol also explores methods available in Cytoscape to visualize and analyze these types of interaction data. The scoring pipeline can take anywhere from 1 d to 1 week, depending on one’s familiarity with the tools and data peculiarities. Similarly, the network analysis and visualization protocol in Cytoscape takes 2–4 h to complete with the provided sample data, but we recommend taking days or even weeks to explore one’s data and find the right questions.
Most multiple organ failure (MOF) scores were developed over a decade ago, but little has been done in terms of validation and to understand the differences between populations identified by each of them. Given the lack of a gold standard, validation must rely upon objective clinical and resource utilization outcomes. Thus, we propose to: 1)validate two widely accepted MOF scores (Denver's and Marshall's) examining their association with adverse outcomes in a postinjury population; and 2) compare risk factors, characteristics and outcomes of patients identified by each score. The Denver MOF score grades (from 0-3) 4 organ dysfunctions (lung, kidney, liver, heart) and defines MOF as score > 3. The Marshall score grades, in addition, central nervous system (CNS) and hematologic dysfunction (for a total of 6 organs on a 0 to 4 scale). Using a prospectively collected dataset, MOF scored daily by both scores for 1389 consecutive trauma patients with ISS >15, admitted from 1992-2004, and their outcomes evaluated (death; ventilator-free days, VFD; mechanical ventilation time, MV; and length of stay in the intensive care unit, ICU-LOS). Three major groups could be identified: 1)severe injury group for whom MOF risk factor rates, mortality and utilization were all high (Denver = Marshall= MOF and Denver= MOF + Marshall=No MOF); 2) moderate injury group with medium rate of MOF risk factors, medium utilization and low mortality (Denver= No MOF, Marshall= MOF); and 3) mild injury group for whom risk factor rates, mortality and utilization were all low (Denver = Marshall=No MOF). Both scores performed well, with the Denver MOF score showing greater specificity. The basic concepts of each score can probably be combined to produce an improved MOF score.
adult; mortality; blood transfusion; cohort study; critical care
Dynamic kinetic asymmetric transformations (DyKAT) of racemic β-bromo-α-keto esters via direct aldolization of nitromethane and acetone provide access to fully substituted α-glycolic acid derivatives bearing a β-stereocenter. The aldol adducts are obtained in excellent yield with high relative and absolute stereocontrol under mild reaction conditions. Mechanistic studies determined that the reactions proceed through a facile catalyst-mediated racemization of the β-bromo-α-keto esters under a DyKAT Type I manifold.
dynamic kinetic asymmetric transformation (DyKAT); Henry reaction; acetone aldol; organocatalyzed; α-keto ester
Aims: The purpose of this study was to determine whether 3′-5′-cyclic adenosine monophosphate (cAMP)-protein kinase A (PKA) and Sirtuin-1 (SIRT1) dependent mechanisms modulate ATP-binding Cassette (ABC) transport protein expression. ABC transport proteins (ABCC2–4) are essential for chemical elimination from hepatocytes and biliary excretion. Nuclear factor-E2 related-factor 2 (NRF2) is a transcription factor that mediates ABCC induction in response to chemical inducers and liver injury. However, a role for NRF2 in the regulation of transporter expression in nonchemical models of liver perturbation is largely undescribed. Results: Here we show that fasting increased NRF2 target gene expression through NRF2- and SIRT1–dependent mechanisms. In intact mouse liver, fasting induces NRF2 target gene expression by at least 1.5 to 5-fold. In mouse and human hepatocytes, treatment with 8-Bromoadenosine-cAMP, a cAMP analogue, increased NRF2 target gene expression and antioxidant response element activity, which was decreased by the PKA inhibitor, H-89. Moreover, fasting induced NRF2 target gene expression was decreased in liver and hepatocytes of SIRT1 liver-specific null mice and NRF2-null mice. Lastly, NRF2 and SIRT1 were recruited to MAREs and Antioxidant Response Elements (AREs) in the human ABCC2 promoter. Innovation: Oxidative stress mediated NRF2 activation is well described, yet the influence of basic metabolic processes on NRF2 activation is just emerging. Conclusion: The current data point toward a novel role of nutrient status in regulation of NRF2 activity and the antioxidant response, and indicates that cAMP/PKA and SIRT1 are upstream regulators for fasting-induced activation of the NRF2-ARE pathway. Antioxid. Redox Signal. 20, 15–30.
An asymmetric total synthesis of the aminocyclopentitol pactamycin is described, which delivers the title compound in 15 steps from 2,4-pentanedione. Critical to this approach was the exploitation of a complex symmetry-breaking reduction strategy to assemble the C1, C2, and C7 relative stereochemistry within the first four steps of the synthesis. Multiple iterations of this reduction strategy are described, and a thorough analysis of stereochemical outcomes is detailed. In the final case, an asymmetric Mannich reaction was developed to install a protected amine directly at the C2 position. Symmetry-breaking reduction of this material gave way to a remarkable series of stereochemical outcomes leading to the title compound without recourse to non-strategic downstream manipulations. This synthesis is immediately accommodating to the facile preparation of structural analogs.
CALGB 19802, a phase II study, evaluated whether dose intensification of daunorubicin and cytarabine could improve disease-free survival (DFS) of adults with acute lymphoblastic leukemia (ALL), and whether high-dose systemic and intrathecal methotrexate could replace cranial radiotherapy for central nervous system (CNS) prophylaxis.
Patients and Methods
One hundred sixty-one eligible, previously untreated patients age 16–82 years (median, 40 years) were enrolled; 33 (20%) were ≥60years old.
One hundred twenty-eight patients (80%) achieved a complete remission (CR). Dose intensification of daunorubicin and cytarabine was feasible. With a median follow-up of 10.4 years for surviving patients, 5-year DFS was 25% (95% CI, 18–33%) and overall survival (OS) was 30% (95% CI, 23–37%). Patients <60 years who received the 80 mg/m2 dose of daunorubicin had a DFS of 33% (22–44%) and OS of 39% (29–49%) at 5 years. Eighty-four (52%) patients relapsed, including nine (6%) with isolated CNS relapses. Omission of cranial irradiation did not result in higher than historical CNS relapse rates.
Intensive systemic, oral, and intrathecal methotrexate dosing permitted omission of CNS irradiation. This intensive approach using higher doses of daunorubicin and cytarabine failed to result in an overall improvement in DFS or OS compared with historical CALGB studies. Future therapeutic strategies for adults with ALL should be tailored to specific age and molecular genetic subsets.
The identification of sex-based disparities in the use of effective medications in high-risk populations can lead to interventions to minimize disparities in health outcomes. The objective of this study was to determine sex-specific rates of cardioprotective medication use in a large population-level administrative-health database from a universal-payer environment.
Research design and methods
This observational, population-based cohort study used provincial administrative data to compare the utilization of cardioprotective medications between women and men in the first year following a diabetes diagnosis. Competing risks regression was used to calculate crude and adjusted sub-hazard ratios for time-to-first angiotensin-converting-enzyme inhibitor, angiotensin receptor blocker, or statin dispensations.
There were 15,120 (45.4%) women and 18,174 (54.6%) men with diabetes in the study cohort. Overall cardioprotective medication use was low for both primary and secondary prevention for both women and men. In the year following a diabetes diagnosis, women were less likely to use a statin relative to men (adjusted sub-hazard ratio [aSHR] 0.90, 95% confidence interval [CI] 0.85 to 0.96), angiotensin-converting-enzyme inhibitors (aSHR 0.90, 95% CI 0.86 to 0.94), or any cardioprotective medication (aSHR 0.93, 95% CI 0.90 to 0.97).
Cardioprotective medication use was not optimal in women or men. We also identified a health care gap with cardioprotective medication use being lower in women with diabetes compared to men. Closing this gap has the potential to reduce the impact of cardiovascular disease in women with diabetes.
Electronic supplementary material
The online version of this article (doi:10.1186/1758-5996-6-117) contains supplementary material, which is available to authorized users.
Rituximab combined with chemotherapy has improved the survival of previously untreated patients with follicular lymphoma (FL). Nevertheless, many patients neither want nor can tolerate chemotherapy, leading to interest in biological approaches. Epratuzumab is a humanized anti-CD22 monoclonal antibody with efficacy in relapsed FL. Since both rituximab and epratuzumab have single agent activity in FL, we evaluated the antibody combination as initial treatment of patients with FL.
Patients and Methods
Fifty-nine untreated patients with FL received epratuzumab 360 mg/m2 with rituximab 375 mg/m2 weekly for four induction doses. This combination was continued as extended induction in weeks 12, 20, 28, and 36. Response assessed by CT was correlated with clinical risk factors, FDG-PET findings at week 3, Fcg polymorphisms, immunohistochemical markers, and statin use.
Therapy was well-tolerated with toxicities similar to expected with rituximab monotherapy. Fifty-two (88.2%) evaluable patients responded, including 25 complete responses (CR)(42.4%), and 27 partial responses (45.8%). At 3 years follow-up, 60% of patients remain in remission. Follicular Lymphoma International Prognostic Index (FLIPI) risk strongly predicted progression-free survival (p=0.022).
The high response rate and prolonged time to progression observed with this antibody combination are comparable to those observed after standard chemo-immunotherapies and further support the development of biologic, non-chemotherapeutic approaches for these patients.
Guidelines and experts describe 5% to 10% reductions in body weight as ‘clinically important’; however, it is not clear if 5% to 10% weight reductions correspond to clinically important improvements in health-related quality of life (HRQL). Our objective was to calculate the amount of weight loss required to attain established minimal clinically important differences (MCIDs) in HRQL, measured using three validated instruments.
Data from the Alberta Population-based Prospective Evaluation of Quality of Life Outcomes and Economic Impact of Bariatric Surgery (APPLES) study, a population-based, prospective Canadian cohort including 150 wait-listed, 200 medically managed and 150 surgically treated patients were examined. Two-year changes in weight and HRQL measures (Short-Form (SF)-12 physical (PCS; MCID = 5) and mental (MCS; MCID = 5) component summary score, EQ-5D Index (MCID = 0.03) and Visual Analog Scale (VAS; MCID = 10), Impact of Weight on Quality of Life (IWQOL)-Lite total score (MCID = 12)) were calculated. Separate multivariable linear regression models were constructed within medically and surgically treated patients to determine if weight changes achieved HRQL MCIDs. Pooled analysis in all 500 patients was performed to estimate the weight reductions required to achieve the pre-defined MCID for each HRQL instrument.
Mean age was 43.7 (SD 9.6) years, 88% were women, 92% were white, and mean initial body mass index was 47.9 (SD 8.1) kg/m2. In surgically treated patients (two-year weight loss = 16%), HRQL MCIDs were reached for all instruments except the SF-12 MCS. In medically managed patients (two-year weight loss = 3%), MCIDs were attained in the EQ-index but not the other instruments. In all patients, percent weight reductions to achieve MCIDs were: 23% (95% confidence interval (CI): 17.5, 32.5) for PCS, 25% (17.5, 40.2) for MCS, 9% (6.2, 15.0) for EQ-Index, 23% (17.3, 36.1) for EQ-VAS, and 17% (14.1, 20.4) for IWQOL-Lite total score.
Weight reductions to achieve MCIDs for most HRQL instruments are markedly higher than the conventional threshold of 5% to 10%. Surgical, but not medical treatment, consistently led to clinically important improvements in HRQL over two years.
Electronic supplementary material
The online version of this article (doi:10.1186/s12916-014-0175-5) contains supplementary material, which is available to authorized users.
Health-related quality of life; Weight loss; Minimal clinically important difference; Obesity; Patient reported outcomes; Bariatric care
To investigate whether the risk of bladder cancer in individuals with newly diagnosed type 2 diabetes is influenced by the frequency of physician visits before diagnosis as a measure of detection bias.
RESEARCH DESIGN AND METHODS
With the use of linked administrative databases from 1996 to 2006, we established a cohort of 185,100 adults from British Columbia, Canada, with incident type 2 diabetes matched one to one with nondiabetic individuals on age, sex, and index date. Incidence rates and adjusted hazard ratios (aHRs) for bladder cancer were calculated during annual time windows following the index date. Analyses were stratified by number of physician visits in the 2 years before diabetes diagnosis and adjusted for age, sex, year of cohort entry, and socioeconomic status.
The study population was 54% men and had an average age of 60.7 ± 13.5 years; 1,171 new bladder cancers were diagnosed over a median follow-up of 4 years. In the first year after diabetes diagnosis, bladder cancer incidence in the diabetic cohort was 85.3 (95% CI 72.0–100.4) per 100,000 person-years and 66.1 (54.5–79.4) in the control cohort (aHR 1.30 [1.02–1.67], P = 0.03). This first-year increased bladder cancer risk was limited to those with the fewest physician visits 2 years before the index date (≤12 visits, aHR 2.14 [1.29–3.55], P = 0.003). After the first year, type 2 diabetes was not associated with bladder cancer.
The results suggest that early detection bias may account for an overestimation in previously reported increased risks of bladder cancer associated with type 2 diabetes.
Concerns regarding neurocognitive toxicity of whole-brain radiotherapy (WBRT) have motivated development of alternative, dose-intensive chemotherapeutic strategies as consolidation in primary CNS lymphoma (PCNSL). We performed a multicenter study of high-dose consolidation, without WBRT, in PCNSL. Objectives were to determine: one, rate of complete response (CR) after remission induction therapy with methotrexate, temozolomide, and rituximab (MT-R); two, feasibility of a two-step approach using high-dose consolidation with etoposide plus cytarabine (EA); three, progression-free survival (PFS); and four, correlation between clinical and molecular prognostic factors and outcome.
Patients and Methods
Forty-four patients with newly diagnosed PCNSL were treated with induction MT-R, and patients who achieved CR received EA consolidation. We performed a prospective analysis of molecular prognostic biomarkers in PCNSL in the setting of a clinical trial.
The rate of CR to MT-R was 66%. The overall 2-year PFS was 0.57, with median follow-up of 4.9 years. The 2-year time to progression was 0.59, and for patients who completed consolidation, it was 0.77. Patients age > 60 years did as well as younger patients, and the most significant clinical prognostic variable was treatment delay. High BCL6 expression correlated with shorter survival.
CALGB 50202 demonstrates for the first time to our knowledge that dose-intensive consolidation for PCNSL is feasible in the multicenter setting and yields rates of PFS and OS at least comparable to those of regimens involving WBRT. On the basis of these encouraging results, an intergroup study has been activated comparing EA consolidation with myeloablative chemotherapy in this randomized trial in PCNSL, in which neither arm involves WBRT.
While the incidence of postinjury multiple-organ failure (MOF) has declined during the past decade, temporal trends of its morbidity, mortality, presentation patterns, and health care resources use have been inconsistent. The purpose of this study was to describe the evolving epidemiology of postinjury MOF from 2003 to 2010 in multiple trauma centers sharing standard treatment protocols.
“Inflammation and Host Response to Injury Collaborative Program” institutions that enrolled more than 20 eligible patients per biennial during the 2003 to 2010 study period were included. The patients were aged 16 years to 90 years, sustained blunt torso trauma with hemorrhagic shock (systolic blood pressure < 90 mm Hg, base deficit ≥ 6 mEq/L, blood transfusion within the first 12 hours), but without severe head injury (motor Glasgow Coma Scale [GCS] score < 4). MOF temporal trends (Denver MOF score > 3) were adjusted for admission risk factors (age, sex, body max index, Injury Severity Score [ISS], systolic blood pressure, and base deficit) using survival analysis.
A total of 1,643 patients from four institutions were evaluated. MOF incidence decreased over time (from 17% in 2003–2004 to 9.8% in 2009–2010). MOF-related death rate (33% in 2003–2004 to 36% in 2009–2010), intensive care unit stay, and mechanical ventilation duration did not change over the study period. Adjustment for admission risk factors confirmed the crude trends. MOF patients required much longer ventilation and intensive care unit stay, compared with non-MOF patients. Most of the MOF-related deaths occurred within 2 days of the MOF diagnosis. Lung and cardiac dysfunctions became less frequent (57.6% to 50.8%, 20.9% to 12.5%, respectively), but kidney and liver failure rates did not change (10.1% to 12.5%, 15.2% to 14.1%).
Postinjury MOF remains a resource-intensive, morbid, and lethal condition. Lung injury is an enduring challenge and should be a research priority. The lack of outcome improvements suggests that reversing MOF is difficult and prevention is still the best strategy.
LEVEL OF EVIDENCE
Epidemiologic study, level III.
Injury; injury death; multiple organ failure; organ failure mortality; adult respiratory distress syndrome
The progressive depletion of quiescent “bystander” CD4 T-cells, which are non-permissive to HIV infection, is a principal driver of the acquired immunodeficiency syndrome (AIDS). These cells undergo abortive infection characterized by the cytosolic accumulation of incomplete HIV reverse transcripts. These viral DNAs are sensed by an unidentified host sensor that triggers an innate immune response, leading to caspase-1 activation and pyroptosis. Using unbiased proteomic and targeted biochemical approaches as well as two independent methods of lentiviral shRNA-mediated gene knockdown in primary CD4 T-cells, we identify Interferon gamma Inducible protein 16 (IFI16) as a host DNA sensor required for CD4 T-cell death due to abortive HIV infection. These findings provide insights into a key host pathway that plays a central role in CD4 T-cell depletion during disease progression to AIDS.
Dominant mutations in the Cu/Zn-superoxide dismutase (SOD1) cause familial forms of amyotrophic lateral sclerosis (ALS), a fatal disorder characterized by the progressive loss of motor neurons. The molecular mechanism underlying the toxic gain-of-function of mutant hSOD1s remains uncertain. Several lines of evidence suggest that toxicity to motor neurons requires damage to non-neuronal cells. In line with this observation, primary astrocytes isolated from mutant hSOD1 over-expressing rodents induce motor neuron death in co-culture. Mitochondrial alterations have been documented in both neuronal and glial cells from ALS patients as well as in ALS-animal models. In addition, mitochondrial dysfunction and increased oxidative stress have been linked to the toxicity of mutant hSOD1 in astrocytes and neurons. In mutant SOD1-linked ALS, mitochondrial alterations may be partially due to the increased association of mutant SOD1 with the outer membrane and intermembrane space of the mitochondria, where it can affect several critical aspects of mitochondrial function. We have previously shown that decreasing glutathione levels, which is crucial for peroxide detoxification in the mitochondria, significantly accelerates motor neuron death in hSOD1G93A mice. Here we employed a catalase targeted to the mitochondria to investigate the effect of increased mitochondrial peroxide detoxification capacity in models of mutant hSOD1-mediated motor neuron death. The over-expression of mitochondria-targeted catalase improved mitochondrial antioxidant defenses and mitochondrial function in hSOD1G93A astrocyte cultures. It also reverted the toxicity of hSOD1G93A-expressing astrocytes towards co-cultured motor neurons, however ALS-animals did not develop the disease later or survive longer. Hence, while increased oxidative stress and mitochondrial dysfunction have been extensively documented in ALS, these results suggest that preventing peroxide-mediated mitochondrial damage alone is not sufficient to delay the disease.
High-mobility group box 1 (HMGB1) is a late mediator of the systemic inflammation associated with sepsis. Recently, HMGB1 has been shown in animals to be a mediator of hemorrhage-induced organ dysfunction. However, the time course of plasma HMGB1 elevations after trauma in humans remains to be elucidated. Consequently, we hypothesized that mechanical trauma in humans would result in early significant elevations of plasma HMGB1. Trauma patients at risk for multiple organ failure (ISS ≥15) were identified for inclusion (n = 23), and postinjury plasma samples were assayed for HMGB1 by enzyme-linked immunosorbent assay. Comparison of postinjury HMGB1 levels with markers for patient outcome (age, injury severity score, units of red blood cell (RBC) transfused per first 24 h, and base deficit) was performed. To investigate whether postinjury transfusion contributes to elevations of circulating HMGB1, levels were determined in both leuko-reduced and non–leuko-reduced packed RBCs. Plasma HMGB1 was elevated more than 30-fold above healthy controls within 1 h of injury (median, 57.76 vs. 1.77 ng/mL; P < 0.003), peaked from 2 to 6 h postinjury (median, 526.18 ng/mL; P < 0.01 vs. control), and remained elevated above control through 136 h. No clear relationship was evident between postinjury HMGB1 levels and markers for patient outcome. High-mobility group box 1 levels increase with duration of RBC storage, although concentrations did not account for postinjury plasma levels. Leuko-reduced attenuated HMGB1 levels in packed RBCs by approximately 55% (P < 0.01). Plasma HMGB1 is significantly increased within 1 h of trauma in humans with marked elevations occurring from 2 to 6 h postinjury. These results suggest that, in contrast to sepsis, HMGB1 release is an early event after traumatic injury in humans. Thus, HMGB1 may be integral to the early inflammatory response to trauma and is a potential target for future therapeutics.
Critical care; shock; inflammation; resuscitation; multiple organ failure; injury severity; transfusion; leukoreduction
The Alberta Project Promoting active Living and healthy Eating in Schools (APPLE Schools) is a comprehensive school health program that is proven feasible and effective in preventing obesity among school aged children. To support decision making on expanding this program, evidence on its long-term health and economic impacts is particularly critical. In the present study we estimate the life course impact of the APPLE Schools programs in terms of future body weights and avoided health care costs.
We modeled growth rates of body mass index (BMI) using longitudinal data from the National Population Health Survey collected between 1996–2008. These growth rate characteristics were used to project BMI trajectories for students that attended APPLE Schools and for students who attended control schools (141 randomly selected schools) in the Canadian province of Alberta.
Throughout the life course, the prevalence of overweight (including obesity) was 1.2% to 2.8% (1.7 on average) less among students attending APPLE Schools relative to their peers attending control schools. The life course prevalence of obesity was 0.4% to 1.4% (0.8% on average) less among APPLE Schools students. If the APPLE Schools program were to be scaled up, the potential cost savings would be $33 to 82 million per year for the province of Alberta, or $150 to 330 million per year for Canada.
These projected health and economic benefits seem to support broader implementation of school-based health promotion programs.
Surgical and conservative management of partial tears of the rotator cuff has long been a controversial topic for many generations of shoulder surgeons. These tears frequently occur on both the articular and bursal surfaces and within the intrasubstance of the rotator cuff. The term “PASTA lesion” describes the partial articular supraspinatus tendon avulsion–type injury. A less common variant of this injury is the bony PASTA lesion or partial articular bony avulsion of the supraspinatus tendon (PABAST).
The diagnosis of blunt abdominal trauma can be challenging and resource intensive. Observation with serial clinical assessments plays a major role in the evaluation of these patients, but the time required for intra-abdominal injury to become clinically apparent is unknown. The purpose of this study was to determine the amount of time required for an intra-abdominal injury to become clinically apparent after blunt abdominal trauma via physical examination or commonly followed clinical values.
A retrospective review of patients who sustained blunt trauma resulting in intra-abdominal injury between June 2010 and June 2012 at a Level 1 academic trauma center was performed. Patient demographics, injuries, and the amount of time from emergency department admission to sign or symptom development and subsequent diagnosis were recorded. All diagnoses were made by computed tomography or at the time of surgery. Patient transfers from other hospitals were excluded.
Of 3,574 blunt trauma patients admitted to the hospital, 285 (8%) experienced intra-abdominal injuries. The mean (SD) age was 36(17) years, the majority were male (194 patients, 68%) and the mean (SD) Injury Severity Score (ISS) was 21 (14). The mean (SD) time from admission to diagnosis via computed tomography or surgery was 74 (55) minutes. Eighty patients (28%) required either surgery (78 patients, 17%) or radiographic embolization (2 patients, 0.7%) for their injury. All patients who required intervention demonstrated a sign or symptom of their intra-abdominal injury within 60 minutes of arrival, although two patients were intervened upon in a delayed fashion. All patients with a blunt intra-abdominal injury manifested a clinical sign or symptom of their intra-abdominal injury, resulting in their diagnosis within 8 hours 25 minutes of arrival to the hospital.
All diagnosed intra-abdominal injuries from blunt trauma manifested clinical signs or symptoms that could prompt imaging or intervention, leading to their diagnosis within 8 hours 25 minutes of arrival to the hospital. All patients who required an intervention for their injury manifested a sign or symptom of their injury within 60 minutes of arrival.
Level of Evidence
Therapeutic study, level IV Epidemiologic study, level III.
Blunt trauma; intra-abdominal injury; 8 hours; 60 minutes; clinically apparent
Plantar heel pain is a common disorder of the foot for which patients seek medical treatment. The purpose of this study is to explore the relationship between duration of symptoms in plantar fasciitis patients and demographic factors, the intensity and location of pain, extent of previous treatment and self reported pain and function.
The charts of patients presenting with plantar heel pain between June 2008 and October 2010 were reviewed retrospectively and 182 patients with a primary diagnosis of plantar fasciitis were identified. Patients with symptoms less than 6 months were identified as acute and patients with symptoms greater than or equal to six months were defined as having chronic symptoms. Comparisons based on duration of symptoms were performed for age, gender, BMI, comorbidities, pain location and intensity, and a functional score measured by the Foot and Ankle Ability Measure (FAAM).
The two groups were similar in age, BMI, gender, and comorbidities. Pain severity, as measured by a VAS, was not statistically significant between the two groups (6.6 and 6.2). The acute and chronic groups of patients reported similar levels of function on both the activity of daily living (62 and 65) and sports (47 and 45) subscales of the FAAM. Patients in the chronic group were more likely to have seen more providers and tried more treatment options for this condition.
As plantar fasciitis symptoms extend beyond 6 months, patients do not experience increasing pain intensity or functional limitation. No specific risk factors have been identified to indicate a risk of developing chronic symptoms.
plantar fasciitis; heel pain; functional limitation
An asymmetric oxa-Michael/Michael cascade reaction of p-quinols and α,β-unsaturated aldehydes provides access to hindered dialkyl ethers. A highly enantioselective oxa-Michael addition of a tertiary alcohol precedes an intramolecular cyclohexadienone desymmetrization, which allows for the concomitant formation of four contiguous stereocenters in a single step. The highly functionalized bicyclic frameworks are rapidly obtained from simple starting materials with good diastereoselection and serve as valuable precursors for further manipulation.
Strawberries contain anthocyanins and ellagitanins which have antioxidant properties. We determined whether the consumption of strawberries increase the plasma antioxidant activity measured as the ability to decompose 2,2-diphenyl-1-picrylhydrazyl radical (DPPH) in healthy subjects. The study involved 10 volunteers (age 41 ± 6 years, body weight 74.4 ± 12.7 kg) that consumed 500 g of strawberries daily for 9 days and 7 matched controls. Fasting plasma and spot morning urine samples were collected at baseline, during fruit consumption and after a 6 day wash-out period. DPPH decomposition was measured in both deproteinized native plasma specimens and pretreated with uricase (non-urate plasma). Twelve phenolics were determined with HPLC. Strawberries had no effect on the antioxidant activity of native plasma and circulating phenolics. Non-urate plasma DPPH decomposition increased from 5.7 ± 0.6% to 6.6 ± 0.6%, 6.5 ± 1.0% and 6.3 ± 1.4% after 3, 6 and 9 days of supplementation, respectively. The wash-out period reversed this activity back to 5.7 ± 0.8% (p<0.01). Control subjects did not reveal any changes of plasma antioxidant activity. Significant increase in urinary urolithin A and 4-hydroxyhippuric (by 8.7- and 5.9-times after 6 days of supplementation with fruits) was noted. Strawberry consumption can increase the non-urate plasma antioxidant activity which, in turn, may decrease the risk of systemic oxidants overactivity.
strawberry; plasma antioxidant activity; dietary intervention; polyphenols
Obesity is a pressing public health concern, which frequently presents in primary care. With the explosive obesity epidemic, there is an urgent need to maximize effective management in primary care. The 5As of Obesity Management™ (5As) are a collection of knowledge tools developed by the Canadian Obesity Network. Low rates of obesity management visits in primary care suggest provider behaviour may be an important variable. The goal of the present study is to increase frequency and quality of obesity management in primary care using the 5As Team (5AsT) intervention to change provider behaviour.
The 5AsT trial is a theoretically informed, pragmatic randomized controlled trial with mixed methods evaluation. Clinic-based multidisciplinary teams (RN/NP, mental health, dietitians) will be randomized to control or the 5AsT intervention group, to participate in biweekly learning collaborative sessions supported by internal and external practice facilitation. The learning collaborative content addresses provider-identified barriers to effective obesity management in primary care. Evidence-based shared decision making tools will be co-developed and iteratively tested by practitioners. Evaluation will be informed by the RE-AIM framework. The primary outcome measure, to which participants are blinded, is number of weight management visits/full-time equivalent (FTE) position. Patient-level outcomes will also be assessed, through a longitudinal cohort study of patients from randomized practices. Patient outcomes include clinical (e.g., body mass index [BMI], blood pressure), health-related quality of life (SF-12, EQ5D), and satisfaction with care. Qualitative data collected from providers and patients will be evaluated using thematic analysis to understand the context, implementation and effectiveness of the 5AsT program.
The 5AsT trial will provide a wide range of insights into current practices, knowledge gaps and barriers that limit obesity management in primary practice. The use of existing resources, collaborative design, practice facilitation, and integrated feedback loops cultivate an applicable, adaptable and sustainable approach to increasing the quantity and quality of weight management visits in primary care.
Primary healthcare; Obesity; Randomized control trial; Evaluation studies; Family medicine; Practice facilitation
Tenofovir disoproxil fumarate (TDF) is increasingly available for patients infected with subtype C HIV-1. This subtype is reported to develop the principle TDF resistance mutation in the HIV reverse transcriptase, K65R, with greater propensity than other subtypes. We sought to describe K65R development during TDF use in a cohort of patients infected with subtype C HIV.
Using a prospectively followed cohort with 6 monthly HIV RNA assays, we identified virologic failure (defined as an HIV RNA >1000 c/mL) during treatment that included TDF. Residual serum, stored at the time of the HIV RNA assay, was used for consensus sequencing and allele-specific PCR. We assessed prevalence of resistance at failure during TDF-containing treatment and associated factors.
Among 1,682 patients on a TDF-containing regimen, 270 developed failure of which 40 were assessed for resistance. By sequencing, the K65R was identified in 5 (12%), major NNRTI mutations in 24 (57%), and the M184V/I in 12 (28%) patients. The K65R was associated with lower HIV RNA at failure (HIV RNA log10 3.3 versus 4.2 c/mL) and prior stavudine exposure. An additional 5 patients had minority K65R populations identified by allele-specific PCR.
These data suggest that the K65R prevalence at virologic failure is moderately higher in our subtype C population than some non-subtype C HIV cohorts. However, we did not find that the K65R was highly selected in HIV-1 subtype C infected patients with up to 6 months of failure of a TDF-containing regimen.
Oxidative stress is generated in several peripheral nerve injury models. In response to oxidative stress, the transcription factor Nrf2 is activated to induce expression of antioxidant responsive element (ARE) genes. The role of Nrf2 in peripheral nerve injury has not been studied to date. In this study, we used a sciatic nerve crush model to examine how deletion of Nrf2 affects peripheral nerve degeneration and regeneration. Our study demonstrated that functional recovery in the Nrf2-/- mice were impaired compared to the wild type mice after sciatic nerve crush. Larger myelin debris were present in the distal nerve stump of the Nrf2-/- mice than in the wild type mice. The presence of larger myelin debris in the Nrf2-/- mice coincides with less macrophages accumulation in the distal nerve stump. Less accumulation of macrophages may have contributed to slower clearance of myelin and thus resulted in the presence of larger myelin debris. Meanwhile, axonal regeneration is comparatively lower in the Nrf2-/- mice than in the wild type mice. Even after 3 months post the injury, more thinly myelinated axon fibers were present in the Nrf2-/- mice than in the wild type mice. Taken collectively, these data support the concept of therapeutic intervention with Nrf2 activators following nerve injury.
Nrf2; sciatic nerve crush; myelin clearance; axonal regeneration; remyelination
The Bridle procedure restores active ankle dorsiflexion through a tri-tendon anastamosis of the tibialis posterior, transferred to the dorsum of the foot, with the peroneus longus and tibialis anterior tendon. Inter-segmental foot motion after the Bridle procedure has not been measured. The purpose of this study is to report kinetic and kinematic variables during walking and heel rise in patients after the Bridle procedure.
18 Bridle and 10 control participants were studied. Walking and heel rise kinetic and kinematic variables were collected and compared using an ANOVA.
During walking the Bridle group, compared with controls, had reduced ankle power at push off [2.3 (SD 0.7) W/kg, 3.4 (SD 0.6) W/kg, respectively, P<.01], less hallux extension during swing [−13 (SD 7)°, 15 (SD6)°, respectively, P<.01] and slightly less ankle dorsiflexion during swing [6 (SD4)°, 9 (SD 2)°, respectively, P=.03]. During heel rise the Bridle group had 4 (SD 6)° of forefoot on hindfoot dorsiflexion compared to 8 (SD 3)° of plantarflexion in the controls (P<.01).
This study provides evidence that the Bridle procedure restores the majority of dorsiflexion motion during swing. However, plantarflexor function during push off and hallux extension during swing were reduced during walking in the Bridle group. Abnormal mid-tarsal joint motion, forefoot on hindfoot dorsiflexion instead of plantarflexion, was identified in the Bridle group during the more challenging heel rise task. Intervention after the Bridle procedure must maximize ankle plantarflexor function and midfoot motion should be examined during challenging tasks.