Perioperative Ischaemic Evaluation-2 (POISE-2) is an international 2×2 factorial randomised controlled trial of low-dose aspirin versus placebo and low-dose clonidine versus placebo in patients who undergo non-cardiac surgery. Perioperative aspirin (and possibly clonidine) may reduce the risk of postoperative acute kidney injury (AKI).
Methods and analysis
After receipt of grant funding, serial postoperative serum creatinine measurements began to be recorded in consecutive patients enrolled at substudy participating centres. With respect to the study schedule, the last of over 6500 substudy patients from 82 centres in 21 countries were randomised in December 2013. The authors will use logistic regression to estimate the adjusted OR of AKI following surgery (compared with the preoperative serum creatinine value, a postoperative increase ≥26.5 μmol/L in the 2 days following surgery or an increase of ≥50% in the 7 days following surgery) comparing each intervention to placebo, and will report the adjusted relative risk reduction. Alternate definitions of AKI will also be considered, as will the outcome of AKI in subgroups defined by the presence of preoperative chronic kidney disease and preoperative chronic aspirin use. At the time of randomisation, a subpopulation agreed to a single measurement of serum creatinine between 3 and 12 months after surgery, and the authors will examine intervention effects on this outcome.
Ethics and dissemination
The authors were competitively awarded a grant from the Canadian Institutes of Health Research for this POISE-2 AKI substudy. Ethics approval was obtained for additional kidney data collection in consecutive patients enrolled at participating centres, which first began for patients enrolled after January 2011. In patients who provided consent, the remaining longer term serum creatinine data will be collected throughout 2014. The results of this study will be reported no later than 2015.
Clinical Trial Registration Number
This study evaluates the impact of a new 'Preparation for Internship’ (PRINT) course, which was developed to facilitate the transition of University of New South Wales (UNSW) medical graduates from Medical School to Internship.
During a period of major curricular reform, the 2007 (old program) and 2009 (new program) cohorts of UNSW final year students completed the Clinical Capability Questionnaire (CCQ) prior to and after undertaking the PRINT course. Clinical supervisors’ ratings and self-ratings of UNSW 2009 medical graduates were obtained from the Hospital-based Prevocational Progress Review Form.
Prior to PRINT, students from both cohorts perceived they had good clinical skills, with lower ratings for capability in procedural skills, operational management, and administrative tasks. After completing PRINT, students from both cohorts perceived significant improvement in their capability in procedural skills, operational management, and administrative tasks. Although PRINT also improved student-perceived capability in confidence, interpersonal skills and collaboration in both cohorts, curriculum reform to a new outcomes-based program was far more influential in improving self-perceptions in these facets of preparedness for hospital practice than PRINT.
The PRINT course was most effective in improving students’ perceptions of their capability in procedural skills, operational management and administrative tasks, indicating that student-to-intern transition courses should be clinically orientated, address relevant skills, use experiential learning, and focus on practical tasks. Other aspects that are important in preparation of medical students for hospital practice cannot be addressed in a PRINT course, but major improvements are achievable by program-wide curriculum reform.
Undergraduate medical education; Transition to internship; Outcome based curriculum; Clinical skills
Studies conducted decades ago described substantial disagreement and errors in physicians’ angiographic interpretation of coronary stenosis severity. Despite the potential implications of such findings, no large-scale efforts to measure or improve clinical interpretation were subsequently made.
Methods & Results
We compared clinical interpretation of stenosis severity in coronary lesions with an independent assessment using quantitative coronary angiography (QCA) in 175 randomly selected patients undergoing elective percutaneous coronary intervention (PCI) at 7 U.S. hospitals in 2011. To assess agreement, we calculated mean difference in percent diameter stenosis between clinical interpretation and QCA and a Cohen’s weighted kappa statistic. Of 216 treated lesions, median percent diameter stenosis was 80.0% (Q1 and Q3, 80.0 and 90.0%) with 213 (98.6%) assessed as ≥70%. Mean difference in percent diameter stenosis between clinical interpretation and QCA was +8.2 ± 8.4%, reflecting an average higher percent diameter stenosis by clinical interpretation (P<0.001). A weighted kappa of 0.27 (95% CI, 0.18 to 0.36) was found between the 2 measurements. Of 213 lesions considered ≥70% by clinical interpretation, 56 (26.3%) were <70% by QCA though none was <50%. Differences between the 2 measurements were largest for intermediate lesions by QCA (50 to <70%) with variation existing across sites.
Physicians tended to assess coronary lesions treated with PCI as more severe than measurements by QCA. Almost all treated lesions were ≥70% by clinical interpretation, while approximately a quarter were <70% by QCA. These findings suggest opportunities to improve clinical interpretation of coronary angiography.
Health policy and outcomes research; Quality improvement; Coronary angiography; Percutaneous coronary intervention; Quantitative coronary angiography
Metagenomics is a relatively recently established but rapidly expanding field that uses high-throughput next-generation sequencing technologies to characterize the microbial communities inhabiting different ecosystems (including oceans, lakes, soil, tundra, plants and body sites). Metagenomics brings with it a number of challenges, including the management, analysis, storage and sharing of data. In response to these challenges, we have developed a new metagenomics resource (http://www.ebi.ac.uk/metagenomics/) that allows users to easily submit raw nucleotide reads for functional and taxonomic analysis by a state-of-the-art pipeline, and have them automatically stored (together with descriptive, standards-compliant metadata) in the European Nucleotide Archive.
Previously active in the mid-1990s, the Canadian Airway Focus Group (CAFG) studied the unanticipated difficult airway and made recommendations on management in a 1998 publication. The CAFG has since reconvened to examine more recent scientific literature on airway management. The Focus Group’s mandate for this article was to arrive at updated practice recommendations for management of the unconscious/induced patient in whom difficult or failed tracheal intubation is encountered.
Nineteen clinicians with backgrounds in anesthesia, emergency medicine, and intensive care joined this iteration of the CAFG. Each member was assigned topics and conducted reviews of Medline, EMBASE, and Cochrane databases. Results were presented and discussed during multiple teleconferences and two face-to-face meetings. When appropriate, evidence- or consensus-based recommendations were made together with assigned levels of evidence modelled after previously published criteria.
The clinician must be aware of the potential for harm to the patient that can occur with multiple attempts at tracheal intubation. This likelihood can be minimized by moving early from an unsuccessful primary intubation technique to an alternative “Plan B” technique if oxygenation by face mask or ventilation using a supraglottic device is non-problematic. Irrespective of the technique(s) used, failure to achieve successful tracheal intubation in a maximum of three attempts defines failed tracheal intubation and signals the need to engage an exit strategy. Failure to oxygenate by face mask or supraglottic device ventilation occurring in conjunction with failed tracheal intubation defines a failed oxygenation, “cannot intubate, cannot oxygenate” situation. Cricothyrotomy must then be undertaken without delay, although if not already tried, an expedited and concurrent attempt can be made to place a supraglottic device.
Appropriate planning is crucial to avoid morbidity and mortality when difficulty is anticipated with airway management. Many guidelines developed by national societies have focused on management of difficulty encountered in the unconscious patient; however, little guidance appears in the literature on how best to approach the patient with an anticipated difficult airway.
To review this and other subjects, the Canadian Airway Focus Group (CAFG) was re-formed. With representation from anesthesiology, emergency medicine, and critical care, CAFG members were assigned topics for review. As literature reviews were completed, results were presented and discussed during teleconferences and two face-to-face meetings. When appropriate, evidence- or consensus-based recommendations were made, and levels of evidence were assigned.
Previously published predictors of difficult direct laryngoscopy are widely known. More recent studies report predictors of difficult face mask ventilation, video laryngoscopy, use of a supraglottic device, and cricothyrotomy. All are important facets of a complete airway evaluation and must be considered when difficulty is anticipated with airway management. Many studies now document the increasing patient morbidity that occurs with multiple attempts at tracheal intubation. Therefore, when difficulty is anticipated, tracheal intubation after induction of general anesthesia should be considered only when success with the chosen device(s) can be predicted in a maximum of three attempts. Concomitant predicted difficulty using oxygenation by face mask or supraglottic device ventilation as a fallback makes an awake approach advisable. Contextual issues, such as patient cooperation, availability of additional skilled help, and the clinician’s experience, must also be considered in deciding the appropriate strategy.
With an appropriate airway evaluation and consideration of relevant contextual issues, a rational decision can be made on whether an awake approach to tracheal intubation will maximize patient safety or if airway management can safely proceed after induction of general anesthesia. With predicted difficulty, close attention should be paid to details of implementing the chosen approach. This should include having a plan in case of the failure of tracheal intubation or patient oxygenation.
APOBEC3 (A3) proteins are virus restriction factors that provide intrinsic immunity against infections by viruses like HIV-1 and MMTV. A3 proteins are inducible by inflammatory stimuli such as LPS and IFNα via mechanisms that are not yet fully defined. Using genetic and pharmacological studies on C57BL/6 mice and cells, we show that IFNα and LPS induce A3 via different pathways independent of each other. IFNα positively regulates mA3 mRNA expression through IFNαR•PKC•STAT1 and negatively regulates mA3 mRNA expression via IFNαR•MAPKs signaling pathways. Interestingly, LPS shows some variation in its regulatory behavior. While LPS mediated positive regulation of mA3 mRNA occurs through TLR4•TRIF•IRF3•PKC, it negatively modulates mA3 mRNA via TLR4•MyD88•MAPK signaling pathways. Additional studies on human PBMCs reveal that PKC differentially regulates IFNα and LPS induction of human A3A, A3F, and A3G mRNA expression. In summary, we have identified important signaling targets downstream of IFNαR and TLR4 that mediate A3 mRNA induction by both LPS and IFNα. Our results provide new insights into the signaling targets that could be manipulated to enhance intracellular store of A3 and potentially enhance A3 anti-viral function in the host.
APOBEC3; interferon alpha; interferon beta; interferon alpha receptor; interferon regulatory factor 3; lipopolysaccharide; myeloid differentiation primary response gene (88); TIR-domain-containing adapter-inducing interferon-β; protein kinase C delta; Toll-like receptor 4; MAPK; ERK; JNK; P38
Background. Psychotropic medications, in particular second-generation antipsychotics (SGAs) and benzodiazepines, have been associated with harm in elderly populations. Health agencies around the world have issued warnings about the risks of prescribing such medications to frail individuals affected by dementia and current guidelines recommend their use only in cases where the benefits clearly outweigh the risks. This study documents the use of psychotropic medications in the entire elderly population of a Canadian province in the context of current clinical guidelines for the treatment of behavioural disturbances.
Methods. Prevalent and incident utilization of antipsychotics, benzodiazepines and related medications (zopiclone and zaleplon) were determined in the population of Manitobans over age 65 in the time period 1997/98 to 2008/09 fiscal years. Comparisons between patients living in the community and those living in personal care (nursing) homes (PCH) were conducted. Influence of sociodemographic characteristics on prescribing was assessed by generalized estimating equations. Non-optimal use was defined as the prescribing of high dose of antipsychotic medications and the use of combination therapy of a benzodiazepine (or zopiclone/zaleplon) with an antipsychotic. A decrease in intensity of use over time and lower proportions of patients treated with antipsychotics at high dose or in combination with benzodiazepines (or zopiclone/zaleplon) was considered a trend toward better prescribing. Multiple regression analysis determined predictors of non-optimal use in the elderly population.
Results. A 20-fold greater prevalent utilization of SGAs was observed in PCH-dwelling elderly persons compared to those living in the community. In 2008/09, 27% of PCH-dwelling individuals received a prescription for an SGA. Patient characteristics, such as younger age, male gender, diagnoses of dementia (or use of an acetylcholinesterase inhibitor) or psychosis in the year prior the prescription, were predictors of non-optimal prescribing (e.g., high dose antipsychotics). During the period 2002/3 and 2007/8, amongst new users of SGAs, 10.2% received high doses. Those receiving high dose antipsychotics did not show high levels of polypharmacy.
Conclusions. Despite encouraging trends, the use of psychotropic medications remains high in elderly individuals, especially in residents of nursing homes. Clinicians caring for such patients need to carefully assess risks and benefits.
Antipsychotic; Benzodiazepines; Elderly; Prescribing; Psychotropic
Background. The relative efficacy and safety of lacosamide as adjunctive therapy compared to other antiepileptic drugs has not been well established.
Objective. To determine if lacosamide provides improved efficacy and safety, reduced length of hospital stay and improved quality of life compared with other anti-epileptic therapies for adults with partial-onset seizures.
Data Sources. A systematic review of the medical literature using Medline (1946–Week 4, 2012), EMBASE (1980–Week 3, 2012), Cochrane Central Register of Controlled Trials (Issue 1 of 12, January 2012). Additional studies were identified (through to February 7, 2012) by searching bibliographies, the FDA drug approval files, clinical trial registries and major national and international neurology meeting abstracts. No restrictions on publication status or language were applied.
Study Selection. Randomized controlled trials of lacosamide in adults with partial-onset seizures were included.
Data Extraction. Study selection, extraction and risk of bias assessment were performed independently by two authors. Authors of studies were contacted for missing data.
Data Synthesis. All pooled analyses used the random effects model.
Results. Three trials (1311 patients) met inclusion criteria. Lacosamide increased the 50% responder rate compared to placebo (RR 1.68 [95% CI 1.36 to 2.08]; I2 = 0%). Discontinuation due to adverse events was statistically significantly higher in the lacosamide arm (RR3.13 [95% CI 1.94 to 5.06]; I2 = 0%). Individual adverse events (ataxia, dizziness, fatigue, and nausea) were also significantly higher in the lacosamide group.
Limitations. All dosage arms from the included studies were pooled to make a single pair-wise comparison to placebo. Selective reporting of outcomes was found in all of the included RCTs.
Conclusions. Lacosamide as adjunctive therapy in patients with partial-onset seizures increases the 50% responder rate but with significantly more adverse events compared to the placebo.
Systematic review; Meta-analysis; Lacosamide; Partial-onset seizures; Epilepsy; Antiepileptic drugs; Randomized controlled trials
Previous studies have described an “obesity paradox” with heart failure, whereby higher body mass index (BMI) is associated with lower mortality. However, little is known about the impact of obesity on survival after acute myocardial infarction.
Data from 2 registries of patients hospitalized in the United States with acute myocardial infarction between 2003–04 (PREMIER) and 2005–08 (TRIUMPH) were used to examine the association of BMI with mortality. Patients (n=6359) were categorized into BMI groups (kg/m2) using baseline measurements. Two sets of analyses were performed using Cox proportional hazards regression with fractional polynomials to model BMI as categorical and continuous variables. To assess the independent association of BMI with mortality, analyses were repeated adjusting for 7 domains of patient and clinical characteristics.
Median BMI was 28.6. BMI was inversely associated with crude 1-year mortality (normal, 9.2%; overweight, 6.1%; obese, 4.7%; morbidly obese; 4.6%; p<0.001), which persisted after multivariable adjustment. When BMI was examined as a continuous variable, the hazards curve declined with increasing BMI and then increased above a BMI of 40. Compared with patients with a BMI of 18.5, patients with higher BMIs had a 20% to 68% lower mortality at 1 year. No interactions between age (p=0.37), gender (p=0.87) or diabetes mellitus (p=0.55) were observed.
There appears to be an “obesity paradox” among acute myocardial infarction patients such that higher BMI is associated with lower mortality, an effect that was not modified by patient characteristics and was comparable across age, gender, and diabetes subgroups.
body mass index; mortality; myocardial infarction; fractional polynomials; obesity paradox
To examine changes in social support during early recovery after acute myocardial infarction (AMI) and determine whether these changes influence outcomes within the first year.
Among 1951 AMI patients enrolled in a 19-center prospective study, we examined changes in social support between baseline (index hospitalization) and 1 month post-AMI to longitudinally assess their association with health status and depressive symptoms within the first year. We further examined whether 1-month support predicted outcomes independent of baseline support. Hierarchical repeated-measures regression evaluated associations, adjusting for site, baseline outcome level, baseline depressive symptoms, sociodemographic characteristics, and clinical factors.
During the first month of recovery, 5.6% of patients had persistently low support, 6.4% had worsened support, 8.1% had improved support, and 80.0% had persistently high support. In risk-adjusted analyses, patients with worsened support (versus persistently high) had greater risk of angina (relative risk=1.46), lower disease-specific quality of life (β=−7.44), lower general mental functioning (β=−4.82), and more depressive symptoms (β=1.94) (all p≤0.01). Conversely, patients with improved support (versus persistently low) had better outcomes, including higher disease-specific quality of life (β=6.78), higher general mental functioning (β=4.09), and fewer depressive symptoms (β=−1.48) (all p≤0.002). In separate analyses, low support at 1 month was significantly associated with poorer outcomes, independent of baseline support level (all p≤0.002).
Changes in social support during early AMI recovery were not uncommon and were important for predicting outcomes. Intervening on low support during early recovery may provide a means of improving outcomes.
depression; health status; myocardial infarction; prognosis; quality of life; social support
Overcoming racial differences in acute coronary syndrome (ACS) outcomes is a strategic goal for US healthcare. Genetic polymorphisms in the adrenergic pathway appear to explain some outcome differences by race in other cardiovascular diseases treated with β-adrenergic receptor-blockade (BB). Whether these genetic variants are associated with survival among ACS patients treated with BB, and if this differs by race, is unknown.
BB after ACS is a measure of quality care, but the effectiveness across racial groups, is less clear.
A prospective cohort of 2,673 ACS patients (2,072 Caucasian; 601 African Americans) discharged on BB from 22 U.S. hospitals were followed for 2 years. Subjects were genotyped for polymorphisms in ADRB1, ADRB2, ADRA2C, and GRK5. We used proportional hazards regression to model the effect of genotype on mortality, stratified by race and adjusted for baseline factors.
The overall 2-year mortality rate was 7.5% for Caucasians and 16.7% for African Americans. The prognosis associated with different genotypes in these BB-treated patients differed by race. In Caucasians, ADRA2C 322-325 deletion (D) carriers had significantly lower mortality as compared with homozygous individuals lacking the deletion (HR 0.46; CI 0.21, 0.99; p=0.047; race-by-genotype interaction p= 0.053). In African Americans, the ADRB2 16R allele was associated with significantly increased mortality (HR for RG vs. GG =2.10; CI 1.14, 3.86; RR vs. GG =2.65; CI 1.38, 5.08; p=0.013; race-by-genotype interaction p=0.096).
Adrenergic pathway polymorphisms are associated with mortality in ACS patients receiving BB in a race-specific manner. Understanding the mechanism by which different genes impact post-ACS mortality differently in Caucasian and African Americans may illuminate opportunities to improve BB therapy in these groups.
The EQ-5D is a widely-used, standardised, quality of life measure producing health profiles, indices and states. The aims of this study were to assess the role of various factors in how people with Multiple Sclerosis rate their quality of life, based on responses to the EQ-5D received via the web portal of the UK MS Register.
The 4516 responses to the EQ-5D (between May 2011 and April 2012) were collated with basic demographic and descriptive MS data and the resulting dataset was analysed in SPSS (v.20).
The mean health state for people with MS was 59.73 (SD 22.4, median 61), compared to the UK population mean of 82.48 (which is approximately 1SD above the cohort mean). The characteristics of respondents with high health states (at or above +1SD) were: better health profiles (most predictive dimension: Usual Activities), higher health indices, younger age, shorter durations of MS, female gender, relapsing-remitting MS, higher educational attainment and being in paid employment (all p-values<0.001). Conversely, the characteristics of respondents with low health states (at or below -1SD) were: poorer health profiles (most predictive dimension: Mobility), lower health indices, older age, longer durations of MS, male gender, progressive MS, lower educational attainment and having an employment status of sick/disabled (p = 0.0014 for age, all other p-values<0.001). Particular living arrangements were not associated with either the high or low health status groups.
This large-scale study has enabled in-depth analyses on how people with MS rate their quality of life, and it provides new knowledge on the various factors that contribute to their self-assessed health status. These findings demonstrate the impact of MS on quality of life, and they can be used to inform care provision and further research, to work towards enhancing the quality of life of people with MS.
Considerable attention has been devoted to the effect of social support on patient outcomes after acute myocardial infarction (AMI). However, little is known about the relation between patient living arrangements and outcomes. Thus, we used data from PREMIER, a registry of patients hospitalized with AMI at 19 US centers between 2003-04, to assess the association of living alone with post-AMI outcomes. Outcome measures included 4-year mortality, 1-year readmission, and 1-year health status, using the Seattle Angina Questionnaire (SAQ) and Short Form-12 physical health component (SF-12 PCS) scales. Patients who lived alone had higher crude 4-year mortality (21.8% vs. 14.5%, P<0.001), but comparable rates of 1-year readmission (41.6% vs. 38.3%, p=0.79). Living alone was associated with lower unadjusted quality of life (mean SAQ −2.40 (95% confidence interval [CI] −4.44, −0.35), p=0.02), but had no impact on SF-12 PCS (−0.45 (95% CI −1.65, 0.76), p=0.47) compared with patients who did not live alone. After multivariable adjustment, patients who lived alone had a comparable risk of mortality (hazard ratio [HR] 1.35, 95% CI: 0.94-1.93) and readmission (HR 0.99, 95% CI: 0.76-1.28) as patients who lived with others. Mean quality of life scores remained lower among patients who lived alone (SAQ −2.91 (95% CI −5.56, −0.26), p=0.03). Living alone may be associated with poorer angina-related quality of life one year post-MI, but is not associated with mortality, readmission, or other health status measures after adjusting for other patient and treatment characteristics.
Living alone; acute myocardial infarction; social support
Objective. We sought to determine the characteristics of children presenting to United States (US) Emergency Departments (ED) with severe sepsis.
Study design. Cross-sectional analysis using data from the National Hospital Ambulatory Medical Care Survey (NHAMCS). Using triage vital signs and ED diagnoses (defined by the International Classification of Diseases, Ninth Revision codes), we identified children <18 years old presenting with both infection (triage fever or ICD-9 infection) and organ dysfunction (triage hypotension or ICD-9 organ dysfunction).
Results. Of 28.2 million pediatric patients presenting to US EDs each year, severe sepsis was present in 95,055 (0.34%; 95% CI: 0.29–0.39%). Fever and respiratory infection were the most common indicators of an infection. Hypotension and respiratory failure were the most common indicators of organ dysfunction. Most severe sepsis occurred in children ages 31 days–1 year old (32.1%). Most visits for pediatric severe sepsis occurred during winter months (37.4%), and only 11.1% of patients arrived at the ED by ambulance. Over half of severe sepsis cases were self-pay or insured by Medicaid. A large portion (44.1%) of pediatric severe sepsis ED visits occurred in the South census region. ED length of stay was over 3 h, and 16.5% were admitted to the hospital.
Conclusion. Nearly 100,000 children annually present to US EDs with severe sepsis. The findings of this study highlight the unique characteristics of children treated in the ED for severe sepsis.
Sepsis; Emergency department; Pediatrics; Epidemiology; Cross-sectional
To evaluate the relationship between A1C and glucose therapy intensification (GTI) in patients with diabetes mellitus (DM) hospitalized for acute myocardial infarction (AMI).
RESEARCH DESIGN AND METHODS
A1C was measured as part of routine care (clinical A1C) or in the core laboratory (laboratory A1C, results unavailable to clinicians). GTI predictors were identified using hierarchical Poisson regression.
Of 1,274 patients, 886 (70%) had clinical A1C and an additional 263 had laboratory A1C measured. Overall, A1C was <7% in 419 (37%), 7–9% in 415 (36%), and >9% in 315 patients (27%). GTI occurred in 31% of patients and was more frequent in those with clinical A1C both before (34 vs. 24%, P < 0.001) and after multivariable adjustment (relative risk 1.34 [95% CI 1.12–1.62] vs. no clinical A1C).
Long-term glucose control is poor in most AMI patients with DM, but only a minority of patients undergo GTI at discharge. Inpatient A1C assessment is strongly associated with intensification of glucose-lowering therapy.
The aim of this study was to provide a contemporary picture of the presentation, etiology and outcome of infective endocarditis (IE) in a large patient cohort from multiple locations worldwide.
Prospective cohort study of 2781 adults with definite IE admitted to 58 hospitals in 25 countries between June 2000 and September 2005.
The median age of the cohort was 57.9 (IQR 43.2–71.8) years and 72% had native valve IE. Most (77%) patients presented early in the disease (<30 days) with few of the classic clinical hallmarks of IE. Recent health-care exposure was found in one quarter of patients. Staphylococcus aureus was the most common pathogen (31%). Mitral (41%) and aortic (38%) valves were infected most commonly. Complications were common: stroke (17%); embolization other than stroke (23%); heart failure (32%) and intracardiac abscess (14%). Surgical therapy was common (48%) and in-hospital mortality remained high (18%). Prosthetic valve involvement (OR 1.47, 95%CI 1.13–1.90), increasing age (OR 1.30, 95%CI 1.17–1.46 per 10-year interval), pulmonary edema (OR 1.79, 95%CI 1.39–2.30), S. aureus infection (OR 1.54, 95%CI 1.14–2.08), coagulase-negative staphylococcal infection (OR 1.50, 95%CI 1.07–2.10), mitral valve vegetation (OR 1.34, 95%CI 1.06–1.68), and paravalvular complications (OR 2.25, 95%CI 1.64–3.09) were associated with increased risk of in-hospital death, while viridans streptococcal infection (OR 0.52, 95%CI 0.33–0.81) and surgery (OR 0.61, 95%CI 0.44–0.83) were associated with decreased risk.
In the early 21st century, IE is more often an acute disease, characterized by a high rate of S. aureus infection. Mortality remains relatively high.
This review addresses how mutation of the TP53 gene (p53) and ultraviolet light alter the behavior of normal progenitor cells in early epidermal preneoplasia.
Cancer is thought to evolve from single mutant cells, which expand into clones and ultimately into tumors. While the mutations in malignant lesions have been studied intensively, less is known about the earliest stages of preneoplasia, and how environmental factors may contribute to drive expansion of mutant cell clones. Here we review the evidence that ultraviolet radiation not only creates new mutations but drives the exponential growth of the numerous p53 mutant clones found in chronically exposed epidermis. Published data is reconciled with a new paradigm of epidermal homeostasis which gives insights into the behavior of mutant cells. We also consider the reasons why so few mutant cells progress into tumors and discuss the implications of these findings for cancer prevention.
Epidermis; Preneoplasia; Ultraviolet; p53 mutation; Stem cell
Automated external defibrillators (AEDs) improve survival from out-of-hospital cardiac arrests, but data on their effectiveness in hospitalized patients are limited.
To evaluate the association of AED use and survival for in-hospital cardiac arrest.
Design, Setting, Patients
Cohort study of 11,695 hospitalized patients with cardiac arrests between January 1, 2000 and August 26, 2008 at 204 hospitals following the introduction of AEDs on general hospital wards.
Main Outcome Measure
Survival to hospital discharge by AED use, using multivariable hierarchical regression analyses to adjust for patient factors and hospital site.
Of 11,695 patients, 9616 (82.2%) had non-shockable rhythms (asystole and pulseless electrical activity) and 2079 (17.8%) had shockable rhythms (ventricular fibrillation and pulseless ventricular tachycardia). AEDs were used in 4515 (38.6%) patients. Overall, 2117 (18.1%) patients survived to hospital discharge. Within the entire study population, AED use was associated with a lower rate of survival after in-hospital cardiac arrest compared with no AED use (16.3% vs. 19.3%; adjusted rate ratio (RR), 0.85; 95% confidence interval (CI), 0.78–0.92; P<0.001). Among cardiac arrests due to non-shockable rhythms, AED use was associated with lower survival (10.4% vs. 15.4%; adjusted RR, 0.74; 95% CI, 0.65–0.83; P<.001). In contrast, for cardiac arrests due to shockable rhythms, AED use was not associated with survival (38.4% vs. 39.8%; adjusted RR, 1.00; 95% CI, 0.88–1.13; P=0.99). These patterns were consistently observed in both monitored and non-monitored hospital units where AEDs were used, after matching patients to the individual units in each hospital where the cardiac arrest occurred, and with a propensity score analysis.
Use of AEDs in hospitalized patients with cardiac arrest is not associated with improved survival.
Diseases of esophageal epithelium (EE) such as reflux esophagitis and cancer are rising in incidence. Despite this, the cellular behaviors underlying EE homeostasis and repair remain controversial. Here we show that in mice, EE is maintained by a single population of cells that divide stochastically to generate proliferating and differentiating daughters with equal probability. In response to challenge with all-trans Retinoic Acid (atRA) the balance of daughter cell fate is unaltered but the rate of cell division increases. However, following wounding, cells reversibly switch to producing an excess of proliferating daughters until the wound has closed. Such fate switching enables a single progenitor population to both maintain and repair tissue without the need for a “reserve” slow-cycling stem cell pool.
The MSIS-29 was developed to assess the physical and psychological impact of MS. The aims of this study were to use the responses to the MSIS-29 via the web portal of the UK MS Register to: examine the internal properties of the scale delivered via the internet, profile the cohort, and assess how well the scale measures impact of disability on the potential workforce.
Between May 2011 and April 2012, 4558 people with MS completed the MSIS-29(v.1). The responses were collated with basic demographic and descriptive MS data and the resulting dataset was analysed in SPSS(v.20).
Internal consistency was high (Cronbach's alpha 0.97 MSIS-29-PHYS, 0.92 MSIS-29-PSYCH). The mean MSIS-29-PHYS score was 60.5 (50.6%) with a median of 62 and the mean MSIS-29-PSYCH score was 24.8 (43.8%) with a median of 24. Physical scores increased with age and disease duration (p<0.001, p<0.001), but there was a weak negative relationship between psychological scores and age (p<0.001). The odds of people having an employment status of sick/disabled were 7.2 (CI 5.5, 9.4, p<0.001) for people with a moderate physical score, and 22.3 (CI 17.0, 29.3, p<0.001) for people with a high physical score (relative to having a low physical score).
This largest known study of its kind has demonstrated how the MSIS-29 can be administered via the internet to characterise a cohort, and to predict the likely impact of disability on taking an active part in the workforce, as a reasonable proxy for the effects of MS on general activities. The findings examining MSIS-29-PHYS and MSIS-29-PSYCH scores against age support the use of two sub-scales, not a combined score. These results underline the importance of using a scale such as this to monitor disability levels regularly in guiding MS care to enable people to be as active as possible.
Understanding how stem cells are regulated in adult tissues is a major challenge in cell biology. In the basal layer of human epidermis, clusters of almost quiescent stem cells are interspersed with proliferating and differentiating cells. Previous studies have shown that the proliferating cells follow a pattern of balanced stochastic cell fate. This behaviour enables them to maintain homeostasis, while stem cells remain confined to their quiescent clusters. Intriguingly, these clusters reappear spontaneously in culture, suggesting that they may play a functional role in stem cell auto-regulation. We propose a model of pattern formation that explains how clustering could regulate stem cell activity in homeostatic tissue through contact inhibition and stem cell aggregation.
stem cells; pattern formation; epidermis
A 25-year-old woman presented at 12 weeks gestation, with symptoms and laboratory investigations consistent with pheochromocytoma. Imaging modalities available during pregnancy were limited and MRI scan of the abdomen and the neck failed to localise the tumour. Postpartum imaging, including 131-metaiodobenzylguanidine and octreotide scans, cardiac CT, cardiac MRI and cardiac catheterisation, allowed accurate localisation of the tumour and helped plan for successful surgical removal.
A large number of diverse, complex, and distributed data resources are currently available in the Bioinformatics domain. The pace of discovery and the diversity of information means that centralised reference databases like UniProt and Ensembl cannot integrate all potentially relevant information sources. From a user perspective however, centralised access to all relevant information concerning a specific query is essential. The Distributed Annotation System (DAS) defines a communication protocol to exchange annotations on genomic and protein sequences; this standardisation enables clients to retrieve data from a myriad of sources, thus offering centralised access to end-users.
We introduce MyDas, a web server that facilitates the publishing of biological annotations according to the DAS specification. It deals with the common functionality requirements of making data available, while also providing an extension mechanism in order to implement the specifics of data store interaction. MyDas allows the user to define where the required information is located along with its structure, and is then responsible for the communication protocol details.