Calcium plays an essential role in nearly all cellular processes. As such, cellular and systemic calcium concentrations are tightly regulated. During sepsis derangements in such tight regulation frequently occur, and treating hypocalcemia with parenteral calcium administration remains the current practice guideline.
We investigated whether calcium administration worsens mortality and organ dysfunction using an experimental murine model of sepsis and explored the mechanistic role of the family of calcium/calmodulin-dependent protein kinases in mediating these physiologic effects. To highlight the biological relevance of these observations, we conducted a translational study of the association between calcium administration, organ dysfunction and mortality among a cohort of critically ill septic ICU patients
Prospective, randomized controlled experimental murine study. Observational clinical cohort analysis.
University research laboratory. Eight ICUs at a tertiary care center.
870 septic ICU patients.
C57BL/6 and CaMKK−/− mice.
Mice underwent cecal ligation and puncture polymicrobial sepsis and were administered calcium chloride (0.25 or 0.25 mg/kg) or normal saline.
Measurements and Main Results
Administering calcium chloride to septic C57BL/6 mice heightened systemic inflammation and vascular leak, exacerbated hepatic and renal dysfunction, and increased mortality. These events were significantly attenuated in CaMKK−/− mice. In a risk–adjusted analysis of septic patients, calcium administration was associated with an increased risk of death, OR 1.92 (95% CI 1.00–3.68, p=0.049), a significant increase in the risk of renal dysfunction, OR 4.74 (95% CI 2.48–9.08, p<0.001), and a significant reduction in ventilator free days, mean decrease 3.29 days (0.50–6.08 days, p=0.02).
Derangements in calcium homeostasis occur during sepsis that are sensitive to calcium administration. This altered calcium signaling, transduced by the CaMKK cascade, mediates heightened inflammation and vascular leak that culminates in elevated organ dysfunction and mortality. In the clinical management of septic patients calcium supplementation provides no benefit and may impose harm.
calcium; sepsis; infection; inflammation; calcium/calmodulin-dependent protein kinase; mortality; organ failure
Autophagy is an evolutionarily conserved cytoplasmic process regulated by the energy rheostats mTOR and AMPK that recycles damaged or unused proteins and organelles. It has been described as an important effector arm of immune cells. We have shown that the cytoplasmically oriented calcium/calmodulin-dependent protein kinase I α (CaMKIα) regulates the inflammatory phenotype of the macrophage (Mφ). Here, we hypothesize that CaMKIα mediates Mφ autophagy. LPS induced autophagy in RAW 264.7 cells and murine peritoneal Mφ that was attenuated with biochemical CaMK inhibition or CaMKIα siRNA. Inhibition of CaMKIα reduced LPS-induced p-Thr172AMPK and TORC1 activity, and expression of a constitutively active CaMKIα but not a kinase-deficient mutant induced p-Thr172AMPK and autophagy that was attenuated by the AMPK inhibitor Compound C. Co-immunoprecipitation and in vitro kinase assays demonstrated that CaMKIα activates AMPK, thereby inducing ATG7, which also localizes to this CaMKIα-AMPK complex. During LPS-induced lung inflammation, C57Bl/6 mice receiving CaMKIαsiRNA displayed reduced lung and bronchoalveolar immune cell autophagy that correlated with reduced neutrophil recruitment, myeloperoxidase activity, and air space cytokine concentration. Independently inhibiting autophagy, using siRNA targeting the PI3 kinase VPS34, yielded similar reductions in lung autophagy and neutrophil recruitment. Thus, a novel CaMKIα-AMPK pathway is rapidly activated in Mφ exposed to LPS and regulates an early autophagic response, independent of TORC1 inhibition. These mechanisms appear to be operant in vivo in orchestrating LPS-induced lung neutrophil recruitment and inflammation.
To evaluate compliance with ACS guidelines and whether trauma center designation, hospital TSCI case volume or spinal surgery volume is associated with paralysis. We hypothesized a priori that trauma center care, by contrast to non-trauma center care, is associated with reduced paralysis at discharge.
Summary Background Data
Approximately 11,000 persons incur a traumatic spinal cord injury (TSCI) in the US annually. The American College of Surgeons (ACS) recommends all TSCI patients be taken to a Level I or II Trauma Center.
We studied 4121 patients diagnosed with TSCI by ICD-9-CM criteria in the 2001 hospital discharge files of seven states (FL, MA, NJ, NY, TX, VA, WA), who were treated in 100 trauma centers and 601 non-trauma centers. We performed multivariate analyses, including a propensity score quintile approach, adjusting for differences in case-mix and clustering by hospital and by state. We also studied 3125 patients using the expanded modified Medicare Provider Analysis and Review records for the years 1996, 2001 and 2006 to assess temporal trends in paralysis by trauma center designation.
Mortality was 7.5%, and 16.3% were discharged with paralysis. Only 57.9% (n=2378) received care at a designated trauma center. Trauma centers had a 16-fold higher admission caseload (20.7 vs. 1.3; p<0.001) and 30-fold higher surgical volume (9.6 vs. 0.3; p<0.001). In the multivariate propensity analysis, paralysis was significantly lower at trauma centers (adjusted OR 0.67; 95% CI, 0.53–0.85; p=0.001). Higher surgical volume, not higher admission volume, was associated with lower risk of paralysis. Indeed, at non-trauma centers, higher admission caseload was associated with worse outcome. There was no significant difference in mortality.
Trauma center care is associated with reduced paralysis after TSCI, possibly due to greater use of spinal surgery. National guidelines to triage all such patients to trauma centers are followed little more than half the time.
Red blood cell (RBC) transfusion is associated with alterations in systemic concentrations of IL-8/CXCL8 functional homologues in a murine model. Whether RBC transfusion alters systemic neutrophil chemokine concentrations in individuals sustaining traumatic injury is not known. We conducted a retrospective, single-center study of severely injured trauma patients presenting within 12 h of injury with a base deficit > 6 and hypotension in the field. Plasma concentrations of twenty-five chemokines, cytokines, and growth factors were obtained from both transfused (N=22) and non-transfused (N=33) groups in the first 48 h following admission. The transfused group (mean RBC units ± SD: 2.7 ±1.7) tended to be older (49.9 ±21.1 versus 40.4 ±19.9 years, p=0.10), with a higher percentage of females (40.9% versus 18.2%, p=0.06), and a higher injury severity score (ISS) (27.1 ±12.7 versus 21.4 ±10.2, p=0.07). In univariate and multivariate analyses, transfusion was associated with increased hospital and ICU length of stay but not ventilator-free days. Plasma CXCL8 concentrations were higher in the transfused (mean ±SD: 84 ±88 pg/mL) than the non-transfused group (31 ±21 pg/mL, p=0.003). Using a linear prediction model to calculate bioanalyte concentrations standardized for age, gender, ISS, and admission SBP, we observed that CXCL8 concentrations diverged within 12 hours following injury, with the transfused group showing persistently elevated CXCL8 concentrations by contrast to the decay observed in the non-transfused group. Other bioanalytes showed no differences across time. RBC transfusion is associated with persistently elevated neutrophil chemokine CXCL8 concentrations following traumatic injury.
Chemokines; RBC transfusion; Inflammation; trauma
Our previous Delphi Study identified several audit filters considered sensitive to deviations in prehospital trauma care and potentially useful in conducting performance improvement (PI), a process currently recommended by the American College of Surgeons Committee on Trauma (ACS-COT). This study validates two of those proposed audit filters.
We studied 4744 trauma patients using the electronic records of the Central Region Trauma registry and Emergency Medical Services (EMS) patient logs for the period January 1, 2002 to December 31, 2004. We studied whether 1) request by on-scene Basis Life Support (BLS) for Advanced Life Support (ALS) assistance or 2) failure by EMS personnel to record basic patient physiology at the scene was associated with increased in-hospital mortality. We performed multivariate analyses, including a propensity score quintile approach, adjusting for differences in case mix and clustering by hospital.
Overall mortality was 6.1%. A total of 28.2% (n=1337) of EMS records were missing patient scene physiologic data. Multivariate analysis revealed that patients missing one or more measures of patient physiology at the scene had increased risk of death (adjusted OR 2.15; 95% CI 1.13–4.10). In 17.4% (n=402) of cases BLS requested ALS assistance. Patients for whom BLS requested ALS had a similar risk of death as patients for whom ALS was initially dispatched (OR 1.04; 95% CI 0.51 to 2.15).
Failure of EMS to document basic measures of scene physiology is associated with increased mortality. This deviation in care may serve as a sensitive audit filter for performing performance improvement. The need by BLS for ALS assistance was not associated with increased mortality.
The chromatin-binding factor HMGB1 functions as a proinflammatory cytokine and late mediator of mortality in murine endotoxemia. Although serine phosphorylation of HMGB1 is necessary for nucleocytoplasmic shuttling prior to its cellular release, the protein kinases involved have not been identified. To investigate if CaMKIV serine phosphorylates and mediates the release of HMGB1 from macrophages (Mϕ) stimulated with LPS, RAW 264.7 cells or murine primary peritoneal Mϕ were incubated with either STO609 (a CaMKIV kinase inhibitor), KN93 (a CaMKIV inhibitor), or we utilized cells from which CaMKIV was depleted by RNAi prior to stimulation with LPS. We also compared the LPS response of primary Mϕ isolated from CaMKIV +/+ and CaMKIV -/- mice. In both cell types LPS induced activation and nuclear translocation of CaMKIV, which preceded HMGB1 nucleocytoplasmic shuttling. However, Mϕ treated with KN93, STO609 or CaMKIV RNAi prior to LPS showed reduced nucleocytoplasmic shuttling of HMGB1 and release of HMGB1 into the supernatant. In addition, LPS induced serine phosphorylation of HMGB1, which correlated with an interaction between CaMKIV and HMGB1 and with CaMKIV phosphorylation of HMGB1 in vitro. In cells, both HMGB1 phosphorylation and interaction with CaMKIV were inhibited by STO609 or CaMKIV RNAi. Similarly, whereas CaMKIV +/+ Mϕ showed serine phosphorylation of HMGB1 in response to LPS, this phosphorylation was attenuated in CaMKIV -/- Mϕ. Collectively, our results demonstrate that CaMKIV promotes the nucleocytoplasmic shuttling of HMGB1 and suggest that the process may be mediated through CaMKIV-dependent serine phosphorylation of HMGB1.
monocytes/macrophages; lipopolysaccharide; inflammation; signal transduction
Mononuclear phagocyte recognition of apoptotic cells triggering suppressive cytokine signaling is a key event in inflammation resolution from injury. Mice deficient in thrombospondin-1 (thbs1−/−), an extracellular matrix glycoprotein that bridges cell-cell interactions, are prone to LPS-induced lung injury and show defective macrophage IL-10 production during the resolution phase of inflammation. Reconstitution of IL-10 rescues thbs1−/− mice from persistent neutrophilic lung inflammation and injury and thbs1−/− alveolar macrophages show defective IL-10 production following intratracheal instillation of apoptotic neutrophils despite intact efferocytosis. Following co-culture with apoptotic neutrophils, thbs1−/− macrophages show a selective defect in IL-10 production whereas PGE2 and TGF-β1 responses remain intact. Full macrophage IL-10 responses require the engagement of thrombospondin-1 structural repeat 2 domain and the macrophage scavenger receptor CD36 LIMP-II Emp sequence homology (CLESH) domain in vitro. Although TSP-1 is not essential for macrophage engulfment of apoptotic neutrophils in vivo, TSP-1 aids in the curtailment of inflammatory responses during the resolution phase of injury in the lungs by providing a means by which apoptotic cells are recognized and trigger optimal IL-10 production by macrophages.
Thrombospondin-1; Injury Resolution
Physician non-compliance with clinical practice guidelines remains a critical barrier to high quality care. Serious games (using gaming technology for serious purposes) have emerged as a method of studying physician decision making. However, little is known about their validity.
We created a serious game and evaluated its construct validity. We used the decision context of trauma triage in the Emergency Department of non-trauma centers, given widely accepted guidelines that recommend the transfer of severely injured patients to trauma centers. We designed cases with the premise that the representativeness heuristic influences triage (i.e. physicians make transfer decisions based on archetypes of severely injured patients rather than guidelines). We randomized a convenience sample of emergency medicine physicians to a control or cognitive load arm, and compared performance (disposition decisions, number of orders entered, time spent per case). We hypothesized that cognitive load would increase the use of heuristics, increasing the transfer of representative cases and decreasing the transfer of non-representative cases.
We recruited 209 physicians, of whom 168 (79%) began and 142 (68%) completed the task. Physicians transferred 31% of severely injured patients during the game, consistent with rates of transfer for severely injured patients in practice. They entered the same average number of orders in both arms (control (C): 10.9 [SD 4.8] vs. cognitive load (CL):10.7 [SD 5.6], p = 0.74), despite spending less time per case in the control arm (C: 9.7 [SD 7.1] vs. CL: 11.7 [SD 6.7] minutes, p<0.01). Physicians were equally likely to transfer representative cases in the two arms (C: 45% vs. CL: 34%, p = 0.20), but were more likely to transfer non-representative cases in the control arm (C: 38% vs. CL: 26%, p = 0.03).
We found that physicians made decisions consistent with actual practice, that we could manipulate cognitive load, and that load increased the use of heuristics, as predicted by cognitive theory.
Regionalization is intended to optimize outcomes by matching patient needs with institutional resources. The American College of Surgeons – Committee on Trauma (ACS-COT) recommends <5% under-triage (treatment of patients with moderate-severe injuries at non-trauma centers (NTCs)) and <50% over-triage (transfer of patients with minor injuries to trauma centers (TCs)).
To test the feasibility of accomplishing the ACS-COT benchmarks given current practice patterns by describing transfer patterns for patients taken initially to NTCs and estimating volume shifts and potential lives saved if full implementation occurred.
Design, Setting and Patients
Retrospective cohort study of adult trauma patients initially evaluated at NTCs in Pennsylvania (2001–2005). We used published estimates of mortality risk reduction associated with treatment at TCs.
Main Outcome Measures
Under- and over-triage rates; estimated patient volume shifts; number of lives saved.
93,880 adult trauma patients were initially evaluated at NTCs in Pennsylvania between 2001–2005. Under-triage was 69%; over-triage was 53%. Achieving <5% under-triage would require the transfer of 18,945 patients/year, a five-fold increase from current practice (3,650 transfers/year). Given an absolute mortality risk reduction of 1.9% for patients with moderate-severe injuries treated at TCs, this change in practice would save 99 potential lives/year, or require 191 transfers/year to save 1 potential life.
Given current practice patterns, ACS-COT recommendations for the regionalization of trauma patients may not be feasible. To achieve 5% under-triage, TCs must increase their capacity 5-fold, physicians at NTCs must increase their capacity to discriminate between moderate-severe and other injuries, or the guidelines must be modified.
decision making; compliance; safety; quality assurance; triage; discrimination
Injured elderly patients experience high rates of undertriage to trauma centers (TCs) while debate continues regarding the age defining a geriatric trauma patient. We sought to identify when mortality risk increases in injured patients due to age alone, to determine whether TC care was associated with improved outcomes for these patients, and to estimate the added admissions burden to TCs using an age threshold for triage.
We performed a retrospective cohort study of injured patients treated at TCs and non-TCs in Pennsylvania from April 1, 2001, to March 31, 2005. Patients were included if they were between ages 19 to 100 years and had sustained minimal injury (ISS<9). The primary outcome was in-hospital mortality. We analyzed age as a predictor of mortality using the fractional polynomial method.
A total of 104,015 patients were included. Mortality risk significantly increased at 57 years (OR: 5.58; 95% CI: 1.07-29.0; p=0.04) relative to 19-year-old patients. TC care was associated with a decreased mortality risk compared to non-TC care (OR: 0.83; 95% CI: 0.69-0.99; p=0.04). Using an age of 70 as a threshold for mandatory triage, we estimated TCs could expect an annual increase of approximately one additional admission per day.
Age is a significant risk factor for mortality in trauma patients, and TC care improves outcomes even in older, minimally injured patients. An age threshold should be considered as a criterion for TC triage. Using the clinically relevant age of 70 as this threshold would not impose a substantial increase on annual TC admissions.
Trauma; Geriatric; Age; Elderly; Triage
Treatment at Level I/II trauma centers improves outcomes for patients with severe injuries. Little is known about the role of physicians’ clinical judgment in triage at outlying hospitals. We assessed the association between physician caseload, case mix, and the triage of trauma patients presenting to non-trauma centers.
A retrospective cohort analysis of patients evaluated between January 1, 2007 and December 31, 2010 by emergency physicians working in eight community hospitals in western Pennsylvania. We linked billing records to hospital charts, summarized physicians’ caseloads, and calculated rates of under-triage (proportion of patients with moderate to severe injuries not transferred to a trauma center) and over-triage (proportion of patients transferred with a minor injury). We measured the correlation between physician characteristics, caseload and rates of triage.
29 (58%) of 50 eligible physicians participated in the study. Physicians had 16.8 (SD=10.1) years of post-residency clinical experience; 21 (72%) were board-certified in Emergency Medicine. They evaluated a median of 2,423 patients/year, of whom 148 (6%) were trauma patients and 3 (0.1%) had moderate to severe injuries. The median under-triage rate was 80%; the median over-triage rate was 91%. Physicians’ caseload of patients with moderate to severe injuries was inversely associated with rates of under-triage (correlation coefficient −0.42, p=0.03). Compared to physicians in the lowest quartile, those in the highest quartile under-triaged 31% fewer patients.
Emergency physicians working in non-trauma centers rarely encounter patients with moderate to severe injuries. Caseload was strongly associated with compliance with American College of Surgeons – Committee on Trauma guidelines.
trauma triage; guidelines; volume-outcome
Recent studies suggest that statin use may improve outcome in critically ill patients. This has been attributed to the pleiomorphic effect and modulation of inflammatory mediators that occurs with statin use. We sought to determine whether preinjury statin (PIS) use was associated with improved outcome in severely injured blunt trauma patients.
Data were obtained from a multicenter prospective cohort study evaluating outcomes in blunt injured adults with hemorrhagic shock. Patients aged 55 years and older were analyzed. Those with isolated traumatic brain injury, cervical cord injury, and those who survived <24 hours were excluded. A propensity score predicting statin use was created using logistic regression. Cox proportional hazard regression was then used to evaluate the effects of PIS use on mortality and the development of multiple organ failure (MOF, multiple organ dysfunction syndrome >5 and nosocomial infection (NI) after adjusting for important injury characteristics and the propensity of taking PISs.
Overall mortality and MOF rates for the study cohort (n = 295) were 21% and 50%, respectively. Over 24% of patients (n = 71) reported PIS use. Kaplan-Meier analysis revealed no difference in NI or mortality over time but did show a significant higher incidence of MOF in those with PIS use (p = 0.04). Regression analysis verified PIS was independently associated with an 80% higher risk of MOF (hazard ratio: 1.8; 95% confidence interval, 1.1–2.9) and was found to be one of the strongest independent risk factors for the development of MOF.
PIS use was independently associated with a higher risk of MOF postinjury. These results are contrary to previous analyses. The protective effect of statins may be lost in the severely injured, and modulation of the inflammatory response may result in higher morbidity. Further studies are required to better understand the impact and potential therapeutic utility of this commonly prescribed medication both before and after injury.
Preinjury statin use; Multiple organ failure; Cox proportional hazard regression
Computed Tomography (CT) has become an essential tool in the assessment of the stable trauma patient. Intravenous (IV) contrast is commonly relied upon to provide superior image quality, particularly for solid organ injury. However, a significant proportion of injured patients have contraindications to IV contrast. Little information exists concerning the repercussions of CT imaging without IV contrast specifically for splenic injury.
We performed a retrospective analysis using data from our trauma registry and chart review as part of a quality improvement project at our institution. All patients with splenic injury, over a 3 year period (2008–2010), where a CT of the abdomen without IV contrast (DRY) early during their admission were selected. All splenic injuries had to have been verified with abdominal CT imaging with IV contrast (CONTRAST) or via intraoperative findings. DRY images were independently read by a single, blinded, radiologist and assessed for parenchymal injury or ‘suspicious’ splenic injury findings and compared with CONTRAST imaging results or intraoperative findings.
Over the time period of the study 319 patients had documented splenic injury with 44 (14%) patients undergoing DRY imaging which was also verified by CONTRAST imaging or operative findings. Splenic parenchymal injury was only visualized in 38% of patients DRY patients. ‘Suspicious’ splenic injury radiographic findings were common. When these less specific findings for splenic injury were incorporated in the radiographic assessment, DRY imaging had over a 93% sensitivity for detecting splenic injury.
DRY imaging is increasingly being performed post-injury and has a low sensitivity in detecting splenic parenchymal injury. However, less specific radiographic findings suspicious for splenic injury in combination provide high sensitivity for splenic injury detection. These results suggest CONTRAST imaging is preferred to detect splenic injury, however, in those patients who have contraindications to IV contrast, DRY imagining may be able to select those who require close monitoring or intervention.
To determine that 1) an age-dependent loss of inducible autophagy underlies the failure to recover from AKI in older, adult animals during endotoxemia, and 2) pharmacologic induction of autophagy, even after established endotoxemia, is of therapeutic utility in facilitating renal recovery in aged mice.
Murine model of endotoxemia and cecal ligation and puncture (CLP) induced acute kidney injury (AKI).
Academic research laboratory.
C57Bl/6 mice of 8 (young) and 45 (adult) weeks of age.
Lipopolysaccharide (1.5 mg/kg), Temsirolimus (5 mg/kg), AICAR (100 mg/kg). Measurements and Main Results: Herein we report that diminished autophagy underlies the failure to recover renal function in older adult mice utilizing a murine model of LPS-induced AKI. The administration of the mTOR inhibitor temsirolimus, even after established endotoxemia, induced autophagy and protected against the development of AKI.
These novel results demonstrate a role for autophagy in the context of LPS-induced AKI and support further investigation into like interventions that have potential to alter the natural history of disease.
To identify the optimal target of a future intervention to improve physician decision making in trauma triage.
A comparison of the incremental cost-effectiveness ratios (ICERs) of current practice versus hypothetical interventions targeting either physicians’ decisional thresholds (attitudes towards transferring patients to trauma centers) or perceptual sensitivity (ability to identify patients who meet guidelines for transfer).
Taking the societal perspective, we constructed a Markov decision model. We drew estimates of triage patterns, mortality, utilities, and costs from the literature. We assumed that an intervention to change decisional threshold would reduce under-triage but also increase over-triage more than an intervention to change perceptual sensitivity. We performed a series of one-way sensitivity analyses, and studied the most influential variables in a Monte Carlo simulation.
The ICER of an intervention to change perceptual sensitivity was $62,799/ quality-adjusted life years (QALY)-gained compared with current practice. The ICER of an intervention to change decisional threshold was $104,975/QALY-gained compared with an intervention to change perceptual sensitivity. These findings were most sensitive to the relative cost of hospitalizing patients with moderate-severe injuries and their relative risk of dying at non-trauma centers. In probabilistic sensitivity analyses, at a willingness-to-pay threshold of $100,000/QALY-gained, there was a 62% likelihood that an intervention to change perceptual sensitivity was the most cost-effective alternative.
Even a minor investment in changing decision making in trauma triage could greatly improve the quality of care provided. The optimal intervention depends on the characteristics of the individual trauma systems.
cost-effectiveness; trauma; triage; clinical practice guidelines; heuristics
Sepsis induced lymphocyte and dendritic cell apoptosis contributes to immunosuppression, which results in an inability to eradicate the primary infection as well as a propensity to acquire new, secondary infections. Another cellular process, autophagy, is also activated in immune cells and plays a protective role. In the present study, we demonstrate that interferon regulatory factor-1 (IRF-1) regulates both immune cell apoptosis and autophagy in a murine endotoxemia model. IRF-1 is activated at an early phase through a Toll-like receptor 4 (TLR4)-dependent, myeloid differentiation primary response gene 88 (MyD88)-independent manner in splenocytes. Further, IRF-1 knockout (KO) mice are protected from a lethal endotoxemia model. This protection is associated with decreased apoptosis and increased autophagy in splenocytes. IRF-1 KO mice experience decreased apoptotic cell loss, especially in CD4+ T lymphocytes and myeloid antigen presenting cells (APC). Meanwhile, IRF-1 KO mice demonstrate increased autophagy and improved mitochondrial integrity. This increased autophagy in KO mice is attributable, at least in part, to deactivationof mammalian target of rapamycin (mTOR)/P70S6 signaling - a main negative regulator of autophagy. Therefore, we propose a novel role for IRF-1 in regulating both apoptosis and autophagy in splenocytes in the setting of endotoxemia with IRF-1 promoting apoptosis and inhibiting autophagy.
IRF-1; endotoxemia; apoptosis; autophagy; splenocyte
United States trauma system guidelines specify when to triage patients to specialty centers. Nonetheless, many eligible patients are not transferred as per guidelines. One possible reason is emergency physician decision-making. The objective of the study was to characterize sensory and decisional determinants of emergency physician trauma triage decision-making.
We conducted a decision science study using a signal detection theory-informed approach to analyze physician responses to a web-based survey of 30 clinical vignettes of trauma cases. We recruited a national convenience sample of emergency medicine physicians who worked at hospitals without level I/II trauma center certification. Using trauma triage guidelines as our reference standard, we estimated physicians’ perceptual sensitivity (ability to discriminate between patients who did and did not meet guidelines for transfer) and decisional threshold (tolerance for false positive or false negative decisions).
We recruited 280 physicians: 210 logged in to the website (response rate 74%) and 168 (80%) completed the survey. The regression coefficient on American College of Surgeons – Committee on Trauma (ACS-COT) guidelines for transfer (perceptual sensitivity) was 0.77 (p<0.01, 95% CI 0.68 – 0.87) indicating that the probability of transfer weakly increased as the ACS-COT guidelines would recommend transfer. The intercept (decision threshold) was 1.45 (p<0.01, 95% CI 1.27 – 1.63), indicating that participants had a conservative threshold for transfer, erring on the side of not transferring patients. There was significant between-physician variability in perceptual sensitivity and decisional thresholds. No physician demographic characteristics correlated with perceptual sensitivity, but men and physicians working at non-trauma centers without a trauma-center affiliation had higher decisional thresholds.
On a case vignette-based questionnaire, both sensory and decisional elements in emergency physicians’ cognitive processes contributed to the under-triage of trauma patients.
Physician decision making; Clinical guidelines; Compliance; Trauma; Triage; Heuristics; Perceptual sensitivity; Signal detection theory
Transcutaneous techniques to measure serum bilirubin have been validated in neonates but not in adult patients. We evaluated transcutaneous bilirubinometry (TcB) in adults at risk for or diagnosed with hepatic dysfunction to determine if this technology has clinical utility in quantifying the presence and magnitude of hyperbilirubinemia.
Unblinded, consecutive hospitalized adult patients (n = 80) from the General Surgery, Trauma Surgery, and Liver Resection / Transplant services of a tertiary care, university affiliated medical center who were having serum bilirubin measurements performed underwent transcutaneous bilirubin measurement from the forehead, sternum, forearm, and deltoid. The transcutaneous bilirubin measurements were repeated each time serum bilirubin measurements were performed.
Transcutaneous bilirubin measurements from the forehead correlated with serum bilirubin better (R=0.963) than measurements from the forearm (R = 0.792), deltoid (R = 0.922), or sternum (R = 0.928). Forehead TcB detected hepatic dysfunction (serum bilirubin ≥ 2 mg/dl) by receiver operator curves (Area Under the Curve, AUC = 0.971) as well as sternum (AUC = 0.970) and better than deltoid and forearm measurements (AUC = 0.935 and 0.893, respectively). A Bland-Altman plot, however, demonstrated that forehead measurements became less accurate as the magnitude of hyperbilirubinemia increased.
Forehead TcB correlated best with serum bilirubin levels but became less accurate at higher values. Refinements in the technology will be required before this technique, while promising, can be considered for routine clinical application in adults being evaluated for hyperbilirubinemia.
Hepatic Function; Trauma; Transplantation; Spectroscopy
Light before and during acute illness has been associated with both benefit and harm in animal models and small human studies. Our objective was to determine the associations of light duration (photoperiod) and intensity (insolation) before and during critical illness with hospital mortality in ICU patients. Based on the 'winter immunoenhancement' theory, we tested the hypothesis that a shorter photoperiod before critical illness is associated with improved survival.
We analyzed data from 11,439 patients admitted to 8 ICUs at the University of Pittsburgh Medical Center between June 30, 1999 and July 31, 2004. Daily photoperiod and insolation prior to and after ICU admission were estimated for each patient by using data provided by the United States Naval Observatory and National Aeronautics and Space Administration and direct measurement of light gradient from outside to bedside for each ICU room. Our primary outcome was hospital mortality. The association between light and risk of death was analyzed using multivariate analyses, adjusting for potential confounders, including severity of illness, case mix, and ICU type.
The cohort had an average APACHE III of 52.9 and a hospital mortality of 10.7%. In total, 128 ICU beds were analyzed; 108 (84%) had windows. Pre-illness photoperiod ranged from 259 to 421 hours in the prior month. A shorter photoperiod was associated with a reduced risk of death: for each 1-hour decrease, the adjusted OR was 0.997 (0.994 to 0.999, p = 0.03). In the ICU, there was near complete (99.6%) degradation of natural light from outside to the ICU bed. Thus, light exposure once in the ICU approached zero; the 24-hour insolation was 0.005 ± 0.003 kWh/m2 with little diurnal variation. There was no association between ICU photoperiod or insolation and mortality.
Consistent with the winter immunoenhancement theory, a shorter photoperiod in the month before critical illness is associated with a reduced risk of death. Once in the ICU, patients are exposed to near negligible natural light despite the presence of windows. Further studies are warranted to determine the underlying mechanisms and whether manipulating light exposure, before or during ICU admission, can enhance survival.
The pathogenesis of sepsis is complex and, unfortunately, poorly understood. The cellular process of autophagy is believed to play a protective role in sepsis; however, the mechanisms responsible for its regulation in this setting are ill defined. In the present study, interferon regulatory factor 1 (IRF-1) was found to regulate the autophagic response in lipopolysaccharide (LPS)-stimulated macrophages. In vivo, tissue macrophages obtained from LPS-stimulated IRF-1 knockout (KO) mice demonstrated increased autophagy and decreased apoptosis compared to those isolated from IRF-1 wild-type (WT) mice. In vitro, LPS-stimulated peritoneal macrophages obtained from IRF-1 KO mice experienced increased autophagy and decreased apoptosis. IRF-1 mediates the inhibition of autophagy by modulating the activation of the mammalian target of rapamycin (mTOR). LPS induced the activation of mTOR in WT peritoneal macrophages, but not in IRF-1 KO macrophages. In contrast, overexpression of IRF-1 alone increased the activation of mTOR and consequently decreased autophagic flux. Furthermore, the inhibitory effects of IRF-1 mTOR activity were mediated by nitric oxide (NO). Therefore, we propose a novel role for IRF-1 and NO in the regulation of macrophage autophagy during LPS stimulation in which IRF-1/NO inhibits autophagy through mTOR activation.
Trauma/hemorrhagic shock (T/HS) results in cytokine-mediated acute inflammation that is generally considered detrimental.
Paradoxically, plasma levels of the early inflammatory cytokine TNF-α (but not IL-6, IL-10, or NO2-/NO3-) were significantly elevated within 6 h post-admission in 19 human trauma survivors vs. 4 non-survivors. Moreover, plasma TNF-α was inversely correlated with Marshall Score, an index of organ dysfunction, both in the 23 patients taken together and in the survivor cohort. Accordingly, we hypothesized that if an early, robust pro-inflammatory response were to be a marker of an appropriate response to injury, then individuals exhibiting such a response would be predisposed to survive. We tested this hypothesis in swine subjected to various experimental paradigms of T/HS. Twenty-three anesthetized pigs were subjected to T/HS (12 HS-only and 11 HS + Thoracotomy; mean arterial pressure of 30 mmHg for 45–90 min) along with surgery-only controls. Plasma obtained at pre-surgery, baseline post-surgery, beginning of HS, and every 15 min thereafter until 75 min (in the HS only group) or 90 min (in the HS + Thoracotomy group) was assayed for TNF-α, IL-6, IL-10, and NO2-/NO3-. Mean post-surgery±HS TNF-α levels were significantly higher in the survivors vs. non-survivors, while non-survivors exhibited no measurable change in TNF-α levels over the same interval.
Contrary to the current dogma, survival in the setting of severe, acute T/HS appears to be associated with an immediate increase in serum TNF-α. It is currently unclear if this response was the cause of this protection, a marker of survival, or both. This abstract won a Young Investigator Travel Award at the SHOCK 2008 meeting in Cologne, Germany.
The Duffy blood group Ag (dfy) binds selective CXC and CC chemokines at high affinity and is expressed on erythrocytes and endothelial cells. However, it does not transmit a signal via G proteins, as occurs with other seven-transmembrane receptors. We hypothesized that dfy functions as a chemokine reservoir and regulates inflammation by altering soluble chemokine concentrations in the blood and tissue compartments. We determined whether Duffy Ag “loss-of-function” phenotypes (human and murine) are associated with alterations in plasma chemokine concentrations during the innate inflammatory response to LPS. Plasma CXCL8 and CCL2 concentrations from humans homozygous for the GATA-1 box polymorphism, a dfy polymorphism that abrogates erythrocyte chemokine binding, were higher than in heterozygotes following LPS stimulation of their whole blood in vitro. Similarly, dfy−/− mice showed higher plasma MIP-2 concentrations than dfy+/+ mice following LPS stimulation of whole blood in vitro. We then determined the relative contributions of erythrocyte and endothelial Duffy Ag in modifying chemokine concentrations and neutrophil recruitment in the lungs following intratracheal LPS administration in dfy−/− and dfy+/+ mice reconstituted with dfy−/− or dfy+/+ marrow. Mice lacking endothelial dfy expression had higher MIP-2 and keratinocyte chemoattractant concentrations in the airspaces. Mice lacking erythrocyte dfy had higher MIP-2 and keratinocyte chemoattractant concentrations in the lung tissue vascular space, but lower plasma chemokine concentrations associated with attenuated neutrophil recruitment into the airspaces. These data indicate that dfy alters soluble chemokine concentrations in blood and local tissue compartments and enhances systemic bioavailability of chemokines produced during local tissue inflammation.
Ischemic tissues require mechanisms to alert the immune system of impending cell damage. The nuclear protein high-mobility group box 1 (HMGB1) can activate inflammatory pathways when released from ischemic cells. We elucidate the mechanism by which HMGB1, one of the key alarm molecules released during liver ischemia/reperfusion (I/R), is mobilized in response to hypoxia. HMGB1 release from cultured hepatocytes was found to be an active process regulated by reactive oxygen species (ROS). Optimal production of ROS and subsequent HMGB1 release by hypoxic hepatocytes required intact Toll-like receptor (TLR) 4 signaling. To elucidate the downstream signaling pathways involved in hypoxia-induced HMGB1 release from hepatocytes, we examined the role of calcium signaling in this process. HMGB1 release induced by oxidative stress was markedly reduced by inhibition of calcium/calmodulin-dependent kinases (CaMKs), a family of proteins involved in a wide range of calcium-linked signaling events. In addition, CaMK inhibition substantially decreased liver damage after I/R and resulted in accumulation of HMGB1 in the cytoplasm of hepatocytes. Collectively, these results demonstrate that hypoxia-induced HMGB1 release by hepatocytes is an active, regulated process that occurs through a mechanism promoted by TLR4-dependent ROS production and downstream CaMK-mediated signaling.
Chronic obstructive pulmonary disease (COPD) is a heterogeneous syndrome characterized by varying degrees of airflow limitation and diffusion impairment. There is increasing evidence to suggest that COPD is also characterized by systemic inflammation. The primary goal of this study was to identify soluble proteins in plasma that associate with the severity of airflow limitation in a COPD cohort with stable disease. A secondary goal was to assess whether unique markers associate with diffusion impairment, based on diffusion capacity of carbon monoxide (DLCO), independent of the forced expiratory volume in 1 second (FEV1).
A cross sectional study of 73 COPD subjects was performed in order to examine the association of 25 different plasma proteins with the severity of lung function impairment, as defined by the baseline measurements of the % predicted FEV1 and the % predicted DLCO. Plasma protein concentrations were assayed using multiplexed immunobead-based cytokine profiling. Associations between lung function and protein concentrations were adjusted for age, gender, pack years smoking history, current smoking, inhaled corticosteroid use, systemic corticosteroid use and statin use.
Plasma concentrations of CCL2/monocyte chemoattractant protein-1 (CCL2/MCP-1), CCL4/macrophage inflammatory protein-1β (CCL4/MIP -1β), CCL11/eotaxin, and interleukin-13 (IL-13) were inversely associated with the % FEV1. Plasma concentrations of soluble Fas were associated with the % DLCO, whereas CXCL9/monokine induced by interferon-γ (CXCL9/Mig), granulocyte- colony stimulating factor (G-CSF) and IL-13 showed inverse relationships with the % DLCO.
Systemic inflammation in a COPD cohort is characterized by cytokines implicated in inflammatory cell recruitment and airway remodeling. Plasma concentrations of IL-13 and chemoattractants for monocytes, T lymphocytes, and eosinophils show associations with increasing severity of disease. Soluble Fas, G-CSF and CXCL9/Mig may be unique markers that associate with disease characterized by disproportionate abnormalities in DLCO independent of the FEV1.