Both hyperlactatemia and persistence of hyperlactatemia have been associated with bad outcome. We compared lactate and lactate-derived variables in outcome prediction.
Retrospective observational study. Case records from 2,251 consecutive intensive care unit (ICU) patients admitted between 2001 and 2007 were analyzed. Baseline characteristics, all lactate measurements, and in-hospital mortality were recorded. The time integral of arterial blood lactate levels above the upper normal threshold of 2.2 mmol/L (lactate-time-integral), maximum lactate (max-lactate), and time-to-first-normalization were calculated. Survivors and nonsurvivors were compared and receiver operating characteristic (ROC) analysis were applied.
A total of 20,755 lactate measurements were analyzed. Data are srpehown as median [interquartile range]. In nonsurvivors (n = 405) lactate-time-integral (192 [0–1881] min·mmol/L) and time-to-first normalization (44.0 [0–427] min) were higher than in hospital survivors (n = 1846; 0 [0–134] min·mmol/L and 0 [0–75] min, respectively; all p < 0.001). Normalization of lactate <6 hours after ICU admission revealed better survival compared with normalization of lactate >6 hours (mortality 16.6% vs. 24.4%; p < 0.001). AUC of ROC curves to predict in-hospital mortality was the largest for max-lactate, whereas it was not different among all other lactate derived variables (all p > 0.05). The area under the ROC curves for admission lactate and lactate-time-integral was not different (p = 0.36).
Hyperlactatemia is associated with in-hospital mortality in a heterogeneous ICU population. In our patients, lactate peak values predicted in-hospital mortality equally well as lactate-time-integral of arterial blood lactate levels above the upper normal threshold.
Lactate; Critically ill; Intensive care units; In-hospital mortality
Human serum albumin (HSA) has been used for a long time as a resuscitation fluid in critically ill patients. It is known to exert several important physiological and pharmacological functions. Among them, the antioxidant properties seem to be of paramount importance as they may be implied in the potential beneficial effects that have been observed in the critical care and hepatological settings. The specific antioxidant functions of the protein are closely related to its structure. Indeed, they are due to its multiple ligand-binding capacities and free radical-trapping properties. The HSA molecule can undergo various structural changes modifying its conformation and hence its binding properties and redox state. Such chemical modifications can occur during bioprocesses and storage conditions of the commercial HSA solutions, resulting in heterogeneous solutions for infusion. In this review, we explore the mechanisms that are responsible for the specific antioxidant properties of HSA in its native form, chemically modified forms, and commercial formulations. To conclude, we discuss the implication of this recent literature for future clinical trials using albumin as a drug and for elucidating the effects of HSA infusion in critically ill patients.
Human serum albumin; Antioxidant force; Oxidized albumin; Critically ill patients
Red blood cells (RBC) storage facilitates the supply of RBC to meet the clinical demand for transfusion and to avoid wastage. However, RBC storage is associated with adverse changes in erythrocytes and their preservation medium. These changes are responsible for functional alterations and for the accumulation of potentially injurious bioreactive substances. They also may have clinically harmful effects especially in critically ill patients. The clinical consequences of storage lesions, however, remain a matter of persistent controversy. Multiple retrospective, observational, and single-center studies have reported heterogeneous and conflicting findings about the effect of blood storage duration on morbidity and/or mortality in trauma, cardiac surgery, and intensive care unit patients. Describing the details of this controversy, this review not only summarizes the current literature but also highlights the equipoise that currently exists with regard to the use of short versus current standard (extended) storage duration red cells in critically ill patients and supports the need for large, randomized, controlled trials evaluating the clinical impact of transfusing fresh (short duration of storage) versus older (extended duration of storage) red cells in critically ill patients.
Age of red blood cells; Storage lesion; Critically ill patients; Outcome; Transfusion; Anemia; Trauma; Cardiac surgery; Mortality; Cytokines
A history of prolonged and excessive consumption of alcohol increases the risk for infections. The goal of this study was to investigate circulating white blood cells (WBC) differentiated by flow cytometry and neutrophil CD64 expression in excessive alcohol drinkers versus abstinent or moderate drinkers, and in those with or without infection, in medical patients admitted to the intensive care unit (ICU).
All patients admitted between September 2009 and March 2010 with an ICU-stay of 3 days or more were eligible for inclusion. Upon admission, hematological exams were conducted by flow cytometry.
Overall, 281 adult were included, with 37% identified as at-risk drinkers. The only significant difference found in circulating WBC between at-risk and not-at-risk drinkers was a lower number of B lymphocytes in at-risk drinkers (P = 0.002). Four groups of patients were defined: not-at-risk drinkers with no infection (n = 66); not-at-risk drinkers with infection (n = 112); at-risk drinkers with no infection (n = 53); and at-risk drinkers with infection (n = 50). Whilst the presence of infection significantly reduced levels of noncytotoxic and cytotoxic T lymphocytes and significantly increased levels of CD16– monocytes in not-at-risk drinkers, with variation related to infection severity, infection had no effect on any of the variables assessed in at-risk drinkers. Post-hoc comparisons showed that B-lymphocyte, noncytotoxic, and cytotoxic T lymphocyte and CD16– counts in at-risk drinkers were similar to those in not-at-risk drinkers with infection and significantly lower than those in not-at-risk drinkers without infection. Neutrophil CD64 index varied significantly between groups, with variations related to infection, not previous alcohol consumption.
These results show that chronic alcohol exposure has an impact on the immune response to infection in critically ill medical patients. The absence of significant variations in circulating WBC seen in at-risk drinkers according to the severity of infection is suggestive of altered immune response.
Alcohol; At-risk drinking; Intensive care unit; Infection; Flow cytometry; CD64 cells
Delirium features can vary greatly depending on the postoperative population studied; however, most studies focus only on high-risk patients. Describing the impact of delirium and risk factors in mixed populations can help in the development of preventive actions.
The occurrence of delirium was evaluated prospectively in 465 consecutive nonventilated postoperative patients admitted to a surgical intensive care unit (SICU) using the confusion assessment method (CAM). Patients with and without delirium were compared. A multiple logistic regression was performed to identify the main risk factors for delirium in the first 24 h of admission to the SICU and the main predictors of outcomes.
Delirium was diagnosed in 43 (9.2%) individuals and was more frequent on the second and third days of admission. The presence of delirium resulted in longer lengths of SICU and hospital stays [6 days (3–13) vs. 2 days (1–3), p < 0.001 and 26 days (12–39) vs. 6 days (3–13), p <0.001, respectively], as well as higher hospital and SICU mortality rates [16.3% vs. 4.0%, p = 0.004 and 6.5% vs. 1.7%, p = 0.042, respectively]. The risk factors for delirium were age (odds ratio (OR), 1.04 [1.02-1.07]), Acute Physiologic Score (APS; OR, 1.11 [1.04-1.2]), emergency surgery (OR, 8.05 [3.58-18.06]), the use of benzodiazepines (OR, 2.28 [1.04-5.00]), and trauma (OR, 6.16 [4.1-6.5]).
Delirium negatively impacts postoperative nonventilated patients. Risk factors can be used to detect high-risk patients in a mixed population of SICU patients.
Delirium; Postoperative; Surgery; Confusion assessment method
Delirium is characterized by a disturbance of consciousness with accompanying change in cognition. Delirium typically manifests as a constellation of symptoms with an acute onset and a fluctuating course. Delirium is extremely common in the intensive care unit (ICU) especially amongst mechanically ventilated patients. Three subtypes have been recognized: hyperactive, hypoactive, and mixed. Delirium is frequently undiagnosed unless specific diagnostic instruments are used. The CAM-ICU is the most widely studied and validated diagnostic instrument. However, the accuracy of this tool may be less than ideal without adequate training of the providers applying it. The presence of delirium has important prognostic implications; in mechanically ventilated patients it is associated with a 2.5-fold increase in short-term mortality and a 3.2-fold increase in 6-month mortality. Nonpharmacological approaches, such as physical and occupational therapy, decrease the duration of delirium and should be encouraged. Pharmacological treatment for delirium traditionally includes haloperidol; however, more data for haloperidol are needed given the paucity of placebo-controlled trials testing its efficacy to treat delirium in the ICU. Second-generation antipsychotics have emerged as an alternative for the treatment of delirium, and they may have a better safety profile. Dexmedetomidine may prove to be a valuable adjunctive agent for patients with delirium in the ICU.
Delirium; Critical illness; Coma; Sedatives; Antipsychotics
The intra-abdominal pressure (IAP) is an important clinical parameter that can significantly change during respiration. Currently, IAP is recorded at end-expiration (IAPee), while continuous IAP changes during respiration (ΔIAP) are ignored. Herein, a novel concept of considering continuous IAP changes during respiration is presented.
Based on the geometric mean of the IAP waveform (MIAP), a mathematical model was developed for calculating respiratory-integrated MIAP (i.e. MIAPri=IAPee+i⋅ΔIAP), where 'i' is the decimal fraction of the inspiratory time, and where ΔIAP can be calculated as the difference between the IAP at end-inspiration (IAPei) minus IAPee. The effect of various parameters on IAPee and MIAPri was evaluated with a mathematical model and validated afterwards in six mechanically ventilated patients. The MIAP of the patients was also calculated using a CiMON monitor (Pulsion Medical Systems, Munich, Germany). Several other parameters were recorded and used for comparison.
The human study confirmed the mathematical modelling, showing that MIAPri correlates well with MIAP (R2 = 0.99); MIAPri was significantly higher than IAPee under all conditions that were used to examine the effects of changes in IAPee, the inspiratory/expiratory (I:E) ratio, and ΔIAP (P <0.001). Univariate Pearson regression analysis showed significant correlations between MIAPri and IAPei (R = 0.99), IAPee (R = 0.99), and ΔIAP (R = 0.78) (P <0.001); multivariate regression analysis confirmed that IAPee (mainly affected by the level of positive end-expiratory pressure, PEEP), ΔIAP, and the I:E ratio are independent variables (P <0.001) determining MIAP. According to the results of a regression analysis, MIAP can also be calculated as
We believe that the novel concept of MIAP is a better representation of IAP (especially in mechanically ventilated patients) because MIAP takes into account the IAP changes during respiration. The MIAP can be estimated by the MIAPri equation. Since MIAPri is almost always greater than the classic IAP, this may have implications on end-organ function during intra-abdominal hypertension. Further clinical studies are necessary to evaluate the physiological effects of MIAP.
Monitoring hepatic blood flow and function might be crucial in treating critically ill patients. Intra-abdominal hypertension is associated with decreased abdominal blood flow, organ dysfunction, and increased mortality. The plasma disappearance rate (PDR) of indocyanine green (ICG) is considered to be a compound marker for hepatosplanchnic perfusion and hepatocellular membrane transport and correlates well with survival in critically ill patients. However, correlation between PDRICG and intra-abdominal pressure (IAP) remains poorly understood. The aim of this retrospective study was to investigate the correlation between PDRICG and classic liver laboratory parameters, IAP and abdominal perfusion pressure (APP). The secondary goal was to evaluate IAP, APP, and PDRICG as prognostic factors for mortality.
A total of 182 paired IAP and PDRICG measurements were performed in 40 critically ill patients. The mean values per patient were used for comparison. The IAP was measured using either a balloon-tipped stomach catheter connected to an IAP monitor (Spiegelberg, Hamburg, Germany, or CiMON, Pulsion Medical Systems, Munich, Germany) or a bladder FoleyManometer (Holtech Medical, Charlottenlund, Denmark). PDRICG was measured at the bedside using the LiMON device (Pulsion Medical Systems, Munich, Germany). Primary endpoint was hospital mortality.
There was no significant correlation between PDRICG and classic liver laboratory parameters, but PDRICG did correlate significantly with APP (R = 0.62) and was inversely correlated with IAP (R = -0.52). Changes in PDRICG were associated with significant concomitant changes in APP (R = 0.73) and opposite changes in IAP (R = 0.61). The IAP was significantly higher (14.6 ± 4.6 vs. 11.1 ± 5.3 mmHg, p = 0.03), and PDRICG (10 ± 8.3 vs. 15.9 ± 5.2%, p = 0.02) and APP (43.6 ± 9 vs. 57.9 ± 12.2 mmHg, p
< 0.0001) were significantly lower in non-survivors.
PDRICG is positively correlated to APP and inversely correlated to IAP. Changes in APP are associated with significant concomitant changes in PDRICG, while changes in IAP are associated with opposite changes in PDRICG, suggesting that an increase in IAP may compromise hepatosplanchnic perfusion. Both PDRICG and IAP are correlated with outcome. Measurement of PDRICG may be a useful additional clinical tool to assess the negative effects of increased IAP on liver perfusion and function.
Little is known about the effects of renal replacement therapy (RRT) with fluid removal on intra-abdominal pressure (IAP). The global end-diastolic volume index (GEDVI) and extravascular lung water index (EVLWI) can easily be measured bedside by transpulmonary thermodilution (TPTD). The aim of this study is to evaluate the changes in IAP, GEDVI and EVLWI in critically ill patients receiving slow extended daily dialysis (SLEDD) or continuous venovenous haemofiltration (CVVH) with the intention of net fluid removal.
We performed a retrospective cohort study in ICU patients who were treated with SLEDD or CVVH and in whom IAP was also measured, and RRT sessions were excluded when the dose of vasoactive medication needed to be changed between the pre- and post-dialysis TPTD measurements and when net fluid loss did not exceed 500 ml. The TPTD measurements were performed within 2 h before and after SLEDD; in case of CVVH, before and after an interval of 12 h.
We studied 25 consecutive dialysis sessions in nine patients with acute renal failure and cardiogenic or non-cardiogenic pulmonary oedema. The GEDVI and EVLWI values before dialysis were 877 ml/m² and 14 ml/kg, respectively. Average net ultrafiltration per session was 3.6 l, with a net fluid loss 1.9 l. The GEDVI decreased significantly during dialysis, but not more than 47.8 ml/m² (p = 0.008), as also did the EVLWI with 1 ml/kg (p = 0.03). The IAP decreased significantly from 12 to 10.5 mmHg (p < 0.0001).
Net fluid removal by SLEDD or CVVH in the range observed in this study decreased IAP, GEDVI and EVLWI in critically ill patients although EVLWI reduction was modest.
Conservative treatment of patients with severe acute pancreatitis (SAP) may be associated with development of intra-abdominal hypertension (IAH), deterioration of visceral perfusion and increased risk of multiple organ dysfunction. Fluid balance is essential for maintenance of adequate organ perfusion and control of the third space. Timely application of continuous veno-venous haemofiltration (CVVH) may help in balancing fluid replacement and removal of cytokines from the blood and tissue compartments. The aim of the present study was to determine whether CVVH can be recommended as a constituent of conservative treatment in patients with SAP who suffer IAH.
A retrospective analysis of 10 years' experience with low-flow CVVH application in patients with SAP who develop IAH was. In all patients, measurement of the intra-abdominal pressure (IAP) was done indirectly through the urinary bladder. Sequential organ failure assessment (SOFA) score was calculated for severity assessment, and necrotizing forms were verified by contrast-enhanced computed tomography. Dynamics of IAP were analysed in parallel with signs of systemic inflammation, dynamics of C-reactive protein and cumulative fluid balance. All variables, complication rate and outcomes were analysed in the whole group and in patients with IAH (CVVH and no-CVVH groups).
From the total of 130 patients, 75 were treated with application of CVVH and 55 without CVVH. Late hospitalization was associated with application of CVVH. Infection was observed in 28.5% of cases regardless of the type of treatment received, with a similar necessity for surgical intervention. IAH was observed in 68.5% of patients, and they had significantly higher SOFA scores compared to patients with normal IAP. CVVH treatment resulted in negative cumulative fluid balance starting from day 5 in patients with IAH, whereas without this treatment, fluid balance remained increasingly positive after a week. Finally, application of CVVH resulted in a lower infection rate and shorter hospital stay, 26.7% vs. 37.9%, and a median of 32 (interquartile range (IQR) = 60 to 12) days vs. 24 (IQR = 34 to 4) days, p = 0.05, comparing CVVH vs. no-CVVH group. Mortality rate reached 11.7% in the CVVH group and 13.8% in the no-CVVH group.
Early application of CVVH facilitates negative fluid balance and reduction of IAH in patients with SAP; it is not associated with increased infection or mortality rate and may reduce hospital stay.
Mechanical ventilation (MV) is considered a predisposing factor for increased intra-abdominal pressure (IAP), especially when positive end-expiratory pressure (PEEP) is applied or in the presence of auto-PEEP. So far, no prospective data exists on the effect of MV on IAP. The study aims to look on the effects of MV on IAP in a group of critically ill patients with no other risk factors for intra-abdominal hypertension (IAH).
An observational multicenter study was conducted on a total of 100 patients divided into two groups: 50 patients without MV and 50 patients with MV. All patients were admitted to the intensive care units of the Medical and Surgical Research Centre, the Carlos J. Finlay Hospital, the Julio Trigo University Hospital, and the Calixto García Hospital, in Havana, Cuba between July 2000 and December 2004. The IAP was measured twice daily on admission using a standard transurethral technique. IAH was considered if IAP was greater than 12 mmHg. Correlations were made between IAP and body mass index (BMI), diagnostic category, gender, age, and ventilatory parameters.
The mean IAP in patients on MV was 6.7 ± 4.1 mmHg and significantly higher than in patients without MV (3.6 ± 2.4 mmHg, p < 0.0001). This difference was maintained regardless of gender, age, BMI, and diagnosis. The use of MV and BMI were independent predictors for IAH for the whole population, while male gender, assisted ventilation mode, and the use of PEEP were independent factors associated with IAH in patients on MV.
In this study, MV was identified as an independent predisposing factor for the development of IAH. Critically ill patients, which are on MV, present with higher IAP values on admission and should be monitored very closely, especially if PEEP is applied, even when they have no other apparent risk factors for IAH.
Application of abdominal negative-pressure therapy (NPT) is lifesaving when conservative measures fail to reduce sustained increase of the intra-abdominal pressure and it is impossible to achieve source control in a single operation due to severe peritonitis. The aim of this study is to share the initial experience with abdominal NPT in Latvia and provide a review of the relevant literature.
In total, 22 patients were included. All patients were treated with KCI® ABThera™ NPT systems. Acute Physiology and Chronic Health Evaluation II (APACHE II) score on admission, daily sequential organ failure assessment score and Mannheim peritonitis index (MPI) were calculated for severity definition. The frequency of NPT system changes, daily amount of aspirated fluid effluent and the time of abdominal closure were assessed. The overall hospital and ICU stay, as well as the outcomes and the complication rate, were analysed.
A complicated intra-abdominal infection was treated in 18 patients. Abdominal compartment syndrome due to severe acute pancreatitis (SAP), secondary ileus and damage control in polytrauma were indications for NPT in four patients. The median age of the patients was 59 years (range, 28 to 81), median APACHE II score was 15 points (range, 9 to 32) and median MPI was 28 points (range, 21 to 40), indicating a prognostic mortality risk of 60%. Sepsis developed in all patients, and in 20 of them, it was severe. NPT systems were changed on a median of every 4 days, and abdominal closure was feasible on the seventh postoperative day without needing a repeated laparotomy. Two NPT systems were removed due to bleeding from the retroperitoneal space in patients with SAP. Intestinal fistulae developed in three patients that were successfully treated conservatively. Incisional hernia occurred in three patients. The overall ICU and hospital stay were 14 (range, 5 to 56) and 25 days (range, 10 to 87), respectively. Only one patient died, contributing to the overall mortality of 4.5%.
Application of abdominal NPT could be a very promising technique for the control of sustained intra-abdominal hypertension and management of severe sepsis due to purulent peritonitis. Further trials are justified for a detailed evaluation of abdominal NPT indications.
Acute kidney insufficiency (AKI) occurs frequently in intensive care units (ICU). In the management of vascular access for renal replacement therapy (RRT), several factors need to be taken into consideration to achieve an optimal RRT dose and to limit complications. In the medium and long term, some individuals may become chronic dialysis patients and so preserving the vascular network is of major importance. Few studies have focused on the use of dialysis catheters (DC) in ICUs, and clinical practice is driven by the knowledge and management of long-term dialysis catheter in chronic dialysis patients and of central venous catheter in ICU patients. This review describes the appropriate use and management of DCs required to obtain an accurate RRT dose and to reduce mechanical and infectious complications in the ICU setting. To deliver the best RRT dose, the length and diameter of the catheter need to be sufficient. In patients on intermittent hemodialysis, the right internal jugular insertion is associated with a higher delivered dialysis dose if the prescribed extracorporeal blood flow is higher than 200 ml/min. To prevent DC colonization, the physician has to be vigilant for the jugular position when BMI < 24 and the femoral position when BMI > 28. Subclavian sites should be excluded. Ultrasound guidance should be used especially in jugular sites. Antibiotic-impregnated dialysis catheters and antibiotic locks are not recommended in routine practice. The efficacy of ethanol and citrate locks has yet to be demonstrated. Hygiene procedures must be respected during DC insertion and manipulation.
Dialysis catheter; Intensive care unit; Catheter dysfunction; Catheter infection
Adverse events (AEs) frequently occur in intensive care units (ICUs) and affect negatively patient outcomes. Targeted improvement strategies for patient safety are difficult to evaluate because of the intrinsic limitations of reporting crude AE rates. Single interventions influence positively the quality of care, but a multifaceted approach has been tested only in selected cases. The present study was designed to evaluate the rate, types, and contributing factors of emerging AEs and test the hypothesis that a multifaceted intervention on medication might reduce drug-related AEs.
This is a prospective, multicenter, before-and-after study of adult patients admitted to four ICUs during a 24-month period. Voluntary, anonymous, self-reporting of AEs was performed using a detailed, locally designed questionnaire. The temporal impact of a multifaceted implementation strategy to reduce drug-related AEs was evaluated using the risk-index scores methodology.
A total of 2,047 AEs were reported (32 events per 100 ICU patient admissions and 117.4 events per 1,000 ICU patient days) from 6,404 patients, totaling 17,434 patient days. Nurses submitted the majority of questionnaires (n = 1,781, 87%). AEs were eye-witnessed in 49% (n = 1,003) of cases and occurred preferentially during an elective procedure (n = 1,597, 78%) and on morning shifts (n = 1,003, 49%), with a peak rate occurring around 10 a.m. Drug-related AEs were the most prevalent (n = 984, 48%), mainly as a consequence of incorrect prescriptions. Poor communication among caregivers (n = 776) and noncompliance with internal guidelines (n = 525) were the most prevalent contributing factors for AE occurrence. The majority of AEs (n = 1155, 56.4%) was associated with minimal, temporary harm. Risk-index scores for drug-related AEs decreased from 10.01 ± 2.7 to 8.72 ± 3.52 (absolute risk difference 1.29; 95% confidence interval, 0.88-1.7; p < 0.01) following the introduction of the intervention.
AEs occurred in the ICU with a typical diurnal frequency distribution. Medication-related AEs were the most prevalent. By applying the risk-index scores methodology, we were able to demonstrate that our multifaceted implementation strategy focused on medication-related adverse events allowed to decrease drug related incidents.
Adverse events; Medical errors; Patient safety; Quality improvement; Intensive care; Reliability
Intensivists are regularly confronted with the question of gastrointestinal bleeding. To date, the latest international recommendations regarding prevention and treatment for gastrointestinal bleeding lack a specific approach to the critically ill patients. We present recommendations for management by the intensivist of gastrointestinal bleeding in adults and children, developed with the GRADE system by an experts group of the French-Language Society of Intensive Care (Société de Réanimation de Langue Française (SRLF), with the participation of the French Language Group of Paediatric Intensive Care and Emergencies (GFRUP), the French Society of Emergency Medicine (SFMU), the French Society of Gastroenterology (SNFGE), and the French Society of Digestive Endoscopy (SFED). The recommendations cover five fields of application: management of gastrointestinal bleeding before endoscopic diagnosis, treatment of upper gastrointestinal bleeding unrelated to portal hypertension, treatment of upper gastrointestinal bleeding related to portal hypertension, management of presumed lower gastrointestinal bleeding, and prevention of upper gastrointestinal bleeding in intensive care.
Gastrointestinal bleeding; Intensive care; Ulcer; Gastric/esophageal varices; Recommendations
This clinical study evaluated the effect of a suctioning maneuver on aspiration past the cuff during mechanical ventilation.
Patients intubated for less than 48 hours with a PVC-cuffed tracheal tube, under mechanical ventilation with a PEEP ≥5 cm H2O and under continuous sedation, were included in the study. At baseline the cuff pressure was set at 30 cm H2O. Then 0.5ml of blue dye diluted with 3 ml of saline was instilled into the subglottic space just above the cuff. Tracheal suctioning was performed using a 16-French suction catheter with a suction pressure of – 400 mbar. A fiberoptic bronchoscopy was performed before and after the suctioning maneuver, looking for the presence of blue dye in the folds within the cuff wall or in the trachea under the cuff. The sealing of the cuff was defined by the absence of leakage of blue dye either in the cuff wall or in the trachea under the cuff.
Twenty-five patients were included. The size of the tracheal tube was 7-mm ID for 5 patients, 7.5-mm ID for 16 patients, and 8-mm ID for four patients. Blue dye was never seen in the trachea under the cuff before suctioning and only in one patient (4%) after the suctioning maneuver. Blue dye was observed in the folds within the cuff wall in 6 of 25 patients before suctioning and 11 of 25 after (p = 0.063). Overall, the incidence of sealing of the cuff was 76% before suctioning and 56% after (p = 0.073).
In patients intubated with a PVC-cuffed tracheal tube and under mechanical ventilation with PEEP ≥5 cm H2O and a cuff pressure set at 30 cm H2O, a single tracheal suctioning maneuver did not increase the risk of aspiration in the trachea under the cuff.
ClinicalTrials.gov, number NCT01170156
Tube cuff; Aspiration; Suctioning maneuver
Recent clinical studies have confirmed the strong prognostic value of persistent hyperlactatemia and delayed lactate clearance in septic shock. Several potential hypoxic and nonhypoxic mechanisms have been associated with persistent hyperlactatemia, but the relative contribution of these factors has not been specifically addressed in comprehensive clinical physiological studies. Our goal was to determine potential hemodynamic and perfusion-related parameters associated with 6-hour lactate clearance in a cohort of hyperdynamic, hyperlactatemic, septic shock patients.
We conducted an acute clinical physiological pilot study that included 15 hyperdynamic, septic shock patients undergoing aggressive early resuscitation. Several hemodynamic and perfusion-related parameters were measured immediately after preload optimization and 6 hours thereafter, with 6-hour lactate clearance as the main outcome criterion. Evaluated parameters included cardiac index, mixed venous oxygen saturation, capillary refill time and central-to-peripheral temperature difference, thenar tissue oxygen saturation (StO2) and its recovery slope after a vascular occlusion test, sublingual microcirculatory assessment, gastric tonometry (pCO2 gap), and plasma disappearance rate of indocyanine green (ICG-PDR). Statistical analysis included Wilcoxon and Mann–Whitney tests.
Five patients presented a 6-hour lactate clearance <10%. Compared with 10 patients with a 6-hour lactate clearance ≥10%, they presented a worse hepatosplanchnic perfusion as represented by significantly more severe derangements of ICG-PDR (9.7 (8–19) vs. 19.6 (9–32)%/min, p < 0.05) and pCO2 gap (33 (9.1-62) vs. 7.7 (3–58) mmHg, p < 0.05) at 6 hours. No other systemic, hemodynamic, metabolic, peripheral, or microcirculatory parameters differentiated these subgroups. We also found a significant correlation between ICG-PDR and pCO2 gap (p = 0.02).
Impaired 6-hour lactate clearance could be associated with hepatosplanchnic hypoperfusion in some hyperdynamic septic shock patients. Improvement of systemic, metabolic, and peripheral perfusion parameters does not rule out the persistence of hepatosplanchnic hypoperfusion in this setting. Severe microcirculatory abnormalities can be detected in hyperdynamic septic shock patients, but their role on lactate clearance is unclear. ICG-PDR may be a useful tool to evaluate hepatosplanchnic perfusion in septic shock patients with persistent hyperlactatemia.
ClinicalTrials.gov Identifier: NCT01271153
Septic shock; Hepatosplanchnic perfusion; Lactate; Microcirculation
Neuromuscular blocking agents (NMBAs), or “paralytics,” often are deployed in the sickest patients in the intensive care unit (ICU) when usual care fails. Despite the publication of guidelines on the use of NMBAs in the ICU in 2002, clinicians have needed more direction to determine which patients would benefit from NMBAs and which patients would be harmed. Recently, new evidence has shown that paralytics hold more promise when used in carefully selected lung injury patients for brief periods of time. When used in early acute respiratory distress syndrome (ARDS), NMBAs assist to establish a lung protective strategy, which leads to improved oxygenation, decreased pulmonary and systemic inflammation, and potentially improved mortality. It also is increasingly recognized that NMBAs can cause harm, particularly critical illness polyneuromyopathy (CIPM), when used for prolonged periods or in septic shock. In this review, we address several practical considerations for clinicians who use NMBAs in their practice. Ultimately, we conclude that NMBAs should be considered a lung protective adjuvant in early ARDS and that clinicians should consider using an alternative NMBA to the aminosteroids in septic shock with less severe lung injury pending further studies.
Neuromuscular blocking agents; Neuromuscular nondepolarizing agents; Polyneuropathies; Respiratory distress syndrome; Adult; Cisatracurium; Status asthmaticus; Shock; Septic
Thrombocytopenia is a very frequent disorder in the intensive care unit. Many etiologies should be searched, and therapeutic approaches differ according to these different causes. However, no guideline exists regarding optimum practices for these situations in critically ill patients. We present recommendations for the management of thrombocytopenia in intensive care unit, excluding pregnancy, developed by an expert group of the French-Language Society of Intensive Care (Société de Réanimation de Langue Française (SRLF), the French Language Group of Paediatric Intensive Care and Emergencies (GFRUP) and of the Haemostasis and Thrombosis Study Group (GEHT) of the French Society of Haematology (SFH). The recommendations cover six fields of application: definition, epidemiology, and prognosis; diagnostic approach; therapeutic aspects; thrombocytopenia and sepsis; iatrogenic thrombocytopenia, with a special focus on heparin-induced thrombocytopenia; and thrombotic microangiopathy.
Thrombocytopenia; Critical care; Adults; Expert recommendations
Critical illness due to 2009 H1N1 influenza has been characterized by respiratory complications, including acute lung injury (ALI) or acute respiratory distress syndrome (ARDS), and associated with high mortality. We studied the severity, outcomes, and hospital charges of patients with ALI/ARDS secondary to pandemic influenza A infection compared with ALI and ARDS from other etiologies.
A retrospective review was conducted that included patients admitted to the Cleveland Clinic MICU with ALI/ARDS and confirmed influenza A infection, and all patients admitted with ALI/ARDS from any other etiology from September 2009 to March 2010. An itemized list of individual hospital charges was obtained for each patient from the hospital billing office and organized by billing code into a database. Continuous data that were normally distributed are presented as the mean ± SD and were analyzed by the Student’s t test. The chi-square and Fisher exact tests were used to evaluate differences in proportions between patient subgroups. Data that were not normally distributed were compared with the Wilcoxon rank-sum test.
Forty-five patients were studied: 23 in the H1N1 group and 22 in the noninfluenza group. Mean ± SD age was similar (44 ± 13 and 51 ± 17 years, respectively, p = 0.15). H1N1 patients had lower APACHE III scores (66 ± 20 vs. 89 ± 32, p = 0.015) and had higher Pplat and PEEP on days 1, 3, and 14. Hospital and ICU length of stay and duration of mechanical ventilation were comparable. SOFA scores over the first 2 weeks in the ICU indicate more severe organ failure in the noninfluenza group (p = 0.017). Hospital mortality was significantly higher in the noninfluenza group (77 vs. 39%, p = 0.016). The noninfluenza group tended to have higher overall charges, including significantly higher cost of blood products in the ICU.
ALI/ARDS secondary to pandemic influenza infection is associated with more severe respiratory compromise but has lower overall acuity and better survival rates than ALI/ARDS due to other causes. Higher absolute charges in the noninfluenza group are likely due to underlying comorbid medical conditions.
ARDS; ALI; Influenza A; Novel influenza; Mechanical ventilation; Hospital cost
Lowering the temperature setting in the heating device during continuous venovenous hemofiltration (CVVH) is an option. The purpose of this study was to determine the effects on body temperature and hemodynamic tolerance of two different temperature settings in the warming device in patients treated with CVVH.
Thirty patients (mean age: 66.5 years; mean SAPS 2: 55) were enrolled in a prospective crossover randomized study. After a baseline of 2 h at 38°C, the heating device was randomly set to 38°C (group A) and 36°C (group B) for 6 h. Then, the temperatures were switched to 36°C in group A and to 38°C in group B for another 6 h. Hemodynamic parameters and therapeutic interventions to control the hemodynamics were recorded.
There was no significant change in body temperature in either group. During the first 6 h, group B patients showed significantly increased arterial pressure (p = 0.01) while the dosage of catecholamine was significantly decreased (p = 0.04). The number of patients who required fluid infusion or increase in catecholamine dosage was similar. During the second period of the study, hemodynamic parameters were unchanged in both groups.
In patients undergoing CVVH, warming of the substitute over 36°C had no impact on body temperature. We showed that setting the fluid temperature at 36°C for a period of time early in the procedure is beneficial in terms of increased mean arterial pressure and decreased catecholamine infusion dosage.
Renal replacement therapy; Hemofiltration; Hemodynamic; Rewarming device; Temperature
Intensive care frequently results in unintentional harm to patients and statistics don’t seem to improve. The ICU environment is especially unforgiving for mistakes due to the multidisciplinary, time-critical nature of care and vulnerability of the patients. Human factors account for the majority of adverse events and a sound safety climate is therefore essential. This article reviews the existing literature on aviation-derived training called Crew Resource Management (CRM) and discusses its application in critical care medicine. CRM focuses on teamwork, threat and error management and blame free discussion of human mistakes. Though evidence is still scarce, the authors consider CRM to be a promising tool for culture change in the ICU setting, if supported by leadership and well-designed follow-up.
Intensive care; Human factors; Safety climate; Crew resource management
Despite the recommended guidelines, the neonatal management of pain and discomfort often remains inadequate. The purpose of the present study was to determine whether adding a pain and discomfort module to a computerized physician order entry (CPOE) system would improve pain and discomfort evaluation in premature newborns under invasive ventilation.
All newborns <37 weeks gestational age (GA) and requiring invasive ventilation were included in a prospective study during two 6-month periods: before and after the inclusion of the pain and discomfort evaluation module. The main outcome measure was the percentage of patients having at least one assessment of pain and discomfort per day of invasive ventilation using the COMFORT scale.
A total of 122 patients were included: 53 before and 69 after the incorporation of the module. The mean age was 30 (3) weeks GA. After the module was included, the percentage of patients who benefited from at least one pain and discomfort assessment per day increased from 64% to 88% (p < 0.01), and the mean number (SD) of scores recorded per day increased from 1 (1) to 3 (1) (p < 0.01). When the score was not within the established range, the nursing staff adapted analgesia/sedation doses more frequently after module inclusion (53% vs. 34%, p < 0.001). Despite higher mean doses of midazolam after module introduction [47 (45) vs. 31 (18) μg/kg/hr, p < 0.05], the durations of invasive ventilation and hospital stay, and the number of nosocomial infections, were not significantly modified.
Adding a pain and discomfort tool to the CPOE system was a simple and effective way to improve the systematic evaluation of premature newborns who required ventilatory assistance.
Analgesia; Computer-assisted instruction; Newborn; Pain management; Sedation
There is controversy over whether traditional intermittent bolus dosing or continuous infusion of beta-lactam antibiotics is preferable in critically ill patients. No significant difference between these two dosing strategies in terms of patient outcomes has been shown yet. This is despite compelling in vitro and in vivo pharmacokinetic/pharmacodynamic (PK/PD) data. A lack of significance in clinical outcome studies may be due to several methodological flaws potentially masking the benefits of continuous infusion observed in preclinical studies. In this review, we explore the methodological shortcomings of the published clinical studies and describe the criteria that should be considered for performing a definitive clinical trial. We found that most trials utilized inconsistent antibiotic doses and recruited only small numbers of heterogeneous patient groups. The results of these trials suggest that continuous infusion of beta-lactam antibiotics may have variable efficacy in different patient groups. Patients who may benefit from continuous infusion are critically ill patients with a high level of illness severity. Thus, future trials should test the potential clinical advantages of continuous infusion in this patient population. To further ascertain whether benefits of continuous infusion in critically ill patients do exist, a large-scale, prospective, multinational trial with a robust design is required.
Beta-lactam antibiotic; Continuous infusion; Critically ill; Pharmacokinetic; Pharmacodynamic; Treatment outcome
Pain remains a significant problem for patients hospitalized in intensive care units (ICUs). As research has shown, for some of these patients pain might even persist after discharge and become chronic. Exposure to intense pain and stress during medical and nursing procedures could be a risk factor that contributes to the transition from acute to chronic pain, which is a major disruption of the pain neurological system. New evidence suggests that physiological alterations contributing to chronic pain states take place both in the peripheral and central nervous systems. The purpose of this paper is to: 1) review cutting-edge theories regarding pain and mechanisms that underlie the transition from acute to chronic pain, such as increases in membrane excitability of peripheral and central nerve fibers, synaptic plasticity, and loss of the function of descending inhibitory pain fibers; 2) provide information on the association between the immune system and pain and its crucial contribution to development of chronic pain syndromes, and 3) discuss mechanisms at brain levels in the nervous system and their contribution to affective (i.e., emotional) states associated with chronic pain conditions. Finally, we will offer suggestions for ICU clinical interventions to attempt to prevent the transition from acute to chronic pain.
Pain; Acute; Chronic; Acute-to-chronic; Intensive care unit; Critical care; Nerve sensitization