Search tips
Search criteria

Results 1-13 (13)

Clipboard (0)

Select a Filter Below

Year of Publication
1.  The Prevalence of Compassion Fatigue and Burnout among Healthcare Professionals in Intensive Care Units: A Systematic Review 
PLoS ONE  2015;10(8):e0136955.
Working in the stressful environment of the Intensive Care Unit (ICU) is an emotionally charged challenge that might affect the emotional stability of medical staff. The quality of care for ICU patients and their relatives might be threatened through long-term absenteeism or a brain and skill drain if the healthcare professionals leave their jobs prematurely in order to preserve their own health.
The purpose of this review is to evaluate the literature related to emotional distress among healthcare professionals in the ICU, with an emphasis on the prevalence of burnout and compassion fatigue and the available preventive strategies.
A systematic literature review was conducted, using Embase, Medline OvidSP, Cinahl, Web-of-science, PsychINFO, PubMed publisher, Cochrane and Google Scholar for articles published between 1992 and June, 2014. Studies reporting the prevalence of burnout, compassion fatigue, secondary traumatic stress and vicarious trauma in ICU healthcare professionals were included, as well as related intervention studies.
Forty of the 1623 identified publications, which included 14,770 respondents, met the selection criteria. Two studies reported the prevalence of compassion fatigue as 7.3% and 40%; five studies described the prevalence of secondary traumatic stress ranging from 0% to 38.5%. The reported prevalence of burnout in the ICU varied from 0% to 70.1%. A wide range of intervention strategies emerged from the recent literature search, such as different intensivist work schedules, educational programs on coping with emotional distress, improving communication skills, and relaxation methods.
The true prevalence of burnout, compassion fatigue, secondary traumatic stress and vicarious trauma in ICU healthcare professionals remains open for discussion. A thorough exploration of emotional distress in relation to communication skills, ethical rounds, and mindfulness might provide an appropriate starting point for the development of further preventive strategies.
PMCID: PMC4554995  PMID: 26322644
2.  Long-term quality of life in critically ill patients with acute kidney injury treated with renal replacement therapy: a matched cohort study 
Critical Care  2015;19(1):289.
Acute kidney injury (AKI) is a common complication in intensive care unit (ICU) patients and is associated with increased morbidity and mortality. We compared long-term outcome and quality of life (QOL) in ICU patients with AKI treated with renal replacement therapy (RRT) with matched non-AKI-RRT patients.
Over 1 year, consecutive adult ICU patients were included in a prospective cohort study. AKI-RRT patients alive at 1 year and 4 years were matched with non-AKI-RRT survivors from the same cohort in a 1:2 (1 year) and 1:1 (4 years) ratio based on gender, age, Acute Physiology and Chronic Health Evaluation II score, and admission category. QOL was assessed by the EuroQoL-5D and the Short Form-36 survey before ICU admission and at 3 months, 1 and 4 years after ICU discharge.
Of 1953 patients, 121 (6.2 %) had AKI-RRT. AKI-RRT hospital survivors (44.6 %; N = 54) had a 1-year and 4-year survival rate of 87.0 % (N = 47) and 64.8 % (N = 35), respectively. Forty-seven 1-year AKI-RRT patients were matched with 94 1-year non-AKI-RRT patients. Of 35 4-year survivors, three refused further cooperation, three were lost to follow-up, and one had no control. Finally, 28 4-year AKI-RRT patients were matched with 28 non-AKI-RRT patients. During ICU stay, 1-year and 4-year AKI-RRT patients had more organ dysfunction compared to their respective matches (Sequential Organ Failure Assessment scores 7 versus 5, P < 0.001, and 7 versus 4, P < 0.001). Long-term QOL was, however, comparable between both groups but lower than in the general population. QOL decreased at 3 months, improved after 1 and 4 years but remained under baseline level. One and 4 years after ICU discharge, 19.1 % and 28.6 % of AKI-RRT survivors remained RRT-dependent, respectively, and 81.8 % and 71 % of them were willing to undergo ICU admission again if needed.
In long-term critically ill AKI-RRT survivors, QOL was comparable to matched long-term critically ill non-AKI-RRT survivors, but lower than in the general population. The majority of AKI-RRT patients wanted to be readmitted to the ICU when needed, despite a higher severity of illness compared to matched non-AKI-RRT patients, and despite the fact that one quarter had persistent dialysis dependency.
Electronic supplementary material
The online version of this article (doi:10.1186/s13054-015-1004-8) contains supplementary material, which is available to authorized users.
PMCID: PMC4527359  PMID: 26250830
5.  Time to look beyond one-year mortality in critically ill hematological patients? 
Critical Care  2014;18(1):107.
The spectacular improvement in long-term prognosis of patients with hematological malignancies since the 1980s, coupled with the subsequent improvement over the past decade in short- and mid-term survival in cases of critical illness, resulted in an increasing referral of such patients to the ICU. A remaining question, however, is how these patients perform in the long term with regard to survival and quality of life. Here we discuss the present multicenter study on survival beyond 1 year in critically ill patients with hematological malignancies. We conclude with suggestions on how we can further improve the long-term outcome of these patients.
PMCID: PMC4056035  PMID: 24517551
7.  Development of antibiotic treatment algorithms based on local ecology and respiratory surveillance cultures to restrict the use of broad-spectrum antimicrobial drugs in the treatment of hospital-acquired pneumonia in the intensive care unit: a retrospective analysis 
Critical Care  2014;18(4):R152.
Timely administration of appropriate antibiotic therapy has been shown to improve outcome in hospital-acquired pneumonia (HAP). Empirical treatment guidelines tailored to local ecology have been advocated in antibiotic stewardship programs. We compared a local ecology based algorithm (LEBA) to a surveillance culture based algorithm (SCBA) in terms of appropriate coverage and spectrum of antimicrobial activity.
We retrospectively assessed 2 hypothetical empirical antibiotic treatment algorithms for HAP on an existing high-quality prospectively collected database in a mixed 36-bed tertiary intensive care unit (ICU). Data on consecutive episodes of microbiologically confirmed HAP were collected over a period of 40 months and divided in a derivation (1 July 2009 to 31 October 2010) and validation (1 November 2010 until 31 October 2012) cohort. On the derivation cohort we constructed a LEBA, based on overall observed bacterial resistance patterns, and a SCBA, which targeted therapy to surveillance culture (SC) in the individual patient. Therapy was directed against pathogens found in respiratory SC collected two to five days before HAP, and in the absence of these, presence or absence of multi-drug resistant (MDR) pathogens in other SC dictated broad-spectrum, respectively narrow spectrum antibiotic therapy. Subsequently, LEBA and SCBA were retrospectively reviewed and compared with actually prescribed antibiotics in the validation cohort.
The first 100 HAP episodes made up the derivation cohort and the subsequent 113 HAP episodes the validation cohort. Appropriate antibiotic coverage rates by applying LEBA and SCBA were 88.5% and 87.6%, respectively, and did not differ significantly with respect to appropriateness of the actually prescribed initial therapy (84.1%). SCBA proposed more narrow spectrum therapy as compared to LEBA and the actually prescribed antimicrobials (P <0.001). SCBA recommended significantly less combination therapy and carbapenems compared to LEBA (P <0.001). SCBA targeted antibiotics to recent respiratory SC in 38.1% (43 out of 113 episodes) of HAP; in these cases adequacy was 93% (40 out of 43).
Rates of appropriate antimicrobial coverage were identical in LEBA and SCBA. However, in this setting of moderate MDR prevalence, the use of SCBA would result in a significant reduction of the use of broad-spectrum drugs and may be a preferential strategy when implementing antibiotic stewardship programs.
PMCID: PMC4223549  PMID: 25030270
8.  Severe Drug-induced Liver Injury Associated with Prolonged Use of Linezolid 
Journal of Medical Toxicology  2010;6(3):322-326.
This study aims to describe a patient developing concomitant severe liver failure and lactic acidosis after long-term treatment with linezolid. A 55-year-old Caucasian woman developed concomitant severe liver failure and lactic acidosis after a treatment with linezolid for 50 days because of infected hip prosthesis. Other causes of liver failure and lactic acidosis were excluded by extensive diagnostic workup. A liver biopsy showed microvesicular steatosis. As linezolid toxicity was considered to be the cause of the lactic acidosis and the severe hepatic failure, the antibiotic was withdrawn. After 4 days of supportive therapy and hemodialysis, the serum lactate level returned within normal limits. The prothrombin time ratio and thrombocytes recovered within 2 weeks. Bilirubin levels normalized within 14 weeks. Since no other cause could be identified, liver injury was considered to be drug-related. Resolution of the hepatotoxicity occurred after discontinuation of linezolid, supportive treatment measures, and hemodialysis. Both lactic acidosis and microvesicular steatosis after the use of linezolid are related to mitochondrial dysfunction. The Council for International Organizations of Medical Sciences/Roussel Ucalf Causality Assessment Method scale revealed that the adverse drug event was probable. Prolonged exposure to linezolid may induce severe hepatotoxicity. Clinicians should be aware of this possible adverse effect especially in case of long-term treatment.
PMCID: PMC3550492  PMID: 20358416
Linezolid; Liver failure; Microvesicular steatosis; Lactic acidosis
9.  Intensive care of the cancer patient: recent achievements and remaining challenges 
A few decades have passed since intensive care unit (ICU) beds have been available for critically ill patients with cancer. Although the initial reports showed dismal prognosis, recent data suggest that an increased number of patients with solid and hematological malignancies benefit from intensive care support, with dramatically decreased mortality rates. Advances in the management of the underlying malignancies and support of organ dysfunctions have led to survival gains in patients with life-threatening complications from the malignancy itself, as well as infectious and toxic adverse effects related to the oncological treatments. In this review, we will appraise the prognostic factors and discuss the overall perspective related to the management of critically ill patients with cancer. The prognostic significance of certain factors has changed over time. For example, neutropenia or autologous bone marrow transplantation (BMT) have less adverse prognostic implications than two decades ago. Similarly, because hematologists and oncologists select patients for ICU admission based on the characteristics of the malignancy, the underlying malignancy rarely influences short-term survival after ICU admission. Since the recent data do not clearly support the benefit of ICU support to unselected critically ill allogeneic BMT recipients, more outcome research is needed in this subgroup. Because of the overall increased survival that has been reported in critically ill patients with cancer, we outline an easy-to-use and evidence-based ICU admission triage criteria that may help avoid depriving life support to patients with cancer who can benefit. Lastly, we propose a research agenda to address unanswered questions.
PMCID: PMC3159899  PMID: 21906331
10.  Has information technology finally been adopted in Flemish intensive care units? 
Information technology (IT) may improve the quality, safety and efficiency of medicine, and is especially useful in intensive Care Units (ICUs) as these are extremely data-rich environments with round-the-clock changing parameters. However, data regarding the implementation rates of IT in ICUs are scarce, and restricted to non-European countries. The current paper aims to provide relevant information regarding implementation of IT in Flemish ICU's (Flanders, Belgium).
The current study is based on two separate but complementary surveys conducted in the region of Flanders (Belgium): a written questionnaire in 2005 followed by a telephone survey in October 2008. We have evaluated the actual health IT adoption rate, as well as its evolution over a 3-year time frame. In addition, we documented the main benefits and obstacles for taking the decision to implement an Intensive Care Information System (ICIS).
Currently, the computerized display of laboratory and radiology results is almost omnipresent in Flemish ICUs, (100% and 93.5%, respectively), but the computerized physician order entry (CPOE) of these examinations is rarely used. Sixty-five % of Flemish ICUs use an electronic patient record, 41.3% use CPOE for medication prescriptions, and 27% use computerized medication administration recording. The implementation rate of a dedicated ICIS has doubled over the last 3 years from 9.3% to 19%, and another 31.7% have plans to implement an ICIS within the next 3 years. Half of the tertiary non-academic hospitals and all university hospitals have implemented an ICIS, general hospitals are lagging behind with 8% implementation, however. The main reasons for postponing ICIS implementation are: (i) the substantial initial investment costs, (ii) integration problems with the hospital information system, (iii) concerns about user-friendly interfaces, (iv) the need for dedicated personnel and (v) the questionable cost-benefit ratio.
Most ICUs in Flanders use hospital IT systems such as computerized laboratory and radiology displays. The adoption rate of ICISs has doubled over the last 3 years but is still surprisingly low, especially in general hospitals. The major reason for not implementing an ICIS is the substantial financial cost, together with the lack of arguments to ensure the cost/benefit.
PMCID: PMC2967500  PMID: 20958955
11.  Determinants and impact of multidrug antibiotic resistance in pathogens causing ventilator-associated-pneumonia 
Critical Care  2008;12(6):R142.
The idea that multidrug resistance (MDR) to antibiotics in pathogens causing ventilator-associated pneumonia (VAP) is an independent risk factor for adverse outcome is still debated. We aimed to identify the determinants of MDR versus non-MDR microbial aetiology in VAP and assessed whether MDR versus non-MDR VAP was independently associated with increased 30-day mortality.
We performed a retrospective analysis of a prospectively registered cohort of adult patients with microbiologically confirmed VAP, diagnosed at a university hospital intensive care unit during a three-year period. Determinants of MDR as compared with non-MDR microbial aetiology and impact of MDR versus non-MDR aetiology on mortality were investigated using multivariate logistic and competing risk regression analysis.
MDR pathogens were involved in 52 of 192 episodes of VAP (27%): methicillin-resistant Staphylococcus aureus in 12 (6%), extended-spectrum β-lactamase producing Enterobacteriaceae in 28 (15%), MDR Pseudomonas aeruginosa and other non-fermenting pathogens in 12 (6%). Multivariable logistic regression identified the Charlson index of comorbidity (odds ratio (OR) = 1.38, 95% confidence interval (CI) = 1.08 to 1.75, p = 0.01) and previous exposure to more than two different antibiotic classes (OR = 5.11, 95% CI = 1.38 to 18.89, p = 0.01) as predictors of MDR aetiology. Thirty-day mortality after VAP diagnosis caused by MDR versus non-MDR was 37% and 20% (p = 0.02), respectively. A multivariate competing risk regression analysis showed that renal replacement therapy before VAP (standardised hazard ratio (SHR) = 2.69, 95% CI = 1.47 to 4.94, p = 0.01), the Charlson index of comorbidity (SHR = 1.21, 95% CI = 1.03 to 1.41, p = 0.03) and septic shock on admission to the intensive care unit (SHR = 1.86, 95% CI = 1.03 to 3.35, p = 0.03), but not MDR aetiology of VAP, were independent predictors of mortality.
The risk of MDR pathogens causing VAP was mainly determined by comorbidity and prior exposure to more than two antibiotics. The increased mortality of VAP caused by MDR as compared with non-MDR pathogens was explained by more severe comorbidity and organ failure before VAP.
PMCID: PMC2646301  PMID: 19014695
12.  A novel approach for prediction of tacrolimus blood concentration in liver transplantation patients in the intensive care unit through support vector regression 
Critical Care  2007;11(4):R83.
Tacrolimus is an important immunosuppressive drug for organ transplantation patients. It has a narrow therapeutic range, toxic side effects, and a blood concentration with wide intra- and interindividual variability. Hence, it is of the utmost importance to monitor tacrolimus blood concentration, thereby ensuring clinical effect and avoiding toxic side effects. Prediction models for tacrolimus blood concentration can improve clinical care by optimizing monitoring of these concentrations, especially in the initial phase after transplantation during intensive care unit (ICU) stay. This is the first study in the ICU in which support vector machines, as a new data modeling technique, are investigated and tested in their prediction capabilities of tacrolimus blood concentration. Linear support vector regression (SVR) and nonlinear radial basis function (RBF) SVR are compared with multiple linear regression (MLR).
Tacrolimus blood concentrations, together with 35 other relevant variables from 50 liver transplantation patients, were extracted from our ICU database. This resulted in a dataset of 457 blood samples, on average between 9 and 10 samples per patient, finally resulting in a database of more than 16,000 data values. Nonlinear RBF SVR, linear SVR, and MLR were performed after selection of clinically relevant input variables and model parameters. Differences between observed and predicted tacrolimus blood concentrations were calculated. Prediction accuracy of the three methods was compared after fivefold cross-validation (Friedman test and Wilcoxon signed rank analysis).
Linear SVR and nonlinear RBF SVR had mean absolute differences between observed and predicted tacrolimus blood concentrations of 2.31 ng/ml (standard deviation [SD] 2.47) and 2.38 ng/ml (SD 2.49), respectively. MLR had a mean absolute difference of 2.73 ng/ml (SD 3.79). The difference between linear SVR and MLR was statistically significant (p < 0.001). RBF SVR had the advantage of requiring only 2 input variables to perform this prediction in comparison to 15 and 16 variables needed by linear SVR and MLR, respectively. This is an indication of the superior prediction capability of nonlinear SVR.
Prediction of tacrolimus blood concentration with linear and nonlinear SVR was excellent, and accuracy was superior in comparison with an MLR model.
PMCID: PMC2206504  PMID: 17655766
13.  Clinical relevance of Aspergillus isolation from respiratory tract samples in critically ill patients 
Critical Care  2006;10(1):R31.
The diagnosis of invasive pulmonary aspergillosis, according to the criteria as defined by the European Organisation for the Research and Treatment of Cancer/Mycoses Study Group (EORTC/MSG), is difficult to establish in critically ill patients. The aim of this study is to address the clinical significance of isolation of Aspergillus spp. from lower respiratory tract samples in critically ill patients on the basis of medical and radiological files using an adapted diagnostic algorithm to discriminate proven and probable invasive pulmonary aspergillosis from Aspergillus colonisation.
Using a historical cohort (January 1997 to December 2003), all critically ill patients with respiratory tract samples positive for Aspergillus were studied. In comparison to the EORTC/MSG criteria, a different appreciation was given to radiological features and microbiological data, including semiquantitative cultures and direct microscopic examination of broncho-alveolar lavage samples.
Over a 7 year period, 172 patients were identified with a positive culture. Of these, 83 patients were classified as invasive aspergillosis. In 50 of these patients (60%), no high risk predisposing conditions (neutropenia, hematologic cancer and stem cell or bone marrow transplantation) were found. Typical radiological imaging (halo and air-crescent sign) occurred in only 5% of patients. In 26 patients, histological examination either by ante-mortem lung biopsy (n = 10) or necropsy (n = 16) was performed, allowing a rough estimation of the predictive value of the diagnostic algorithm. In all patients with histology, all cases of clinical probable pulmonary aspergillosis were confirmed (n = 17). Conversely, all cases classified as colonisation had negative histology (n = 9).
A respiratory tract sample positive for Aspergillus spp. in the critically ill should always prompt further diagnostic assessment, even in the absence of the typical hematological and immunological host risk factors. In a minority of patients, the value of the clinical diagnostic algorithm was confirmed by histological findings, supporting its predictive value. The proposed diagnostic algorithm needs prospective validation.
PMCID: PMC1550813  PMID: 16507158

Results 1-13 (13)