PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-10 (10)
 

Clipboard (0)
None

Select a Filter Below

Journals
Year of Publication
Document Types
1.  Maximal Standard Dose of Parenteral Iron for Hemodialysis Patients: An MRI-Based Decision Tree Learning Analysis 
PLoS ONE  2014;9(12):e115096.
Background and Objectives
Iron overload used to be considered rare among hemodialysis patients after the advent of erythropoesis-stimulating agents, but recent MRI studies have challenged this view. The aim of this study, based on decision-tree learning and on MRI determination of hepatic iron content, was to identify a noxious pattern of parenteral iron administration in hemodialysis patients.
Design, Setting, Participants and Measurements
We performed a prospective cross-sectional study from 31 January 2005 to 31 August 2013 in the dialysis centre of a French community-based private hospital. A cohort of 199 fit hemodialysis patients free of overt inflammation and malnutrition were treated for anemia with parenteral iron-sucrose and an erythropoesis-stimulating agent (darbepoetin), in keeping with current clinical guidelines. Patients had blinded measurements of hepatic iron stores by means of T1 and T2* contrast MRI, without gadolinium, together with CHi-squared Automatic Interaction Detection (CHAID) analysis.
Results
The CHAID algorithm first split the patients according to their monthly infused iron dose, with a single cutoff of 250 mg/month. In the node comprising the 88 hemodialysis patients who received more than 250 mg/month of IV iron, 78 patients had iron overload on MRI (88.6%, 95% CI: 80% to 93%). The odds ratio for hepatic iron overload on MRI was 3.9 (95% CI: 1.81 to 8.4) with >250 mg/month of IV iron as compared to <250 mg/month. Age, gender (female sex) and the hepcidin level also influenced liver iron content on MRI.
Conclusions
The standard maximal amount of iron infused per month should be lowered to 250 mg in order to lessen the risk of dialysis iron overload and to allow safer use of parenteral iron products.
doi:10.1371/journal.pone.0115096
PMCID: PMC4266677  PMID: 25506921
2.  Diagnostic accuracy of early urinary index changes in differentiating transient from persistent acute kidney injury in critically ill patients: multicenter cohort study 
Critical Care  2013;17(2):R56.
Introduction
Urinary indices have limited effectiveness in separating transient acute kidney injury (AKI) from persistent AKI in ICU patients. Their time-course may vary with the mechanism of AKI. The primary objective of this study was to evaluate the diagnostic value of changes over time of the usual urinary indices in separating transient AKI from persistent AKI.
Methods
An observational prospective multicenter study was performed in six ICUs involving 244 consecutive patients, including 97 without AKI, 54 with transient AKI, and 93 with persistent AKI. Urinary sodium, urea and creatinine were measured at ICU admission (H0) and on 6-hour urine samples during the first 24 ICU hours (H6, H12, H18, and H24). Transient AKI was defined as AKI with a cause for renal hypoperfusion and reversal within 3 days.
Results
Significant increases from H0 to H24 were noted in fractional excretion of urea (median, 31% (22 to 41%) and 39% (29 to 48%) at H24, P < 0.0001), urinary urea/plasma urea ratio (15 (7 to 28) and 20 (9 to 40), P < 0.0001), and urinary creatinine/plasma creatinine ratio (50 (24 to 101) and 57 (29 to 104), P = 0.01). Fractional excretion of sodium did not change significantly during the first 24 hours in the ICU (P = 0.13). Neither urinary index values at ICU admission nor changes in urinary indices between H0 and H24 performed sufficiently well to recommend their use in clinical setting (area under the receiver-operating characteristic curve ≤0.65).
Conclusion
Although urinary indices at H24 performed slightly better than those at H0 in differentiating transient AKI from persistent AKI, they remain insufficiently reliable to be clinically relevant.
doi:10.1186/cc12582
PMCID: PMC3733426  PMID: 23531299
3.  Procalcitonin levels to guide antibiotic therapy in adults with non-microbiologically proven apparent severe sepsis: a randomised controlled trial 
BMJ Open  2013;3(2):e002186.
Objective
Some patients with the phenotype of severe sepsis may have no overt source of infection or identified pathogen. We investigated whether a procalcitonin-based algorithm influenced antibiotic use in patients with non-microbiologically proven apparent severe sepsis.
Design
This multicentre, randomised, controlled, single-blind trial was performed in two parallel groups.
Setting
Eight intensive care units in France.
Participants
Adults with the phenotype of severe sepsis and no overt source of infection, negative microbial cultures from multiple matrices and no antibiotic exposure shortly before intensive care unit admission.
Intervention
The initiation and duration of antibiotic therapy was based on procalcitonin levels in the experimental arm and on the intensive care unit physicians’ clinical judgement without reference to procalcitonin values in the control arm.
Main outcome measure
The primary outcome was the proportion of patients on antibiotics on day 5 postrandomisation.
Results
Over a 3-year period, 62/1250 screened patients were eligible for the study, of whom 31 were randomised to each arm; 4 later withdrew their consent. At day 5, 18/27 (67%) survivors were on antibiotics in the experimental arm, versus 21/26 (81%) controls (p=0.24; relative risk=0.83, 95% CI: 0.60 to 1.14). Only 8/58 patients (13%) had baseline procalcitonin <0.25 µg/l; in these patients, physician complied poorly with the algorithm.
Conclusions
In intensive care unit patients with the phenotype of severe sepsis or septic shock and without an overt source of infection or a known pathogen, the current study was unable to confirm that a procalcitonin-based algorithm may influence antibiotic exposure. However, the premature termination of the trial may not allow definitive conclusions.
doi:10.1136/bmjopen-2012-002186
PMCID: PMC3586059  PMID: 23418298
4.  Incidence and outcome of contrast-associated acute kidney injury in a mixed medical-surgical ICU population: a retrospective study 
BMC Nephrology  2013;14:31.
Background
Contrast-enhanced radiographic examinations carry the risk of contrast-associated acute kidney injury (CA-AKI). While CA-AKI is a well-known complication outside the intensive care unit (ICU) setting, data on CA-AKI in ICU patients are scarce. Our aim was to assess the incidence and short-term outcome of CA-AKI in a mixed medical-surgical ICU population.
Methods
We conducted a single-center retrospective analysis between September 2006 and December 2008 on adult patients who underwent a contrast-enhanced computed tomography for urgent diagnostic purposes. CA-AKI was defined as either a relative increment in serum creatinine of ≥ 25% or an absolute increment in serum creatinine of ≥ 0.3 mg/dL within 48 hrs after contrast administration. ICU mortality rates of patients with and without CA-AKI were compared in univariate and multivariate analyses. The need for renal replacement therapy (RRT) was also recorded.
Results
CA-AKI occurred in 24/143 (16.8%) patients. Coexisting risk factors for kidney injury, such as sepsis, nephrotoxic drugs and hemodynamic failure were commonly observed in patients who developed CA-AKI. ICU mortality was significantly higher in patients with than in those without CA-AKI (50% vs 21%, p = 0.004). In multivariate logistic regression, CA-AKI remained associated with ICU mortality (odds ratio: 3.48, 95% confidence interval: 1.10-11.46, p = 0.04). RRT was required in 7 (29.2%) patients with CA-AKI.
Conclusions
In our cohort, CA-AKI was a frequent complication. It was associated with a poor short-term outcome and seemed to occur mainly when multiple risk factors for kidney injury were present. Administration of ICM should be considered as a potential high-risk procedure and not as a routine innocuous practice in ICU patients.
doi:10.1186/1471-2369-14-31
PMCID: PMC3637311  PMID: 23379629
Acute kidney injury; Contrast-associated acute kidney injury; Intensive car unit; Mortality; Outcome
5.  Efficacy of renal replacement therapy in critically ill patients: a propensity analysis 
Critical Care  2012;16(6):R236.
Introduction
Although renal replacement therapy (RRT) is a common procedure in critically ill patients with acute kidney injury (AKI), its efficacy remains uncertain. Patients who receive RRT usually have higher mortality rates than those who do not. However, many differences exist in severity patterns between patients with and those without RRT and available results are further confounded by treatment selection bias since no consensus on indications for RRT has been reached so far. Our aim was to account for these biases to accurately assess RRT efficacy, with special attention to RRT timing.
Methods
We performed a propensity analysis using data of the French longitudinal prospective multicenter Outcomerea database. Two propensity scores for RRT were built to match patients who received RRT to controls who did not despite having a close probability of receiving the procedure. AKI was defined according to RIFLE criteria. The association between RRT and hospital mortality was examined through multivariate conditional logistic regression analyses to control for residual confounding. Sensitivity analyses were conducted to examine the impact of RRT timing.
Results
Among the 2846 study patients, 545 (19%) received RRT. Crude mortality rates were higher in patients with than in those without RRT (38% vs 17.5%, P < 0.001). After matching and adjustment, RRT was not associated with a reduced hospital mortality. The two propensity models yielded concordant results.
Conclusions
In our study population, RRT failed to reduce hospital mortality. This result emphasizes the need for randomized studies comparing RRT to conservative management in selected ICU patients, with special focus on RRT timing.
doi:10.1186/cc11905
PMCID: PMC3672625  PMID: 23254304
6.  Diagnostic performance of fractional excretion of urea in the evaluation of critically ill patients with acute kidney injury: a multicenter cohort study 
Critical Care  2011;15(4):R178.
Introduction
Several factors, including diuretic use and sepsis, interfere with the fractional excretion of sodium, which is used to distinguish transient from persistent acute kidney injury (AKI). These factors do not affect the fractional excretion of urea (FeUrea). However, there are conflicting data on the diagnostic accuracy of FeUrea.
Methods
We conducted an observational, prospective, multicenter study at three ICUs in university hospitals. Unselected patients, except those with obstructive AKI, were admitted to the participating ICUs during a six-month period. Transient AKI was defined as AKI caused by renal hypoperfusion and reversal within three days. The results are reported as medians (interquartile ranges).
Results
A total of 203 patients were included. According to our definitions, 67 had no AKI, 54 had transient AKI and 82 had persistent AKI. FeUrea was 39% (28 to 40) in the no-AKI group, 41% (29 to 54) in the transient AKI group and 32% (22 to 51) in the persistent AKI group (P = 0.12). FeUrea was of little help in distinguishing transient AKI from persistent AKI, with the area under the receiver operating characteristic curve being 0.59 (95% confidence interval, 0.49 to 0.70; P = 0.06). Sensitivity was 63% and specificity was 54% with a cutoff of 35%. In the subgroup of patients receiving diuretics, the results were similar.
Conclusions
FeUrea may be of little help in distinguishing transient AKI from persistent AKI in critically ill patients, including those receiving diuretic therapy. Additional studies are needed to evaluate alternative markers or strategies to differentiate transient from persistent AKI.
doi:10.1186/cc10327
PMCID: PMC3387621  PMID: 21794161
acute kidney failure; ICU; fractional excretion of sodium; acute tubular necrosis; diuretics; sensitivity and specificity
7.  Impact of ureido/carboxypenicillin resistance on the prognosis of ventilator-associated pneumonia due to Pseudomonas aeruginosa 
Critical Care  2011;15(2):R112.
Introduction
Although Pseudomonas aeruginosa is a leading pathogen responsible for ventilator-associated pneumonia (VAP), the excess in mortality associated with multi-resistance in patients with P. aeruginosa VAP (PA-VAP), taking into account confounders such as treatment adequacy and prior length of stay in the ICU, has not yet been adequately estimated.
Methods
A total of 223 episodes of PA-VAP recorded into the Outcomerea database were evaluated. Patients with ureido/carboxy-resistant P. aeruginosa (PRPA) were compared with those with ureido/carboxy-sensitive P. aeruginosa (PSPA) after matching on duration of ICU stay at VAP onset and adjustment for confounders.
Results
Factors associated with onset of PRPA-VAP were as follows: admission to the ICU with septic shock, broad-spectrum antimicrobials at admission, prior use of ureido/carboxypenicillin, and colonization with PRPA before infection. Adequate antimicrobial therapy was more often delayed in the PRPA group. The crude ICU mortality rate and the hospital mortality rate were not different between the PRPA and the PSPA groups. In multivariate analysis, after controlling for time in the ICU before VAP diagnosis, neither ICU death (odds ratio (OR) = 0.73; 95% confidence interval (CI): 0.32 to 1.69; P = 0.46) nor hospital death (OR = 0.87; 95% CI: 0.38 to 1.99; P = 0.74) were increased in the presence of PRPA infection. This result remained unchanged in the subgroup of 87 patients who received adequate antimicrobial treatment on the day of VAP diagnosis.
Conclusions
After adjustment, and despite the more frequent delay in the initiation of an adequate antimicrobial therapy in these patients, resistance to ureido/carboxypenicillin was not associated with ICU or hospital death in patients with PA-VAP.
doi:10.1186/cc10136
PMCID: PMC3219393  PMID: 21481266
9.  Severe Imported Falciparum Malaria: A Cohort Study in 400 Critically Ill Adults 
PLoS ONE  2010;5(10):e13236.
Background
Large studies on severe imported malaria in non-endemic industrialized countries are lacking. We sought to describe the clinical spectrum of severe imported malaria in French adults and to identify risk factors for mortality at admission to the intensive care unit.
Methodology and Principal Findings
Retrospective review of severe Plasmodium falciparum malaria episodes according to the 2000 World Health Organization definition and requiring admission to the intensive care unit. Data were collected from medical charts using standardised case-report forms, in 45 French intensive care units in 2000–2006. Risk factors for in-hospital mortality were identified by univariate and multivariate analyses.
Data from 400 adults admitted to the intensive care unit were analysed, representing the largest series of severe imported malaria to date. Median age was 45 years; 60% of patients were white, 96% acquired the disease in sub-Saharan Africa, and 65% had not taken antimalarial chemoprophylaxis. Curative quinine treatment was used in 97% of patients. Intensive care unit mortality was 10.5% (42 deaths). By multivariate analysis, three variables at intensive care unit admission were independently associated with hospital death: older age (per 10-year increment, odds ratio [OR], 1.72; 95% confidence interval [95%CI], 1.28–2.32; P = 0.0004), Glasgow Coma Scale score (per 1-point decrease, OR, 1.32; 95%CI, 1.20–1.45; P<0.0001), and higher parasitemia (per 5% increment, OR, 1.41; 95%CI, 1.22–1.62; P<0.0001).
Conclusions and Significance
In a large population of adults treated in a non-endemic industrialized country, severe malaria still carried a high mortality rate. Our data, including predictors of death, can probably be generalized to other non-endemic countries where high-quality healthcare is available.
doi:10.1371/journal.pone.0013236
PMCID: PMC2951913  PMID: 20949045
10.  Norepinephrine weaning in septic shock patients by closed loop control based on fuzzy logic 
Critical Care  2008;12(6):R155.
Introduction
The rate of weaning of vasopressors drugs is usually an empirical choice made by the treating in critically ill patients. We applied fuzzy logic principles to modify intravenous norepinephrine (noradrenaline) infusion rates during norepinephrine infusion in septic patients in order to reduce the duration of shock.
Methods
Septic patients were randomly assigned to norepinephrine infused either at the clinician's discretion (control group) or under closed-loop control based on fuzzy logic (fuzzy group). The infusion rate changed automatically after analysis of mean arterial pressure in the fuzzy group. The primary end-point was time to cessation of norepinephrine. The secondary end-points were 28-day survival, total amount of norepinephine infused and duration of mechanical ventilation.
Results
Nineteen patients were randomly assigned to fuzzy group and 20 to control group. Weaning of norepinephrine was achieved in 18 of the 20 control patients and in all 19 fuzzy group patients. Median (interquartile range) duration of shock was significantly shorter in the fuzzy group than in the control group (28.5 [20.5 to 42] hours versus 57.5 [43.7 to 117.5] hours; P < 0.0001). There was no significant difference in duration of mechanical ventilation or survival at 28 days between the two groups. The median (interquartile range) total amount of norepinephrine infused during shock was significantly lower in the fuzzy group than in the control group (0.6 [0.2 to 1.0] μg/kg versus 1.4 [0.6 to 2.7] μg/kg; P < 0.01).
Conclusions
Our study has shown a reduction in norepinephrine weaning duration in septic patients enrolled in the fuzzy group. We attribute this reduction to fuzzy control of norepinephrine infusion.
Trial registration
Trial registration: Clinicaltrials.gov NCT00763906.
doi:10.1186/cc7149
PMCID: PMC2646320  PMID: 19068113

Results 1-10 (10)