There is ample literature available on the association between both time to antibiotics and appropriateness of antibiotics and clinical outcomes from sepsis. In fact, the current state of debate surrounds the balance to be struck between prompt empirical therapy and care in the choice of appropriate antibiotics (both in terms of the susceptibility of infecting organism and minimizing resistance arising from use of broad-spectrum agents). The objective of this study is to determine sepsis bundle compliance and the appropriateness of antimicrobial therapy in patients with severe sepsis and septic shock and its impact on outcomes.
This study was conducted in the ICU of a tertiary care, private hospital in São Paulo, Brazil. A retrospective cohort study was conducted from July 2005 to December 2012 in patients with severe sepsis and septic shock.
A total of 1,279 patients were identified with severe sepsis and septic shock, of which 358 (32.1%) had bloodstream infection (BSI). The inpatient mortality rate was 29%. In evaluation of the sepsis bundle, over time there was a progressive increase in serum arterial lactate collection, obtaining blood cultures prior to antibiotic administration, administration of broad-spectrum antibiotics within 1 hour, and administration of appropriate antimicrobials, with statistically significant differences in the later years of the study. We also observed a significant decrease in mortality. In patients with bloodstream infection, after adjustment for other covariates the administration of appropriate antimicrobial therapy was associated with a decrease in mortality in patients with severe sepsis and septic shock (p = 0.023).
The administration of appropriate antimicrobial therapy was independently associated with a decline in mortality in patients with severe sepsis and septic shock due to bloodstream infection. As protocol adherence increased over time, the crude mortality rate decreased, which reinforces the need to implement institutional guidelines and monitor appropriate antimicrobial therapy compliance.
Surveillance blood cultures are often obtained in hematopoietic stem cell transplant (HSCT) patients for detection of bloodstream infection. The major aims of this retrospective cohort study were to determine the utility of the practice of obtaining surveillance blood cultures from asymptomatic patients during the first 100 post-transplant days and to determine if obtaining more than one positive blood culture helps in the diagnosis of bloodstream infection.
We conducted a 17-month retrospective analysis of all blood cultures obtained for patients admitted to the hospital for HSCT from January 2010 to June 2011. Each patient’s clinical course, vital signs, diagnostic testing, treatment, and response to treatment were reviewed. The association between number of positive blood cultures and the final diagnosis was analyzed.
Blood culture results for 205 patients were reviewed. Cultures obtained when symptoms of infection were present (clinical cultures) accounted for 1,033 culture sets, whereas 2,474 culture sets were classified as surveillance cultures (no symptoms of infection were present). The total number of positive blood cultures was 185 sets (5.3% of cultures obtained) and accounted for 84 positive culture episodes. Incidence of infection in autologous, related allogeneic and unrelated allogeneic transplants was 8.3%, 20.0%, and 28.6% respectively. Coagulase-negative staphylococci were the most common organisms isolated. Based on our application of predefined criteria there were 29 infections and 55 episodes of positive blood cultures that were not infections. None of the patients who developed infection were diagnosed by surveillance blood cultures. None of the uninfected patients with positive blood cultures showed any clinical changes after receiving antibiotics. There was a significant difference between the incidence of BSI in the first and second 50-day periods post-HSCT. There was no association between the number of positive blood cultures and the final diagnosis.
Surveillance blood cultures in patients who have undergone HSCT do not identify bloodstream infections. The number of positive blood cultures was not helpful in determining which patients had infection. Patients are at higher risk of infection in the first 50 days post-transplant period.
Enterovirus and herpes simplex viruses are common causes of lymphocytic meningitis. The purpose of this study was to analyse the impact of the use molecular testing for Enteroviruses and Herpes simplex viruses I and II in all suspected cases of viral meningitis.
From November 18, 2008 to November 17, 2009 (phase II, intervention), all patients admitted with suspected viral meningitis (with pleocytosis) had a CSF sample tested using a nucleic acid amplification test (NAAT). Data collected during this period were compared to those from the previous one-year period, i.e. November 18, 2007 to November 17, 2008 (phase I, observational), when such tests were available but not routinely used.
In total, 2,536 CSF samples were assessed, of which 1,264 were from phase I, and 1,272 from phase II. Of this total, a NAAT for Enterovirus was ordered in 123 cases during phase I (9.7% of the total phase I sample) and in 221 cases in phase II (17.4% of the total phase II sample). From these, Enterovirus was confirmed in 35 (28.5%, 35/123) patients during phase I and 71 (32.1%, 71/221) patients during phase II (p = 0.107). The rate of diagnosis of meningitis by HSV I and II did not differ between the groups (13 patients, 6.5% in phase I and 13, 4.7% in phase II) (p = 1.0), from 200 cases in phase I and 274 cases in phase II.
The number of cases diagnosed with enteroviral meningitis increased during the course of this study, leading us to believe that the strategy of performing NAAT for Enterovirus on every CSF sample with pleocytosis is fully justified.
Few studies have assessed the time to blood culture positivity as a predictor of clinical outcome in fungal bloodstream infections (BSIs). The purpose of this study was to evaluate the time to positivity (TTP) of blood cultures in patients with Candida albicans BSIs and to assess its impact on clinical outcome.
A historical cohort study with 89 adults patients with C. albicans BSIs. TTP was defined as the time between the start of incubation and the time that the automated alert signal indicating growth in the culture bottle sounded.
Patients with BSIs and TTPs of culture of ≤36 h (n=39) and >36 h (n=50) were compared. Septic shock occurred in 46.2% of patients with TTPs of ≤36 h and in 40.0% of patients with TTP of >36 h (p=0.56). A central venous catheter source was more common with a BSI TTP of ≤36 h (p=0.04). Univariate analyis revealed that APACHE II score≥20 at BSI onset, the development of at least one organ system failure (respiratory, cardiovascular, renal, hematologic, or hepatic), SOFA at BSI onset, SAPS II at BSI onset, and time to positivity were associated with death. By using logistic regression analysis, the only independent predictor of death was time to positivity (1.04; 95% CI, 1.0-1.1, p=0.035), with the chance of the patient with C. albicans BSI dying increasing 4.0% every hour prior to culture positivity.
A longer time to positivity was associated with a higher mortality for Candida albicans BSIs; therefore, initiating empiric treatment with antifungals may improve outcomes.
Candida; Bloodstream infection; Time to positivity; Antifungal therapy
Nosocomial bloodstream infections (nBSIs) are an important cause of morbidity and mortality and are the most frequent type of nosocomial infection in pediatric patients.
We identified the predominant pathogens and antimicrobial susceptibilities of nosocomial bloodstream isolates in pediatric patients (≤16 years of age) in the Brazilian Prospective Surveillance for nBSIs at 16 hospitals from 12 June 2007 to 31 March 2010 (Br SCOPE project).
In our study a total of 2,563 cases of nBSI were reported by hospitals participating in the Br SCOPE project. Among these, 342 clinically significant episodes of BSI were identified in pediatric patients (≤16 years of age). Ninety-six percent of BSIs were monomicrobial. Gram-negative organisms caused 49.0% of these BSIs, Gram-positive organisms caused 42.6%, and fungi caused 8.4%. The most common pathogens were Coagulase-negative staphylococci (CoNS) (21.3%), Klebsiella spp. (15.7%), Staphylococcus aureus (10.6%), and Acinetobacter spp. (9.2%). The crude mortality was 21.6% (74 of 342). Forty-five percent of nBSIs occurred in a pediatric or neonatal intensive-care unit (ICU). The most frequent underlying conditions were malignancy, in 95 patients (27.8%). Among the potential factors predisposing patients to BSI, central venous catheters were the most frequent (66.4%). Methicillin resistance was detected in 37 S. aureus isolates (27.1%). Of the Klebsiella spp. isolates, 43.2% were resistant to ceftriaxone. Of the Acinetobacter spp. and Pseudomonas aeruginosa isolates, 42.9% and 21.4%, respectively, were resistant to imipenem.
In our multicenter study, we found a high mortality and a large proportion of gram-negative bacilli with elevated levels of resistance in pediatric patients.
The success of central line–associated bloodstream infection (CLABSI) prevention programs in intensive care units (ICUs) has led to the expansion of surveillance at many hospitals. We sought to compare non-ICU CLABSI (nCLABSI) rates with national reports and describe methods of surveillance at several participating US institutions.
Design and Setting
An electronic survey of several medical centers about infection surveillance practices and rate data for non-ICU patients.
Ten tertiary care hospitals.
In March 2011, a survey was sent to 10 medical centers. The survey consisted of 12 questions regarding demographics and CLABSI surveillance methodology for non-ICU patients at each center. Participants were also asked to provide available rate and device utilization data.
Hospitals ranged in size from 238 to 1,400 total beds (median, 815). All hospitals reported using Centers for Disease Control and Prevention (CDC) definitions. Denominators were collected by different means: counting patients with central lines every day (5 hospitals), indirectly estimating on the basis of electronic orders (n = 4), or another automated method (n = 1). Rates of nCLABSI ranged from 0.2 to 4.2 infections per 1,000 catheter-days (median, 2.5). The national rate reported by the CDC using 2009 data from the National Healthcare Surveillance Network was 1.14 infections per 1,000 catheter-days.
Only 2 hospitals were below the pooled CLABSI rate for inpatient wards; all others exceeded this rate. Possible explanations include differences in average central line utilization or hospital size in the impact of certain clinical risk factors notably absent from the definition and in interpretation and reporting practices. Further investigation is necessary to determine whether the national benchmarks are low or whether the hospitals surveyed here represent a selection of outliers.
Ventilator-associated pneumonia (VAP) is a common infection in the intensive care unit (ICU) and associated with a high mortality.
A quasi-experimental study was conducted in a medical-surgical ICU. Multiple interventions to optimize VAP prevention were performed from October 2008 to December 2010. All of these processes, including the Institute for Healthcare Improvement’s (IHI) ventilator bundle plus oral decontamination with chlorhexidine and continuous aspiration of subglottic secretions (CASS), were adopted for patients undergoing mechanical ventilation.
We evaluated a total of 21,984 patient-days, and a total of 6,052 ventilator-days (ventilator utilization rate of 0.27). We found VAP rates of 1.3 and 2.0 per 1,000 ventilator days respectively in 2009 and 2010, achieving zero incidence of VAP several times during 12 months, whenever VAP bundle compliance was over 90%.
These results suggest that it is possible to reduce VAP rates to near zero and sustain these rates, but it requires a complex process involving multiple performance measures and interventions that must be permanently monitored.
Ventilator associated pneumonia; Prevention; Intensive care; VAP bundle
Approximately 150 million central venous catheters (CVC) are used each year in the United States. Catheter-related bloodstream infections (CR-BSI) are one of the most important complications of the central venous catheters (CVCs). Our objective was to compare the in-hospital mortality when the catheter is removed or not removed in patients with CR-BSI.
We reviewed all episodes of CR-BSI that occurred in our intensive care unit (ICU) from January 2000 to December 2008. The standard method was defined as a patient with a CVC and at least one positive blood culture obtained from a peripheral vein and a positive semi quantitative (>15 CFU) culture of a catheter segment from where the same organism was isolated. The conservative method was defined as a patient with a CVC and at least one positive blood culture obtained from a peripheral vein and one of the following: (1) differential time period of CVC culture versus peripheral culture positivity of more than 2 hours, or (2) simultaneous quantitative blood culture with 5∶1 ratio (CVC versus peripheral).
53 CR-BSI (37 diagnosed by the standard method and 16 by the conservative method) were diagnosed during the study period. There was a no statistically significant difference in the in-hospital mortality for the standard versus the conservative method (57% vs. 75%, p = 0.208) in ICU patients.
In our study there was a no statistically significant difference between the standard and conservative methods in-hospital mortality.
The Surviving Sepsis Campaign (SSC) guidelines for the management of severe sepsis (SS) and septic shock (SSh) have been recommended to reduce morbidity and mortality.
Materials and Methods
A quasi-experimental study was conducted in a medical-surgical ICU. Multiple interventions to optimize SS and SSh shock patients' clinical outcomes were performed by applying sepsis bundles (6- and 24-hour) in May 2006. We compared bundle compliance and patient outcomes before (July 2005-April 2006) and after (May 2006-December 2009) implementation of the interventions.
A total of 564 SS and SSh patients were identified. Prior to the intervention, compliance with the 6 hour-sepsis resuscitation bundle was only 6%. After the intervention, compliance was as follows: 8.2% from May to December 2006, 9.3% in 2007, 21.1% in 2008 and 13.7% in 2009. For the 24 hour-management bundle, baseline compliance was 15.0%. After the intervention, compliance was 15.1% from May to December 2006, 21.4% in 2007, 27.8% in 2008 and 44.4% in 2009. The in-hospital mortality was 54.0% from July 2005 to April 2006, 41.1% from May to December 2006, 39.3% in 2007, 41.4% in 2008 and 16.2% in 2009.
These results suggest reducing SS and SSh patient mortality is a complex process that involves multiple performance measures and interventions.
Nosocomial bloodstream infections (nBSIs) are an important cause of morbidity and mortality. Data from a nationwide, concurrent surveillance study, Brazilian SCOPE (Surveillance and Control of Pathogens of Epidemiological Importance), were used to examine the epidemiology and microbiology of nBSIs at 16 Brazilian hospitals. In our study 2,563 patients with nBSIs were included from 12 June 2007 to 31 March 2010. Ninety-five percent of BSIs were monomicrobial. Gram-negative organisms caused 58.5% of these BSIs, Gram-positive organisms caused 35.4%, and fungi caused 6.1%. The most common pathogens (monomicrobial) were Staphylococcus aureus (14.0%), coagulase-negative staphylococci (CoNS) (12.6%), Klebsiella spp. (12.0%), and Acinetobacter spp. (11.4%). The crude mortality was 40.0%. Forty-nine percent of nBSIs occurred in the intensive-care unit (ICU). The most frequent underlying conditions were malignancy, in 622 patients (24.3%). Among the potential factors predisposing patients to BSI, central venous catheters were the most frequent (70.3%). Methicillin resistance was detected in 157 S. aureus isolates (43.7%). Of the Klebsiella sp. isolates, 54.9% were resistant to third-generation cephalosporins. Of the Acinetobacter spp. and Pseudomonas aeruginosa isolates, 55.9% and 36.8%, respectively, were resistant to imipenem. In our multicenter study, we found high crude mortality and a high proportion of nBSIs due to antibiotic-resistant organisms.
Clostridium difficile-associated disease (CDAD) is a serious nosocomial infection, however few studies have assessed CDAD outcome in the intensive care unit (ICU). We evaluated the epidemiology, clinical course and outcome of hospital-acquired CDAD in the critical care setting.
We performed a historical cohort study on 58 adults with a positive C. difficile cytotoxin assay result occurring in intensive care units.
Sixty-two percent of patients had concurrent infections, 50% of which were bloodstream infections. The most frequently prescribed antimicrobials prior to CDAD were anti-anaerobic agents (60.3%). Septic shock occurred in 32.8% of CDAD patients. The in-hospital mortality was 27.6%. Univariate analysis revealed that SOFA score, at least one organ failure and age were predictors of mortality. Charlson score ≥3, gender, concurrent infection, and number of days with diarrhea before a positive C. difficile toxin assay were not significant predictors of mortality on univariate analysis. Independent predictors for death were SOFA score at infection onset (per 1-point increment, OR 1.40; CI95 1.13–1.75) and age (per 1-year increment, OR 1.10; CI95 1.02–1.19).
In ICU patients with CDAD, advanced age and increased severity of illness at the onset of infection, as measured by the SOFA score, are independent predictors of death.
Enterococci are the third leading cause of nosocomial bloodstream infection (BSI). Vancomycin resistant enterococci are common and provide treatment challenges; however questions remain about VRE's pathogenicity and its direct clinical impact. This study analyzed the inflammatory response of Enterococcal BSI, contrasting infections from vancomycin-resistant and vancomycin-susceptible isolates.
We performed a historical cohort study on 50 adults with enterococcal BSI to evaluate the associated systemic inflammatory response syndrome (SIRS) and mortality. We examined SIRS scores 2 days prior through 14 days after the first positive blood culture. Vancomycin resistant (n = 17) and susceptible infections (n = 33) were compared. Variables significant in univariate analysis were entered into a logistic regression model to determine the affect on mortality.
60% of BSI were caused by E. faecalis and 34% by E. faecium. 34% of the isolates were vancomycin resistant. Mean APACHE II (A2) score on the day of BSI was 16. Appropriate antimicrobials were begun within 24 hours in 52%. Septic shock occurred in 62% and severe sepsis in an additional 18%. Incidence of organ failure was as follows: respiratory 42%, renal 48%, hematologic 44%, hepatic 26%. Crude mortality was 48%. Progression to septic shock was associated with death (OR 14.9, p < .001). There was no difference in A2 scores on days -2, -1 and 0 between the VRE and VSE groups. Maximal SIR (severe sepsis, septic shock or death) was seen on day 2 for VSE BSI vs. day 8 for VRE. No significant difference was noted in the incidence of organ failure, 7-day or overall mortality between the two groups. Univariate analysis revealed that AP2>18 at BSI onset, and respiratory, cardiovascular, renal, hematologic and hepatic failure were associated with death, but time to appropriate therapy >24 hours, age, and infection due to VRE were not. Multivariate analysis revealed that hematologic (OR 8.4, p = .025) and cardiovascular failure (OR 7.5, p = 032) independently predicted death.
In patients with enterococcal BSI, (1) the incidence of septic shock and organ failure is high, (2) patients with VRE BSI are not more acutely ill prior to infection than those with VSE BSI, and (3) the development of hematologic or cardiovascular failure independently predicts death.
Resistance to linezolid is rare in clinical isolates of Enterococcus faecalis. A strain resistant to this antimicrobial but susceptible to vancomycin was found to cause central venous catheter colonization in a patient who never received linezolid.
Several acute illness severity scores have been proposed for evaluating patients on admission to intensive care units but these have not been compared for patients with nosocomial bloodstream infection (nBSI). We compared three severity of illness scoring systems for predicting mortality in patients with nBSI due to Pseudomonas aeruginosa.
We performed a historical cohort study on 63 adults in intensive care units with P. aeruginosa monomicrobial nBSI.
The Acute Physiology, Age, Chronic Health Evaluation II (APACHE II), Sequential Organ Failure Assessment (SOFA), and Simplified Acute Physiologic Score (SAPS II), were calculated daily from 2 days prior through 2 days after the first positive blood culture. Calculation of the area under the receiver operating characteristic (ROC) curve confirmed that APACHE II and SAPS II at day -1 and SOFA at day +1 were better predictors of outcome than days -2, 0 and day 2 of BSI. By stepwise logistic regression analysis of these three scoring systems, SAPS II (OR: 13.03, CI95% 2.51–70.49) and APACHE II (OR: 12.51, CI95% 3.12–50.09) on day -1 were the best predictors for mortality.
SAPS II and APACHE II are more accurate than the SOFA score for predicting mortality in this group of patients at day -1 of BSI.
Few studies have assessed the time to blood culture positivity as a predictor of clinical outcome in bloodstream infections (BSIs). The purpose of this study was to evaluate the time to positivity (TTP) of blood cultures in patients with Staphylococcus aureus BSIs and to assess its impact on clinical outcome. We performed a historical cohort study with 91 adult patients with S. aureus BSIs. TTP was defined as the time between the start of incubation and the time that the automated alert signal indicating growth in the culture bottle sounded. Patients with BSIs and TTPs of culture of ≤12 h (n = 44) and >12 h (n = 47) were compared. Septic shock occurred in 13.6% of patients with TTPs of ≤12 h and in 8.5% of patients with TTP of >12 h (P = 0.51). A central venous catheter source was more common with a BSI TTP of ≤12 h (P = 0.010). Univariate analysis revealed that a Charlson score of ≥3, the failure of at least one organ (respiratory, cardiovascular, renal, hematologic, or hepatic), infection with methicillin-resistant S. aureus, and TTPs of ≤12 h were associated with death. Age, gender, an APACHE II score of ≥20 at BSI onset, inadequate empirical antibiotic therapy, hospital-acquired bacteremia, and endocarditis were not associated with mortality. Multivariate analysis revealed that independent predictors of hospital mortality were a Charlson score of ≥3 (odds ratio [OR], 14.4; 95% confidence interval [CI], 2.24 to 92.55), infection with methicillin-resistant S. aureus (OR, 9.3; 95% CI, 1.45 to 59.23), and TTPs of ≤12 h (OR, 6.9; 95% CI, 1.07 to 44.66). In this historical cohort study of BSIs due to S. aureus, a TTP of ≤12 h was a predictor of the clinical outcome.
Pseudomonas aeruginosa strains that produce metallo-β-lactamases (MBLs) are becoming increasingly prevalent. We evaluated the epidemiological and microbiological characteristics of monomicrobial bloodstream infections caused by MBL-producing P. aeruginosa isolates, as well as the clinical outcomes in patients with these infections.
The frequency of ESBL producing Klebsiella pneumoniae bloodstream infections (BSI) is high in Brazilian hospitals, however little is known regarding what role, if any, resistance plays in the expected outcome in hospitals with a high prevalence of these pathogens.
From 1996 to 2001, hospital acquired K. pneumoniae BSI were evaluated retrospectively. Each patient was included only once at the time of BSI. ESBL producing strains were identified using the E-test method. The association of variables with the mortality related to bacteremia was included in a stepwise logistic regression model.
One hundred and eight hospital acquired K. pneumoniae BSI met criteria for inclusion. Fifty two percent were due to ESBL producing strains. The overall in-hospital mortality was 40.8%. Variables independently predicting death by multivariate analysis were the following: mechanical ventilation (p = 0.001), number of comorbidities (p = 0.003), antimicrobials prescribed before bacteremia (p = 0.01) and fatal underlying disease (p = 0.025).
Bacteremia due to ESBL producing K. pneumoniae strains was not an independent predictor for death in patients with BSI. An increased mortality in hospital-acquired BSI by K. pneumoniae was related to the requirement for mechanical ventilation, more than two comorbidities, the previous use of two or more antibiotics, and the presence of a rapidly fatal disease.
Some studies of nosocomial bloodstream infection (nBSI) have demonstrated a higher mortality for polymicrobial bacteremia when compared to monomicrobial nBSI. The purpose of this study was to compare differences in systemic inflammatory response and mortality between monomicrobial and polymicrobial nBSI with Pseudomonas aeruginosa.
We performed a historical cohort study on 98 adults with P. aeruginosa (Pa) nBSI. SIRS scores were determined 2 days prior to the first positive blood culture through 14 days afterwards. Monomicrobial (n = 77) and polymicrobial BSIs (n = 21) were compared.
78.6% of BSIs were caused by monomicrobial P. aeruginosa infection (MPa) and 21.4% by polymicrobial P. aeruginosa infection (PPa). Median APACHE II score on the day of BSI was 22 for MPa and 23 for PPa BSIs. Septic shock occurred in 33.3% of PPa and in 39.0% of MPa (p = 0.64). Progression to septic shock was associated with death more frequently in PPa (OR 38.5, CI95 2.9–508.5) than MPa (OR 4.5, CI95 1.7–12.1). Maximal SIR (severe sepsis, septic shock or death) was seen on day 0 for PPa BSI vs. day 1 for MPa. No significant difference was noted in the incidence of organ failure, 7-day or overall mortality between the two groups. Univariate analysis revealed that APACHE II score ≥20 at BSI onset, Charlson weighted comorbidity index ≥3, burn injury and respiratory, cardiovascular, renal and hematologic failure were associated with death, while age, malignant disease, diabetes mellitus, hepatic failure, gastrointestinal complications, inappropriate antimicrobial therapy, infection with imipenem resistant P. aeruginosa and polymicrobial nBSI were not. Multivariate analysis revealed that hematologic failure (p < 0.001) and APACHE II score ≥20 at BSI onset (p = 0.005) independently predicted death.
In this historical cohort study of nBSI with P. aeruginosa, the incidence of septic shock and organ failure was high in both groups. Additionally, patients with PPa BSI were not more acutely ill, as judged by APACHE II score prior to blood culture positivity than those with MPa BSI. Using multivariable logistic regression analysis, the development of hematologic failure and APACHE II score ≥20 at BSI onset were independent predictors of death; however, PPa BSI was not.
Increasingly resistant bacteria in sickle cell disease patients indicate need to evaluate extendedspectrum cephalosporin therapy.
Few long-term multicenter investigations have evaluated the relationships between aggregate antimicrobial drug use in hospitals and bacterial resistance. We measured fluoroquinolone use from 1999 through 2003 in a network of US hospitals. The percentages of fluoroquinolone-resistant Pseudomonas aeruginosa and methicillin-resistant Staphylococcus aureus (MRSA) were obtained from yearly antibiograms at each hospital. Univariate linear regression showed significant associations between a hospital's volume of fluoroquinolone use and percent resistance in most individual study years (1999–2001 for P. aeruginosa, 1999–2002 for S. aureus). When the method of generalized estimating equations was used, a population-averaged longitudinal model incorporating total fluoroquinolone use and the previous year's resistance (to account for autocorrelation) did not show a significant effect of fluoroquinolone use on percent resistance for most drug-organism combinations, except for the relationship between levofloxacin use and percent MRSA. The ecologic relationship between fluoroquinolone use and resistance is complex and requires further study.
Keywords: fluoroquinolones; Staphylococcus aureus; Pseudomonas aeruginosa; pharmacoepidemiology; drug resistance; microbial
Nosocomial oxacillin-resistant Staphylococcus aureus (ORSA) bloodstream isolates were tested to determine the prevalence of vancomycin heteroresistance. We screened 619 ORSA nosocomial bloodstream isolates from 36 hospitals between 1997 and 2000. Only one isolate exhibiting heterotypic resistance was detected. Thus, vancomycin heteroresistance in clinical bloodstream isolates remains rare in the United States.
In vitro activities of seven fluoroquinolones against 140 clinical Acinetobacter baumannii isolates representing 138 different strain types were determined. The rank order of activity was clinafloxacin > gatifloxacin > levofloxacin > trovafloxacin > gemifloxacin = moxifloxacin > ciprofloxacin. The 31 outbreak-related A. baumannii strains were significantly more resistant than were 109 sporadic strains.
To investigate the dissemination of vancomycin-resistant Enterococcus faecium (VREF) in a 728-bed tertiary-care hospital, all clinical VREF isolates recovered from June 1992 to June 1997 were typed by pulsed-field gel electrophoresis, and the transfer histories of the patients were documented. A total of 413 VREF isolates from urine (52%), wounds (16%), blood (11%), catheter tips (6%), and other sites (15%) were studied. VREF specimens mostly came from patients on wards (66%) but 34% came from patients in an intensive care unit. The number of VREF isolates progressively increased over time, with higher rates of isolation during the winter months and lower rates in the late summer months. Four distinct banding patterns were detected by pulsed-field gel electrophoresis among 316 samples (76%). Strain A (122 samples; 30%) appeared in June 1992 as the first VREF strain and was found until December 1994 throughout the entire hospital. Type B (92 samples; 22%) was initially detected in January 1994 and disappeared in November 1996. Strain C (10 samples; 2%) was limited to late 1996 and early 1997. Strain D (92 samples; 22%) showed two major peaks during March 1996 to August 1996 and January 1997 to February 1997. Unrelated strains (97 samples; 24%) appeared 1 year after the appearance of the first VREF isolate, and the numbers increased slightly over the years. Nosocomial acquisition (i.e., no known detection prior to admission and first isolation from cultures performed with samples retrieved ≥2 days after hospitalization) was found for 316 (91%) of 347 patients. Despite the implementation of Centers for Disease Control and Prevention guidelines, the proportion of related strains and high number of nosocomial cases of infection indicate a high transmission rate inside the hospital. The results imply an urgent need for stringent enforcement of more effective infection control measures.
Our objective was to determine the ability of the internal medicine In-Training Examination (ITE) to predict pass or fail outcomes on the American Board of Internal Medicine (ABIM) certifying examination and to develop an externally validated predictive model and a simple equation that can be used by residency directors to provide probability feedback for their residency programs. We collected a study sample of 155 internal medicine residents from the three Virginia internal medicine programs and a validation sample of 64 internal medicine residents from a residency program outside Virginia. Scores from both samples were collected across three class cohorts. The Kolmogorov-Smirnov z test indicated no statistically significant difference between the distribution of scores for the two samples (z = 1.284, p = .074). Results of the logistic model yielded a statistically significant prediction of ABIM pass or fail performance from ITE scores (Wald = 35.49, SE = 0.036, df = 1, p < .005) and overall correct classifications for the study sample and validation sample at 79% and 75%, respectively. The ITE is a useful tool in assessing the likelihood of a resident's passing or failing the ABIM certifying examination but is less predictive for residents who received ITE scores between 49 and 66.
certifying examination; in-training examination; education; predictions; residents