PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1029364)

Clipboard (0)
None

Related Articles

1.  From intermittent antibiotic point prevalence surveys to quality improvement: experience in Scottish hospitals 
Background
In 2008, the Scottish Antimicrobial Prescribing Group (SAPG) was established to coordinate a national antimicrobial stewardship programme. In 2009 SAPG led participation in a European point prevalence survey (PPS) of hospital antibiotic use. We describe how SAPG used this baseline PPS as the foundation for implementation of measures for improvement in antibiotic prescribing.
Methods
In 2009 data for the baseline PPS were collected in accordance with the European Surveillance of Antimicrobial Consumption [ESAC] protocol. This informed the development of two quality prescribing indicators: compliance with antibiotic policy in acute admission units and duration of surgical prophylaxis. From December 2009 clinicians collected these data on a monthly basis. The prescribing indicators were reviewed and further modified in March 2011. Data for the follow up PPS in September 2011 were collected as part of a national PPS of healthcare associated infection and antimicrobial use developed using ECDC protocols.
Results
In the baseline PPS data were collected in 22 (56%) acute hospitals. The frequency of recording the reason for treatment in medical notes was similar in Scotland (75.9%) and Europe (75.7%). Compliance with policy (81.0%) was also similar to Europe (82.5%) but duration of surgical prophylaxis <24hr (68.6%), was higher than in Europe (48.1%, OR: 0.41, p<0.001). Following the development and implementation of the prescribing indicators monthly measurement and data feedback in admission units illustrated improvement in indication documented of ≥90% and compliance with antibiotic prescribing policy increasing from 76% to 90%. The initial prescribing indicator in surgical prophylaxis was less successful in providing consistent national data as there was local discretion on which procedures to include. Following a review and a focus on colorectal surgery the mean proportion receiving single dose prophylaxis exceeded the target of 95% and the mean proportion compliant with policy was 83%. In the follow up PPS of 2011 indication documented (86.8%) and policy compliant (82.8%) were higher than in baseline PPS.
Conclusions
The baseline PPS identified priorities for quality improvement. SAPG has demonstrated that implementation of regularly reviewed national prescribing indicators, acceptable to clinicians, implemented through regular systematic measurement can drive improvement in quality of antibiotic use in key clinical areas. However, our data also show that the ESAC PPS method may underestimate the proportion of surgical prophylaxis with duration <24hr.
doi:10.1186/2047-2994-2-3
PMCID: PMC3573889  PMID: 23320479
Antimicrobial stewardship; Quality improvement; Prescribing indicators; Point prevalence survey; Antibiotic; Hospital prescribing; Surgical prophylaxis
2.  Importance of antimicrobial stewardship to the English National Health Service 
Antimicrobials are an extremely valuable resource across the spectrum of modern medicine. Their development has been associated with dramatic reductions in communicable disease mortality and has facilitated technological advances in cancer therapy, transplantation, and surgery. However, this resource is threatened by the dwindling supply of new antimicrobials and the global increase in antimicrobial resistance. There is an urgent need for antimicrobial stewardship (AMS) to protect our remaining antimicrobials for future generations. AMS emphasizes sensible, appropriate antimicrobial management for the benefit of the individual and society as a whole. Within the English National Health Service (NHS), a series of recent policy initiatives have focused on all aspects of AMS, including best practice guidelines for antimicrobial prescribing, enhanced surveillance mechanisms for monitoring antimicrobial use across primary and secondary care, and new prescribing competencies for doctors in training. Here we provide a concise summary to clarify the current position and importance of AMS within the NHS and review the evidence base for AMS recommendations. The evidence supports the impact of AMS strategies on modifying prescribing practice in hospitals, with beneficial effects on both antimicrobial resistance and the incidence of Clostridium difficile, and no evidence of increased sepsis-related mortality. There is also a promising role for novel diagnostic technologies in AMS, both in enhancing microbiological diagnosis and improving the specificity of sepsis diagnosis. More work is needed to establish an evidence base for interventions to improve public and patient education regarding the role of antibiotics in common clinical syndromes, such as respiratory tract infection. Future priorities include establishing novel approaches to antimicrobial management (eg, duration of therapy, combination regimens) to protect against resistance and working with the pharmaceutical industry to promote the development of new antimicrobials.
doi:10.2147/IDR.S39185
PMCID: PMC4047980  PMID: 24936131
antimicrobial resistance; antibiotics; National Health Service; methicillin-resistant Staphylococcus aureus; Clostridium difficile; prescribing
3.  Growing a “Positive Culture” of Antimicrobial Stewardship in a Community Hospital 
Background:
Promoting the appropriate use of antimicrobials is a core value of antimicrobial stewardship. Prospective audit and feedback constitute an effective strategy for reducing the cost and use of antimicrobials, as well as their adverse effects, such as infection with Clostridium difficile.
Objective:
To evaluate the antimicrobial stewardship program in the intensive care unit at the authors’ hospital, in order to determine the cost and utilization of antimicrobials, as well as the rate of nosocomially acquired C. difficile infection.
Methods:
An infectious diseases team, consisting of a physician and a pharmacist, performed prospective audit and feedback during a pilot study (April to June 2010). The team met with the intensive care unit team daily to discuss optimization of therapy. The cost and utilization of antimicrobial drugs, as well as rates of C. difficile infection, were compared between the pilot period and the same period during the previous year (April to June 2009). For 3 months after the pilot phase (i.e., July to September 2010), the strategy was continued 3 days per week.
Results:
After introduction of the antimicrobial stewardship program, there was a significant reduction in the cost of antimicrobial drugs: $27 917 less than during the same period in the previous year, equivalent to a reduction of $15.45 (36.2%) per patient-day ($42.63 versus $27.18). Utilization of broad-spectrum antipseudomonal antimicrobial agents was also significantly lower, declining from 63.16 to 38.59 defined daily doses (DDDs) per 100 patient-days (reduction of 38.9%). After the pilot period, the rate declined further, to 28.47 DDDs/100 patient-days. During the pilot period, there were no cases of C. difficile infection, and in the post-pilot period, there was 1 case (overall rate 0.42 cases/1000 patient-days). This rate was lower than (but not significantly different from) the rate for April to September 2009 (1.87 cases/1000 patient-days). There were no differences in mortality rate or severity of illness.
Conclusion:
The antimicrobial stewardship program in this community hospital was associated with significant decreases in antimicrobial costs and in utilization of antipseudomonal antimicrobial agents and a nonsignificant decrease in the rate of C. difficile infection. Knowledge exchange, peer-to-peer communication, and decision support, key factors in this success, will be applied in implementing the antimicrobial stewardship program throughout the hospital.
PMCID: PMC3203822  PMID: 22479082
antimicrobial stewardship; prospective audit and feedback; Clostridium difficile; gestion responsable des antimicrobiens; vérification prospective et rétroaction; Clostridium difficile
4.  Development of quality indicators for antimicrobial treatment in adults with sepsis 
BMC Infectious Diseases  2014;14:345.
Background
Outcomes in patients with sepsis are better if initial empirical antimicrobial use is appropriate. Several studies have shown that adherence to guidelines dictating appropriate antimicrobial use positively influences clinical outcome, shortens length of hospital stay and contributes to the containment of antibiotic resistance.
Quality indicators (QIs) can be systematically developed from these guidelines to define and measure appropriate antimicrobial use. We describe the development of a concise set of QIs to assess the appropriateness of antimicrobial use in adult patients with sepsis on a general medical ward or Intensive Care Unit (ICU).
Methods
A RAND-modified, five step Delphi procedure was used. A multidisciplinary panel of 14 experts appraised and prioritized 40 key recommendations from within the Dutch national guideline on antimicrobial use for adult hospitalized patients with sepsis (http://www.swab.nl/guidelines). A procedure to select QIs relevant to clinical outcome, antimicrobial resistance and costs was performed using two rounds of questionnaires with a face-to-face consensus meeting between the rounds over a period of three months.
Results
The procedure resulted in the selection of a final set of five QIs, namely: obtain cultures; prescribe empirical antimicrobial therapy according to the national guideline; start intravenous drug therapy; start antimicrobial treatment within one hour; and streamline antimicrobial therapy.
Conclusion
This systematic, stepwise method, which combined evidence and expert opinion, led to a concise and therefore feasible set of QIs for optimal antimicrobial use in hospitalized adult patients with sepsis. The next step will entail subjecting these quality indicators to an applicability test for their clinimetric properties and ultimately, using these QIs in quality-improvement projects. This information is crucial for antimicrobial stewardship teams to help set priorities and to focus improvement.
doi:10.1186/1471-2334-14-345
PMCID: PMC4078010  PMID: 24950718
Sepsis; Antimicrobial treatment; Quality indicator; Quality improvement; Appropriate antimicrobial use; Appropriate antibiotic use
5.  Antimicrobial Stewardship Programs in Health Care Systems 
Clinical Microbiology Reviews  2005;18(4):638-656.
Antimicrobial stewardship programs in hospitals seek to optimize antimicrobial prescribing in order to improve individual patient care as well as reduce hospital costs and slow the spread of antimicrobial resistance. With antimicrobial resistance on the rise worldwide and few new agents in development, antimicrobial stewardship programs are more important than ever in ensuring the continued efficacy of available antimicrobials. The design of antimicrobial management programs should be based on the best current understanding of the relationship between antimicrobial use and resistance. Such programs should be administered by multidisciplinary teams composed of infectious diseases physicians, clinical pharmacists, clinical microbiologists, and infection control practitioners and should be actively supported by hospital administrators. Strategies for changing antimicrobial prescribing behavior include education of prescribers regarding proper antimicrobial usage, creation of an antimicrobial formulary with restricted prescribing of targeted agents, and review of antimicrobial prescribing with feedback to prescribers. Clinical computer systems can aid in the implementation of each of these strategies, especially as expert systems able to provide patient-specific data and suggestions at the point of care. Antibiotic rotation strategies control the prescribing process by scheduled changes of antimicrobial classes used for empirical therapy. When instituting an antimicrobial stewardship program, a hospital should tailor its choice of strategies to its needs and available resources.
doi:10.1128/CMR.18.4.638-656.2005
PMCID: PMC1265911  PMID: 16223951
6.  Trends in Staphylococcus aureus bacteraemia and impacts of infection control practices including universal MRSA admission screening in a hospital in Scotland, 2006–2010: retrospective cohort study and time-series intervention analysis 
BMJ Open  2012;2(3):e000797.
Objectives
To describe secular trends in Staphylococcus aureus bacteraemia (SAB) and to assess the impacts of infection control practices, including universal methicillin-resistant Staphylococcus aureus (MRSA) admission screening on associated clinical burdens.
Design
Retrospective cohort study and multivariate time-series analysis linking microbiology, patient management and health intelligence databases.
Setting
Teaching hospital in North East Scotland.
Participants
All patients admitted to Aberdeen Royal Infirmary between 1 January 2006 and 31 December 2010: n=420 452 admissions and 1 430 052 acute occupied bed days (AOBDs).
Intervention
Universal admission screening programme for MRSA (August 2008) incorporating isolation and decolonisation.
Primary and secondary measures
Hospital-wide prevalence density, hospital-associated incidence density and death within 30 days of MRSA or methicillin-sensitive Staphylococcus aureus (MSSA) bacteraemia.
Results
Between 2006 and 2010, prevalence density of all SAB declined by 41%, from 0.73 to 0.50 cases/1000 AOBDs (p=0.002 for trend), and 30-day mortality from 26% to 14% (p=0.013). Significant reductions were observed in MRSA bacteraemia only. Overnight admissions screened for MRSA rose from 43% during selective screening to >90% within 4 months of universal screening. In multivariate time-series analysis (R2 0.45 to 0.68), universal screening was associated with a 19% reduction in prevalence density of MRSA bacteraemia (−0.035, 95% CI −0.049 to −0.021/1000 AOBDs; p<0.001), a 29% fall in hospital-associated incidence density (−0.029, 95% CI −0.035 to −0.023/1000 AOBDs; p<0.001) and a 46% reduction in 30-day mortality (−15.6, 95% CI −24.1% to −7.1%; p<0.001). Positive associations with fluoroquinolone and cephalosporin use suggested that antibiotic stewardship reduced prevalence density of MRSA bacteraemia by 0.027 (95% CI 0.015 to 0.039)/1000 AOBDs. Rates of MSSA bacteraemia were not significantly affected by screening or antibiotic use.
Conclusions
Declining clinical burdens from SAB were attributable to reductions in MRSA infections. Universal admission screening and antibiotic stewardship were associated with decreases in MRSA bacteraemia and associated early mortality. Control of MSSA bacteraemia remains a priority.
Article summary
Article focus
This study describes the changing epidemiology of MRSA and MSSA bacteraemia in a large inpatient population from Scotland over a 5-year period.
Second, it evaluates the impact of universal MRSA admission screening, and other infection control practices, on hospital-wide rates of MRSA bacteraemia.
Key messages
Recent declines in clinical burdens from SAB in North East Scotland were attributable to a reduction in invasive MRSA infections.
Compared with a strategy of targeted screening in high-risk environments, universal admission screening may significantly reduce rates of MRSA bacteraemia and associated early mortality alongside improvements in antibiotic stewardship and infection control.
Strategies to reduce clinical burdens from MSSA bacteraemia are required if progress towards national targets for all SAB is to be sustained.
Strengths and limitations of this study
Without a contemporary control, this study did not prove causality but a temporal association between universal admission screening and rates of MRSA bacteraemia.
ARIMA modelling accounted for the non-independence of data and stochastic elements in time series of infections, and the dynamic effects of changes in other aspects of care.
Findings may be limited to large public hospitals with intensive care units and endemic MRSA but low rates of MRSA infection.
doi:10.1136/bmjopen-2011-000797
PMCID: PMC3378947  PMID: 22685226
7.  A Time-Series Analysis of Clostridium difficile and Its Seasonal Association with Influenza 
OBJECTIVE
To characterize the temporal progression of the monthly incidence of Clostridium difficile infections (CDIs) and to determine whether the incidence of CDI is related to the incidence of seasonal influenza.
DESIGN
A retrospective study of patients in the Nationwide Inpatient Sample during the period from 1998 through 2005.
METHODS
We identified all hospitalizations with a primary or secondary diagnosis of CDI with use of International Classification of Diseases, 9th Revision, Clinical Modification codes, and we did the same for influenza. The incidence of CDI was modeled as an autoregression about a linear trend. To investigate the association of CDI with influenza, we compared national and regional CDI and influenza series data and calculated cross-correlation functions with data that had been prewhitened (filtered to remove temporal patterns common to both series). To estimate the burden of seasonal CDI, we developed a proportional measure of seasonal CDI.
RESULTS
Time-series analysis of the monthly number of CDI cases reveals a distinct positive linear trend and a clear pattern of seasonal variation (R2 = 0.98). The cross-correlation functions indicate that influenza activity precedes CDI activity on both a national and regional basis. The average burden of seasonal (ie, winter) CDI is 23%.
CONCLUSIONS
The epidemiologic characteristics of CDI follow a pattern that is seasonal and associated with influenza, which is likely due to antimicrobial use during influenza seasons. Approximately 23% of average monthly CDI during the peak 3 winter months could be eliminated if CDI remained at summer levels.
doi:10.1086/651095
PMCID: PMC3024857  PMID: 20175682
8.  Antimicrobial stewardship in long term care facilities: what is effective? 
Intense antimicrobial use in long term care facilities promotes the emergence and persistence of antimicrobial resistant organisms and leads to adverse effects such as C. difficile colitis. Guidelines recommend development of antimicrobial stewardship programs for these facilities to promote optimal antimicrobial use. However, the effectiveness of these programs or the contribution of any specific program component is not known. For this review, publications describing evaluation of antimicrobial stewardship programs for long term care facilities were identified through a systematic literature search. Interventions included education, guidelines development, feedback to practitioners, and infectious disease consultation. The studies reviewed varied in types of facilities, interventions used, implementation, and evaluation. Comprehensive programs addressing all infections were reported to have improved antimicrobial use for at least some outcomes. Targeted programs for treatment of pneumonia were minimally effective, and only for indicators of uncertain relevance for stewardship. Programs focusing on specific aspects of treatment of urinary infection – limiting treatment of asymptomatic bacteriuria or prophylaxis of urinary infection – were reported to be effective. There were no reports of cost-effectiveness, and the sustainability of most of the programs is unclear. There is a need for further evaluation to characterize effective antimicrobial stewardship for long term care facilities.
doi:10.1186/2047-2994-3-6
PMCID: PMC3931475  PMID: 24521205
Long term care facility; Antimicrobial stewardship; Pneumonia; Urinary tract infection
9.  Telehealth and Telecare: supporting unpaid carers in Scotland 
The Scottish Government Joint Improvement Team (JIT) and the Scottish Centre for Telehealth (SCT), now merged as the Scottish Centre for Telehealth and Telecare (SCTT), published an Education and Training Strategy for Telehealthcare in Scotland in March 2010. In implementing the Strategy’s dedicated Carers Workstream, the SCTT has been working with Carers Scotland and other carer organisations to promote awareness of the benefits of telehealth and telecare in helping to support unpaid carers. Following on from research undertaken by the University of Leeds into the benefits of telecare for carers, developments to date includes the publication of a Training Toolkit for professionals and carer organisations to assist with raising awareness of the benefits of telehealth and telecare specifically for unpaid carers. The toolkit includes outline training programmes, handouts, digital stories and case studies which all adaptable for local use. The presentation will provide an overview of the broader strategic carers’ agenda in relation to telehealth and telecare service delivery, an in-depth look at the Carers and Telehealthcare Training Toolkit and an update on an exciting new technology project to support young carers, being developed in collaboration with Princess Royal Trust for Carers, Glasgow Caledonian and Edinburgh Universities.
Project objectives/deliverables: Work with local authority, health boards and other partners to raise awareness of telehealth and telecare care amongst carers nationally and locally.Identify methods and programme for delivery to ensure that carers have access to appropriate information about access to telehealth and telecare services.Develop core content and supporting materials in carer awareness and telehealth and telecare training to form a Carers and Telehealthcare Training Toolkit.Evaluate impact of Telehealthcare Training Toolkit on staff and carers to inform future development of Toolkit contents.Incorporate telehealth and telecare for carers’ content into core curriculum and CPD modules for health and social care staff.Identify methods and programme for delivery to facilitate the integration of telehealth and telecare needs into carer, community care and health assessment processes.Work with Glasgow Caledonian University and partners to enhance young carers’ access to information and support via a technology solution.
PMCID: PMC3571193
unpaid carers; awareness raising; education; support; self-management
10.  Antibiotic use and resistance in emerging economies: a situation analysis for Viet Nam 
BMC Public Health  2013;13:1158.
Background
Antimicrobial resistance is a major contemporary public health threat. Strategies to contain antimicrobial resistance have been comprehensively set forth, however in developing countries where the need for effective antimicrobials is greatest implementation has proved problematic. A better understanding of patterns and determinants of antibiotic use and resistance in emerging economies may permit more appropriately targeted interventions.
Viet Nam, with a large population, high burden of infectious disease and relatively unrestricted access to medication, is an excellent case study of the difficulties faced by emerging economies in controlling antimicrobial resistance.
Methods
Our working group conducted a situation analysis of the current patterns and determinants of antibiotic use and resistance in Viet Nam. International publications and local reports published between 1-1-1990 and 31-8-2012 were reviewed. All stakeholders analyzed the findings at a policy workshop and feasible recommendations were suggested to improve antibiotic use in Viet Nam.
Here we report the results of our situation analysis focusing on: the healthcare system, drug regulation and supply; antibiotic resistance and infection control; and agricultural antibiotic use.
Results
Market reforms have improved healthcare access in Viet Nam and contributed to better health outcomes. However, increased accessibility has been accompanied by injudicious antibiotic use in hospitals and the community, with predictable escalation in bacterial resistance. Prescribing practices are poor and self-medication is common – often being the most affordable way to access healthcare. Many policies exist to regulate antibiotic use but enforcement is insufficient or lacking.
Pneumococcal penicillin-resistance rates are the highest in Asia and carbapenem-resistant bacteria (notably NDM-1) have recently emerged. Hospital acquired infections, predominantly with multi-drug resistant Gram-negative organisms, place additional strain on limited resources. Widespread agricultural antibiotic use further propagates antimicrobial resistance.
Conclusions
Future legislation regarding antibiotic access must alter incentives for purchasers and providers and ensure effective enforcement. The Ministry of Health recently initiated a national action plan and approved a multicenter health improvement project to strengthen national capacity for antimicrobial stewardship in Viet Nam. This analysis provided important input to these initiatives. Our methodologies and findings may be of use to others across the world tackling the growing threat of antibiotic resistance.
doi:10.1186/1471-2458-13-1158
PMCID: PMC4116647  PMID: 24325208
Antibiotic resistance; Bacterial diseases; Health policy; Health systems; Legislation (health); Resource constrained; Antibiotic consumption
11.  Antimicrobial stewardship: attempting to preserve a strategic resource 
Antimicrobials hold a unique place in our drug armamentarium. Unfortunately the increase in resistance among both gram-positive and gram-negative pathogens coupled with a lack of new antimicrobial agents is threatening our ability to treat infections. Antimicrobial use is the driving force behind this rise in resistance and much of this use is suboptimal. Antimicrobial stewardship programs (ASP) have been advocated as a strategy to improve antimicrobial use. The goals of ASP are to improve patient outcomes while minimizing toxicity and selection for resistant strains by assisting in the selection of the correct agent, right dose, and best duration. Two major strategies for ASP exist: restriction/pre-authorization that controls use at the time of ordering and audit and feedback that reviews ordered antimicrobials and makes suggestions for improvement. Both strategies have some limitations, but have been effective at achieving stewardship goals. Other supplemental strategies such as education, clinical prediction rules, biomarkers, clinical decision support software, and institutional guidelines have been effective at improving antimicrobial use. The most effective antimicrobial stewardship programs have employed multiple strategies to impact antimicrobial use. Using these strategies stewardship programs have been able to decrease antimicrobial use, the spread of resistant pathogens, the incidence of C. difficile infection, pharmacy costs, and improved patient outcomes.
doi:10.3402/jchimp.v1i2.7209
PMCID: PMC3714030  PMID: 23882324
antimicrobial stewardship; resistance; pre-authorization; audit and feedback
12.  Outbreak of Clostridium difficile PCR ribotype 027 - the recent experience of a regional hospital 
BMC Infectious Diseases  2014;14:209.
Background
Clostridium difficile infection (CDI) is the leading cause of healthcare-associated diarrhea, and several outbreaks with increased severity and mortality have been reported. In this study we report a C. difficile PCR ribotype 027 outbreak in Portugal, aiming to contribute to a better knowledge of the epidemiology of this agent in Europe.
Methods
Outbreak report with retrospective study of medical records and active surveillance data of all inpatients with the diagnosis of CDI, from 1st January to 31th December 2012, in a Portuguese hospital. C. difficile isolates were characterized regarding ribotype, toxin genes and moxifloxin resistance. Outbreak control measures were taken, concerning communication, education, reinforcement of infection control measures, optimization of diagnosis and treatment of CDI, and antibiotic stewardship.
Results
Fifty-three inpatients met the case definition of C. difficile-associated infection: 55% males, median age was 78.0 years (interquartile range: 71.0-86.0), 75% had co-morbidities, only 15% had a nonfatal condition, 68% had at least one criteria of severe disease at diagnosis, 89% received prior antibiotherapy, 79% of episodes were nosocomial. CDI rate peak was 13.89/10,000 bed days. Crude mortality rate at 6 months was 64.2% while CDI attributable cause was 11.3%. Worse outcome was related to older age (P = 0.022), severity criteria at diagnosis (leukocytosis (P = 0.008) and renal failure), and presence of fatal underlying condition (P = 0.025). PCR ribotype 027 was identified in 16 of 22 studied samples.
Conclusions
This is the first report of a 027-CDI outbreak in Portugal. We emphasize the relevance of the measures taken to control the outbreak and highlight the importance of implementing a close and active surveillance of CDI.
doi:10.1186/1471-2334-14-209
PMCID: PMC3998949  PMID: 24739945
Clostridium difficile; Outbreak; PCR ribotype 027; Portugal
13.  Implementation of an antimicrobial stewardship program on the medical-surgical service of a 100-bed community hospital 
Background
Antimicrobial stewardship has been promoted as a key strategy for coping with the problems of antimicrobial resistance and Clostridium difficile. Despite the current call for stewardship in community hospitals, including smaller community hospitals, practical examples of stewardship programs are scarce in the reported literature. The purpose of the current report is to describe the implementation of an antimicrobial stewardship program on the medical-surgical service of a 100-bed community hospital employing a core strategy of post-prescriptive audit with intervention and feedback.
Methods
For one hour twice weekly, an infectious diseases physician and a clinical pharmacist audited medical records of inpatients receiving systemic antimicrobial therapy and made non-binding, written recommendations that were subsequently scored for implementation. Defined daily doses (DDDs; World Health Organization Center for Drug Statistics Methodology) and acquisition costs per admission and per patient-day were calculated monthly for all administered antimicrobial agents.
Results
The antimicrobial stewardship team (AST) made one or more recommendations for 313 of 367 audits during a 16-month intervention period (September 2009 – December 2010). Physicians implemented recommendation(s) from each of 234 (75%) audits, including from 85 of 115 for which discontinuation of all antimicrobial therapy was recommended. In comparison to an 8-month baseline period (January 2009 – August 2009), there was a 22% decrease in defined daily doses per 100 admissions (P = .006) and a 16% reduction per 1000 patient-days (P = .013). There was a 32% reduction in antimicrobial acquisition cost per admission (P = .013) and a 25% acquisition cost reduction per patient-day (P = .022).
Conclusions
An effective antimicrobial stewardship program was implemented with limited resources on the medical-surgical service of a 100-bed community hospital.
doi:10.1186/2047-2994-1-32
PMCID: PMC3499185  PMID: 23043720
Antimicrobial stewardship; ASP; Small community hospital
14.  Outcome measurement of extensive implementation of antimicrobial stewardship in patients receiving intravenous antibiotics in a Japanese university hospital 
Background
Antimicrobial stewardship has not always prevailed in a wide variety of medical institutions in Japan.
Methods
The infection control team was involved in the review of individual use of antibiotics in all inpatients (6348 and 6507 patients/year during the first and second annual interventions, respectively) receiving intravenous antibiotics, according to the published guidelines, consultation with physicians before prescription of antimicrobial agents and organisation of education programme on infection control for all medical staff. The outcomes of extensive implementation of antimicrobial stewardship were evaluated from the standpoint of antimicrobial use density, treatment duration, duration of hospital stay, occurrence of antimicrobial-resistant bacteria and medical expenses.
Results
Prolonged use of antibiotics over 2 weeks was significantly reduced after active implementation of antimicrobial stewardship (2.9% vs. 5.2%, p < 0.001). Significant reduction in the antimicrobial consumption was observed in the second-generation cephalosporins (p = 0.03), carbapenems (p = 0.003), aminoglycosides (p < 0.001), leading to a reduction in the cost of antibiotics by 11.7%. The appearance of methicillin-resistant Staphylococcus aureus and the proportion of Serratia marcescens to Gram-negative bacteria decreased significantly from 47.6% to 39.5% (p = 0.026) and from 3.7% to 2.0% (p = 0.026), respectively. Moreover, the mean hospital stay was shortened by 2.9 days after active implementation of antimicrobial stewardship.
Conclusion
Extensive implementation of antimicrobial stewardship led to a decrease in the inappropriate use of antibiotics, saving in medical expenses, reduction in the development of antimicrobial resistance and shortening of hospital stay.
doi:10.1111/j.1742-1241.2012.02999.x
PMCID: PMC3469737  PMID: 22846073
15.  ICMR programme on Antibiotic Stewardship, Prevention of Infection & Control (ASPIC) 
Antimicrobial resistance and hospital infections have increased alarmingly in India. Antibiotic stewardship and hospital infection control are two broad strategies which have been employed globally to contain the problems of resistance and infections. For this to succeed, it is important to bring on board the various stakeholders in hospitals, especially the clinical pharmacologists. The discipline of clinical pharmacology needs to be involved in themes such as antimicrobial resistance and hospital infection which truly impact patient care. Clinical pharmacologists need to collaborate with faculty in other disciplines such as microbiology to achieve good outcomes for optimal patient care in the hospital setting. The ASPIC programme was initiated by the Indian Council of Medical Research (ICMR) in response to the above need and was designed to bring together faculty from clinical pharmacology, microbiology and other disciplines to collaborate on initiating and improving antibiotic stewardship and concurrently curbing hospital infections through feasible infection control practices. This programme involves the participation of 20 centres per year throughout the country which come together for a training workshop. Topics pertaining to the above areas are discussed in addition to planning a project which helps to improve antibiotic stewardship and infection control practices in the various centres. It is hoped that this programme would empower hospitals and institutions throughout the country to improve antibiotic stewardship and infection control and ultimately contain antimicrobial resistance.
PMCID: PMC4001333  PMID: 24718396
Antimicrobial Stewardship; ASPIC; ICMR workshop
16.  Clostridium difficile with Moxifloxacin/Clindamycin Resistance in Vegetables in Ohio, USA, and Prevalence Meta-Analysis 
Journal of Pathogens  2014;2014:158601.
We (i) determined the prevalence of Clostridium difficile and their antimicrobial resistance to six antimicrobial classes, in a variety of fresh vegetables sold in retail in Ohio, USA, and (ii) conducted cumulative meta-analysis of reported prevalence in vegetables since the 1990s. Six antimicrobial classes were tested for their relevance as risk factors for C. difficile infections (CDIs) (clindamycin, moxifloxacin) or their clinical priority as exhaustive therapeutic options (metronidazole, vancomycin, linezolid, and tigecycline). By using an enrichment protocol we isolated C. difficile from three of 125 vegetable products (2.4%). All isolates were toxigenic, and originated from 4.6% of 65 vegetables cultivated above the ground (n = 3; outer leaves of iceberg lettuce, green pepper, and eggplant). Root vegetables yielded no C. difficile. The C. difficile isolates belonged to two PCR ribotypes, one with an unusual antimicrobial resistance for moxifloxacin and clindamycin (lettuce and pepper; 027-like, A+B+CDT+; tcdC 18 bp deletion); the other PCR ribotype (eggplant, A+B+ CDT−; classic tcdC) was susceptible to all antimicrobials. Results of the cumulative weighted meta-analysis (6 studies) indicate that the prevalence of C. difficile in vegetables is 2.1% and homogeneous (P < 0.001) since the first report in 1996 (2.4%). The present study is the first report of the isolation of C. difficile from retail vegetables in the USA. Of public health relevance, antimicrobial resistance to moxifloxacin/clindamycin (a bacterial-associated risk factor for severe CDIs) was identified on the surface of vegetables that are consumed raw.
doi:10.1155/2014/158601
PMCID: PMC4279118  PMID: 25580297
17.  Control of multidrug resistant bacteria in a tertiary care hospital in India 
Background
The objective of this study was to assess the impact of antimicrobial stewardship programs on the multidrug resistance patterns of bacterial isolates. The study comprised an initial retrospective analysis of multidrug resistance in bacterial isolates for one year (July 2007-June 2008) followed by prospective evaluation of the impact of Antimicrobial Stewardship programs on resistance for two years and nine months (July 2008-March 2011).
Setting
A 300-bed tertiary care private hospital in Gurgaon, Haryana (India)
Findings
Methods
Study Design
• July 2007 to June 2008: Resistance patterns of bacterial isolates were studied.
• July 2008: Phase I intervention programme Implementation of an antibiotic policy in the hospital.
• July 2008 to June 2010: Assessment of the impact of the Phase I intervention programme.
• July 2010 to March 2011: Phase II intervention programme: Formation and effective functioning of the antimicrobial stewardship committee. Statistical correlation of the Defined daily dose (DDD) for prescribed drugs with the antimicrobial resistance of Gram negatives.
Results
Phase I intervention programme (July 2008) resulted in a decrease of 4.47% in ESBLs (E.coli and Klebsiella) and a significant decrease of 40.8% in carbapenem-resistant Pseudomonas. Phase II intervention (July 2010) brought a significant reduction (24.7%) in carbapenem-resistant Pseudomonas. However, the resistance in the other Gram negatives (E.coli, Klebsiella, and Acinetobacter) rose and then stabilized. A positive correlation was observed in Pseudomonas and Acinetobacter with carbapenems and cefoperazone-sulbactam.
Piperacillin-tazobactam showed a positive correlation with Acinetobacter only. E.coli and Klebsiella showed positive correlation with cefoparazone-sulbactam and piperacillin-tazobactam.
Conclusion
An antimicrobial stewardship programme with sustained and multifaceted efforts is essential to promote the judicious use of antibiotics.
doi:10.1186/2047-2994-1-23
PMCID: PMC3524029  PMID: 22958481
Carbapenem resistance; Gram negatives; Antimicrobial stewardship program; DDD and Antimicrobial resistance
18.  Characterisation of Clostridium difficile Hospital Ward–Based Transmission Using Extensive Epidemiological Data and Molecular Typing 
PLoS Medicine  2012;9(2):e1001172.
A population-based study in Oxfordshire (UK) hospitals by Sarah Walker and colleagues finds that in an endemic setting with good infection control, ward-based contact cannot account for most new cases of Clostridium difficile infection.
Background
Clostridium difficile infection (CDI) is a leading cause of antibiotic-associated diarrhoea and is endemic in hospitals, hindering the identification of sources and routes of transmission based on shared time and space alone. This may compromise rational control despite costly prevention strategies. This study aimed to investigate ward-based transmission of C. difficile, by subdividing outbreaks into distinct lineages defined by multi-locus sequence typing (MLST).
Methods and Findings
All C. difficile toxin enzyme-immunoassay-positive and culture-positive samples over 2.5 y from a geographically defined population of ∼600,000 persons underwent MLST. Sequence types (STs) were combined with admission and ward movement data from an integrated comprehensive healthcare system incorporating three hospitals (1,700 beds) providing all acute care for the defined geographical population. Networks of cases and potential transmission events were constructed for each ST. Potential infection sources for each case and transmission timescales were defined by prior ward-based contact with other cases sharing the same ST. From 1 September 2007 to 31 March 2010, there were means of 102 tests and 9.4 CDIs per 10,000 overnight stays in inpatients, and 238 tests and 15.7 CDIs per month in outpatients/primary care. In total, 1,276 C. difficile isolates of 69 STs were studied. From MLST, no more than 25% of cases could be linked to a potential ward-based inpatient source, ranging from 37% in renal/transplant, 29% in haematology/oncology, and 28% in acute/elderly medicine to 6% in specialist surgery. Most of the putative transmissions identified occurred shortly (≤1 wk) after the onset of symptoms (141/218, 65%), with few >8 wk (21/218, 10%). Most incubation periods were ≤4 wk (132/218, 61%), with few >12 wk (28/218, 13%). Allowing for persistent ward contamination following ward discharge of a CDI case did not increase the proportion of linked cases after allowing for random meeting of matched controls.
Conclusions
In an endemic setting with well-implemented infection control measures, ward-based contact with symptomatic enzyme-immunoassay-positive patients cannot account for most new CDI cases.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Hospital-acquired infections are common and occur when patients are infected with an organism found in the hospital or health-care environment. Hospital-acquired infections can sometimes cause no symptoms but often lead to illness or even death. A leading hospital-acquired infection is with the anaerobic bacterium Clostridium difficile, which causes gastrointestinal problems, including diarrhea, leading to severe illness and even death, especially in older patients or patients who are already seriously ill. Between 7% and 26% of elderly adult inpatients in hospitals may be asymptomatic carriers of C. difficile, and the spores that are formed by this organism can live outside of the human body for long periods of time and are notoriously resistant to most routine surface-cleaning methods. Following major hospital-associated outbreaks around the world, C. difficile infection has become a prime target for expensive prevention and infection control strategies.
Why Was This Study Done?
Prevention strategies and infection control measures have contributed to reducing the incidence of C. difficile infection, however, to date, there have not been any robust evaluations of the impact of such strategies in reducing the spread of infection at the individual level. In order to implement improved, cost-effective policies, and to work out how to reduce incidence even further, a better understanding of person-to-person spread is crucial, especially as infection with C. difficile depends on a combination of factors, such as antibiotic exposure and host susceptibility. Therefore, the researchers conducted this study to examine in detail the transmission of C. difficile in hospital wards in order to give more insight and information on the nature of person-to-person spread.
What Did the Researchers Do and Find?
The researchers used a population-based study in Oxfordshire, UK, to investigate hospital ward–based transmission of defined C. difficile strains from symptomatic patients by identifying C. difficile infection from routine clinical microbiological samples from 1 September 2007 to 31 March 2010. Throughout this period, Oxfordshire hospitals operated a rigorous infection control policy monitored by infection control staff, in which stool samples for C. difficile testing were taken from admitted patients with persistent diarrhea, and from patients with any diarrhea who were 65 years or older. The researchers tested all stool samples for C. difficile toxins by enzyme immunoassay, cultured positive samples, and genotyped C. difficile isolates by using multi-locus sequence typing (to identify strains, that is, sequence types), and finally, constructed networks of cases and potential transmissions (by tracing contacts for up to 26 weeks) for each sequence type identified.
In order to show which ward-based contacts potentially incorporated direct person-to-person spread and indirect transmission via the environment during shared ward exposure, the researchers analysed links (ward contacts) between the first case (the donor) and the second case (the recipient) for all pairs of cases with the same sequence type. The researchers then calculated the minimum infectious period by measuring the time between the first infected stool sample from the donor and ward contact with the recipient, and calculated the incubation period as the time between this ward contact and the first infected stool sample in the recipient. To reduce the possibility of shared ward contacts occurring by chance, the researchers used patients with negative enzyme immunoassay stool samples as controls to estimate how often such ward contacts reflected actual transmission rather than chance.
Over the study period, almost 30,000 stool samples from almost 15,000 patients were tested for C. difficile, with 4.4% (1,282) found positive for C. difficile in enzyme immunoassay and culture. With genotyping, the researchers identified 69 strains (sequence types) of C. difficile. The researchers found that the majority (66%) of cases of C. difficile infection were not linked to known cases, and only 23% had a credible ward-based donor sharing the same sequence type of C. difficile. Furthermore, the researchers found that most probable transmissions occurred less than one week after the onset of symptoms, with a minority (10%) occurring after eight weeks. Most incubation periods were less than four weeks, but a few (13%) were more than 12 weeks. Importantly, even after allowing for the random meeting of matched controls and for persistent ward contamination, the proportion of linked cases did not increase following ward discharge of a C. difficile infection case.
What Do These Findings Mean?
These findings show that in an endemic setting with well-implemented infection control measures, ward-based contact with symptomatic, enzyme-immunoassay-positive patients cannot account for most new cases of C. difficile infection. Crucially, these findings mean that C. difficile infection might not be effectively controlled by current strategies to prevent person-to-person spread. Although the researchers were able to distinguish different strains of C. difficile, there were insufficient numbers of these different strains to deduce whether the results they obtained might be different if there was a different combination of strain types, that is, if some strains were spreading more in hospitals than others. Finally, in order to determine what other types of control interventions are required to reduce the spread of C. difficile, a better understanding of other routes of transmission and reservoirs of infectivity is needed.
Additional Information
Please access these web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001172.
This study is further discussed in a PLoS Medicine Perspective by Stephan Harbarth and Matthew Samore
The US Centers for Disease Control and Prevention provides information about C. difficile infection, as does the UK Health Protection Agency
The World Health Organization has published a guide for preventing hospital-acquired infections
doi:10.1371/journal.pmed.1001172
PMCID: PMC3274560  PMID: 22346738
19.  Incidence of and risk factors for community-associated Clostridium difficile infection: A nested case-control study 
BMC Infectious Diseases  2011;11:194.
Background
Clostridium difficile is the most common cause of nosocomial infectious diarrhea in the United States. However, recent reports have documented that C. difficile infections (CDIs) are occurring among patients without traditional risk factors. The purpose of this study was to examine the epidemiology of CA-CDI, by estimating the incidence of CA-CDI and HA-CDI, identifying patient-related risk factors for CA-CDI, and describing adverse health outcomes of CA-CDI.
Methods
We conducted a population-based, retrospective, nested, case-control study within the University of Iowa Wellmark Data Repository from January 2004 to December 2007. We identified persons with CDI, determined whether infection was community-associated (CA) or hospital-acquired (HA), and calculated incidence rates. We collected demographic, clinical, and pharmacologic information for CA-CDI cases and controls (i.e., persons without CDI). We used conditional logistic regression to estimate the odds ratios (ORs) for potential risk factors for CA-CDI.
Results
The incidence rates for CA-CDI and HA-CDI were 11.16 and 12.1 cases per 100,000 person-years, respectively. CA-CDI cases were more likely than controls to receive antimicrobials (adjusted OR, 6.09 [95% CI 4.59-8.08]) and gastric acid suppressants (adjusted OR, 2.30 [95% CI 1.56-3.39]) in the 180 days before diagnosis. Controlling for other covariates, increased risk for CA-CDI was associated with use of beta-lactam/beta-lactamase inhibitors, cephalosporins, clindamycin, fluoroquinolones, macrolides, and penicillins. However, 27% of CA-CDI cases did not receive antimicrobials in the 180 days before their diagnoses, and 17% did not have any traditional risk factors for CDI.
Conclusions
Our study documented that the epidemiology of CDI is changing, with CA-CDI occurring in populations not traditionally considered "high-risk" for the disease. Clinicians should consider this diagnosis and obtain appropriate diagnostic testing for outpatients with persistent or severe diarrhea who have even remote antimicrobial exposure.
doi:10.1186/1471-2334-11-194
PMCID: PMC3154181  PMID: 21762504
20.  Antimicrobial susceptibility profiles of human and piglet Clostridium difficile PCR-ribotype 078 
In the last decade, outbreaks of nosocomial Clostridium difficile infections (CDI) occurred worldwide. A new emerging type, PCR-ribotype 027, was the associated pathogen. Antimicrobial susceptibility profiles of this type were extensively investigated and used to partly explain its spread. In Europe, the incidence of C. difficile PCR-ribotype 078 recently increased in humans and piglets. Using recommendations of the European Committee on Antimicrobial Susceptibility Testing (EUCAST) and the Clinical and Laboratory Standards Institute (CLSI) we studied the antimicrobial susceptibility to eight antimicrobials, mechanisms of resistance and the relation with previously prescribed antimicrobials in human (n=49) and porcine (n=50) type 078 isolates. Human and porcine type 078 isolates showed similar antimicrobial susceptibility patterns for the antimicrobials tested. In total, 37% of the isolates were resistant to four or more antimicrobial agents. The majority of the human and porcine isolates were susceptible to amoxicillin (100%), tetracycline (100%) and clindamycin (96%) and resistant to ciprofloxacin (96%). More variation was found for resistance patterns to erythromycin (76% in human and 59% in porcine isolates), imipenem (29% in human and 50% in porcine isolates) and moxifloxacin (16% for both human and porcine isolates). MIC values of cefuroxim were high (MICs >256 mg/L) in 96% of the isolates. Resistance to moxifloxacin and clindamycin was associated with a gyr(A) mutation and the presence of the erm(B) gene, respectively. A large proportion (96%) of the erythromycin resistant isolates did not carry the erm(B) gene. The use of ciprofloxacin (humans) and enrofloxacin (pigs) was significantly associated with isolation of moxifloxacin resistant isolates. Increased fluoroquinolone use could have contributed to the spread of C. difficile type 078.
doi:10.1186/2047-2994-2-14
PMCID: PMC3651393  PMID: 23566553
21.  Unnecessary use of fluoroquinolone antibiotics in hospitalized patients 
BMC Infectious Diseases  2011;11:187.
Background
Fluoroquinolones are among the most commonly prescribed antimicrobials and are an important risk factor for colonization and infection with fluoroquinolone-resistant gram-negative bacilli and for Clostridium difficile infection (CDI). In this study, our aim was to determine current patterns of inappropriate fluoroquinolone prescribing among hospitalized patients, and to test the hypothesis that longer than necessary treatment durations account for a significant proportion of unnecessary fluoroquinolone use.
Methods
We conducted a 6-week prospective, observational study to determine the frequency of, reasons for, and adverse effects associated with unnecessary fluoroquinolone use in a tertiary-care academic medical center. For randomly-selected adult inpatients receiving fluoroquinolones, therapy was determined to be necessary or unnecessary based on published guidelines or standard principles of infectious diseases. Adverse effects were determined based on chart review 6 weeks after completion of therapy.
Results
Of 1,773 days of fluoroquinolone therapy, 690 (39%) were deemed unnecessary. The most common reasons for unnecessary therapy included administration of antimicrobials for non-infectious or non-bacterial syndromes (292 days-of-therapy) and administration of antimicrobials for longer than necessary durations (234 days-of-therapy). The most common syndrome associated with unnecessary therapy was urinary tract infection or asymptomatic bacteriuria (30% of all unnecessary days-of-therapy). Twenty-seven percent (60/227) of regimens were associated with adverse effects possibly attributable to therapy, including gastrointestinal adverse effects (14% of regimens), colonization by resistant pathogens (8% of regimens), and CDI (4% of regimens).
Conclusions
In our institution, 39% of all days of fluoroquinolone therapy were unnecessary. Interventions that focus on improving adherence with current guidelines for duration of antimicrobial therapy and for management of urinary syndromes could significantly reduce overuse of fluoroquinolones.
doi:10.1186/1471-2334-11-187
PMCID: PMC3145580  PMID: 21729289
22.  Risk factors for recurrent Clostridium difficile infection (CDI) hospitalization among hospitalized patients with an initial CDI episode: a retrospective cohort study 
BMC Infectious Diseases  2014;14:306.
Background
Recurrent Clostridium difficile infection (rCDI) is observed in up to 25% of patients with an initial CDI episode (iCDI). We assessed risk factors for rCDI among patients hospitalized with iCDI.
Methods
We performed a retrospective cohort study at Barnes-Jewish Hospital from 1/1/03 to 12/31/09. iCDI was defined as a positive toxin assay for C. difficile with no CDI in previous 60 days, and rCDI as a repeat positive toxin ≤42 days of stopping iCDI treatment. Three demographic, 13 chronic and 12 acute disease characteristics, and 21 processes of care were assessed for association with rCDI. Cox modeling identified independent risk factors for rCDI.
Results
425 (10.1%) of 4,200 patients enrolled developed rCDI. Of the eight risk factors for rCDI on multivariate analyses, the strongest three were 1) high-risk antimicrobials following completion of iCDI treatment (HR 2.95, 95% CI 2.25-3.86), 2) community-onset healthcare-associated iCDI (HR 1.80, 95% CI 1.41-2.29) and 3) fluoroquinolones after completion of iCDI treatment (HR 1.56, 95% CI 1.63-2.08). Other risk factors included gastric acid suppression, ≥2 hospitalizations within prior 60 days, age, and IV vancomycin after iCDI treatment ended.
Conclusions
The rCDI rate was 10.1%. Recognizing such modifiable risk factors as certain antimicrobial treatments and gastric acid suppression may help optimize prevention efforts.
doi:10.1186/1471-2334-14-306
PMCID: PMC4050409  PMID: 24898123
C. difficile; Risk factors; Recurrence
23.  Cost-effectiveness analysis of fidaxomicin versus vancomycin in Clostridium difficile infection 
Journal of Antimicrobial Chemotherapy  2014;69(11):2901-2912.
Objectives
Fidaxomicin was non-inferior to vancomycin with respect to clinical cure rates in the treatment of Clostridium difficile infections (CDIs) in two Phase III trials, but was associated with significantly fewer recurrences than vancomycin. This economic analysis investigated the cost-effectiveness of fidaxomicin compared with vancomycin in patients with severe CDI and in patients with their first CDI recurrence.
Methods
A 1 year time horizon Markov model with seven health states was developed from the perspective of Scottish public healthcare providers. Model inputs for effectiveness, resource use, direct costs and utilities were obtained from published sources and a Scottish expert panel. The main model outcome was the incremental cost-effectiveness ratio (ICER), expressed as cost per quality-adjusted life year (QALY), for fidaxomicin versus vancomycin; ICERs were interpreted using willingness-to-pay thresholds of £20 000/QALY and £30 000/QALY. One-way and probabilistic sensitivity analyses were performed.
Results
Total costs were similar with fidaxomicin and vancomycin in patients with severe CDI (£14 515 and £14 344, respectively) and in patients with a first recurrence (£16 535 and £16 926, respectively). Improvements in clinical outcomes with fidaxomicin resulted in small QALY gains versus vancomycin (severe CDI, +0.010; patients with first recurrence, +0.019). Fidaxomicin was cost-effective in severe CDI (ICER £16 529/QALY) and dominant (i.e. more effective and less costly) in patients with a first recurrence. The probability that fidaxomicin was cost-effective at a willingness-to-pay threshold of £30 000/QALY was 60% for severe CDI and 68% in a first recurrence.
Conclusions
Fidaxomicin is cost-effective in patients with severe CDI and in patients with a first CDI recurrence versus vancomycin.
doi:10.1093/jac/dku257
PMCID: PMC4195473  PMID: 25096079
economic; model; antibacterials
24.  National Institute of Allergy and Infectious Disease (NIAID) Funding for Studies of Hospital-Associated Bacterial Pathogens: Are Funds Proportionate to Burden of Disease? 
Background
Hospital-associated infections (HAIs) are associated with a considerable burden of disease and direct costs greater than $17 billion. The pathogens that cause the majority of serious HAIs are Enterococcus faecium, Staphylococcus aureus, Clostridium difficile, Klebsiella pneumoniae, Acinetobacter baumannii, Pseudomonas aeruginosa, and Enterobacter species, referred as ESCKAPE. We aimed to determine the amount of funding the National Institute of Health (NIH) National Institute of Allergy and Infectious Diseases (NIAID) allocates to research on antimicrobial resistant pathogens, particularly ESCKAPE pathogens.
Methods
The NIH Research Portfolio Online Reporting Tools (RePORT) database was used to identify NIAID antimicrobial resistance research grants funded in 2007-2009 using the terms "antibiotic resistance," "antimicrobial resistance," and "hospital-associated infection."
Results
Funding for antimicrobial resistance grants has increased from 2007-2009. Antimicrobial resistance funding for bacterial pathogens has seen a smaller increase than non-bacterial pathogens. The total funding for all ESKCAPE pathogens was $ 22,005,943 in 2007, $ 30,810,153 in 2008 and $ 49,801,227 in 2009. S. aureus grants received $ 29,193,264 in FY2009, the highest funding amount of all the ESCKAPE pathogens. Based on 2009 funding data, approximately $1,565 of research money was spent per S. aureus related death and $750 of was spent per C. difficile related death.
Conclusions
Although the funding for ESCKAPE pathogens has increased from 2007 to 2009, funding levels for antimicrobial resistant bacteria-related grants is still lower than funding for antimicrobial resistant non-bacterial pathogens. Efforts may be needed to improve research funding for resistant-bacterial pathogens, particularly as their clinical burden increases.
doi:10.1186/2047-2994-1-5
PMCID: PMC3415121  PMID: 22958856
Antibiotic resistance; NIH; Hospital-associated infection; research funding; disease burden
25.  Reductions in intestinal Clostridiales precede the development of nosocomial Clostridium difficile infection 
Microbiome  2013;1:18.
Background
Antimicrobial use is thought to suppress the intestinal microbiota, thereby impairing colonization resistance and allowing Clostridium difficile to infect the gut. Additional risk factors such as proton-pump inhibitors may also alter the intestinal microbiota and predispose patients to Clostridium difficile infection (CDI). This comparative metagenomic study investigates the relationship between epidemiologic exposures, intestinal bacterial populations and subsequent development of CDI in hospitalized patients. We performed a nested case–control study including 25 CDI cases and 25 matched controls. Fecal specimens collected prior to disease onset were evaluated by 16S rRNA gene amplification and pyrosequencing to determine the composition of the intestinal microbiota during the at-risk period.
Results
The diversity of the intestinal microbiota was significantly reduced prior to an episode of CDI. Sequences corresponding to the phylum Bacteroidetes and to the families Bacteroidaceae and Clostridiales Incertae Sedis XI were depleted in CDI patients compared to controls, whereas sequences corresponding to the family Enterococcaceae were enriched. In multivariable analyses, cephalosporin and fluoroquinolone use, as well as a decrease in the abundance of Clostridiales Incertae Sedis XI were significantly and independently associated with CDI development.
Conclusions
This study shows that a reduction in the abundance of a specific bacterial family - Clostridiales Incertae Sedis XI - is associated with risk of nosocomial CDI and may represent a target for novel strategies to prevent this life-threatening infection.
doi:10.1186/2049-2618-1-18
PMCID: PMC3971611  PMID: 24450844
Intestinal microbiota; Clostridium difficile infection; 16S rRNA gene sequencing; Clostridiales Incertae Sedis XI

Results 1-25 (1029364)