The objective of this study was to analyze the transmission dynamics of ESBL positive Klebsiella spp. with an additional resistance towards gentamicin (ESBL-G) in a Dutch region of 650,000 inhabitants in 2012.
All patient related ESBL-G isolates isolated in 2012 were genotyped using both Amplification Fragment Length Polymorphism (AFLP) and High-throughput MultiLocus Sequence Typing (HiMLST). HiMLST was used to analyze the presence of (unidentified) clusters of ESBL-G positive patients. Furthermore, all consecutive ESBL-G isolates within patients were studied in order to evaluate the intra-patient variation of antibiotic phenotypes.
There were 38 ESBL-G isolates, which were classified into 18 different sequence types (STs) and into 21 different AFLP types. Within the STs, four clusters were detected from which two were unknown resulting in a transmission index of 0.27. An analysis of consecutive ESBL-G isolates (with similar STs) within patients showed that for 68.8% of the patients at least one isolate had a different consecutive antibiotic phenotype.
The transmission of ESBL-G in the region Kennemerland in 2012 was polyclonal with several outbreaks (with a high level of epidemiological linkage). Furthermore, clustering by antibiotic phenotype characterization seems to be an inadequate approach in this setting. The routine practice of molecular typing of collected ESBL-G isolates may help to detect transmission in an early stage, which opens the possibility of a rapid response.
Livestock-associated MRSA (MC398) has emerged and is related to an extensive reservoir in pigs and veal calves. Individuals with direct contact with these animals and their family members are known to have high MC398 carriage rates. Until now it was assumed that MC398 does not spread to individuals in the community without pig or veal calf exposure. To test this, we identified the proportion of MC398 in MRSA positive individuals without contact with pigs/veal calves or other known risk factors (MRSA of unknown origin; MUO).
In 17 participating hospitals, we determined during two years the occurrence of MC398 in individuals without direct contact with livestock and no other known risk factor (n = 271) and tested in a post analysis the hypothesis whether hospitals in pig-dense areas have higher proportions of MC398 of all MUO.
Fifty-six individuals (20.7%) without animal contact carried MC398. In hospitals with high pig-densities in the adherence area, the proportion of MC398 of all MUO was higher than this proportion in hospitals without pigs in the surroundings.
One fifth of the individuals carrying MUO carried MC398. So, MC398 is found in individuals without contact to pigs or veal calves. The way of transmission from the animal reservoir to these individuals is unclear, probably by human-to-human transmission or by exposure to the surroundings of the stables. Further research is needed to investigate the way of transmission.
We collected 110 Salmonella enterica isolates from sick pigs and determined their serotypes, genotypes using pulsed-field gel electrophoresis (PFGE), and antimicrobial susceptibility to 12 antimicrobials and compared the data with a collection of 18,280 isolates obtained from humans. The pig isolates fell into 12 common serovars for human salmonellosis in Taiwan; S. Typhimurium, S. Choleraesuis, S. Derby, S. Livingstone, and S. Schwarzengrund were the 5 most common serovars and accounted for a total of 84% of the collection. Of the 110 isolates, 106 (96%) were multidrug resistant (MDR) and 48 (44%) had PFGE patterns found in human isolates. S. Typhimurium, S. Choleraesuis, and S. Schwarzengrund were among the most highly resistant serovars. The majority of the 3 serovars were resistant to 8–11 of the tested antimicrobials. The isolates from pigs and humans sharing a common PFGE pattern displayed identical or very similar resistance patterns and Salmonella strains that caused severe infection in pigs were also capable of causing infections in humans. The results indicate that pigs are one of the major reservoirs to human salmonellosis in Taiwan. Almost all of the pig isolates were MDR, which highlights the necessity of strictly regulating the use of antimicrobials in the agriculture sector in Taiwan.
Community acquired K. pneumoniae pneumonia is still common in Asia and is reportedly associated with alcohol use. Oropharyngeal carriage of K. pneumoniae could potentially play a role in the pathogenesis of K. pneumoniae pneumonia. However, little is known regarding K. pneumoniae oropharyngeal carriage rates and risk factors. This population-based cross-sectional study explores the association of a variety of demographic and socioeconomic factors, as well as alcohol consumption with oropharyngeal carriage of K. pneumoniae in Vietnam.
Methods and Findings
1029 subjects were selected randomly from age, sex, and urban and rural strata. An additional 613 adult men from a rural environment were recruited and analyzed separately to determine the effects of alcohol consumption. Demographic, socioeconomic, and oropharyngeal carriage data was acquired for each subject. The overall carriage rate of K. pneumoniae was 14.1% (145/1029, 95% CI 12.0%–16.2%). By stepwise logistic regression, K. pneumoniae carriage was found to be independently associated with age (OR 1.03, 95% CI 1.02–1.04), smoking (OR 1.9, 95% CI 1.3–2.9), rural living location (OR 1.6, 95% CI 1.1–2.4), and level of weekly alcohol consumption (OR 1.7, 95% CI 1.04–2.8).
Moderate to heavy weekly alcohol consumption, old age, smoking, and living in a rural location are all found to be associated with an increased risk of K. pneumoniae carriage in Vietnamese communities. Whether K. pneumoniae carriage is a risk factor for pneumonia needs to be elucidated.
The aim of this study was to compare the current screening methods and to evaluate confirmation tests for phenotypic plasmidal AmpC (pAmpC) detection.
For this evaluation we used 503 Enterobacteriaceae from 18 Dutch hospitals and 21 isolates previously confirmed to be pAmpC positive. All isolates were divided into three groups: isolates with 1) reduced susceptibility to ceftazidime and/or cefotaxime; 2) reduced susceptibility to cefoxitin; 3) reduced susceptibility to ceftazidime and/or cefotaxime combined with reduced susceptibility to cefoxitin. Two disk-based tests, with cloxacillin or boronic acid as inhibitor, and Etest with cefotetan-cefotetan/cloxacillin were used for phenotypic AmpC confirmation. Finally, presence of pAmpC genes was tested by multiplex and singleplex PCR.
We identified 13 pAmpC producing Enterobacteriaceae isolates among the 503 isolates (2.6%): 9 CMY-2, 3 DHA-1 and 1 ACC-1 type in E. coli isolates. The sensitivity and specificity of reduced susceptibility to ceftazidime and/or cefotaxime in combination with cefoxitin was 97% (33/34) and 90% (289/322) respectively. The disk-based test with cloxacillin showed the best performance as phenotypic confirmation method for AmpC production.
For routine phenotypic detection of pAmpC the screening for reduced susceptibility to third generation cephalosporins combined with reduced susceptibility to cefoxitin is recommended. Confirmation via a combination disk diffusion test using cloxacillin is the best phenotypic option. The prevalence found is worrisome, since, due to their plasmidal location, pAmpC genes may spread further and increase in prevalence.
We investigated the relationship between average monthly temperature and the most common clinical pathogens causing infections in intensive care patients.
A prospective unit-based study in 73 German intensive care units located in 41 different hospitals and 31 different cities with total 188,949 pathogen isolates (102,377 Gram-positives and 86,572 Gram-negatives) from 2001 to 2012. We estimated the relationship between the number of clinical pathogens per month and the average temperature in the month of isolation and in the month prior to isolation while adjusting for confounders and long-term trends using time series analysis. Adjusted incidence rate ratios for temperature parameters were estimated based on generalized estimating equation models which account for clustering effects.
The incidence density of Gram-negative pathogens was 15% (IRR 1.15, 95%CI 1.10–1.21) higher at temperatures ≥20°C than at temperatures below 5°C. E. cloacae occurred 43% (IRR = 1.43; 95%CI 1.31–1.56) more frequently at high temperatures, A. baumannii 37% (IRR = 1.37; 95%CI 1.11–1.69), S. maltophilia 32% (IRR = 1.32; 95%CI 1.12–1.57), K. pneumoniae 26% (IRR = 1.26; 95%CI 1.13–1.39), Citrobacter spp. 19% (IRR = 1.19; 95%CI 0.99–1.44) and coagulase-negative staphylococci 13% (IRR = 1.13; 95%CI 1.04–1.22). By contrast, S. pneumoniae 35% (IRR = 0.65; 95%CI 0.50–0.84) less frequently isolated at high temperatures. For each 5°C increase, we observed a 3% (IRR = 1.03; 95%CI 1.02–1.04) increase of Gram-negative pathogens. This increase was highest for A. baumannii with 8% (IRR = 1.08; 95%CI 1.05–1.12) followed by K. pneumoniae, Citrobacter spp. and E. cloacae with 7%.
Clinical pathogens vary by incidence density with temperature. Significant higher incidence densities of Gram-negative pathogens were observed during summer whereas S. pneumoniae peaked in winter. There is increasing evidence that different seasonality due to physiologic changes underlies host susceptibility to different bacterial pathogens. Even if the underlying mechanisms are not yet clear, the temperature-dependent seasonality of pathogens has implications for infection control and study design.
In the Netherlands a successful MRSA Search and Destroy policy is applied in healthcare institutes. We determined the effect of an adjustment in the MRSA Search and Destroy policy for patients in the outpatient clinic on the MRSA transmission to health care workers (HCW).
In June 2008 an adjustment in the policy for outpatients was introduced in a large teaching hospital. Following this adjustment MRSA positive patients and patients at risk could be seen and treated applying general precautions, without additional protective measures. Also, disinfection of the room after the patient had left was abandoned. To monitor the effect of this policy on the transmission of MRSA all physicians and health care workers of the outpatient clinic were screened for MRSA carriage repeatedly.
Before the introduction of the adjusted policy all physicians and HCW of the outpatient clinic were screened (=0-measurement, n = 1,073). None of them was found to be MRSA positive. After introduction of the policy in June 2008 the screening was repeated in October 2008 (n = 1,170) and April 2009 (n = 1,128). In April 2009 one health care worker was MRSA positive resulting in a mean prevalence of 0.09%. This is lower than the known prevalence in HCW. The health care worker was colonized with the livestock-related Spa type t011. As far as we could verify, no patients with this Spa-type had been cared for by the health care worker.
The adjusted MRSA policy did not lead to detectable transmission of MRSA to HCW and was associated with less disturbances in the work flow.
MRSA; Outpatient clinic; Outpatients; Staphylococcus aureus; Search & Destroy policy; Transmission
We studied clinical characteristics, appropriateness of initial antibiotic treatment, and other factors associated with day 30 mortality in patients with bacteremia caused by extended-spectrum-β-lactamase (ESBL)-producing bacteria in eight Dutch hospitals. Retrospectively, information was collected from 232 consecutive patients with ESBL bacteremia (due to Escherichia coli, Klebsiella pneumoniae, and Enterobacter cloacae) between 2008 and 2010. In this cohort (median age of 65 years; 24 patients were <18 years of age), many had comorbidities, such as malignancy (34%) or recurrent urinary tract infection (UTI) (15%). One hundred forty episodes (60%) were nosocomial, 54 (23%) were otherwise health care associated, and 38 (16%) were community acquired. The most frequent sources of infection were UTI (42%) and intra-abdominal infection (28%). Appropriate therapy within 24 h after bacteremia onset was prescribed to 37% of all patients and to 54% of known ESBL carriers. The day 30 mortality rate was 20%. In a multivariable analysis, a Charlson comorbidity index of ≥3, an age of ≥75 years, intensive care unit (ICU) stay at bacteremia onset, a non-UTI bacteremia source, and presentation with severe sepsis, but not inappropriate therapy within <24 h (adjusted odds ratio [OR], 1.53; 95% confidence interval [CI], 0.68 to 3.45), were associated with day 30 mortality. Further assessment of confounding and a stratified analysis for patients with UTI and non-UTI origins of infection did not reveal a statistically significant effect of inappropriate therapy on day 30 mortality, and these results were insensitive to the possible misclassification of patients who had received β-lactam–β-lactamase inhibitor combinations or ceftazidime as initial treatment. In conclusion, ESBL bacteremia occurs mostly in patients with comorbidities requiring frequent hospitalization, and 84% of episodes were health care associated. Factors other than inappropriate therapy within <24 h determined day 30 mortality.
Selective decontamination of the digestive tract (SDD) selectively eradicates aerobic Gram-negative bacteria (AGNB) by the enteral administration of oral nonabsorbable antimicrobial agents, i.e., colistin and tobramycin. We retrospectively investigated the impact of SDD, applied for 5 years as part of an infection control program for the control of an outbreak with extended-spectrum beta-lactamase (ESBL)-producing Klebsiella pneumoniae in an intensive care unit (ICU), on resistance among AGNB. Colistin MICs were determined on stored ESBL-producing K. pneumoniae isolates using the Etest. The occurrence of both tobramycin resistance among pathogens intrinsically resistant to colistin (CIR) and bacteremia caused by ESBL-producing K. pneumoniae and CIR were investigated. Of the 134 retested ESBL-producing K. pneumoniae isolates, 28 were isolated before SDD was started, and all had MICs of <1.5 mg/liter. For the remaining 106 isolated after starting SDD, MICs ranged between 0.5 and 24 mg/liter. Tobramycin-resistant CIR isolates were found sporadically before the introduction of SDD, but their prevalence increased immediately afterward. Segmented regression analysis showed a highly significant relationship between SDD and resistance to tobramycin. Five patients were identified with bacteremia caused by ESBL-producing K. pneumoniae before SDD and 9 patients thereafter. No bacteremia caused by CIR was found before SDD, but its occurrence increased to 26 after the introduction of SDD. In conclusion, colistin resistance among ESBL-producing K. pneumoniae isolates emerged rapidly after SDD. In addition, both the occurrence and the proportion of tobramycin resistance among CIR increased under the use of SDD. SDD should not be applied in outbreak settings when resistant bacteria are prevalent.
Methicillin-resistant Staphylococcus aureus (MRSA) is a major cause of preventable nosocomial infections and is endemic in hospitals worldwide. The effectiveness of infection control policies varies significantly across hospital settings. The impact of the hospital context towards the rate of nosocomial MRSA infections and the success of infection control is understudied. We conducted a modelling study to evaluate several infection control policies in surgical, intensive care, and medical ward specialties, each with distinct ward conditions and policies, of a tertiary public hospital in Sydney, Australia. We reconfirm hand hygiene as the most successful policy and find it to be necessary for the success of other policies. Active screening for MRSA, patient isolation in single-bed rooms, and additional staffing were found to be less effective. Across these ward specialties, MRSA transmission risk varied by 13% and reductions in the prevalence and nosocomial incidence rate of MRSA due to infection control policies varied by up to 45%. Different levels of infection control were required to reduce and control nosocomial MRSA infections for each ward specialty. Infection control policies and policy targets should be specific for the ward and context of the hospital. The model we developed is generic and can be calibrated to represent different ward settings and pathogens transmitted between patients indirectly through health care workers. This can aid the timely and cost effective design of synergistic and context specific infection control policies.
This retrospective study evaluated trends and association between resistance of Pseudomonas aeruginosa isolated from patients with hospital-acquired infections (HAIs) and hospital antimicrobial usage from 2003 through 2011 in a tertiary care hospital in northeast China. HAI was defined as occurrence of infection after hospital admission, without evidence that infection was present or incubating (≦48 h) on admission. In vitro susceptibilities were determined by disk diffusion test and susceptibility profiles were determined using zone diameter interpretive criteria, as recommended by Clinical and Laboratory Standards Institute (CLSI). Data on usage of various antimicrobial agents, expressed as defined daily dose (DDD) per 1,000 patients-days developed by WHO Anatomical Therapeutical Chemical (ATC)/DDD index 2011, were collected from hospital pharmacy computer database. Most of 747 strains of P. aeruginosa were collected from respiratory samples (201 isolates, 26.9%), blood (179, 24.0%), secretions and pus (145, 19.4%) over the years. Time series analysis demonstrated a significant increase in resistance rates of P. aeruginosa to ticarcillin/clavulanic acid, piperacillin/tazobactam, cefoperazone/sulbactam, piperacillin, imipenem, meropenem, ceftazidime, cefepime, ciprofloxacin, and levofloxacin except aminoglycosides over time in the hospital (P<0.001). The rates of carbapenem-resistant P. aeruginosa (CRPA) isolated from patients with HAIs were 14.3%, 17.1%, 21.1%, 24.6%, 37.0%, 48.8%, 56.4%, 51.2%, and 54.1% over time. A significant increase in usage of anti-pseudomonal carbapenems (P<0.001) was seen. ARIMA models demonstrated that anti-pseudomonal carbapenems usage was strongly correlated with the prevalence of imipenem and meropenem-resistant P. aeruginosa (P<0.001). Increasing of quarterly CRPA was strongly correlated at one time lag with quarterly use of anti-pseudomonal carbapenems (P<0.001). Our data demonstrated positive correlation between anti-pseudomonal antimicrobial usage and P. aeruginosa resistance to several classes of antibiotics, but not all antimicrobial agents in the hospital.
Recent years have seen an increase in the frequency of extreme rainfall and subsequent flooding across the world. Climate change models predict that such flooding will become more common, triggering sewer overflows, potentially with increased risks to human health. In August 2010, a triathlon sports competition was held in Copenhagen, Denmark, shortly after an extreme rainfall. The authors took advantage of this event to investigate disease risks in two comparable cohorts of physically fit, long distance swimmers competing in the sea next to a large urban area. An established model of bacterial concentration in the water was used to examine the level of pollution in a spatio-temporal manner. Symptoms and exposures among athletes were examined with a questionnaire using a retrospective cohort design and the questionnaire investigation was repeated after a triathlon competition held in non-polluted seawater in 2011. Diagnostic information was collected from microbiological laboratories. The results showed that the 3.8 kilometer open water swimming competition coincided with the peak of post-flooding bacterial contamination in 2010, with average concentrations of 1.5x104
E. coli per 100 ml water. The attack rate of disease among 838 swimmers in 2010 was 42% compared to 8% among 931 swimmers in the 2011 competition (relative risk (RR) 5.0; 95% CI: 4.0-6.39). In 2010, illness was associated with having unintentionally swallowed contaminated water (RR 2.5; 95% CI: 1.8-3.4); and the risk increased with the number of mouthfuls of water swallowed. Confirmed aetiologies of infection included Campylobacter, Giardia lamblia and diarrhoeagenic E. coli. The study demonstrated a considerable risk of illness from water intake when swimming in contaminated seawater in 2010, and a small but measureable risk from non-polluted water in 2011. This suggests a significant risk of disease in people ingesting small amounts of flood water following extreme rainfall in urban areas.
In 2011, Dutch animal production sectors started recording veterinary antimicrobial consumption. These data are used by the Netherlands Veterinary Medicines Authority to create transparency in and define benchmark indicators for veterinary consumption of antimicrobials. This paper presents the results of sector wide consumption of antimicrobials, in the form of prescriptions or deliveries, for all pig, veal calf, and broiler farms. Data were used to calculate animal defined daily dosages per year (ADDD/Y) per pig or veal calf farm. For broiler farms, number of animal treatment days per year was calculated. Furthermore, data were used to calculate the consumption of specific antimicrobial classes per administration route per pig or veal calf farm. The distribution of antimicrobial consumption per farm varied greatly within and between farm categories. All categories, except for rosé starter farms, showed a highly right skewed distribution with a long tail. Median ADDD/Y values varied from 1.2 ADDD/Y for rosé finisher farms to 83.2 ADDD/Y for rosé starter farms, with 28.6 ADDD/Y for white veal calf farms. Median consumption in pig farms was 9.3 ADDD/Y for production pig farms and 3.0 ADDD/Y for slaughter pig farms. Median consumption in broiler farms was 20.9 ATD/Y. Regarding specific antimicrobial classes, fluoroquinolones were mainly used on veal calf farms, but in low quantities: P75 range was 0 – 0.99 ADDD/Y, and 0 – 0.04 ADDD/Y in pig farms. The P75 range for 3rd/4th-generation cephalosporins was 0 – 0.07 ADDD/Y for veal calf farms, and 0 – 0.1 ADDD/Y for pig farms. The insights obtained from these results, and the full transparency obtained by monitoring antimicrobial consumption per farm, will help reduce antimicrobial consumption and endorse antimicrobial stewardship. The wide and skewed distribution in consumption has important practical and methodological implications for benchmarking, surveillance and future analysis of trends.
Staphylococcus aureus sequence type 398 (ST398) was originally associated with animal infections. We announce the complete genome sequences of two ST398 methicillin-susceptible S. aureus strains from the livestock environment. These genome sequences assist in the characterization of interesting ST398 features relying on host tropism and epidemiological settings.
Livestock-associated methicillin-resistant Staphylococcus aureus (LA-MRSA) emergence is a major public health concern. This study was aimed at assessing risk factors for persistently carrying MRSA in veal calf farmers and their family members. We also evaluate the dynamics of MRSA environmental load during the veal-calf production cycle.
Observational, longitudinal, repeated cross-sectional study.
52 veal calf farms in the Netherlands.
From the end of 2010 to the end of 2011, a total of 211 farmers, family members and employees were included in the study.
Primary outcome and secondary outcome measures
Nasal swabs were taken from participants on days 0, 4, 7 and week 12. A persistent MRSA carrier was defined as a person positive for MRSA on days 0, 4 and 7. Participants filled in an extensive questionnaire to identify potential risk factors and confounders. For estimation of MRSA prevalence in calves and environmental contamination, animal nasal swabs and Electrostatic Dust Collectors were taken on day 0 and week 12.
The presence of potential animal reservoirs (free-ranging farm cats and sheep) and the level of contact with veal calves was positively associated with persistent MRSA carriage. Interestingly, at the end of the study (week 12), there was a twofold rise in animal prevalence and a significantly higher MRSA environmental load in the stables was found on farms with MRSA carriers.
This study supports the hypothesis that environmental contamination with MRSA plays a role in the acquisition of MRSA in farmers and their household members and suggests that other animal species should also be targeted to implement effective control strategies.
MRSA remains a leading cause of hospital-acquired (HAP) and healthcare-associated pneumonia (HCAP). We describe the epidemiology and outcome of MRSA pneumonia in Canadian hospitals, and identify factors contributing to mortality.
Prospective surveillance for MRSA pneumonia in adults was done for one year (2011) in 11 Canadian hospitals. Standard criteria for MRSA HAP, HCAP, ventilator-associated pneumonia (VAP), and community-acquired pneumonia (CAP) were used to identify cases. MRSA isolates underwent antimicrobial susceptibility testing, and were characterized by pulsed-field gel electrophoresis (PFGE) and Panton-Valentine leukocidin (PVL) gene detection. The primary outcome was all-cause mortality at 30 days. A multivariable analysis was done to examine the association between various host and microbial factors and mortality.
A total of 161 patients with MRSA pneumonia were identified: 90 (56%) with HAP, 26 (16%) HCAP, and 45 (28%) CAP; 23 (14%) patients had VAP. The mean (± SD) incidence of MRSA HAP was 0.32 (± 0.26) per 10,000 patient-days, and of MRSA VAP was 0.30 (± 0.5) per 1,000 ventilator-days. The 30-day all-cause mortality was 28.0%. In multivariable analysis, variables associated with mortality were the presence of multiorgan failure (OR 8.1; 95% CI 2.5-26.0), and infection with an isolate with reduced susceptibility to vancomycin (OR 2.5, 95% CI 1.0-6.3).
MRSA pneumonia is associated with significant mortality. Severity of disease at presentation, and infection caused by an isolate with elevated MIC to vancomcyin are associated with increased mortality. Additional studies are required to better understand the impact of host and microbial variables on outcome.
The English Department of Health introduced universal MRSA screening of admissions to English hospitals in 2010. It commissioned a national audit to review implementation, impact on patient management, admission prevalence and extra yield of MRSA identified compared to “high-risk” specialty or “checklist-activated” screening (CLAS) of patients with MRSA risk factors.
National audit May 2011. Questionnaires to infection control teams in all English NHS acute trusts, requesting number patients admitted and screened, new or previously known MRSA; MRSA point prevalence; screening and isolation policies; individual risk factors and patient management for all new MRSA patients and random sample of negatives.
144/167 (86.2%) trusts responded. Individual patient data for 760 new MRSA patients and 951 negatives. 61% of emergency admissions (median 67.3%), 81% (median 59.4%) electives and 47% (median 41.4%) day-cases were screened. MRSA admission prevalence: 1% (median 0.9%) emergencies, 0.6% (median 0.4%) electives, 0.4% (median 0%) day-cases. Approximately 50% all MRSA identified was new. Inpatient MRSA point prevalence: 3.3% (median 2.9%). 104 (77%) trusts pre-emptively isolated patients with previous MRSA, 63 (35%) pre-emptively isolated admissions to “high-risk” specialties; 7 (5%) used PCR routinely. Mean time to MRSA positive result: 2.87 days (±1.33); 37% (219/596) newly identified MRSA patients discharged before result available; 55% remainder (205/376) isolated post-result. In an average trust, CLAS would reduce screening by 50%, identifying 81% of all MRSA. “High risk” specialty screening would reduce screening by 89%, identifying 9% of MRSA.
Implementation of universal screening was poor. Admission prevalence (new cases) was low. CLAS reduced screening effort for minor decreases in identification, but implementation may prove difficult. Cost effectiveness of this and other policies, awaits evaluation by transmission dynamic economic modelling, using data from this audit. Until then trusts should seek to improve implementation of current policy and use of isolation facilities.
The number of extended-spectrum beta-lactamase (ESBL) positive (+) Escherichia coli is increasing worldwide. In contrast with many other multidrug-resistant bacteria, it is suspected that they predominantly spread within the community. The objective of this study was to assess factors associated with community-acquired colonization of ESBL (+) E. coli.
We performed a matched case-control study at the Charité University Hospital Berlin between May 2011 and January 2012. Cases were defined as patients colonized with community-acquired ESBL (+) E. coli identified <72 h after hospital admission. Controls were patients that carried no ESBL-positive bacteria but an ESBL-negative E.coli identified <72 h after hospital admission. Two controls per case were chosen from potential controls according to admission date. Case and control patients completed a questionnaire assessing nutritional habits, travel habits, household situation and language most commonly spoken at home (mother tongue). An additional rectal swab was obtained together with the questionnaire to verify colonization status. Genotypes of ESBL (+) E. coli strains were determined by PCR and sequencing. Risk factors associated with ESBL (+) E. coli colonization were analyzed by a multivariable conditional logistic regression analysis.
We analyzed 85 cases and 170 controls, respectively. In the multivariable analysis, speaking an Asian language most commonly at home (OR = 13.4, CI 95% 3.3–53.8; p<0.001) and frequently eating pork (≥3 meals per week) showed to be independently associated with ESBL colonization (OR = 3.5, CI 95% 1.8–6.6; p<0.001). The most common ESBL genotypes were CTX-M-1 with 44% (n = 37), CTX-M-15 with 28% (n = 24) and CTX-M-14 with 13% (n = 11).
An Asian mother tongue and frequently consuming certain types of meat like pork can be independently associated with the colonization of ESBL-positive bacteria. We found neither frequent consumption of poultry nor previous use of antibiotics to be associated with ESBL colonization.
swine; livestock; cattle; geographic information systems; methicillin-resistant Staphylococcus aureus; MRSA; cluster analysis; bacteria; zoonoses; the Netherlands
Methicillin-resistant Staphylococcus aureus (MRSA) is a worldwide problem in both hospitals and communities all over the world. In 2003, a new MRSA clade emerged with a reservoir in pigs and veal calves: livestock-associated MRSA (LA-MRSA). We wanted to estimate the incidence of bacteraemias due to LA-MRSA using national surveillance data from 2009 in the Netherlands. We found a low incidence of LA-MRSA and MRSA bacteraemia episodes, compared to bacteraemias caused by all S. aureus (0.04, 0.18 and 19.3 episodes of bacteraemia per 100,000 inhabitants per year, respectively). LA-MRSA and MRSA were uncommon compared to numbers from other countries as well. MRSA in general and LA-MRSA in specific does not appear to be a public health problem in the Netherlands now. The low incidence of LA-MRSA bacteraemia episodes may best be explained by differences in the populations affected by LA-MRSA versus other MRSA. However, reduced virulence of the strain involved, and the effectiveness of the search and destroy policy might play a role as well.
Surgical site infections (SSI’s) are associated with severe morbidity, mortality and increased health care costs in vascular surgery.
To implement a bundle of care in vascular surgery and measure the effects on the overall and deep-SSI’s rates.
Prospective, quasi-experimental, cohort study.
A prospective surveillance for SSI’s after vascular surgery was performed in the Amphia hospital in Breda, from 2009 through 2011. A bundle developed by the Dutch hospital patient safety program (DHPSP) was introduced in 2009. The elements of the bundle were (1) perioperative normothermia, (2) hair removal before surgery, (3) the use of perioperative antibiotic prophylaxis and (4) discipline in the operating room. Bundle compliance was measured every 3 months in a random sample of surgical procedures and this was used for feedback.
Bundle compliance improved significantly from an average of 10% in 2009 to 60% in 2011. In total, 720 vascular procedures were performed during the study period and 75 (10.4%) SSI were observed. Deep SSI occurred in 25 (3.5%) patients. Patients with SSI’s (28,5±29.3 vs 10.8±11.3, p<0.001) and deep-SSI’s (48.3±39.4 vs 11.4±11.8, p<0.001) had a significantly longer length of hospital stay after surgery than patients without an infection. A significantly higher mortality was observed in patients who developed a deep SSI (Adjusted OR: 2.96, 95% confidence interval 1.32–6.63). Multivariate analysis showed a significant and independent decrease of the SSI-rate over time that paralleled the introduction of the bundle. The SSI-rate was 51% lower in 2011 compared to 2009.
The implementation of the bundle was associated with improved compliance over time and a 51% reduction of the SSI-rate in vascular procedures. The bundle did not require expensive or potentially harmful interventions and is therefore an important tool to improve patient safety and reduce SSI’s in patients undergoing vascular surgery.
Community-acquired urinary tract infection (CA-UTI) is the most common infection caused by extended-spectrum β-lactamase (ESBL)-producing Enterobacteriaceae, but the clinical epidemiology of these infections in low prevalence countries is largely unknown. A population based case-control study was conducted to assess risk factors for CA-UTI caused by ESBL-producing E. coli or K. pneumoniae. The study was carried out in a source population in Eastern Norway, a country with a low prevalence of infections caused by ESBL-producing Enterobacteriaceae. The study population comprised 100 cases and 190 controls with CA-UTI caused by ESBL-producing and non-ESBL-producing E. coli or K. pneumoniae, respectively. The following independent risk factors of ESBL-positive UTIs were identified: Travel to Asia, The Middle East or Africa either during the past six weeks (Odds ratio (OR) = 21; 95% confidence interval (CI): 4.5–97) or during the past 6 weeks to 24 months (OR = 2.3; 95% CI: 1.1–4.4), recent use of fluoroquinolones (OR = 16; 95% CI: 3.2–80) and β-lactams (except mecillinam) (OR = 5.0; 95% CI: 2.1–12), diabetes mellitus (OR = 3.2; 95% CI: 1.0–11) and recreational freshwater swimming the past year (OR = 2.1; 95% CI: 1.0–4.0). Factors associated with decreased risk were increasing number of fish meals per week (OR = 0.68 per fish meal; 95% CI: 0.51–0.90) and age (OR = 0.89 per 5 year increase; 95% CI: 0.82–0.97). In conclusion, we have identified risk factors that elucidate mechanisms and routes for dissemination of ESBL-producing Enterobacteriaceae in a low prevalence country, which can be used to guide appropriate treatment of CA-UTI and targeted infection control measures.
Although surgical-site infection (SSI) rates are advocated as a major evaluation criterion, the reproducibility of SSI diagnosis is unknown. We assessed agreement in diagnosing SSI among specialists involved in SSI surveillance in Europe.
Twelve case-vignettes based on suspected SSI were submitted to 100 infection-control physicians (ICPs) and 86 surgeons in 10 European countries. Each participant scored eight randomly-assigned case-vignettes on a secure online relational database. The intra-class correlation coefficient (ICC) was used to assess agreement for SSI diagnosis on a 7-point Likert scale and the kappa coefficient to assess agreement for SSI depth on a three-point scale.
Intra-specialty agreement for SSI diagnosis ranged across countries and specialties from 0.00 (95%CI, 0.00–0.35) to 0.65 (0.45–0.82). Inter-specialty agreement varied from 0.04 (0.00–0.62) in to 0.55 (0.37–0.74) in Germany. For all countries pooled, intra-specialty agreement was poor for surgeons (0.24, 0.14–0.42) and good for ICPs (0.41, 0.28–0.61). Reading SSI definitions improved agreement among ICPs (0.57) but not surgeons (0.09). Intra-specialty agreement for SSI depth ranged across countries and specialties from 0.05 (0.00–0.10) to 0.50 (0.45–0.55) and was not improved by reading SSI definition.
Among ICPs and surgeons evaluating case-vignettes of suspected SSI, considerable disagreement occurred regarding the diagnosis, with variations across specialties and countries.
Infectious diseases are among the major causes of death worldwide. We evaluated the trends of mortality due to septicemia in Greece and compared it with mortality due to other infections.
Data on mortality stratified by cause of death during 2003–2010 was obtained from the Hellenic Statistical Authority. Deaths caused by infectious diseases were grouped by site of infection and analyzed using SPSS 17.0 software.
45,451 deaths due to infections were recorded in Greece during the 8-year period of time, among which 12.2% were due to septicemia, 69.7% pneumonia, 1.5% pulmonary tuberculosis, 0.2% influenza, 0.5% other infections of the respiratory tract, 7.9% intra-abdominal infections (IAIs), 2.5% urinary tract infections (UTIs), 2.2% endocarditis or pericarditis or myocarditis, 1.6% hepatitis, 1% infections of the central nervous system, and 0.7% other infections. A percentage of 99.4% of deaths due to septicemia were caused by bacteria that were not reported on the death certificate (noted as indeterminate septicemia). More deaths due to indeterminate septicemia were observed during 2007–2010 compared to 2003–2006 (3,558 versus 1,966; p<0.05).
Despite the limitations related to the quality of death certificates, this study shows that the mortality rate due to septicemia has almost doubled after 2007 in Greece. Proportionally, septicemia accounted for a greater increase in the mortality rate within the infectious causes of death for the same period of time. The emergence of resistance could partially explain this alarming phenomenon. Therefore, stricter infection control measures should be urgently applied in all Greek healthcare facilities.