Search tips
Search criteria 


Logo of jgimedspringer.comThis journalToc AlertsSubmit OnlineOpen Choice
J Gen Intern Med. 2005 April; 20(4): 340–343.
PMCID: PMC1490095

Teaching Evidence-based Medicine Skills Can Change Practice in a Community Hospital

Sharon E Straus, MD, MSc, FRCPC,1 Chris Ball, MD,2 Nick Balcombe, MD,3 Jonathon Sheldon, MD,3 and Finlay A McAlister, MD, MSc, FRCPC4



Several studies have evaluated whether evidence-based medicine (EBM) training courses can improve skills such as literature searching and critical appraisal but to date, few data exist on whether teaching EBM skills and providing evidence-based resources result in change in behavior or clinical outcomes. This study was conducted to evaluate whether a multifaceted EBM intervention consisting of teaching EBM skills and provision of electronic evidence resources changed clinical practice.


Before/after study.


The medical inpatient units at a district general hospital.


Thirty-five attending physicians and 12 medicine residents.


A multicomponent EBM intervention was provided including an EBM training course of seven 1-hour sessions, an EBM syllabus and textbook, and provision of evidence-based resources on the hospital network.


The primary outcome of the study was the quality of evidence in support of therapies initiated for the primary diagnoses in 483 consecutive patients admitted during the month before and the month after the intervention. Patients admitted after implementation of the EBM intervention were significantly more likely to receive therapies proven to be beneficial in randomized controlled trials (62% vs 49%; P = .016). Of these trial-proven therapies, those offered after the EBM intervention were significantly more likely to be based on high-quality randomized controlled trials (95% vs 87%; P = .023).


A multifaceted intervention designed to teach and support EBM significantly improved evidence-based practice patterns in a district general hospital.

Keywords: evidence-based medicine, medical education, practice of medicine

As a result of studies demonstrating substantial gaps between research evidence and the care provided in usual clinical practice,1 there is an increasing emphasis on the teaching of evidence-based medicine (EBM) skills in undergraduate, postgraduate, and continuing medical education programs. These initiatives are based on the untested assumption that teaching health care providers the skills necessary to practice EBM (formulating questions, literature searching, critical appraisal, and application of research evidence) will change clinical performance. Although a number of studies have examined whether training courses can improve specific EBM skills such as literature searching or critical appraisal,2 limited data exist on the impact of teaching EBM skills and providing evidence resources on clinical decision making or clinical outcomes. This study was conducted to address this question by evaluating whether the care offered to medical patients at a district general hospital in the United Kingdom was more evidence based after implementation of a multifaceted EBM training program.


We performed a before and after study of the quality of evidence in support of therapies initiated for the primary diagnosis of patients admitted to a medical inpatient unit of a district general hospital. It was conducted at Queen's Hospital in Burton-upon-Trent, Staffordshire, United Kingdom (a 465-bed district general hospital without a university affiliation but with a fully integrated information support system as one of the two nationally funded pilot sites for the development of the electronic patient record). There were 35 attending physicians and 3 teams (consisting of 2 junior and 2 senior residents) in the department of medicine, none of whom had received prior training in clinical epidemiology or EBM.


The EBM intervention was multifaceted. First, we reviewed all discharge summaries for a 2-week period (July 1998) to identify the most common admitting diagnoses. Therapies were identified for each common medical diagnosis and literature searches were conducted to retrieve evidence supportive of these therapies. For each topic, 1-page summaries of the evidence (critically appraised topics; CATs3) were prepared and entered into a database. Second, we provided all participants with the syllabus Practising Evidence-based Medicine and relevant excerpts from the book Evidence-based Medicine: How to Practise and Teach EBM.4,5 Third, we conducted an EBM training course over seven 1-hour sessions in October and November 1998 (these sessions occurred during regularly scheduled teaching rounds and involved small-group teaching similar to that provided during the annual Oxford Workshops on How to Teach Evidence-based Medicine). Each session began with a clinical scenario and generation of a clinical question by the learners. During the session, we identified an article relevant to the question and used this to develop and hone critical appraisal skills for a variety of study designs. Clinical topics for discussion included the diagnosis of iron deficiency anemia, prognosis following stroke, therapy for dementia and atrial fibrillation, and association between calcium antagonists and cancer. We also taught efficient strategies to find evidence employing EBM resources including the Cochrane Library, Best Evidence (a compendium of ACP Journal Club and Evidence Based Medicine), and medline, as well as how to develop and access the CATs for 65 common topics which had previously been prepared. Fourth, these EBM resources were installed on the hospital electronic network which the attending physicians and house officers could access from a ward-based computer (prior to this intervention, only the Oxford Textbook of Medicine and a few locally developed guidelines were available on the hospital network). Participants were able to access these resources through two PCs located on the medical ward.

Outcome Assessment

To evaluate our EBM intervention, we obtained all discharge summaries for patients admitted for more than 24 hours to the medicine wards at Queen's Hospital in September 1998 and January 1999 (immediately before and after the program, and 2 months during which the same medical teams were attending on the inpatient units) and 2 of the investigators independently assigned a primary diagnosis and primary intervention to each discharge. The investigators were blind to admission date. Any disagreements were resolved by consensus and, in a few instances, independent assessment by a third investigator. Using methods employed in an earlier study,6 we defined the primary diagnosis as “the disease, syndrome, or condition entirely, or if there were several diagnoses, most responsible for the patient's admission to hospital” and the primary intervention as “the treatment or other manoeuvre that represented the most important attempt to cure, alleviate or care for the patient in respect of his primary diagnosis.”

After designation of the primary diagnosis and intervention for each patient, Best Evidence, the Cochrane Library, and medline were searched to find evidence for each intervention. Two clinical epidemiologists independently classified the strength of evidence for each intervention using a previously employed scheme6:

Class 1: Those interventions that have been proven to be beneficial in systematic reviews of randomized controlled trials (RCTs) or from individual RCTs.

Class 2: Those interventions with convincing nonexperimental evidence; for example, interventions whose face validity is so great that randomized trials were unanimously judged by the 2 investigators to be both unnecessary, and, if a placebo would have been involved, unethical (e.g., antibiotics for pneumonia, pacemaker for complete heart block).

Class 3: Those interventions without substantial evidence, which includes interventions in common use but meeting neither of the above 2 criteria or interventions shown to be harmful or useless in systematic reviews of RCTs or from individual RCTs.

The primary interventions were defined as “evidence based” if they were class 1 or 2. Finally, the clinical epidemiologists independently categorized the quality of the evidence underlying each class 1 intervention. They used previously described criteria in deciding whether or not an RCT or systematic review of RCTs was of high quality (RCT with blinded assessment of outcomes, intention-to-treat analysis, follow-up of at least 80% or losses to follow-up too few to materially affect the results, and sufficient sample size to detect a clinically important difference with power>80%).4 Any disagreements were resolved by consensus.

The data were entered into and analyzed using SPSS (SPSS Inc., Chicago, IL) for Windows version 11.0.

Approval was received from Queen's District Hospital to review anonymous discharge summaries for this project.


During September 1998, 262 patients were admitted to the Queen's District Hospital medical inpatient units; 275 patients were admitted to these same units in January 1999. We excluded 4 patients who were admitted for diagnostic work-up (3 for bronchoscopies, 1 for a lactose tolerance test), 3 patients who discharged themselves against medical advice before therapy was instituted, 41 patients who were admitted for observation (virtually all for chest pain or syncope) and did not receive specific therapy, and 6 patients who received supportive end-of-life care only. The primary diagnoses in the remaining 483 patients were generally similar in both time periods, although there were significantly more admissions for obstructive airways disease exacerbations in January 1999 than in September 1998 (P =.001; Table 1). The age (mean, 63, standard deviation [SD], 9.4 years vs mean, 62, SD, 8.2 years) and gender (44% vs 46% women) distributions were similar in both time periods. The attending physicians cared for a mean of 36.3 (SD, 12.7) patients and each resident cared for a mean of 32.2 (SD, 17.5) patients each month.

Table 1
Ten Most Common Primary Diagnoses in Patients Admitted Pre/Post EBM Intervention

Patients admitted after implementation of the EBM intervention were significantly more likely to receive evidence-based therapy than those treated before the intervention (82% vs 74%; P =.046). In particular, patients admitted after the intervention were significantly more likely to receive therapies proven to be beneficial in RCTs (62% vs 49%; P =.016; Table 2). Furthermore, even among the subset of patients receiving therapies shown to be beneficial in RCTs, the therapies offered after the EBM intervention were significantly more likely to be based on high-quality evidence (95% vs 87% based on high-quality RCTs; P =.023; Table 2). Sensitivity analyses demonstrated that patients admitted with coronary disease (71 preintervention, 61 postintervention) were significantly more likely to receive evidence-based therapy after the intervention (98% vs 87%; P =.02), while the quality of evidence for the selected therapy was not significantly different for patients with obstructive airways disease pre/postintervention (91% vs 86% evidence-based; P =.47).

Table 2
Evidentiary Basis for Prescribed Therapies Pre/Post EBM Intervention

Agreement between the investigators was good regarding the primary diagnosis (κ= 0.92) and management (κ=0.76).


We have shown that a multifaceted intervention designed to teach EBM skills and implement evidence resources significantly improved practice patterns in a district general hospital. After the EBM intervention, more patients were prescribed therapies proven to be efficacious in randomized trials, and the trials supporting these therapies were significantly more likely to be high quality than before the EBM intervention. The observed absolute improvement of 13% exceeds the 10% absolute improvement which has long been accepted as the minimal clinically important difference for studies of educational interventions.7

The degree to which practice on the medical inpatient units at this district general hospital was evidence based after the EBM intervention is very similar to that reported for medical inpatient units at university-affiliated tertiary care hospitals with attending physicians holding postgraduate degrees in clinical epidemiology. For example, using the same definition we did, 82% of primary interventions were deemed to be evidence based at the John Radcliffe Hospital in Oxford, United Kingdom and 84% at the Ottawa General Hospital in Ottawa, Canada.6,8 This includes 53% (and 57%, respectively) of primary interventions which were deemed to be class 1 (i.e., supported by RCTs). Thus, we have shown that the attainment of evidence-based practice is indeed possible in busy clinical settings after implementation of appropriate resources and teaching of EBM skills.

The results of two recent randomized trials suggest there may be benefit to EBM training although differences in the formulation of the intervention make it difficult to compare with the current study.9,10 In the first study, information management was compared with training in EBM for secondary prevention of cardiac disease in primary care.9 The combination of these interventions showed some improvement in management of cholesterol. However, it appears that the study intervention was focused on searching and retrieving evidence from the Internet and medline around this particular condition and did not include education on formulating questions, applying the evidence, or assessing our performance. Moreover, the intensity of the intervention is unclear. The second study evaluated the impact of an EBM educational intervention among public health workers but did not report impact on actual behaviors.10

However, our study is a before/after case series and does not carry the same weight as a randomized trial. Ideally, to complete a methodologically rigorous study of EBM, we would aim to expose “control” clinicians to an evidence-poor teaching intervention and allow them to become out of date and unaware of potentially life-saving evidence accessible to and known by the evidence-based clinicians in the experimental group. However, this approach is not ethical and alternative designs must be explored. We were unable to identify an appropriate control site with the same patient, house staff, and attending physician mix and which had the same informatics infrastructure. This hospital had a fully integrated information support system as one of two nationally funded pilot sites. And, we were unable to do an interrupted time series given the time constraints due to house staff rotation.

Our choice of study design limits the inferences that should be drawn from this study. Thus, while our study suggests that EBM training does meaningfully impact on clinical decision making, randomized trials of EBM teaching are clearly needed and one of us (SES) has embarked on just such a study. This randomized trial of family physicians has been designed to determine whether an online EBM educational intervention can change behavior and clinical outcomes. Our study may also be criticized for reporting on process measures (therapy prescribed) rather than clinical outcomes such as mortality. However, we chose to focus on process measures as they are more sensitive indicators of quality of care than changes in clinical outcomes which take months or years to manifest.11 Finally, we do not have any data on the frequency with which the various evidence resources were accessed by the clinicians at Queen's Hospital. However, other investigators have shown that if you provide evidence resources in a convenient and readily accessible format, clinicians with training in EBM will use them.12,13

In summary, we have demonstrated that the practice of clinicians in a district general hospital changed in a statistically significant and clinically meaningful way after completion of an EBM training course and provision of evidence resources. The implications of our study are further amplified by evidence that clinicians trained in EBM are more likely to remain up to date for longer after their training than clinicians without EBM training.14 Given that surveys of frontline clinicians confirm widespread enthusiasm for EBM and a desire to learn the key skills such as evidence retrieval and critical appraisal,1519 we believe that training in the practice of EBM should remain a key component of undergraduate and postgraduate education. Proponents of knowledge translation have advocated that changing behavior requires comprehensive approaches directed toward patients, physicians, managers, and policy makers.20 The results of this study suggest that a multifaceted approach to teaching EBM can change behavior.


SES is supported by the Ontario Ministry of Health and the Knowledge Translation Program, University of Toronto; FAM is supported by the Alberta Heritage Foundation for Medical Research and the Canadian Institutes of Health Research.

The authors thank Professor D.L. Sackett for conducting many of the EBM teaching sessions in this study and for providing guidance and resources for this study, and the house staff at Queen's Hospital who participated in the study.


1. Majumdar SR, McAlister FA, Furberg CD. From publication to practice in chronic cardiovascular disease—the long and winding road. J Am Coll Cardiol. 2004;43:1738–42. [PubMed]
2. Straus SE, McAlister FA. Evidence-based medicine: a commentary on common criticisms. CMAJ. 2000;163:837–41. [PMC free article] [PubMed]
3. Sauve J-S, Lee HN, Meade MO, et al. and the General Internal Medicine Fellowship Programme of McMaster University. The critically-appraised topic (CAT): a resident-initiated tactic for applying users' guides at the bedside. Ann R Coll Physicians Surg Can. 1995;28:396–8.
4. Straus SE, Badenoch D, Richardson WS, Rosenberg W, Sackett DL. Practising Evidence-based Medicine. Oxford: Radcliffe Medical Press; 1998.
5. Sackett DL, Richardson WS, Rosenberg WMC, Haynes RB. Evidence-based Medicine: How to Practice and Teach EBM. London: Churchill Livingstone; 1997.
6. Ellis J, Mulligan I, Rowe J, Sackett DL. Inpatient general medicine is evidence based. Lancet. 1995;346:407–10. [PubMed]
7. Sibley JC, Sackett DL, Neufeld V, Gerrard B, Rudnick KV, Fraser W. A randomized trial of continuing medical education. N Engl J Med. 1982;306:511–5. [PubMed]
8. Michaud G, McGown JL, van der Jagt R, Wells G, Tugwell P. Are therapeutic decisions supported by evidence from health care research? Arch Intern Med. 1998;158:1665–8. [PubMed]
9. Langham J, Tucker H, Sloan D, et al. Secondary prevention of cardiovascular disease: the PIER trial. Br J Gen Pract. 2002;142:818–24. [PMC free article] [PubMed]
10. Forsetlund L, Bradley P, Forsen L, et al. Randomised controlled trial of a theoretically grounded intervention to diffuse evidence-based public health service. BMC Med Educ. 2003;3:2. [PMC free article] [PubMed]
11. Mant J, Hicks N. Detecting differences in quality of care: the sensitivity of measures of process and outcome in treating acute myocardial infarction. BMJ. 1995;311:793–6. [PMC free article] [PubMed]
12. Fisher BW, Hayward RSA, Lau FY. Use and perceptions of computerized health information resources by medical residents. Ann R Coll Physicians Surg Can. 2002;35:467–71.
13. Sackett DL, Straus SE. Finding and applying evidence during clinical rounds: the “evidence cart” JAMA. 1998;280:1336–8. [PubMed]
14. Shin JH, Haynes RB, Johnston Effect of problem-based, self-directed undergraduate education in lifelong learning. Can Med Assoc J. 1993;148:969–76. [PMC free article] [PubMed]
15. McColl A, Smith H, White P, Field J. General practitioners' perceptions of the route to evidence based medicine: a questionnaire survey. BMJ. 1998;316:361–5. [PMC free article] [PubMed]
16. McAlister FA, Graham I, Karr GW, Laupacis A. Evidence-based medicine and the practising clinician: a survey of Canadian general internists. J Gen Intern Med. 1999;14:236–42. [PMC free article] [PubMed]
17. Hagdrup N, Falshaw M, Gray RW, Carter Y. All members of primary care team are aware of importance of evidence based medicine. BMJ. 1998;317:282. [PMC free article] [PubMed]
18. Ghali WA, Saitz R, Eskew AH, Lemaire JB, Gupta M, Hershman WY. Evidence-based medicine: behaviors, skills, and attitudes of medical students. Ann R Coll Physicians Surg Can. 1998;31:177–82.
19. Olatunbosun OA, Edouard D, Pierson RA. Physicians' attitudes toward evidence based obstetric practice: a questionnaire survey. BMJ. 1998;316:365–6. [PMC free article] [PubMed]
20. Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients' care. Lancet. 2003;362:1225–30. [PubMed]

Articles from Journal of General Internal Medicine are provided here courtesy of Society of General Internal Medicine