These results show that our educational intervention significantly enhanced residents' EBM skills. Compared with the control group, residents in the intervention group improved substantially their ability to pose well-structured clinical questions, to search the literature for studies pertinent to their questions, and to understand studies' quantitative results. These educational outcomes were durable over a 6 to 9 month period. We did not demonstrate an effect of the EBM course on residents' abilities to evaluate studies' quality or relevance to individual patients. Taken together, these objective findings were consistent with the residents' subjective self perceptions: after taking the EBM course, they believed they were better able to interpret study results but not better able to apply those results to patients' care.
We are encouraged by these findings. Many observers have noted the difficulties inherent in applying published clinical evidence to the care of individual patients.18–21
Thus, we were not surprised to find that a brief intervention for relatively inexperienced clinicians failed to improve measures of that outcome. We suspect that acquiring expertise in this “ultimate” EBM skill may require a career-long, conscientious effort. In contrast, as shown in this study, the more fundamental EBM skills can be taught successfully even to junior physicians.
There are several potential limitations to our study. First, residents were not formally randomized. However, the two groups had similar baseline characteristics, and adjustment for baseline differences, including performance on the precourse test, did not alter the results. Second, contamination between groups was possible, but this bias would minimize, not increase, differences between groups. Third, nondifferential measurement bias might have occurred, since our test to measure EBM skills had not been previously assessed. However, the impressive initial improvement by the intervention group and the consistency of that improvement by the “crossed over” control group support the instrument's validity and reliability. Finally, the two raters grading the test were not blinded to group assignment, but each scored the tests independently using strict criteria for free-text answers, and most test questions were multiple choice, minimizing the opportunity for measurement bias. In fact, the greatest magnitude of improvement was seen for literature searching (multiple choice questions), and no improvement was seen in study quality assessment (free-text answers), supporting our belief that bias was unlikely to explain our findings.
Despite these limitations, we believe that we have shown that physicians in training can achieve proficiency in seeking, finding and understanding the “best evidence.” Clearly, further efforts are required to produce greater improvements in critical appraisal skills22–24
with the goal being for learners to achieve all-round competency in EBM.25
We think our results propel the EBM movement toward new more challenging terrain, such as assessing whether residents will employ these new skills in taking care of patients. The objective of teaching EBM should be to promote its adoption into everyday clinical practice. As technology and the availability of high quality evidence improve, our hope is that this can be accomplished during physicians' formative training years.
Our expectation is that research and innovation in medical education will lead the way. In our own educational program, where advocacy of the principles of EBM continues to grow,26
we plan to study next the interface between “best evidence” and “best practice.” Assessing whether the application of EBM skills by practicing physicians actually produces improvements in patient care, and evaluating why physicians choose to use, or not use, published evidence relevant to the care of their patients are important areas for future research.27–29
What can medical educators learn from those choices? These and other questions need answers before EBM can realize its full potential.