PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
Acad Med. Author manuscript; available in PMC 2012 June 1.
Published in final edited form as:
PMCID: PMC3102783
NIHMSID: NIHMS285516

Does Simulation-based Medical Education with Deliberate Practice Yield Better Results than Traditional Clinical Education? A Meta-Analytic Comparative Review of the Evidence

Abstract

Purpose

This article presents a comparison of the effectiveness of traditional clinical education toward skill acquisition goals versus simulation-based medical education (SBME) with deliberate practice (DP).

Method

This is a quantitative meta-analysis that spans twenty years, 1990 to 2010. A search strategy involving three literature databases, 12 search terms, and four inclusion criteria was used. Four authors independently retrieved and reviewed articles. Main outcome measures were extracted to calculate effect sizes.

Results

Of 3,742 articles identified, 14 met inclusion criteria. The overall effect size for the 14 studies evaluating the comparative effectiveness of SBME compared to traditional clinical medical education was 0.71 (95% confidence interval, 0.65–0.76; P < .001).

Conclusions

Although the number of reports analyzed in this meta analysis is small, these results show that SBME with DP is superior to traditional clinical medical education in achieving specific clinical skill acquisition goals. SBME is a complex educational intervention that should be introduced thoughtfully and evaluated rigorously at training sites. Further research on incorporating SBME with DP into medical education is needed to amplify its power, utility, and cost-effectiveness.

This article addresses the comparative effectiveness of traditional methods of clinical medical education, especially the Halstedian “see one, do one, teach one” approach,1 versus simulation-based medical education (SBME) with deliberate practice (DP). SBME24 engages learners in lifelike experiences with varying fidelity designed to mimic real clinical encounters. DP embodies strong and consistent educational interventions grounded in information processing and behavioral theories of skill acquisition and maintenance.58 DP has at least nine elements (List 1).

The goal of DP is constant skill improvement, not just skill maintenance. The power of DP has been demonstrated in many professional domains including sports, commerce, performing arts, science, and writing.9 Research shows that DP is a much more powerful predictor of professional accomplishment than experience or academic aptitude.6

Comparative effectiveness research (CER), also known as patient-centered outcomes research, refers to studies that compare the benefits and liabilities of different interventions and strategies to prevent, diagnose, treat, and monitor health conditions.1012 The aim is to “make head-to-head comparisons of different health care interventions [that] outline the effectiveness—or benefits and harms—of treatment options.”13 Conventional clinical treatment options include drugs, surgery, rehabilitation, and preventive interventions that (1) improve patient health, (2) contribute to quality of life, and (3) boost longevity. Treatment options grounded in comparative research have efficacy in controlled laboratory settings and are also effective in clinical patient care where health care delivery, its receipt, and patient adherence vary widely.14

U.S. CER policies have been published recently by the Institute of Medicine (IOM) under the title, Knowing What Works in Health Care: A Roadmap for the Nation.15 Complementary work by the U.S. Agency for Healthcare Research and Quality (AHRQ) outlines comparative health care research priorities.16 These expressions of CER policies and priorities focus on conventional treatment options. However, they do not address the value of a skillful medical and health professions workforce and the importance of its education for the delivery of effective health care. We assert that human capital, embodied in competent physicians and other health care professionals, is an essential feature of health care delivery even though IOM policies and AHRQ research priorities are silent about the contribution of health professions education to health care delivery.

The purpose of medical education at all levels is to prepare physicians with the knowledge, skills, and features of professionalism needed to deliver quality patient care. Medical education research seeks to make the enterprise more effective, efficient, and economical. Short and long-run goals of research in medical education are to show that educational programs contribute to physician competence measured in the classroom, simulation laboratory, and patient care settings. Improved patient outcomes linked directly to educational events are the ultimate goal of medical education research and qualify this scholarship as translational science.17

This article reviews and evaluates evidence about the comparative effectiveness of SBME with DP versus traditional clinical education. The goal of the study is to perform a “head-to-head” comparison of these two educational methods toward the goal of clinical skill acquisition. This is a quantitative meta-analysis of SBME research that spans twenty years, from 1990 to 2010. The comparative review is selective and critical. We also believe it is exhaustive because the number of existing comparative studies is small.

Method

This article was prepared using most reporting conventions described in the MOOSE (Meta-analysis of Observational Studies in Epidemiology) Statement18 and the QUOROM Statement19 for reports of meta-analyses of randomized controlled trials.

Study eligibility and identification

Quantitative research synthesis begins with a systematic search of existing literature. Our search strategy covered three literature databases (MEDLINE, Web of Knowledge, PsychINFO) and employed 12 single search terms and concepts (clinical education, clinical outcomes, deliberate practice, fellows, mastery learning, medical education, medical simulation training, medical students, patient outcomes, quality of care, residents, simulator) and their Boolean combinations. We searched publications from 1990 to April, 2010. We also reviewed reference lists of all selected manuscripts to identify additional reports. The intent was to perform a detailed and thorough search of peer reviewed publications that have been judged for academic quality to evaluate the comparative effectiveness of SBME with DP versus traditional clinical education.

Study selection

Four authors (WCM, SBI, ERC, and DBW) independently retrieved articles using the 12 search terms and reviewed titles and abstracts. The full text of each article thought to be eligible for the study was also reviewed by the four authors. Four inclusion criteria were used to select the pool of eligible studies for the final analysis: each study had to (1) feature SBME with DP as an educational intervention; (2) have an appropriate comparison group featuring traditional, clinical education or a pre-intervention baseline measurement for single group designs; (3) assess trainee skill acquisition rather than knowledge or attitudes; and (4) present sufficient data to enable effect size (ES) calculation. Conflicts were resolved by consensus.

Data abstraction and synthesis

The following data were extracted from selected studies: (1) study design (i.e., randomized trial, cohort study, case-control study, pre-post baseline study); (2) sample size; (3) outcome variables (i.e., what competency was assessed); and (4) reported skill outcome values (mean and standard deviation).

Two authors (WCM, ERC) abstracted information about the main outcome measure for each study and performed statistical analyses. For studies with a comparison group, effect sizes were calculated as the difference in means between the intervention and control groups, divided by the pooled standard deviation. For these studies the intervention group comprised all medical trainees receiving SBME with DP while the control group included all medical trainees receiving traditional clinical education. Effect size calculations for pre-post baseline studies (within-subjects designs) were performed by dividing the mean difference between post and pretest outcomes by the pretest standard deviation. When sufficient data were not available we used t-test values and degrees of freedom to calculate effect size correlation.20 Effect sizes were derived for individual studies and then combined across research reports. Effect size estimates were corrected for sample size. Correlation coefficients were calculated from effect size estimates. For each outcome of interest, pooled estimates and 95% confidence intervals (CIs) of effect size correlations were calculated using an inverse-variance weighted random effects meta-analysis.20 Statistical significance was defined as P < .05. Data analyses were done using Comprehensive Meta-Analysis, Version 2 (Biostat, Englewood, NJ).

Results

Each reviewer screened the 3,742 citations identified using our search strategies. We excluded papers if they were not original research, did not involve medical learners, or were not published in English. This left us with a subset of 328 articles for further review. This group was evaluated in detail by each author to reach consensus on whether the articles met the inclusion criteria described above until full consensus was reached. Of this group, 314 were excluded from the final analysis. These 314 papers were excluded because they did not feature DP, did not have a comparison group, the intervention was not simulation-based, or because data were insufficient. Several studies were included even though the term “deliberate practice” was not used in text. In these cases, descriptions of the type, intensity, and quality of the educational interventions were synonymous with the DP model.

The search strategy and inclusion and exclusion criteria resulted in a final set of 14 research reports addressing medical clinical skill acquisition.2134 The 14 journal articles are listed in Table 1 in four descending categories ordered by the rigor of their research design.14

Table 1
Studies and Study Features Included in a Meta-Analysis of Articles Published from 1990 to 2010, Comparing the Effectiveness of Traditional Clinical Education with Simulation-Based Medical Education with Deliberate Practice (DP)

A total of 633 learners participated, including 389 internal medicine, surgical, and emergency medicine residents, 226 medical students, and 18 internal medicine fellows. The SBME studies address a number of competencies and skills including advanced cardiac life support, laparoscopic surgical techniques, central venous catheter insertion, cardiac auscultation, and thoracentesis. Six of the studies demonstrated improvement in laparoscopic surgical skills including cholecystectomy, instrument and camera navigation and handling, and suturing live tissues.2226,34 Two of the studies showed improved performance and adherence to American Heart Association advanced cardiac life support guidelines including responses during actual patient codes.21,30 Cardiac auscultation skills including identification and interpretation of heart sounds and murmurs were improved among medical students and residents in two studies.27,29 Four of the studies demonstrated improved ability among residents and fellows to perform three invasive procedures (hemodialysis catheter insertion, thoracentesis, central venous catheter insertion).28,3133

Results from the meta-analysis of the 14 studies comparing SBME with DP versus traditional clinical education are displayed quantitatively and as a forest plot in Figure 1.35 The figure shows CER results with 95% CIs for each individual study and overall. The magnitude of boxes shown is a relative sample size indicator. Without exception and with very high confidence the CER data favor SBME with DP in comparison to traditional clinical education or a pre-intervention baseline measure. Every study exceeds the null value without statistical overlap. The overall effect size correlation (0.71) qualifies as a large effect size36 and summarizes the quantitative power of SBME with DP educational interventions for skill acquisition compared to traditional clinical education.

Figure 1
Random effects meta-analysis of traditional clinical education compared to Simulation-Based Medical Education (SBME) with Deliberate Practice (DP). Effect size correlations with 95% confidence intervals (95% CI) represent the 14 studies included in the ...

Discussion and Conclusions

Only a small number of studies were identified that address “head to head” comparative effectiveness of SBME with DP and traditional clinical education or a pre-intervention baseline. However, the results of this meta-analysis are clear and unequivocal. The meta-analytic outcomes favoring SBME with DP are powerful, consistent, and without exception. There is no doubt that SBME is superior to traditional clinical education for acquisition of a wide range of medical skills represented in this study: advanced cardiac life support, laparoscopic surgery, cardiac auscultation, hemodialysis catheter insertion, thoracentesis, and central venous catheter insertion. We are confident that demonstrations of the utility and cost effectiveness37 of educational interventions featuring SBME with DP will increase as the technology is applied to other skill acquisition and maintenance opportunities in health care.

A growing body of evidence shows that clinical skills acquired in medical simulation laboratory settings transfer directly to improved patient care practices and better patient outcomes. Examples of improved patient care practices linked directly to SBME include studies of better management of difficult obstetrical deliveries (e.g., shoulder dystocia),38 laparoscopic surgery,39 and bronchoscopy.40 Better patient outcomes linked directly to SBME have been reported in several studies using historical control groups that address reductions in catheter-related bloodstream infections41 and postpartum outcomes (e.g., brachial palsy injury,38 neonatal hypoxic-ischemic encephalopathy42) among newborn infants. Such work suggests that traditional, clinical education is insufficient if the goal is skill acquisition and downstream patient safety.

The power and utility of SBME with DP toward the goal of skill acquisition is no longer in doubt, especially compared to traditional models of clinical education. However, we also acknowledge that SBME with DP is a complex intervention that has a variety of elements including a long implementation chain, features that mutate as a result of refinement and adaptation to local circumstances, and represent open systems that feed back on themselves. Pawson and colleagues assert, “As interventions are implemented, they change the conditions that made them work in the first place.”43 There is much to learn about the organizational effects of SBME with DP on the medical schools and postgraduate residency programs that adopt these new educational technologies. Best practices to develop faculty to teach using SBME with DP also warrant attention. Finally, we agree with Eva, who asserts that the medical education community needs to “move away from research that is intended to prove the effectiveness of our educational endeavours and towards research that aims to understand the complexity inherent in those activities.”44

The results of this CER study underscore the importance of using new ways to invest and grow human capital embodied in a highly skilled workforce to improve health care delivery and patient safety. CER policies and priorities should endorse the importance of medical and health professions education in addition to investments in basic science research, drug design, medical device fabrication, and other established mechanisms of medical translational science. A recent conference hosted by Harvard Medical School involving educational leaders from eight other U.S. medical schools concluded, “…investigation of the efficacy of simulation in enhancing the performance of medical school graduates received the highest [priority] score.”45 Enhancement of the traditional clinical educational model with evidence-based practices like SBME with DP should be a high priority for medical education policy and research.

This study has several limitations. First, the final number of research studies contained in the meta-analysis (14) is small even though the data involve 633 medical learners. Second, the meta-analysis primarily addresses acquisition of medical procedural skills. It does not cover acquisition of many other clinical skills, such as judgment under pressure, medical decision-making, situation awareness, teamwork, or professional behavior. It is not known if the DP model is suited to these skills and research is warranted. Third, we are aware of many potential sources of bias that may affect meta-analyses of quasi-experimental research including cohort, case-control, and pre-post studies.46 Despite these limitations, the direction, strength, and consistency of results from this study indicate the outcomes are robust. We conclude that DP is a key variable in rigorous SBME research and training. Further CER on SBME with DP versus traditional clinical education will refute or endorse this conclusion.

List 1

Nine Elements of Deliberate Practice (DP)*

  1. Highly motivated learners with good concentration who address
  2. Well defined learning objectives or tasks at an
  3. Appropriate level of difficulty with
  4. Focused, repetitive practice that yields
  5. Rigorous, reliable measurements that provide
  6. Informative feedback from educational sources (e.g., simulators, teachers) that promote
  7. Monitoring, error correction, and more DP that enable
  8. Evaluation and performance that may reach a mastery standard where learning time may vary but expected minimal outcomes are identical and allows
  9. Advancement to the next task or unit.

Acknowledgments

The authors acknowledge the graphical expertise of Sheila Macomber. They thank Douglas Vaughan, MD, at Northwestern University and Michael S. Gordon, MD, PhD, at the University of Miami for their support of this work.

Funding Support: This research was supported in part by the Jacob R. Suker, MD, professorship in medical education and by grant UL 1 RR025741 from the National Center for Research Resources, National Institutes of Health (NIH). (Dr. McGaghie) The NIH had no role in the preparation, review, or approval of the manuscript.

Footnotes

Other disclosure: None.

Ethical approval: Not applicable.

*Source: McGaghie WC, Siddall VJ, Mazmanian PE, Myers J. Lessons for continuing medical education from simulation research in undergraduate and graduate medical education. Effectiveness of continuing medical education: American College of Chest Physicians evidence-based educational guidelines. CHEST 2009; 135: 62S–68S.

Source: Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. J Gen Intern Med. 2006; 21: 251–256.

Contributor Information

Dr. William C. McGaghie, Jacob R. Suker, MD, professor of medical education, professor of preventive Medicine, and director of evaluation, NUCATS Institute, Northwestern University Feinberg School of Medicine, Chicago, Illinois.

Dr. S. Barry Issenberg, Professor of medicine and assistant director, Gordon Center for Research in Medical Education, University of Miami Miller School of Medicine, Miami, Florida.

Ms. Elaine R. Cohen, Research assistant, department of medicine, Northwestern University Feinberg School of Medicine, Chicago, Illinois.

Dr. Jeffrey H. Barsuk, Assistant professor of medicine, division of hospital medicine Northwestern University Feinberg School of Medicine, Chicago, Illinois.

Dr. Diane B. Wayne, Associate professor of medicine and director, internal medicine residency training program, Northwestern University Feinberg School of Medicine, Chicago, Illinois.

References

1. Halsted WS. The training of the surgeon. Bull Johns Hopkins Hosp. 1904;15:267–275.
2. Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282(9):861–866. [PubMed]
3. Issenberg SB, McGaghie WC, Petrusa ER, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med Teach. 2005;27:10–28. [PubMed]
4. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44:50–63. [PubMed]
5. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. [Accessed February 25, 2011.];Acad Med. 2004 79(10, Suppl):S70–S81 . http://journals.lww.com/academicmedicine/Fulltext/2004/10001/Deliberate_Practice_and_the_Acquisition_and.22.aspx. [PubMed]
6. Ericsson KA. The influence of experience and deliberate practice on the development of superior expert performance. In: Ericsson KA, Charness N, Feltovich PJ, Hoffman RR, editors. The Cambridge Handbook of Expertise and Expert Performance. New York: Cambridge University Press; 2006. pp. 683–703.
7. Cordray DS, Pion GM. Treatment strength and integrity: models and methods. In: Bootzin RR, McKnight PE, editors. Strengthening Research Methodology: Psychological Measurement and Evaluation. Washington, DC: American Psychological Association; 2006. pp. 103–124.
8. Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. J Gen Intern Med. 2006;21:251–256. [PMC free article] [PubMed]
9. Ericsson KA, Charness N, Feltovich PJ, Hoffman RR. The Cambridge Handbook of Expertise and Expert Performance. New York: Cambridge University Press; 2006.
10. U.S. Department of Health and Human Services. [Accessed February 28, 2011.];Comparative effectiveness research funding: Federal Coordinating Council for Comparative Effectiveness Research. Available at: http://www.hhs.gov/recovery/programs/cer/
11. Institute of Medicine. National Priorities for Comparative Effectiveness Research. Washington, DC: National Academies Press; 2009.
12. Hochman M, McCormick D. Characteristics of published comparative effectiveness studies of medications. JAMA. 2010;303 (10):951–958. [PubMed]
13. Agency for Health care Research and Quality. [Accessed February 25, 2011.];AHRQ Effective Health Care Program. Available at: http://effectivehealthcare.ahrq.gov/
14. Fletcher RH, Fletcher SW, Wagner EH. Clinical Epidemiology—The Essentials. 3. Baltimore: Lippincott Williams & Wilkins; 1996.
15. Institute of Medicine (IOM) Knowing What Works in Health Care: A Roadmap for the Nation. Washington, DC: The National Academies Press; 2008.
16. Agency for Health care Research and Quality (AHRQ) [Accessed February 25, 2011.];What is the Effective Health Care Program. Available at: http://effectivehealthcare.ahrq.gov/index.cfm/what-is-the-effective-health-care-program1/
17. McGaghie WC. Sci Trans Med. Vol. 2. 2010. Medical education research as translational science; p. 19cm8. [PubMed]
18. Stroup DF, Berlin JA, Morton SC, et al. for the Meta-analysis Of Observational Studies in Epidemiology (MOOSE) Group. Meta-analysis of observational studies in epidemiology: a proposal for reporting. JAMA. 2000;283 (15):2008–2012. [PubMed]
19. Moher D, Cook DJ, Eastwood S, et al. for the QUOROM Group. Improving the quality of reports of meta-analyses of randomized controlled trials: the QUOROM statement. Lancet. 1999;354:1896–1900. [PubMed]
20. Borenstein M, Hedges LV, Higgins JPT, Rothstein HR. Introduction to Meta-Analysis. Chichester, UK: John Wiley & Sons; 2009.
21. Wayne DB, Butter J, Siddall VJ, et al. Simulation-based training of internal medicine residents in advanced cardiac life support protocols: a randomized trial. Teach Learn Med. 2005;17:202–208. [PubMed]
22. Ahlberg G, Enochsson L, Gallagher AG, et al. Proficiency-based virtual reality training significantly reduces the error rate for residents during theie first 10 laparoscopic cholecystectomies. Am J Surg. 2007;193:797–804. [PubMed]
23. Andreatta PB, Woodrum DT, Birkmeyer JD, et al. Laparoscopic skills are improved with LapMentor training: results of a randomized, double-blinded study. Ann Surg. 2006;243:854–863. [PubMed]
24. Korndorffer JR, Dunne JB, Sierra R, et al. Simulator training for laparoceopic suturing using performance goals translates to the operating room. J Am Coll Surg. 2005;201:23–29. [PubMed]
25. Korndorffer JR, Hayes DJ, Dunne JB, et al. Development and transferability of a cost-effective laparoscopic camera navigation simulator. Surg Endosc. 2005;19:161–167. [PubMed]
26. Van Sickle KR, Bitter EM, Baghai M, et al. Prospective, randomized, double-blind trial of curriculum-based training for intracorporeal suturing and knot tying. J Am Coll Surg. 2008;207:560–568. [PubMed]
27. Issenberg SB, McGaghie WC, Gordon DL, et al. Effectiveness of a cardiology review course for internal medicine residents using simulation technology and deliberate practice. Teach Learn Med. 2002;14:223–228. [PubMed]
28. Barsuk JH, Ahya SN, Cohen ER, et al. Mastery learning of temporary hemodialysis catheter insertion by nephrology fellows using simulation technology and deliberate practice. Am J Kidney Dis. 2009;53 (6):A14–A17. [PubMed]
29. Butter J, McGaghie WC, Cohen ER, et al. Simulation-based mastery learning improves cardiac auscultation skills in medical students. J Gen Intern Med. 2010;25:780–785. [PMC free article] [PubMed]
30. Wayne DB, Didwania A, Feinglass J, et al. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital. CHEST. 2008;133:56–61. [PubMed]
31. Wayne DB, Barsuk JH, O’Leary KO, et al. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med. 2008;3:48–54. [PubMed]
32. Barsuk JH, McGaghie WC, Cohen ER, et al. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4:397–403. [PubMed]
33. Barsuk JH, McGaghie WC, Cohen ER, et al. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37:2697–2701. [PubMed]
34. Stefanidis D, Sierra R, Korndorffer JR, et al. Intensive continuing medical education course training on simulators results in proficiency in laparoscopic suturing. Am J Surg. 2006;191:23–27. [PubMed]
35. Anzures-Cabrera J, Higgins JPT. Graphical displays for meta-analysis: an overview with suggestions for practice. Res Syn Meth. 2010;1:66–80.
36. Cohen J. Statistical Power Analysis for the Behavioral Sciences. 2. Hillsdale, NJ: Lawrence Erlbaum Associates; 1988.
37. Cohen ER, Feinglass J, Barsuk JH, et al. Cost savings from reduced catheter-related bloodstream infection after simulation-based education for residents in a medical intensive care unit. Sim Health care. 2010;5:98–102. [PubMed]
38. Draycott TJ, Crofts JF, Ash JP, et al. Improving neonatal outcome through practical shoulder dystocia training. Obstet Gynecol. 2008;112:14–20. [PubMed]
39. Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg. 2002;236(4):458–463. discussion 463–454. [PubMed]
40. Blum MG, Powers TW, Sundarasan S. Bronchoscopy simulator effectively prepares junior residents to competently perform basic clinical bronchoscopy. Ann Thorac Surg. 2004;78:287–291. [PubMed]
41. Barsuk JH, Cohen ER, Feinglass J, et al. Use of simulation-based education to reduce catheter-related bloodstream infections. Arch Intern Med. 2009;169:1420–1423. [PubMed]
42. Draycott T, Sibanda T, Owen L, et al. Does training in obstetric emergencies improve neonatal outcome? BJOG. 2006;113:177–182. [PubMed]
43. Pawson R, Greenhalgh T, Havery G, Walshe K. Realist review – a new method of systematic review designed for complex policy interventions. J Health Serv Res Policy. 2005;10 (Suppl 1):21–34. [PubMed]
44. Eva KW. The value of paradoxical tensions in medical education research. Med Educ. 2010;44:3–4. [PubMed]
45. Fincher R-ME, White CB, Huang G, Schwartzstein R. Toward hypothesis-driven medical education research: task force report from the millennium conference 2007 on educational research. [Accessed February 25, 2011.];Acad Med. 2010 85:821–828. http://journals.lww.com/academicmedicine/Abstract/2010/05000/Toward_Hypothesis_Driven_Medical_Education.27.aspx. [PubMed]
46. Colliver JA, Kucera K, Verhulst SJ. Meta-analysis of quasi-experimental research: are systematic narrative reviews indicated? Med Educ. 2008;42:858–865. [PubMed]