PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of jgimedspringer.comThis journalToc AlertsSubmit OnlineOpen Choice
 
J Gen Intern Med. 2010 August; 25(8): 780–785.
Published online 2010 March 26. doi:  10.1007/s11606-010-1309-x
PMCID: PMC2896602

Simulation-based Mastery Learning Improves Cardiac Auscultation Skills in Medical Students

Abstract

Background

Cardiac auscultation is a core clinical skill. However, prior studies show that trainee skills are often deficient and that clinical experience is not a proxy for competence.

Objective

To describe a mastery model of cardiac auscultation education and evaluate its effectiveness in improving bedside cardiac auscultation skills.

Design

Untreated control group design with pretest and posttest.

Participants

Third-year students who received a cardiac auscultation curriculum and fourth year students who did not.

Intervention

A cardiac auscultation curriculum consisting of a computer tutorial and a cardiac patient simulator. All third-year students were required to meet or exceed a minimum passing score (MPS) set by an expert panel at posttest.

Measurements

Diagnostic accuracy with simulated heart sounds and actual patients.

Results

Trained third-year students (n = 77) demonstrated significantly higher cardiac auscultation accuracy compared to untrained fourth year students (n = 31) in assessment of simulated heart sounds (93.8% vs. 73.9%, p < 0.001) and with real patients (81.8% vs. 75.1%, p = 0.003). USMLE scores correlated modestly with a computer-based multiple choice assessment using simulated heart sounds but not with bedside skills on real patients.

Conclusions

A cardiac auscultation curriculum consisting of deliberate practice with a computer-based tutorial and a cardiac patient simulator resulted in improved assessment of simulated heart sounds and more accurate examination of actual patients.

Key words: Cardiac Auscultation, simulation, medical students, learning

BACKGROUND

Clinical experience has served traditionally as a proxy for competence in skills such as cardiac auscultation. However, research has shown that clinical experience does not correlate with skill in this core competency. 14 One study showed that medical students identified only 20% of 12 cardiac events correctly and the accuracy of residents was no better at 19%.2 Another investigation showed that except for cardiology fellows, there was no improvement in auscultation skills after the third-year of medical school among medical students, residents and faculty.3 The need for improved assessment and training in core skills such as cardiac auscultation is emphasized by the transition toward a competency based model of medical education. 5

Simulation technology with deliberate practice6 can be used to improve a variety of trainee skills. Examples include cardiac auscultation,7,8 advanced cardiac life support (ACLS),9 thoracentesis,10 and central venous catheter insertion.1113 In addition to enhancing technical proficiency, some educational interventions have also been shown to transfer skill to the clinical environment and improve patient care14 and patient outcomes.15

In an earlier study, we used an expert panel and performance scores from a group of 100 third-year medical students from three Chicago medical schools to set a minimum passing score (MPS) for cardiac auscultation skills.16 Setting an MPS allows development of a mastery learning model in which all learners achieve the desired outcome, although the time needed to achieve mastery may vary.17,18 Mastery learning, a form of competency-based education, is a rigorous way to document that all trainees have achieved competency in a particular procedure or skill. The current study had two aims. The first was to use a mastery learning program featuring simulation technology and deliberate practice to allow third-year medical students to meet or exceed a minimum level of proficiency in cardiac auscultation skills during the junior medicine clerkship. We also compared their auscultation skills with those of fourth-year students who did not receive the mastery learning program. The second was to determine the impact of this intervention on the accuracy of cardiac auscultation in actual patients.

This article was prepared using reporting conventions described in the TREND Statement19 for nonrandomized comparative studies.

METHODS

Objectives and Design

The study was an untreated control group design with pretest and posttest20 of a simulation-based, mastery learning educational intervention designed to increase third-year medical students’ clinical skills at cardiac auscultation (simulation trained group). Primary measurements were obtained at baseline (pretest), immediately after the educational intervention in the simulation laboratory (posttest), and no longer than two weeks following the educational intervention with actual patients. Fourth-year medical students (traditionally trained) did not receive the intervention and served as controls. The cardiac auscultation proficiency of the simulation trained group was compared with the proficiency of the traditionally trained group to assess the impact of the intervention.

Participants

From October 2008 to April 2009, 77 third- and 31 fourth-year students on required Department of Medicine rotations at Northwestern Memorial Hospital (NMH) were assessed in cardiac auscultation proficiency using a computerized multiple choice case-based examination and auscultation of actual patients. The third-year students were on the junior medicine clerkship, an 8-week inpatient and 4-week outpatient experience. The fourth year students were on the senior medicine clerkship, a 4-week inpatient experience. All students received 2 hours of practice with a cardiac simulator (Harvey) during their second year of medical school. There was no other formal cardiac auscultation training. The Northwestern University Feinberg School of Medicine Institutional Review Board approved the study. Participants provided informed consent before the baseline assessment.

Educational Intervention

The curriculum for third-year students featured approximately one hour of deliberate practice of 12 major cardiac findings (split S2, S3, S4, systolic click, innocent murmur, mitral regurgitation, mitral stenosis, aortic regurgitation, aortic stenosis, tricuspid regurgitation, continuous murmur and pericardial rub). The 12 findings were selected based on results of a previously published national survey of internal medicine and family residency program directors.21 The intervention included a computer based, interactive self study tutorial (UMedic)22 developed at the University of Miami which features didactic instruction, deliberate practice, and self assessment. Students accessed this program at Northwestern’s Clinical Education Center and used headphones to listen to the heart sounds. The volume of the heart sounds was fixed and students completed the tutorial at their own pace and self reported the time spent with the tutorial.

After students spent time with the UMedic tutorial, they received 30-40 minutes of focused review of the major cardiac findings using a cardiac simulator (Harvey) led by an experienced clinician educator (JB).

Measurements

Third- and fourth-year students completed a computer-based assessment and examined actual patients. The computerized assessment was a set of 12 previously validated multiple-choice questions combining a brief clinical history, vital signs, and an audiovisual file of associated heart sounds presented on the Harvey cardiac patient simulator.7,23,24 Students chose the best among four possible responses. There was no time limit to complete this exercise and students could listen to the heart sounds as many times as they wished. Third-year students completed the computerized assessment before the intervention (pretest) and again after the education program (posttest). Similar to past research, the same heart sounds were used for pretest and posttest, although presentation order varied.7 Fourth-year students completed the examination once during their clinical rotation.

At posttest, third-year students were expected to meet or exceed the MPS of 75% set previously by a panel of 16 expert judges.16 Third-year students who did not achieve the MPS at post-test returned for additional practice with the tutorial and were retested until the MPS was reached.

Students also evaluated patients recruited from internal medicine or cardiology practices based on the presence of at least one of the 12 important cardiac findings listed above. Students each participated in one of five assessment sessions. Third-year students completed the assessment within 2 weeks of the intervention. At four of the sessions students examined four patients. At the fifth session, the students examined five patients. Students were provided a brief clinical history and vital signs and were asked to perform a cardiac assessment of each patient. They had 4 minutes to perform their assessment. Students completed a structured response form on which they described the clinical findings (presence, absence and radiation of systolic murmurs, diastolic murmurs, S3 and S4) and chose the most likely diagnosis from the list of 12 choices. Students received up to 12 points for each patient assessment; 11 for clinical findings and one for the final diagnosis. The structured response form was constructed de novo for this study at a level appropriate for third and fourth-year students. All patients were weighted the same.

At least two board certified cardiologists, each with more than 15 years of clinical experience and recognized for their leadership in cardiology education, also completed a cardiac assessment of the same patients using the same structured response form. Their responses served as the gold standard. Cardiologists and students were not aware of clinical diagnoses or echocardiogram findings of the patients. In the case of discrepancies between cardiologists, students were given credit for any answer provided by one of the cardiologists. Findings by the cardiologists included mitral regurgitation, aortic stenosis/sclerosis, tricuspid regurgitation, innocent murmur, systolic click, S4 and a normal cardiac exam.

Third-year students completed a survey of their satisfaction with the cardiac auscultation curriculum. United States Medical Licensing Exam (USMLE) Step 1 scores were obtained from the registrar to assess correlations between examination scores and cardiac auscultation performance.

Data Analysis

Patient checklist reliability was estimated by calculating inter-rater reliability of the cardiologists, the preferred method for assessments that depend on human judges, using Kappa (κ) coefficient adjusted using the formula of Brennan and Prediger.2527

When comparing the performance of third-year students from computer-based pretest (baseline) to posttest (outcome), within group differences were analyzed using the paired t-test. Performance of simulation-trained third-year medical students at computer-based posttest and patient assessments were compared to traditionally trained fourth-year student performance using the unpaired t-test.

Demographic data were analyzed using the chi square statistic and t-test to assess for differences in characteristics between the simulation-trained and traditionally trained students. Spearman’s rho coefficient was used to assess relationships between computer-based and patient assessment performances and USMLE Step 1 scores.

RESULTS

Seventy-eight third-year students and 39 fourth-year students completed required rotations in the Department of Medicine at NMH during the study period. Seventy-seven (99%) third-year students (simulation-trained) and 31 (79%) fourth-year students (traditionally trained) consented to participate in the study. As shown in Table 1, there was no significant difference in age or USMLE Step 1 scores between groups. There were more men in the fourth-year student group. Third-year students spent a mean of 67.5 minutes (SD = 21.5 minutes) with the tutorial.

Table 1
Descriptive Statistics for n = 108 Medical Students

Some variance in the clinical findings of the expert cardiologists was expected. However, the cardiologists displayed moderate inter-rater reliability in their patient assessments as measured by the mean kappa coefficient (κn = 0.76) of their findings on the structured response form.

At baseline, third-year students (M = 67.3%, SD = 18.85) scored similar to fourth-year students (M = 73.9%, SD = 14.1%) [p = .067]. After simulation training, third-year students improved their scores significantly to 93.8% (SD = 11.6%) [p < 0.001] and performed better than traditionally trained fourth-year students [p < 0.001]. Four third-year students (5.2%) did not achieve the MPS at post-test and required additional practice. Each of these students achieved the MPS when re-tested after less than one hour of additional self study. Thirteen fourth-year students (41.9%) did not achieve the MPS. There was no significant difference in performance between males and females in the third-year student group (94.1% vs. 93.3) [p = 0.65] or the fourth-year student group (73.8% vs. 74.3%) [p = 0.93].

Table 2 displays simulated cardiac auscultation performance by type of finding. The 12 cardiac findings are divided into diastolic (n = 4), systolic (n = 6) and other (n = 2). As shown, simulation-trained third-year students outperformed fourth-year students in identifying simulated diastolic and systolic findings. Identification of the other two findings was similar between groups.

Table 2
Performance of Simulated Cardiac Auscultation by Type of Finding and Group

Using the cardiologists’ findings as a gold standard, third-year (simulation- trained) students also more accurately assessed patients with cardiac findings (M = 81.8%, SD = 8.8%) compared to fourth-year students (M = 75.1% SD 13.4%) [p = 0.003] who did not complete simulation training but had more clinical experience. A graphic portrait of the students’ computer-based and patient assessment scores is shown in the Figure 1.

Figure 1
Performance of simulation-trained third-year students and traditionally trained fourth-year students in cardiac auscultation clinical skills assessments.

The students’ USMLE Step 1 scores showed a modest correlation with the multiple choice auscultation post-test (r = 0.27, p = 0.002). There was no significant correlation between USMLE scores and third or fourth-year students’ assessments of actual patients (r = - 0.04, p = .81).

Eighty-seven percent of students completed a course evaluation questionnaire. Students reported that this curriculum improved their cardiac auscultation skills, was a useful adjunct to clinical experience and was enjoyable (Table 3).

Table 3
Third-Year Student Responses to Course Evaluation Questionnaire (n = 67) Likert Scale: 1= Strongly Disagree; 5= Strongly Agree

COMMENT

Cardiac auscultation is a core clinical skill. Our findings show that a curriculum featuring simulation technology and deliberate practice improved the ability of third-year medical students to identify simulated heart sounds. Third-year students also showed improved accuracy when examining actual patients when compared to untrained fourth-year students. As in our other mastery learning programs,913 this intervention featured clear goals, deliberate practice, a supportive environment and rigorous outcome measures. The computer-based delivery of the curriculum was especially suited to repetitive, deliberate practice which has been shown to improve cardiac auscultation performance.8,28

Use of the mastery learning model allowed all third-year students to achieve the MPS for cardiac auscultation skills during the junior medicine clerkship. These findings advance what is known about cardiac auscultation education for medical students because our approach requires that all learners meet or exceed a minimum standard. Although some students required additional practice time, all third-year students met or exceeded the MPS in cardiac auscultation skills at posttest after a curriculum that was largely self-study. Although third-year students in this study performed similarly at baseline to another group of third-year students from three schools,16 third-year students in this study outperformed a cohort of fourth-year students who experienced the same medical school curriculum, but did not complete the mastery learning program. In fact, more than 40% of the fourth-year students failed to achieve the MPS on the computer based examination. This reminds us that clinical experience is not a proxy for skill29 and that rigorous assessment is needed to document the proficiency of trainees in important clinical skills such as cardiac auscultation.

In addition to the improvements demonstrated with simulated heart sounds, students who completed the mastery learning curriculum also showed improvement in cardiac auscultation accuracy with actual patients. The addition of key patient data (history, vital signs) likely explains the higher performance of subjects in this study compared to prior studies that used cardiac auscultation findings alone.3,4,7 Although statistically significant improvements occurred both with simulated heart sounds and with actual patients, the improvement with actual patients was less striking than the improvement found in the laboratory setting. This likely relates to the observation that enhanced clinical performance (efficacy) is easier to detect in a controlled clinical environment than in a patient care setting (effectiveness).30

At Northwestern, we focus on competency-based education and patient safety and use multiple simulation-based mastery learning programs to educate and assess trainees and meet competency requirements. In addition to cardiac auscultation, examples include communication skills31 and invasive procedures such as central venous catheter insertion,12,13 and ACLS.9 Simulation-based mastery learning programs allow trainees to take as much practice time as they need. Learners then document skill in a simulated environment prior to actual patient care. This approach is feasible and practical, is compliant with competency-based accreditation requirements, and has been shown to improve the quality of patient care delivered by trainees in several competency areas.14,15

Medical students are expected to develop a broad range of skills, and educators must use diverse assessments to accurately evaluate their ability to function as clinicians. We found a modest correlation between USMLE Step 1 score and student performance on the computer-based multiple choice examination. However, there was no correlation between USMLE Step 1 score and bedside assessment of actual patients. This suggests that the performance and interpretation of physical examination findings at the bedside represents an independent clinical skill, distinct from the medical knowledge necessary for success in multiple choice examinations.

This study has several limitations. First, it represents the performance of a single group of students over a short time period at one medical school. However, standard setting data used to set the MPS was obtained from 100 students at three medical schools. Second, not all cardiac diagnoses were studied. Third, we used the clinical assessment of experienced cardiologists and not echocardiography because there is no gold standard for assessing cardiac physical examination skills.32 We believe this is the most appropriate reference standard as not all valvular abnormalities produce auscultatory findings and some important examination findings have no direct echocardiographic equivalent. Fourth, we used the UMedic self-study tutorial and Harvey cardiac patient simulator for testing and education. Although Harvey is widely used in medical education,22 we do not know if use of other modalities33 would produce similar findings. Last, durability of enhanced cardiac auscultation proficiency is unknown and is an appropriate area for further study.

In summary, a simulation based mastery learning program dramatically improved cardiac auscultation skills in medical students. Skills improved in both assessments of simulated heart sounds and examination of actual patients. The program was rated favorably by students. Use of a mastery learning program allows all learners to meet or exceed a predetermined MPS and is a valuable model to achieve and document competence in important clinical skills.

Acknowledgements

We thank the Northwestern medical students who participated in this study for their dedication to patient care and education. We acknowledge Drs. Douglas Vaughan and John X. Thomas for their support and encouragement of this work. Funding was provided through the Augusta Webster, MD Office of Medical Education and Faculty Development. Dr. McGaghie’s contribution was supported in part by the Jacob R. Suker, MD, professorship in medical education at Northwestern University and by grant UL 1 RR025741 from the National Center for Research Resources, National Institutes of Health. The National Institutes of Health had no role in the preparation, review, or approval of the manuscript.

Funding Office of Medical Education and Faculty Development, Northwestern University Feinberg School of Medicine, Chicago, IL

Conflicts of interests Dr. McGaghie has been consultant to, and received honorariums from, the Gordon Center for Research in Medical Education at the University of Miami Miller School of Medicine where the "Harvey" mannequin is manufactured. None of the other authors have conflicts of interest, financial interests or relationships or affiliations relevant to the subject matter or materials discussed in this manuscript.

References

1. Butterworth JS, Reppert EH. Ausculatatory acumen in the general medical population. JAMA. 1960;174:32–4.
2. Mangione S, Neiman L, Graceley E, Kaye D. The teaching and practice of cardiac auscultation during internal medicine and cardiology training. Ann Intern Med. 1993;119:47–54. [PubMed]
3. Vukanovic-Criley JM, Criley S, Ward CM, et al. Competency in cardiac examination skills in medical students, trainees, physicians and faculty. Arch Intern Med. 2006;166:610–16. doi: 10.1001/archinte.166.6.610. [PubMed] [Cross Ref]
4. Mangione S, Nieman LZ. Cardiac ausculatation skills of internal medicine and family practice trainees: A comparison of diagnostic proficiency. JAMA. 1997;278:717–22. doi: 10.1001/jama.278.9.717. [PubMed] [Cross Ref]
5. Accreditation Council for Graduate Medical Education Outcome Project. Available at: (http://www.acgme.org/Outcome). Accessed February 1, 2010.
6. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 Suppl):S70–81. doi: 10.1097/00001888-200410001-00022. [PubMed] [Cross Ref]
7. Issenberg SB, McGaghie WC, Gordon DL, et al. Effectiveness of a cardiology review course for internal medicine residents using simulation technology and deliberate practice. Teach Learn Med. 2002;14:223–8. doi: 10.1207/S15328015TLM1404_4. [PubMed] [Cross Ref]
8. Barrett MJ, Kuzma MA, Seto TC, et al. The power of repetition in mastering cardiac auscultation. Am J Med. 2006;119:73–5. doi: 10.1016/j.amjmed.2004.12.036. [PubMed] [Cross Ref]
9. Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. JGIM. 2006;21:251–6. doi: 10.1111/j.1525-1497.2006.00341.x. [PMC free article] [PubMed] [Cross Ref]
10. Wayne DB, Barsuk JH, O’Leary KJ, Fudala MJ, McGaghie WC. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med. 2008;3:48–54. doi: 10.1002/jhm.268. [PubMed] [Cross Ref]
11. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4:397–403. doi: 10.1002/jhm.468. [PubMed] [Cross Ref]
12. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37: in press. [PubMed]
13. Barsuk JH, Ahya SN, Cohen ER, McGaghie WC, Wayne DB. Mastery learning of temporary hemodialysis catheter insertion by nephrology fellows using simulation technology and deliberate practice. Am J Kidney Dis. 2009;54:70–6. doi: 10.1053/j.ajkd.2008.12.041. [PubMed] [Cross Ref]
14. Wayne DB, Didwania A, Feinglass J, Fudala MJ, Barsuk JH, McGaghie WC. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case-control study. CHEST. 2008;133:56–61. doi: 10.1378/chest.07-0131. [PubMed] [Cross Ref]
15. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Use of Simulation-based education to reduce catheter-related bloodstream infections. Arch Intern Med. 2009;169:1420–3. doi: 10.1001/archinternmed.2009.215. [PubMed] [Cross Ref]
16. Wayne DB, Butter J, Cohen ER, McGaghie WC. Setting defensible standards for cardiac auscultation skills in medical students. Acad Med. 2009;84(10, Suppl):S94–6. doi: 10.1097/ACM.0b013e3181b38e8c. [PubMed] [Cross Ref]
17. Block, JHed. Mastery Learning: Theory and Practice. New York: Holt, Rinehart and Winston 1971.
18. McGaghie WC, Miller GE, Sajid A, Telder TV. Competency-Based Curriculum Development in Medical Education. Public Health Paper No. 68, Geneva, Switzerland: World Health Organization 1978.
19. Des Jarlais DC, Lyles C, Crepaz N, TREND Group Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: The TREND Statement. Am J Pub Health. 2004;94:361–6. doi: 10.2105/AJPH.94.3.361. [PubMed] [Cross Ref]
20. Shadish WR, Cook TD, Campbell DT. Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Boston: Houghton Mifflin; 2002.
21. Mangione S. The teaching of cardiac auscultation during internal medicine and family medicine training – a national comparison. Acad Med. 1998;73(Supplement):S10–12. doi: 10.1097/00001888-199810000-00030. [PubMed] [Cross Ref]
22. Miami Group. UMedic User Manual. Gordon Center for Research in Medical Education, University of Miami Miller School of Medicine, 2007.
23. Issenberg SB, Petrusa ER, McGaghie WC, et al. Effectiveness of a computer-based system to teach bedside cardiology. Acad Med. 1999;74:S93–5. doi: 10.1097/00001888-199910000-00051. [PubMed] [Cross Ref]
24. Issenberg SB, McGaghie WC, Brown DD, et al. Development of multimedia computer-based measures of clinical skills in bedside cardiology. In: Melnick DE, et al., editors. The Eighth International Ottawa Conference on Medical Education and Assessment Proceedings. Evolving Assessment: Protecting the Human Dimension. Philadelphia: National Board of Medical Examiners; 2000. pp. 821–9.
25. Downing SM. Reliability: on the reproducibility of assessment data. Med Educ. 2004;38:1006–12. doi: 10.1111/j.1365-2929.2004.01932.x. [PubMed] [Cross Ref]
26. Fleiss JL, Levin B, Paik MC. Statistical Methods for Rates and Proportions. 3. New York: John Wiley & Sons; 2003.
27. Brennan RL, Prediger DJ. Coefficient kappa: some uses, misuses and alternatives. Educ Psychol Meas. 1981;41:587–699.
28. Barrett MJ, Lacey CS, Sekara AE, Linden EA, Gracely EJ. Mastering cardiac murmurs – the power of repetition. CHEST. 2004;126:470–5. doi: 10.1378/chest.126.2.470. [PubMed] [Cross Ref]
29. Choudhry NK, Fletcher RH, Soumerai SB. Systematic review: the relationship between clinical experience and quality of health care. Ann Intern Med. 2005;142:260–73. [PubMed]
30. Fletcher RH, Fletcher SW, Wagner EH. Clinical Epidemiology—The Essentials. 3. Baltimore: Williams & Wilkins; 1996.
31. Wayne DB, Cohen E, Makoul G, McGaghie WC. The impact of judge selection on standard setting for a patient survey of physician communication skills. Acad Med. 2008;83(10 Suppl):S17–20. doi: 10.1097/ACM.0b013e318183e7bd. [PubMed] [Cross Ref]
32. Hatala R, Issenberg SB, Kassen B, Cole G, Bacchus CM, Scalese RJ. Assessing cardiac physical examination skills using simulation technology and real patients: a comparison study. Med Educ. 2008;42:628–36. doi: 10.1111/j.1365-2923.2007.02953.x. [PubMed] [Cross Ref]
33. Geiovanni D, Roberts T, Norman G. Relative effectiveness of high- versus low-fidelity simulation in learning heart sounds. Med Educ. 2009;43:661–8. doi: 10.1111/j.1365-2923.2009.03398.x. [PubMed] [Cross Ref]

Articles from Journal of General Internal Medicine are provided here courtesy of Society of General Internal Medicine