|Home | About | Journals | Submit | Contact Us | Français|
Cardiac auscultation is a core clinical skill. However, prior studies show that trainee skills are often deficient and that clinical experience is not a proxy for competence.
To describe a mastery model of cardiac auscultation education and evaluate its effectiveness in improving bedside cardiac auscultation skills.
Untreated control group design with pretest and posttest.
Third-year students who received a cardiac auscultation curriculum and fourth year students who did not.
A cardiac auscultation curriculum consisting of a computer tutorial and a cardiac patient simulator. All third-year students were required to meet or exceed a minimum passing score (MPS) set by an expert panel at posttest.
Diagnostic accuracy with simulated heart sounds and actual patients.
Trained third-year students (n=77) demonstrated significantly higher cardiac auscultation accuracy compared to untrained fourth year students (n=31) in assessment of simulated heart sounds (93.8% vs. 73.9%, p<0.001) and with real patients (81.8% vs. 75.1%, p=0.003). USMLE scores correlated modestly with a computer-based multiple choice assessment using simulated heart sounds but not with bedside skills on real patients.
A cardiac auscultation curriculum consisting of deliberate practice with a computer-based tutorial and a cardiac patient simulator resulted in improved assessment of simulated heart sounds and more accurate examination of actual patients.
Clinical experience has served traditionally as a proxy for competence in skills such as cardiac auscultation. However, research has shown that clinical experience does not correlate with skill in this core competency. 1–4 One study showed that medical students identified only 20% of 12 cardiac events correctly and the accuracy of residents was no better at 19%.2 Another investigation showed that except for cardiology fellows, there was no improvement in auscultation skills after the third-year of medical school among medical students, residents and faculty.3 The need for improved assessment and training in core skills such as cardiac auscultation is emphasized by the transition toward a competency based model of medical education. 5
Simulation technology with deliberate practice6 can be used to improve a variety of trainee skills. Examples include cardiac auscultation,7,8 advanced cardiac life support (ACLS),9 thoracentesis,10 and central venous catheter insertion.11–13 In addition to enhancing technical proficiency, some educational interventions have also been shown to transfer skill to the clinical environment and improve patient care14 and patient outcomes.15
In an earlier study, we used an expert panel and performance scores from a group of 100 third-year medical students from three Chicago medical schools to set a minimum passing score (MPS) for cardiac auscultation skills.16 Setting an MPS allows development of a mastery learning model in which all learners achieve the desired outcome, although the time needed to achieve mastery may vary.17,18 Mastery learning, a form of competency-based education, is a rigorous way to document that all trainees have achieved competency in a particular procedure or skill. The current study had two aims. The first was to use a mastery learning program featuring simulation technology and deliberate practice to allow third-year medical students to meet or exceed a minimum level of proficiency in cardiac auscultation skills during the junior medicine clerkship. We also compared their auscultation skills with those of fourth-year students who did not receive the mastery learning program. The second was to determine the impact of this intervention on the accuracy of cardiac auscultation in actual patients.
This article was prepared using reporting conventions described in the TREND Statement19 for nonrandomized comparative studies.
The study was an untreated control group design with pretest and posttest20 of a simulation-based, mastery learning educational intervention designed to increase third-year medical students’ clinical skills at cardiac auscultation (simulation trained group). Primary measurements were obtained at baseline (pretest), immediately after the educational intervention in the simulation laboratory (posttest), and no longer than two weeks following the educational intervention with actual patients. Fourth-year medical students (traditionally trained) did not receive the intervention and served as controls. The cardiac auscultation proficiency of the simulation trained group was compared with the proficiency of the traditionally trained group to assess the impact of the intervention.
From October 2008 to April 2009, 77 third- and 31 fourth-year students on required Department of Medicine rotations at Northwestern Memorial Hospital (NMH) were assessed in cardiac auscultation proficiency using a computerized multiple choice case-based examination and auscultation of actual patients. The third-year students were on the junior medicine clerkship, an 8-week inpatient and 4-week outpatient experience. The fourth year students were on the senior medicine clerkship, a 4-week inpatient experience. All students received 2 hours of practice with a cardiac simulator (Harvey) during their second year of medical school. There was no other formal cardiac auscultation training. The Northwestern University Feinberg School of Medicine Institutional Review Board approved the study. Participants provided informed consent before the baseline assessment.
The curriculum for third-year students featured approximately one hour of deliberate practice of 12 major cardiac findings (split S2, S3, S4, systolic click, innocent murmur, mitral regurgitation, mitral stenosis, aortic regurgitation, aortic stenosis, tricuspid regurgitation, continuous murmur and pericardial rub). The 12 findings were selected based on results of a previously published national survey of internal medicine and family residency program directors.21 The intervention included a computer based, interactive self study tutorial (UMedic)22 developed at the University of Miami which features didactic instruction, deliberate practice, and self assessment. Students accessed this program at Northwestern’s Clinical Education Center and used headphones to listen to the heart sounds. The volume of the heart sounds was fixed and students completed the tutorial at their own pace and self reported the time spent with the tutorial.
After students spent time with the UMedic tutorial, they received 30-40 minutes of focused review of the major cardiac findings using a cardiac simulator (Harvey) led by an experienced clinician educator (JB).
Third- and fourth-year students completed a computer-based assessment and examined actual patients. The computerized assessment was a set of 12 previously validated multiple-choice questions combining a brief clinical history, vital signs, and an audiovisual file of associated heart sounds presented on the Harvey cardiac patient simulator.7,23,24 Students chose the best among four possible responses. There was no time limit to complete this exercise and students could listen to the heart sounds as many times as they wished. Third-year students completed the computerized assessment before the intervention (pretest) and again after the education program (posttest). Similar to past research, the same heart sounds were used for pretest and posttest, although presentation order varied.7 Fourth-year students completed the examination once during their clinical rotation.
At posttest, third-year students were expected to meet or exceed the MPS of 75% set previously by a panel of 16 expert judges.16 Third-year students who did not achieve the MPS at post-test returned for additional practice with the tutorial and were retested until the MPS was reached.
Students also evaluated patients recruited from internal medicine or cardiology practices based on the presence of at least one of the 12 important cardiac findings listed above. Students each participated in one of five assessment sessions. Third-year students completed the assessment within 2 weeks of the intervention. At four of the sessions students examined four patients. At the fifth session, the students examined five patients. Students were provided a brief clinical history and vital signs and were asked to perform a cardiac assessment of each patient. They had 4 minutes to perform their assessment. Students completed a structured response form on which they described the clinical findings (presence, absence and radiation of systolic murmurs, diastolic murmurs, S3 and S4) and chose the most likely diagnosis from the list of 12 choices. Students received up to 12 points for each patient assessment; 11 for clinical findings and one for the final diagnosis. The structured response form was constructed de novo for this study at a level appropriate for third and fourth-year students. All patients were weighted the same.
At least two board certified cardiologists, each with more than 15 years of clinical experience and recognized for their leadership in cardiology education, also completed a cardiac assessment of the same patients using the same structured response form. Their responses served as the gold standard. Cardiologists and students were not aware of clinical diagnoses or echocardiogram findings of the patients. In the case of discrepancies between cardiologists, students were given credit for any answer provided by one of the cardiologists. Findings by the cardiologists included mitral regurgitation, aortic stenosis/sclerosis, tricuspid regurgitation, innocent murmur, systolic click, S4 and a normal cardiac exam.
Third-year students completed a survey of their satisfaction with the cardiac auscultation curriculum. United States Medical Licensing Exam (USMLE) Step 1 scores were obtained from the registrar to assess correlations between examination scores and cardiac auscultation performance.
Patient checklist reliability was estimated by calculating inter-rater reliability of the cardiologists, the preferred method for assessments that depend on human judges, using Kappa (κ) coefficient adjusted using the formula of Brennan and Prediger.25–27
When comparing the performance of third-year students from computer-based pretest (baseline) to posttest (outcome), within group differences were analyzed using the paired t-test. Performance of simulation-trained third-year medical students at computer-based posttest and patient assessments were compared to traditionally trained fourth-year student performance using the unpaired t-test.
Demographic data were analyzed using the chi square statistic and t-test to assess for differences in characteristics between the simulation-trained and traditionally trained students. Spearman’s rho coefficient was used to assess relationships between computer-based and patient assessment performances and USMLE Step 1 scores.
Seventy-eight third-year students and 39 fourth-year students completed required rotations in the Department of Medicine at NMH during the study period. Seventy-seven (99%) third-year students (simulation-trained) and 31 (79%) fourth-year students (traditionally trained) consented to participate in the study. As shown in Table 1, there was no significant difference in age or USMLE Step 1 scores between groups. There were more men in the fourth-year student group. Third-year students spent a mean of 67.5 minutes (SD=21.5 minutes) with the tutorial.
Some variance in the clinical findings of the expert cardiologists was expected. However, the cardiologists displayed moderate inter-rater reliability in their patient assessments as measured by the mean kappa coefficient (κn=0.76) of their findings on the structured response form.
At baseline, third-year students (M=67.3%, SD=18.85) scored similar to fourth-year students (M=73.9%, SD=14.1%) [p=.067]. After simulation training, third-year students improved their scores significantly to 93.8% (SD=11.6%) [p<0.001] and performed better than traditionally trained fourth-year students [p<0.001]. Four third-year students (5.2%) did not achieve the MPS at post-test and required additional practice. Each of these students achieved the MPS when re-tested after less than one hour of additional self study. Thirteen fourth-year students (41.9%) did not achieve the MPS. There was no significant difference in performance between males and females in the third-year student group (94.1% vs. 93.3) [p=0.65] or the fourth-year student group (73.8% vs. 74.3%) [p=0.93].
Table 2 displays simulated cardiac auscultation performance by type of finding. The 12 cardiac findings are divided into diastolic (n=4), systolic (n=6) and other (n=2). As shown, simulation-trained third-year students outperformed fourth-year students in identifying simulated diastolic and systolic findings. Identification of the other two findings was similar between groups.
Using the cardiologists’ findings as a gold standard, third-year (simulation- trained) students also more accurately assessed patients with cardiac findings (M=81.8%, SD=8.8%) compared to fourth-year students (M=75.1% SD 13.4%) [p=0.003] who did not complete simulation training but had more clinical experience. A graphic portrait of the students’ computer-based and patient assessment scores is shown in the Figure 1.
The students’ USMLE Step 1 scores showed a modest correlation with the multiple choice auscultation post-test (r=0.27, p=0.002). There was no significant correlation between USMLE scores and third or fourth-year students’ assessments of actual patients (r=- 0.04, p=.81).
Eighty-seven percent of students completed a course evaluation questionnaire. Students reported that this curriculum improved their cardiac auscultation skills, was a useful adjunct to clinical experience and was enjoyable (Table 3).
Cardiac auscultation is a core clinical skill. Our findings show that a curriculum featuring simulation technology and deliberate practice improved the ability of third-year medical students to identify simulated heart sounds. Third-year students also showed improved accuracy when examining actual patients when compared to untrained fourth-year students. As in our other mastery learning programs,9–13 this intervention featured clear goals, deliberate practice, a supportive environment and rigorous outcome measures. The computer-based delivery of the curriculum was especially suited to repetitive, deliberate practice which has been shown to improve cardiac auscultation performance.8,28
Use of the mastery learning model allowed all third-year students to achieve the MPS for cardiac auscultation skills during the junior medicine clerkship. These findings advance what is known about cardiac auscultation education for medical students because our approach requires that all learners meet or exceed a minimum standard. Although some students required additional practice time, all third-year students met or exceeded the MPS in cardiac auscultation skills at posttest after a curriculum that was largely self-study. Although third-year students in this study performed similarly at baseline to another group of third-year students from three schools,16 third-year students in this study outperformed a cohort of fourth-year students who experienced the same medical school curriculum, but did not complete the mastery learning program. In fact, more than 40% of the fourth-year students failed to achieve the MPS on the computer based examination. This reminds us that clinical experience is not a proxy for skill29 and that rigorous assessment is needed to document the proficiency of trainees in important clinical skills such as cardiac auscultation.
In addition to the improvements demonstrated with simulated heart sounds, students who completed the mastery learning curriculum also showed improvement in cardiac auscultation accuracy with actual patients. The addition of key patient data (history, vital signs) likely explains the higher performance of subjects in this study compared to prior studies that used cardiac auscultation findings alone.3,4,7 Although statistically significant improvements occurred both with simulated heart sounds and with actual patients, the improvement with actual patients was less striking than the improvement found in the laboratory setting. This likely relates to the observation that enhanced clinical performance (efficacy) is easier to detect in a controlled clinical environment than in a patient care setting (effectiveness).30
At Northwestern, we focus on competency-based education and patient safety and use multiple simulation-based mastery learning programs to educate and assess trainees and meet competency requirements. In addition to cardiac auscultation, examples include communication skills31 and invasive procedures such as central venous catheter insertion,12,13 and ACLS.9 Simulation-based mastery learning programs allow trainees to take as much practice time as they need. Learners then document skill in a simulated environment prior to actual patient care. This approach is feasible and practical, is compliant with competency-based accreditation requirements, and has been shown to improve the quality of patient care delivered by trainees in several competency areas.14,15
Medical students are expected to develop a broad range of skills, and educators must use diverse assessments to accurately evaluate their ability to function as clinicians. We found a modest correlation between USMLE Step 1 score and student performance on the computer-based multiple choice examination. However, there was no correlation between USMLE Step 1 score and bedside assessment of actual patients. This suggests that the performance and interpretation of physical examination findings at the bedside represents an independent clinical skill, distinct from the medical knowledge necessary for success in multiple choice examinations.
This study has several limitations. First, it represents the performance of a single group of students over a short time period at one medical school. However, standard setting data used to set the MPS was obtained from 100 students at three medical schools. Second, not all cardiac diagnoses were studied. Third, we used the clinical assessment of experienced cardiologists and not echocardiography because there is no gold standard for assessing cardiac physical examination skills.32 We believe this is the most appropriate reference standard as not all valvular abnormalities produce auscultatory findings and some important examination findings have no direct echocardiographic equivalent. Fourth, we used the UMedic self-study tutorial and Harvey cardiac patient simulator for testing and education. Although Harvey is widely used in medical education,22 we do not know if use of other modalities33 would produce similar findings. Last, durability of enhanced cardiac auscultation proficiency is unknown and is an appropriate area for further study.
In summary, a simulation based mastery learning program dramatically improved cardiac auscultation skills in medical students. Skills improved in both assessments of simulated heart sounds and examination of actual patients. The program was rated favorably by students. Use of a mastery learning program allows all learners to meet or exceed a predetermined MPS and is a valuable model to achieve and document competence in important clinical skills.
We thank the Northwestern medical students who participated in this study for their dedication to patient care and education. We acknowledge Drs. Douglas Vaughan and John X. Thomas for their support and encouragement of this work. Funding was provided through the Augusta Webster, MD Office of Medical Education and Faculty Development. Dr. McGaghie’s contribution was supported in part by the Jacob R. Suker, MD, professorship in medical education at Northwestern University and by grant UL 1 RR025741 from the National Center for Research Resources, National Institutes of Health. The National Institutes of Health had no role in the preparation, review, or approval of the manuscript.
Funding Office of Medical Education and Faculty Development, Northwestern University Feinberg School of Medicine, Chicago, IL
Conflicts of interests Dr. McGaghie has been consultant to, and received honorariums from, the Gordon Center for Research in Medical Education at the University of Miami Miller School of Medicine where the "Harvey" mannequin is manufactured. None of the other authors have conflicts of interest, financial interests or relationships or affiliations relevant to the subject matter or materials discussed in this manuscript.