Cardiac auscultation is a core clinical skill. Our findings show that a curriculum featuring simulation technology and deliberate practice improved the ability of third-year medical students to identify simulated heart sounds. Third-year students also showed improved accuracy when examining actual patients when compared to untrained fourth-year students. As in our other mastery learning programs,9–13
this intervention featured clear goals, deliberate practice, a supportive environment and rigorous outcome measures. The computer-based delivery of the curriculum was especially suited to repetitive, deliberate practice which has been shown to improve cardiac auscultation performance.8,28
Use of the mastery learning model allowed all third-year students to achieve the MPS for cardiac auscultation skills during the junior medicine clerkship. These findings advance what is known about cardiac auscultation education for medical students because our approach requires that all learners meet or exceed a minimum standard. Although some students required additional practice time, all third-year students met or exceeded the MPS in cardiac auscultation skills at posttest after a curriculum that was largely self-study. Although third-year students in this study performed similarly at baseline to another group of third-year students from three schools,16
third-year students in this study outperformed a cohort of fourth-year students who experienced the same medical school curriculum, but did not complete the mastery learning program. In fact, more than 40% of the fourth-year students failed to achieve the MPS on the computer based examination. This reminds us that clinical experience is not a proxy for skill29
and that rigorous assessment is needed to document the proficiency of trainees in important clinical skills such as cardiac auscultation.
In addition to the improvements demonstrated with simulated heart sounds, students who completed the mastery learning curriculum also showed improvement in cardiac auscultation accuracy with actual patients. The addition of key patient data (history, vital signs) likely explains the higher performance of subjects in this study compared to prior studies that used cardiac auscultation findings alone.3,4,7
Although statistically significant improvements occurred both with simulated heart sounds and with actual patients, the improvement with actual patients was less striking than the improvement found in the laboratory setting. This likely relates to the observation that enhanced clinical performance (efficacy) is easier to detect in a controlled clinical environment than in a patient care setting (effectiveness).30
At Northwestern, we focus on competency-based education and patient safety and use multiple simulation-based mastery learning programs to educate and assess trainees and meet competency requirements. In addition to cardiac auscultation, examples include communication skills31
and invasive procedures such as central venous catheter insertion,12,13
Simulation-based mastery learning programs allow trainees to take as much practice time as they need. Learners then document skill in a simulated environment prior to actual patient care. This approach is feasible and practical, is compliant with competency-based accreditation requirements, and has been shown to improve the quality of patient care delivered by trainees in several competency areas.14,15
Medical students are expected to develop a broad range of skills, and educators must use diverse assessments to accurately evaluate their ability to function as clinicians. We found a modest correlation between USMLE Step 1 score and student performance on the computer-based multiple choice examination. However, there was no correlation between USMLE Step 1 score and bedside assessment of actual patients. This suggests that the performance and interpretation of physical examination findings at the bedside represents an independent clinical skill, distinct from the medical knowledge necessary for success in multiple choice examinations.
This study has several limitations. First, it represents the performance of a single group of students over a short time period at one medical school. However, standard setting data used to set the MPS was obtained from 100 students at three medical schools. Second, not all cardiac diagnoses were studied. Third, we used the clinical assessment of experienced cardiologists and not echocardiography because there is no gold standard for assessing cardiac physical examination skills.32
We believe this is the most appropriate reference standard as not all valvular abnormalities produce auscultatory findings and some important examination findings have no direct echocardiographic equivalent. Fourth, we used the UMedic self-study tutorial and Harvey cardiac patient simulator for testing and education. Although Harvey is widely used in medical education,22
we do not know if use of other modalities33
would produce similar findings. Last, durability of enhanced cardiac auscultation proficiency is unknown and is an appropriate area for further study.
In summary, a simulation based mastery learning program dramatically improved cardiac auscultation skills in medical students. Skills improved in both assessments of simulated heart sounds and examination of actual patients. The program was rated favorably by students. Use of a mastery learning program allows all learners to meet or exceed a predetermined MPS and is a valuable model to achieve and document competence in important clinical skills.