PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of meoJournal home pageRegistrationManuscript submissionJournal home pagePublisher home page
 
Med Educ Online. 2012; 17: 10.3402/meo.v17i0.18391.
Published online 2012 May 16. doi:  10.3402/meo.v17i0.18391
PMCID: PMC3355381

An innovative quality improvement curriculum for third-year medical students

Abstract

Background

Competence in quality improvement (QI) is a priority for medical students. We describe a self-directed QI skills curriculum for medical students in a 1-year longitudinal integrated third-year clerkship: an ideal context to learn and practice QI.

Methods

Two groups of four students identified a quality gap, described existing efforts to address the gap, made quantifying measures, and proposed a QI intervention. The program was assessed with knowledge and attitude surveys and a validated tool for rating trainee QI proposals. Reaction to the curriculum was assessed by survey and focus group.

Results

Knowledge of QI concepts did not improve (mean knowledge score±SD): pre: 5.9±1.5 vs. post: 6.6±1.3, p=0.20. There were significant improvements in attitudes (mean topic attitude score±SD) toward the value of QI (pre: 9.9±1.8 vs. post: 12.6±1.9, p=0.03) and confidence in QI skills (pre: 13.4±2.8 vs. post: 16.1±3.0, p=0.05). Proposals lacked sufficient analysis of interventions and evaluation plans. Reaction was mixed, including appreciation for the experience and frustration with finding appropriate mentorship.

Conclusion

Clinical-year students were able to conduct a self-directed QI project. Lack of improvement in QI knowledge suggests that self-directed learning in this domain may be insufficient without targeted didactics. Higher order skills such as developing measurement plans would benefit from explicit instruction and mentorship. Lessons from this experience will allow educators to better target QI curricula to medical students in the clinical years.

Keywords: quality improvement education, undergraduate medical education, experiential learning, self-directed learning

Recent papers have called for the reform of medical education to emphasize better the ‘development of skills, behaviors, and attitudes needed by practicing physicians’, especially regarding healthcare quality and systems theory (1). Increasingly, medical school deans (2), medical students (3), national leaders in medical education (4), and leaders in health policy (5) are calling for quality improvement (QI) to be an educational priority. Despite this enthusiasm, undergraduate medical education has failed to incorporate QI adequately into required or elective curricula; in 2010, only 29% of graduating medical students felt that they had received adequate instruction in QI (6).

Experiential learning integrated into clinical practice is felt to be the most effective means of teaching QI skills (7). While QI curricula in undergraduate education have been previously described (811), descriptions of curricula integrated with clerkship experiences remain sparse (1214).

Longitudinal clerkships, in which students spend prolonged periods of time working and learning across specialties in one healthcare system, offer students many opportunities to identify systems deficiencies amenable to QI interventions. We took advantage of the implementation of a year-long longitudinal integrated clerkship (15) at our institution to pilot an innovative QI curriculum. We used a literature-based framework to define appropriate QI competencies for third-year medical students (16). Self-directed learning was emphasized as the foundation for how students will approach new problems throughout their careers, as well as to optimize the use of the limited time allotted for formal didactics (17).

In this paper, we describe the development, implementation, and assessment of a self-directed, experiential QI curriculum for third-year medical students. We provide narrative descriptions of the students’ work and suggest key lessons to inform future clinically based medical student QI curricula, both in traditional and longitudinal clerkships.

Methods

Development process

In 2007, the University of California, San Francisco School of Medicine (UCSF SOM) piloted a 1-year longitudinal integrated third-year clerkship: the Parnassus Integrated Student Clinical Experiences program (PISCES) (15). In longitudinal integrated clerkships, students complete their clinical rotations at a single site over an extended period of time. This approach fosters patient-centeredness and promotes a more holistic approach to healthcare (18, 19). By remaining in a single site, students benefit from being integrated into that healthcare system (19, 20); they become immersed in the logistics and systems that may need improvement. Furthermore, students have a prolonged duration to develop QI projects in the context of their clinical responsibilities. Hence, the PISCES program provides an opportune setting for implementing a new QI curriculum.

To meet the need for education in the competencies of systems-based practice and practice-based learning for this group, a PISCES faculty member (KH) and a student ‘curriculum ambassador’ (DL) developed a QI curriculum. Three themes emerged from a literature review on key principles in undergraduate QI education. First, experiential activities are most effective in teaching QI (1113, 16, 2124). Second, students value the opportunity to have creative input in their experiential QI work (10, 11). Finally, QI is best taught as an integrated component of existing clinical experiences, rather than as an isolated course (12).

We developed an experience-based, self-directed QI curriculum, using the Ogrinc et al. framework of QI competencies to develop specific learning objectives for third-year medical students (16). Our curriculum was developed in an iterative fashion starting in 2007–2008. In the first 2 years of the program, which focused on the feasibility of the curriculum, we required that students enrolled in the longitudinal clerkship develop a QI proposal. We identified faculty willing to advise students and established that students were logistically capable of meeting the minimal requirements. In this report, we describe our findings from the 2009–2010 academic year, during which we made pre- and post- measurements of knowledge, skills, and attitudes compared to a control group and formally evaluated final QI proposals using an expert panel and a validated tool.

Curriculum overview

The curriculum objectives were anchored with developmentally appropriate learning objectives as derived from the framework described by Ogrinc et al. (16). Specifically, we set the expectation that third-year medical students would demonstrate competence in developing a QI intervention. The implementation and assessment of interventions are beyond the expected skill level of an ‘advanced beginner’ in this schema. Thus, students were required to present a QI proposal that included evidence of their efforts to (1) analyze a process of care for a healthcare quality gap of their own choosing, (2) make initial measurements quantifying the gap, and (3) recommend changes to close the gap. To identify a quality gap, students were asked to reflect on their clinical experiences to date and identify situations in which a patient received care that was not optimal. The curriculum director (SM) provided guidance for these steps.

To avoid overburdening an already full faculty and student schedule, as well as to emphasize the practice-based learning skills that students will need throughout their careers, we established a curricular structure that included key components of self-directed learning (17). Rather than relying on didactics, we emphasized the role of the educator as the facilitator, to help students identify learning needs specific to their identified QI gap and to develop learning objectives and identify appropriate resources to meet their needs. Specific milestones with structured ‘check-ins’ were prescheduled throughout the academic year.

Implementation

The UCSF Committee on Human Research approved the study. In the 2009–2010 academic year, a pre-intervention survey of knowledge and attitudes about QI was administered to all participating students. The curriculum director presented an hour-long lecture to orient students to the structure and objectives of the curriculum and introduce the basics of QI science. The QI science content of the lecture emphasized the importance of making structure, process, and outcome measurements to determine the efficacy of an intervention. Of the 16 students in the PISCES program, eight were pre-assigned the QI curriculum and predivided into two groups of four. The other eight students were assigned to an outpatient community-based project (not part of this paper).

After each QI group identified a healthcare quality gap, the curriculum director helped focus students on specific measurable gaps relating to the theme they chose, discouraging the pursuit of gaps requiring analysis of health metrics not readily available to the students. As the students’ projects developed, the curriculum director facilitated establishment of a mentorship relationship between groups and faculty with expertise in the chosen quality gap. Mentors had not been recruited until this point in order to match students with mentors with relevant expertise.

During two scheduled and three ad-hoc meetings, the students in each group updated the curriculum director on the progress of their efforts and received guidance for focusing their next steps. Fig. 1 illustrates the timeline of the curriculum, including milestones marking the achievement of specific objectives, ‘check-ins’ and the final 15-min presentation of each groups’ QI proposal.

Fig. 1
Timeline for Quality Improvement Curriculum.

Measurements

We measured educational outcomes within the framework of knowledge, skills, and attitudes. QI skills were evaluated using a validated tool for assessing trainee QI proposals (25). Knowledge and attitudes were assessed at the beginning and end of the curriculum using a survey developed for this experience. We also measured students’ reaction to the curriculum through survey questions and a focus group.

Knowledge

Our knowledge assessment reflected the self-directed nature of the curriculum – we expected each group to learn practical lessons. We, therefore, asked students to define two basic concepts that we anticipated all participating students would have learned after a year-long QI project: ‘CQI’ (continuous quality improvement) and ‘PDSA’ (plan-do-study-act). In addition, based on the assumption that clinical year medical students should be able to ‘identify outcome and process measures appropriate for a clinical problem,’ (16) nine questions were developed in which an assessment measure was listed and students were asked to define the measure as structure, process, or outcome. See Appendix.

Skills

Skill assessment was determined by evaluation of the final presentations. Four UCSF faculty experts in QI education evaluated the presentations using the validated Quality Improvement Proposal Assessment Tool-7 (QIPAT-7) that was originally designed for the evaluation of residents’ QI proposals (25). It defines seven domains (definition of the problem, identification of key stakeholders, evidence of root cause analysis, choice of QI project, potential interventions, proposed interventions, and implementation/evaluation of the intervention). Each domain is rated from 1 (needs improvement) to 5 (exceeds expectations). Evaluators underwent a 30-min training session to standardize the use of the tool.

Attitude

For the attitude assessment, 12 questions regarding confidence in competence in QI were informed by prior reports of evaluation of QI curricula (26, 27). In addition to confidence in skills, we developed three additional domains of attitude designed to assess the effect of our curriculum: students’ perceptions of how highly QI work is valued by providers and institutions, the ability of QI projects to improve care provision, and the role of physicians in QI. Questions addressed these themes using a 5-point Likert scale (1=strongly disagree, 5=strongly agree). See Appendix.

Reaction

At the end of the academic year, a student focus group conducted on behalf of the PISCES leadership by faculty who were not involved in the QI assignment included one question on students’ experience in the QI curriculum. The authors were provided with an anonymized, transcribed summary of the students’ responses. In addition to the focus group comments, nine Likert-scale questions and five open-ended questions assessing satisfaction with the curriculum were included in the post-assignment survey.

Statistical analysis

For the knowledge questions, paired t-tests were performed on the aggregated scores pre- and postcurriculum (1=correct; 0=incorrect). For the skill assessment, scores in each domain reported by the four expert panelists were averaged, and standard deviations were calculated. For the attitude questions, Likert scale responses from 1 to 5 were summed in each predefined domain to create topic scores for each respondent in the pre- and post- periods. Paired t-tests were then performed to obtain p values. All analyses were performed in SAS (version 9.2, Cary, NC).

Results

Synopses of student projects

Group 1

Based on their early clinical experiences, group 1 explored the ‘inadequacy of pain control at the end of life’ as the healthcare quality gap of interest. They initially struggled to define the quality gap and considered characterizing the problem as the lack of palliative care consultation for terminally ill patients, inadequate nurse-physician communication around pain, or infrequent use of the pain control service by the primary team. The group met with physicians as well as nurses who care for patients at the end of life. It proved difficult for them to find a single mentor who could devote sufficient time to guide the project, and the group relied on ‘consultations’ with multiple faculty.

Ultimately, the group focused on nursing documentation around pain control and created a chart review tool to analyze adequacy of pain control for patients at the end of life. Chart review using this tool led to what they described as their major finding: the documentation of pain control in the medical record can result in a confusing data overload, which is not interpretable to providers. Based on this discovery, they proposed that a more easily interpretable score, similar to the common APGAR score used for evaluating newborns, would allow for better interpretation and tracking of adequacy of pain control. However, their proposed intervention was not precisely defined, nor did it include specific, measurable, and timely goals, as was required by the curriculum.

Group 2

Based on their clinical experiences, group 2 became interested in preventable causes of delirium. After exploring targetable causes of delirium, they determined that ‘inappropriate urinary catheter (UC) use’ was a healthcare quality gap that was modifiable by a QI intervention. This group identified a mentor early in the year: the medical director of a small, non-teaching inpatient service, who remained actively engaged for the duration of the project. Group members met with the mentor in person three times and exchanged emails on average twice a month throughout the project.

This group initially hypothesized that clinicians would be unaware of whether their patients had indwelling UCs. Using the 20-bed non-teaching hospitalist service at our institution as a model, they first determined which patients had UCs and subsequently surveyed the responsible hospitalist for awareness of whether their patients had UCs. They found that few hospitalists were unaware that their patients had UCs; a survey of residents on the traditional teaching service had the same finding. They adjusted their focus to survey residents to measure their awareness of proper indications for UC placement and consequently recommended an educational intervention to disseminate guideline-based recommendations for UC placement. While this group's intervention was more developed than the other group's intervention, they also did not define specific, measurable, and timely goals for outcomes targeted by their proposed intervention.

Educational outcomes

Our knowledge measures did not show any significant improvement from pre- to post- assignment [(mean knowledge score out of 11 items±SD) pre: 5.88±1.46 vs. post: 6.63±1.3, p=0.20]. In terms of skill assessment, both groups achieved competence in several of the curricular objectives. Both groups completed literature reviews analyzing the extent of the gap, made efforts to understand previous local work to address the problem, and successfully identified relevant stakeholders. Both groups achieved these milestones within the required deadlines.

Evaluation of the final presentations showed some common shortcomings in QI skills. Most notably, both groups demonstrated inadequate analysis of the potential interventions before determining a specific intervention and struggled to define specific, measurable, and timely goals for their proposed interventions. The evaluation scores of student presentations using the QIPAT-7 are shown in Table 1.

Table 1
Skill assessment

In terms of attitudes, students’ confidence in their QI skills increased significantly at the end of the year. They had significantly increased perception of how highly QI projects are valued by healthcare providers. There was a non-significant increase in perception of the importance of QI projects in improving care systems and the importance of the role of QI efforts in a physician's practice (Table 2).

Table 2
Attitude assessment

Reaction to curriculum

The focus group revealed that two students found the experience to be helpful in translating thoughts into concrete action. The majority of students would have liked the timeline shortened, with more concrete goals built in. Comments in the open response section of the postassignment survey showed that students wanted greater ‘protected time’ to work on this assignment. Finally, students felt that it would be beneficial to have had increased guidance from mentors and project leaders. Representative comments from the focus group and survey are presented in Table 3.

Table 3
Learner reaction

Discussion

We piloted a QI curriculum in a year-long longitudinal integrated clerkship for third-year medical students that was based on actual clinical experiences, allowed students to select a QI gap, and required self-directed learning to meet specific competencies. While students were able to meet most of the targeted competencies, higher order skills such as establishing timely and measurable goals for a proposed intervention were lacking. Students showed increased confidence in their ability to perform QI as well as improved perceptions of value of QI for individuals and institutions. However, measurements of QI knowledge did not show improvement over the course of the year – a deficiency of either our assessment tools or the curriculum.

Four major lessons emerged from this experiment. First, the project was feasible. Assessment of student final presentations demonstrated appropriateness of the majority of preselected competencies: groups were able to identify and quantify a quality gap based on observations from their own clinical experiences, complete literature reviews to analyze the extent of the gap, make efforts to understand previous local work to address the problem, and identify relevant stakeholders. They accomplished these objectives in the context of busy clerkship schedules, demonstrating that these are achievable goals for clinical year medical students.

Second, we learned that students did not perform well in our knowledge assessment, suggesting that knowledge objectives should be explicitly taught with the aid of scheduled instructional sessions throughout the curriculum. We experimented with a curricular model that emphasized self-directed learning. However, basic concepts that we believed would have easily emerged through ongoing involvement with a QI project were not demonstrated by our measures. Supplementing the single introductory lecture with additional didactics may have filled this gap.

Third, we discovered that early establishment of project specific mentorship is essential. The success of QI curricula at other levels can be attributed to motivated faculty mentorship (19, 28). We had anticipated that requiring students to select projects based on their own experiences would result in greater motivation and self-directed learning. However, implementing this strategy revealed that even with the extensive QI expertise available at our institution, it can be difficult for students to find faculty willing to devote the time needed to actively mentor a group on a de novo project. The experience of group 1, who had difficulty finding a mentor, was diminished in comparison to the second group who had an actively engaged expert mentor for the majority of the project. Thus, we conclude that students should either join ongoing projects that approximate their interests or find a mentor who can commit to their project early in the academic year.

Finally, students struggled with the higher order competencies (such as identification of appropriate process and outcome measures, ability to recommend changes in clinical processes) required of ‘advanced beginners’ as defined by Ogrinc et al. (16) and would have benefited from explicit instruction and mentorship to help them develop clear project goals as well a plan for evaluating the impact of their proposed intervention. For example, despite the consistent emphasis on measurable outcomes that was reinforced throughout the year, students found it difficult to define clear improvement goals and measures by which to gauge the success of their proposed interventions (see Table 1). Increased focused didactics, more frequent and specific feedback, and consistent mentorship could address this finding.

This pilot study has several limitations. First, the sample size was necessarily small, given the predetermined size of the longitudinal clerkship. Second, our measurements of knowledge and attitudes were developed for the purposes of our curriculum, and thus not independently validated. Finally, the curriculum was piloted in a longitudinal integrated clerkship in which students are immersed in a single healthcare system and instilled with a longitudinal, patient-centered perspective. Piloting the curriculum in this setting, which remains rare in most medical schools, limits the generalizability of our conclusions. Despite this limitation, we feel that the lessons we learned may be largely applicable for self-directed and experiential QI educational efforts for clinical year students. This curriculum may be applicable for any clerkship student who spends a significant continuous period of time within a single system, even with a traditional clerkship structure.

Since the IOM's reports on safety and quality 10 years ago, the need for physicians to be competent in quality improvement has emerged as a major area of focus in medical education (1, 5, 29). In response to this need, we developed a self-directed, experiential QI curriculum for third-year medical students in which they gain experience in systems-based practice and subsequently value the important role of physicians in QI efforts. We share four major lessons from this effort: clinical-year medical students are able to conduct a self-directed QI project, mentorship is vital, self-directed learning in this domain may be insufficient without targeted didactics or other pedagogical strategies, and higher level skills such as the measurement of efficacy of interventions would benefit from explicit instruction and consistent mentorship. As the focus on systems-based practice in undergraduate medical education increases, lessons learned from our pilot curriculum can allow educators teaching QI to better target developmentally appropriate competencies in clinical-year medical students.

Acknowledgements

The authors are grateful to Lowell Tong, MD, for his valuable input on this project, Patricia S. O'Sullivan, EdD, for her critical review of the draft, and Judy Maselli, MSPH, for her assistance in the statistical analyses.

Appendix: Pre- and post- curriculum knowledge and attitude questionnaire

Please indicate the degree to which you agree with the following statements, using the following scale: 1=strongly disagree, 2=disagree, 3=neutral, 4=agree, 5=strongly agree

Non-MD healthcare providers highly value QI projects.1 2 3 4 5
Physicians highly value QI projects.1 2 3 4 5
Hospital management/managers highly value QI projects.1 2 3 4 5
QI projects are important for improving patient care.1 2 3 4 5
QI projects are important for improving patient satisfaction.1 2 3 4 5
QI projects are important for improving hospital reimbursement.1 2 3 4 5
Physicians play an important role in a hospital's quality improvement efforts.1 2 3 4 5
Leading quality improvement efforts is a part of being a practicing physician.1 2 3 4 5
I am likely to be involved in QI projects in the future.1 2 3 4 5
I am confident in my ability to identify a QI need.1 2 3 4 5
I am confident in my ability to identify stakeholders after a QI need has been identified.1 2 3 4 5
I am confident in my ability to develop a QI project.1 2 3 4 5

What does the PDSA stand for? ______________________________

What does CQI stand for? ___________________________________

For each of the following, please indicate whether the quality measure refers to:

S=Structure, P=Process, O=Outcome (circle one)

Existence of an Electronic Medical Record.S P O
Time Out is required before each operation.S P O
Percentage of patients with prior MI on beta-blockers.S P O
Percentage of patients over 65 who have received the pneumovax vaccine.S P O
Number of patients who have received smoking cessation counseling.S P O
Patient satisfaction with their care.S P O
Hospital readmission-rate.S P O
30 day mortality rate.S P O
Existence of Computerized Physician Order Entry.S P O

Conflict of interest and funding

The authors have not received any funding or benefits from industry or elsewhere to conduct this study.

References

1. Leape L, Berwick D, Clancy C, Conway J, Gluck P, Guest J, et al. Transforming healthcare: a safety imperative. Qual Saf Health Care. 2009;18:424–8. [PubMed]
2. Griner PF. Leadership strategies of medical school deans to promote quality and safety. Jt Comm J Qual Patient Saf. 2007;33:63–72. [PubMed]
3. Tsai TC, Bohnen JD, Hafiz S. Instruction in quality improvement and patient safety must be a priority in medical students’ education. Acad Med. 2010;85:743–4. [PubMed]
4. AAMC. Report V: contemporary issues in medicine: quality of care. Medical School Objectives Project. Available from: http://www.aamc.org/meded/msop/ [cited 13 October 2010]
5. Berwick DM, Finkelstein JA. Preparing medical students for the continual improvement of health and health care: Abraham Flexner and the new “public interest” Acad Med. 2010;85:S56–65. [PubMed]
6. AAMC. Medical School Graduation Questionnaire. Available from: http://www.aamc.org/data/gq/ [cited 15 October 2010]. Used with permission of the AAMC; 2000–2010.
7. Boonyasai RT, Windish DM, Chakraborti C, Feldman LS, Rubin HR, Bass EB. Effectiveness of teaching quality improvement to clinicians: a systematic review. JAMA. 2007;298:1023–37. [PubMed]
8. Thompson DA, Cowan J, Holzmueller C, Wu AW, Bass E, Pronovost P. Planning and implementing a systems-based patient safety curriculum in medical education. Am J Med Qual. 2008;23:271–8. [PubMed]
9. Paulman P, Medder J. Teaching the quality improvement process to junior medical students: the Nebraska experience. Fam Med. 2002;34:421–2. [PubMed]
10. Gould BE, Grey MR, Huntington CG, Gruman C, Rosen JH, Storey E, et al. Improving patient care outcomes by teaching quality improvement to medical students in community-based practices. Acad Med. 2002;77:1011–8. [PubMed]
11. Weeks WB, Robinson JL, Brooks WB, Batalden PB. Using early clinical experiences to integrate quality-improvement learning into medical education. Acad Med. 2000;75:81–4. [PubMed]
12. Gould BE, O'Connell MT, Russell MT, Pipas CF, McCurdy FA. Teaching quality measurement and improvement, cost-effectiveness, and patient satisfaction in undergraduate medical education: the UME-21 experience. Fam Med. 2004;36:S57–62. [PubMed]
13. Henley E. A quality improvement curriculum for medical students. Jt Comm J Qual Improv. 2002;28:42–8. [PubMed]
14. Varkey P. Educating to improve patient care: integrating quality improvement into a medical school curriculum. Am J Med Qual. 2007;22:112–6. [PubMed]
15. Poncelet A, Bokser S, Calton B, Hauer K, Kirsch H, Jones T, Lai C, Mazotti L, Shore W, Teherani A, Tong L, Wamsley M, Robertson P. Development of a longitudinal integrated clerkship at an academic medical center. Medical Education Online, North America, 16, apr. 2011. Available at: < http://med-ed-online.net/index.php/meo/article/view/5939>. [accessed 23 April 2012]. [PMC free article] [PubMed]
16. Ogrinc G, Headrick LA, Mutha S, Coleman MT, O'Donnell J, Miles PV. A framework for teaching medical students and residents about practice-based learning and improvement, synthesized from a literature review. Acad Med. 2003;78:748–56. [PubMed]
17. Murad MH, Varkey P. Self-directed learning in health professions education. Ann Acad Med Singapore. 2008;37:580–90. [PubMed]
18. Norris TE, Schaad DC, DeWitt D, Ogur B, Hunt DD. Longitudinal integrated clerkships for medical students: an innovation adopted by medical schools in Australia, Canada, South Africa, and the United States. Acad Med. 2009;84:902–7. [PubMed]
19. Bell SK, Krupat E, Fazio SB, Roberts DH, Schwartzstein RM. Longitudinal pedagogy: a successful response to the fragmentation of the third-year medical student clerkship experience. Acad Med. 2008;83:467–75. [PubMed]
20. Chou CL, Johnston CB, Singh B, Garber JD, Kaplan E, Lee K, et al. A “safe space” for learning and reflection: one school's design for continuity with a peer group across clinical clerkships. Academic medicine: journal of the Association of American Medical Colleges. 2011;86(12):1560–5. [PubMed]
21. Headrick LA, Neuhauser D, Schwab P, Stevens DP. Continuous quality improvement and the education of the generalist physician. Acad Med. 1995;70:S104–9. [PubMed]
22. Eiser AR, Connaughton-Storey J. Experiential learning of systems-based practice: a hands-on experience for first-year medical residents. Acad Med. 2008;83:916–23. [PubMed]
23. Gordon PR, Carlson L, Chessman A, Kundrat ML, Morahan PS, Headrick LA. A multisite collaborative for the development of interdisciplinary education in continuous improvement for health professions students. Acad Med. 1996;71:973–8. [PubMed]
24. Windish DM, Reed DA, Boonyasai RT, Chakraborti C, Bass EB. Methodological rigor of quality improvement curricula for physician trainees: a systematic review and recommendations for change. Acad Med. 2009;84:1677–92. [PubMed]
25. Leenstra JL, Beckman TJ, Reed DA, Mundell WC, Thomas KG, Krajicek BJ, et al. Validation of a method for assessing resident physicians’ quality improvement proposals. J Gen Intern Med. 2007;22:1330–4. [PMC free article] [PubMed]
26. Oyler J, Vinci L, Arora V, Johnson J. Teaching internal medicine residents quality improvement techniques using the ABIM's practice improvement modules. J Gen Intern Med. 2008;23:927–30. [PMC free article] [PubMed]
27. Ogrinc G, West A, Eliassen MS, Liuw S, Schiffman J, Cochran N. Integrating practice-based learning and improvement into medical student learning: evaluating complex curricular innovations. Teach Learn Med. 2007;19:221–9. [PubMed]
28. Weingart SN, Tess A, Driver J, Aronson MD, Sands K. Creating a quality improvement elective for medical house officers. J Gen Intern Med. 2004;19:861–7. [PMC free article] [PubMed]
29. Cohen J. Letter to medical school deans. Washington, DC: Association of American Medical Colleges; 1999.

Articles from Medical Education Online are provided here courtesy of The Editors and Co-Action Publishing