PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of jgimedspringer.comThis journalToc AlertsSubmit OnlineOpen Choice
 
J Gen Intern Med. Mar 2009; 24(3): 361–365.
Published online Jan 21, 2009. doi:  10.1007/s11606-009-0904-1
PMCID: PMC2642556
Resident Self-Assessment and Self-Reflection: University of Wisconsin-Madison’s Five-Year Study
Christopher Hildebrand, MD,1 Elizabeth Trowbridge, MD,2 Mary A. Roach, PhD,2 Anne Gravel Sullivan, PhD,2 Aimee Teo Broman, MS,2 and Bennett Vogelman, MDcorresponding author2,3
1William S. Middleton Veterans Administration Hospital, Madison, WI USA
2University of Wisconsin-Madison School of Medicine and Public Health, Madison, WI USA
3Professor and Senior Associate Chair of Education, 600 Highland Avenue J5/237 Clinical Science Center, Madison, WI 53792-2454 USA
Bennett Vogelman, Phone: +1-608-2637352, Fax: +1-608-2626743, bsv/at/medicine.wisc.edu.
corresponding authorCorresponding author.
Received August 28, 2008; Revised December 2, 2008; Accepted December 16, 2008.
BACKGROUND
Chart review represents a critical cornerstone for practice-based learning and improvement in our internal medicine residency program.
OBJECTIVE
To document residents’ performance monitoring and improvement skills in their continuity clinics, their satisfaction with practice-based learning and improvement, and their ability to self-reflect on their performance.
DESIGN
Retrospective longitudinal design with repeated measures.
PARTICIPANTS
Eighty Internal Medicine residents abstracted data for 3 consecutive years from the medical records of their 4,390 patients in the University of Wisconsin-Madison (UW) Hospital and Clinics and William S. Middleton Veterans Administration (VA) outpatient clinics.
MEASUREMENT
Logistic modeling was used to determine the effect of postgraduate year, resident sex, graduation cohort, and clinic setting on residents’ “compliance rate” on 17 nationally recognized health screening and chronic disease management parameters from 2003 to 2007.
RESULTS
Residents’ adherence to national preventive and chronic disease standards increased significantly from intern to subsequent years for administering immunizations, screening for diabetes, cholesterol, cancer, and behavioral risks, and for management of diabetes. Of the residents, 92% found the chart review exercise beneficial, with 63% reporting gains in understanding about their medical practices, 26% reflecting on specific gaps in their practices, and 8% taking critical action to improve their patient outcomes.
CONCLUSIONS
This paper provides support for the feasibility and practicality of this limited-cost method of chart review. It also directs our residency program’s attention in the continuity clinic to a key area important to internal medicine training programs by highlighting the potential benefit of enhancing residents’ self-reflection skills.
KEY WORDS: practice-based learning and improvement, graduate medical education, chart review, ambulatory care settings
The Accreditation Council of Graduate Medical Education (ACGME) identified practice-based learning and improvement (PBLI) as a key competency in 1999.1 Shortly thereafter, in 2000, our internal medicine residency program began incorporating annual chart review, guided by a follow-up meeting between the resident and project director, as a critical exercise for ensuring that residents developed both an attitude and an aptitude for self-reflection and a lifelong commitment to self-improvement. For the past 5 years, residents in our training program, under the direct supervision of the Associate Chief of Staff for Medicine and Ambulatory Care (CH), routinely conducted annual chart review. They studied the relevant practice guidelines, reviewed the medical records of continuity care patients in their outpatient clinics, documented their adherence to national preventive and chronic disease care recommendations, and reflected on their practices.
In 2003, when the American Board of Internal Medicine (ABIM) developed its practice improvement modules (PIM), which are now required for maintenance of certification, it acknowledged that quality improvement was inherently linked to professionalism and self-regulation.2 External audits might lead to improvement in the quality of patient care,3 but such improvements were not found consistently,4,5 tended to be small to moderate,6 and were most evident when residents’ performance was particularly poor or when educational interventions were particularly intense.6 Moreover, residents often challenged the accuracy of external reports,7,8 and physicians’ belief in the credibility of their own data was essential for quality improvement.2 By establishing self-assessment and reflection on practice as the foundation for the PIMs, the ABIM acknowledged the potential for bias, but affirmed our belief that residents’ honest reflections on their own medical practices provided an element of learning that could not be gleaned through standard didactic medical education or passive review of externally extracted data.
Research reported within the past year raised serious concerns about the quality of residency clinic care practices.9,10 It also provided practical suggestions for improving patient outcomes by strengthening clinic operations,11,12 adapting instruction to residents’ prior knowledge,13 increasing physician-patient continuity,14 and re-designing the ambulatory block structure.15 New methods for teaching and assessing PBLI support clinical understanding,1619 and disease registries help programs achieve benchmarks for critical care processes.20,21 However, creating meaningful improvement in medical practice requires individual commitment to change.22
In our 5-year retrospective study, we assessed residents’ improvement over time in their performance monitoring skills and outcomes, their skills at self-reflection, and their satisfaction with our PBLI practices. We also considered contextual effects of resident sex and clinic, and the feasibility and value of PBLI to the residency program as we assessed the impact of these practices on residents’ commitment to improved patient care outcomes.
Participants and Setting
The sample included 80 residents from five consecutive graduating cohorts (2003 – 2007) who performed chart review during all 3 years of residency in the University of Wisconsin-Madison Internal Medicine residency program. There were 37 residents (60% male) in the primary care continuity clinic of the Veterans Administration Hospital (VA) and 43 residents (40% male) in one of four University of Wisconsin (UW) health-care clinics. Clinics served a mixed income, urban/rural patient population. The same weekly outpatient medical curriculum was implemented at all sites under the direct supervision of the faculty member who had responsibility for each resident’s clinic. Assigned faculty were on site during all continuity clinics, personally seeing interns’ patients, and providing indirect staffing for patients of PGY-2 and PGY-3 residents. Residents who did not complete all 3 years of chart review (n = 36) were not included in this study.
Chart Review Procedures
Residents were instructed to select, in good faith, 20 charts from their panels of approximately 60 adult patients whom they had seen in their continuity clinics within the past year, with the intention that nearly all of their patients would be reviewed within the 3-year context of the program. To confirm the absence of bias in chart selection, we examined medical record numbers across 3 years for a randomly selected 10% of residents (n = 8) and documented that 17% of their 453 patient charts had undergone re-review. Prior to conducting chart review, small groups of residents were guided through a master sheet summarizing the evidence-based and/or expert opinion references for each parameter and the standardized data abstraction tool. Criteria were documented as “met” if test results were evident in the patient’s chart or if written documentation indicated that the test had been performed, scheduled, or offered; as “not met” if documentation was missing or incomplete; or as “not applicable” if screening or management was not indicated due to age, sex, or medical history. Chart review was performed during 2 h of unscheduled clinic time. The Institutional Review Board categorized this project as exempt, and patient identifiers were anonymized as data collection was completed.
Data abstracted from chart review included patient age and sex as well as residents’ compliance with each of 11 preventive medicine and 6 chronic disease management measures reflecting standards of care as accepted by the US Preventive Services Task Force.23 Measures included (1) immunizations (tetanus booster, pneumococcal 23-valent, inactivated influenza), (2) general health screening (diabetes, cholesterol, hypertension), (3) cancer screening (colorectal, breast, cervical), (4) behavioral risk factor screening (tobacco, alcohol), and (5) chronic disease management for type 2 diabetes mellitus (HgbA1c, blood pressure, LDL, microalbumin, foot care, eye exam). These measures have been endorsed by the National Committee for Quality Assurance (NCQA)24 as quality indicators in ambulatory care.
Self-Reflection
Following chart review, residents completed a brief questionnaire indicating whether they had benefited from the chart review exercise (yes or no) and why. Content analyses were conducted on residents’ qualitative responses using a modified version of four categories developed by Mezirow (1991):25 (1) Lack of engagement captured responses in which residents suggested an absence of learning or otherwise complained about the chart review experience; (2) Understanding captured a range of positive statements about the importance of the process or general appreciation of new learning regarding residents’ strengths or weaknesses; (3) Reflection included positive statements in which residents appeared to interpret and give meaning to their performance on a particular quality indicator; (4) Critical Reflection included statements in which residents clearly addressed a particular problem in their chart review and either planned action to change or specified a commitment to change the situation. Inter-rater reliability between one of the authors (MAR) and a trained graduate assistant indicated excellent reproducibility (kappa = 0.84).
Statistical Analysis
For chart review, each patient’s outcome was coded as 1 if the criterion was met (termed “compliance”), coded as 0 if the criterion was not met and should have been met, and coded as “not applicable” if screening or managing that measure was not indicated given that patient’s risk factors. We assumed that if a measure was coded as met, it was indicated for that patient.
For each resident, postgraduate year, and health-care measure, compliance was modeled against program year, clinic, and known demographic variables of the resident and patient. Each resident had a maximum of 1,020 observations (17 measures × 3 postgraduate years × 20 patients). Logistic modeling was used to determine the effect of postgraduate year in the residency program (PGY-1, PGY-2, PGY-3), resident sex (male, female), graduation cohort (2003, 2004, 2005, 2006, 2007), clinic setting (VA, UW), patient sex (male, female), and patient age. Generalized Estimating Equations were used to adjust for correlation between repeated observations for a resident and postgraduate year in the program.26 Chart review data were analyzed using the Statistical Analysis Software (SAS) package.27 Resident sex, clinic setting, postgraduate year, patient sex, and patient age were treated as categorical variables; graduation cohort was treated as a continuous variable. Compliance rates were analyzed using a logistic regression model that adjusted for repeated measures and year in the program. Descriptive statistics were used to analyze the survey data.
Eighty residents abstracted data from the medical records of 4,390 patients (mean = 18 patients per resident for each of 3 years) on 17 core measures.
Patient Characteristics
Comparisons of patient sex and age characteristics within resident sex and clinic setting indicated that at the UW clinic, female residents were significantly more likely than male residents to have female patients (68% vs. 30%, χ2 = 287.8, p < 0.0001); this was not true at the VA clinic, where both male and female residents had 29% female patients (χ2 = 0.03, p = 0.87). Patients at the VA clinic were older than patients at the UW clinics (χ2 = 578.0, p < 0.0001), with about two-thirds of patients at or above age 55 at the VA and two-thirds below age 55 at the UW.
Logistic Analyses of Residents’ Compliance Rates
Residents’ compliance rates for monitoring core health-care measures within postgraduate years are presented in Table 1. A multivariate model using results from all 17 health-care measures for the 80 residents indicated a significant effect of postgraduate year. Residents exhibited a significantly higher compliance rate during their second [OR = 1.72, CI = (1.39, 2.13), p < 0.0001] and third [OR = 1.98, CI = (1.60, 2.44), p < 0.0001] years than during their first year of residency. There was no significant difference in residents’ compliance rates between the 2nd and 3rd years. Following the first postgraduate year, significant improvement was evident in residents’ rates of immunization (tetanus and pneumococcal vaccines), general health screening (diabetes and cholesterol), and cancer screening (colorectal and cervical). For management of patients with diabetes, residents demonstrated significant improvement for screening for LDL, testing for microalbumin, and conducting foot and eye exams. Residents’ compliance rates for addressing patients’ use of tobacco and alcohol during clinic visits also increased significantly. No significant improvements were found for administering influenza vaccines, screening and monitoring for hypertension, conducting mammograms, or checking for HgbA1c in patients with diabetes. Residents’ success at screening and successfully monitoring blood pressure are reflected in Table 1 as the percentages of all patients in the residents’ clinic who either had normal blood pressure during screening or had a treatment plan to address their elevated blood pressure (≥140/90, ≥130/80 for diabetics).
Table 1
Table 1
Number of Patients for Whom Test Was Indicated (n) and Compliance Rate (%) by Postgraduate Year for the 80 Residents Who Completed all 3 Years of Chart Review
Residents at the UW clinics exhibited significantly lower compliance rates than residents at the VA clinics [OR = 0.58, CI = (0.44, 0.76), p < 0.0001], and there was a small but significant increase in compliance rates by graduation cohort (11% increase in compliance rate, compared to the previous cohort year) [OR = 1.11, CI = (1.04, 1.19), p < 0.002]. Compliance rates for residents did not differ significantly by resident sex, patient sex, or patient age. A comparable logistic model, developed for all 116 residents, indicated that residents who completed all 3 years of chart review (n = 80) exhibited significantly higher rates of compliance than residents who did not complete 3 years of chart review (n = 36) [OR = 1.35, CI = (1.10, 1.67), p < 0.005].
Reflection Following Chart Review
The 80 residents completed a total of 160 annual surveys during their 3 years of residency, with 28 residents completing all three surveys, 49 residents completing at least one but not all three surveys, and 3 residents completing no surveys. Among completed surveys, the overwhelming majority of residents (92%) indicated that the chart review exercise was beneficial, and positive perceptions were consistently high across postgraduate year (PGY-1 = 90.0%; PGY-2 = 96.5%; PGY-3 = 90.5%).
In response to the follow-up question as to why they thought chart review was beneficial, only 60% of the resident surveys included open-text commentary. Responses fell into four categories.25 The majority of residents (63%) indicated general understanding about the importance of self-assessment or identification of general perceived strengths or deficits, e.g., “It helped me identify areas in which I need to improve.” A total of 26% of the residents identified specific insights or associations among quality indicators, e.g., “I missed general maintenance for complicated patients.” Some residents (8%) emphasized specific actions taken or commitments to change in the future, e.g., “I changed my computer template after the last chart review to incorporate preventative medicine reminders into each visit.”
Chart review represents a well-established technique for measuring and continuously improving the quality of health care.28 Academic medical centers have provided benchmarking data by postgraduate year,2931 and evaluation of personal clinical performance against benchmarks has become an accepted intervention in professional development outside of academic settings.32 Results of this study add to the literature by demonstrating that residents who participated in annual reviews of their own patients’ charts exhibited increased adherence to nationally recognized preventive and chronic disease standards. Residents showed significant improvement over postgraduate year in their rates of administering immunizations (tetanus, pneumococcal), screening for diabetes, cholesterol, and cancer (colorectal, cervical), and screening for behavioral risk factors, as well as in their management of diabetes (LDL, microalbumin, foot and eye care). These findings are consistent with the notion that physicians naturally seek to close gaps in their performance once they become aware of them, especially if those gaps are self-discovered.33
Although our focus has always been on residents’ individual practice-based learning and improvement, differences in compliance rates clearly exist within our residency program and in comparisons between our residency and other residency programs. For example, although all of our residents were exposed to the same curriculum and model for faculty supervision, compliance rates were significantly higher at the VA than at the UW. This might be explained by the fact that practice guidelines at the VA were fully integrated into an established electronic medical record system, and residents had primary responsibility for their patient panels, whereas electronic medical records were unavailable at the UW, and residents had to schedule their patients through their attendings, thereby limiting access to patient information and the ability to monitor quality improvements. It is more difficult to explain how specific clinic practices (e.g., reminder systems, protocols for immunizations) might have led to differential improvements in some screening measures (e.g., cervical cancer) and not others (e.g., breast cancer) or to specify reasons for incremental improvements over time given the retrospective nature of our data. Moreover, the fact that we, like Kern (1990),29 found improved outcomes across residency years where others did not,31 or that our compliance rates appear similar to baseline rates in some studies,22 but higher than those reported in other studies15 suggests potential differences in practice sites that may be too numerous to speculate (e.g., patient populations, frequencies of re-visits to clinic, core curriculum).
Residents consistently reported benefiting from the process of reviewing their patients’ charts. Those who completed the survey typically cited gains in understanding about their medical practices, with some, although not a majority of residents, articulating specific progress toward commitments to change. These findings suggest that the residency program may wish to strengthen PBLI skills in continuity clinics,34 while reinforcing residents’ skills at self-reflection through focused mentorship,35 since residents who fail to engage in reflection or critical reflection25 on this exercise may be less capable in general of addressing their gaps and improving their medical practices.
This paper provided support for the feasibility and practicality of our limited cost method of chart review. In contrast to a more expensive model of quality assessment, we estimate that our method could be implemented in a similar-sized program at an annual cost of about $3,000, based on 15 h per year of faculty time (at $100 per hour), supported by 20 h for a program assistant to monitor compliance (at $15 per hour), and 40 h by a data analyst to compile, analyze, and report program data (at $30 per hour), plus the opportunity costs associated with resident time to complete the chart review.
Inherent limitations in the methodological features of our study must also be recognized. We assumed that the value of spending time looking at one’s own work relative to accepted standards of care outweighed the potential for resident bias in chart audit, but in the absence of a control group and double data extraction, we can neither claim that this PBLI activity changed resident behavior nor can we rule out the possibility that improvement over time reflected additional years of training. Similarly, we integrated practice-based learning into the ambulatory care setting assuming that this type of learning would lead to improved patient care practices, but because we did not actually measure improvements in the quality of patient care, we can not confirm this association. Finally, we laid a foundation for lifelong learning by protecting the chart review process from outside evaluation, thereby encouraging residents to take responsibility for their own foibles. However, absence of outcome-based assessments or other evidence to show that our former residents continued to apply their knowledge to self-improvement in their later medical practice hampered our ability to claim success at establishing lifelong self-regulation.
Conclusion Our method of chart review focused on inculcating a spirit of independent reflective thinking and providing residents with an opportunity to improve their practices based on what they learned. Our data collected over 5 years may have been retrospective, but they resulted in the accrual of a large number of individual measurements, and our findings seem promising enough to recommend more rigorous testing. Indeed, if this practical, meaningful, limited cost PBLI method could be shown to lead to change in resident behavior within controlled studies, it seems likely that many residency programs would choose to implement it. It is our hope that our enhanced counseling practices will likewise move residents along a continuum toward increasing critical self-reflection and commitment to individual and system change, further accelerating residents’ progress toward improved patient outcomes.
Acknowledgement
Completion of this paper was made possible by a grant from the Medical Education and Research Committee (MERC) of the University of Wisconsin-Madison School of Medicine and Public Health as well as by the support of the Education Innovation Project of the Residency Review Committee for Internal Medicine, of which we are a participating residency. This research was also supported by the University of Wisconsin Institute for Clinical and Translational Research, funded through an NIH Clinical and Translational Science Award (CTSA), grant no. 1 UL1 RR025011. Earlier results of this paper were presented at the Association of Program Directors in Internal Medicine meeting (2004). We wish to acknowledge the programmatic efforts of Suzy Griffiths, Toni Prisk, Vonnie Schoenleber, Jessalyn Richter, and Erik Stava at various stages in the process of this work.
Conflict of Interest None disclosed.
1. Swing SR. The ACGME outcome project: Retrospective and prospective. Med Teach. 2007;29(7):648–54. [PubMed]
2. Duffy FD, Lynn LA, Didura H, et al. Self-assessment of practice performance: development of the ABIM Practice Improvement Module (PIM). J Contin Educ Health Prof. 2008;28(1):38–46, Winter. [PubMed]
3. Houston TK, Wall T, Allison JJ, et al. Implementing achievable benchmarks in preventive health: a controlled trial in residency education. Acad Med. 2006;81(7):608–16, Jul. [PubMed]
4. Foy R, Eccles MP, Jamtvedt G, Young J, Grimshaw JM, Baker R. What do we know about how to do audit and feedback? Pitfalls in applying evidence from a systematic review. BMC Health Serv Res. 2005;13;5:50, Jul. [PMC free article] [PubMed]
5. Kogan JR, Reynolds EE, Shea JA. Effectiveness of report cards based on chart audits of residents’ adherence to practice guidelines on practice performance: a randomized controlled trial. Teach Learn Med. 2003;15(1):25–30, Winter. [PubMed]
6. Jamtvedt G, Young JM, Kristoffersen DT, O’Brien MA, Oxman AD. Does telling people what they have been doing change what they do? A systematic review of the effects of audit and feedback. Qual Saf Health Care. 2006;15(6):433–6, Dec. [PMC free article] [PubMed]
7. Haan CK, Edwards FH, Poole B, Godley M, Genuardi FJ, Zenni EA. A model to begin to use clinical outcomes in medical education. Acad Med. 2008;83(6):574–80, Jun. [PubMed]
8. Lyman JA, Schorling J, Nadkarni M, May N, Scully K, Voss J. Development of a web-based resident profiling tool to support training in practice-based learning and improvement. J Gen Intern Med. 2008;23(4):485–8, Apr. [PMC free article] [PubMed]
9. Keirns CC, Bosk CL. The unintended consequences of training residents in dysfunctional outpatient settings. Acad Med. 2008;83(5):498–502, May. [PubMed]
10. Mladenovic J, Shea JA, Duffy FD, Lynn LA, Holmboe ES, Lipner RS. Variation in internal medicine residency clinic practices: assessing practice environments and quality of care. J Gen Intern Med. 2008;23(7):914–20, Jul. [PMC free article] [PubMed]
11. Stevens DP, Sixta CS, Wagner E, Bowen JL. The evidence is at hand for improving care in settings where residents train. J Gen Intern Med. 2008;23(7):1116–7, Jul. [PMC free article] [PubMed]
12. Sisson SD, Boonyasai R, Baker-Genaw K, Silverstein J. Continuity clinic satisfaction and valuation in residency training. J Gen Intern Med. 2007;22(12):1704–10, Dec. [PMC free article] [PubMed]
13. Cook DA, Beckman TJ, Thomas KG, Thompson WG. Adapting web-based instruction to residents’ knowledge improves learning efficiency: a randomized controlled trial. J Gen Intern Med. 2008;23(7):985–90, Jul. [PMC free article] [PubMed]
14. Dearinger AT, Wilson JF, Griffith CH, Scutchfield FD. The effect of physician continuity on diabetic outcomes in a resident continuity clinic. J Gen Intern Med. 2008;23(7):937–41, Jul. [PMC free article] [PubMed]
15. Warm EJ, Schauer DP, Diers T, et al. The ambulatory long-block: an accreditation council for graduate medical education (ACGME) educational innovations project (EIP). J Gen Intern Med. 2008;23(7):921–6, Jul. [PMC free article] [PubMed]
16. Lynch DC, Swing SR, Horowitz SD, Holt K, Messer JV. Assessing practice-based learning and improvement. Teach Learn Med. 2004;16(1):85–92, Winter. [PubMed]
17. Morrison LJ, Headrick LA. Teaching residents about practice-based learning and improvement. Jt Comm J Qual Patient Saf. 2008;34(8):453–9, Aug. [PubMed]
18. Moskowitz EJ, Nash DB. Accreditation Council for Graduate Medical Education competencies: practice-based learning and systems-based practice. Am J Med Qual. 2007;22(5):351–82, Sep–Oct. [PubMed]
19. Oyler J, Vinci L, Arora V, Johnson J. Teaching internal medicine residents quality improvement techniques using the ABIM’s practice improvement modules. J Gen Intern Med. 2008;23(7):927–30, Jul. [PMC free article] [PubMed]
20. Halverson LW, Sontheimer D, Duvall S. A residency clinic chronic condition management quality improvement project. Fam Med. 2007;39(2):103–11, Feb. [PubMed]
21. Thomas KG, Thomas MR, Stroebel RJ, et al. Use of a registry-generated audit, feedback, and patient reminder intervention in an internal medicine resident clinic- A randomized trial. J Gen Intern Med. 2007;22(12):1740–4, Dec. [PMC free article] [PubMed]
22. Holmboe ES, Prince L, Green M. Teaching and improving quality of care in a primary care internal medicine residency clinic. Acad Med. 2005;80(6):571–7, Jun. [PubMed]
23. U.S. Preventative Services Task Force. Screening and behavioral counseling interventions in primary care to reduce alcohol misuse: recommendation statement. Ann Intern Med. 2004;140(7):554–6, Apr 6. [PubMed]
24. National Committee for Quality Assurance <http://www.ncqa.org>. Accessed 2008 February 26.
25. Mezirow J. Transformative dimensions of adult learning. San Francisco: Jossey-Bass; 1991.
26. Zeger SL, Liang KY, Albert PS. Models for longitudinal data: a generalized estimating equation approach. Biometrics. 1988;44(4):1049–60, Dec. [PubMed]
27. Statistical Analysis Software Package Version 9.1. Cary, NC: SAS Institute Inc.
28. Meyers FJ, Weinberger SE, Fitzgibbons JP, Glassroth J, Duffy FD, Clayton CP. Redesigning residency training in internal medicine: the consensus report of the Alliance for Academic Internal Medicine Education Redesign Task Force. Acad Med. 2007;82(12):1211–9, Dec. [PubMed]
29. Kern DE, Harris WL, Boekeloo BO, Barker LR, Hogeland P. Use of an outpatient medical record audit to achieve educational objectives: changes in residents’ performances over six years. J Gen Intern Med. 1990;5(3):218–24, May–Jun. [PubMed]
30. Holmboe ES, Bowen JL, Green M, et al. Reforming internal medicine residency training. A report from the Society of General Internal Medicine’s task force for residency reform. J Gen Intern Med. 2005;20(12):1165–72, Dec. [PMC free article] [PubMed]
31. Willett LL, Palonen K, Allison JJ, et al. Differences in preventive health quality by residency year. Is seniority better. J Gen Intern Med. 2005;20(9):825–9, Sep. [PMC free article] [PubMed]
32. Kiefe CI, Allison JJ, Williams OD, Person SD, Weaver MT, Weissman NW. Improving quality improvement using achievable benchmarks for physician feedback: a randomized controlled trial. JAMA. 2001;285(22):2871–9, Jun 13. [PubMed]
33. Duffy FD, Holmboe ES. Self-assessment in lifelong learning and improving performance in practice: physician know thyself. JAMA. 2006;296(9):1137–9, Sep 6. [PubMed]
34. Bowen JL, Salerno SM, Chamberlain JK, Eckstrom E, Chen HL, Brandenburg S. Changing habits of practice. Transforming internal medicine residency education in ambulatory settings. J Gen Intern Med. 2005;20(12):1181–7, Dec. [PMC free article] [PubMed]
35. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: A systematic review. JAMA. 2006;296(9):1094–102. [PubMed]
Articles from Journal of General Internal Medicine are provided here courtesy of
Society of General Internal Medicine