The Accreditation Council for Graduate Medical Education (ACGME) requires residency programs to meet and demonstrate outcomes across 6 competencies. Measuring residents' competency in practice-based learning and improvement (PBLI) is particularly challenging.
We developed an educational tool to meet ACGME requirements for PBLI. The PBLI template helped programs document quality improvement (QI) projects and supported increased scholarly activity surrounding PBLI learning.
We reviewed program requirements for 43 residency and fellowship programs and identified specific PBLI requirements for QI activities. We also examined ACGME Program Information Form responses on PBLI core competency questions surrounding QI projects for program sites visited in 2008–2009. Data were integrated by a multidisciplinary committee to develop a peer-protected PBLI template guiding programs through process, documentation, and evaluation of QI projects. All steps were reviewed and approved through our GME Committee structure.
An electronic template, companion checklist, and evaluation form were developed using identified project characteristics to guide programs through the PBLI process and facilitate documentation and evaluation of the process. During a 24 month period, 27 programs have completed PBLI projects, and 15 have reviewed the template with their education committees, but have not initiated projects using the template.
The development of the tool generated program leaders' support because the tool enhanced the ability to meet program-specific objectives. The peer-protected status of this document for confidentiality and from discovery has been beneficial for program usage. The document aggregates data on PBLI and QI initiatives, offers opportunities to increase scholarship in QI, and meets the ACGME goal of linking measures to outcomes important to meeting accreditation requirements at the program and institutional level.
The Accreditation Council for Graduate Medical Education has suggested various methods for evaluation of practice-based learning and improvement competency, but data on implementation of these methods are limited.
To compare medical record review and patient surveys on evaluating physician performance in preventive services in an outpatient resident clinic.
Within an ongoing quality improvement project, we collected baseline performance data on preventive services provided for patients at the University of Alabama at Birmingham (UAB) Internal Medicine Residents' ambulatory clinic.
Seventy internal medicine and medicine-pediatrics residents from the UAB Internal Medicine Residency program.
Resident- and clinic-level comparisons of aggregated patient survey and chart documentation rates of (1) screening for smoking status, (2) advising smokers to quit, (3) cholesterol screening, (4) mammography screening, and (5) pneumonia vaccination.
Six hundred and fifty-nine patient surveys and 761 charts were abstracted. At the clinic level, rates for screening of smoking status, recommending mammogram, and for cholesterol screening were similar (difference <5%) between the 2 methods. Higher rates for pneumonia vaccination (76% vs 67%) and advice to quit smoking (66% vs 52%) were seen on medical record review versus patient surveys. However, within-resident (N=70) comparison of 2 methods of estimating screening rates contained significant variability. The cost of medical record review was substantially higher ($107 vs $17/physician).
Medical record review and patient surveys provided similar rates for selected preventive health measures at the clinic level, with the exception of pneumonia vaccination and advising to quit smoking. A large variation among individual resident providers was noted.
education; medical; preventive health services; patient survey; medical record review; cost evaluation
The Accreditation Council for Graduate Medical Education (ACGME) requires physicians in training to be educated in 6 competencies considered important for independent medical practice. There is little information about the experiences that residents feel contribute most to the acquisition of the competencies.
To understand how residents perceive their learning of the ACGME competencies and to determine which educational activities were most helpful in acquiring these competencies.
A web-based survey created by the graduate medical education office for institutional program monitoring and evaluation was sent to all residents in ACGME-accredited programs at the David Geffen School of Medicine, University of California-Los Angeles, from 2007 to 2010. Residents responded to questions about the adequacy of their learning for each of the 6 competencies and which learning activities were most helpful in competency acquisition.
We analyzed 1378 responses collected from postgraduate year-1 (PGY-1) to PGY-3 residents in 12 different residency programs, surveyed between 2007 and 2010. The overall response rate varied by year (66%–82%). Most residents (80%–97%) stated that their learning of the 6 ACGME competencies was “adequate.” Patient care activities and observation of attending physicians and peers were listed as the 2 most helpful learning activities for acquiring the 6 competencies.
Our findings reinforce the importance of learning from role models during patient care activities and the heterogeneity of learning activities needed for acquiring all 6 competencies.
The Accreditation Council for Graduate Medical Education (ACGME) requires an annual evaluation of all ACGME-accredited residency and fellowship programs to assess program quality. The results of this evaluation must be used to improve the program. This manuscript describes a metric to be used in conducting ACGME-mandated annual program review of ACGME-accredited anesthesiology residencies and fellowships.
A variety of metrics to assess anesthesiology residency and fellowship programs are identified by the authors through literature review and considered for use in constructing a program "report card."
Metrics used to assess program quality include success in achieving American Board of Anesthesiology (ABA) certification, performance on the annual ABA/American Society of Anesthesiology In-Training Examination, performance on mock oral ABA certification examinations, trainee scholarly activities (publications and presentations), accreditation site visit and internal review results, ACGME and alumni survey results, National Resident Matching Program (NRMP) results, exit interview feedback, diversity data and extensive program/rotation/faculty/curriculum evaluations by trainees and faculty. The results are used to construct a "report card" that provides a high-level review of program performance and can be used in a continuous quality improvement process.
An annual program review is required to assess all ACGME-accredited residency and fellowship programs to monitor and improve program quality. We describe an annual review process based on metrics that can be used to focus attention on areas for improvement and track program performance year-to-year. A "report card" format is described as a high-level tool to track educational outcomes.
The Accreditation Council for Graduate Medical Education (ACGME) Outcome Project requires that residency program directors objectively document that their residents achieve competence in 6 general dimensions of practice.
In November 2007, the American Board of Internal Medicine (ABIM) and the ACGME initiated the development of milestones for internal medicine residency training. ABIM and ACGME convened a 33-member milestones task force made up of program directors, experts in evaluation and quality, and representatives of internal medicine stakeholder organizations. This article reports on the development process and the resulting list of proposed milestones for each ACGME competency.
The task force adopted the Dreyfus model of skill acquisition as a framework the internal medicine milestones, and calibrated the milestones with the expectation that residents achieve, at a minimum, the “competency” level in the 5-step progression by the completion of residency. The task force also developed general recommendations for strategies to evaluate the milestones.
The milestones resulting from this effort will promote competency-based resident education in internal medicine, and will allow program directors to track the progress of residents and inform decisions regarding promotion and readiness for independent practice. In addition, the milestones may guide curriculum development, suggest specific assessment strategies, provide benchmarks for resident self-directed assessment-seeking, and assist remediation by facilitating identification of specific deficits. Finally, by making explicit the profession's expectations for graduates and providing a degree of national standardization in evaluation, the milestones may improve public accountability for residency training.
Residents are evaluated using Accreditation Council for Graduate Medical Education (ACGME) core competencies. An Objective Structured Clinical Examination (OSCE) is a potential evaluation tool to measure these competencies and provide outcome data.
Create an OSCE to evaluate and demonstrate improvement in intern core competencies of patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice before and after internship.
From 2006 to 2008, 106 interns from 10 medical specialties were evaluated with a preinternship and postinternship OSCE at Madigan Army Medical Center. The OSCE included eight 12-minute stations that collectively evaluated the 6 ACGME core competencies using human patient simulators, standardized patients, and clinical scenarios. Interns were scored using objective and subjective criteria, with a maximum score of 100 for each competency. Stations included death notification, abdominal pain, transfusion consent, suture skills, wellness history, chest pain, altered mental status, and computer literature search. These stations were chosen by specialty program directors, created with input from board-certified specialists, and were peer reviewed.
All OSCE testing on the 106 interns (ages 25 to 44 [average, 28.6]; 70 [66%] men; 65 [58%] allopathic medical school graduates) resulted in statistically significant improvement in all ACGME core competencies: patient care (71.9% to 80.0%, P < .001), medical knowledge (59.6% to 78.6%, P < .001), practice-based learning and improvement (45.2% to 63.0%, P < .001), interpersonal and communication skills (77.5% to 83.1%, P < .001), professionalism (74.8% to 85.1%, P < .001), and systems-based practice (56.6% to 76.5%, P < .001).
An OSCE during internship can evaluate incoming baseline ACGME core competencies and test for interval improvement. The OSCE is a valuable assessment tool to provide outcome measures on resident competency performance and evaluate program effectiveness.
In 1999, the Accreditation Council for Graduate Medical Education (ACGME) Outcome Project began to focus on resident performance in the 6 competencies of patient care, medical knowledge, professionalism, practice-based learning and improvement, interpersonal communication skills, and professionalism. Beginning in 2007, the ACGME began collecting information on how programs assess these competencies. This report provides information on the nature and extent of those assessments.
Using data collected by the ACGME for site visits, we use descriptive statistics and percentages to describe the number and type of methods and assessors accredited programs (n = 4417) report using to assess the competencies. Observed differences among specialties, methodologies, and assessors are tested with analysis of variance procedures.
Almost all (>97%) of programs report assessing all of the competencies and using multiple methods and multiple assessors. Similar assessment methods and evaluator types were consistently used across the 6 competencies. However, there were some differences in the use of patient and family as assessors: Primary care and ambulatory specialties used these to a greater extent than other specialties.
Residency programs are emphasizing the competencies in their evaluation of residents. Understanding the scope of evaluation methodologies that programs use in resident assessment is important for both the profession and the public, so that together we may monitor continuing improvement in US graduate medical education.
The Accreditation Council for Graduate Medical Education (ACGME) requires that training programs integrate system-based practice (SBP) and practice-based learning and improvement (PBLI) into internal medicine residency curricula.
Context and setting
We instituted a seminar series and year-long-mentored curriculum designed to engage internal medicine residents in these competencies.
Residents participate in a seminar series that includes assigned reading and structured discussion with faculty who assist in the development of quality improvement or research projects. Residents pursue projects over the remainder of the year. Monthly works in progress meetings, protected time for inquiry, and continued faculty mentorship guide the residents in their project development. Trainees present their work at hospital-wide grand rounds at the end of the academic year. We performed a survey of residents to assess their self-reported knowledge, attitudes and skills in SBP and PBLI. In addition, blinded faculty scored projects for appropriateness, impact, and feasibility.
We measured resident self-reported knowledge, attitudes, and skills at the end of the academic year. We found evidence that participants improved their understanding of the context in which they were practicing, and that their ability to engage in quality improvement projects increased. Blinded faculty reviewers favorably ranked the projects’ feasibility, impact, and appropriateness. The ‘Curriculum of Inquiry’ generated 11 quality improvement and research projects during the study period. Barriers to the ongoing work include a limited supply of mentors and delays due to Institutional Review Board approval. Hospital leadership recognizes the importance of the curriculum, and our accreditation manager now cites our ongoing work.
A structured residency-based curriculum facilitates resident demonstration of SBP and practice-based learning and improvement. Residents gain knowledge and skills though this enterprise and hospitals gain access to trainees who help to solve ongoing problems and meet accreditation requirements.
graduate medical education; competencies; longitudinal curriculum
The Accreditation Council for Graduate Medical Education (ACGME) recommends resident portfolios as 1 method for assessing competence in practice-based learning and improvement. In July 2005, when anesthesiology residents in our department were required to start a portfolio, the residents and their faculty advisors did not readily accept this new requirement. Intensive education efforts addressing the goals and importance of portfolios were undertaken. We hypothesized that these educational efforts improved acceptance of the portfolio and retrospectively audited the portfolio evaluation forms completed by faculty advisors.
Intensive education about the goals and importance of portfolios began in January 2006, including presentations at departmental conferences and one-on-one education sessions. Faculty advisors were instructed to evaluate each resident's portfolio and complete a review form. We retrospectively collected data to determine the percentage of review forms completed by faculty. The portfolio reviews also assessed the percentage of 10 required portfolio components residents had completed.
Portfolio review forms were completed by faculty advisors for 13% (5/38) of residents during the first advisor-advisee meeting in December 2005. Initiation of intensive education efforts significantly improved compliance, with review forms completed for 68% (26/38) of residents in May 2006 (P < .0001) and 95% (36/38) in December 2006 (P < .0001). Residents also significantly improved the completeness of portfolios between May and December of 2006.
Portfolios are considered a best methods technique by the ACGME for evaluation of practice-based learning and improvment. We have found that intensive education about the goals and importance of portfolios can enhance acceptance of this evaluation tool, resulting in improved compliance in completion and evaluation of portfolios.
Standard curricula to teach Internal Medicine residents about quality assessment and improvement, important components of the Accreditation Council for Graduate Medical Education core competencies practiced-based learning and improvement (PBLI) and systems-based practice (SBP), have not been easily accessible.
Using the American Board of Internal Medicine’s (ABIM) Clinical Preventative Services Practice Improvement Module (CPS PIM), we have incorporated a longitudinal quality assessment and improvement curriculum (QAIC) into the 2 required 1-month ambulatory rotations during the postgraduate year 2. During the first block, residents complete the PIM chart reviews, patient, and system surveys. The second block includes resident reflection using PIM data and the group performing a small test of change using the Plan–Do–Study–Act (PDSA) cycle in the resident continuity clinic.
To date, 3 resident quality improvement (QI) projects have been undertaken as a result of QAIC, each making significant improvements in the residents’ continuity clinic. Resident confidence levels in QI skills (e.g., writing an aim statement [71% to 96%, P < .01] and using a PDSA cycle [9% to 89%, P < .001]) improved significantly.
The ABIM CPS PIM can be used by Internal Medicine residency programs to introduce QI concepts into their residents’ outpatient practice through encouraging practice-based learning and improvement and systems-based practice.
Internal Medicine residents; quality improvement; practiced-based learning and improvement; systems-based practice; practice improvement module
In the United States, the Accreditation Council of graduate medical education (ACGME) requires all accredited Internal medicine residency training programs to facilitate resident scholarly activities. However, clinical experience and medical education still remain the main focus of graduate medical education in many Internal Medicine (IM) residency-training programs. Left to design the structure, process and outcome evaluation of the ACGME research requirement, residency-training programs are faced with numerous barriers. Many residency programs report having been cited by the ACGME residency review committee in IM for lack of scholarly activity by residents.
We would like to share our experience at Lincoln Hospital, an affiliate of Weill Medical College Cornell University New York, in designing and implementing a successful structured research curriculum based on ACGME competencies taught during a dedicated "research rotation".
Since the inception of the research rotation in 2004, participation of our residents among scholarly activities has substantially increased. Our residents increasingly believe and appreciate that research is an integral component of residency training and essential for practice of medicine.
Internal medicine residents' outlook in research can be significantly improved using a research curriculum offered through a structured and dedicated research rotation. This is exemplified by the improvement noted in resident satisfaction, their participation in scholarly activities and resident research outcomes since the inception of the research rotation in our internal medicine training program.
In 2011, the Accreditation Council on Graduate Medical Education (ACGME) redefined resident duty hour requirements by reducing in-hospital duty hour requirements for residents in an effort to improve patient care, resident well-being, and resident education. We sought to determine the cost of adoption based on changes made by neurology residency programs and departments due to these requirements.
We surveyed department chairs or residency program directors at 123 ACGME-accredited US adult neurology training programs on programmatic changes and resident expansion, hiring practices, and development of new computer-based resources in direct response to the 2011 ACGME duty hour requirements. Using data from publicly available resources, we estimated respondents’ financial cost of adoption.
In all, 63 responded (51% response rate); 76% were program directors. The most common changes implemented by programs were adding night float systems (n = 31; 49%) and increasing faculty responsibility (n = 26; 41%). In direct response to the requirements, 21 programs applied to ACGME for 40 additional residents, 29 of which were fully covered by institutional funds. In direct response to the requirements, nearly half of the departments (n = 26) hired individuals for a total of 80 hires (or 64 full-time equivalents), most commonly mid-level practitioners. The total estimated cost to responding departments was US $12.7 million or US $201,000 per department annually. When projecting expenses of planned changes for the following year, costs increased to US $360,000 per department, with 5-year costs exceeding US $1 million.
The most recent restriction on resident duty hours comes at substantial cost to neurology departments and residency programs.
education; training; academic; quality; safety; costs
Historical bias toward service-oriented inpatient graduate medical education experiences has hindered both resident education and care of patients in the ambulatory setting.
Describe and evaluate a residency redesign intended to improve the ambulatory experience for residents and patients.
Categorical Internal Medicine resident ambulatory practice at the University of Cincinnati Academic Health Center.
We created a year-long continuous ambulatory group-practice experience separated from traditional inpatient responsibilities called the long block as an Accreditation Council for Graduate Medical Education Educational Innovations Project. The practice adopted the Chronic Care Model and residents received extensive instruction in quality improvement and interprofessional teams.
The long block was associated with significant increases in resident and patient satisfaction as well as improvement in multiple quality process and outcome measures. Continuity and no-show rates also improved.
An ambulatory long block can be associated with improvements in resident and patient satisfaction, quality measures, and no-show rates. Future research should be done to determine effects of the long block on education and patient care in the long term, and elucidate which aspects of the long block most contribute to improvement.
ambulatory education; clinic; residency training; chronic care model
The complex competency labeled practice-based learning and improvement (PBLI) by the Accreditation Council for Graduate Medical Education (ACGME) incorporates core knowledge in evidence-based medicine (EBM). The purpose of this study was to operationally define a “PBLI-EBM” domain for assessing resident physician competence.
The authors used an iterative design process to first content analyze and map correspondences between ACGME and EBM literature sources. The project team, including content and measurement experts and residents/fellows, parsed, classified, and hierarchically organized embedded learning outcomes using a literature-supported cognitive taxonomy. A pool of 141 items was produced from the domain and assessment specifications. The PBLI-EBM domain and resulting items were content validated through formal reviews by a national panel of experts.
The final domain represents overlapping PBLI and EBM cognitive dimensions measurable through written, multiple-choice assessments. It is organized as 4 subdomains of clinical action: Therapy, Prognosis, Diagnosis, and Harm. Four broad cognitive skill branches (Ask, Acquire, Appraise, and Apply) are subsumed under each subdomain. Each skill branch is defined by enabling skills that specify the cognitive processes, content, and conditions pertinent to demonstrable competence. Most items passed content validity screening criteria and were prepared for test form assembly and administration.
The operational definition of PBLI-EBM competence is based on a rigorously developed and validated domain and item pool, and substantially expands conventional understandings of EBM. The domain, assessment specifications, and procedures outlined may be used to design written assessments to tap important cognitive dimensions of the overall PBLI competency, as given by ACGME. For more comprehensive coverage of the PBLI competency, such instruments need to be complemented with performance assessments.
Health professionals need competencies in improvement skills if they are to contribute usefully to improving patient care. Medical education programmes in the USA have not systematically taught improvement skills to residents (registrars in the UK). The Accreditation Council for Graduate Medical Education (ACGME) has recently developed and begun to deploy a competency based model for accreditation that may encourage the development of improvement skills by the 100 000 residents in accredited programmes. Six competencies have been identified for all physicians, independent of specialty, and measurement tools for these competencies have been described. This model may be applicable to other healthcare professions. This paper explores patterns that inhibit efforts to change practice and proposes an educational model to provide changes in management skills based on trainees' analysis of their own work.
Key Words: physician education; improvement skills; accreditation; competency
OBJECTIVE: To describe the views of residency program directors regarding the effect of the 2010 duty hour recommendations on the 6 core competencies of graduate medical education.
METHODS: US residency program directors in internal medicine, pediatrics, and general surgery were e-mailed a survey from July 8 through July 20, 2010, after the 2010 Accreditation Council for Graduate Medical Education (ACGME) duty hour recommendations were published. Directors were asked to rate the implications of the new recommendations for the 6 ACGME core competencies as well as for continuity of inpatient care and resident fatigue.
RESULTS: Of 719 eligible program directors, 464 (65%) responded. Most program directors believe that the new ACGME recommendations will decrease residents' continuity with hospitalized patients (404/464 [87%]) and will not change (303/464 [65%]) or will increase (26/464 [6%]) resident fatigue. Additionally, most program directors (249-363/464 [53%-78%]) believe that the new duty hour restrictions will decrease residents' ability to develop competency in 5 of the 6 core areas. Surgery directors were more likely than internal medicine directors to believe that the ACGME recommendations will decrease residents' competency in patient care (odds ratio [OR], 3.9; 95% confidence interval [CI], 2.5-6.3), medical knowledge (OR, 1.9; 95% CI, 1.2-3.2), practice-based learning and improvement (OR, 2.7; 95% CI, 1.7-4.4), interpersonal and communication skills (OR, 1.9; 95% CI, 1.2-3.0), and professionalism (OR, 2.5; 95% CI, 1.5-4.0).
CONCLUSION: Residency program directors' reactions to ACGME duty hour recommendations demonstrate a marked degree of concern about educating a competent generation of future physicians in the face of increasing duty hour standards and regulation.
The reactions of residency program directors to the ACGME duty hour recommendations demonstrate a marked degree of concern about educating a competent generation of future physicians in the face of increasing duty hour standards and regulation.
In 2006, the University of Virginia became one of the first academic medical institutions to be placed on probation, after the Accreditation Council for Graduate Medical Education (ACGME) Institutional Review Committee implemented a new classification system for institutional reviews.
After University of Virginia reviewed its practices and implemented needed changes, the institution was able to have probation removed and full accreditation restored. Whereas graduate medical education committees and designated institutional officials are required to conduct internal reviews of each ACGME–accredited program midway through its accreditation cycle, no similar requirement exists for institutions.
As we designed corrective measures at the University of Virginia, we realized that regularly scheduled audits of the entire institution would have prevented the accumulation of deficiencies. We suggest that institutional internal reviews be implemented to ensure that the ACGME institutional requirements for graduate medical education are met. This process represents practice-based learning and improvement at the institutional level and may prevent other institutions from receiving unfavorable accreditation decisions.
The Accreditation Council for Graduate Medical Education (ACGME) introduced the Outcome Project in July 2001 to improve the quality of resident education through competency-based learning. The purpose of this systematic review is to determine and explore the perceptions of program directors regarding challenges to implementing the ACGME Outcome Project.
We used the PubMed and Web of Science databases and bibliographies for English-language articles published between January 1, 2001, and February 17, 2012. Studies were included if they described program directors' opinions on (1) barriers encountered when attempting to implement ACGME competency-based education, and (2) assessment methods that each residency program was using to implement competency-based education. Articles meeting the inclusion criteria were screened by 2 researchers. The grading criterion was created by the authors and used to assess the quality of each study.
The survey-based data reported the opinions of 1076 program directors. Barriers that were encountered include: (1) lack of time; (2) lack of faculty support; (3) resistance of residents to the Outcome Project; (4) insufficient funding; (5) perceived low priority for the Outcome Project; (6) inadequate salary incentive; and (7) inadequate knowledge of the competencies. Of the 6 competencies, those pertaining to patient care and medical knowledge received the most responses from program directors and were given highest priority.
The reviewed literature revealed that time and financial constraints were the most important barriers encountered when implementing the ACGME Outcome Project.
How long a resident must train to achieve competency is an ongoing debate in medicine. For family medicine, there is an Accreditation Council for Graduate Medical Education (ACGME)–approved proposal to examine the benefits of lengthening family medicine training from 3 to 4 years. The rationale for adding another year of residency in family medicine has included the following: (1) overcoming the effect of the duty hour limits in further reducing educational opportunities, (2) reversing the growing number of first-time takers of the American Board of Family Medicine primary board who fail to pass the exam, (3) enhancing the family medicine training experience by “decompressing” the ever-growing number of Residency Review Committee requirements to maintain accreditation, and (4) improving the overall quality of family medicine graduates.
The in-training examination is a national and yearly exam administered by the American Board of Emergency Medicine to all emergency medicine residents in the USA. The purpose of the examination is to evaluate a resident’s progress toward obtaining the fundamental knowledge to practice independent emergency medicine.
The purpose of this study was to determine the effects of a 40 hour board review lecture course on the resident in-training examination in emergency medicine.
A 40 hour board review lecture course was designed and implemented during the weekly 5 hour long resident conferences during the 8 weeks preceding the in-training examination date in 2006. Attendance was mandatory at the Accreditation Council for Graduate Medical Education (ACGME) standard of 70% or greater. A positive result was considered to be a 10% increase or greater in the resident’s individual national class percentile ranking among their national peers for their class year for the emergency medicine in-training examination. A resident was excluded from the study if there was no 2005 in-training examination score for self-comparison. The 95% confidence intervals (CI) were used to analyze the results.
Of 16 residents, 1 (6.25%; 95% CI: 0–18%) showed a positive result of increasing their national class percentile ranking by 10% or greater. For the PGY2, one of the eight had a positive result (12.5%; 95% CI: 0–35.4%). For PGY3, no resident (0%; 95% CI: 0–35.4%) had a positive result.
A 40 hour board review lecture course has no positive effect on improving a resident’s in-training examination score.
In-training examination; Board review; Emergency medicine resident
There are no nationwide data on the methods residency programs are using to assess trainee competence. The Accreditation Council for Graduate Medical Education (ACGME) has recommended tools that programs can use to evaluate their trainees. It is unknown if programs are adhering to these recommendations.
To describe evaluation methods used by our nation’s internal medicine residency programs and assess adherence to ACGME methodological recommendations for evaluation.
All internal medicine programs registered with the Association of Program Directors of Internal Medicine (APDIM).
Descriptive statistics of programs and tools used to evaluate competence; compliance with ACGME recommended evaluative methods.
The response rate was 70%. Programs were using an average of 4.2–6.0 tools to evaluate their trainees with heavy reliance on rating forms. Direct observation and practice and data-based tools were used much less frequently. Most programs were using at least 1 of the Accreditation Council for Graduate Medical Education (ACGME)’s “most desirable” methods of evaluation for all 6 measures of trainee competence. These programs had higher support staff to resident ratios than programs using less desirable evaluative methods.
Residency programs are using a large number and variety of tools for evaluating the competence of their trainees. Most are complying with ACGME recommended methods of evaluation especially if the support staff to resident ratio is high.
graduate medical education; residency; ACGME; competency
When quality improvement processes are integrated into resident education, many opportunities are created for improved outcomes in patient care. For Bethesda Family Medicine (BFM), integrating quality improvement into resident education is paramount in fulfilling the Accreditation Council for Graduate Medical Education Practice-Based Learning and Improvement core competency requirements.
A resident-developed diabetes management treatment protocol that targeted 11 evidence-based measures recommended for successful diabetes management was implemented within the BFM residency and all physician practices under its parent healthcare system. This study compares diabetes management at BFM and at 2 other family medicine practices at timepoints before and after protocol implementation. We measured hemoglobin A1c (HbA1c), low-density lipoprotein (LDL) cholesterol, and systolic blood pressure (SBP) in adult diabetics and compared patient outcomes for these measures for the first and third quarters of 2009 and 2010.
In BFM patients, HbA1c, LDL, and SBP levels decreased, but only HbA1c improvement persisted long term. For the comparison groups, in general levels were lower than those of BFM patients but not significantly so after the first measurement period.
A resident-led treatment protocol can improve HbA1c outcomes among residents' diabetic patients. Periodic educational interventions can enhance residents' focus on diabetes management. Residents in graduate medical education can initiate treatment protocols to improve patient care in a large healthcare system.
Adult diabetics; diabetes management; graduate medical education; quality improvement; resident education
This article, developed for the Betty Ford Institute Consensus Conference on Graduate Medical Education (December, 2008), presents a model curriculum for Family Medicine residency training in substance abuse.
The authors reviewed reports of past Family Medicine curriculum development efforts, previously-identified barriers to education in high risk substance use, approaches to overcoming these barriers, and current training guidelines of the Accreditation Council for Graduate Medical Education (ACGME) and their Family Medicine Residency Review Committee. A proposed eight-module curriculum was developed, based on substance abuse competencies defined by Project MAINSTREAM and linked to core competencies defined by the ACGME. The curriculum provides basic training in high risk substance use to all residents, while also addressing current training challenges presented by U.S. work hour regulations, increasing international diversity of Family Medicine resident trainees, and emerging new primary care practice models.
This paper offers a core curriculum, focused on screening, brief intervention and referral to treatment, which can be adapted by residency programs to meet their individual needs. The curriculum encourages direct observation of residents to ensure that core skills are learned and trains residents with several "new skills" that will expand the basket of substance abuse services they will be equipped to provide as they enter practice.
Broad-based implementation of a comprehensive Family Medicine residency curriculum should increase the ability of family physicians to provide basic substance abuse services in a primary care context. Such efforts should be coupled with faculty development initiatives which ensure that sufficient trained faculty are available to teach these concepts and with efforts by major Family Medicine organizations to implement and enforce residency requirements for substance abuse training.
The established guidelines for a diabetes foot examination include assessing circulatory, skin, and neurological status to detect problems early and reduce the likelihood of amputation. Physician adherence to the guidelines for proper examination is less than optimal.
Our objective was to increase compliance with the performance of a proper foot examination through a predominantly physician-directed interventional campaign.
The study consisted of 3 parts: a retrospective chart review to estimate background compliance, an educational intervention, and prospective chart review at 3 and 6 months. A properly documented foot examination was defined as assessing at least 2 of the 3 necessary components. The educational intervention consisted of 2 lectures directed at resident physicians and a quality assurance announcement at a general internal medicine staff meeting. Clinic support staff were instructed to remove the shoes and socks of all diabetic patients when they were placed in exam rooms, and signs reminding diabetics were placed in each exam room.
There was a significant increase in the performance of proper foot examination over the course of the study (baseline 14.0%, 3 months 58.0%, 6 months 62.1%; P < .001). Documentation of any component of a proper foot examination also increased substantially (32.6%, 67.3%, 72.5%; P < .001). Additionally, performance of each component of a proper exam increased dramatically during the study: neurological (13.5%, 35.8%, 38.5%; P < .001), skin (23.0%, 64.2%, 69.2%; P < .001), and vascular (14.0%, 51.2%, 50.5%; P < .001).
Patients with diabetes are unlikely to have foot examinations in their primary medical care. A simple, low-cost educational intervention significantly improved the adherence to foot examination guidelines for patients with diabetes.
diabetes; foot ulceration; foot exam; prevention; physical education
The Accreditation Council of Graduate Medical Education (ACGME) and the Residency Review Committee require a competency-based, accessible evaluation system. The paper system at our institution did not meet these demands and suffered from low compliance. A diverse committee of internal medicine faculty, program directors, and house staff designed a new clinical evaluation strategy based on ACGME competencies and utilizing a modular web-based system called ResEval. ResEval more effectively met requirements and provided useful data for program and curriculum development. The system is paperless, allows for evaluations at any time, and produces customized evaluation reports, dramatically improving our ability to analyze evaluation data. The use of this novel system and the inclusion of a robust technology infrastructure, repeated training and e-mail reminders, and program leadership commitment resulted in an increase in clinical skills evaluations performed and a rapid change in the workflow and culture of evaluation at our residency program.
house staff evaluation; clinical skills assessment; internal medicine residency; Internet; educational measurement