1) To describe how internal medicine residency programs fulfill the Accreditation Council for Graduate Medical Education (ACGME) scholarly activity training requirement including the current context of resident scholarly work, and 2) to compare findings between university and nonuniversity programs.
Cross-sectional mailed survey.
ACGME-accredited internal medicine residency programs.
Internal medicine residency program directors.
Data were collected on 1) interpretation of the scholarly activity requirement, 2) support for resident scholarship, 3) scholarly activities of residents, 4) attitudes toward resident research, and 5) program characteristics. University and nonuniversity programs were compared.
The response rate was 78%. Most residents completed a topic review with presentation (median, 100%) to fulfill the requirement. Residents at nonuniversity programs were more likely to complete case reports (median, 40% vs 25%; P =.04) and present at local or regional meetings (median, 25% vs 20%; P =.01), and were just as likely to conduct hypothesis-driven research (median, 20% vs 20%; P =.75) and present nationally (median, 10% vs 5%; P =.10) as residents at university programs. Nonuniversity programs were more likely to report lack of faculty mentors (61% vs 31%; P <.001) and resident interest (55% vs 40%; P =.01) as major barriers to resident scholarship. Programs support resident scholarship through research curricula (47%), funding (46%), and protected time (32%).
Internal medicine residents complete a variety of projects to fulfill the scholarly activity requirement. Nonuniversity programs are doing as much as university programs in meeting the requirement and supporting resident scholarship despite reporting significant barriers.
ACGME; resident research; medical education; national survey
The Accreditation Council for Graduate Medical Education (ACGME) requires an annual evaluation of all ACGME-accredited residency and fellowship programs to assess program quality. The results of this evaluation must be used to improve the program. This manuscript describes a metric to be used in conducting ACGME-mandated annual program review of ACGME-accredited anesthesiology residencies and fellowships.
A variety of metrics to assess anesthesiology residency and fellowship programs are identified by the authors through literature review and considered for use in constructing a program "report card."
Metrics used to assess program quality include success in achieving American Board of Anesthesiology (ABA) certification, performance on the annual ABA/American Society of Anesthesiology In-Training Examination, performance on mock oral ABA certification examinations, trainee scholarly activities (publications and presentations), accreditation site visit and internal review results, ACGME and alumni survey results, National Resident Matching Program (NRMP) results, exit interview feedback, diversity data and extensive program/rotation/faculty/curriculum evaluations by trainees and faculty. The results are used to construct a "report card" that provides a high-level review of program performance and can be used in a continuous quality improvement process.
An annual program review is required to assess all ACGME-accredited residency and fellowship programs to monitor and improve program quality. We describe an annual review process based on metrics that can be used to focus attention on areas for improvement and track program performance year-to-year. A "report card" format is described as a high-level tool to track educational outcomes.
The Accreditation Council for Graduate Medical Education (ACGME) Outcome Project requires that residency program directors objectively document that their residents achieve competence in 6 general dimensions of practice.
In November 2007, the American Board of Internal Medicine (ABIM) and the ACGME initiated the development of milestones for internal medicine residency training. ABIM and ACGME convened a 33-member milestones task force made up of program directors, experts in evaluation and quality, and representatives of internal medicine stakeholder organizations. This article reports on the development process and the resulting list of proposed milestones for each ACGME competency.
The task force adopted the Dreyfus model of skill acquisition as a framework the internal medicine milestones, and calibrated the milestones with the expectation that residents achieve, at a minimum, the “competency” level in the 5-step progression by the completion of residency. The task force also developed general recommendations for strategies to evaluate the milestones.
The milestones resulting from this effort will promote competency-based resident education in internal medicine, and will allow program directors to track the progress of residents and inform decisions regarding promotion and readiness for independent practice. In addition, the milestones may guide curriculum development, suggest specific assessment strategies, provide benchmarks for resident self-directed assessment-seeking, and assist remediation by facilitating identification of specific deficits. Finally, by making explicit the profession's expectations for graduates and providing a degree of national standardization in evaluation, the milestones may improve public accountability for residency training.
The Accreditation Council for Graduate Medical Education (ACGME) requires that training programs integrate system-based practice (SBP) and practice-based learning and improvement (PBLI) into internal medicine residency curricula.
Context and setting
We instituted a seminar series and year-long-mentored curriculum designed to engage internal medicine residents in these competencies.
Residents participate in a seminar series that includes assigned reading and structured discussion with faculty who assist in the development of quality improvement or research projects. Residents pursue projects over the remainder of the year. Monthly works in progress meetings, protected time for inquiry, and continued faculty mentorship guide the residents in their project development. Trainees present their work at hospital-wide grand rounds at the end of the academic year. We performed a survey of residents to assess their self-reported knowledge, attitudes and skills in SBP and PBLI. In addition, blinded faculty scored projects for appropriateness, impact, and feasibility.
We measured resident self-reported knowledge, attitudes, and skills at the end of the academic year. We found evidence that participants improved their understanding of the context in which they were practicing, and that their ability to engage in quality improvement projects increased. Blinded faculty reviewers favorably ranked the projects’ feasibility, impact, and appropriateness. The ‘Curriculum of Inquiry’ generated 11 quality improvement and research projects during the study period. Barriers to the ongoing work include a limited supply of mentors and delays due to Institutional Review Board approval. Hospital leadership recognizes the importance of the curriculum, and our accreditation manager now cites our ongoing work.
A structured residency-based curriculum facilitates resident demonstration of SBP and practice-based learning and improvement. Residents gain knowledge and skills though this enterprise and hospitals gain access to trainees who help to solve ongoing problems and meet accreditation requirements.
graduate medical education; competencies; longitudinal curriculum
The Accreditation Council for Graduate Medical Education (ACGME) requires physicians in training to be educated in 6 competencies considered important for independent medical practice. There is little information about the experiences that residents feel contribute most to the acquisition of the competencies.
To understand how residents perceive their learning of the ACGME competencies and to determine which educational activities were most helpful in acquiring these competencies.
A web-based survey created by the graduate medical education office for institutional program monitoring and evaluation was sent to all residents in ACGME-accredited programs at the David Geffen School of Medicine, University of California-Los Angeles, from 2007 to 2010. Residents responded to questions about the adequacy of their learning for each of the 6 competencies and which learning activities were most helpful in competency acquisition.
We analyzed 1378 responses collected from postgraduate year-1 (PGY-1) to PGY-3 residents in 12 different residency programs, surveyed between 2007 and 2010. The overall response rate varied by year (66%–82%). Most residents (80%–97%) stated that their learning of the 6 ACGME competencies was “adequate.” Patient care activities and observation of attending physicians and peers were listed as the 2 most helpful learning activities for acquiring the 6 competencies.
Our findings reinforce the importance of learning from role models during patient care activities and the heterogeneity of learning activities needed for acquiring all 6 competencies.
The Accreditation Council of Graduate Medical Education (ACGME) and the Residency Review Committee require a competency-based, accessible evaluation system. The paper system at our institution did not meet these demands and suffered from low compliance. A diverse committee of internal medicine faculty, program directors, and house staff designed a new clinical evaluation strategy based on ACGME competencies and utilizing a modular web-based system called ResEval. ResEval more effectively met requirements and provided useful data for program and curriculum development. The system is paperless, allows for evaluations at any time, and produces customized evaluation reports, dramatically improving our ability to analyze evaluation data. The use of this novel system and the inclusion of a robust technology infrastructure, repeated training and e-mail reminders, and program leadership commitment resulted in an increase in clinical skills evaluations performed and a rapid change in the workflow and culture of evaluation at our residency program.
house staff evaluation; clinical skills assessment; internal medicine residency; Internet; educational measurement
This article, developed for the Betty Ford Institute Consensus Conference on Graduate Medical Education (December, 2008), presents a model curriculum for Family Medicine residency training in substance abuse.
The authors reviewed reports of past Family Medicine curriculum development efforts, previously-identified barriers to education in high risk substance use, approaches to overcoming these barriers, and current training guidelines of the Accreditation Council for Graduate Medical Education (ACGME) and their Family Medicine Residency Review Committee. A proposed eight-module curriculum was developed, based on substance abuse competencies defined by Project MAINSTREAM and linked to core competencies defined by the ACGME. The curriculum provides basic training in high risk substance use to all residents, while also addressing current training challenges presented by U.S. work hour regulations, increasing international diversity of Family Medicine resident trainees, and emerging new primary care practice models.
This paper offers a core curriculum, focused on screening, brief intervention and referral to treatment, which can be adapted by residency programs to meet their individual needs. The curriculum encourages direct observation of residents to ensure that core skills are learned and trains residents with several "new skills" that will expand the basket of substance abuse services they will be equipped to provide as they enter practice.
Broad-based implementation of a comprehensive Family Medicine residency curriculum should increase the ability of family physicians to provide basic substance abuse services in a primary care context. Such efforts should be coupled with faculty development initiatives which ensure that sufficient trained faculty are available to teach these concepts and with efforts by major Family Medicine organizations to implement and enforce residency requirements for substance abuse training.
The Accreditation Council for Graduate Medical Education (ACGME) requires residency programs to meet and demonstrate outcomes across 6 competencies. Measuring residents' competency in practice-based learning and improvement (PBLI) is particularly challenging.
We developed an educational tool to meet ACGME requirements for PBLI. The PBLI template helped programs document quality improvement (QI) projects and supported increased scholarly activity surrounding PBLI learning.
We reviewed program requirements for 43 residency and fellowship programs and identified specific PBLI requirements for QI activities. We also examined ACGME Program Information Form responses on PBLI core competency questions surrounding QI projects for program sites visited in 2008–2009. Data were integrated by a multidisciplinary committee to develop a peer-protected PBLI template guiding programs through process, documentation, and evaluation of QI projects. All steps were reviewed and approved through our GME Committee structure.
An electronic template, companion checklist, and evaluation form were developed using identified project characteristics to guide programs through the PBLI process and facilitate documentation and evaluation of the process. During a 24 month period, 27 programs have completed PBLI projects, and 15 have reviewed the template with their education committees, but have not initiated projects using the template.
The development of the tool generated program leaders' support because the tool enhanced the ability to meet program-specific objectives. The peer-protected status of this document for confidentiality and from discovery has been beneficial for program usage. The document aggregates data on PBLI and QI initiatives, offers opportunities to increase scholarship in QI, and meets the ACGME goal of linking measures to outcomes important to meeting accreditation requirements at the program and institutional level.
OBJECTIVE: To describe the views of residency program directors regarding the effect of the 2010 duty hour recommendations on the 6 core competencies of graduate medical education.
METHODS: US residency program directors in internal medicine, pediatrics, and general surgery were e-mailed a survey from July 8 through July 20, 2010, after the 2010 Accreditation Council for Graduate Medical Education (ACGME) duty hour recommendations were published. Directors were asked to rate the implications of the new recommendations for the 6 ACGME core competencies as well as for continuity of inpatient care and resident fatigue.
RESULTS: Of 719 eligible program directors, 464 (65%) responded. Most program directors believe that the new ACGME recommendations will decrease residents' continuity with hospitalized patients (404/464 [87%]) and will not change (303/464 [65%]) or will increase (26/464 [6%]) resident fatigue. Additionally, most program directors (249-363/464 [53%-78%]) believe that the new duty hour restrictions will decrease residents' ability to develop competency in 5 of the 6 core areas. Surgery directors were more likely than internal medicine directors to believe that the ACGME recommendations will decrease residents' competency in patient care (odds ratio [OR], 3.9; 95% confidence interval [CI], 2.5-6.3), medical knowledge (OR, 1.9; 95% CI, 1.2-3.2), practice-based learning and improvement (OR, 2.7; 95% CI, 1.7-4.4), interpersonal and communication skills (OR, 1.9; 95% CI, 1.2-3.0), and professionalism (OR, 2.5; 95% CI, 1.5-4.0).
CONCLUSION: Residency program directors' reactions to ACGME duty hour recommendations demonstrate a marked degree of concern about educating a competent generation of future physicians in the face of increasing duty hour standards and regulation.
The reactions of residency program directors to the ACGME duty hour recommendations demonstrate a marked degree of concern about educating a competent generation of future physicians in the face of increasing duty hour standards and regulation.
There are no nationwide data on the methods residency programs are using to assess trainee competence. The Accreditation Council for Graduate Medical Education (ACGME) has recommended tools that programs can use to evaluate their trainees. It is unknown if programs are adhering to these recommendations.
To describe evaluation methods used by our nation’s internal medicine residency programs and assess adherence to ACGME methodological recommendations for evaluation.
All internal medicine programs registered with the Association of Program Directors of Internal Medicine (APDIM).
Descriptive statistics of programs and tools used to evaluate competence; compliance with ACGME recommended evaluative methods.
The response rate was 70%. Programs were using an average of 4.2–6.0 tools to evaluate their trainees with heavy reliance on rating forms. Direct observation and practice and data-based tools were used much less frequently. Most programs were using at least 1 of the Accreditation Council for Graduate Medical Education (ACGME)’s “most desirable” methods of evaluation for all 6 measures of trainee competence. These programs had higher support staff to resident ratios than programs using less desirable evaluative methods.
Residency programs are using a large number and variety of tools for evaluating the competence of their trainees. Most are complying with ACGME recommended methods of evaluation especially if the support staff to resident ratio is high.
graduate medical education; residency; ACGME; competency
An important expectation of pediatric education is assessing, resuscitating, and stabilizing ill or injured children.
To determine whether the Accreditation Council for Graduate Medical Education (ACGME) minimum time requirement for emergency and acute illness experience is adequate to achieve the educational objectives set forth for categorical pediatric residents. We hypothesized that despite residents working five 1-month block rotations in a high-volume (95 000 pediatric visits per year) pediatric emergency department (ED), the comprehensive experience outlined by the ACGME would not be satisfied through clinical exposure.
This was a retrospective, descriptive study comparing actual resident experience to the standard defined by the ACGME. The emergency medicine experience of 35 categorical pediatric residents was tracked including number of patients evaluated during training and patient discharge diagnoses. The achievability of the ACGME requirement was determined by reporting the percentage of pediatric residents that cared for at least 1 patient from each of the ACGME-required disorder categories.
A total of 11.4% of residents met the ACGME requirement for emergency and acute illness experience in the ED. The median number of patients evaluated by residents during training in the ED was 941. Disorder categories evaluated least frequently included shock, sepsis, diabetic ketoacidosis, coma/altered mental status, cardiopulmonary arrest, burns, and bowel obstruction.
Pediatric residents working in one of the busiest pediatric EDs in the country and working 1 month more than the ACGME-recommended minimum did not achieve the ACGME requirement for emergency and acute illness experience through direct patient care.
The Accreditation Council for Graduate Medical Education (ACGME) requirements stipulate that psychiatry residents need to be educated in the area of emergency psychiatry. Existing research investigating the current state of this training is limited, and no research to date has assessed whether the ACGME Residency Review Committee requirements for psychiatry residency training are followed by psychiatry residency training programs.
We administered, to chief resident attendees of a national leadership conference, a 24‐item paper survey on the types and amount of emergency psychiatry training provided by their psychiatric residency training programs. Descriptive statistics were used in the analysis.
Of 154 surveys distributed, 111 were returned (72% response rate). Nearly one‐third of chief resident respondents indicated that more than 50% of their program's emergency psychiatry training was provided during on‐call periods. A minority indicated that they were aware of the ACGME program requirements for emergency psychiatry training. While training in emergency psychiatry occurred in many programs through rotations—different from the on‐call period—direct supervision was available during on‐call training only about one‐third of the time.
The findings suggest that about one‐third of psychiatry residency training programs do not adhere to the ACGME standards for emergency psychiatry training. Enhanced knowledge of the ACGME requirements may enhance psychiatry residents' understanding on how their programs are fulfilling the need for more emergency psychiatry training. Alternative settings to the on‐call period for emergency psychiatry training are more likely to provide for direct supervision.
Increased focus on the number and type of physicians delivering health care in the United States necessitates a better understanding of changes in graduate medical education (GME). Data collected by the Accreditation Council for Graduate Medical Education (ACGME) allow longitudinal tracking of residents, revealing the number and type of residents who continue GME following completion of an initial residency. We examined trends in the percent of graduates pursuing additional clinical education following graduation from ACGME-accredited pipeline specialty programs (specialties leading to initial board certification).
Using data collected annually by the ACGME, we tracked residents graduating from ACGME-accredited pipeline specialty programs between academic year (AY) 2002–2003 and AY 2006–2007 and those pursuing additional ACGME-accredited training within 2 years. We examined changes in the number of graduates and the percent of graduates continuing GME by specialty, by type of medical school, and overall.
The number of pipeline specialty graduates increased by 1171 (5.3%) between AY 2002–2003 and AY 2006–2007. During the same period, the number of graduates pursuing additional GME increased by 1059 (16.7%). The overall rate of continuing GME increased each year, from 28.5% (6331/22229) in AY 2002–2003 to 31.6% (7390/23400) in AY 2006–2007. Rates differed by specialty and for US medical school graduates (26.4% [3896/14752] in AY 2002–2003 to 31.6% [4718/14941] in AY 2006–2007) versus international medical graduates (35.2% [2118/6023] to 33.8% [2246/6647]).
The number of graduates and the rate of continuing GME increased from AY 2002–2003 to AY 2006–2007. Our findings show a recent increase in the rate of continued training for US medical school graduates compared to international medical graduates. Our results differ from previously reported rates of subspecialization in the literature. Tracking individual residents through residency and fellowship programs provides a better understanding of residents' pathways to practice.
The Accreditation Council for Graduate Medical Education (ACGME) requires pediatric residency programs to teach professionalism but does not provide concrete guidance for fulfilling these requirements. Individual programs, therefore, adopt their own methods for teaching and evaluating professionalism, and published research demonstrating how to satisfy the ACGME professionalism requirement is lacking.
We surveyed pediatric residency program directors in 2008 to explore the establishment of expectations for professional conduct, the educational experiences used to foster learning in professionalism, and the evaluation of professionalism.
Surveys were completed by 96 of 189 program directors (51%). A majority reported that new interns attend a session during which expectations for professionalism are conveyed, either verbally (93%) or in writing (65%). However, most program directors reported that “None or Few” of their residents engaged in multiple educational experiences that could foster learning in professionalism. Despite the identification of professionalism as a core competency, a minority (28%) of programs had a written curriculum in ethics or professionalism. When evaluating professionalism, the most frequently used assessment strategies were rated as “very useful” by only a modest proportion (26%–54%) of respondents.
Few programs have written curricula in professionalism, and opportunities for experiential learning in professionalism may be limited. In addition, program directors express only moderate satisfaction with current strategies for evaluating professionalism that were available through 2008.
Education and training in advanced airway management as part of an anesthesiology residency program is necessary to help residents attain the status of expert in difficult airway management. The Accreditation Council for Graduate Medical Education (ACGME) emphasizes that residents in anesthesiology must obtain significant experience with a broad spectrum of airway management techniques. However, there is no specific number required as a minimum clinical experience that should be obtained in order to ensure competency. We have developed a curriculum for a new Advanced Airway Techniques rotation. This rotation is supplemented with a hands-on Difficult Airway Workshop. We describe here this comprehensive advanced airway management educational program at our institution. Future studies will focus on determining if education in advanced airway management results in a decrease in airway related morbidity and mortality and overall better patients' outcome during difficult airway management.
To assess the value of a faculty and resident medical education development program.
Modules on Accreditation Council for Graduate Medical Education (ACGME) competencies and evaluation, teaching methods, and Residency Review Committee guidelines were created, beta tested, and installed on a website. Pretests and posttests were developed. Faculty and residents were required to complete the course. At initiation and 6 months after training, residents completed a feedback perception survey. Statistical analysis was performed using Student t test. P < .05 was considered significant.
Forty-nine voluntary faculty members and residents completed the course. The posttest scores on all the ACGME competencies were significantly higher than the pretest scores (P < .05). The results of the residents' survey indicated that the educational development program significantly improved their perceptions of corrective and immediate feedback by faculty.
A formal Internet-based program significantly increases short-term cognitive knowledge about the ACGME competencies among participants and improves trainees' perceptions of the quality of faculty feedback up to 6 months after training.
IMGs constitute about a third of the United States (US) internal medicine graduates. US residency training programs face challenges in selection of IMGs with varied background features. However data on this topic is limited. We analyzed whether any pre-selection characteristics of IMG residents in our internal medicine program are associated with selected outcomes, namely competency based evaluation, examination performance and success in acquiring fellowship positions after graduation.
We conducted a retrospective study of 51 IMGs at our ACGME accredited teaching institution between 2004 and 2007. Background resident features namely age, gender, self-reported ethnicity, time between medical school graduation to residency (pre-hire time), USMLE step I & II clinical skills scores, pre-GME clinical experience, US externship and interest in pursuing fellowship after graduation expressed in their personal statements were noted. Data on competency-based evaluations, in-service exam scores, research presentation and publications, fellowship pursuance were collected. There were no fellowships offered in our hospital in this study period. Background features were compared between resident groups according to following outcomes: (a) annual aggregate graduate PGY-level specific competency-based evaluation (CBE) score above versus below the median score within our program (scoring scale of 1 – 10), (b) US graduate PGY-level specific resident in-training exam (ITE) score higher versus lower than the median score, and (c) those who succeeded to secure a fellowship within the study period. Using appropriate statistical tests & adjusted regression analysis, odds ratio with 95% confidence intervals were calculated.
94% of the study sample were IMGs; median age was 35 years (Inter-Quartile range 25th – 75th percentile (IQR): 33–37 years); 43% women and 59% were Asian physicians. The median pre-hire time was 5 years (IQR: 4–7 years) and USMLE step I & step II clinical skills scores were 85 (IQR: 80–88) & 82 (IQR: 79–87) respectively. The median aggregate CBE scores during training were: PG1 5.8 (IQR: 5.6–6.3); PG2 6.3 (IQR 6–6.8) & PG3 6.7 (IQR: 6.7 – 7.1). 25% of our residents scored consistently above US national median ITE scores in all 3 years of training and 16% pursued a fellowship.
Younger residents had higher aggregate annual CBE score than the program median (p < 0.05). Higher USMLE scores were associated with higher than US median ITE scores, reflecting exam-taking skills. Success in acquiring a fellowship was associated with consistent fellowship interest (p < 0.05) and research publications or presentations (p <0.05). None of the other characteristics including visa status were associated with the outcomes.
Background IMG features namely, age and USMLE scores predict performance evaluation and in-training examination scores during residency training. In addition enhanced research activities during residency training could facilitate fellowship goals among interested IMGs.
Among medical educators, there are concerns that the 2003 Accreditation Council for Graduate Medical Education (ACGME) duty hour rules (DHR) has encouraged the development of a “shift work” mentality among residents while eroding professionalism by forcing residents to either abandon patients when they hit 80 hours or lie about hours worked. In this qualitative study, we explore how medical and surgical residents perceive and respond to DHR by examining the ‘local’ organizational culture in which their work is embedded.
In 2008, we conducted three months of ethnographic observation of internal medicine and general surgery residents as they went about their everyday work in two hospitals affiliated with the same training program. We also conducted in-depth interviews with seventeen residents. Field notes and interview transcripts were analyzed for perceptions and behaviors surrounding coming and leaving work, reporting of duty hours, and resident opinion about DHR.
Our respondents did not exhibit a “shift work” mentality in relation to their work. We found that residents: 1) occasionally stay in the hospital in order to complete patient care tasks even when, according to the clock, they were required to leave because the organizational culture stressed performing work thoroughly, 2) do not blindly embrace noncompliance with DHR but are thoughtful about the tradeoffs inherent in the regulations, and 3) express nuanced and complex reasons for erroneously reporting duty hours that suggest that reporting hours worked is not a simple issue of lying or truth telling.
Concerns about DHR and the erosion of resident professionalism via the development of a “shift work” mentality are likely to have been over-stated. At the institution we examined, residents did not behave as automatons punching in and out at prescribed times. Rather, they are mindful of the consequences and meaning surrounding the decisions they make to stay or leave work. When work hour rules are broken, residents do not perceive this behavior to be deviant but rather as a reflection of the higher priority that they place on providing patient care than on complying strictly with DHR. The influence of DHR on professionalism is more complex than conventional wisdom suggests and requires additional assessment.
internship and residency; duty hour regulations; professionalism
The American Board of Internal Medicine Certification Examination (ABIM-CE) is one of several methods used to assess medical knowledge, an Accreditation Council for Graduate Medical Education (ACGME) core competency for graduating internal medicine residents. With recent changes in graduate medical education program directors and internal medicine residents are seeking evidence to guide decisions regarding residency elective choices. Prior studies have shown that formalized elective curricula improve subspecialty ABIM-CE scores. The primary aim of this study was to evaluate whether the number of subspecialty elective exposures or the specific subspecialties which residents complete electives in impact ABIM-CE scores.
ABIM-CE scores, elective exposures and demographic characteristics were collected for MedStar Georgetown University Hospital internal medicine residents who were first-time takers of the ABIM-CE in 2006–2010 (n=152). Elective exposures were defined as a two-week period assigned to the respective subspecialty. ABIM-CE score was analyzed using the difference between the ABIM-CE score and the standardized passing score (delta-SPS). Subspecialty scores were analyzed using percentage of correct responses. Data was analyzed using GraphPad Prism version 5.00 for Windows.
Paired elective exposure and ABIM-CE scores were available in 131 residents. There was no linear correlation between ABIM-CE mean delta-SPS and the total number of electives or the number of unique elective exposures. Residents with ≤14 elective exposures had higher ABIM-CE mean delta-SPS than those with ≥15 elective exposures (143.4 compared to 129.7, p=0.051). Repeated electives in individual subspecialties were not associated with significant difference in mean ABIM-CE delta-SPS.
This study did not demonstrate significant positive associations between individual subspecialty elective exposures and ABIM-CE mean delta-SPS score. Residents with ≤14 elective exposures had higher ABIM-CE mean delta-SPS than those with ≥15 elective exposures suggesting there may be an “ideal” number of elective exposures that supports improved ABIM-CE performance. Repeated elective exposures in an individual specialty did not correlate with overall or subspecialty ABIM-CE performance.
Resident education; Gender; Elective; Subspecialty; Graduate medical education
The Accreditation Council for Graduate Medical Education (ACGME) invokes evidence-based medicine (EBM) principles through the practice-based learning core competency. The authors hypothesized that among a representative sample of emergency medicine (EM) residency programs, a wide variability in EBM resident training priorities, faculty expertise expectations, and curricula exists.
The primary objective was to obtain descriptive data regarding EBM practices and expectations from EM physician educators. Our secondary objective was to assess differences in EBM educational priorities among journal club directors compared with non–journal club directors.
A 19-question survey was developed by a group of recognized EBM curriculum innovators and then disseminated to Council of Emergency Medicine Residency Directors (CORD) conference participants, assessing their opinions regarding essential EBM skill sets and EBM curricular expectations for residents and faculty at their home institutions. The survey instrument also identified the degree of interest respondents had in receiving a free monthly EBM journal club curriculum.
A total of 157 individuals registered for the conference, and 98 completed the survey. Seventy-seven (77% of respondents) were either residency program directors or assistant / associate program directors. The majority of participants were from university-based programs and in practice at least 5 years. Respondents reported the ability to identify flawed research (45%), apply research findings to patient care (43%), and comprehend research methodology (33%) as the most important resident skill sets. The majority of respondents reported no formal journal club or EBM curricula (75%) and do not utilize structured critical appraisal instruments (71%) when reviewing the literature. While journal club directors believed that resident learners’ most important EBM skill is to identify secondary peer-reviewed resources, non–journal club directors identified residents’ ability to distinguish significantly flawed research as the key skill to develop. Interest in receiving a free monthly EBM journal club curriculum was widely accepted (89%).
Attaining EBM proficiency is an expected outcome of graduate medical education (GME) training, although the specific domains of anticipated expertise differ between faculty and residents. Few respondents currently use a formalized curriculum to guide the development of EBM skill sets. There appears to be a high level of interest in obtaining EBM journal club educational content in a structured format. Measuring the effects of providing journal club curriculum content in conjunction with other EBM interventions may warrant further investigation.
evidence-based medicine; knowledge translation; faculty development
The Department of Graduate Medical Education at Stanford Hospital and Clinics has developed a professional training program for program directors. This paper outlines the goals, structure, and expected outcomes for the one-year Fellowship in Graduate Medical Education Administration program.
The skills necessary for leading a successful Accreditation Council for Graduate Medical Education (ACGME) training program require an increased level of curricular and administrative expertise. To meet the ACGME Outcome Project goals, program directors must demonstrate not only sophisticated understanding of curricular design but also competency-based performance assessment, resource management, and employment law. Few faculty-development efforts adequately address the complexities of educational administration. As part of an institutional-needs assessment, 41% of Stanford program directors indicated that they wanted more training from the Department of Graduate Medical Education.
To address this need, the Fellowship in Graduate Medical Education Administration program will provide a curriculum that includes (1) readings and discussions in 9 topic areas, (2) regular mentoring by the director of Graduate Medical Education (GME), (3) completion of a service project that helps improve GME across the institution, and (4) completion of an individual scholarly project that focuses on education.
The first fellow was accepted during the 2008–2009 academic year. Outcomes for the project include presentation of a project at a national meeting, internal workshops geared towards disseminating learning to peer program directors, and the completion of a GME service project. The paper also discusses lessons learned for improving the program.
Scholarly activity as a component of residency education is becoming increasingly emphasized by the Accreditation Council for Graduate Medical Education. “Limited or no evidence of resident or faculty scholarly activity” is a common citation given to family medicine residency programs by the Review Committee for Family Medicine.
The objective was to provide a model scholarly activity curriculum that has been successful in improving the quality of graduate medical education in a family medicine residency program, as evidenced by a record of resident academic presentations and publications.
We provide a description of the Clinical Scholars Program that has been implemented into the curriculum of the Trident/Medical University of South Carolina Family Medicine Residency Program.
During the most recent 10-year academic period (2000–2010), a total of 111 residents completed training and participated in the Clinical Scholars Program. This program has produced more than 24 presentations during national and international meetings of medical societies and 15 publications in peer-reviewed medical journals. In addition, many of the projects have been presented during meetings of state and regional medical organizations.
This paper presents a model curriculum for teaching about scholarship to family medicine residents. The success of this program is evidenced by the numerous presentations and publications by participating residents.
The Accreditation Council on Graduate Medical Education (ACGME) supports chart audit as a method to track competency in Practice-Based Learning and Improvement. We examined whether peer chart audits performed by internal medicine residents were associated with improved documentation of foot care in patients with diabetes mellitus.
A retrospective electronic chart review was performed on 347 patients with diabetes mellitus cared for by internal medicine residents in a university-based continuity clinic from May 2003 to September 2004. Residents abstracted information pertaining to documentation of foot examinations (neurological, vascular, and skin) from the charts of patients followed by their physician peers. No formal feedback or education was provided.
Significant improvement in the documentation of foot exams was observed over the course of the study. The percentage of patients receiving neurological, vascular, and skin exams increased by 20% (from 13% to 33%) (p = 0.001), 26% (from 45% to 71%) (p < 0.001), and 18% (51%–72%) (p = 0.005), respectively. Similarly, the proportion of patients receiving a well-documented exam which includes all three components – neurological, vascular and skin foot exam – increased over time (6% to 24%, p < 0.001).
Peer chart audits performed by residents in the absence of formal feedback were associated with improved documentation of the foot exam in patients with diabetes mellitus. Although this study suggests that peer chart audits may be an effective tool to improve practice-based learning and documentation of foot care in diabetic patients, evaluating the actual performance of clinical care was beyond the scope of this study and would be better addressed by a randomized controlled trial.
The Accreditation Council for Graduate Medical Education (ACGME) introduced the Outcome Project in July 2001 to improve the quality of resident education through competency-based learning. The purpose of this systematic review is to determine and explore the perceptions of program directors regarding challenges to implementing the ACGME Outcome Project.
We used the PubMed and Web of Science databases and bibliographies for English-language articles published between January 1, 2001, and February 17, 2012. Studies were included if they described program directors' opinions on (1) barriers encountered when attempting to implement ACGME competency-based education, and (2) assessment methods that each residency program was using to implement competency-based education. Articles meeting the inclusion criteria were screened by 2 researchers. The grading criterion was created by the authors and used to assess the quality of each study.
The survey-based data reported the opinions of 1076 program directors. Barriers that were encountered include: (1) lack of time; (2) lack of faculty support; (3) resistance of residents to the Outcome Project; (4) insufficient funding; (5) perceived low priority for the Outcome Project; (6) inadequate salary incentive; and (7) inadequate knowledge of the competencies. Of the 6 competencies, those pertaining to patient care and medical knowledge received the most responses from program directors and were given highest priority.
The reviewed literature revealed that time and financial constraints were the most important barriers encountered when implementing the ACGME Outcome Project.
Morbidity and Mortality (M&M) Conferences are an Accreditation Council for Graduate Medical Education (ACGME) mandated educational series that occur regularly at all institutions that have residency training programs. The potential for learning from medical errors, complications, and unanticipated outcomes is immense—provided that the focus is on education, as opposed to culpability. The education innovation described in this manuscript is the manner in which we have used the ACGME Outcome Project's 6 core competencies as the structure upon which the cases discussed at our M&M conference are framed. When presented at grand rounds in a novel format, M&M conference has not only maintained support for the quality improvement efforts in the Department, but has served to improve the educational impact of the conference.
grand rounds; Morbidity and Mortality; ACGME competencies; medical education