The Accreditation Council for Graduate Medical Education (ACGME) requires an annual evaluation of all ACGME-accredited residency and fellowship programs to assess program quality. The results of this evaluation must be used to improve the program. This manuscript describes a metric to be used in conducting ACGME-mandated annual program review of ACGME-accredited anesthesiology residencies and fellowships.
A variety of metrics to assess anesthesiology residency and fellowship programs are identified by the authors through literature review and considered for use in constructing a program "report card."
Metrics used to assess program quality include success in achieving American Board of Anesthesiology (ABA) certification, performance on the annual ABA/American Society of Anesthesiology In-Training Examination, performance on mock oral ABA certification examinations, trainee scholarly activities (publications and presentations), accreditation site visit and internal review results, ACGME and alumni survey results, National Resident Matching Program (NRMP) results, exit interview feedback, diversity data and extensive program/rotation/faculty/curriculum evaluations by trainees and faculty. The results are used to construct a "report card" that provides a high-level review of program performance and can be used in a continuous quality improvement process.
An annual program review is required to assess all ACGME-accredited residency and fellowship programs to monitor and improve program quality. We describe an annual review process based on metrics that can be used to focus attention on areas for improvement and track program performance year-to-year. A "report card" format is described as a high-level tool to track educational outcomes.
OBJECTIVE: To describe the views of residency program directors regarding the effect of the 2010 duty hour recommendations on the 6 core competencies of graduate medical education.
METHODS: US residency program directors in internal medicine, pediatrics, and general surgery were e-mailed a survey from July 8 through July 20, 2010, after the 2010 Accreditation Council for Graduate Medical Education (ACGME) duty hour recommendations were published. Directors were asked to rate the implications of the new recommendations for the 6 ACGME core competencies as well as for continuity of inpatient care and resident fatigue.
RESULTS: Of 719 eligible program directors, 464 (65%) responded. Most program directors believe that the new ACGME recommendations will decrease residents' continuity with hospitalized patients (404/464 [87%]) and will not change (303/464 [65%]) or will increase (26/464 [6%]) resident fatigue. Additionally, most program directors (249-363/464 [53%-78%]) believe that the new duty hour restrictions will decrease residents' ability to develop competency in 5 of the 6 core areas. Surgery directors were more likely than internal medicine directors to believe that the ACGME recommendations will decrease residents' competency in patient care (odds ratio [OR], 3.9; 95% confidence interval [CI], 2.5-6.3), medical knowledge (OR, 1.9; 95% CI, 1.2-3.2), practice-based learning and improvement (OR, 2.7; 95% CI, 1.7-4.4), interpersonal and communication skills (OR, 1.9; 95% CI, 1.2-3.0), and professionalism (OR, 2.5; 95% CI, 1.5-4.0).
CONCLUSION: Residency program directors' reactions to ACGME duty hour recommendations demonstrate a marked degree of concern about educating a competent generation of future physicians in the face of increasing duty hour standards and regulation.
The reactions of residency program directors to the ACGME duty hour recommendations demonstrate a marked degree of concern about educating a competent generation of future physicians in the face of increasing duty hour standards and regulation.
The Accreditation Council for Graduate Medical Education (ACGME) Outcome Project requires that residency program directors objectively document that their residents achieve competence in 6 general dimensions of practice.
In November 2007, the American Board of Internal Medicine (ABIM) and the ACGME initiated the development of milestones for internal medicine residency training. ABIM and ACGME convened a 33-member milestones task force made up of program directors, experts in evaluation and quality, and representatives of internal medicine stakeholder organizations. This article reports on the development process and the resulting list of proposed milestones for each ACGME competency.
The task force adopted the Dreyfus model of skill acquisition as a framework the internal medicine milestones, and calibrated the milestones with the expectation that residents achieve, at a minimum, the “competency” level in the 5-step progression by the completion of residency. The task force also developed general recommendations for strategies to evaluate the milestones.
The milestones resulting from this effort will promote competency-based resident education in internal medicine, and will allow program directors to track the progress of residents and inform decisions regarding promotion and readiness for independent practice. In addition, the milestones may guide curriculum development, suggest specific assessment strategies, provide benchmarks for resident self-directed assessment-seeking, and assist remediation by facilitating identification of specific deficits. Finally, by making explicit the profession's expectations for graduates and providing a degree of national standardization in evaluation, the milestones may improve public accountability for residency training.
In the United States, the Accreditation Council of graduate medical education (ACGME) requires all accredited Internal medicine residency training programs to facilitate resident scholarly activities. However, clinical experience and medical education still remain the main focus of graduate medical education in many Internal Medicine (IM) residency-training programs. Left to design the structure, process and outcome evaluation of the ACGME research requirement, residency-training programs are faced with numerous barriers. Many residency programs report having been cited by the ACGME residency review committee in IM for lack of scholarly activity by residents.
We would like to share our experience at Lincoln Hospital, an affiliate of Weill Medical College Cornell University New York, in designing and implementing a successful structured research curriculum based on ACGME competencies taught during a dedicated "research rotation".
Since the inception of the research rotation in 2004, participation of our residents among scholarly activities has substantially increased. Our residents increasingly believe and appreciate that research is an integral component of residency training and essential for practice of medicine.
Internal medicine residents' outlook in research can be significantly improved using a research curriculum offered through a structured and dedicated research rotation. This is exemplified by the improvement noted in resident satisfaction, their participation in scholarly activities and resident research outcomes since the inception of the research rotation in our internal medicine training program.
The Accreditation Council of Graduate Medical Education (ACGME) and the Residency Review Committee require a competency-based, accessible evaluation system. The paper system at our institution did not meet these demands and suffered from low compliance. A diverse committee of internal medicine faculty, program directors, and house staff designed a new clinical evaluation strategy based on ACGME competencies and utilizing a modular web-based system called ResEval. ResEval more effectively met requirements and provided useful data for program and curriculum development. The system is paperless, allows for evaluations at any time, and produces customized evaluation reports, dramatically improving our ability to analyze evaluation data. The use of this novel system and the inclusion of a robust technology infrastructure, repeated training and e-mail reminders, and program leadership commitment resulted in an increase in clinical skills evaluations performed and a rapid change in the workflow and culture of evaluation at our residency program.
house staff evaluation; clinical skills assessment; internal medicine residency; Internet; educational measurement
1) To describe how internal medicine residency programs fulfill the Accreditation Council for Graduate Medical Education (ACGME) scholarly activity training requirement including the current context of resident scholarly work, and 2) to compare findings between university and nonuniversity programs.
Cross-sectional mailed survey.
ACGME-accredited internal medicine residency programs.
Internal medicine residency program directors.
Data were collected on 1) interpretation of the scholarly activity requirement, 2) support for resident scholarship, 3) scholarly activities of residents, 4) attitudes toward resident research, and 5) program characteristics. University and nonuniversity programs were compared.
The response rate was 78%. Most residents completed a topic review with presentation (median, 100%) to fulfill the requirement. Residents at nonuniversity programs were more likely to complete case reports (median, 40% vs 25%; P =.04) and present at local or regional meetings (median, 25% vs 20%; P =.01), and were just as likely to conduct hypothesis-driven research (median, 20% vs 20%; P =.75) and present nationally (median, 10% vs 5%; P =.10) as residents at university programs. Nonuniversity programs were more likely to report lack of faculty mentors (61% vs 31%; P <.001) and resident interest (55% vs 40%; P =.01) as major barriers to resident scholarship. Programs support resident scholarship through research curricula (47%), funding (46%), and protected time (32%).
Internal medicine residents complete a variety of projects to fulfill the scholarly activity requirement. Nonuniversity programs are doing as much as university programs in meeting the requirement and supporting resident scholarship despite reporting significant barriers.
ACGME; resident research; medical education; national survey
The American Board of Internal Medicine Certification Examination (ABIM-CE) is one of several methods used to assess medical knowledge, an Accreditation Council for Graduate Medical Education (ACGME) core competency for graduating internal medicine residents. With recent changes in graduate medical education program directors and internal medicine residents are seeking evidence to guide decisions regarding residency elective choices. Prior studies have shown that formalized elective curricula improve subspecialty ABIM-CE scores. The primary aim of this study was to evaluate whether the number of subspecialty elective exposures or the specific subspecialties which residents complete electives in impact ABIM-CE scores.
ABIM-CE scores, elective exposures and demographic characteristics were collected for MedStar Georgetown University Hospital internal medicine residents who were first-time takers of the ABIM-CE in 2006–2010 (n=152). Elective exposures were defined as a two-week period assigned to the respective subspecialty. ABIM-CE score was analyzed using the difference between the ABIM-CE score and the standardized passing score (delta-SPS). Subspecialty scores were analyzed using percentage of correct responses. Data was analyzed using GraphPad Prism version 5.00 for Windows.
Paired elective exposure and ABIM-CE scores were available in 131 residents. There was no linear correlation between ABIM-CE mean delta-SPS and the total number of electives or the number of unique elective exposures. Residents with ≤14 elective exposures had higher ABIM-CE mean delta-SPS than those with ≥15 elective exposures (143.4 compared to 129.7, p=0.051). Repeated electives in individual subspecialties were not associated with significant difference in mean ABIM-CE delta-SPS.
This study did not demonstrate significant positive associations between individual subspecialty elective exposures and ABIM-CE mean delta-SPS score. Residents with ≤14 elective exposures had higher ABIM-CE mean delta-SPS than those with ≥15 elective exposures suggesting there may be an “ideal” number of elective exposures that supports improved ABIM-CE performance. Repeated elective exposures in an individual specialty did not correlate with overall or subspecialty ABIM-CE performance.
Resident education; Gender; Elective; Subspecialty; Graduate medical education
Responding to mandates from the Accreditation Council for Graduate Medical Education (ACGME) and American Osteopathic Association (AOA), residency programs have developed competency-based assessment tools. One such tool is the American College of Osteopathic Pediatricians (ACOP) program directors’ annual report. High-stakes clinical skills licensing examinations, such as the Comprehensive Osteopathic Medical Licensing Examination Level 2-Performance Evaluation (COMLEX-USA Level 2-PE), also assess competency in several clinical domains.
The purpose of this study is to investigate the relationships between program director competency ratings of first-year osteopathic residents in pediatrics and COMLEX-USA Level 2-PE scores from 2005 to 2009.
The sample included all 94 pediatric first-year residents who took COMLEX-USA Level 2-PE and whose training was reviewed by the ACOP for approval of training between 2005 and 2009. Program director competency ratings and COMLEX-USA Level 2-PE scores (domain and component) were merged and analyzed for relationships.
Biomedical/biomechanical domain scores were positively correlated with overall program director competency ratings. Humanistic domain scores were not significantly correlated with overall program director competency ratings, but did show moderate correlation with ratings for interpersonal and communication skills. The six ACGME or seven AOA competencies assessed empirically by the ACOP program directors’ annual report could not be recovered by principal component analysis; instead, three factors were identified, accounting for 86% of the variance between competency ratings.
A few significant correlations were noted between COMLEX-USA Level 2-PE scores and program director competency ratings. Exploring relationships between different clinical skills assessments is inherently difficult because of the heterogeneity of tools used and overlap of constructs within the AOA and ACGME core competencies.
residency program director ratings; clinical skills testing; high-stakes licensing exam; competency assessment; pediatric residents; external validity; COMLEX-USA
The Accreditation Council for Graduate Medical Education (ACGME) requires that training programs integrate system-based practice (SBP) and practice-based learning and improvement (PBLI) into internal medicine residency curricula.
Context and setting
We instituted a seminar series and year-long-mentored curriculum designed to engage internal medicine residents in these competencies.
Residents participate in a seminar series that includes assigned reading and structured discussion with faculty who assist in the development of quality improvement or research projects. Residents pursue projects over the remainder of the year. Monthly works in progress meetings, protected time for inquiry, and continued faculty mentorship guide the residents in their project development. Trainees present their work at hospital-wide grand rounds at the end of the academic year. We performed a survey of residents to assess their self-reported knowledge, attitudes and skills in SBP and PBLI. In addition, blinded faculty scored projects for appropriateness, impact, and feasibility.
We measured resident self-reported knowledge, attitudes, and skills at the end of the academic year. We found evidence that participants improved their understanding of the context in which they were practicing, and that their ability to engage in quality improvement projects increased. Blinded faculty reviewers favorably ranked the projects’ feasibility, impact, and appropriateness. The ‘Curriculum of Inquiry’ generated 11 quality improvement and research projects during the study period. Barriers to the ongoing work include a limited supply of mentors and delays due to Institutional Review Board approval. Hospital leadership recognizes the importance of the curriculum, and our accreditation manager now cites our ongoing work.
A structured residency-based curriculum facilitates resident demonstration of SBP and practice-based learning and improvement. Residents gain knowledge and skills though this enterprise and hospitals gain access to trainees who help to solve ongoing problems and meet accreditation requirements.
graduate medical education; competencies; longitudinal curriculum
The Accreditation Council for Graduate Medical Education (ACGME) core competencies are used to assess resident performance, and recently similar competencies have become an accepted framework for evaluating medical student achievements as well. However, the utility of incorporating the competencies into the resident application has not yet been assessed.
The objective of this study was to examine letters of recommendation (LORs) to identify ACGME competency–based themes that might help distinguish the least successful from the most successful residents.
Residents entering a university-based residency program from 1994 to 2004 were retrospectively evaluated by faculty and ranked in 4 groups according to perceived level of success. Applications from residents in the highest and lowest groups were abstracted. LORs were qualitatively reviewed and analyzed for 9 themes (6 ACGME core competencies and 3 additional performance measures). The mean number of times each theme was mentioned was calculated for each student. Groups were compared using the χ2 test and the Student t test.
Seventy-five residents were eligible for analysis, and 29 residents were ranked in the highest and lowest groups. Baseline demographics and number of LORs did not differ between the two groups. Successful residents had statistically significantly more comments about excellence in the competency areas of patient care, medical knowledge, and interpersonal and communication skills.
LORs can provide useful clues to differentiate between students who are likely to become the least versus the most successful residency program graduates. Greater usage of the ACGME core competencies within LORs may be beneficial.
In 1999, the Accreditation Council for Graduate Medical Education (ACGME) Outcome Project began to focus on resident performance in the 6 competencies of patient care, medical knowledge, professionalism, practice-based learning and improvement, interpersonal communication skills, and professionalism. Beginning in 2007, the ACGME began collecting information on how programs assess these competencies. This report provides information on the nature and extent of those assessments.
Using data collected by the ACGME for site visits, we use descriptive statistics and percentages to describe the number and type of methods and assessors accredited programs (n = 4417) report using to assess the competencies. Observed differences among specialties, methodologies, and assessors are tested with analysis of variance procedures.
Almost all (>97%) of programs report assessing all of the competencies and using multiple methods and multiple assessors. Similar assessment methods and evaluator types were consistently used across the 6 competencies. However, there were some differences in the use of patient and family as assessors: Primary care and ambulatory specialties used these to a greater extent than other specialties.
Residency programs are emphasizing the competencies in their evaluation of residents. Understanding the scope of evaluation methodologies that programs use in resident assessment is important for both the profession and the public, so that together we may monitor continuing improvement in US graduate medical education.
The Accreditation Council for Graduate Medical Education (ACGME) has announced revisions to the resident duty hour standards in light of a 2008 Institute of Medicine report that recommended further limits. Soliciting resident input regarding the future of duty hours is critical to ensure trainee buy-in.
To assess incoming intern perceptions of duty hour restrictions at 3 teaching hospitals.
We administered an anonymous survey to incoming interns during orientation at 3 teaching hospitals affiliated with 2 Midwestern medical schools in 2009. Survey questions assessed interns' perceptions of maximum shift length, days off, ACGME oversight, and preferences for a “fatigued post-call intern who admitted patient” versus “well-rested covering intern who just picked up patient” for various clinical scenarios.
Eighty-six percent (299/346) of interns responded. Although 59% agreed that residents should not work over 16 hours without a break, 50% of interns favored the current limits. The majority (78%) of interns desired ability to exceed shift limit for rare cases or clinical opportunities. Most interns (90%) favored oversight by the ACGME, and 97% preferred a well-rested intern for performing a procedure. Meanwhile, only 48% of interns preferred a well-rested intern for discharging a patient or having an end of life discussion. Interns who favored 16-hour limits were less concerned with negative consequences of duty hour restrictions (handoffs, reduced clinical experience) and more likely to choose the well-rested intern for certain scenarios (odds ratio 2.33, 95% confidence interval 1.42–3.85, P = .001).
Incoming intern perceptions on limiting duty hours vary. Many interns desire flexibility to exceed limits for interesting clinical opportunities and favor ACGME oversight. Clinical context matters when interns consider the tradeoffs between fatigue and discontinuity.
The Accreditation Council for Graduate Medical Education (ACGME) requires physicians in training to be educated in 6 competencies considered important for independent medical practice. There is little information about the experiences that residents feel contribute most to the acquisition of the competencies.
To understand how residents perceive their learning of the ACGME competencies and to determine which educational activities were most helpful in acquiring these competencies.
A web-based survey created by the graduate medical education office for institutional program monitoring and evaluation was sent to all residents in ACGME-accredited programs at the David Geffen School of Medicine, University of California-Los Angeles, from 2007 to 2010. Residents responded to questions about the adequacy of their learning for each of the 6 competencies and which learning activities were most helpful in competency acquisition.
We analyzed 1378 responses collected from postgraduate year-1 (PGY-1) to PGY-3 residents in 12 different residency programs, surveyed between 2007 and 2010. The overall response rate varied by year (66%–82%). Most residents (80%–97%) stated that their learning of the 6 ACGME competencies was “adequate.” Patient care activities and observation of attending physicians and peers were listed as the 2 most helpful learning activities for acquiring the 6 competencies.
Our findings reinforce the importance of learning from role models during patient care activities and the heterogeneity of learning activities needed for acquiring all 6 competencies.
IMGs constitute about a third of the United States (US) internal medicine graduates. US residency training programs face challenges in selection of IMGs with varied background features. However data on this topic is limited. We analyzed whether any pre-selection characteristics of IMG residents in our internal medicine program are associated with selected outcomes, namely competency based evaluation, examination performance and success in acquiring fellowship positions after graduation.
We conducted a retrospective study of 51 IMGs at our ACGME accredited teaching institution between 2004 and 2007. Background resident features namely age, gender, self-reported ethnicity, time between medical school graduation to residency (pre-hire time), USMLE step I & II clinical skills scores, pre-GME clinical experience, US externship and interest in pursuing fellowship after graduation expressed in their personal statements were noted. Data on competency-based evaluations, in-service exam scores, research presentation and publications, fellowship pursuance were collected. There were no fellowships offered in our hospital in this study period. Background features were compared between resident groups according to following outcomes: (a) annual aggregate graduate PGY-level specific competency-based evaluation (CBE) score above versus below the median score within our program (scoring scale of 1 – 10), (b) US graduate PGY-level specific resident in-training exam (ITE) score higher versus lower than the median score, and (c) those who succeeded to secure a fellowship within the study period. Using appropriate statistical tests & adjusted regression analysis, odds ratio with 95% confidence intervals were calculated.
94% of the study sample were IMGs; median age was 35 years (Inter-Quartile range 25th – 75th percentile (IQR): 33–37 years); 43% women and 59% were Asian physicians. The median pre-hire time was 5 years (IQR: 4–7 years) and USMLE step I & step II clinical skills scores were 85 (IQR: 80–88) & 82 (IQR: 79–87) respectively. The median aggregate CBE scores during training were: PG1 5.8 (IQR: 5.6–6.3); PG2 6.3 (IQR 6–6.8) & PG3 6.7 (IQR: 6.7 – 7.1). 25% of our residents scored consistently above US national median ITE scores in all 3 years of training and 16% pursued a fellowship.
Younger residents had higher aggregate annual CBE score than the program median (p < 0.05). Higher USMLE scores were associated with higher than US median ITE scores, reflecting exam-taking skills. Success in acquiring a fellowship was associated with consistent fellowship interest (p < 0.05) and research publications or presentations (p <0.05). None of the other characteristics including visa status were associated with the outcomes.
Background IMG features namely, age and USMLE scores predict performance evaluation and in-training examination scores during residency training. In addition enhanced research activities during residency training could facilitate fellowship goals among interested IMGs.
The Accreditation Council for Graduate Medical Education (ACGME) requires residency programs to meet and demonstrate outcomes across 6 competencies. Measuring residents' competency in practice-based learning and improvement (PBLI) is particularly challenging.
We developed an educational tool to meet ACGME requirements for PBLI. The PBLI template helped programs document quality improvement (QI) projects and supported increased scholarly activity surrounding PBLI learning.
We reviewed program requirements for 43 residency and fellowship programs and identified specific PBLI requirements for QI activities. We also examined ACGME Program Information Form responses on PBLI core competency questions surrounding QI projects for program sites visited in 2008–2009. Data were integrated by a multidisciplinary committee to develop a peer-protected PBLI template guiding programs through process, documentation, and evaluation of QI projects. All steps were reviewed and approved through our GME Committee structure.
An electronic template, companion checklist, and evaluation form were developed using identified project characteristics to guide programs through the PBLI process and facilitate documentation and evaluation of the process. During a 24 month period, 27 programs have completed PBLI projects, and 15 have reviewed the template with their education committees, but have not initiated projects using the template.
The development of the tool generated program leaders' support because the tool enhanced the ability to meet program-specific objectives. The peer-protected status of this document for confidentiality and from discovery has been beneficial for program usage. The document aggregates data on PBLI and QI initiatives, offers opportunities to increase scholarship in QI, and meets the ACGME goal of linking measures to outcomes important to meeting accreditation requirements at the program and institutional level.
Because of the Accreditation Council for Graduate Medical Education (ACGME) and the Residency Review Committee (RRC) approval timelines, new residency programs cannot use Electronic Residency Application Service (ERAS) during their first year of applicants.
We sought to identify differences between program directors’ subjective ratings of applicants from an emergency medicine (EM) residency program’s first year (in which ERAS was not used) to their ratings of applicants the following year in which ERAS was used.
The University of Utah Emergency Medicine Residency Program received approval from the ACGME in 2004. Applicants for the entering class of 2005 (year 1) did not use ERAS, submitting a separate application, while those applying for the following year (year 2) used ERAS. Residency program directors rated applicants using subjective components of their applications, assigning scores on scales from 0–10 or 0–5 (10 or 5 = highest score) for select components of the application. We retrospectively reviewed and compared these ratings between the 2 years of applicants.
A total of 130 and 458 prospective residents applied during year 1 and year 2, respectively. Applicants were similar in average scores for research (1.65 vs. 1.81, scale 0–5, p = 0.329) and volunteer work (5.31 vs. 5.56, scale 0–10, p = 0.357). Year 1 applicants received higher scores for their personal statement (3.21 vs. 2.22, scale 0–5, p < 0.001), letters of recommendation (7.0 vs. 5.94, scale 0–10, p < 0.001), dean’s letter (3.5 vs. 2.7, scale 1–5, p < 0.001), and in their potential contribution to class characteristics (4.64 vs. 3.34, scale 0–10, p < 0.001).
While the number of applicants increased, the use of ERAS in a new residency program did not improve the overall subjective ratings of residency applicants. Year 1 applicants received higher scores for the written components of their applications and in their potential contributions to class characteristics.
Residency application; ERAS; Subjective Ratings
Residency program directors are challenged to effectively teach and assess the Accreditation Council for Graduate Medical Education's (ACGME) 6 competencies. The purpose of this study was to characterize the morbidity and mortality (M&M) conference as a cost-effective and efficient approach for addressing the ACGME competencies through evaluation of resident participation and case diversity.
In our modified M&M conference, senior residents submit a weekly list of cases to the conference proctors. The resident presents the case, including a critique of management, using the medical literature. The resident submits a case summary evaluating patient care practices, integrating scientific evidence, and evaluating systemic barriers to care. Completed case summaries are distributed and archived for reference.
During a 3-year period, 30 residents presented 196 cases. Of these, 37 (19%) directly related to systems-based practice, 20 (10%) involved problems with inadequate communication, and 11 (6%) included issues of professionalism or ethics. All cases involved practice-based learning and medical knowledge.
The M&M conference addresses the core competencies through resident participation as well as directed analysis of diverse cases.
The Accreditation Council for Graduate Medical Education (ACGME) invokes evidence-based medicine (EBM) principles through the practice-based learning core competency. The authors hypothesized that among a representative sample of emergency medicine (EM) residency programs, a wide variability in EBM resident training priorities, faculty expertise expectations, and curricula exists.
The primary objective was to obtain descriptive data regarding EBM practices and expectations from EM physician educators. Our secondary objective was to assess differences in EBM educational priorities among journal club directors compared with non–journal club directors.
A 19-question survey was developed by a group of recognized EBM curriculum innovators and then disseminated to Council of Emergency Medicine Residency Directors (CORD) conference participants, assessing their opinions regarding essential EBM skill sets and EBM curricular expectations for residents and faculty at their home institutions. The survey instrument also identified the degree of interest respondents had in receiving a free monthly EBM journal club curriculum.
A total of 157 individuals registered for the conference, and 98 completed the survey. Seventy-seven (77% of respondents) were either residency program directors or assistant / associate program directors. The majority of participants were from university-based programs and in practice at least 5 years. Respondents reported the ability to identify flawed research (45%), apply research findings to patient care (43%), and comprehend research methodology (33%) as the most important resident skill sets. The majority of respondents reported no formal journal club or EBM curricula (75%) and do not utilize structured critical appraisal instruments (71%) when reviewing the literature. While journal club directors believed that resident learners’ most important EBM skill is to identify secondary peer-reviewed resources, non–journal club directors identified residents’ ability to distinguish significantly flawed research as the key skill to develop. Interest in receiving a free monthly EBM journal club curriculum was widely accepted (89%).
Attaining EBM proficiency is an expected outcome of graduate medical education (GME) training, although the specific domains of anticipated expertise differ between faculty and residents. Few respondents currently use a formalized curriculum to guide the development of EBM skill sets. There appears to be a high level of interest in obtaining EBM journal club educational content in a structured format. Measuring the effects of providing journal club curriculum content in conjunction with other EBM interventions may warrant further investigation.
evidence-based medicine; knowledge translation; faculty development
The Accreditation Council for Graduate Medical Education (ACGME) introduced the Outcome Project in July 2001 to improve the quality of resident education through competency-based learning. The purpose of this systematic review is to determine and explore the perceptions of program directors regarding challenges to implementing the ACGME Outcome Project.
We used the PubMed and Web of Science databases and bibliographies for English-language articles published between January 1, 2001, and February 17, 2012. Studies were included if they described program directors' opinions on (1) barriers encountered when attempting to implement ACGME competency-based education, and (2) assessment methods that each residency program was using to implement competency-based education. Articles meeting the inclusion criteria were screened by 2 researchers. The grading criterion was created by the authors and used to assess the quality of each study.
The survey-based data reported the opinions of 1076 program directors. Barriers that were encountered include: (1) lack of time; (2) lack of faculty support; (3) resistance of residents to the Outcome Project; (4) insufficient funding; (5) perceived low priority for the Outcome Project; (6) inadequate salary incentive; and (7) inadequate knowledge of the competencies. Of the 6 competencies, those pertaining to patient care and medical knowledge received the most responses from program directors and were given highest priority.
The reviewed literature revealed that time and financial constraints were the most important barriers encountered when implementing the ACGME Outcome Project.
The Accreditation Council for Graduate Medical Education (ACGME) requires pediatric residency programs to teach professionalism but does not provide concrete guidance for fulfilling these requirements. Individual programs, therefore, adopt their own methods for teaching and evaluating professionalism, and published research demonstrating how to satisfy the ACGME professionalism requirement is lacking.
We surveyed pediatric residency program directors in 2008 to explore the establishment of expectations for professional conduct, the educational experiences used to foster learning in professionalism, and the evaluation of professionalism.
Surveys were completed by 96 of 189 program directors (51%). A majority reported that new interns attend a session during which expectations for professionalism are conveyed, either verbally (93%) or in writing (65%). However, most program directors reported that “None or Few” of their residents engaged in multiple educational experiences that could foster learning in professionalism. Despite the identification of professionalism as a core competency, a minority (28%) of programs had a written curriculum in ethics or professionalism. When evaluating professionalism, the most frequently used assessment strategies were rated as “very useful” by only a modest proportion (26%–54%) of respondents.
Few programs have written curricula in professionalism, and opportunities for experiential learning in professionalism may be limited. In addition, program directors express only moderate satisfaction with current strategies for evaluating professionalism that were available through 2008.
The Accreditation Council for Graduate Medical Education (ACGME) uses a 29-question Resident Survey for yearly residency program assessments. This article describes methodology for aggregating Resident Survey data into 5 discrete areas of program performance for use in the accreditation process. This article also describes methodology for setting thresholds that may assist Residency Review Committees in identifying programs with potential compliance problems.
A team of ACGME staff and Residency Review Committee chairpersons reviewed the survey for content and proposed thresholds (through a modified Angoff procedure) that would indicate problematic program functioning.
Interrater agreement was high for the 5 content areas and for the threshold values (percentage of noncompliant residents), indicating that programs above these thresholds may warrant follow-up by the accrediting organization. Comparison of the Angoff procedure and the actual distribution of the data revealed that the Angoff thresholds were extremely similar to 1 standard deviation above the content area mean.
Data from the ACGME Resident Survey may be aggregated into internally consistent and consensually valid areas that may help Residency Review Committees make more targeted and specific judgments about program compliance.
Graduate medical education has moved towards competency-based training. The aim of this study was to assess hand surgery program directors’ opinions of exposure gaps in core competencies rated as essential for hand surgery training.
We surveyed the 74 ACGME hand surgery fellowship program directors. Respondents rated their opinion of 9 general areas of practice, 97 knowledge topics, and 172 procedures into one of three categories: essential, exposure needed, or unnecessary. Program directors also rated trainee exposure of each component at their respective program. Moderate and large exposure gaps were respectively defined as presence of at least 25 and 50 % of programs rating trainees as not having proficiency in the component at the end of training.
Sixty-two of 74 program directors (84 %) responded to the survey. For the 76 knowledge topics and 98 procedures rated as essential, a majority of the knowledge topics (61 %; n = 46) and procedures (72 %; n = 71) had at least a moderate exposure gap. In addition, 22 % (n = 17) of the essential knowledge topics and 26 % (n = 25) of the essential procedures had a large exposure gap.
This study illuminates the discrepancies between what is believed to be important for practicing hand surgeons and graduates’ proficiency as perceived by program directors. The field of hand surgery must work to determine if program directors have unrealistic expectations for what is essential for practicing hand surgeons or if reforms are needed to improve exposure to essential skills in hand surgery training.
Competencies; Competency-based training; Hand surgery fellowship; Hand surgery training; Medical knowledge; Patient care
Many have called for ambulatory training redesign in internal medicine (IM) residencies to increase primary care career outcomes. Many believe dysfunctional, clinic environments are a key barrier to meaningful ambulatory education, but little is actually known about the educational milieu of continuity clinics nationwide.
We wished to describe the infrastructure and educational milieu at resident continuity clinics and assess clinic readiness to meet new IM-RRC requirements.
National survey of ACGME accredited IM training programs.
Directors of academic and community-based continuity clinics.
Two hundred and twenty-one out of 365 (62%) of clinic directors representing 49% of training programs responded. Wide variation amongst continuity clinics in size, structure and educational organization exist. Clinics below the 25th percentile of total clinic sessions would not meet RRC-IM requirements for total number of clinic sessions. Only two thirds of clinics provided a longitudinal mentor. Forty-three percent of directors reported their trainees felt stressed in the clinic environment and 25% of clinic directors felt overwhelmed.
The survey used self reported data and was not anonymous. A slight predominance of larger clinics and university based clinics responded. Data may not reflect changes to programs made since 2008.
This national survey demonstrates that the continuity clinic experience varies widely across IM programs, with many sites not yet meeting new ACGME requirements. The combination of disadvantaged and ill patients with inadequately resourced clinics, stressed residents, and clinic directors suggests that many sites need substantial reorganization and institutional commitment.New paradigms, encouraged by ACGME requirement changes such as increased separation of inpatient and outpatient duties are needed to improve the continuity clinic experience.
clinic; resident education; ACGME; primary care
The Accreditation Council for Graduate Medical Education (ACGME) expects programs to engage in ongoing, meaningful improvement, facilitated in part through an annual process of program assessment and improvement. The Duke University Hospital Office of Graduate Medical Education (OGME) used an institutional practice-based learning and improvement strategy to improve the annual evaluation and improvement of its programs.
The OGME implemented several strategies including the development and dissemination of a template for the report, program director and coordinator development, a reminder and tracking system, incorporation of the document into internal reviews, and use of incentives to promote program adherence.
In the first year of implementation (summer 2005), 27 programs (37%) submitted documentation of their annual program evaluation and improvement to the OGME; this increased to 100% of programs by 2009. A growing number of programs elected to use the template in lieu of written minutes. The number of citations related to required program review and improvement decreased from 12 in a single academic year to 3 over the last 5 years.
Duke University Hospital's institutional initiative to incorporate practice-based learning and improvement resulted in increased documentation, greater use of a standardized template, fewer ACGME-related citations, and enhanced consistency in preparing for ACGME site visits.
Residents are evaluated using Accreditation Council for Graduate Medical Education (ACGME) core competencies. An Objective Structured Clinical Examination (OSCE) is a potential evaluation tool to measure these competencies and provide outcome data.
Create an OSCE to evaluate and demonstrate improvement in intern core competencies of patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice before and after internship.
From 2006 to 2008, 106 interns from 10 medical specialties were evaluated with a preinternship and postinternship OSCE at Madigan Army Medical Center. The OSCE included eight 12-minute stations that collectively evaluated the 6 ACGME core competencies using human patient simulators, standardized patients, and clinical scenarios. Interns were scored using objective and subjective criteria, with a maximum score of 100 for each competency. Stations included death notification, abdominal pain, transfusion consent, suture skills, wellness history, chest pain, altered mental status, and computer literature search. These stations were chosen by specialty program directors, created with input from board-certified specialists, and were peer reviewed.
All OSCE testing on the 106 interns (ages 25 to 44 [average, 28.6]; 70 [66%] men; 65 [58%] allopathic medical school graduates) resulted in statistically significant improvement in all ACGME core competencies: patient care (71.9% to 80.0%, P < .001), medical knowledge (59.6% to 78.6%, P < .001), practice-based learning and improvement (45.2% to 63.0%, P < .001), interpersonal and communication skills (77.5% to 83.1%, P < .001), professionalism (74.8% to 85.1%, P < .001), and systems-based practice (56.6% to 76.5%, P < .001).
An OSCE during internship can evaluate incoming baseline ACGME core competencies and test for interval improvement. The OSCE is a valuable assessment tool to provide outcome measures on resident competency performance and evaluate program effectiveness.