In the United States, the Accreditation Council of graduate medical education (ACGME) requires all accredited Internal medicine residency training programs to facilitate resident scholarly activities. However, clinical experience and medical education still remain the main focus of graduate medical education in many Internal Medicine (IM) residency-training programs. Left to design the structure, process and outcome evaluation of the ACGME research requirement, residency-training programs are faced with numerous barriers. Many residency programs report having been cited by the ACGME residency review committee in IM for lack of scholarly activity by residents.
We would like to share our experience at Lincoln Hospital, an affiliate of Weill Medical College Cornell University New York, in designing and implementing a successful structured research curriculum based on ACGME competencies taught during a dedicated "research rotation".
Since the inception of the research rotation in 2004, participation of our residents among scholarly activities has substantially increased. Our residents increasingly believe and appreciate that research is an integral component of residency training and essential for practice of medicine.
Internal medicine residents' outlook in research can be significantly improved using a research curriculum offered through a structured and dedicated research rotation. This is exemplified by the improvement noted in resident satisfaction, their participation in scholarly activities and resident research outcomes since the inception of the research rotation in our internal medicine training program.
The Accreditation Council for Graduate Medical Education (ACGME) requires an annual evaluation of all ACGME-accredited residency and fellowship programs to assess program quality. The results of this evaluation must be used to improve the program. This manuscript describes a metric to be used in conducting ACGME-mandated annual program review of ACGME-accredited anesthesiology residencies and fellowships.
A variety of metrics to assess anesthesiology residency and fellowship programs are identified by the authors through literature review and considered for use in constructing a program "report card."
Metrics used to assess program quality include success in achieving American Board of Anesthesiology (ABA) certification, performance on the annual ABA/American Society of Anesthesiology In-Training Examination, performance on mock oral ABA certification examinations, trainee scholarly activities (publications and presentations), accreditation site visit and internal review results, ACGME and alumni survey results, National Resident Matching Program (NRMP) results, exit interview feedback, diversity data and extensive program/rotation/faculty/curriculum evaluations by trainees and faculty. The results are used to construct a "report card" that provides a high-level review of program performance and can be used in a continuous quality improvement process.
An annual program review is required to assess all ACGME-accredited residency and fellowship programs to monitor and improve program quality. We describe an annual review process based on metrics that can be used to focus attention on areas for improvement and track program performance year-to-year. A "report card" format is described as a high-level tool to track educational outcomes.
The Accreditation Council for Graduate Medical Education (ACGME) Outcome Project requires that residency program directors objectively document that their residents achieve competence in 6 general dimensions of practice.
In November 2007, the American Board of Internal Medicine (ABIM) and the ACGME initiated the development of milestones for internal medicine residency training. ABIM and ACGME convened a 33-member milestones task force made up of program directors, experts in evaluation and quality, and representatives of internal medicine stakeholder organizations. This article reports on the development process and the resulting list of proposed milestones for each ACGME competency.
The task force adopted the Dreyfus model of skill acquisition as a framework the internal medicine milestones, and calibrated the milestones with the expectation that residents achieve, at a minimum, the “competency” level in the 5-step progression by the completion of residency. The task force also developed general recommendations for strategies to evaluate the milestones.
The milestones resulting from this effort will promote competency-based resident education in internal medicine, and will allow program directors to track the progress of residents and inform decisions regarding promotion and readiness for independent practice. In addition, the milestones may guide curriculum development, suggest specific assessment strategies, provide benchmarks for resident self-directed assessment-seeking, and assist remediation by facilitating identification of specific deficits. Finally, by making explicit the profession's expectations for graduates and providing a degree of national standardization in evaluation, the milestones may improve public accountability for residency training.
Scholarly activity is a requirement for accreditation by the Accreditation Council for Graduate Medical Education. There is currently no uniform definition used by all Residency Review Committees (RRCs). A total of 6 of the 27 RRCs currently have a rubric or draft of a rubric to evaluate scholarly activity.
To develop a definition of scholarly activity and a set of rubrics to be used in program accreditation to reduce subjectivity of the evaluation of scholarly activity at the level of individual residency programs and across RRCs.
We performed a review of the pertinent literature and selected faculty promotion criteria across the United States to develop a structure for a proposed rubric of scholarly activity, drawing on work on scholarship by experts to create a definition of scholarly activity and rubrics for its assessment.
The literature review showed that academic institutions in the United States place emphasis on all 4 major components of Boyer's definition of scholarship: discovery, integration, application, and teaching. We feel that the assessment of scholarly activity should mirror these findings as set forth in our proposed rubric. Our proposed rubric is intended to ensure a more objective evaluation of these components of scholarship in accreditation reviews, and to address both expectations for scholarly pursuits for core teaching faculty and those for resident and fellow physicians.
The aim of our proposed rubric is to ensure a more objective evaluation of these components of scholarship in accreditation reviews, and to address expectations for scholarly pursuits for core teaching faculty as well as those for resident and fellow physicians.
Scholarly research is a key component of Canadian urology residency. Through comparison of scholarly performance of urology residents before residency with that achieved during residency, we aimed to elicit predictive factors for completion of research activities.
Electronic surveys were sent to 152 urology residents of 11 accredited Canadian programs. Survey questions pertained to post-graduate training year (PGY), formal education, scholarly activity completed before and after the start of residency, protected/dedicated research time, structured research curriculum and pursuit of fellowship training.
Surveys were completed by 42 residents from 10 programs. Only 26% of residents had a structured research curriculum, 38% a dedicated research rotation and 43% protected research time. We found that 45% of residents published at least 1 manuscript so far during residency (mean 1.14 ± 0.32), and 43% submitted at least 1 manuscript (mean 0.86 ± 0.25). During residency, 62% of residents completed ≥1 formal research presentation (median number 1.5; range: 0-≥10). Only the level of PGY significantly affected the number of manuscripts published (p < 0.001) and number of formal research presentations (p < 0.001) completed during residency. In total, 86% of residents planning to pursue fellowship training had a mean number of publications and presentations during residency of 1.25 ± 0.37 and 2.25 ± 0.54, respectively.
Level of PGY significantly affected quantitative scholarly activity, but the numbers and types of presentations performed prior to residency, completion of an honours or graduate degree and plans to pursue fellowship training did not.
Scholarly activity as a component of residency education is becoming increasingly emphasized by the Accreditation Council for Graduate Medical Education. “Limited or no evidence of resident or faculty scholarly activity” is a common citation given to family medicine residency programs by the Review Committee for Family Medicine.
The objective was to provide a model scholarly activity curriculum that has been successful in improving the quality of graduate medical education in a family medicine residency program, as evidenced by a record of resident academic presentations and publications.
We provide a description of the Clinical Scholars Program that has been implemented into the curriculum of the Trident/Medical University of South Carolina Family Medicine Residency Program.
During the most recent 10-year academic period (2000–2010), a total of 111 residents completed training and participated in the Clinical Scholars Program. This program has produced more than 24 presentations during national and international meetings of medical societies and 15 publications in peer-reviewed medical journals. In addition, many of the projects have been presented during meetings of state and regional medical organizations.
This paper presents a model curriculum for teaching about scholarship to family medicine residents. The success of this program is evidenced by the numerous presentations and publications by participating residents.
Professional organizations have called for individualized training approaches, as well as for opportunities for resident scholarship, to ensure that internal medicine residents have sufficient knowledge and experience to make informed career choices.
Context and Purpose
To address these training issues within the University of California, San Francisco, internal medicine program, we created the Areas of Distinction (AoD) program to supplement regular clinical duties with specialized curricula designed to engage residents in clinical research, global health, health equities, medical education, molecular medicine, or physician leadership. We describe our AoD program and present this initiative's evaluation data.
Methods and Program Evaluation
We evaluated features of our AoD program, including program enrollment, resident satisfaction, recruitment surveys, quantity of scholarly products, and the results of our resident's certifying examination scores. Finally, we described the costs of implementing and maintaining the AoDs.
AoD enrollment increased from 81% to 98% during the past 5 years. Both quantitative and qualitative data demonstrated a positive effect on recruitment and improved resident satisfaction with the program, and the number and breadth of scholarly presentations have increased without an adverse effect on our board certification pass rate.
The AoD system led to favorable outcomes in the domains of resident recruitment, satisfaction, scholarship, and board performance. Our intervention showed that residents can successfully obtain clinical training while engaging in specialized education beyond the bounds of core medicine training. Nurturing these interests 5 empower residents to better shape their careers by providing earlier insight into internist roles that transcend classic internal medicine training.
The Accreditation Council for Graduate Medical Education requires scholarly activity for both faculty and residents in obstetrics and gynecology (Ob-Gyn). There is little evidence on the most effective method to train, recruit, and retain research faculty who can mentor resident researchers at small programs.
To address this problem, we created the “Baby Steps” program for a small university-based Ob-Gyn program.
After a thorough assessment of existing resources, a postdoctoral researcher was recruited and coupled with an established researcher to raise the standards of resident research, facilitate and coordinate resident projects, and support clinical faculty participation in research activities. Grant submissions, grants awarded, publications submitted, presentations, and awards were tracked before and after the implementation of the Baby Steps program for faculty and residents.
After 2 years the program has already begun to show an increase in scholarly activity. In a program of 12 residents, 8 made one or more presentations at regional or national meetings within the previous 24 months. Additionally, 8 of 12 clinical faculty members were engaged as mentors in resident research, compared with only 3 in past years. Further, abstract, paper, and grant submissions by faculty increased approximately 25%.
The addition of a mentored postdoctoral researcher was associated with improvements to both resident and faculty research activities. Based on this success, a sister residency program has incorporated the Baby Steps approach into its training.
The Accreditation Council for Graduate Medical Education (ACGME) requires residency programs to meet and demonstrate outcomes across 6 competencies. Measuring residents' competency in practice-based learning and improvement (PBLI) is particularly challenging.
We developed an educational tool to meet ACGME requirements for PBLI. The PBLI template helped programs document quality improvement (QI) projects and supported increased scholarly activity surrounding PBLI learning.
We reviewed program requirements for 43 residency and fellowship programs and identified specific PBLI requirements for QI activities. We also examined ACGME Program Information Form responses on PBLI core competency questions surrounding QI projects for program sites visited in 2008–2009. Data were integrated by a multidisciplinary committee to develop a peer-protected PBLI template guiding programs through process, documentation, and evaluation of QI projects. All steps were reviewed and approved through our GME Committee structure.
An electronic template, companion checklist, and evaluation form were developed using identified project characteristics to guide programs through the PBLI process and facilitate documentation and evaluation of the process. During a 24 month period, 27 programs have completed PBLI projects, and 15 have reviewed the template with their education committees, but have not initiated projects using the template.
The development of the tool generated program leaders' support because the tool enhanced the ability to meet program-specific objectives. The peer-protected status of this document for confidentiality and from discovery has been beneficial for program usage. The document aggregates data on PBLI and QI initiatives, offers opportunities to increase scholarship in QI, and meets the ACGME goal of linking measures to outcomes important to meeting accreditation requirements at the program and institutional level.
The Accreditation Council for Graduate Medical Education (ACGME) requires pediatric residency programs to teach professionalism but does not provide concrete guidance for fulfilling these requirements. Individual programs, therefore, adopt their own methods for teaching and evaluating professionalism, and published research demonstrating how to satisfy the ACGME professionalism requirement is lacking.
We surveyed pediatric residency program directors in 2008 to explore the establishment of expectations for professional conduct, the educational experiences used to foster learning in professionalism, and the evaluation of professionalism.
Surveys were completed by 96 of 189 program directors (51%). A majority reported that new interns attend a session during which expectations for professionalism are conveyed, either verbally (93%) or in writing (65%). However, most program directors reported that “None or Few” of their residents engaged in multiple educational experiences that could foster learning in professionalism. Despite the identification of professionalism as a core competency, a minority (28%) of programs had a written curriculum in ethics or professionalism. When evaluating professionalism, the most frequently used assessment strategies were rated as “very useful” by only a modest proportion (26%–54%) of respondents.
Few programs have written curricula in professionalism, and opportunities for experiential learning in professionalism may be limited. In addition, program directors express only moderate satisfaction with current strategies for evaluating professionalism that were available through 2008.
There are no nationwide data on the methods residency programs are using to assess trainee competence. The Accreditation Council for Graduate Medical Education (ACGME) has recommended tools that programs can use to evaluate their trainees. It is unknown if programs are adhering to these recommendations.
To describe evaluation methods used by our nation’s internal medicine residency programs and assess adherence to ACGME methodological recommendations for evaluation.
All internal medicine programs registered with the Association of Program Directors of Internal Medicine (APDIM).
Descriptive statistics of programs and tools used to evaluate competence; compliance with ACGME recommended evaluative methods.
The response rate was 70%. Programs were using an average of 4.2–6.0 tools to evaluate their trainees with heavy reliance on rating forms. Direct observation and practice and data-based tools were used much less frequently. Most programs were using at least 1 of the Accreditation Council for Graduate Medical Education (ACGME)’s “most desirable” methods of evaluation for all 6 measures of trainee competence. These programs had higher support staff to resident ratios than programs using less desirable evaluative methods.
Residency programs are using a large number and variety of tools for evaluating the competence of their trainees. Most are complying with ACGME recommended methods of evaluation especially if the support staff to resident ratio is high.
graduate medical education; residency; ACGME; competency
The American Board of Internal Medicine Certification Examination (ABIM-CE) is one of several methods used to assess medical knowledge, an Accreditation Council for Graduate Medical Education (ACGME) core competency for graduating internal medicine residents. With recent changes in graduate medical education program directors and internal medicine residents are seeking evidence to guide decisions regarding residency elective choices. Prior studies have shown that formalized elective curricula improve subspecialty ABIM-CE scores. The primary aim of this study was to evaluate whether the number of subspecialty elective exposures or the specific subspecialties which residents complete electives in impact ABIM-CE scores.
ABIM-CE scores, elective exposures and demographic characteristics were collected for MedStar Georgetown University Hospital internal medicine residents who were first-time takers of the ABIM-CE in 2006–2010 (n=152). Elective exposures were defined as a two-week period assigned to the respective subspecialty. ABIM-CE score was analyzed using the difference between the ABIM-CE score and the standardized passing score (delta-SPS). Subspecialty scores were analyzed using percentage of correct responses. Data was analyzed using GraphPad Prism version 5.00 for Windows.
Paired elective exposure and ABIM-CE scores were available in 131 residents. There was no linear correlation between ABIM-CE mean delta-SPS and the total number of electives or the number of unique elective exposures. Residents with ≤14 elective exposures had higher ABIM-CE mean delta-SPS than those with ≥15 elective exposures (143.4 compared to 129.7, p=0.051). Repeated electives in individual subspecialties were not associated with significant difference in mean ABIM-CE delta-SPS.
This study did not demonstrate significant positive associations between individual subspecialty elective exposures and ABIM-CE mean delta-SPS score. Residents with ≤14 elective exposures had higher ABIM-CE mean delta-SPS than those with ≥15 elective exposures suggesting there may be an “ideal” number of elective exposures that supports improved ABIM-CE performance. Repeated elective exposures in an individual specialty did not correlate with overall or subspecialty ABIM-CE performance.
Resident education; Gender; Elective; Subspecialty; Graduate medical education
Because of the Accreditation Council for Graduate Medical Education (ACGME) and the Residency Review Committee (RRC) approval timelines, new residency programs cannot use Electronic Residency Application Service (ERAS) during their first year of applicants.
We sought to identify differences between program directors’ subjective ratings of applicants from an emergency medicine (EM) residency program’s first year (in which ERAS was not used) to their ratings of applicants the following year in which ERAS was used.
The University of Utah Emergency Medicine Residency Program received approval from the ACGME in 2004. Applicants for the entering class of 2005 (year 1) did not use ERAS, submitting a separate application, while those applying for the following year (year 2) used ERAS. Residency program directors rated applicants using subjective components of their applications, assigning scores on scales from 0–10 or 0–5 (10 or 5 = highest score) for select components of the application. We retrospectively reviewed and compared these ratings between the 2 years of applicants.
A total of 130 and 458 prospective residents applied during year 1 and year 2, respectively. Applicants were similar in average scores for research (1.65 vs. 1.81, scale 0–5, p = 0.329) and volunteer work (5.31 vs. 5.56, scale 0–10, p = 0.357). Year 1 applicants received higher scores for their personal statement (3.21 vs. 2.22, scale 0–5, p < 0.001), letters of recommendation (7.0 vs. 5.94, scale 0–10, p < 0.001), dean’s letter (3.5 vs. 2.7, scale 1–5, p < 0.001), and in their potential contribution to class characteristics (4.64 vs. 3.34, scale 0–10, p < 0.001).
While the number of applicants increased, the use of ERAS in a new residency program did not improve the overall subjective ratings of residency applicants. Year 1 applicants received higher scores for the written components of their applications and in their potential contributions to class characteristics.
Residency application; ERAS; Subjective Ratings
Physician workforce projections by the Institute of Medicine require enhanced training in geriatrics for all primary care and subspecialty physicians. Defining essential geriatrics competencies for internal medicine and family medicine residents would improve training for primary care and subspecialty physicians. The objectives of this study were to (1) define essential geriatrics competencies common to internal medicine and family medicine residents that build on established national geriatrics competencies for medical students, are feasible within current residency programs, are assessable, and address the Accreditation Council for Graduate Medical Education competencies; and (2) involve key stakeholder organizations in their development and implementation.
Initial candidate competencies were defined through small group meetings and a survey of more than 100 experts, followed by detailed item review by 26 program directors and residency clinical educators from key professional organizations. Throughout, an 8-member working group made revisions to maintain consistency and compatibility among the competencies. Support and participation by key stakeholder organizations were secured throughout the project.
The process identified 26 competencies in 7 domains: Medication Management; Cognitive, Affective, and Behavioral Health; Complex or Chronic Illness(es) in Older Adults; Palliative and End-of-Life Care; Hospital Patient Safety; Transitions of Care; and Ambulatory Care. The competencies map directly onto the medical student geriatric competencies and the 6 Accreditation Council for Graduate Medical Education Competencies.
Through a consensus-building process that included leadership and members of key stakeholder organizations, a concise set of essential geriatrics competencies for internal medicine and family medicine residencies has been developed. These competencies are well aligned with concerns for residency training raised in a recent Medicare Payment Advisory Commission report to Congress. Work is underway through stakeholder organizations to disseminate and assess the competencies among internal medicine and family medicine residency programs.
The Accreditation Council for Graduate Medical Education (ACGME) requires physicians in training to be educated in 6 competencies considered important for independent medical practice. There is little information about the experiences that residents feel contribute most to the acquisition of the competencies.
To understand how residents perceive their learning of the ACGME competencies and to determine which educational activities were most helpful in acquiring these competencies.
A web-based survey created by the graduate medical education office for institutional program monitoring and evaluation was sent to all residents in ACGME-accredited programs at the David Geffen School of Medicine, University of California-Los Angeles, from 2007 to 2010. Residents responded to questions about the adequacy of their learning for each of the 6 competencies and which learning activities were most helpful in competency acquisition.
We analyzed 1378 responses collected from postgraduate year-1 (PGY-1) to PGY-3 residents in 12 different residency programs, surveyed between 2007 and 2010. The overall response rate varied by year (66%–82%). Most residents (80%–97%) stated that their learning of the 6 ACGME competencies was “adequate.” Patient care activities and observation of attending physicians and peers were listed as the 2 most helpful learning activities for acquiring the 6 competencies.
Our findings reinforce the importance of learning from role models during patient care activities and the heterogeneity of learning activities needed for acquiring all 6 competencies.
OBJECTIVE: To describe the views of residency program directors regarding the effect of the 2010 duty hour recommendations on the 6 core competencies of graduate medical education.
METHODS: US residency program directors in internal medicine, pediatrics, and general surgery were e-mailed a survey from July 8 through July 20, 2010, after the 2010 Accreditation Council for Graduate Medical Education (ACGME) duty hour recommendations were published. Directors were asked to rate the implications of the new recommendations for the 6 ACGME core competencies as well as for continuity of inpatient care and resident fatigue.
RESULTS: Of 719 eligible program directors, 464 (65%) responded. Most program directors believe that the new ACGME recommendations will decrease residents' continuity with hospitalized patients (404/464 [87%]) and will not change (303/464 [65%]) or will increase (26/464 [6%]) resident fatigue. Additionally, most program directors (249-363/464 [53%-78%]) believe that the new duty hour restrictions will decrease residents' ability to develop competency in 5 of the 6 core areas. Surgery directors were more likely than internal medicine directors to believe that the ACGME recommendations will decrease residents' competency in patient care (odds ratio [OR], 3.9; 95% confidence interval [CI], 2.5-6.3), medical knowledge (OR, 1.9; 95% CI, 1.2-3.2), practice-based learning and improvement (OR, 2.7; 95% CI, 1.7-4.4), interpersonal and communication skills (OR, 1.9; 95% CI, 1.2-3.0), and professionalism (OR, 2.5; 95% CI, 1.5-4.0).
CONCLUSION: Residency program directors' reactions to ACGME duty hour recommendations demonstrate a marked degree of concern about educating a competent generation of future physicians in the face of increasing duty hour standards and regulation.
The reactions of residency program directors to the ACGME duty hour recommendations demonstrate a marked degree of concern about educating a competent generation of future physicians in the face of increasing duty hour standards and regulation.
The Accreditation Council for Graduate Medical Education (ACGME) requires that training programs integrate system-based practice (SBP) and practice-based learning and improvement (PBLI) into internal medicine residency curricula.
Context and setting
We instituted a seminar series and year-long-mentored curriculum designed to engage internal medicine residents in these competencies.
Residents participate in a seminar series that includes assigned reading and structured discussion with faculty who assist in the development of quality improvement or research projects. Residents pursue projects over the remainder of the year. Monthly works in progress meetings, protected time for inquiry, and continued faculty mentorship guide the residents in their project development. Trainees present their work at hospital-wide grand rounds at the end of the academic year. We performed a survey of residents to assess their self-reported knowledge, attitudes and skills in SBP and PBLI. In addition, blinded faculty scored projects for appropriateness, impact, and feasibility.
We measured resident self-reported knowledge, attitudes, and skills at the end of the academic year. We found evidence that participants improved their understanding of the context in which they were practicing, and that their ability to engage in quality improvement projects increased. Blinded faculty reviewers favorably ranked the projects’ feasibility, impact, and appropriateness. The ‘Curriculum of Inquiry’ generated 11 quality improvement and research projects during the study period. Barriers to the ongoing work include a limited supply of mentors and delays due to Institutional Review Board approval. Hospital leadership recognizes the importance of the curriculum, and our accreditation manager now cites our ongoing work.
A structured residency-based curriculum facilitates resident demonstration of SBP and practice-based learning and improvement. Residents gain knowledge and skills though this enterprise and hospitals gain access to trainees who help to solve ongoing problems and meet accreditation requirements.
graduate medical education; competencies; longitudinal curriculum
Many have called for ambulatory training redesign in internal medicine (IM) residencies to increase primary care career outcomes. Many believe dysfunctional, clinic environments are a key barrier to meaningful ambulatory education, but little is actually known about the educational milieu of continuity clinics nationwide.
We wished to describe the infrastructure and educational milieu at resident continuity clinics and assess clinic readiness to meet new IM-RRC requirements.
National survey of ACGME accredited IM training programs.
Directors of academic and community-based continuity clinics.
Two hundred and twenty-one out of 365 (62%) of clinic directors representing 49% of training programs responded. Wide variation amongst continuity clinics in size, structure and educational organization exist. Clinics below the 25th percentile of total clinic sessions would not meet RRC-IM requirements for total number of clinic sessions. Only two thirds of clinics provided a longitudinal mentor. Forty-three percent of directors reported their trainees felt stressed in the clinic environment and 25% of clinic directors felt overwhelmed.
The survey used self reported data and was not anonymous. A slight predominance of larger clinics and university based clinics responded. Data may not reflect changes to programs made since 2008.
This national survey demonstrates that the continuity clinic experience varies widely across IM programs, with many sites not yet meeting new ACGME requirements. The combination of disadvantaged and ill patients with inadequately resourced clinics, stressed residents, and clinic directors suggests that many sites need substantial reorganization and institutional commitment.New paradigms, encouraged by ACGME requirement changes such as increased separation of inpatient and outpatient duties are needed to improve the continuity clinic experience.
clinic; resident education; ACGME; primary care
The Accreditation Council for Graduate Medical Education (ACGME) invokes evidence-based medicine (EBM) principles through the practice-based learning core competency. The authors hypothesized that among a representative sample of emergency medicine (EM) residency programs, a wide variability in EBM resident training priorities, faculty expertise expectations, and curricula exists.
The primary objective was to obtain descriptive data regarding EBM practices and expectations from EM physician educators. Our secondary objective was to assess differences in EBM educational priorities among journal club directors compared with non–journal club directors.
A 19-question survey was developed by a group of recognized EBM curriculum innovators and then disseminated to Council of Emergency Medicine Residency Directors (CORD) conference participants, assessing their opinions regarding essential EBM skill sets and EBM curricular expectations for residents and faculty at their home institutions. The survey instrument also identified the degree of interest respondents had in receiving a free monthly EBM journal club curriculum.
A total of 157 individuals registered for the conference, and 98 completed the survey. Seventy-seven (77% of respondents) were either residency program directors or assistant / associate program directors. The majority of participants were from university-based programs and in practice at least 5 years. Respondents reported the ability to identify flawed research (45%), apply research findings to patient care (43%), and comprehend research methodology (33%) as the most important resident skill sets. The majority of respondents reported no formal journal club or EBM curricula (75%) and do not utilize structured critical appraisal instruments (71%) when reviewing the literature. While journal club directors believed that resident learners’ most important EBM skill is to identify secondary peer-reviewed resources, non–journal club directors identified residents’ ability to distinguish significantly flawed research as the key skill to develop. Interest in receiving a free monthly EBM journal club curriculum was widely accepted (89%).
Attaining EBM proficiency is an expected outcome of graduate medical education (GME) training, although the specific domains of anticipated expertise differ between faculty and residents. Few respondents currently use a formalized curriculum to guide the development of EBM skill sets. There appears to be a high level of interest in obtaining EBM journal club educational content in a structured format. Measuring the effects of providing journal club curriculum content in conjunction with other EBM interventions may warrant further investigation.
evidence-based medicine; knowledge translation; faculty development
An important expectation of pediatric education is assessing, resuscitating, and stabilizing ill or injured children.
To determine whether the Accreditation Council for Graduate Medical Education (ACGME) minimum time requirement for emergency and acute illness experience is adequate to achieve the educational objectives set forth for categorical pediatric residents. We hypothesized that despite residents working five 1-month block rotations in a high-volume (95 000 pediatric visits per year) pediatric emergency department (ED), the comprehensive experience outlined by the ACGME would not be satisfied through clinical exposure.
This was a retrospective, descriptive study comparing actual resident experience to the standard defined by the ACGME. The emergency medicine experience of 35 categorical pediatric residents was tracked including number of patients evaluated during training and patient discharge diagnoses. The achievability of the ACGME requirement was determined by reporting the percentage of pediatric residents that cared for at least 1 patient from each of the ACGME-required disorder categories.
A total of 11.4% of residents met the ACGME requirement for emergency and acute illness experience in the ED. The median number of patients evaluated by residents during training in the ED was 941. Disorder categories evaluated least frequently included shock, sepsis, diabetic ketoacidosis, coma/altered mental status, cardiopulmonary arrest, burns, and bowel obstruction.
Pediatric residents working in one of the busiest pediatric EDs in the country and working 1 month more than the ACGME-recommended minimum did not achieve the ACGME requirement for emergency and acute illness experience through direct patient care.
The Accreditation Council of Graduate Medical Education (ACGME) and the Residency Review Committee require a competency-based, accessible evaluation system. The paper system at our institution did not meet these demands and suffered from low compliance. A diverse committee of internal medicine faculty, program directors, and house staff designed a new clinical evaluation strategy based on ACGME competencies and utilizing a modular web-based system called ResEval. ResEval more effectively met requirements and provided useful data for program and curriculum development. The system is paperless, allows for evaluations at any time, and produces customized evaluation reports, dramatically improving our ability to analyze evaluation data. The use of this novel system and the inclusion of a robust technology infrastructure, repeated training and e-mail reminders, and program leadership commitment resulted in an increase in clinical skills evaluations performed and a rapid change in the workflow and culture of evaluation at our residency program.
house staff evaluation; clinical skills assessment; internal medicine residency; Internet; educational measurement
IMGs constitute about a third of the United States (US) internal medicine graduates. US residency training programs face challenges in selection of IMGs with varied background features. However data on this topic is limited. We analyzed whether any pre-selection characteristics of IMG residents in our internal medicine program are associated with selected outcomes, namely competency based evaluation, examination performance and success in acquiring fellowship positions after graduation.
We conducted a retrospective study of 51 IMGs at our ACGME accredited teaching institution between 2004 and 2007. Background resident features namely age, gender, self-reported ethnicity, time between medical school graduation to residency (pre-hire time), USMLE step I & II clinical skills scores, pre-GME clinical experience, US externship and interest in pursuing fellowship after graduation expressed in their personal statements were noted. Data on competency-based evaluations, in-service exam scores, research presentation and publications, fellowship pursuance were collected. There were no fellowships offered in our hospital in this study period. Background features were compared between resident groups according to following outcomes: (a) annual aggregate graduate PGY-level specific competency-based evaluation (CBE) score above versus below the median score within our program (scoring scale of 1 – 10), (b) US graduate PGY-level specific resident in-training exam (ITE) score higher versus lower than the median score, and (c) those who succeeded to secure a fellowship within the study period. Using appropriate statistical tests & adjusted regression analysis, odds ratio with 95% confidence intervals were calculated.
94% of the study sample were IMGs; median age was 35 years (Inter-Quartile range 25th – 75th percentile (IQR): 33–37 years); 43% women and 59% were Asian physicians. The median pre-hire time was 5 years (IQR: 4–7 years) and USMLE step I & step II clinical skills scores were 85 (IQR: 80–88) & 82 (IQR: 79–87) respectively. The median aggregate CBE scores during training were: PG1 5.8 (IQR: 5.6–6.3); PG2 6.3 (IQR 6–6.8) & PG3 6.7 (IQR: 6.7 – 7.1). 25% of our residents scored consistently above US national median ITE scores in all 3 years of training and 16% pursued a fellowship.
Younger residents had higher aggregate annual CBE score than the program median (p < 0.05). Higher USMLE scores were associated with higher than US median ITE scores, reflecting exam-taking skills. Success in acquiring a fellowship was associated with consistent fellowship interest (p < 0.05) and research publications or presentations (p <0.05). None of the other characteristics including visa status were associated with the outcomes.
Background IMG features namely, age and USMLE scores predict performance evaluation and in-training examination scores during residency training. In addition enhanced research activities during residency training could facilitate fellowship goals among interested IMGs.
Programmatic changes for the dermatology residency program at The University of Texas Medical Branch were first introduced in 2005, with the faculty goal incorporating formal dermatology research projects into the 3-year postgraduate training period. This curriculum initially developed as a recommendation for voluntary scholarly project activity by residents, but it evolved into a program requirement for all residents in 2009. Departmental support for this activity includes assignment of a faculty mentor with similar interest about the research topic, financial support from the department for needed supplies, materials, and statistical consultation with the Office of Biostatistics for study design and data analysis, a 2-week elective that provides protected time from clinical activities for the purpose of preparing research for publication and submission to a peer-reviewed medical journal, and a departmental award in recognition for the best resident scholarly project each year. Since the inception of this program, five classes have graduated a total of 16 residents. Ten residents submitted their research studies for peer review and published their scholarly projects in seven dermatology journals through the current academic year. These articles included three prospective investigations, three surveys, one article related to dermatology education, one retrospective chart review, one case series, and one article about dermatopathology. An additional article from a 2012 graduate about dermatology education has also been submitted to a journal. This new program for residents was adapted from our historically successful Dermatology Honors Research Program for medical students at The University of Texas Medical Branch. Our experience with this academic initiative to promote dermatology research by residents is outlined. It is recommended that additional residency programs should consider adopting similar research programs to enrich resident education.
dermatology; resident; research; education; accreditation
The Ministry of Health, Labour and Welfare of Japan has been promoting participation in scholarly activities for physicians during residency training. However, there is debate regarding whether this is worthwhile for residents.
To evaluate residents’ opinions of engaging in scholarly activities and identify factors associated with overall satisfaction with their training program.
Cross-sectional national survey.
1,124 second-year residents in teaching hospitals in Japan in 2007
Collected data included demographics, teaching hospital characteristics and resources, residents’ research experiences, including type of activities, barriers to performing scholarly activities, residents’ opinions of scholarly requirements, and resident satisfaction with their residency program.
1,124 residents/1,500 responded for a response rate of 74.9%. Our data showed that 60.2% of Japanese residents engaged in some type of scholarly activity. Barriers included: “No resident time”; “No mentor;” and “No resident interest.” Sixty-three percent of residents thought that research should be a residency requirement. In multivariate logistic analysis, residents’ overall satisfaction with their residency program was significantly associated with participation in research activity (odds ratio (OR), 1.5; 95% confidence interval (CI), 1.1–2.1); male gender (OR, 1.5; 95% CI: 1.1–2.2); satisfaction with residency compensation (OR, 3.8; 95% CI, 2.6–5.0), and satisfaction with the residency curriculum (OR, 19.5; 95% CI, 13.7–27.7).
The majority of residents surveyed thought that research activity was worthwhile. Residents’ participation in research activity was associated with higher levels of satisfaction with residency training. Implementing measures to overcome existing barriers may have educational benefits for residents.
residency; clinical research; job satisfaction; medical education; Japan
The Accreditation Council for Graduate Medical Education (ACGME) requirements stipulate that psychiatry residents need to be educated in the area of emergency psychiatry. Existing research investigating the current state of this training is limited, and no research to date has assessed whether the ACGME Residency Review Committee requirements for psychiatry residency training are followed by psychiatry residency training programs.
We administered, to chief resident attendees of a national leadership conference, a 24‐item paper survey on the types and amount of emergency psychiatry training provided by their psychiatric residency training programs. Descriptive statistics were used in the analysis.
Of 154 surveys distributed, 111 were returned (72% response rate). Nearly one‐third of chief resident respondents indicated that more than 50% of their program's emergency psychiatry training was provided during on‐call periods. A minority indicated that they were aware of the ACGME program requirements for emergency psychiatry training. While training in emergency psychiatry occurred in many programs through rotations—different from the on‐call period—direct supervision was available during on‐call training only about one‐third of the time.
The findings suggest that about one‐third of psychiatry residency training programs do not adhere to the ACGME standards for emergency psychiatry training. Enhanced knowledge of the ACGME requirements may enhance psychiatry residents' understanding on how their programs are fulfilling the need for more emergency psychiatry training. Alternative settings to the on‐call period for emergency psychiatry training are more likely to provide for direct supervision.