The Accreditation Council for Graduate Medical Education (ACGME) requires residency programs to meet and demonstrate outcomes across 6 competencies. Measuring residents' competency in practice-based learning and improvement (PBLI) is particularly challenging.
We developed an educational tool to meet ACGME requirements for PBLI. The PBLI template helped programs document quality improvement (QI) projects and supported increased scholarly activity surrounding PBLI learning.
We reviewed program requirements for 43 residency and fellowship programs and identified specific PBLI requirements for QI activities. We also examined ACGME Program Information Form responses on PBLI core competency questions surrounding QI projects for program sites visited in 2008–2009. Data were integrated by a multidisciplinary committee to develop a peer-protected PBLI template guiding programs through process, documentation, and evaluation of QI projects. All steps were reviewed and approved through our GME Committee structure.
An electronic template, companion checklist, and evaluation form were developed using identified project characteristics to guide programs through the PBLI process and facilitate documentation and evaluation of the process. During a 24 month period, 27 programs have completed PBLI projects, and 15 have reviewed the template with their education committees, but have not initiated projects using the template.
The development of the tool generated program leaders' support because the tool enhanced the ability to meet program-specific objectives. The peer-protected status of this document for confidentiality and from discovery has been beneficial for program usage. The document aggregates data on PBLI and QI initiatives, offers opportunities to increase scholarship in QI, and meets the ACGME goal of linking measures to outcomes important to meeting accreditation requirements at the program and institutional level.
The Accreditation Council for Graduate Medical Education (ACGME) Outcome Project requires that residency program directors objectively document that their residents achieve competence in 6 general dimensions of practice.
In November 2007, the American Board of Internal Medicine (ABIM) and the ACGME initiated the development of milestones for internal medicine residency training. ABIM and ACGME convened a 33-member milestones task force made up of program directors, experts in evaluation and quality, and representatives of internal medicine stakeholder organizations. This article reports on the development process and the resulting list of proposed milestones for each ACGME competency.
The task force adopted the Dreyfus model of skill acquisition as a framework the internal medicine milestones, and calibrated the milestones with the expectation that residents achieve, at a minimum, the “competency” level in the 5-step progression by the completion of residency. The task force also developed general recommendations for strategies to evaluate the milestones.
The milestones resulting from this effort will promote competency-based resident education in internal medicine, and will allow program directors to track the progress of residents and inform decisions regarding promotion and readiness for independent practice. In addition, the milestones may guide curriculum development, suggest specific assessment strategies, provide benchmarks for resident self-directed assessment-seeking, and assist remediation by facilitating identification of specific deficits. Finally, by making explicit the profession's expectations for graduates and providing a degree of national standardization in evaluation, the milestones may improve public accountability for residency training.
The Accreditation Council for Graduate Medical Education (ACGME) requires an annual evaluation of all ACGME-accredited residency and fellowship programs to assess program quality. The results of this evaluation must be used to improve the program. This manuscript describes a metric to be used in conducting ACGME-mandated annual program review of ACGME-accredited anesthesiology residencies and fellowships.
A variety of metrics to assess anesthesiology residency and fellowship programs are identified by the authors through literature review and considered for use in constructing a program "report card."
Metrics used to assess program quality include success in achieving American Board of Anesthesiology (ABA) certification, performance on the annual ABA/American Society of Anesthesiology In-Training Examination, performance on mock oral ABA certification examinations, trainee scholarly activities (publications and presentations), accreditation site visit and internal review results, ACGME and alumni survey results, National Resident Matching Program (NRMP) results, exit interview feedback, diversity data and extensive program/rotation/faculty/curriculum evaluations by trainees and faculty. The results are used to construct a "report card" that provides a high-level review of program performance and can be used in a continuous quality improvement process.
An annual program review is required to assess all ACGME-accredited residency and fellowship programs to monitor and improve program quality. We describe an annual review process based on metrics that can be used to focus attention on areas for improvement and track program performance year-to-year. A "report card" format is described as a high-level tool to track educational outcomes.
Standard curricula to teach Internal Medicine residents about quality assessment and improvement, important components of the Accreditation Council for Graduate Medical Education core competencies practiced-based learning and improvement (PBLI) and systems-based practice (SBP), have not been easily accessible.
Using the American Board of Internal Medicine’s (ABIM) Clinical Preventative Services Practice Improvement Module (CPS PIM), we have incorporated a longitudinal quality assessment and improvement curriculum (QAIC) into the 2 required 1-month ambulatory rotations during the postgraduate year 2. During the first block, residents complete the PIM chart reviews, patient, and system surveys. The second block includes resident reflection using PIM data and the group performing a small test of change using the Plan–Do–Study–Act (PDSA) cycle in the resident continuity clinic.
To date, 3 resident quality improvement (QI) projects have been undertaken as a result of QAIC, each making significant improvements in the residents’ continuity clinic. Resident confidence levels in QI skills (e.g., writing an aim statement [71% to 96%, P < .01] and using a PDSA cycle [9% to 89%, P < .001]) improved significantly.
The ABIM CPS PIM can be used by Internal Medicine residency programs to introduce QI concepts into their residents’ outpatient practice through encouraging practice-based learning and improvement and systems-based practice.
Internal Medicine residents; quality improvement; practiced-based learning and improvement; systems-based practice; practice improvement module
Residents are evaluated using Accreditation Council for Graduate Medical Education (ACGME) core competencies. An Objective Structured Clinical Examination (OSCE) is a potential evaluation tool to measure these competencies and provide outcome data.
Create an OSCE to evaluate and demonstrate improvement in intern core competencies of patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice before and after internship.
From 2006 to 2008, 106 interns from 10 medical specialties were evaluated with a preinternship and postinternship OSCE at Madigan Army Medical Center. The OSCE included eight 12-minute stations that collectively evaluated the 6 ACGME core competencies using human patient simulators, standardized patients, and clinical scenarios. Interns were scored using objective and subjective criteria, with a maximum score of 100 for each competency. Stations included death notification, abdominal pain, transfusion consent, suture skills, wellness history, chest pain, altered mental status, and computer literature search. These stations were chosen by specialty program directors, created with input from board-certified specialists, and were peer reviewed.
All OSCE testing on the 106 interns (ages 25 to 44 [average, 28.6]; 70 [66%] men; 65 [58%] allopathic medical school graduates) resulted in statistically significant improvement in all ACGME core competencies: patient care (71.9% to 80.0%, P < .001), medical knowledge (59.6% to 78.6%, P < .001), practice-based learning and improvement (45.2% to 63.0%, P < .001), interpersonal and communication skills (77.5% to 83.1%, P < .001), professionalism (74.8% to 85.1%, P < .001), and systems-based practice (56.6% to 76.5%, P < .001).
An OSCE during internship can evaluate incoming baseline ACGME core competencies and test for interval improvement. The OSCE is a valuable assessment tool to provide outcome measures on resident competency performance and evaluate program effectiveness.
In 1999, the Accreditation Council for Graduate Medical Education (ACGME) Outcome Project began to focus on resident performance in the 6 competencies of patient care, medical knowledge, professionalism, practice-based learning and improvement, interpersonal communication skills, and professionalism. Beginning in 2007, the ACGME began collecting information on how programs assess these competencies. This report provides information on the nature and extent of those assessments.
Using data collected by the ACGME for site visits, we use descriptive statistics and percentages to describe the number and type of methods and assessors accredited programs (n = 4417) report using to assess the competencies. Observed differences among specialties, methodologies, and assessors are tested with analysis of variance procedures.
Almost all (>97%) of programs report assessing all of the competencies and using multiple methods and multiple assessors. Similar assessment methods and evaluator types were consistently used across the 6 competencies. However, there were some differences in the use of patient and family as assessors: Primary care and ambulatory specialties used these to a greater extent than other specialties.
Residency programs are emphasizing the competencies in their evaluation of residents. Understanding the scope of evaluation methodologies that programs use in resident assessment is important for both the profession and the public, so that together we may monitor continuing improvement in US graduate medical education.
In the United States, the Accreditation Council of graduate medical education (ACGME) requires all accredited Internal medicine residency training programs to facilitate resident scholarly activities. However, clinical experience and medical education still remain the main focus of graduate medical education in many Internal Medicine (IM) residency-training programs. Left to design the structure, process and outcome evaluation of the ACGME research requirement, residency-training programs are faced with numerous barriers. Many residency programs report having been cited by the ACGME residency review committee in IM for lack of scholarly activity by residents.
We would like to share our experience at Lincoln Hospital, an affiliate of Weill Medical College Cornell University New York, in designing and implementing a successful structured research curriculum based on ACGME competencies taught during a dedicated "research rotation".
Since the inception of the research rotation in 2004, participation of our residents among scholarly activities has substantially increased. Our residents increasingly believe and appreciate that research is an integral component of residency training and essential for practice of medicine.
Internal medicine residents' outlook in research can be significantly improved using a research curriculum offered through a structured and dedicated research rotation. This is exemplified by the improvement noted in resident satisfaction, their participation in scholarly activities and resident research outcomes since the inception of the research rotation in our internal medicine training program.
The complex competency labeled practice-based learning and improvement (PBLI) by the Accreditation Council for Graduate Medical Education (ACGME) incorporates core knowledge in evidence-based medicine (EBM). The purpose of this study was to operationally define a “PBLI-EBM” domain for assessing resident physician competence.
The authors used an iterative design process to first content analyze and map correspondences between ACGME and EBM literature sources. The project team, including content and measurement experts and residents/fellows, parsed, classified, and hierarchically organized embedded learning outcomes using a literature-supported cognitive taxonomy. A pool of 141 items was produced from the domain and assessment specifications. The PBLI-EBM domain and resulting items were content validated through formal reviews by a national panel of experts.
The final domain represents overlapping PBLI and EBM cognitive dimensions measurable through written, multiple-choice assessments. It is organized as 4 subdomains of clinical action: Therapy, Prognosis, Diagnosis, and Harm. Four broad cognitive skill branches (Ask, Acquire, Appraise, and Apply) are subsumed under each subdomain. Each skill branch is defined by enabling skills that specify the cognitive processes, content, and conditions pertinent to demonstrable competence. Most items passed content validity screening criteria and were prepared for test form assembly and administration.
The operational definition of PBLI-EBM competence is based on a rigorously developed and validated domain and item pool, and substantially expands conventional understandings of EBM. The domain, assessment specifications, and procedures outlined may be used to design written assessments to tap important cognitive dimensions of the overall PBLI competency, as given by ACGME. For more comprehensive coverage of the PBLI competency, such instruments need to be complemented with performance assessments.
The Accreditation Council for Graduate Medical Education has suggested various methods for evaluation of practice-based learning and improvement competency, but data on implementation of these methods are limited.
To compare medical record review and patient surveys on evaluating physician performance in preventive services in an outpatient resident clinic.
Within an ongoing quality improvement project, we collected baseline performance data on preventive services provided for patients at the University of Alabama at Birmingham (UAB) Internal Medicine Residents' ambulatory clinic.
Seventy internal medicine and medicine-pediatrics residents from the UAB Internal Medicine Residency program.
Resident- and clinic-level comparisons of aggregated patient survey and chart documentation rates of (1) screening for smoking status, (2) advising smokers to quit, (3) cholesterol screening, (4) mammography screening, and (5) pneumonia vaccination.
Six hundred and fifty-nine patient surveys and 761 charts were abstracted. At the clinic level, rates for screening of smoking status, recommending mammogram, and for cholesterol screening were similar (difference <5%) between the 2 methods. Higher rates for pneumonia vaccination (76% vs 67%) and advice to quit smoking (66% vs 52%) were seen on medical record review versus patient surveys. However, within-resident (N=70) comparison of 2 methods of estimating screening rates contained significant variability. The cost of medical record review was substantially higher ($107 vs $17/physician).
Medical record review and patient surveys provided similar rates for selected preventive health measures at the clinic level, with the exception of pneumonia vaccination and advising to quit smoking. A large variation among individual resident providers was noted.
education; medical; preventive health services; patient survey; medical record review; cost evaluation
The Accreditation Council for Graduate Medical Education (ACGME) recommends resident portfolios as 1 method for assessing competence in practice-based learning and improvement. In July 2005, when anesthesiology residents in our department were required to start a portfolio, the residents and their faculty advisors did not readily accept this new requirement. Intensive education efforts addressing the goals and importance of portfolios were undertaken. We hypothesized that these educational efforts improved acceptance of the portfolio and retrospectively audited the portfolio evaluation forms completed by faculty advisors.
Intensive education about the goals and importance of portfolios began in January 2006, including presentations at departmental conferences and one-on-one education sessions. Faculty advisors were instructed to evaluate each resident's portfolio and complete a review form. We retrospectively collected data to determine the percentage of review forms completed by faculty. The portfolio reviews also assessed the percentage of 10 required portfolio components residents had completed.
Portfolio review forms were completed by faculty advisors for 13% (5/38) of residents during the first advisor-advisee meeting in December 2005. Initiation of intensive education efforts significantly improved compliance, with review forms completed for 68% (26/38) of residents in May 2006 (P < .0001) and 95% (36/38) in December 2006 (P < .0001). Residents also significantly improved the completeness of portfolios between May and December of 2006.
Portfolios are considered a best methods technique by the ACGME for evaluation of practice-based learning and improvment. We have found that intensive education about the goals and importance of portfolios can enhance acceptance of this evaluation tool, resulting in improved compliance in completion and evaluation of portfolios.
There are no nationwide data on the methods residency programs are using to assess trainee competence. The Accreditation Council for Graduate Medical Education (ACGME) has recommended tools that programs can use to evaluate their trainees. It is unknown if programs are adhering to these recommendations.
To describe evaluation methods used by our nation’s internal medicine residency programs and assess adherence to ACGME methodological recommendations for evaluation.
All internal medicine programs registered with the Association of Program Directors of Internal Medicine (APDIM).
Descriptive statistics of programs and tools used to evaluate competence; compliance with ACGME recommended evaluative methods.
The response rate was 70%. Programs were using an average of 4.2–6.0 tools to evaluate their trainees with heavy reliance on rating forms. Direct observation and practice and data-based tools were used much less frequently. Most programs were using at least 1 of the Accreditation Council for Graduate Medical Education (ACGME)’s “most desirable” methods of evaluation for all 6 measures of trainee competence. These programs had higher support staff to resident ratios than programs using less desirable evaluative methods.
Residency programs are using a large number and variety of tools for evaluating the competence of their trainees. Most are complying with ACGME recommended methods of evaluation especially if the support staff to resident ratio is high.
graduate medical education; residency; ACGME; competency
The Accreditation Council for Graduate Medical Education (ACGME) requires physicians in training to be educated in 6 competencies considered important for independent medical practice. There is little information about the experiences that residents feel contribute most to the acquisition of the competencies.
To understand how residents perceive their learning of the ACGME competencies and to determine which educational activities were most helpful in acquiring these competencies.
A web-based survey created by the graduate medical education office for institutional program monitoring and evaluation was sent to all residents in ACGME-accredited programs at the David Geffen School of Medicine, University of California-Los Angeles, from 2007 to 2010. Residents responded to questions about the adequacy of their learning for each of the 6 competencies and which learning activities were most helpful in competency acquisition.
We analyzed 1378 responses collected from postgraduate year-1 (PGY-1) to PGY-3 residents in 12 different residency programs, surveyed between 2007 and 2010. The overall response rate varied by year (66%–82%). Most residents (80%–97%) stated that their learning of the 6 ACGME competencies was “adequate.” Patient care activities and observation of attending physicians and peers were listed as the 2 most helpful learning activities for acquiring the 6 competencies.
Our findings reinforce the importance of learning from role models during patient care activities and the heterogeneity of learning activities needed for acquiring all 6 competencies.
In 2006, the University of Virginia became one of the first academic medical institutions to be placed on probation, after the Accreditation Council for Graduate Medical Education (ACGME) Institutional Review Committee implemented a new classification system for institutional reviews.
After University of Virginia reviewed its practices and implemented needed changes, the institution was able to have probation removed and full accreditation restored. Whereas graduate medical education committees and designated institutional officials are required to conduct internal reviews of each ACGME–accredited program midway through its accreditation cycle, no similar requirement exists for institutions.
As we designed corrective measures at the University of Virginia, we realized that regularly scheduled audits of the entire institution would have prevented the accumulation of deficiencies. We suggest that institutional internal reviews be implemented to ensure that the ACGME institutional requirements for graduate medical education are met. This process represents practice-based learning and improvement at the institutional level and may prevent other institutions from receiving unfavorable accreditation decisions.
Health professionals need competencies in improvement skills if they are to contribute usefully to improving patient care. Medical education programmes in the USA have not systematically taught improvement skills to residents (registrars in the UK). The Accreditation Council for Graduate Medical Education (ACGME) has recently developed and begun to deploy a competency based model for accreditation that may encourage the development of improvement skills by the 100 000 residents in accredited programmes. Six competencies have been identified for all physicians, independent of specialty, and measurement tools for these competencies have been described. This model may be applicable to other healthcare professions. This paper explores patterns that inhibit efforts to change practice and proposes an educational model to provide changes in management skills based on trainees' analysis of their own work.
Key Words: physician education; improvement skills; accreditation; competency
The American Board of Internal Medicine Certification Examination (ABIM-CE) is one of several methods used to assess medical knowledge, an Accreditation Council for Graduate Medical Education (ACGME) core competency for graduating internal medicine residents. With recent changes in graduate medical education program directors and internal medicine residents are seeking evidence to guide decisions regarding residency elective choices. Prior studies have shown that formalized elective curricula improve subspecialty ABIM-CE scores. The primary aim of this study was to evaluate whether the number of subspecialty elective exposures or the specific subspecialties which residents complete electives in impact ABIM-CE scores.
ABIM-CE scores, elective exposures and demographic characteristics were collected for MedStar Georgetown University Hospital internal medicine residents who were first-time takers of the ABIM-CE in 2006–2010 (n=152). Elective exposures were defined as a two-week period assigned to the respective subspecialty. ABIM-CE score was analyzed using the difference between the ABIM-CE score and the standardized passing score (delta-SPS). Subspecialty scores were analyzed using percentage of correct responses. Data was analyzed using GraphPad Prism version 5.00 for Windows.
Paired elective exposure and ABIM-CE scores were available in 131 residents. There was no linear correlation between ABIM-CE mean delta-SPS and the total number of electives or the number of unique elective exposures. Residents with ≤14 elective exposures had higher ABIM-CE mean delta-SPS than those with ≥15 elective exposures (143.4 compared to 129.7, p=0.051). Repeated electives in individual subspecialties were not associated with significant difference in mean ABIM-CE delta-SPS.
This study did not demonstrate significant positive associations between individual subspecialty elective exposures and ABIM-CE mean delta-SPS score. Residents with ≤14 elective exposures had higher ABIM-CE mean delta-SPS than those with ≥15 elective exposures suggesting there may be an “ideal” number of elective exposures that supports improved ABIM-CE performance. Repeated elective exposures in an individual specialty did not correlate with overall or subspecialty ABIM-CE performance.
Resident education; Gender; Elective; Subspecialty; Graduate medical education
Historical bias toward service-oriented inpatient graduate medical education experiences has hindered both resident education and care of patients in the ambulatory setting.
Describe and evaluate a residency redesign intended to improve the ambulatory experience for residents and patients.
Categorical Internal Medicine resident ambulatory practice at the University of Cincinnati Academic Health Center.
We created a year-long continuous ambulatory group-practice experience separated from traditional inpatient responsibilities called the long block as an Accreditation Council for Graduate Medical Education Educational Innovations Project. The practice adopted the Chronic Care Model and residents received extensive instruction in quality improvement and interprofessional teams.
The long block was associated with significant increases in resident and patient satisfaction as well as improvement in multiple quality process and outcome measures. Continuity and no-show rates also improved.
An ambulatory long block can be associated with improvements in resident and patient satisfaction, quality measures, and no-show rates. Future research should be done to determine effects of the long block on education and patient care in the long term, and elucidate which aspects of the long block most contribute to improvement.
ambulatory education; clinic; residency training; chronic care model
The Accreditation Council for Graduate Medical Education (ACGME) requires that training programs integrate system-based practice (SBP) and practice-based learning and improvement (PBLI) into internal medicine residency curricula.
Context and setting
We instituted a seminar series and year-long-mentored curriculum designed to engage internal medicine residents in these competencies.
Residents participate in a seminar series that includes assigned reading and structured discussion with faculty who assist in the development of quality improvement or research projects. Residents pursue projects over the remainder of the year. Monthly works in progress meetings, protected time for inquiry, and continued faculty mentorship guide the residents in their project development. Trainees present their work at hospital-wide grand rounds at the end of the academic year. We performed a survey of residents to assess their self-reported knowledge, attitudes and skills in SBP and PBLI. In addition, blinded faculty scored projects for appropriateness, impact, and feasibility.
We measured resident self-reported knowledge, attitudes, and skills at the end of the academic year. We found evidence that participants improved their understanding of the context in which they were practicing, and that their ability to engage in quality improvement projects increased. Blinded faculty reviewers favorably ranked the projects’ feasibility, impact, and appropriateness. The ‘Curriculum of Inquiry’ generated 11 quality improvement and research projects during the study period. Barriers to the ongoing work include a limited supply of mentors and delays due to Institutional Review Board approval. Hospital leadership recognizes the importance of the curriculum, and our accreditation manager now cites our ongoing work.
A structured residency-based curriculum facilitates resident demonstration of SBP and practice-based learning and improvement. Residents gain knowledge and skills though this enterprise and hospitals gain access to trainees who help to solve ongoing problems and meet accreditation requirements.
graduate medical education; competencies; longitudinal curriculum
This article, developed for the Betty Ford Institute Consensus Conference on Graduate Medical Education (December, 2008), presents a model curriculum for Family Medicine residency training in substance abuse.
The authors reviewed reports of past Family Medicine curriculum development efforts, previously-identified barriers to education in high risk substance use, approaches to overcoming these barriers, and current training guidelines of the Accreditation Council for Graduate Medical Education (ACGME) and their Family Medicine Residency Review Committee. A proposed eight-module curriculum was developed, based on substance abuse competencies defined by Project MAINSTREAM and linked to core competencies defined by the ACGME. The curriculum provides basic training in high risk substance use to all residents, while also addressing current training challenges presented by U.S. work hour regulations, increasing international diversity of Family Medicine resident trainees, and emerging new primary care practice models.
This paper offers a core curriculum, focused on screening, brief intervention and referral to treatment, which can be adapted by residency programs to meet their individual needs. The curriculum encourages direct observation of residents to ensure that core skills are learned and trains residents with several "new skills" that will expand the basket of substance abuse services they will be equipped to provide as they enter practice.
Broad-based implementation of a comprehensive Family Medicine residency curriculum should increase the ability of family physicians to provide basic substance abuse services in a primary care context. Such efforts should be coupled with faculty development initiatives which ensure that sufficient trained faculty are available to teach these concepts and with efforts by major Family Medicine organizations to implement and enforce residency requirements for substance abuse training.
The Accreditation Council for Graduate Medical Education (ACGME) invokes evidence-based medicine (EBM) principles through the practice-based learning core competency. The authors hypothesized that among a representative sample of emergency medicine (EM) residency programs, a wide variability in EBM resident training priorities, faculty expertise expectations, and curricula exists.
The primary objective was to obtain descriptive data regarding EBM practices and expectations from EM physician educators. Our secondary objective was to assess differences in EBM educational priorities among journal club directors compared with non–journal club directors.
A 19-question survey was developed by a group of recognized EBM curriculum innovators and then disseminated to Council of Emergency Medicine Residency Directors (CORD) conference participants, assessing their opinions regarding essential EBM skill sets and EBM curricular expectations for residents and faculty at their home institutions. The survey instrument also identified the degree of interest respondents had in receiving a free monthly EBM journal club curriculum.
A total of 157 individuals registered for the conference, and 98 completed the survey. Seventy-seven (77% of respondents) were either residency program directors or assistant / associate program directors. The majority of participants were from university-based programs and in practice at least 5 years. Respondents reported the ability to identify flawed research (45%), apply research findings to patient care (43%), and comprehend research methodology (33%) as the most important resident skill sets. The majority of respondents reported no formal journal club or EBM curricula (75%) and do not utilize structured critical appraisal instruments (71%) when reviewing the literature. While journal club directors believed that resident learners’ most important EBM skill is to identify secondary peer-reviewed resources, non–journal club directors identified residents’ ability to distinguish significantly flawed research as the key skill to develop. Interest in receiving a free monthly EBM journal club curriculum was widely accepted (89%).
Attaining EBM proficiency is an expected outcome of graduate medical education (GME) training, although the specific domains of anticipated expertise differ between faculty and residents. Few respondents currently use a formalized curriculum to guide the development of EBM skill sets. There appears to be a high level of interest in obtaining EBM journal club educational content in a structured format. Measuring the effects of providing journal club curriculum content in conjunction with other EBM interventions may warrant further investigation.
evidence-based medicine; knowledge translation; faculty development
OBJECTIVE: To describe the views of residency program directors regarding the effect of the 2010 duty hour recommendations on the 6 core competencies of graduate medical education.
METHODS: US residency program directors in internal medicine, pediatrics, and general surgery were e-mailed a survey from July 8 through July 20, 2010, after the 2010 Accreditation Council for Graduate Medical Education (ACGME) duty hour recommendations were published. Directors were asked to rate the implications of the new recommendations for the 6 ACGME core competencies as well as for continuity of inpatient care and resident fatigue.
RESULTS: Of 719 eligible program directors, 464 (65%) responded. Most program directors believe that the new ACGME recommendations will decrease residents' continuity with hospitalized patients (404/464 [87%]) and will not change (303/464 [65%]) or will increase (26/464 [6%]) resident fatigue. Additionally, most program directors (249-363/464 [53%-78%]) believe that the new duty hour restrictions will decrease residents' ability to develop competency in 5 of the 6 core areas. Surgery directors were more likely than internal medicine directors to believe that the ACGME recommendations will decrease residents' competency in patient care (odds ratio [OR], 3.9; 95% confidence interval [CI], 2.5-6.3), medical knowledge (OR, 1.9; 95% CI, 1.2-3.2), practice-based learning and improvement (OR, 2.7; 95% CI, 1.7-4.4), interpersonal and communication skills (OR, 1.9; 95% CI, 1.2-3.0), and professionalism (OR, 2.5; 95% CI, 1.5-4.0).
CONCLUSION: Residency program directors' reactions to ACGME duty hour recommendations demonstrate a marked degree of concern about educating a competent generation of future physicians in the face of increasing duty hour standards and regulation.
The reactions of residency program directors to the ACGME duty hour recommendations demonstrate a marked degree of concern about educating a competent generation of future physicians in the face of increasing duty hour standards and regulation.
The in-training examination is a national and yearly exam administered by the American Board of Emergency Medicine to all emergency medicine residents in the USA. The purpose of the examination is to evaluate a resident’s progress toward obtaining the fundamental knowledge to practice independent emergency medicine.
The purpose of this study was to determine the effects of a 40 hour board review lecture course on the resident in-training examination in emergency medicine.
A 40 hour board review lecture course was designed and implemented during the weekly 5 hour long resident conferences during the 8 weeks preceding the in-training examination date in 2006. Attendance was mandatory at the Accreditation Council for Graduate Medical Education (ACGME) standard of 70% or greater. A positive result was considered to be a 10% increase or greater in the resident’s individual national class percentile ranking among their national peers for their class year for the emergency medicine in-training examination. A resident was excluded from the study if there was no 2005 in-training examination score for self-comparison. The 95% confidence intervals (CI) were used to analyze the results.
Of 16 residents, 1 (6.25%; 95% CI: 0–18%) showed a positive result of increasing their national class percentile ranking by 10% or greater. For the PGY2, one of the eight had a positive result (12.5%; 95% CI: 0–35.4%). For PGY3, no resident (0%; 95% CI: 0–35.4%) had a positive result.
A 40 hour board review lecture course has no positive effect on improving a resident’s in-training examination score.
In-training examination; Board review; Emergency medicine resident
The established guidelines for a diabetes foot examination include assessing circulatory, skin, and neurological status to detect problems early and reduce the likelihood of amputation. Physician adherence to the guidelines for proper examination is less than optimal.
Our objective was to increase compliance with the performance of a proper foot examination through a predominantly physician-directed interventional campaign.
The study consisted of 3 parts: a retrospective chart review to estimate background compliance, an educational intervention, and prospective chart review at 3 and 6 months. A properly documented foot examination was defined as assessing at least 2 of the 3 necessary components. The educational intervention consisted of 2 lectures directed at resident physicians and a quality assurance announcement at a general internal medicine staff meeting. Clinic support staff were instructed to remove the shoes and socks of all diabetic patients when they were placed in exam rooms, and signs reminding diabetics were placed in each exam room.
There was a significant increase in the performance of proper foot examination over the course of the study (baseline 14.0%, 3 months 58.0%, 6 months 62.1%; P < .001). Documentation of any component of a proper foot examination also increased substantially (32.6%, 67.3%, 72.5%; P < .001). Additionally, performance of each component of a proper exam increased dramatically during the study: neurological (13.5%, 35.8%, 38.5%; P < .001), skin (23.0%, 64.2%, 69.2%; P < .001), and vascular (14.0%, 51.2%, 50.5%; P < .001).
Patients with diabetes are unlikely to have foot examinations in their primary medical care. A simple, low-cost educational intervention significantly improved the adherence to foot examination guidelines for patients with diabetes.
diabetes; foot ulceration; foot exam; prevention; physical education
An important expectation of pediatric education is assessing, resuscitating, and stabilizing ill or injured children.
To determine whether the Accreditation Council for Graduate Medical Education (ACGME) minimum time requirement for emergency and acute illness experience is adequate to achieve the educational objectives set forth for categorical pediatric residents. We hypothesized that despite residents working five 1-month block rotations in a high-volume (95 000 pediatric visits per year) pediatric emergency department (ED), the comprehensive experience outlined by the ACGME would not be satisfied through clinical exposure.
This was a retrospective, descriptive study comparing actual resident experience to the standard defined by the ACGME. The emergency medicine experience of 35 categorical pediatric residents was tracked including number of patients evaluated during training and patient discharge diagnoses. The achievability of the ACGME requirement was determined by reporting the percentage of pediatric residents that cared for at least 1 patient from each of the ACGME-required disorder categories.
A total of 11.4% of residents met the ACGME requirement for emergency and acute illness experience in the ED. The median number of patients evaluated by residents during training in the ED was 941. Disorder categories evaluated least frequently included shock, sepsis, diabetic ketoacidosis, coma/altered mental status, cardiopulmonary arrest, burns, and bowel obstruction.
Pediatric residents working in one of the busiest pediatric EDs in the country and working 1 month more than the ACGME-recommended minimum did not achieve the ACGME requirement for emergency and acute illness experience through direct patient care.
The Accreditation Council for Graduate Medical Education (ACGME) Learning Portfolio is recommended as a tool to develop and document reflective, practice-based learning and improvement. There is no consensus regarding the appropriate content of a learning portfolio in medical education. Studying lessons selected for inclusion in their learning portfolios by surgical trainees could help identify useful subject matter for this purpose.
Each month, all residents in our surgery residency program submit entries into their individual Surgical Learning and Instructional Portfolio (SLIP). The SLIP entries from July 2008 to 2009 (n = 420) were deidentified and randomized using a random number generator. We conducted a thematic content analysis of 50 random portfolio entries to identify lessons learned. Two independent raters analyzed the “3 lessons learned” portion of the portfolio entries and identified themes and subthemes using the constant comparative method used in grounded theory.
The collaborative coding process resulted in theme saturation after the identification of 7 themes and their subthemes. Themes in decreasing order of frequency included complications, disease epidemiology, disease presentation, surgical management of disease, medical management of disease, operative techniques, and pathophysiology. Junior residents chose to focus on a broad array of foundational topics including disease presentation, epidemiology, and overall management of diseases, whereas postgraduate year-4 (PGY-4) and PGY-5 residents most frequently chose to focus on complications as learning points.
Lessons learned reflect perceived needs of the trainees based on training year. When given a template to follow, junior and senior residents choose to reflect on different subject matter to meet their learning goals.
The Accreditation Council for Graduate Medical Education (ACGME) expects programs to engage in ongoing, meaningful improvement, facilitated in part through an annual process of program assessment and improvement. The Duke University Hospital Office of Graduate Medical Education (OGME) used an institutional practice-based learning and improvement strategy to improve the annual evaluation and improvement of its programs.
The OGME implemented several strategies including the development and dissemination of a template for the report, program director and coordinator development, a reminder and tracking system, incorporation of the document into internal reviews, and use of incentives to promote program adherence.
In the first year of implementation (summer 2005), 27 programs (37%) submitted documentation of their annual program evaluation and improvement to the OGME; this increased to 100% of programs by 2009. A growing number of programs elected to use the template in lieu of written minutes. The number of citations related to required program review and improvement decreased from 12 in a single academic year to 3 over the last 5 years.
Duke University Hospital's institutional initiative to incorporate practice-based learning and improvement resulted in increased documentation, greater use of a standardized template, fewer ACGME-related citations, and enhanced consistency in preparing for ACGME site visits.