PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (754353)

Clipboard (0)
None

Related Articles

1.  Use of a Structured Template to Facilitate Practice-Based Learning and Improvement Projects 
Background
The Accreditation Council for Graduate Medical Education (ACGME) requires residency programs to meet and demonstrate outcomes across 6 competencies. Measuring residents' competency in practice-based learning and improvement (PBLI) is particularly challenging.
Purpose
We developed an educational tool to meet ACGME requirements for PBLI. The PBLI template helped programs document quality improvement (QI) projects and supported increased scholarly activity surrounding PBLI learning.
Methods
We reviewed program requirements for 43 residency and fellowship programs and identified specific PBLI requirements for QI activities. We also examined ACGME Program Information Form responses on PBLI core competency questions surrounding QI projects for program sites visited in 2008–2009. Data were integrated by a multidisciplinary committee to develop a peer-protected PBLI template guiding programs through process, documentation, and evaluation of QI projects. All steps were reviewed and approved through our GME Committee structure.
Results
An electronic template, companion checklist, and evaluation form were developed using identified project characteristics to guide programs through the PBLI process and facilitate documentation and evaluation of the process. During a 24 month period, 27 programs have completed PBLI projects, and 15 have reviewed the template with their education committees, but have not initiated projects using the template.
Discussion
The development of the tool generated program leaders' support because the tool enhanced the ability to meet program-specific objectives. The peer-protected status of this document for confidentiality and from discovery has been beneficial for program usage. The document aggregates data on PBLI and QI initiatives, offers opportunities to increase scholarship in QI, and meets the ACGME goal of linking measures to outcomes important to meeting accreditation requirements at the program and institutional level.
doi:10.4300/JGME-D-11-00195.1
PMCID: PMC3399615  PMID: 23730444
2.  Measuring Resident Physicians' Performance of Preventive Care 
BACKGROUND
The Accreditation Council for Graduate Medical Education has suggested various methods for evaluation of practice-based learning and improvement competency, but data on implementation of these methods are limited.
OBJECTIVE
To compare medical record review and patient surveys on evaluating physician performance in preventive services in an outpatient resident clinic.
DESIGN
Within an ongoing quality improvement project, we collected baseline performance data on preventive services provided for patients at the University of Alabama at Birmingham (UAB) Internal Medicine Residents' ambulatory clinic.
PARTICIPANTS
Seventy internal medicine and medicine-pediatrics residents from the UAB Internal Medicine Residency program.
MEASUREMENTS
Resident- and clinic-level comparisons of aggregated patient survey and chart documentation rates of (1) screening for smoking status, (2) advising smokers to quit, (3) cholesterol screening, (4) mammography screening, and (5) pneumonia vaccination.
RESULTS
Six hundred and fifty-nine patient surveys and 761 charts were abstracted. At the clinic level, rates for screening of smoking status, recommending mammogram, and for cholesterol screening were similar (difference <5%) between the 2 methods. Higher rates for pneumonia vaccination (76% vs 67%) and advice to quit smoking (66% vs 52%) were seen on medical record review versus patient surveys. However, within-resident (N=70) comparison of 2 methods of estimating screening rates contained significant variability. The cost of medical record review was substantially higher ($107 vs $17/physician).
CONCLUSIONS
Medical record review and patient surveys provided similar rates for selected preventive health measures at the clinic level, with the exception of pneumonia vaccination and advising to quit smoking. A large variation among individual resident providers was noted.
doi:10.1111/j.1525-1497.2006.00338.x
PMCID: PMC1828097  PMID: 16499544
education; medical; preventive health services; patient survey; medical record review; cost evaluation
3.  Educational Experiences Residents Perceive As Most Helpful for the Acquisition of the ACGME Competencies 
Background
The Accreditation Council for Graduate Medical Education (ACGME) requires physicians in training to be educated in 6 competencies considered important for independent medical practice. There is little information about the experiences that residents feel contribute most to the acquisition of the competencies.
Objective
To understand how residents perceive their learning of the ACGME competencies and to determine which educational activities were most helpful in acquiring these competencies.
Method
A web-based survey created by the graduate medical education office for institutional program monitoring and evaluation was sent to all residents in ACGME-accredited programs at the David Geffen School of Medicine, University of California-Los Angeles, from 2007 to 2010. Residents responded to questions about the adequacy of their learning for each of the 6 competencies and which learning activities were most helpful in competency acquisition.
Results
We analyzed 1378 responses collected from postgraduate year-1 (PGY-1) to PGY-3 residents in 12 different residency programs, surveyed between 2007 and 2010. The overall response rate varied by year (66%–82%). Most residents (80%–97%) stated that their learning of the 6 ACGME competencies was “adequate.” Patient care activities and observation of attending physicians and peers were listed as the 2 most helpful learning activities for acquiring the 6 competencies.
Conclusion
Our findings reinforce the importance of learning from role models during patient care activities and the heterogeneity of learning activities needed for acquiring all 6 competencies.
doi:10.4300/JGME-D-11-00058.1
PMCID: PMC3399609  PMID: 23730438
4.  Accreditation council for graduate medical education (ACGME) annual anesthesiology residency and fellowship program review: a "report card" model for continuous improvement 
BMC Medical Education  2010;10:13.
Background
The Accreditation Council for Graduate Medical Education (ACGME) requires an annual evaluation of all ACGME-accredited residency and fellowship programs to assess program quality. The results of this evaluation must be used to improve the program. This manuscript describes a metric to be used in conducting ACGME-mandated annual program review of ACGME-accredited anesthesiology residencies and fellowships.
Methods
A variety of metrics to assess anesthesiology residency and fellowship programs are identified by the authors through literature review and considered for use in constructing a program "report card."
Results
Metrics used to assess program quality include success in achieving American Board of Anesthesiology (ABA) certification, performance on the annual ABA/American Society of Anesthesiology In-Training Examination, performance on mock oral ABA certification examinations, trainee scholarly activities (publications and presentations), accreditation site visit and internal review results, ACGME and alumni survey results, National Resident Matching Program (NRMP) results, exit interview feedback, diversity data and extensive program/rotation/faculty/curriculum evaluations by trainees and faculty. The results are used to construct a "report card" that provides a high-level review of program performance and can be used in a continuous quality improvement process.
Conclusions
An annual program review is required to assess all ACGME-accredited residency and fellowship programs to monitor and improve program quality. We describe an annual review process based on metrics that can be used to focus attention on areas for improvement and track program performance year-to-year. A "report card" format is described as a high-level tool to track educational outcomes.
doi:10.1186/1472-6920-10-13
PMCID: PMC2830223  PMID: 20141641
5.  Charting the Road to Competence: Developmental Milestones for Internal Medicine Residency Training 
Background
The Accreditation Council for Graduate Medical Education (ACGME) Outcome Project requires that residency program directors objectively document that their residents achieve competence in 6 general dimensions of practice.
Intervention
In November 2007, the American Board of Internal Medicine (ABIM) and the ACGME initiated the development of milestones for internal medicine residency training. ABIM and ACGME convened a 33-member milestones task force made up of program directors, experts in evaluation and quality, and representatives of internal medicine stakeholder organizations. This article reports on the development process and the resulting list of proposed milestones for each ACGME competency.
Outcomes
The task force adopted the Dreyfus model of skill acquisition as a framework the internal medicine milestones, and calibrated the milestones with the expectation that residents achieve, at a minimum, the “competency” level in the 5-step progression by the completion of residency. The task force also developed general recommendations for strategies to evaluate the milestones.
Discussion
The milestones resulting from this effort will promote competency-based resident education in internal medicine, and will allow program directors to track the progress of residents and inform decisions regarding promotion and readiness for independent practice. In addition, the milestones may guide curriculum development, suggest specific assessment strategies, provide benchmarks for resident self-directed assessment-seeking, and assist remediation by facilitating identification of specific deficits. Finally, by making explicit the profession's expectations for graduates and providing a degree of national standardization in evaluation, the milestones may improve public accountability for residency training.
doi:10.4300/01.01.0003
PMCID: PMC2931179  PMID: 21975701
6.  Assessing Intern Core Competencies With an Objective Structured Clinical Examination 
Background
Residents are evaluated using Accreditation Council for Graduate Medical Education (ACGME) core competencies. An Objective Structured Clinical Examination (OSCE) is a potential evaluation tool to measure these competencies and provide outcome data.
Objective
Create an OSCE to evaluate and demonstrate improvement in intern core competencies of patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice before and after internship.
Methods
From 2006 to 2008, 106 interns from 10 medical specialties were evaluated with a preinternship and postinternship OSCE at Madigan Army Medical Center. The OSCE included eight 12-minute stations that collectively evaluated the 6 ACGME core competencies using human patient simulators, standardized patients, and clinical scenarios. Interns were scored using objective and subjective criteria, with a maximum score of 100 for each competency. Stations included death notification, abdominal pain, transfusion consent, suture skills, wellness history, chest pain, altered mental status, and computer literature search. These stations were chosen by specialty program directors, created with input from board-certified specialists, and were peer reviewed.
Results
All OSCE testing on the 106 interns (ages 25 to 44 [average, 28.6]; 70 [66%] men; 65 [58%] allopathic medical school graduates) resulted in statistically significant improvement in all ACGME core competencies: patient care (71.9% to 80.0%, P < .001), medical knowledge (59.6% to 78.6%, P < .001), practice-based learning and improvement (45.2% to 63.0%, P < .001), interpersonal and communication skills (77.5% to 83.1%, P < .001), professionalism (74.8% to 85.1%, P < .001), and systems-based practice (56.6% to 76.5%, P < .001).
Conclusion
An OSCE during internship can evaluate incoming baseline ACGME core competencies and test for interval improvement. The OSCE is a valuable assessment tool to provide outcome measures on resident competency performance and evaluate program effectiveness.
doi:10.4300/01.01.0006
PMCID: PMC2931201  PMID: 21975704
7.  Residency Programs' Evaluations of the Competencies: Data Provided to the ACGME About Types of Assessments Used by Programs 
Background
In 1999, the Accreditation Council for Graduate Medical Education (ACGME) Outcome Project began to focus on resident performance in the 6 competencies of patient care, medical knowledge, professionalism, practice-based learning and improvement, interpersonal communication skills, and professionalism. Beginning in 2007, the ACGME began collecting information on how programs assess these competencies. This report provides information on the nature and extent of those assessments.
Methods
Using data collected by the ACGME for site visits, we use descriptive statistics and percentages to describe the number and type of methods and assessors accredited programs (n  =  4417) report using to assess the competencies. Observed differences among specialties, methodologies, and assessors are tested with analysis of variance procedures.
Results
Almost all (>97%) of programs report assessing all of the competencies and using multiple methods and multiple assessors. Similar assessment methods and evaluator types were consistently used across the 6 competencies. However, there were some differences in the use of patient and family as assessors: Primary care and ambulatory specialties used these to a greater extent than other specialties.
Conclusion
Residency programs are emphasizing the competencies in their evaluation of residents. Understanding the scope of evaluation methodologies that programs use in resident assessment is important for both the profession and the public, so that together we may monitor continuing improvement in US graduate medical education.
doi:10.4300/JGME-02-04-30
PMCID: PMC3010956  PMID: 22132294
8.  Instituting systems-based practice and practice-based learning and improvement: a curriculum of inquiry 
Medical Education Online  2013;18:10.3402/meo.v18i0.21612.
Background
The Accreditation Council for Graduate Medical Education (ACGME) requires that training programs integrate system-based practice (SBP) and practice-based learning and improvement (PBLI) into internal medicine residency curricula.
Context and setting
We instituted a seminar series and year-long-mentored curriculum designed to engage internal medicine residents in these competencies.
Methods
Residents participate in a seminar series that includes assigned reading and structured discussion with faculty who assist in the development of quality improvement or research projects. Residents pursue projects over the remainder of the year. Monthly works in progress meetings, protected time for inquiry, and continued faculty mentorship guide the residents in their project development. Trainees present their work at hospital-wide grand rounds at the end of the academic year. We performed a survey of residents to assess their self-reported knowledge, attitudes and skills in SBP and PBLI. In addition, blinded faculty scored projects for appropriateness, impact, and feasibility.
Outcomes
We measured resident self-reported knowledge, attitudes, and skills at the end of the academic year. We found evidence that participants improved their understanding of the context in which they were practicing, and that their ability to engage in quality improvement projects increased. Blinded faculty reviewers favorably ranked the projects’ feasibility, impact, and appropriateness. The ‘Curriculum of Inquiry’ generated 11 quality improvement and research projects during the study period. Barriers to the ongoing work include a limited supply of mentors and delays due to Institutional Review Board approval. Hospital leadership recognizes the importance of the curriculum, and our accreditation manager now cites our ongoing work.
Conclusions
A structured residency-based curriculum facilitates resident demonstration of SBP and practice-based learning and improvement. Residents gain knowledge and skills though this enterprise and hospitals gain access to trainees who help to solve ongoing problems and meet accreditation requirements.
doi:10.3402/meo.v18i0.21612
PMCID: PMC3776321  PMID: 24044686
graduate medical education; competencies; longitudinal curriculum
9.  Evaluating Practice-Based Learning and Improvement: Efforts to Improve Acceptance of Portfolios 
Introduction
The Accreditation Council for Graduate Medical Education (ACGME) recommends resident portfolios as 1 method for assessing competence in practice-based learning and improvement. In July 2005, when anesthesiology residents in our department were required to start a portfolio, the residents and their faculty advisors did not readily accept this new requirement. Intensive education efforts addressing the goals and importance of portfolios were undertaken. We hypothesized that these educational efforts improved acceptance of the portfolio and retrospectively audited the portfolio evaluation forms completed by faculty advisors.
Methods
Intensive education about the goals and importance of portfolios began in January 2006, including presentations at departmental conferences and one-on-one education sessions. Faculty advisors were instructed to evaluate each resident's portfolio and complete a review form. We retrospectively collected data to determine the percentage of review forms completed by faculty. The portfolio reviews also assessed the percentage of 10 required portfolio components residents had completed.
Results
Portfolio review forms were completed by faculty advisors for 13% (5/38) of residents during the first advisor-advisee meeting in December 2005. Initiation of intensive education efforts significantly improved compliance, with review forms completed for 68% (26/38) of residents in May 2006 (P < .0001) and 95% (36/38) in December 2006 (P < .0001). Residents also significantly improved the completeness of portfolios between May and December of 2006.
Discussion
Portfolios are considered a best methods technique by the ACGME for evaluation of practice-based learning and improvment. We have found that intensive education about the goals and importance of portfolios can enhance acceptance of this evaluation tool, resulting in improved compliance in completion and evaluation of portfolios.
doi:10.4300/JGME-D-10-00010.1
PMCID: PMC3010953  PMID: 22132291
10.  Teaching Internal Medicine Residents Quality Improvement Techniques using the ABIM’s Practice Improvement Modules 
Summary
Introduction/aim
Standard curricula to teach Internal Medicine residents about quality assessment and improvement, important components of the Accreditation Council for Graduate Medical Education core competencies practiced-based learning and improvement (PBLI) and systems-based practice (SBP), have not been easily accessible.
Program description
Using the American Board of Internal Medicine’s (ABIM) Clinical Preventative Services Practice Improvement Module (CPS PIM), we have incorporated a longitudinal quality assessment and improvement curriculum (QAIC) into the 2 required 1-month ambulatory rotations during the postgraduate year 2. During the first block, residents complete the PIM chart reviews, patient, and system surveys. The second block includes resident reflection using PIM data and the group performing a small test of change using the Plan–Do–Study–Act (PDSA) cycle in the resident continuity clinic.
Program Evaluation
To date, 3 resident quality improvement (QI) projects have been undertaken as a result of QAIC, each making significant improvements in the residents’ continuity clinic. Resident confidence levels in QI skills (e.g., writing an aim statement [71% to 96%, P < .01] and using a PDSA cycle [9% to 89%, P < .001]) improved significantly.
Discussion
The ABIM CPS PIM can be used by Internal Medicine residency programs to introduce QI concepts into their residents’ outpatient practice through encouraging practice-based learning and improvement and systems-based practice.
doi:10.1007/s11606-008-0549-5
PMCID: PMC2517947  PMID: 18449612
Internal Medicine residents; quality improvement; practiced-based learning and improvement; systems-based practice; practice improvement module
11.  The research rotation: competency-based structured and novel approach to research training of internal medicine residents 
Background
In the United States, the Accreditation Council of graduate medical education (ACGME) requires all accredited Internal medicine residency training programs to facilitate resident scholarly activities. However, clinical experience and medical education still remain the main focus of graduate medical education in many Internal Medicine (IM) residency-training programs. Left to design the structure, process and outcome evaluation of the ACGME research requirement, residency-training programs are faced with numerous barriers. Many residency programs report having been cited by the ACGME residency review committee in IM for lack of scholarly activity by residents.
Methods
We would like to share our experience at Lincoln Hospital, an affiliate of Weill Medical College Cornell University New York, in designing and implementing a successful structured research curriculum based on ACGME competencies taught during a dedicated "research rotation".
Results
Since the inception of the research rotation in 2004, participation of our residents among scholarly activities has substantially increased. Our residents increasingly believe and appreciate that research is an integral component of residency training and essential for practice of medicine.
Conclusion
Internal medicine residents' outlook in research can be significantly improved using a research curriculum offered through a structured and dedicated research rotation. This is exemplified by the improvement noted in resident satisfaction, their participation in scholarly activities and resident research outcomes since the inception of the research rotation in our internal medicine training program.
doi:10.1186/1472-6920-6-52
PMCID: PMC1630691  PMID: 17044924
12.  Impact of 2011 Resident Duty Hour Requirements on Neurology Residency Programs and Departments 
The Neurohospitalist  2014;4(3):119-126.
Objective:
In 2011, the Accreditation Council on Graduate Medical Education (ACGME) redefined resident duty hour requirements by reducing in-hospital duty hour requirements for residents in an effort to improve patient care, resident well-being, and resident education. We sought to determine the cost of adoption based on changes made by neurology residency programs and departments due to these requirements.
Methods:
We surveyed department chairs or residency program directors at 123 ACGME-accredited US adult neurology training programs on programmatic changes and resident expansion, hiring practices, and development of new computer-based resources in direct response to the 2011 ACGME duty hour requirements. Using data from publicly available resources, we estimated respondents’ financial cost of adoption.
Results:
In all, 63 responded (51% response rate); 76% were program directors. The most common changes implemented by programs were adding night float systems (n = 31; 49%) and increasing faculty responsibility (n = 26; 41%). In direct response to the requirements, 21 programs applied to ACGME for 40 additional residents, 29 of which were fully covered by institutional funds. In direct response to the requirements, nearly half of the departments (n = 26) hired individuals for a total of 80 hires (or 64 full-time equivalents), most commonly mid-level practitioners. The total estimated cost to responding departments was US $12.7 million or US $201,000 per department annually. When projecting expenses of planned changes for the following year, costs increased to US $360,000 per department, with 5-year costs exceeding US $1 million.
Conclusions:
The most recent restriction on resident duty hours comes at substantial cost to neurology departments and residency programs.
doi:10.1177/1941874413518640
PMCID: PMC4056414  PMID: 24982715
education; training; academic; quality; safety; costs
13.  The Ambulatory Long-Block: An Accreditation Council for Graduate Medical Education (ACGME) Educational Innovations Project (EIP) 
Introduction
Historical bias toward service-oriented inpatient graduate medical education experiences has hindered both resident education and care of patients in the ambulatory setting.
Aim
Describe and evaluate a residency redesign intended to improve the ambulatory experience for residents and patients.
Setting
Categorical Internal Medicine resident ambulatory practice at the University of Cincinnati Academic Health Center.
Program Description
We created a year-long continuous ambulatory group-practice experience separated from traditional inpatient responsibilities called the long block as an Accreditation Council for Graduate Medical Education Educational Innovations Project. The practice adopted the Chronic Care Model and residents received extensive instruction in quality improvement and interprofessional teams.
Program Evaluation
The long block was associated with significant increases in resident and patient satisfaction as well as improvement in multiple quality process and outcome measures. Continuity and no-show rates also improved.
Discussion
An ambulatory long block can be associated with improvements in resident and patient satisfaction, quality measures, and no-show rates. Future research should be done to determine effects of the long block on education and patient care in the long term, and elucidate which aspects of the long block most contribute to improvement.
doi:10.1007/s11606-008-0588-y
PMCID: PMC2517908  PMID: 18612718
ambulatory education; clinic; residency training; chronic care model
14.  Mapping Cognitive Overlaps Between Practice-Based Learning and Improvement and Evidence-Based Medicine: An Operational Definition for Assessing Resident Physician Competence 
Purpose
The complex competency labeled practice-based learning and improvement (PBLI) by the Accreditation Council for Graduate Medical Education (ACGME) incorporates core knowledge in evidence-based medicine (EBM). The purpose of this study was to operationally define a “PBLI-EBM” domain for assessing resident physician competence.
Method
The authors used an iterative design process to first content analyze and map correspondences between ACGME and EBM literature sources. The project team, including content and measurement experts and residents/fellows, parsed, classified, and hierarchically organized embedded learning outcomes using a literature-supported cognitive taxonomy. A pool of 141 items was produced from the domain and assessment specifications. The PBLI-EBM domain and resulting items were content validated through formal reviews by a national panel of experts.
Results
The final domain represents overlapping PBLI and EBM cognitive dimensions measurable through written, multiple-choice assessments. It is organized as 4 subdomains of clinical action: Therapy, Prognosis, Diagnosis, and Harm. Four broad cognitive skill branches (Ask, Acquire, Appraise, and Apply) are subsumed under each subdomain. Each skill branch is defined by enabling skills that specify the cognitive processes, content, and conditions pertinent to demonstrable competence. Most items passed content validity screening criteria and were prepared for test form assembly and administration.
Conclusions
The operational definition of PBLI-EBM competence is based on a rigorously developed and validated domain and item pool, and substantially expands conventional understandings of EBM. The domain, assessment specifications, and procedures outlined may be used to design written assessments to tap important cognitive dimensions of the overall PBLI competency, as given by ACGME. For more comprehensive coverage of the PBLI competency, such instruments need to be complemented with performance assessments.
doi:10.4300/JGME-D-09-00029.1
PMCID: PMC2931258  PMID: 21975994
15.  Changing education to improve patient care 
Quality in Health Care : QHC  2001;10(Suppl 2):ii54-ii58.
Health professionals need competencies in improvement skills if they are to contribute usefully to improving patient care. Medical education programmes in the USA have not systematically taught improvement skills to residents (registrars in the UK). The Accreditation Council for Graduate Medical Education (ACGME) has recently developed and begun to deploy a competency based model for accreditation that may encourage the development of improvement skills by the 100 000 residents in accredited programmes. Six competencies have been identified for all physicians, independent of specialty, and measurement tools for these competencies have been described. This model may be applicable to other healthcare professions. This paper explores patterns that inhibit efforts to change practice and proposes an educational model to provide changes in management skills based on trainees' analysis of their own work.
Key Words: physician education; improvement skills; accreditation; competency
doi:10.1136/qhc.0100054..
PMCID: PMC1765748  PMID: 11700380
16.  Duty Hour Recommendations and Implications for Meeting the ACGME Core Competencies: Views of Residency Directors 
Mayo Clinic Proceedings  2011;86(3):185-191.
OBJECTIVE: To describe the views of residency program directors regarding the effect of the 2010 duty hour recommendations on the 6 core competencies of graduate medical education.
METHODS: US residency program directors in internal medicine, pediatrics, and general surgery were e-mailed a survey from July 8 through July 20, 2010, after the 2010 Accreditation Council for Graduate Medical Education (ACGME) duty hour recommendations were published. Directors were asked to rate the implications of the new recommendations for the 6 ACGME core competencies as well as for continuity of inpatient care and resident fatigue.
RESULTS: Of 719 eligible program directors, 464 (65%) responded. Most program directors believe that the new ACGME recommendations will decrease residents' continuity with hospitalized patients (404/464 [87%]) and will not change (303/464 [65%]) or will increase (26/464 [6%]) resident fatigue. Additionally, most program directors (249-363/464 [53%-78%]) believe that the new duty hour restrictions will decrease residents' ability to develop competency in 5 of the 6 core areas. Surgery directors were more likely than internal medicine directors to believe that the ACGME recommendations will decrease residents' competency in patient care (odds ratio [OR], 3.9; 95% confidence interval [CI], 2.5-6.3), medical knowledge (OR, 1.9; 95% CI, 1.2-3.2), practice-based learning and improvement (OR, 2.7; 95% CI, 1.7-4.4), interpersonal and communication skills (OR, 1.9; 95% CI, 1.2-3.0), and professionalism (OR, 2.5; 95% CI, 1.5-4.0).
CONCLUSION: Residency program directors' reactions to ACGME duty hour recommendations demonstrate a marked degree of concern about educating a competent generation of future physicians in the face of increasing duty hour standards and regulation.
The reactions of residency program directors to the ACGME duty hour recommendations demonstrate a marked degree of concern about educating a competent generation of future physicians in the face of increasing duty hour standards and regulation.
doi:10.4065/mcp.2010.0635
PMCID: PMC3046937  PMID: 21307391
17.  Practice-Based Learning and Improvement for Institutions: A Case Report 
Background
In 2006, the University of Virginia became one of the first academic medical institutions to be placed on probation, after the Accreditation Council for Graduate Medical Education (ACGME) Institutional Review Committee implemented a new classification system for institutional reviews.
Intervention
After University of Virginia reviewed its practices and implemented needed changes, the institution was able to have probation removed and full accreditation restored. Whereas graduate medical education committees and designated institutional officials are required to conduct internal reviews of each ACGME–accredited program midway through its accreditation cycle, no similar requirement exists for institutions.
Learning
As we designed corrective measures at the University of Virginia, we realized that regularly scheduled audits of the entire institution would have prevented the accumulation of deficiencies. We suggest that institutional internal reviews be implemented to ensure that the ACGME institutional requirements for graduate medical education are met. This process represents practice-based learning and improvement at the institutional level and may prevent other institutions from receiving unfavorable accreditation decisions.
doi:10.4300/JGME-D-10-00071.1
PMCID: PMC3010952  PMID: 22132290
18.  Barriers to Implementing the ACGME Outcome Project: A Systematic Review of Program Director Surveys 
Introduction
The Accreditation Council for Graduate Medical Education (ACGME) introduced the Outcome Project in July 2001 to improve the quality of resident education through competency-based learning. The purpose of this systematic review is to determine and explore the perceptions of program directors regarding challenges to implementing the ACGME Outcome Project.
Methods
We used the PubMed and Web of Science databases and bibliographies for English-language articles published between January 1, 2001, and February 17, 2012. Studies were included if they described program directors' opinions on (1) barriers encountered when attempting to implement ACGME competency-based education, and (2) assessment methods that each residency program was using to implement competency-based education. Articles meeting the inclusion criteria were screened by 2 researchers. The grading criterion was created by the authors and used to assess the quality of each study.
Results
The survey-based data reported the opinions of 1076 program directors. Barriers that were encountered include: (1) lack of time; (2) lack of faculty support; (3) resistance of residents to the Outcome Project; (4) insufficient funding; (5) perceived low priority for the Outcome Project; (6) inadequate salary incentive; and (7) inadequate knowledge of the competencies. Of the 6 competencies, those pertaining to patient care and medical knowledge received the most responses from program directors and were given highest priority.
Conclusions
The reviewed literature revealed that time and financial constraints were the most important barriers encountered when implementing the ACGME Outcome Project.
doi:10.4300/JGME-D-11-00222.1
PMCID: PMC3546570  PMID: 24294417
19.  Length of Training Debate in Family Medicine: Idealism Versus Realism? 
How long a resident must train to achieve competency is an ongoing debate in medicine. For family medicine, there is an Accreditation Council for Graduate Medical Education (ACGME)–approved proposal to examine the benefits of lengthening family medicine training from 3 to 4 years. The rationale for adding another year of residency in family medicine has included the following: (1) overcoming the effect of the duty hour limits in further reducing educational opportunities, (2) reversing the growing number of first-time takers of the American Board of Family Medicine primary board who fail to pass the exam, (3) enhancing the family medicine training experience by “decompressing” the ever-growing number of Residency Review Committee requirements to maintain accreditation, and (4) improving the overall quality of family medicine graduates.
doi:10.4300/JGME-D-12-00250.1
PMCID: PMC3693679  PMID: 24404258
20.  Board review course effect on resident in-training examination 
Background
The in-training examination is a national and yearly exam administered by the American Board of Emergency Medicine to all emergency medicine residents in the USA. The purpose of the examination is to evaluate a resident’s progress toward obtaining the fundamental knowledge to practice independent emergency medicine.
Aims
The purpose of this study was to determine the effects of a 40 hour board review lecture course on the resident in-training examination in emergency medicine.
Methods
A 40 hour board review lecture course was designed and implemented during the weekly 5 hour long resident conferences during the 8 weeks preceding the in-training examination date in 2006. Attendance was mandatory at the Accreditation Council for Graduate Medical Education (ACGME) standard of 70% or greater. A positive result was considered to be a 10% increase or greater in the resident’s individual national class percentile ranking among their national peers for their class year for the emergency medicine in-training examination. A resident was excluded from the study if there was no 2005 in-training examination score for self-comparison. The 95% confidence intervals (CI) were used to analyze the results.
Results
Of 16 residents, 1 (6.25%; 95% CI: 0–18%) showed a positive result of increasing their national class percentile ranking by 10% or greater. For the PGY2, one of the eight had a positive result (12.5%; 95% CI: 0–35.4%). For PGY3, no resident (0%; 95% CI: 0–35.4%) had a positive result.
Conclusions
A 40 hour board review lecture course has no positive effect on improving a resident’s in-training examination score.
doi:10.1007/s12245-008-0068-5
PMCID: PMC2657258  PMID: 19384650
In-training examination; Board review; Emergency medicine resident
21.  The State of Evaluation in Internal Medicine Residency 
Journal of General Internal Medicine  2008;23(7):1010-1015.
Background
There are no nationwide data on the methods residency programs are using to assess trainee competence. The Accreditation Council for Graduate Medical Education (ACGME) has recommended tools that programs can use to evaluate their trainees. It is unknown if programs are adhering to these recommendations.
Objective
To describe evaluation methods used by our nation’s internal medicine residency programs and assess adherence to ACGME methodological recommendations for evaluation.
Design
Nationwide survey.
Participants
All internal medicine programs registered with the Association of Program Directors of Internal Medicine (APDIM).
Measurements
Descriptive statistics of programs and tools used to evaluate competence; compliance with ACGME recommended evaluative methods.
Results
The response rate was 70%. Programs were using an average of 4.2–6.0 tools to evaluate their trainees with heavy reliance on rating forms. Direct observation and practice and data-based tools were used much less frequently. Most programs were using at least 1 of the Accreditation Council for Graduate Medical Education (ACGME)’s “most desirable” methods of evaluation for all 6 measures of trainee competence. These programs had higher support staff to resident ratios than programs using less desirable evaluative methods.
Conclusions
Residency programs are using a large number and variety of tools for evaluating the competence of their trainees. Most are complying with ACGME recommended methods of evaluation especially if the support staff to resident ratio is high.
doi:10.1007/s11606-008-0578-0
PMCID: PMC2517950  PMID: 18612734
graduate medical education; residency; ACGME; competency
22.  Chronic Disease Management: A Residency-Led Intervention to Improve Outcomes in Diabetic Patients 
The Ochsner Journal  2012;12(4):323-330.
Background
When quality improvement processes are integrated into resident education, many opportunities are created for improved outcomes in patient care. For Bethesda Family Medicine (BFM), integrating quality improvement into resident education is paramount in fulfilling the Accreditation Council for Graduate Medical Education Practice-Based Learning and Improvement core competency requirements.
Methods
A resident-developed diabetes management treatment protocol that targeted 11 evidence-based measures recommended for successful diabetes management was implemented within the BFM residency and all physician practices under its parent healthcare system. This study compares diabetes management at BFM and at 2 other family medicine practices at timepoints before and after protocol implementation. We measured hemoglobin A1c (HbA1c), low-density lipoprotein (LDL) cholesterol, and systolic blood pressure (SBP) in adult diabetics and compared patient outcomes for these measures for the first and third quarters of 2009 and 2010.
Results
In BFM patients, HbA1c, LDL, and SBP levels decreased, but only HbA1c improvement persisted long term. For the comparison groups, in general levels were lower than those of BFM patients but not significantly so after the first measurement period.
Conclusions
A resident-led treatment protocol can improve HbA1c outcomes among residents' diabetic patients. Periodic educational interventions can enhance residents' focus on diabetes management. Residents in graduate medical education can initiate treatment protocols to improve patient care in a large healthcare system.
PMCID: PMC3527859  PMID: 23267258
Adult diabetics; diabetes management; graduate medical education; quality improvement; resident education
23.  Providing competency-based family medicine residency training in substance abuse in the new millennium: a model curriculum 
BMC Medical Education  2010;10:33.
Background
This article, developed for the Betty Ford Institute Consensus Conference on Graduate Medical Education (December, 2008), presents a model curriculum for Family Medicine residency training in substance abuse.
Methods
The authors reviewed reports of past Family Medicine curriculum development efforts, previously-identified barriers to education in high risk substance use, approaches to overcoming these barriers, and current training guidelines of the Accreditation Council for Graduate Medical Education (ACGME) and their Family Medicine Residency Review Committee. A proposed eight-module curriculum was developed, based on substance abuse competencies defined by Project MAINSTREAM and linked to core competencies defined by the ACGME. The curriculum provides basic training in high risk substance use to all residents, while also addressing current training challenges presented by U.S. work hour regulations, increasing international diversity of Family Medicine resident trainees, and emerging new primary care practice models.
Results
This paper offers a core curriculum, focused on screening, brief intervention and referral to treatment, which can be adapted by residency programs to meet their individual needs. The curriculum encourages direct observation of residents to ensure that core skills are learned and trains residents with several "new skills" that will expand the basket of substance abuse services they will be equipped to provide as they enter practice.
Conclusions
Broad-based implementation of a comprehensive Family Medicine residency curriculum should increase the ability of family physicians to provide basic substance abuse services in a primary care context. Such efforts should be coupled with faculty development initiatives which ensure that sufficient trained faculty are available to teach these concepts and with efforts by major Family Medicine organizations to implement and enforce residency requirements for substance abuse training.
doi:10.1186/1472-6920-10-33
PMCID: PMC2885404  PMID: 20459842
24.  Effect of a Physician-directed Educational Campaign on Performance of Proper Diabetic Foot Exams in an Outpatient Setting 
BACKGROUND
The established guidelines for a diabetes foot examination include assessing circulatory, skin, and neurological status to detect problems early and reduce the likelihood of amputation. Physician adherence to the guidelines for proper examination is less than optimal.
OBJECTIVE
Our objective was to increase compliance with the performance of a proper foot examination through a predominantly physician-directed interventional campaign.
METHODS
The study consisted of 3 parts: a retrospective chart review to estimate background compliance, an educational intervention, and prospective chart review at 3 and 6 months. A properly documented foot examination was defined as assessing at least 2 of the 3 necessary components. The educational intervention consisted of 2 lectures directed at resident physicians and a quality assurance announcement at a general internal medicine staff meeting. Clinic support staff were instructed to remove the shoes and socks of all diabetic patients when they were placed in exam rooms, and signs reminding diabetics were placed in each exam room.
RESULTS
There was a significant increase in the performance of proper foot examination over the course of the study (baseline 14.0%, 3 months 58.0%, 6 months 62.1%; P < .001). Documentation of any component of a proper foot examination also increased substantially (32.6%, 67.3%, 72.5%; P < .001). Additionally, performance of each component of a proper exam increased dramatically during the study: neurological (13.5%, 35.8%, 38.5%; P < .001), skin (23.0%, 64.2%, 69.2%; P < .001), and vascular (14.0%, 51.2%, 50.5%; P < .001).
CONCLUSIONS
Patients with diabetes are unlikely to have foot examinations in their primary medical care. A simple, low-cost educational intervention significantly improved the adherence to foot examination guidelines for patients with diabetes.
doi:10.1046/j.1525-1497.2003.10662.x
PMCID: PMC1494848  PMID: 12709092
diabetes; foot ulceration; foot exam; prevention; physical education
25.  Meeting Requirements and Changing Culture 
Journal of General Internal Medicine  2004;19(5 Pt 2):492-495.
The Accreditation Council of Graduate Medical Education (ACGME) and the Residency Review Committee require a competency-based, accessible evaluation system. The paper system at our institution did not meet these demands and suffered from low compliance. A diverse committee of internal medicine faculty, program directors, and house staff designed a new clinical evaluation strategy based on ACGME competencies and utilizing a modular web-based system called ResEval. ResEval more effectively met requirements and provided useful data for program and curriculum development. The system is paperless, allows for evaluations at any time, and produces customized evaluation reports, dramatically improving our ability to analyze evaluation data. The use of this novel system and the inclusion of a robust technology infrastructure, repeated training and e-mail reminders, and program leadership commitment resulted in an increase in clinical skills evaluations performed and a rapid change in the workflow and culture of evaluation at our residency program.
doi:10.1111/j.1525-1497.2004.30065.x
PMCID: PMC1492338  PMID: 15109310
house staff evaluation; clinical skills assessment; internal medicine residency; Internet; educational measurement

Results 1-25 (754353)