PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-11 (11)
 

Clipboard (0)
None

Select a Filter Below

Journals
Authors
more »
Year of Publication
Document Types
4.  Development, Testing, and Implementation of the ACGME Clinical Learning Environment Review (CLER) Program 
Since the release of the Institute of Medicine's report on resident hours and patient safety, there have been calls for enhanced institutional oversight of duty hour limits and of efforts to enhance the quality and safety of care in teaching hospitals. The ACGME has established the Clinical Learning Environment Review (CLER) program as a key component of the Next Accreditation System with the aim to promote safety and quality of care by focusing on 6 areas important to the safety and quality of care in teaching hospitals and the care residents will provide in a lifetime of practice after completion of training. The 6 areas encompass engagement of residents in patient safety, quality improvement and care transitions, promoting appropriate resident supervision, duty hour oversight and fatigue management, and enhancing professionalism.
Over the coming 18 months the ACGME will develop, test, and fully implement this new program by conducting visits to the nearly 400 clinical sites of sponsoring institutions with two or more specialty or subspecialty programs. These site visits will provide an understanding of how the learning environment for the 116 000 current residents and fellows addresses the 6 areas important to safety and quality of care, and will generate baseline data on the status of these activities in accredited institutions. We expect that over time the CLER program will serve as a new source of formative feedback for teaching institutions, and generate national data that will guide performance improvement for United States graduate medical education.
doi:10.4300/JGME-04-03-31
PMCID: PMC3444205  PMID: 23997895
5.  The Next Accreditation System: Stakeholder Expectations and Dialogue with the Community 
In February 2012, in an article in the New England Journal of Medicine,1 the Accreditation Council for Graduate Medical Education (ACGME) provided an initial description and the rationale for the Next Accreditation System (NAS). We follow up with this piece, which reflects on questions about the NAS, as a starting point for a dialogue with the community, and as the first in a series of articles that will describe key attributes of the NAS, offer practical guidance to programs and sponsoring institutions, and solicit stakeholder input. Dialogue with the community will be helpful in answering questions and in allowing the ACGME to clarify and refine certain elements of the NAS. This dialogue needs to be mindful that many details of the NAS are yet to be finalized. In communicating about the NAS, ACGME, thus, must balance a timely response to the community's desire to learn more and the need to have details well established to avoid a need to make changes after details have been released to stakeholders and the public.
doi:10.4300/JGME-04-02-35
PMCID: PMC3399632  PMID: 23730461
9.  Tracking Residents Through Multiple Residency Programs: A Different Approach for Measuring Residents' Rates of Continuing Graduate Medical Education in ACGME-Accredited Programs 
Background
Increased focus on the number and type of physicians delivering health care in the United States necessitates a better understanding of changes in graduate medical education (GME). Data collected by the Accreditation Council for Graduate Medical Education (ACGME) allow longitudinal tracking of residents, revealing the number and type of residents who continue GME following completion of an initial residency. We examined trends in the percent of graduates pursuing additional clinical education following graduation from ACGME-accredited pipeline specialty programs (specialties leading to initial board certification).
Methods
Using data collected annually by the ACGME, we tracked residents graduating from ACGME-accredited pipeline specialty programs between academic year (AY) 2002–2003 and AY 2006–2007 and those pursuing additional ACGME-accredited training within 2 years. We examined changes in the number of graduates and the percent of graduates continuing GME by specialty, by type of medical school, and overall.
Results
The number of pipeline specialty graduates increased by 1171 (5.3%) between AY 2002–2003 and AY 2006–2007. During the same period, the number of graduates pursuing additional GME increased by 1059 (16.7%). The overall rate of continuing GME increased each year, from 28.5% (6331/22229) in AY 2002–2003 to 31.6% (7390/23400) in AY 2006–2007. Rates differed by specialty and for US medical school graduates (26.4% [3896/14752] in AY 2002–2003 to 31.6% [4718/14941] in AY 2006–2007) versus international medical graduates (35.2% [2118/6023] to 33.8% [2246/6647]).
Conclusion
The number of graduates and the rate of continuing GME increased from AY 2002–2003 to AY 2006–2007. Our findings show a recent increase in the rate of continued training for US medical school graduates compared to international medical graduates. Our results differ from previously reported rates of subspecialization in the literature. Tracking individual residents through residency and fellowship programs provides a better understanding of residents' pathways to practice.
doi:10.4300/JGME-D-10-00105.1
PMCID: PMC3010950  PMID: 22132288
10.  Residency Programs' Evaluations of the Competencies: Data Provided to the ACGME About Types of Assessments Used by Programs 
Background
In 1999, the Accreditation Council for Graduate Medical Education (ACGME) Outcome Project began to focus on resident performance in the 6 competencies of patient care, medical knowledge, professionalism, practice-based learning and improvement, interpersonal communication skills, and professionalism. Beginning in 2007, the ACGME began collecting information on how programs assess these competencies. This report provides information on the nature and extent of those assessments.
Methods
Using data collected by the ACGME for site visits, we use descriptive statistics and percentages to describe the number and type of methods and assessors accredited programs (n  =  4417) report using to assess the competencies. Observed differences among specialties, methodologies, and assessors are tested with analysis of variance procedures.
Results
Almost all (>97%) of programs report assessing all of the competencies and using multiple methods and multiple assessors. Similar assessment methods and evaluator types were consistently used across the 6 competencies. However, there were some differences in the use of patient and family as assessors: Primary care and ambulatory specialties used these to a greater extent than other specialties.
Conclusion
Residency programs are emphasizing the competencies in their evaluation of residents. Understanding the scope of evaluation methodologies that programs use in resident assessment is important for both the profession and the public, so that together we may monitor continuing improvement in US graduate medical education.
doi:10.4300/JGME-02-04-30
PMCID: PMC3010956  PMID: 22132294
11.  Assessing Physicians' Orientation Toward Lifelong Learning 
BACKGROUND
Despite the importance of lifelong learning as an element of professionalism, no psychometrically sound instrument is available for its assessment among physicians.
OBJECTIVE
To assess the validity and reliability of an instrument developed to measure physicians' orientation toward lifelong learning.
DESIGN
Mail survey.
PARTICIPANTS
Seven hundred and twenty-one physicians, of whom 444 (62%) responded.
MEASUREMENT
The Jefferson Scale of Physician Lifelong Learning (JSPLL), which includes 19 items answered on a 4-point Likert scale, was used with additional questions about respondents' professional activities related to continuous learning.
RESULTS
Factor analysis of the JSPLL yielded 4 subscales entitled: “professional learning beliefs and motivation,”“scholarly activities,”“attention to learning opportunities,” and “technical skills in seeking information,” which are consistent with widely recognized features of lifelong learning. The validity of the scale and its subscales was supported by significant correlations with a set of criterion measures that presumably require continuous learning. The internal consistency reliability (coefficient α) of the JSPLL was 0.89, and the test-retest reliability was 0.91.
CONCLUSIONS
Empirical evidence supports the validity and reliability of the JSPLL.
doi:10.1111/j.1525-1497.2006.00500.x
PMCID: PMC1831612  PMID: 16918737
lifelong learning; physicians; psychometrics; validity; reliability

Results 1-11 (11)