PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-2 (2)
 

Clipboard (0)
None

Select a Filter Below

Journals
Authors
Year of Publication
Document Types
1.  Description of a Developmental Criterion-Referenced Assessment for Promoting Competence in Internal Medicine Residents 
Rationale
End-of- rotation global evaluations can be subjective, produce inflated grades, lack interrater reliability, and offer information that lacks value. This article outlines the generation of a unique developmental criterion-referenced assessment that applies adult learning theory and the learner, manager, teacher model, and represents an innovative application to the American Board of Internal Medicine (ABIM) 9-point scale.
Intervention
We describe the process used by Southern Illinois University School of Medicine to develop rotation-specific, criterion-based evaluation anchors that evolved into an effective faculty development exercise.
Results
The intervention gave faculty a clearer understanding of the 6 Accreditation Council for Graduate Medical Education competencies, each rotation's educational goals, and how rotation design affects meaningful work-based assessment. We also describe easily attainable successes in evaluation design and pitfalls that other institutions may be able to avoid. Shifting the evaluation emphasis on the residents' development of competence has made the expectations of rotation faculty more transparent, has facilitated conversations between program director and residents, and has improved the specificity of the tool for feedback. Our findings showed the new approach reduced grade inflation compared with the ABIM end-of-rotation global evaluation form.
Discussion
We offer the new developmental criterion-referenced assessment as a unique application of the competences to the ABIM 9-point scale as a transferable model for improving the validity and reliability of resident evaluations across graduate medical education programs.
doi:10.4300/01.01.0012
PMCID: PMC2931180  PMID: 21975710
2.  Development of the Objective Structured System-Interaction Examination 
Study Objective
The purpose of this study was to develop an objective method of evaluating resident competency in systems-based practice.
Study Design
Faculty developed a 12-station examination, the Objective Structured System-Interaction Examination (OSSIE), patterned after the Objective Structured Clinical Examinations (OSCEs), to evaluate residents' ability to effectively work within the complex medical system of care. Scenarios consisted of multiple situations, such as patient hand-offs, consultations, complicated discharges, and family meetings, in which residents interacted with simulated professionals, simulated patients, and simulated family members to demonstrate the systems-based skills. Twelve second-year residents participated in the OSSIE.
Findings
Along with the standardized professionals, a faculty member provided the resident with immediate feedback and completed an evaluation form designed specifically to assess systems-based practice. Residents, faculty, and staff evaluated the OSSIE and felt it provided a rich learning experience and was a beneficial means of formative assessment. The residents' third-year learning experiences were adapted to meet their needs, and suggestions were offered for curriculum revision.
Discussion
The OSSIE is unique in that it uses standardized professionals, involves scenarios in a variety of settings, and incorporates current technology, including an electronic health record and a state-of-the-art simulation laboratory, into the examination. Challenges to implementation include faculty time, scheduling of residents, and availability of resources.
Conclusion
By using the OSSIE, faculty are able to assess, provide constructive feedback, and tailor training opportunities to improve resident competence in systems-based practice. Reliability and validity of an instrument developed for use with the OSSIE are currently being determined.
doi:10.4300/01.01.0013
PMCID: PMC2931182  PMID: 21975711

Results 1-2 (2)