Physician communication skills are associated with improved patient satisfaction, better health outcomes, greater adherence to treatment, and more active self-management of chronic illnesses.1–,7
The Accreditation Council for Graduate Medical Education, the American Board of Medical Specialties, and the Association of American Medical Colleges have underscored the importance of communication skills by including training and assessment in communication and interpersonal skills as one of the competency domains.8–,10
The Bayer-Fetzer Conference on Physician-Patient Communication in Medical Education convened authors of the major theoretical models of physician-patient communication and other important stakeholders to reach a consensus on the essential elements that characterized physician-patient communication. Their report resulting from this conference, termed the Kalamazoo Consensus Statement (Kalamazoo I), identified 7 key elements of communication in clinical encounters: build the relationship, open the discussion, gather information, understand the patient's perspective, share information, reach agreement, and provide closure.1
It was hoped that by providing a common framework, this expert consensus statement would facilitate the development of communications curriculum and assessment tools in medical education. Later, The Kalamazoo II report recommended specific assessment methods to evaluate communication skills, including direct observation with real patients, ratings of simulated encounters with standardized patients (SPs), ratings of video or audiotaped interactions, patient surveys, and knowledge/skills/attitudes examinations.11
The original Kalamazoo Essential Elements Checklist included 23 items assessing subcompetencies identified in the Kalamazoo I report. However, its scaling options (done well, needs improvement, not done) limited a rater's ability to distinguish among the range of physician communication skills. Furthermore, it required considerable time for rating. In response to these limitations, Calhoun et al12
adapted the Kalamazoo Essential Elements Checklist by replacing the original response options with a 5-point Likert scale that allowed raters to evaluate each communication skill on a continuum from “poor” to “excellent,” and shifted to global ratings for the 7 essential elements. This tool will be subsequently referred to as the KEECC-A.
Initial studies of the KEECC-A suggest that it is a flexible tool with psychometric data to support its use in some medical education settings. Rider et al2
studied fellows during a simulated family meeting using the KEECC-A and a gap analysis as part of a multisource assessment of communication skills. Investigators in that study added 2 additional dimensions (demonstrates empathy and communicates information accurately) to the KEECC-A instrument and found a Cronbach α value of .84 for the original 7 dimensions and a Cronbach α value of .87 for the 9 dimensions of their modified tool. These data provided evidence for the internal consistency of the measure when completed by these peer raters and suggested that one strength of the KEECC-A is ease of use by multiple raters.12
The usability of the KEECC-A has also been highlighted. Schirmer et al13
reported that the KEECC-A helped less experienced faculty raters focus on 7 key elements of communication. Furthermore, Calhoun et al12
found the average time to complete the KEECC-A, plus an additional 2 dimensions, was 7 (±2.7) minutes, suggesting that this tool is feasible for faculty to use.
We developed an institutional curriculum in communication skills for first-year core residency programs using the Kalamazoo framework to guide curricular development and assessment. This curriculum focused on 3 key topic areas: informed consent, disclosure of errors, and sharing bad news. Each of these 3 communication topic areas contained an online module, a small group discussion facilitated by the program director or key clinical faculty, and a 3-station objective structured clinical examination (OSCE) for each topic area (9 OSCEs total). Residents, faculty, and SPs used the KEECC-A as the assessment tool for the OSCEs.
This article extends and expands on the work done by Rider et al14
and Calhoun et al12
by looking at evidence for the reliability and validity of the KEECC-A using ratings from SPs, faculty, and resident self-ratings. In addition, scores on the KEECC-A and the American Board of Internal Medicine Patient Satisfaction Questionnaire (PSQ) were analyzed for a subgroup of residents in order to examine evidence for the measure's convergent validity. The KEECC-A has the potential to fill the need for a communication tool that is linked to an expert conceptual framework, is brief, and can be used for assessment of communication skills by a wide variety of raters and specialties.