PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of jgimedspringer.comThis journalToc AlertsSubmit OnlineOpen Choice
 
J Gen Intern Med. 2010 September; 25(Suppl 4): 586–592.
Published online 2010 August 25. doi:  10.1007/s11606-010-1358-1
PMCID: PMC2940445

Developing Measures of Educational Change for Academic Health Care Teams Implementing the Chronic Care Model in Teaching Practices

Abstract

BACKGROUND

The Chronic Care Model (CCM) is a multidimensional framework designed to improve care for patients with chronic health conditions. The model strives for productive interactions between informed, activated patients and proactive practice teams, resulting in better clinical outcomes and greater satisfaction. While measures for improving care may be clear, measures of residents’ competency to provide chronic care do not exist. This report describes the process used to develop educational measures and results from CCM settings that used them to monitor curricular innovations.

SUBJECTS

Twenty-six academic health care teams participating in the national and California Academic Chronic Care Collaboratives.

METHOD

Using successive discussion groups and surveys, participants engaged in an iterative process to identify desirable and feasible educational measures for curricula that addressed educational objectives linked to the CCM. The measures were designed to facilitate residency programs’ abilities to address new accreditation requirements and tested with teams actively engaged in redesigning educational programs.

ANALYSIS

Field notes from each discussion and lists from work groups were synthesized using the CCM framework. Descriptive statistics were used to report survey results and measurement performance.

RESULTS

Work groups generated educational objectives and 17 associated measurements. Seventeen (65%) teams provided feasibility and desirability ratings for the 17 measures. Two process measures were selected for use by all teams. Teams reported variable success using the measures. Several teams reported use of additional measures, suggesting more extensive curricular change.

CONCLUSION

Using an iterative process in collaboration with program participants, we successfully defined a set of feasible and desirable education measures for academic health care teams using the CCM. These were used variably to measure the results of curricular changes, while simultaneously addressing requirements for residency accreditation.

Electronic supplementary material

The online version of this article (doi:10.1007/s11606-010-1358-1) contains supplementary material, which is available to authorized users.

KEY WORDS: Chronic Care Model, quality improvement, graduate medical education, ambulatory care, practice-based learning and improvement, systems-based practice

BACKGROUND

As the number of individuals living with a chronic illness increases, caring for patients with chronic health conditions has become a common endeavor for physicians-in-training in all settings.1 Optimal management of chronic illnesses should be a central goal for any medical education curriculum. Yet, the existing care models and their associated learning environments fall short of achieving expected outcomes for many chronic conditions.2 In an effort to address this gap in care, new models of care delivery have been tested in clinical practices,3,4 but few report efforts to make similar changes in academic training environments.5,6 In 2005, the Institute for Improving Clinical Care of the Association of American Medical Colleges (AAMC) launched a national collaborative designed to train health care teams that included residents-in-training to implement a new model for delivering systematic chronic care, the Chronic Care Model (CCM), in academic primary care practices. Based on the early successes of this national collaborative, the California Healthcare Foundation funded a similar effort in the state of California in 2007.

The Chronic Care Model (CCM) is a multidimensional framework designed to assist practices with improving care for patients with chronic health conditions.7 It differs from the traditional ambulatory visit where an individual physician evaluates an uninformed, passive patient. The redesigned practice strives for productive interactions between informed, activated patients and proactive practice teams, and results in better clinical outcomes and greater patient and clinician satisfaction. The CCM is summarized in Table 1. Practices that employ the CCM use evidence to provide quality care (decision support); redesign the way team care is delivered to improve quality (delivery system design); implement information systems to improve communication between team members, provide evidence-based reminders for care decisions, monitor the effectiveness of care for individual patients and populations, and identify patients in need of planned care interventions (clinical information systems); and support patients in their self-care through education, skill building, and self-belief (self-management support). These attributes are situated in the larger context of the health system that values and provides incentives for quality chronic care delivery (health care organization) and the community that supports patients in their self-management (community resources and policies). The CCM uses well-defined, evidence-based process and outcome measures to assess progress toward optimal management of individuals and populations with a specified chronic condition. For example, a practice monitoring improvement of care for patients with diabetes mellitus will track the number of patients reaching the target for glycemic control (outcome measure) as well as up-to-date monofilament foot examinations (process measure).

Table 1
Chronic Care Model Components

Implementing the CCM in academic primary care practices where residents provide some or all of the clinical care requires making changes in the educational program simultaneously with making changes in the practice. While the outcome measures for improving care may be clear, no similar measures of educational progress existed at the time the national collaborative was launched. Because a shift in residency program accreditation was underway,8 we saw this as an opportunity to assist the programs involved in the collaboratives with addressing new competency-based accreditation requirements.

In 1999, the Accreditation Council of Graduate Medical Education (ACGME) endorsed six new general competencies that all graduates must achieve and linked these competences to residency program accreditation.8 Two of these competencies in particular, systems-based practice (SBP) and practice-based learning and improvement (PBLI), can readily be met through resident participation in efforts to implement the CCM in their continuity clinic practices or other ambulatory settings.

Our aim was to develop a discrete set of educational measures at the interface between the CCM and the SBP and PBLI competencies that would facilitate curricular change in residency programs committed to redesigning their clinical practices using the CCM. We report here the iterative process undertaken to develop measures of educational change and the resulting measures that teams participating in the national and California collaboratives used to monitor curricular innovations.

METHODS

Setting

In January 2005, the AAMC, in partnership with MacColl Institute for Healthcare Innovation (Group Health Research Institute, Seattle, WA) and supported by the Robert Wood Johnson Foundation, called for proposals from teaching hospitals to participate in a national academic chronic care collaborative for the purpose of implementing the CCM in academic practice settings. Participants were required to form quality improvement teams that included both practice (e.g., physician and nurse champions) and educational (e.g., faculty and resident) leaders. Thirty-six self-selected quality improvement teams from 22 institutions across the US participated in this national collaborative. For most teams, the residency program director or an associate program director with the ability to facilitate change in the curriculum served as the educational leader. Teams participated in a series of five training sessions (four face-to-face and one virtual Web-based) between June 2005 and November 2006 to learn about the CCM and the model for improvement, a method using rapid cycle tests of change to improve the quality of care.9 In order to monitor their local improvement efforts, teams reported monthly on various process and outcome measures related to their work. For example, if the team was working to improve the quality of diabetes care, reported measures included the percent of the diabetic population in the practice with a hemoglobin A1c measured in the previous 6 months (process measure) and the percent of the population with A1c measurements less than 7.0% (outcome measure). In addition, teams were expected to implement educational changes that integrated the CCM into the residency program curriculum.

Based on the early successes of this national collaborative, the California Healthcare Foundation funded a similar effort for teaching hospitals in California in 2007. Teams participating in the national collaborative served as the development cohort, and those participating in the California collaborative served as the implementation cohort for the educational measures.

At the first training session of the national collaborative, we engaged volunteer participants in an iterative development process to (1) define potential educational objectives at the interface between the SBP and PBLI competencies and the Chronic Care Model, and propose specific measurement strategies, (2) determine which educational measures were both desirable and feasible at the local residency program and practice level, (3) select two required measures for use across all teams involved in the collaborative, and (4) report on educational outcomes using these measures and additional optional measures as desired. We chose this iterative method because it mirrored the quality improvement model whereby the end users are engaged in the development process as the most knowledgeable contributors and most invested in using the product.

Educational Measures

Using small group process, education representatives from each team and all other interested participants worked to define potential educational objectives and measures of educational change. The ACGME competencies were reviewed with attention to the specific detailed objectives for SBP and PBLI competencies. Participants were subdivided into six smaller groups based on the six components of the CCM and charged with generating learning objectives that would address their assigned CCM component and some part of the SBP or PBLI competencies. For example, the small group assigned to the Clinical Information Systems (CIS) component of the CCM suggested: “Residents routinely receive reports about their practice outcomes for the disease of interest (e.g., diabetes mellitus)” as an educational objective linked to the PBLI competency.

Ideas from each group were recorded and used to generate a list of learning objectives tied to each component of the CCM and linked to either the SBP or PBLI competency. This summary report was presented to the entire group of participants. Through discussion, the ideas were validated as representative and further revised for clarity. For example, the suggestion above was revised to “Residents will improve their understanding of the practice’s performance for the condition of interest.”

The authors then developed a survey with 17 proposed educational measures linked to these educational objectives (see online Appendix). For the example objective above, the measurement was: “Number of residents receiving, reviewing, and discussing at least one registry report for the practice population.” The disease registry is the primary way that practices keep track of the relevant clinical quality measures as a reflection of the quality of care patients are receiving. All participating teams were invited to rate each potential educational measure based on the value it would provide to the team in facilitating educational change at the training site (“desirability”) and if it would be feasible for the team to implement the measure at the training site (“feasibility). Likert scales were used to rate each item where 1 = not at all desirable/feasible, 3 = neutral, and 5 = highly desirable/feasible. Surveys and reminders were distributed via e-mail. Mean ratings were calculated for all survey items. A minimal threshold mean score of “4 = somewhat desirable/feasible” or higher was used to retain educational measures for further consideration.

During subsequent workshops at the next collaborative training session, survey results were presented, and the six potential educational measures meeting the minimal threshold for consideration were discussed. Based on these discussions and the goals of the collaborative, the authors selected two ‘required’ measures for application across the collaborative. They were chosen for high desirability and high feasibility and the potential to promote educational change in training programs in new curricular areas. The remaining 15 measures were optional for teams.

Teams identified the population of ‘learners’ they considered subjects of the educational changes. In most cases, the targeted learners were the residents expected to participate in the chronic care curriculum, but some teams also included others (e.g., teaching faculty) who were learning about the CCM. Teams were required to send monthly reports to the leadership of the collaborative including counts of learners who had met the expectations of the required measures and counts of learners for any optional educational measures they chose to use. We used descriptive statistics to report results.

RESULTS

Successive workshop discussions generated several learning objectives. For the PBLI competency, participants generated learning objectives related to two components of the CCM: clinical information systems and decision support. For the SBP competency, learning objectives were generated for four CCM components: health system, delivery system design, self-management support, and community resources. Table 2 shows the reformulation of these learning objectives, linking CCM components, ACGME competencies, and potential educational measures for each objective. Most of the educational measures were classified as “process” measures or ways of tracking resident “exposure” to new concepts. Six of the 17 measures were considered “outcome” measures, requiring residents to demonstrate observable behaviors.

Table 2
Relationship Between Two ACGME Competencies,8 the Chronic Care model Components, Educational Objectives, and Measures of Educational Change

Seventeen of 26 (65%) participating teams completed the Education Measures desirability/feasibility survey (7 from Family Medicine programs, 7 from Internal Medicine programs, 2 from Medicine-Pediatrics programs, and 1 from a Nurse Practitioner program). Results are shown in Table 3. The six measures with mean ratings of “somewhat to highly” feasible and desirable were retained for further discussion (shown in bold). Two of the retained measures were outcome measures related to critical appraisal of the literature and answering clinical questions, activities familiar to residency training curricula. The remaining four measures were process measures not routinely a part of residency training: analyzing population reports on the quality of care provided and improving patient-centeredness attributes of the practice. The two measures that were selected for reporting across all participating teams were: (1) number of residents receiving, reviewing, and discussing at least one registry report for the practice population (PBLI competency objective under the clinical information systems component of the CCM); (2) number of residents learning and demonstrating self-management support strategies (SBP competency objective under the self-management support component of the CCM) (Table 3).

Table 3
Results of Education Measures Survey, Required and Optional Measures

National Collaborative Results

Teams participating in the national collaborative developed the measures as described above. In the fourth month of the program, these teams were asked to begin reporting on the required education measures. Only 4 of 26 and 3 of 26 participating teams reported results for the first and second required educational measures between October 2005 and October 2006, with 383 learners exposed to at least one of the curricular innovations assessed by the required education measures. The target learner population, initially defined as the residents in the clinical practice sites, evolved to include all team members in the practice sites since the concepts were new to everyone. Because both of the required measures represented new concepts in the curriculum, all teams reported a performance baseline of zero. Data are shown in Table 4.

Table 4
Team Performance on Education Measures

California Collaborative

The California collaborative employed the educational measures developed by the national collaborative. The majority of teams reported these measures after the first team training session, and their extended use provided more opportunity to realize their effect. Results are shown in Table 4.

Figures 1 and and22 illustrate cumulative improvement over the reporting period for 15 California collaborative teams reporting results for at least 6 months. In addition to the required measures, some teams used additional education measures. Examples include: percent of learners conducting a planned visit (12 teams, 178 learners, 60% of learners achieved), percent of learners identifying, learning, and teaching others about a clinical question (5 teams, 47 learners, 88% achieved), and percent of learners completing a Plan-Do-Study-Act cycle (9 teams, 149 learners, 95% achieved).

Figure 1
Required measure 1: percent of learners reviewing a registry report. Legend: cumulative results for all reporting teams in California collaborative.
Figure 2
Required measure 2: percent of learners demonstrating self-management support strategies. Legend: cumulative results for all reporting teams in California collaborative.

DISCUSSION

Using an iterative process in collaboration with those most impacted by the outcome, we successfully defined a set of feasible and desirable education measures at the intersection of the CCM and the ACGME Competencies that residency programs addressing curricular changes in chronic care could use to measure results. To facilitate early success in exposing learners to the new care delivery model, process rather than outcome measures were chosen for the required measures. We based this decision on our desire to minimize the additional burden to programs already undertaking extensive practice re-design at their local training sites and the relative resistance to more robust measures we detected during the development process.

As demonstrated by the low reporting rate, teams participating in the national collaborative had difficulty using educational measures in addition to the other requirements of their participation in the collaborative. This was less of a problem for California collaborative participants. Two observations may explain part of this difference. First, many of the California teams had previously used disease registries in their practices and did not need to overcome this significant obstacle. Second, a leading self-management program was previously developed and championed in California11, and many team members were familiar with the concepts. Although the development teams from the national collaborative lagged behind the implementation teams in the California collaborative, together they engaged more than 700 learners in the new curriculum. In addition, 80% of the implementation teams that reported measures used additional measures to drive the curricular change they desired.

Initially, we anticipated that residents would be the defined learner population. Teams recognized early that all members of the practice re-design were learning new skills and should be counted as learners participating in the curriculum, including faculty. As a result, faculty development was recognized to be as important a focus as resident engagement in the new curriculum.

Our study has several limitations. Not all teams participated in the development process, although 65% of teams completed the feasibility/desirability survey. We did not track specific participation and cannot comment on the differences between participants and non-participants. We also cannot detect differences between training disciplines and different measures may be more important in different learning environments or to different disciplines. Not all teams reported results of educational measures on a monthly bases and some did not report at all. Thus, these results reflect what is possible for teams who were able to overcome barriers to curricular change and successfully implement the CCM in their academic practices. Finally, these measures are for educational use, and their impact on patient care quality has not been evaluated.

This report offers a new approach to educational assessment for the development of ACGME competencies in the setting of chronic care. This work presents a framework for measuring educational outcomes linked to use of the CCM in residency training programs. Although process measures do not assure that new skills have been mastered, the stage is set for testing and refining educational outcome measures to improve residency training in chronic illness care.

Electronic supplementary material

Below is the link to the electronic supplementary material.

ESM 1(33K, doc)

(DOC 32 kb)

Acknowledgement

The Robert Wood Johnson Foundation and the California Healthcare Foundation generously supported the academic chronic care collaboratives that served as the basis for this work

Conflict of interest None disclosed.

References

1. Bodenheimer T, Chen E, Bennett H. Confronting the growing burden of chronic disease: can the US health care workforce do the job? Health Affairs. 2009;28:64–74. doi: 10.1377/hlthaff.28.1.64. [PubMed] [Cross Ref]
2. Crossing the quality chasm: a new health system for the twenty-first century. Washington: National Academy Press; 2001.
3. Bodenheimer T, Wagner EH, Grumbach K. Improving primary care for patients with chronic illness. JAMA. 2002;288:1775–1779. doi: 10.1001/jama.288.14.1775. [PubMed] [Cross Ref]
4. Bodenheimer T, Wagner EH, Grumbach K. Improving primary care for patients with chronic illness: the chronic care model, part 2. JAMA. 2002;288:1909–1914. doi: 10.1001/jama.288.15.1909. [PubMed] [Cross Ref]
5. Warm EJ, Schauer DP, Diers T, et al. The ambulatory long-block: An Accreditation Council for Graduate Medical Education (ACGME) Educational Innovations Project (EIP) J Gen Intern Med. 2008;23:921–6. doi: 10.1007/s11606-008-0588-y. [PMC free article] [PubMed] [Cross Ref]
6. Dipiero A, Dorr DA, Kelso C, Bowen JL. Integrating systematic chronic care for diabetes into an academic general internal medicine resident-faculty practice. J Gen Intern Med. 2008;23:1749–1756. doi: 10.1007/s11606-008-0751-5. [PMC free article] [PubMed] [Cross Ref]
7. Wagner EH, Austin BT, Davis C, Hindmarsh M, Schaefer J, Bonomi A. Improving chronic illness care: translating evidence into action. Health Affairs. 2001;20:64–78. doi: 10.1377/hlthaff.20.6.64. [PubMed] [Cross Ref]
8. ACGME Outcome Project. Available at: http://www.acgme.org/outcome/comp/compMin.asp, accessed March 23, 2010.
9. Langley GL, Nolan KM, Nolan TW, Norman CL, Provost LP. The improvement guide: a practical approach to enhancing organizational performance. 2. San Francisco: Jossey-Bass Publishers; 2009.
10. Bonomi AE, Wagner EH, Glasgow RE, Korff M. Assessment of Chronic Illness Care (ACIC): a practical tool to measure quality improvement. Health Serv Res. 2002;37:791–820. doi: 10.1111/1475-6773.00049. [PMC free article] [PubMed] [Cross Ref]
11. Lorig KR, Ritter P, Stewart AL, et al. Chronic disease self-management program: two-year health status and health care utilization outcomes. Med Care. 2001;39:1217–1223. doi: 10.1097/00005650-200111000-00008. [PubMed] [Cross Ref]
12. Norcini JJ, Blank LL, Duffy FD, Fortna GS. The mini-CEX: a method for assessing clinical skills. Ann Int Med. 2003;138:476–481. [PubMed]

Articles from Journal of General Internal Medicine are provided here courtesy of Society of General Internal Medicine