Search tips
Search criteria 


Logo of jgmeLink to Publisher's site
J Grad Med Educ. 2009 September; 1(1): 49–60.
PMCID: PMC2931181

Systems-Based Practice Defined: Taxonomy Development and Role Identification for Competency Assessment of Residents



To demonstrate a methodology for coding and taxonomy development and to operationally define residents' competence in systems-based practice (SBP) in terms of observable roles, actions, and behaviors.


The Accreditation Council for Graduate Medical Education's (ACGME's) full-text definition of SBP and the 6 discrete expectations it contains were content analyzed. Structured interviews of 88 health care professionals using a variant of focus group interviews called nominal group processes were conducted and qualitatively analyzed to identify the key attributes of SBP. Themes obtained from these 2 procedures were conceptually matched and organized to create a taxonomy of observable SPB behaviors and the SBP domain.


Six general resident roles emerged, under which 35 specific behavioral attributes were subsumed. From the SBP domain specified. Sample SBP items categorized by roles were derived that reflected “in-context” representations of ACGME SBP expectations.


Our comprehensive analysis created an operational representation of the SBP competency. The taxonomy development model provides a framework for constructing assessment instrument(s) that could be applied to the other ACGME competencies or complex concepts in medical education.


In the United States, the Accreditation Council for Graduate Medical Education (ACGME) requires that all residents demonstrate competency in systems-based practice (SBP). The other 5 competencies designated by the ACGME are medical knowledge, professionalism, patient care, interpersonal and communication skills, and practice-based learning and improvement. These broad, relatively complex and unobservable attributes are difficult to measure because they have not been formally defined using consensus procedures among health care professionals. This study undertook the task of operationally defining SBP competence based on expert consensus.


There is a growing focus in medicine and other health professions on evaluating physicians' knowledge of health care systems. Concurrently, there is an accompanying requirement for evaluating physicians' health care systems knowledge. This emphasis has evolved from a combination of factors, including (1) an increase in knowledge and technologies that require a more systems-based rather than individual-practice emphasis in patient care; (2) an increase in health care errors that are widely publicized; (3) higher levels of information and expectations of consumers in accessing a variety of services in a system of care; and (4) efforts to contain overwhelming health care costs, such as those seen in managed care, which requires scrutiny of all components of a health care system.1,9

The accreditation standards for graduate medical education programs in several nations now emphasize that physicians understand and practice within a larger system of patient care (eg, the United States' ACGME Outcome Project,10 Canada's CanMEDS,11 and the United Kingdom General Medical Council's Good Medical Practices).10,12 When physicians competently practice within larger systems of care, patients are better able to access a full range of services, and physicians can provide more coordinated care contained within the limited resources available.

How the ACGME Defines Competency in SBP

The ACGME broadly defines SBP as the demonstration of “an awareness of and responsiveness to the larger context and system of health care and the ability to effectively call on system resources to provide care that is of optimal value.”10 The language of the SBP definition further states that residents must obtain and demonstrate competency by meeting the 6 resident expectations noted in table 1.

Table 1
The Six Systems-Based Practice (SBP) Expectations of the Accreditation Council for Graduate Medical Education

Although more specific than the above definition, the embedded SBP expectations still represent complex behaviors involving undefined aspects of residents' knowledge, skills, and dispositions. A classic problem in assessment in general and at the crux of measuring complex domains of behavior is the need to define such ambiguous competency areas in highly specific, well-elaborated terms to clarify exactly what it is that one is trying to measure.13

In sum, observing complex behaviors, like SBP, requires identifying the underlying layers of abilities, indicators of personality traits, and other attributes with some systematic procedure.14,15 To properly assess the defined domain, a choice must then be made as to the best method of assessment, such as written tests, direct performance observations or, perhaps, self-reported or supervisor-rated scales. The assessment method should be the one most likely to yield valid and authentic information on the competency domain. For instance, although certain clinical practice behaviors are best tapped as directly observed performances, implied cognitive capacities are usually indirectly tapped through written examinations.15 There are those who advocate the use of more “authentic” assessment methods calling for examinees to demonstrate the requisite behaviors in real-life contexts, such as physicians showing how they would actually negotiate the health care system while tackling tasks related to patient care in a hospital. Such contextualized performance assessments are claimed to both define expectations better and raise standards in education.16

In its description of SBP, the ACGME is clear in its requirement that resident competence needs to be assessed. Beyond the expectations, however, the council does not stipulate how the assessments should be designed, the assessment methods that would be the best suited for different competency areas, or who should undertake the development work.

Within the resident competency evaluation literature, there are surprisingly few models that offer clear and direct routes to pursue the assessment design process in a way that avoids measurement pitfalls later. Optimally, what is needed are step-by-step “guidelines” for the design and validation of educational instruments for assessing ACGME competencies such as SBP. Trochim17 recommends the use of already established and empirically tested design models found within the measurement and evaluation literature.

Taxonomy development as a methodology has been applied to better define aspects of clinical practice, like medical errors.18 A possible route may be a process for developing and validating a taxonomy of observable behaviors of undefined constructs, such as SBP, referred to as domain specification in the assessment design literature.19 Taxonomies are classification schemes that may or may not have a hierarchic structure but that systematically identify, categorize, and name items that belong in some broad category by using explicit criteria. Serving as a conceptual framework, such applications of taxonomies may help “illuminate and magnify” unknown and ambiguous constructs.20


Iterative Assessment Design Process

This study applies an iterative process model for assessment design19 as the overarching methodological approach for developing a taxonomy of behaviors underlying the SBP domain. The process has 4 phases, and each is depicted in figure 1. The approach starts by specifying the assessment context in terms of the population to be assessed (in this case, resident physicians), the purposes for which the resulting assessment tool(s) will be used (eg, accreditation, education program monitoring, or resident coaching), and the constructs or competency areas (in this case, SBP) that will be targeted for measurement (phase 1). Next, the domain is clarified in terms of unambiguous, action-oriented, and observable indicators to facilitate the instrument design process that follows (phase 2). Instrument design involves selection of appropriate assessment method(s) and writing of items to best tap into the domain, as specified (phase 3). In phase 4, the resultant domain and instrument are tested, validated, and revised iteratively as needed until the desired levels of validity, reliability, and utility are achieved. Both content validation and empiric validation methods can be used in phase 4, which typically includes such procedures as external expert reviews and pilot testing of a tool with psychometric evaluations of score validity and reliability.

Figure 1
Adaptation of the Process Model for Systems-Based Practice Development19

Phases in the process model need not be sequentially implemented, but they are typically cyclic. For example, as soon as the domain is specified (phase 2), a content validation may occur (phase 4), followed by revisions needed to the domain. This may be followed by instrument design work (phase 3) and another round of content validation and revision (phase 4). Development cycles may thus continue in an ongoing way.

The process model for assessment design is grounded in the psychometric model of test development, as recommended in the Standards for Educational and Psychological Testing.21 In our work, we have applied all 4 phases of the process model,19 with a primary focus on domain specification—the goal being to clarify and validate the SBP domain in the form of a hierarchic taxonomy, with a round of content validation by stakeholders.

Summary of Steps in the Design Process and SBP Taxonomy Development

In the first part of the SBP domain-specification procedure (table 2), we defined the boundary of the ACGME expectations for residents and parsed their roles as systems-based practitioners through a content analysis of ACGME documents (see steps 1-2 outlined in table 2). In the second part, we independently used a nominal group process, a procedure similar to focus group interviews, to gather opinion statements by selected categories of health care professionals—namely, nurses, resident physicians, and pharmacists. The groups provided their perceptions of different dimensions of systems-based practice. The opinions collected were coded qualitatively using a grounded theory approach and clustered in a hierarchic taxonomy that is organized from general to more specific indicators of SBP (steps 3-4).22,23 The results from each part of the procedure were then corroborated against each other (step 5). Residents and attending physicians served as external consultants to review and refine the domain thus derived. The final part of the procedure used the corroborated taxonomy (or the validated SBP domain) to prepare examples of items for a performance rating scale that may be used by residency directors to rate residents on SBP competency (step 6). The final SBP taxonomy and domain are intended to serve as the basis for designing performance assessment tools for resident physician populations similar to those suggested in the Appendix.

Table 2
Steps of the Process Model Employed to Derive a Performance Rating Tool in the Systems-Based Practice (SBP) Competency Area

Full Description of Steps in the Design Process and SBP Taxonomy Development

table 3 represents an interim product, showing the initial stages of what eventually unfolded as the SBP taxonomy. figure 2 is a depiction of a more detailed taxonomy development process (or fully specified SBP domain) and taxonomy confirmation process.

Figure 2
Process of Development and Confirmation of the Systems-Based Practice Taxonomy
Table 3
The Systems-Based Practice (SBP) Taxonomy: Initial Stages

Identifying the “Domain” or Scope of the SBP Expectations

We first examined the existing language of the SBP competency.10 This is represented in the first column of table 3, where the underscored text identifies what was collectively determined by 3 of the investigators to be the essential content (ie, what the resident must do) for each SBP expectation. Furthermore, through this procedure we identified 6 actions that were embedded statements of the expected resident behaviors. The first word in the second column of table 3 provides the expected resident behaviors in the form of a verb that indicates “doing” something (“work,” “coordinate,” “perform,” “advocate,” “identify”).

Behavior Identification from Source Documents

Next, we sought to dig more deeply into the language of the SBP expectations by identifying the underlying concepts and learning outcomes associated with each one (figure 2, left). To do so, we reviewed appropriate source documents (eg, residency program descriptions, ACGME teaching materials, peer-reviewed journal publications, etc) to identify all of the targeted topics and skills that are necessary to teach and assess constructs like SBP. Our focus was exclusively to find language in a broad array of source documents to place each SBP expectation into context. All documentation remained centered within the medical education literature. Based on our comprehensive search in 2006, and then again in 2007, there were comparatively few articles referring to SBP compared with terms like professionalism. Moreover, it was beyond the scope of the present study to conduct comprehensive or systematic reviews of existing published literature or other written documentation because our purpose for searching such literature was narrower in focus.

As an example, we recorded behaviors identified in the published literature for the ACGME SBP action of “perform cost-benefit analysis” in efforts by Tomolo et al24 as well as Englander et al.25 As another example, to specify the SBP action of “coordinate” patient care, we reviewed the published literature concerned with effects of practices,26 as well as the ACGME source documentation for another of its competencies: patient care. This review allowed us to record a domain-specific list of targeted topics, such as patient history, physical examination, diagnosis, and treatment. These activities comprise “patient care” needs to be met to serve the ACGME patient care competency. In addition, these activities also need to be coordinated to meet the health care systems needs. There is a fine line between these 2 distinctions: the SBP expectation of “coordinating patient care” versus the language of the ACGME expectation for the competency of patient care; we were interested in activities and language directly related to SBP.

Once each SBP expectation had been sufficiently clarified in terms of descriptive language, our search for domain-specific content ended. We then constructed the initial SBP taxonomy tying together the SBP expectations, embedded actions, and behaviors identified from the domain specification, given the source document review.

Collection of Health Care Professional Opinion Statements

Next, we sought to obtain the perspectives of a range of stakeholders within the health care system (figure 2, right side). To do so, we revisited the original data set reported in Graham et al3 at 2 large, academic, urban medical centers in New York City. In that study, there were a total of 88 health care profession participants from 7 different care categories: resident, attending physician, nurse, social work, pharmacy, and physical-occupational therapy. They fell within 16 specialties (eg, anesthesia, internal medicine, intensive care unit, psychiatry, neurological surgery). Each care category identified was deemed to be a stakeholder in the health care system because of multiple daily interactions with residents to meet patient care responsibilities directly (eg, nursing) or indirectly (eg, pharmacy).

In that study, we used a nominal group process methodology to gather opinion statements from the hospital stakeholders about their beliefs of residents' behavioral knowledge, skills, and dispositions.3 These statements were generated first individually and then collectively as a stakeholder group to represent what they believed was necessary for residents to demonstrate competency in SBP.

Using qualitative analysis, the results of the Graham et al3 study showed that there were distinguishable differences in perspective among stakeholders. In fact, there were enough differences to justify that a working definition of SBP needed to extend beyond physicians and include the other members of the health care team. In the present study, these opinion statements were revisited. The angle of analysis shifted to all responses for all 88 stakeholders (n  = 88) at the aggregate level. Once organized in this way, our collection of opinion statements procedure was complete.

Thematic Coding of the Stakeholder Opinion Statements

We used content analysis and a variant of grounded theory as the main process for the thematic coding of all of the opinion statements collected. Content analysis is a systematic analysis of the occurrence of words, phrases, and concepts in text material; it is done through the use of a coding scheme, which consists of categories and operational definitions.27 To derive a coding scheme to conduct the content analysis, we adhered to the established qualitative methodology of grounded theory22,23 to identify the salient themes as we analyzed the stakeholder opinion statement data. The thematic coding consisted of the following tasks. First, the aggregated stakeholder opinion statement list was reviewed to identify emergent behavioral themes. During this initial review, 2 members of the research team worked both independently and in consultation with each other to code a limited yet robust sample of the opinion statements. Disagreements were resolved by discussion. Several iterations of theme coding ensued. Second, once a common structure emerged, additional members of the research team went through a similar process until a general framework was in place. Third, 1 investigator used this coding scheme to revisit the complete data set and organize it according to themes. Fourth, the final theme list was generated and reviewed by 2 other investigators. Variations of grounded theory and content analysis have been used widely in other disciplines (eg, social science) and have also been applied in medical education settings.28,30

To content validate the stakeholder opinion statement theme list, one investigator began to meet with a number of attending physicians and senior residents during a 6-month period to review progress and keep the content analysis process focused and contextually relevant. Some meetings were scheduled, whereas others were happenstance or opportunistic. Other stakeholder groups (nursing, pharmacy, social work, physical therapy) were not included in this process because of scheduling difficulty, perceived task complexity, and simply because physicians and residents were more readily available to us because of our involvement in other projects with them. We based this methodological decision on the rationale that an entire spectrum of stakeholders—attending physicians, nurses, pharmacy, residents, social work, physical therapy—confirmed a theme structure in a previous study,3 and all that the present group of physicians and residents was doing was confirming the structure of this previous process. Limitations of this decision and how the results may differ from Graham et al3 are addressed in a subsequent section.

The following is an example of what our consultation with physicians and residents revealed through an iterative process during the 6-month period. The statement provided by a stakeholder group—“knows and uses information systems”—was analyzed in 2 parts. In the first part, “knows and uses” are verbs describing expectations for residents, and in the second part, “information systems” forms the content variable of the statement. We established that 2 behavioral themes are embedded within this opinion statement: “knows information systems” and “uses information systems.” In addition, through this iterative process, we also learned that many statements generated behavioral themes that were similar in concept and meaning. For example, all stakeholder groups each reported such terms as “conveys availability,” “is accessible,” “make themselves available,” and “lets people know who they are and how to reach them.” To eliminate redundancies within our content analysis procedures, we compiled lists of terms and then, as previously mentioned, consulted with physicians and residents on multiple occasions to combine and classify these related themes under a single behavioral theme. In the example, the final term was availability.

The “SBP Match List”: Matching Domain Specification and Themes

For the last step in taxonomy confirmation, the investigators began a process in which the SBP behaviors identified from our analysis of published and source documentation used to describe SBP behavior were “matched” to the themes that emerged from the health care team stakeholders' opinion statements. Regarding what constitutes a match, the investigators adhered to the following rationale: to justify inclusion into the SBP taxonomy, there had to be a direct and obvious connection between the identified behaviors from the source documentation and the behavioral themes from the stakeholder groups. Thus, a “match” is defined as occurring when a behavioral theme from the stakeholder groups was able to confirm a behavior derived from existing SBP source documentation. The matching task was based on a set of materials reviewed by multiple stakeholder groups, so it was done only by the investigators. The output of the matching task was, however, reviewed by attending physician and resident consultants.

For example, patient information was one of the behaviors identified through the domain specification of the action “coordinates patient care.” Separately, the overarching behavioral theme identified for the stakeholder opinion statement “knows the patient's history, exam, and current status” was also coded as patient information. Therefore, the behavior result of the domain specification (patient information) and the theme result from the stakeholder opinion statements (patient information) were successfully matched and, hence, confirmed. In summary, through this matching process we sought agreement between the behaviors identified through source documentation in the initial taxonomy development and the stakeholder behavioral themes obtained via the opinions of the personnel working within the system from the taxonomy confirmation.

It is important to note that the final SBP taxonomy consists of only the stakeholder behavioral themes and source document behaviors that met this threshold. Otherwise, there could be numerous behavioral themes identified by the stakeholders that are interesting and worthwhile to consider, but not all were identified via analysis of the ACGME SBP expectations or associated literature review. Perhaps in other instances or contexts these behavioral themes should be considered. We decided that if a stakeholder behavioral theme could not be matched in some way to the behaviors identified via the SBP source documentation—which are direct descendants of the SBP actions and, ultimately, the SBP expectations of the ACGME—then that behavioral theme generated by the stakeholders should not be considered part of the SBP domain. This rule was followed to meet our goal.

It is also the case that some SBP expectations identified by the ACGME or uncovered as an expectation-related behavior in the source documentation for SBP may not have been represented in the stakeholder behavioral themes. Perhaps this is because the stakeholders were unaware of its importance or felt that some aspect of an expectation was irrelevant from their perspective. For instance, cost-benefit analysis possibly falls into this situation because large, research-driven academic institutions may not have as much focus on day-to-day costs as small, cost-driven hospitals. Unlike the previous scenario, where we excluded stakeholder behavioral themes that did not match our criteria for inclusion, an ACGME SBP-identified expectation that is not mentioned by the stakeholders is, nonetheless, still necessary to include in the final SBP taxonomy. As a rule of thumb, if the accrediting agency language identifies an expectation, then it has effectively deemed this expectation to be important, and therefore it needs to be defined and measured, regardless of whether or not it was considered from the stakeholders' perspective.

Roles and Operational Definitions of the Matched List

After the matched list was constructed, we invited a group of physicians and senior residents who volunteered to help us to convert it into contextually grounded definitions.19 A contextual definition is a behavior indicating competence in SBP that can be observed in a measurable way, given a condition. For example, a physician may be observed in the role of a collaborator while giving patients advice on health insurance options. For the conversion, physicians or residents were used as validators largely because of ease of scheduling and accessibility. This time, we asked 4 consultants (2 attending physicians, 1 fellow, and 1 chief resident) to individually meet with 2 of the investigators. We told them their task was to help us make the contents of the matched list observable and measurable by providing examples from the routine activities of residents that are relevant to practice within the “system.” In this way, both content validation and domain clarification were iteratively built into the taxonomy development process. Examples of their efforts can be seen in the fourth column of table 4.

Table 4
The Complete Systems-Based Practice (SBP) Taxonomy

The last and most important task (to achieve the goals of the present study) was to identify an overarching set of resident roles for each SBP expectation. As mentioned in an earlier section, a SBP expectation is different from a role: the role makes operational the general expectation by situating the physicians' actions in a context.

The final taxonomy retained only those behaviors from the matched list that could be operationalized through the lens of SBPs. The investigators, along with several consultations with attending physicians and residents, analyzed all contextual definitions related to each of the 6 SBP expectations, then compared and contrasted each to existing source documentation, such as CanMEDS,11 and collectively decided on a role to represent it. Finally, once the contextual definitions and roles were identified and agreed on, we were able to write a preliminary set of SBP items for a potential performance assessment tool for residents. Three investigators drafted multiple items, and attending physicians and residents edited, improved, or deleted each. An example set of SBP evaluation items is located in the Appendix.


Our methods yielded 35 matched behaviors and 6 resident roles conforming to the ACGME list of 6 expectations for SBP competency. For the purposes of the present study we define the action term to mean something “residents can do to demonstrate their competence in SBP yet is presently described in an immeasurable way within the ACGME SBP language.” Specifically, the actions extracted from the 6 ACGME SBP resident expectations are by themselves neither directly observable nor measurable because they are void of context.

For all 6 ACGME SBP expectations, the qualitative, stepwise process described previously in this study resulted in the taxonomy of SBP. Based on the language in the SBP taxonomy, observable contextual definitions were generated, and actual evaluation tool items were structured and written. The information starts with that presented in table 3, and the complete SBP taxonomy is represented in table 4. In addition, 6 working examples of how an ACGME SBP expectation is translated to a role, an expectation, an observable action and, finally, a few sample items are presented in the last column of table 5 and will be explained through the following resident role descriptions (the Appendix includes a list of sample items for each role).

Table 5
A Working Example of the Systems-Based Practice (SBP) Taxonomy

SBP Role 1: System Consultant

To operationalize this role as completely as possible, a statement based on the SBP taxonomy (table 5), such as the following could be used: “To meet the ACGME-SBP resident expectation no. 1 to work within different delivery systems, the resident in the role of SBP system consultant must understand the system of care delivery by being able to educate patients through, for instance, advising and guiding them.” Based on this, an example of an item on a performance assessment could be: “I have seen this resident discussing the limitations of different insurance plans with patients and their families.” In this instance, the operationalization is not just about patient care; instead, it is about the patient and resident's relationship with the health care system, insurance plans, and knowledge of what both can provide together.

SBP Role 2: Care Coordinator

To operationalize this role, a statement such as the following could be constructed based on the SBP taxonomy (table 5): “To meet the ACGME-SBP resident expectation no. 2 to coordinate patient care, the resident in the role of SBP care coordinator must understand the effects of his/her practices on the system by being able to demonstrate professionalism in systems-related activities shown, for instance, through demonstrations of reliability, availability, and taking responsibility.” Based on this operationalization, an example evaluation tool item could be: “I have seen this resident contacting the previous health provider to understand patients' problems better.” This action is an important demonstration of reliability, availability, and taking responsibility, because residents who routinely communicate with previous health providers can more effectively plan patient care by learning what has occurred prior to the present illness, including symptoms, work-up, and interventions. This action also allows for clarification and confirmation of the medical history provided by the patient. Additionally, in the hospital setting, contact with outpatient providers promotes continuity of care and facilitates discharge planning.

SBP Role 3: Resource Manager

A statement such as the following could be constructed based on the SBP taxonomy (table 5): “To meet the ACGME-SBP resident expectation no. 3 to perform cost-benefit analysis, the resident in the role of SBP resource manager must practice cost-effectiveness by using resources in ways that are, for instance, under careful monitoring and cost efficient.” Based on this operationalization, an example of an item could be: “I have seen this resident select computed tomography (CT or CAT scan) by considering patient needs and costs to the system.” This activity is important to SBP because managing costs under careful monitoring and for cost efficiency requires residents to be mindful of how tests they order have broader implications than just the clinical information they provide. Tests ordered unnecessarily, without sound clinical reasoning, are costly to the patient and the system in terms of time, effort, and resources.

SBP Role 4: Patient Advocate

A statement such as the following could be constructed based on the SBP taxonomy (table 5): “To meet the ACGME-SBP resident expectation no. 4 to advocate for quality care, the resident in the role of SBP patient advocate must promote patient advocacy by working within the care system to, for instance, realize system restraints when providing patient care.” Based on this, an example of an item could be: “I have seen this resident make adjustments to patient care in a flexible way to work around delays in getting patients' lab reports.” Working around such roadblocks promotes patient advocacy by functioning within the constraints of the system to minimize delays in starting potentially life-saving treatments.

SBP Role 5: Team Collaborator

A statement such as the following could be constructed based on the SBP taxonomy (table 5): “To meet the ACGME-SBP resident expectation no. 5 to work in teams, the resident in the role of SBP team collaborator must use a team approach by communicating with health care personnel in ways that, for instance, demonstrate networking and relationship management.” Based on this, an example of an evaluation item could be: “I have seen this resident interacting with therapists (physical, occupational, respiratory) to communicate treatment, follow-up plans, and other concerns.” When residents use their system-wide networks and collaborative relationships to communicate treatment, follow-up plans, and other concerns, this benefits the system by ensuring coordinated and seamless transfer of care between clinicians.

SBP Role 6: System Evaluator

A statement such as the following could be constructed based on the SBP taxonomy (table 5): “To meet the ACGME-SBP resident expectation no. 6 to identify system errors, the resident in the role of SBP system evaluator must participate in implementing potential systems solutions by identifying system errors, for instance, suggesting improvements, changes, and modifications.” Based on this, an example of an item could be: “I have seen this resident discussing treatment protocols with [fellow residents, social workers, nurses, medical students].” Development and adherence of systematic protocols can serve as potential systems solutions to minimize errors in patient care. Because residents are often involved in carrying out protocols, they are likely to identify where errors occur, either in design or implementation, and offer suggestions for improvement that will benefit the system.


Our main purpose was to address the long-term goal of the ACGME to use well-defined SBP expectations to evaluate SBP behavior in resident physicians.10 The intention was to develop a SBP taxonomy by using a systematic design process. We believed this would provide the basis for more contextually relevant definitions of SBP indicators. To do so, it was necessary to take the existing SBP language in ACGME documents and convert it to specific roles and measurable outcomes that are meaningful in the context of a typical resident's “health care system.” We showed that this could be achieved by converting the SBP expectations into performance evaluation items grouped within 6 roles that are then more measurable and corroborated by health professionals. For illustrative purposes, a list of sample items for each role is given in the Appendix.

The SBP definition reported here may well overlap with existing competency frameworks. For example, there are terminology overlaps with Canada's CanMEDS,11 even though its SBP expectations are more broadly defined than those by ACGME. Although the SBP domain did not overlap with other ACGME competencies, such as medical knowledge or practice-based learning and improvement, there was some overlap with patient care or professionalism areas.

It is important to emphasize how the 6 SBP roles offer a way to represent residents' SBPs separately from the other ACGME competencies. For instance, the SBP role of team collaborator is not necessarily about how professionally residents are interacting with team members or how well they communicate. For the SBP competency, we identify the role of team collaborator to be concerned with how residents use their competency in SBP to form collaborative networks with the purpose of improving the functioning of the system.

How well residents function in the team collaborator role may indeed be dependent on their professional behavior or their interpersonal communication skills, but these separate, distinct ACGME competencies are relevant only to the extent that they reflect (or are in the service of) the system and the resident's functioning within the defined team collaborator role. The SBP indicators are skills or behaviors that serve the system in some important way, regardless of language or superficial semantic overlaps.

Our assessment design approach resulted in one possible conceptual framework as the basis for SBP and is only a beginning. Future work should continue in terms of evaluation and possible psychometric validation of the taxonomy and performance assessment tools that emerge. The framework may help residency programs better understand what SBP actually is in more everyday terms. In pragmatic terms, residency programs may see the SBP taxonomy development process as a model with which to inform the development of ACGME competency-related curricula.31 It can also be used to prepare for accreditation site visits, or to further design a residency program's own internal assessment and evaluation processes.

Future Directions

The methodology to define SBP was comprehensive and based on an established procedural model, but is by no means the final word on this complex competency. Residency programs need to keep efforts going to: (1) further discuss and validate the SBP definition as represented through the SBP taxonomy; (2) refine the process of defining SBP; and (3) improve the SBP taxonomy as a template for program development and resident evaluation. Researchers can make additional efforts to more operationally define (and then continually refine) the ACGME competency language.

With the present SBP taxonomy, various residency programs can “harvest” from it content relevant to their own needs. As written, the SBP taxonomy and its representative items can be used as a springboard for thinking about curriculum contents. New items that are more program specific (eg, surgery, primary care, or psychiatry) could be written from existing ones.

The present study places importance on generating context-specific, measurable examples of actions and behaviors from a process model framework.19 Further refinement of the identified categories and additional generation of context-specific examples need to be an ongoing process. This is necessary to ensure that the evaluation tools produced are maximally valid. It is also imperative that ongoing investigations tap into constantly evolving expectations and practices. There are boundaries to geography (eg, urban health care centers) and time (eg, new or revised ACGME competencies). This requires that the present SBP taxonomy keep pace with graduate medical education in many environments and specialties. It must evolve to meet future accreditation expectations.


There are a number of limitations regarding this research. First, we realize that there are numerous other possible health care settings (eg, rural, suburban) and that 2 large, urban, academic medical centers are not fully representative of SBPs in health care.

Second, we acknowledge that other stakeholder perspectives, such as patients or family members, were not represented. These groups also have unique viewpoints of residents and their roles vis-à-vis the health care system. Future studies should take this into consideration.

Third, although multiple stakeholder groups (eg, nursing) were involved in generating behavioral themes, we relied only on attending physicians and residents to iteratively refine language and turn matched behaviors into relevant context definitions in the later stages of the research process. This was based on their own personal experience and familiarity with the clinical environment as well as our lack of accessibility to other stakeholder groups in the latter stages of this project. There may be drawbacks to this that need to be addressed in future studies. However, other stakeholders' voices (ie, nursing, pharmacy, social work, physical therapy) were “at the table” during data gathering and coding.

Fourth, the SBP roles identified in the taxonomy here are semantically different from those of the earlier study.3 This should be expected, because the SBP taxonomy development and validation process was more in-depth and used more systematic analysis of the original data set. Each iteration resulted in language that was more concise and representative. We would expect that as we define SBP in more specialty-specific terms, such terminology differences will continue to appear.

Fifth, the way in which the examples of SBP evaluation items are presently structured (ie, to be completed at the end of the rotation period of residents) raises general concerns with possible memory biases of the raters. For example, in the context of tutorial-based evaluation, recent research has shown that the end-of-rotation assessments do not predict performance well because of variability from one session to another.32 Instead, what is required are samples of performance over time and avoidance of long instruments. Similar concerns are reported in Brennen and Norman,33 who suggest that at least 8 “encounters” need to be assessed in practice to give reliable information. Although the approach described here assumes a single end-of-rotation evaluation (see rating scale in the Appendix), there is no reason why it could not be applied in other contexts, such as rotations in which there are multiple assessments.


The SBP taxonomy developed through a process model for assessment design19 shows that the core elements of the ACGME's definition of SBP can be linked to actual resident behaviors and performance actions. Our comprehensive analysis of the constructs underlying SBP created an operational representation of this previously vague competency. Residency directors, regardless of specialty, could easily adapt and further develop this taxonomy for assessing residents' behaviors and performance. This conceptual framework provides new insights into developing resident assessment instrument(s) that could be used as a model for defining roles and observable behaviors in any specialty, such as surgery or pediatrics. Any of the 5 other ACGME competencies, or other complex concepts in graduate medical education, could be clarified using the methods demonstrated.

Appendix: Suggested Response Scale and Sample Systems-Based Practice Items Categorized by Roles

  • 0  = Not applicable
  • 1  = Never (during a rotation)
  • 2  = Rarely (once or twice during the rotation)
  • 3  = Sometimes (5-10 times during the rotation)
  • 4  = Many times (11-15 times)
  • 5  = Most of the time (more than 15 times)

Role 1: System Consultant

I have observed this resident:

  • — Discussing limitations of different insurance plans with patients and their families.
  • — Referring patients and their families to financial advisors when needed.
  • — Considering costs while selecting procedures like CAT scans.
  • — Discussing alternative and complementary treatments (like acupuncture, chiropractic, aromatherapy, etc) with patients and families.

I have observed this resident discussing benefits/risks/costs of inpatient versus outpatient, rehab versus home, or limitations/restrictions in these choices related to health insurance with:

  • — Patients and families.
  • — Residents.

Role 2: Care Coordinator

I have observed this resident:

  • — Conducting detailed and prioritized sign-outs.
  • — Contacting the patients' previous health care providers on admission.
  • — Verifying prior health information (past history) of inpatients from multiple sources (like patient, patient's family, etc) when necessary and available.
  • — Answering pages promptly.
  • — Following the approved protocols for conducting procedures (eg, phlebotomy, intravenous puncture, splinting, central venous line placement, etc).
  • — Employing preventive measures (like disposal of used needles) to avoid risks to other health professionals.
  • — Referring patients to appropriate services.
  • — Responding promptly to calls from other disciplines.

Role 3: Resource Manager

I have observed this resident:

  • — Managing documentation of medical records with minimal errors.
  • — Accessing translation services when needed.
  • — Using electronic ordering systems with minimal errors.

Role 4: Patient Advocate

I have observed this resident making adjustments (demonstrate flexibility) to work around roadblocks like:

  • — Unavailability of the internet.
  • — Delay in getting the lab reports.
  • — Nonavailability of relevant staff.

Role 5: Team Collaborator

I have observed this resident interacting with the following to communicate treatment, follow-up plans, and other concerns:

  • — Nurses.
  • — Therapists (physical, occupational, respiratory).
  • — Pharmacists.
  • — Social workers.

Role 6: System Evaluator

I have observed this resident providing constructive feedback to:

  • — Fellow residents.
  • — Nurses.
  • — Social workers.
  • — Medical students.


Mark J. Graham, PhD, is Director for Education Research at the Center for Education Research and Evaluation, Columbia University Medical Center. At the time this study was conducted Zoon Naqvi, MBBS, EdM, was at the Center for Education Research and Evaluation, Columbia University Medical Center; John Encandela, PhD, was at the Center for Education Research and Evaluation, Columbia University Medical Center, and at New York-Presbyterian Hospital; Kelli J. Harding, MD, is at the Department of Psychiatry, Columbia University; and Madhabi Chatterji, PhD, is at Teachers College, Columbia University.

The Stemmler Fund for Medical Education Research from the National Board of Medical Examiners supported this research. All findings and statements are, however, the sole opinions of the authors. A previous version of this paper was presented in March 2008 at the American Education Research Association (AERA), Division I, New York, NY. We are grateful to Hilary Schmidt for the energy and commitment she gave to conceptualizing this project. We wish to acknowledge all of the health team members for making time to participate in this study, and we thank Stan Hamstra and Maria Mylopoulos for their early encouragement at the AERA meeting in New York. We appreciate the thoughtful reviews of previous drafts of this manuscript by Dorene Balmer, Nicole Borges, Clarissa Cortland, Ingrid Philibert, Boyd Richards, Peter Wyer, and 2 anonymous reviewers.


1. Iglehart J. K. The struggle for reform–challenges and hopes for comprehensive health care legislation. N Engl J Med. 2009;360(17):1693–1695. [PubMed]
2. Ginsburg J. A., Doherty R. B. Public Policy Committee of the American College of Physicians. Achieving a high-performance health care system with universal access: what the United States can learn from other countries. Ann Intern Med. 2008;148(1):55–75. [PubMed]
3. Graham M. J., Naqvi Z., Encandela J. What indicates competency in systems based practice?: an analysis of perspective consistency among healthcare team members. Adv Health Sci Educ Theory Pract. 2009;14(2):187–203. [PubMed]
4. Institute of Medicine, Committee on Quality of Health Care in America. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001.
5. Lancet. Educating doctors for world health. Lancet. 2001;358(9292):1471. [PubMed]
6. Owen J. W., Roberts O. Globalization, health, and foreign policy: emerging linkages and interests. Global Health. 2005;1:12. Available at: Accessed August 18, 2009. [PMC free article] [PubMed]
7. Palmer K. T., Harling C. C., Harrison J., Macdonald E. B., Snashall D. C. Good medical practice: guidance for occupational physicians. Occup Med (Lond) 2002;52(6):341–352. [PubMed]
8. Romanow R. J. Building on Values: The Future of Health Care in Canada. Saskatoon, Canada: Commission on the Future of Health Care in Canada; Available at: Accessed August 18, 2009.
9. Ludmerer K. M. Time to Heal: American Medical Education from the Turn of the Century to the Era of Managed Care. New York, NY: Oxford University Press; 1999.
10. ACGME. ACGME outcomes project. 2009. Available at: Accessed August 27.
11. The Royal College of Physicians and Surgeons of Canada. The CanMEDS Physician Competency Framework. 2008. Available at: Accessed July 31.
12. The General Medical Council. Good Medical Practice. Available at: Accessed August 18, 2009.
13. DeVellis R. F. Scale Development: Theory and Applications. Newbury Park, CA: Sage Publications; 1991. Applied Social Research Methods Series; vol 26.
14. Roe R. A. What makes a competent psychologist? Eur Psychol. 2002;7(3):192–202.
15. Bashook P. G. Best practices for assessing competence and performance of the behavioral health workforce. Adm Policy Ment Health. 2005;32(5–6):563–592. [PubMed]
16. Wiggins G. Research news and comments: response to Terwillinger. Educ Res. 1998;27(6):20–21.
17. Trochim W. The Research Methods Knowledge Base. 2nd ed. Cincinnati, OH: Atomic Dog Publishing; 2000.
18. Zhang J., Patel V. L., Johnson T. R., Shortliffe E. H. A cognitive taxonomy of medical errors. J Biomed Inform. 2004;37(3):193–204. [PubMed]
19. Chatterji M. Designing and Using Tools for Educational Assessment. Boston, MA: Allyn and Bacon; 2003.
20. Bordage G. Conceptual frameworks to illuminate and magnify. Med Educ. 2009;43(4):312–319. [PubMed]
21. American Educational Research Association. Standards for Educational and Psychological Testing. Washington, DC: American Psychological Association; 1999.
22. Strauss A. L., Corbin J. M. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. 2nd ed. Thousand Oaks, CA: Sage Publications; 1998.
23. Glaser B. G. Basics of Grounded Theory Analysis: Emergence vs. Forcing. Mill Valley, CA: Sociology Press; 1992.
24. Tomolo A., Caron A., Perz M. L., Fultz T., Aron D. C. The Outcomes Card: development of a systems-based practice educational tool. J Gen Intern Med. 2005;20(8):769–771. [PMC free article] [PubMed]
25. Englander R., Agostinucci W., Zalneraiti E., Carraccio C. L. Teaching residents systems-based practice through a hospital cost-reduction program: a “win-win” situation. Teach Learn Med. 2006;18(2):150–152. [PubMed]
26. Schattner A., Bronstein A., Jellin N. Information and shared decision-making are top patients' priorities. BMC Health Serv Res. 2006;6:21. Available at: Accessed August 18, 2009. [PMC free article] [PubMed]
27. Baker L. M. Information needs at the end of life: a content analysis of one person's story. J Med Libr Assoc. 2004;92(1):78–82. [PMC free article] [PubMed]
28. Ginsburg S., Regehr G., Stern D., Lingard L. The anatomy of the professional lapse: bridging the gap between traditional frameworks and students' perceptions. Acad Med. 2002;77(6):516–522. [PubMed]
29. Ackerman A., Graham M., Schmidt H., Stern D. T., Miller S. Z. Critical events in the lives of interns. J Gen Intern Med. 2009;24(1):27–32. [PMC free article] [PubMed]
30. Cutler J. L., Harding K. J., Mozian S. A. Discrediting the notion “working with ‘crazies’ will make you ‘crazy’”: addressing stigma and enhancing empathy in medical student education. Adv Health Sci Educ Theory Pract. 2008. In press. [PubMed]
31. Ziegelstein R. C., Fiebach N. H. “The mirror” and “the village”: a new method of teaching practice-based learning and improvement and systems-based practice. Acad Med. 2004;79(1):83–88. [PubMed]
32. Eva K. W., Solomon P., Nevelle A. J. Using a sampling strategy to address psychometric challenges in tutorial-based assessments. Adv Health Sci Educ. 2007;12:19–33. [PubMed]
33. Brennan B. G., Norman G. W. Use of encounter cards for evaluation of residents in obstetrics. Acad Med. 1997;72(10):S43–S44. [PubMed]

Articles from Journal of Graduate Medical Education are provided here courtesy of Accreditation Council for Graduate Medical Education