Search tips
Search criteria 


Logo of hsresearchLink to Publisher's site
Health Serv Res. 2007 June; 42(3 Pt 1): 1257–1273.
PMCID: PMC1955254

Measuring Organizational Attributes of Primary Care Practices: Development of a New Instrument



To develop an instrument to measure organizational attributes relevant for family practices using the perspectives of clinicians, nurses, and staff.

Data Sources/Study Setting

Clinicians, nurses, and office staff (n = 640) from 51 community family medicine practices.


A survey, designed to measure a practices' internal resources for change, for use in family medicine practices was created by a multidisciplinary panel of experts in primary care research and health care organizational performance. This survey was administered in a cross-sectional study to a sample of diverse practices participating in an intervention trial. A factor analysis identified groups of questions relating to latent constructs of practices' internal resources for capacity to change. ANOVA methods were used to confirm that the factors differentiated practices.

Data Collection

The survey was administered to all staff from 51 practices.

Principal Findings

The factor analysis resulted in four stable and internally consistent factors. Three of these factors, “communication,” “decision-making,” and “stress/chaos,” describe resources for change in primary care practices. One factor, labeled “history of change,” may be useful in assessing the success of interventions.


A 21-item questionnaire can reliably measure four important organizational attributes relevant to family practices. These attributes can be used both as outcome measures as well as important features for targeting system interventions.

Keywords: Primary care organizational attributes measurement

The impact of the work environment on clinical performance and outcomes is receiving renewed interest by health systems researchers, health care administrators, and policy makers (IOM Committee on Quality of Health Care in America 2001; Shortell 2002; Shortell and Selberg 2002). A growing body of literature derived from research in large health care settings, such as hospitals and large multispecialty group practices, suggests that better health outcomes are associated with particular organizational attributes (Shortell and LoGerfo 1981; Shortell 1985, 1990, 2002; Davies and Ware 1988; Shortell, O'Brien et al. 1994; Shortell, Zimmerman et al. 1994; Shortell et al. 1998; Mitchell and Shortell 1997; Davies and Nutley 2000; Donaldson et al. 2000; Ferlie and Shortell 2001). Similar studies within primary care have the potential to impact a broader spectrum of the American population. In a given year most Americans visit a primary care physician (Benson and Marano 1994), with more than a quarter of these visits to family medicine practices (Woodwell 1999). Primary care practices, including family medicine, general internal medicine, and pediatric practices, are unique among health providers in that they must serve as the front line for large variety of health care needs, from prevention to identification of disease and illness to the treatment of ailments or referral to specialists. To understand the impact of organizational attributes of primary care practices on delivery of patient care, one first needs to understand the attributes of these typically small medical providers and develop a method for measuring these attributes.

A key element of a practice's ability to maintain and improve quality of care for their patients is their ability to adapt to the evolving understanding of medicine, to demands for enhanced clinical performance, and to changes in the larger health care management system. While general models of change have been proposed (Senge 1990, 1994; Rogers 1995), these models have typically not incorporated the features unique to primary care practices. One exception is the primary care change model recently described by Cohen et al. (2004) that includes four interdependent elements that determine a practice's capacity for sustainable change. This model emerged from three federally funded studies, two descriptive studies, the Direct Observation of Primary Care (DOPC) study (Crabtree et al. 1998; Miller et al. 1998) and Prevention & Competing Demands in Primary Care (P&CD) study (Crabtree et al. 2001; Miller et al. 1998; Tallia et al. 2003), and one intervention study, the Study to Enhance Prevention by Understanding Practice (STEP-UP) (Goodwin et al. 2001; Cohen et al. 2004). The model identifies clinician and staff characteristics, particularly their interrelationships, as important in distinguishing the between practices' abilities to improve their rates of delivery of prevention services. Of the four elements described by Cohen et al. (motivation of key stakeholders, resources for change, outside motivators, and opportunities for change), resources for change best describes organizational characteristics that a practice must have to modify not only its technical aspects, but also its values and beliefs regarding itself as an organization. The resources for change element includes internal resources such as relationships among practice members, leadership and decision-making approaches, communication and perception of competing demands. Information management and management infrastructure, are also facets of the internal resources for change element.

Measurement tools exist for assessing important organizational attributes of larger health systems, such as hospitals (Shortell 1985; Shortell et al.1991, 2000; Jennings and Westfall 1994; Nabitz et al. 2000; Weeks et al. 2000; Meyer and Collier 2001; Nordhaus-Bike 2001; Goldstein and Schweikhart 2002). These attributes include: (1) leadership that engages a diversity of perspectives and shares critical information in order to enhance problem solving processes; (2) a culture that fosters openness, connectedness, and learning; (3) relationships that foster communication and collaboration; (4) management functions that describe presence of diverse structural components and processes such as fiscal, material, clinical, recognition, and feedback, and strategic planning; and (5) information mastery that includes the access and use of information that supports learning and problem solving activities (Shortell et al. 1998; Davies and Nutley 2000; Donaldson et al. 2000; Ferlie and Shortell 2001). The first three of these attributes, in particular, were also identified by Cohen et al. (2004) as internal resources that are needed to create and sustain change (Cohen et al. 2004) in the primary care practice. However, a systematic tool to measure these attributes in primary care practices has not been developed. Because these practices are much smaller and have much more limited resources than hospitals and other larger health systems, instruments for measuring attributes of larger health systems are not expected to describe and differentiate between primary care practices well. As such, an instrument must be created specifically for use in the primary care practice.

This study aims to develop an instrument to measure organizational attributes of primary care practices and to evaluate the measurement properties of this newly developed instrument. The instrument, intended to be a survey of clinicians, nurses and staff within the practice, seeks to measure some facets of resources for change, including relationships among practice members, leadership and decision-making approaches, communication, and perception of competing demands. The other facets, specifically information management and management infrastructure, are thought to be better addressed at the practice level by one or two key members of the practice, for example the office manager or medical director. This instrument is developed based on the experience of investigators in family medicine, internal medicine, and pediatric practices; however, its use will be evaluated within family medicine practices. It was hypothesized that this instrument's items would measure internal resources for change and be able to sort practices according to the strength of their available internal resources.


Instrument Development

The instrument was developed as a measurement tool of organizational performance for an NHLBI-funded clinical trial (R01 HL70800), “Using Learning Teams for Reflective Adaptation” or ULTRA. The ULTRA study evolved from a series of NCI- and AHRQ-funded studies that identified key attributes of primary care practice organizational performance and internal resources that increased the capacity for change (Crabtree et al. 1998, 2001; Miller et al. 1998, 2001; Stange et al. 1998; Cohen et al. 2004); however, a structured tool that operationalized these attributes was unavailable. To develop such a tool, the ULTRA team summarized existing data, reviewed existing instruments, and convened an expert panel.

With the model for practice change in mind, the research team reviewed published instruments that have been validated for measuring organizational performance in the hospital ICU (Shortell et al. 1991; Shortell, Zimmerman et al. 1994), hospitals (Jennings and Westfall 1994; Nabitz et al. 2000; Shortell et al. 2000; Meyer and Collier 2001; Goldstein and Schweikhart 2002), large corporations (Hertz et al. 1994; Meyer 1998; Zairi 1998; NIST 2002), and nursing homes (Scott-Cawiezell et al. 2004, 2005; Scott et al. 2005). While few of these had been applied in smaller practices, some of the items included in these instruments described the experiences of the research team in primary care practices. Additional items were pulled from an unpublished instrument developed to assess practice organization in a pilot project for the MacArthur Initiative on Depression and Primary Care and other projects. The team culled through the items from each of these instruments to generate a list of approximately 120 items focusing on concepts of communication, relationships, leadership, and decision making.

A panel of experts convened in early 2003 for two, 2-day instrument development working sessions. Panel participants, with experience in a broad range of primary care settings, included investigators from the DOPC, P&CD, and STEP-UP studies, as well as experts in health care evaluation, health policy, complexity science, nursing home organization, delivery of chronic care in primary care practice, large health care organizations, and instrument design. The panel initially reviewed the core concepts and sorted items accordingly. Obvious duplicates were removed and remaining items were checked to be sure the wording was appropriate for a primary care practice (versus a larger health system). Questions were adapted or eliminated with the following goals: (1) the complete survey could be finished by practice employees in less than 15 minutes; (2) items accurately characterized small family practices; (3) items retained the original wording whenever possible; (4) items were clear and brief; (5) items addressed a single issue; and (6) wording of the items was simple enough to be comprehended by most clinicians, nurses, and office staff in family practice. The final 28 items were combined to form the Survey of Organizational Attributes for Primary Care (SOAPC). All items on the SOAPC were formatted using a 5-point Likert-type scale, ranging from 1=strongly disagree to 5=strongly agree.


An evaluation of the initial 28-item questionnaire was incorporated as part of the NHLBI-funded ULTRA trial and was approved by the UMDNJ-RWJMS IRB. The questionnaires were distributed by the office manager at each practice and later returned to the office manager in sealed envelopes. Staff were reminded to complete and return the questionnaires.

Sample of Practices

Practices participating in the ULTRA trial were practice members of the New Jersey Family Medicine Research Network as well as other community-based practices throughout New Jersey and eastern Pennsylvania. Practices were purposefully recruited to include diversity in location (urban, suburban, and rural), populations served (primarily minority versus primarily nonminority), practice size (solo, small group, and large group) and race/ethnicity of physicians. Recruitment of practices included dissemination of study materials and/or personal contact by phone with the physician or office manager in order to briefly introduce the project, and, as a final step, a meeting at the practice site with members of the practice. Initially, 179 practices were mailed or faxed material about the study. Out of these, we were unable to speak directly with practice leadership in 85 practices, leaving 94 who were actively invited to participate through a phone or in-person conversation with practice leadership. Out of these 94, 61 (65 percent) were consented. Out of the 61 who consented, seven practices withdrew from the study and three practices had not returned surveys at the time of this paper, leaving 51 for this analysis.

Of the 51 practices, six had 11 or more clinicians, six had seven to 10 clinicians, 19 had four to six clinicians, 19 had two to three clinicians, and seven had a single clinician. Weekly patient volume ranged from 12 practices that saw less than 100 patients per week, to 19 with 100–300 patients per week, to 11 practices with 300–500 patients per week, and nine practices with more than 500 patients per week. Six practices were located in an urban area, while 44 were suburban and one rural. Thirty-eight practices were owned by physicians, nine by a hospital health system, two by a university, one by a church, and one by a corporation. Nine practices were owned by minority clinicians. The overall average racial characteristics of the patients were as follows: 68 percent white, 17 percent black/African American, 1 percent Native American, 1 percent Pacific Islander, 3 percent Asian Indian, 3 percent Asian, and 7 percent some other race. On average 10 percent of patients were Hispanic, and 90 percent were non-Hispanic. Minority patient populations tended to be clustered in a few practices. For example, in one practice 80 percent of patients were Hispanic, yet in another practice only 1 percent of patients were Hispanic. On average, 17 percent of patients seen in the practices were under age 17, 31 percent were ages 18 to 44, 30 percent were ages 45 to 65, and 22 percent were age 65 and older.


Initial item analysis of the SOAPC examined the mean, median, variance, skewness, and floor and ceiling effects of each of the SOAPC items. Additionally, outliers were identified and the problem of missing data was assessed. Items with good variation across practice variation relative to the within-practice variation were identified.

Using information provided by multiple correlations, factor analysis, and Cronbach's α coefficients, items were selected for inclusion in a simplified questionnaire and divided to describe distinct factors. A scree plot was used to identify the appropriate number of factors. Factor analysis with oblique varimax rotation, allowing for correlation between factors, was used to sort the items. Factor loadings of at least 0.4 were chosen to select items for each factor in order to maximize differentiation between factors. The decision to include or eliminate individual items in a shortened version of the questionnaire weighed a number of considerations. (1) Items were considered for elimination if they were not at all related to information provided by other items in the questionnaire, as indicated by small multiple correlations. (2) Items that had better between-practice variability and within-practice consistency were favored for inclusion. (3) Items were eliminated if they had high factor loadings for more than one factor and were determined to have imprecise interpretations. (4) Finally, sets of items that when grouped into factors subjectively made sense according to the principles of organization function were preferred.

Note that if practices were homogeneous in terms of the proportion of office staff, nurses and other clinical staff answering the SOAPC, then methods that maximize the distance between groups of observations, such as canonical discriminant analysis, would have been appropriate. In this case, due to the heterogeneity of distributions of responders in each practice, it was better to treat each staff respondent as an individual observation in a factor analysis. Formal analyses that compare practices may then account for the roles of the SOAPC responders.

A practice score for each factor was created by averaging the scores for each individual staff member in that practice. The score for an individual was the average of the items belonging to that factor. Thus, the practice scores range from 1 to 5. Means and variances of the practice scores for each factor were calculated to determine if the factors distinguished between practices. Further, a correlation analysis studied the relationship between factor scores for each practice.


Data in these analyses were collected from 640 staff, including clinicians, nurses, and office staff, from 51 family practices, representing a response rate of 58 percent. Of 269 physicians from the practices surveyed, 156 (58.0 percent) returned the questionnaire, of 38 physician assistants and nurse practicioners, 27 (71.1 percent) responded, of 329 other clinical staff, 178 (54.1 percent) responded and of 439 office staff, 262 (59.7 percent) responded. We were unable to identify the roles of 25 staff members, of which 17 (68.0 percent) responded. All except one of the returned questionnaires had at least 25 of the 28 questions answered. The one questionnaire in which all of the questions were left blank was excluded from this analysis. Further, no one question was left blank by more than four individual respondents.

Tables 1 and and22 include the lists of individual items, sorted into their final categories. Listed along with each item (Table 3) are the means, standard deviations, multiple correlations and initial factor loadings along the four dimensions. Multiple correlations ranged from 0.11 up to 0.79. R2 values that give rough description of the variation attributable to practice membership varied between 0.14 and 0.34, which is reasonably high for ordinal data.

Table 1
List of Items by Factor
Table 2
Items Eliminated from Final Survey
Table 3
List of Means (Standard Deviations), Multiple Correlations, R2 Values Describing Proportion of Variation Attributable to Practice Membership, and the Principle Components Factor Analysis Loadings for Each Item

The scree plot indicated that a four-factor solution was appropriate. Eigenvalues from the principle component analysis were 9.4, 2.0, 1.3, and 0.9 and dropped off significantly thereafter.

Factor loadings generally lead to straightforward assignment of items to factors. The only exceptions were items 24 and 25, which were cross-loaded with communication and decision making. The factor loadings for these items were also less than other loadings within these factors. Thus, these items were eliminated. Items 22, 23, and 26 were removed from consideration as part of the remaining factors, because they did not load significantly with any factor. Although item 27 had a significant factor loading, it had a somewhat lower multiple correlation which reflects the conceptual reality that it addressed leadership in a much different way than many of the other questions on leadership. The Cronbach's α for decision making improved significantly upon removal of item 27. Finally, although item 28 loaded significantly with chaos, it was eliminated because of its higher multiple correlation (0.64) and it did not significantly change the internal validity of the chaos factor.

The final factors are labeled “communication,” “decision making,” “stress/chaos,” and “history of change.” The first three factors in particular describe three components of the internal resources for change element described by Cohen et al. Cronbach's α's of 0.81, 0.88, and 0.85, respectively, for these three factors indicated strong internal consistency. The fourth factor, based on three items, was retained because it provided a meaningful measure of perceived change, which would be of particular use in distinguishing practices at the end of an intervention trial. The Cronbach's α, 0.73, was also reasonable for history of change.

Correlations between the factors, when factors are measured on each individual employee and when factors are measured at the practice level, are given in the first and second rows, respectively, of each cell in Table 4. As anticipated, according to the perceptions of family physician practice employees, better communication and teamwork is associated with more participatory the decision making; more stressful and chaotic work environments are associated with less communication and with less participatory decision making. History of change is associated with lower communication and participatory decision making and with higher stressful/chaotic work environments.

Table 4
Percentiles and IQRs of and Correlations between Factors

Table 4 also presents percentiles and interquartile ranges of the practice scores. The interquartile range, the distance between the 25th and 75th percentiles, is much greater for communication and decision making than for the other two factors, indicating that the practices differ most on these factors. One-way ANOVA's treating practice as the predictor, indicated that all factors were significantly different across practices (all with p-values <.0001).


Health care in the United States is rapidly changing and primary care practices are faced with unprecedented challenges in what has been described as “…the best of times, and the worst of times”(Grumbach 1999). Surprise and uncertainty are the norm in this rapidly changing landscape (McDaniel et al. 2003), so there is a critical need for primary care practices to constantly adapt their care and management practices (Graham et al. 2002). Many practices are faced with constant turnover of both clinicians and staff (Ruhe et al. 2004; Goodwin et al. 2001; Tallia et al. 2003), at the same time that there are pleas for practices to establish collaborative teams to coordinate the care for patients with complex and often multiple chronic conditions (Grumbach 1999; IOM Committee on Quality of Health Care in America 2001; Lemieux-Charles et al. 2002). Thus, there is a critical need to monitor organizational performance of primary care practices and search for interventions and models of care that promote practices' capacity to thrive as demands on the practice are continually changing.

This research developed an instrument that characterizes and differentiates primary care practices on four distinct scales: communication, decision making, stress/chaos, and history of change. The factor labeled communication simply describes whether all members of the practice are able to work through problems as a team through discussion and consultation with one another. High scores indicate better communication. High scores on decision making indicate that within the practice there is a participatory approach to making decisions and that the leadership encourages input from all employees of the practice. High scores on stress/chaos indicate that the employees feel overwhelmed by the workload. High scores on history of change indicate that there have been numerous changes in the management and culture of the practice. The first three of these dimensions fit into the resources for change element of the change model described by Cohen et al. (2004).

History of change, particularly when evaluated at the practice-level, was associated with lower levels of communication and participatory decision making and with higher levels of chaos. Further investigation may help to understand the causal direction of this relationship. Does change cause chaos within the practice, followed by less communication? Or, is the perception of change greater when staff members are not included in the decision-making process. One weakness of the survey items is that we cannot identify the nature of the change being measured, whether it is change due to turnover of staff, financial crises, alterations of the operating procedures in the practice, etc. In the ULTRA study, which provided the baseline data for development of the SOAPC instrument, we hypothesize that although change may happen in any practice due to multiple influences, the ability to maintain processes that allow for positive change initiated within the practice is actually facilitated by better communication and higher participatory decision making. This hypothesis and, in general, the relationship of change with chaos, communication and participatory decision making are best addressed in the context of prospective, longitudinal studies with the aid of qualitative observations.

The assessment of communications does not reflect the rich, multifaceted nature of relationships and communication, but is rather a fairly simple assessment of whether communication exists or not. In particular, features such as diversity of opinion (McDaniel and Walls 1997), whether people are collectively mindful of their work (Weick and Sutcliffe 2001), and heedful of the impact of their actions on others (Weick and Roberts 1993) need to be considered. Items crafted to address these features would help to assess the qualities of relationships within the practice that are critical for establishing the capacity for change.

There are a number of limitations to this study. The instrument was test in a relatively small number of practices from a fairly narrow region of two northeastern states. To help ensure transferability of the results, practices were selected to represent a range of small to medium sized independent organizations located in a range of inner-city, suburban, and semi-rural regions. These practices were also primarily family medicine practices. While we did not test this instrument in pediatric and general internal medicine practices, we would expect them to behave similarly to family practices because they have similar external pressures and perform similar functions. Finally, the data for this analysis come from practices that agreed to participate in an intervention study aimed improving office functioning and patient care. As such, one would expect the practices included in this sample to be more innovative than the typical practice. Thus, one would expect the variation observed between practices in this study to be amplified when one considers a broader spectrum of primary care practices.

The four scales were also internally consistent from a quantitative standpoint and corresponded well with in-depth qualitative data available for each individual item within a practice. Additional research will be required to establish the optimal range of scores for achieving different clinical and other performance outcomes.

This instrument provides a practical resource for monitoring secular change, establishing performance measure benchmarks, and targeting interventions. Benchmarking is especially important as practices attempt to implement the chronic care team approaches that many see as a necessary step for enhancing the process and outcomes of care in the primary care setting (Brook et al. 1996; Cleary and Edgman-Levitan 1997; Ovretveit et al. 2002). It is also increasingly recognized that individual practices adapt their care and management processes to meet the needs of their unique “fitness landscape” or niche (Miller et al. 2001; Cohen et al. 2004). Intervention studies are now increasingly being tailored to the local practice's attributes (Stange 1996; Goodwin et al. 2001; Stange et al. 2003; Solberg et al. 2004). This instrument assesses practice attributes that not only can be studied for correlations with provision of patient care, but can be targeted in interventions and used to document the impact of these interventions.


The authors appreciate the enthusiastic participation of the practices that made this study possible. The authors also wish to acknowledge members of the panel of experts who spent time reviewing and discussing instruments: Patrice Gregory, Ph.D.; Alfred Tallia, M.D., M.P.H.; Thomas Rundall, Ph.D.; Russell Glasgow, Ph.D.; Dan Gaylin, Ph.D.; Reuben McDaniel, Jr. Ed.D.; Laura Leviton, Ph.D.; and Kurt Stange, M.D., Ph.D. Funding for this panel was provided by the Robert Wood Johnson Foundation. The authors would also wish to thank two reviewers for their helpful critiques. Data collection and data analysis funding support was provided by a grant from the National Heart, Lung, and Blood Institute (R01 HL70800) and an AAFP Research Center Grant. This research was also supported by the Cancer Institute of New Jersey's Primary Care Research shared resource.

We gratefully acknowledge the practices from the New Jerse Family Medicine Research Network and Eastern Pennsylvania Inquiry Collaborative Network, whose participation made this study possible.


  • Benson V, Marano M. Current Estimates from the National Health Interview Survey. Hyattsville, MD: National Center for Health Statistics; 1994.
  • Brook RH, McGlynn EA, Cleary PD. Quality of Health Care. Part 2 Measuring Quality of Care. New England Journal of Medicine. 1996;335(13):966–70. [PubMed]
  • Cleary PD, Edgman-Levitan S. Health Care Quality. Incorporating Consumer Perspectives. Journal of the American Medical Association. 1997;278(19):1608–12. [PubMed]
  • Cohen D, McDaniel R, Crabtree B, Ruhe M, Weyer S, Tallia A, Miller W, Goodwin M, Nutting P, Solberg L, Zyzanski S, Jaen C, Gilchrist V, Stange K. A Practice Change Model for Quality Improvement in Primary Care Practice. Journal of Healthcare Management. 2004;49(3):155–68. [PubMed]
  • Crabtree BF, Miller WL, Aita VA, Flocke SA, Stange KC. Primary Care Practice Organization and Preventive Services Delivery A Qualitative Analysis. Journal of Family Practice. 1998;46(5):403–9. [PubMed]
  • Crabtree BF, Miller WL, Stange KC. Understanding Practice from the Ground Up. Journal of Family Practice. 2001;50(10):881–7. [PubMed]
  • Davies AR, Ware JE., Jr Involving Consumers in Quality of Care Assessment. Health Affairs (Millwood) 1988;7(1):33–48.
  • Davies HT, Nutley SM. Developing Learning Organisations in the New NHS. British Medical Journal. 2000;320(7240):998–1001. [PMC free article] [PubMed]
  • Donaldson M, O'Connor D, Bishop D. Exploring Innovation and Quality Improvement in Health Care Micro-Systems: A Cross-Case Analysis. Washington, DC: Institute of Medicine, National Academy Press; 2000.
  • Ferlie EB, Shortell SM. Improving the Quality of Health Care in the United Kingdom and the United States A Framework for Change. Milbank Quarterly. 2001;79(2):281–315. [PubMed]
  • Goldstein SM, Schweikhart SB. Empirical Support for the Baldridge Award Framework in U. S. Hospitals. Health Care Management Review. 2002;27(1):62–75. [PubMed]
  • Goodwin MA, Zyzanski SJ, Zronek S, Ruhe M, Weyer S, Konrad N, Esola D, Stange KC. A Clinical Trial of Tailored Office Systems for Preventive Service Delivery. The Study To Enhance Prevention by Understanding Practice (STEP-UP) American Journal of Preventive Medicine. 2001;21(1):20–8. [PubMed]
  • Graham R, Roberts R, Ostergaard D, Kahn N, Pugno P, Green L. Family Practice in the United States A Status Report. Journal of the American Medical Association. 2002;288:1097–101. [PubMed]
  • Grumbach K. Primary Care in the United States—The Best of Times, The Worst of Times. New England Journal of Medicine. 1999;341(26):2008–10. [PubMed]
  • Hertz HS, Reimann CW, Bostwick MC. The Malcolm Baldrige National Quality Award Concept Could It Help Stimulate or Accelerate Health Care Quality Improvement? Quality Management in Health Care. 1994;2(4):63–72. [PubMed]
  • IOM Committee on Quality of Health Care in America. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001.
  • Jennings K, Westfall F. A Survey-Based Benchmarking Approach for Health Care Using the Baldrige Quality Criteria. Joint Commission Journal on Quality Improvement. 1994;20(9):500–9. [PubMed]
  • Lemieux-Charles L, McGuire W, Blidner I. Building Interorganizational Knowledge for Evidence-Based Health System Change. Health Care Management Review. 2002;27(3):48–59. [PubMed]
  • McDaniel RR, Jr, Jordan ME, Fleeman BF. Surprise, Surprise, Surprise! A Complexity Science View of the Unexpected. Health Care Management Review. 2003;28(3):266–78. [PubMed]
  • McDaniel RR, Jr, Walls M. Diversity as a Management Strategy for Organizations. Journal of Management Inquiry. 1997;6(4):363–75.
  • Meyer SM. Contrasting the Original Malcolm Baldrige National Quality Award and the Health Care Pilot Award. Quality Management in Health Care. 1998;6(3):12–21. [PubMed]
  • Meyer S, Collier D. An Empirical Test of the Causal Relationships in the Baldrige Health Care Pilot Criteria. Journal of Operations Management. 2001;19:403–25.
  • Miller WL, Crabtree BF, McDaniel R, Stange KC. Understanding Change in Primary Care Practice Using Complexity Theory. Journal of Family Practice. 1998;46(5):369–76. [PubMed]
  • Miller WL, McDaniel RR, Jr, Crabtree BF, Stange KC. Practice Jazz Understanding Variation in Family Practice Using Complexity Science. Journal of Family Practice. 2001;50(10):872–8. [PubMed]
  • Mitchell PH, Shortell SM. Adverse Outcomes and Variations in Organization of Care Delivery. Medical Care. 1997;35(11):NS19–32. [PubMed]
  • Nabitz U, Klazinga N, Walburg J. The EFQM Excellence Model European and Dutch Experiences with the EFQM Approach in Health Care. European Foundation for Quality Management. International Journal for Quality in Health Care. 2000;12(3):191–201. [PubMed]
  • NIST. Baldrige National Quality Program. Gathersburg, MD: NIST; 2002. Available at
  • Nordhaus-Bike A. Rare Breed. St. Louis System Goes the Extra Mile in Seeking Baldrige Award. Materials Management in Health Care. 2001;10(1):14–5. [PubMed]
  • Ovretveit J, Bate P, Cleary P, Cretin S, Gustafson D, McInnes K, McLeod H, Molfenter T, Plsek P, Robert G, Shortell S, Wilson T. Quality Collaboratives Lessons from Research. Quality and Safety in Health Care. 2002;11(4):345–51. [PMC free article] [PubMed]
  • Rogers E. Diffusion of Innovations. New York: Free Press; 1995.
  • Ruhe M, Gotler RS, Goodwin MA, Stange KC. Physician and Staff Turnover in Community Primary Care Practice. Journal of Ambulatory Care Management. 2004;27(3):242–8. [PubMed]
  • Scott J, Vojir C, Jones K, Moore L. Assessing Nursing Home's Capacity to Create and Sustain Improvement. Journal of Nursing Care Quality. 2005;20(1):36–42. [PubMed]
  • Scott-Cawiezell J, Jones K, Moore L, Vojir C. Nursing Home Culture A Critical Component in Sustained Improvement. Journal of Nursing Care Quality. 2005;20(4):341–8. [PubMed]
  • Scott-Cawiezell J, Schenkman M, Moore L, Vojir C, Connoly RP, Pratt M, Palmer L. Exploring Nursing Home Staff's Perceptions of Communication and Leadership to Facilitate Quality Improvement. Journal of Nursing Care Quality. 2004;19(3):242–52. [PubMed]
  • Senge PM. The Fifth Discipline. The Art & Practice of the Learning Organization. New York: Doubleday; 1990.
  • Senge PM. The Fifth Discipline The Art and Practice of the Learning Organization. New York: Doubleday/Currency; 1994.
  • Shortell S. Effective Organizational Change: Assessing the Evidence. Washington, DC: National Academies Press: Institute of Medicine Annual Meeting; 2002.
  • Shortell SM. High-Performing Healthcare Organizations Guidelines for the Pursuit of Excellence. Hospital & Health Services Administration. 1985;30(4):7–35.
  • Shortell SM. Developing Effective Culture Vital to Hospital Strategy. Modern Healthcare. 1990;20(30):38.
  • Shortell SM, Bennett CL, Byck GR. Assessing the Impact of Continuous Quality Improvement on Clinical Practice What It Will Take to Accelerate Progress. Milbank Quarterly. 1998;76(4):593–624. 510. [PubMed]
  • Shortell SM, Jones RH, Rademaker AW, Gillies RR, Dranove DS, Hughes EF, Budetti PP, Reynolds KS, Huang CF. Assessing the Impact of Total Quality Management and Organizational Culture on Multiple Outcomes of Care for Coronary Artery Bypass Graft Surgery Patients. Medical Care. 2000;38(2):207–17. [PubMed]
  • Shortell SM, LoGerfo JP. Hospital Medical Staff Organization and Quality of Care Results for Myocardial Infarction and Appendectomy. Medical Care. 1981;19(10):1041–55. [PubMed]
  • Shortell SM, O'Brien JL, Hughes EF, Carman JM, Foster RW, Boerstler H, O'Connor EJ. Assessing the progress of TQM in US Hospitals Findings from Two Studies. Quality Letter for Healthcare Leaders. 1994;6(3):14–7. [PubMed]
  • Shortell SM, Rousseau DM, Gilles RR, Devers KJ, Simons TL. Organizational Assessment in Intensive Care Units (ICUs) Construct Development, Reliability, and Validity of the ICU Nurse-Physician Questionnaire. Medical Care. 1991;29(8):709–26. [PubMed]
  • Shortell SM, Selberg J. Working Differently. The IOM's Call to Action. Healthcare Executive. 2002;17(1):6–10. [PubMed]
  • Shortell SM, Zimmerman JE, Rousseau DM, Gillies RR, Wagner DP, Draper EA, Knaus WA, Duffy J. The Performance of Intensive Care Units Does Good Management Make a Difference? Medical Care. 1994;32(5):508–25. [PubMed]
  • Solberg LI, Hroscikoski MC, Sperl-Hillen JM, O'Connor PJ, Crabtree BF. Key Issues in Transforming Health Care Organizations for Quality The Case of Advanced Access. Joint Commission Journal on Quality and Safety. 2004;30(1):15–24. [PubMed]
  • Stange K. One Size Doesn't Fit All. Multimethod Research Yields New Insights into Interventions to Increase Prevention in Family Practice. Journal of Family Practice. 1996;43(4):358–60. [PubMed]
  • Stange KC, Goodwin MA, Zyzanski SJ, Dietrich A J. Sustainability of A Practice-Individualized Preventive Service Delivery Intervention. American Journal of Preventive Medicine. 2003;25(4):296–300. [PubMed]
  • Stange KC, Zyzanski SJ, Jaen CR, Callahan EJ, Kelly RB, Gillanders WR, Shank JC, Chao J, Medalie JH, Miller WL, Crabtree BF, Flocke SA, Gilchrist VJ, Langa DM, Goodwin MA. Illuminating the ‘Black Box.’ A Description of 4454 Patient Visits to 138 Family Physicians. Journal of Family Practice. 1998;46(5):377–89. [PubMed]
  • Tallia A, Stange KC, McDaniel RR, Aita V, Miller WL, Crabtree BF. Understanding Organizational Design Perspectives of Primary Care Practices. Journal of Healthcare Management. 2003;48(1):45–59. [PubMed]
  • Weeks WB, Hamby L, Stein A, Batalden PB. Using the Baldrige Management System Framework in Health Care The Veterans Health Administration Experience. Joint Commission Journal on Quality Improvement. 2000;26(7):379–87. [PubMed]
  • Weick K, Roberts K. Collective Mind in Organizations Heedful Interrelating on Flight Decks. Administrative Science Quarterly. 1993;38:357–81.
  • Weick K, Sutcliffe K. Managing the Unexpected. San Francisco: Josey-Bass; 2001.
  • Woodwell DA. National Ambulatory Medical Care Survey 1997 Summary. Advance Data. 1999;(305):1–28. [PubMed]
  • Zairi M. Managing Human Resources in Healthcare Learning from World Class Practices—Part I. Health Manpower Management. 1998;24(2–3):48–57. [PubMed]

Articles from Health Services Research are provided here courtesy of Health Research & Educational Trust