Search tips
Search criteria 


Logo of qualsafetyQuality and Safety in Health CareCurrent TOCInstructions for authors
Qual Saf Health Care. 2007 October; 16(5): 387–399.
PMCID: PMC2464970

Application of statistical process control in healthcare improvement: systematic review



To systematically review the literature regarding how statistical process control—with control charts as a core tool—has been applied to healthcare quality improvement, and to examine the benefits, limitations, barriers and facilitating factors related to such application.

Data sources

Original articles found in relevant databases, including Web of Science and Medline, covering the period 1966 to June 2004.

Study selection

From 311 articles, 57 empirical studies, published between 1990 and 2004, met the inclusion criteria.


A standardised data abstraction form was used for extracting data relevant to the review questions, and the data were analysed thematically.


Statistical process control was applied in a wide range of settings and specialties, at diverse levels of organisation and directly by patients, using 97 different variables. The review revealed 12 categories of benefits, 6 categories of limitations, 10 categories of barriers, and 23 factors that facilitate its application and all are fully referenced in this report. Statistical process control helped different actors manage change and improve healthcare processes. It also enabled patients with, for example asthma or diabetes mellitus, to manage their own health, and thus has therapeutic qualities. Its power hinges on correct and smart application, which is not necessarily a trivial task. This review catalogues 11 approaches to such smart application, including risk adjustment and data stratification.


Statistical process control is a versatile tool which can help diverse stakeholders to manage change in healthcare and improve patients' health.

Quality improvement (QI) practices represent a leading approach to the essential, and often challenging, task of managing organisational change.1 Statistical process control (SPC) is, in turn, a key approach to QI.2 SPC was developed in the 1920s by the physicist Walter Shewhart to improve industrial manufacturing. It migrated to healthcare, first in laboratory settings (eg, Fisher and Humphries3) and then into direct patient care applications, along with other approaches to QI. Before we report on our systematic review of the literature on how SPC has been applied to QI in healthcare, there is a need to define SPC and its role in QI.

“Statistical process control (SPC) is a philosophy, a strategy, and a set of methods for ongoing improvement of systems, processes, and outcomes. The SPC approach is based on learning through data and has its foundation in the theory of variation (understanding common and special causes). The SPC strategy incorporates the concepts of an analytic study, process thinking, prevention, stratification, stability, capability, and prediction. SPC incorporates measurement, data collection methods, and planned experimentation. Graphical methods, such as Shewhart charts (more commonly called ‘control charts'), run charts, frequency plots, histograms, Pareto analysis, scatter diagrams, and flow diagrams are the primary tools used in SPC.” (Carey4, p xviii)

The terms “statistical process control” and “statistical quality control” are often used interchangeably,5 although sometimes the latter is used to describe a broader organisational approach to quality management that evolved into the concept of total quality management.6

One of the tenets of QI is that to improve healthcare performance we must change our way of working.7 But change does not always mean improvement. To discriminate between changes that yield improvement and those that do not, relevant aspects of performance need to be measured. In addition, measurement guides decisions about where improvement efforts should be focused in the first place. SPC may facilitate such decision making. Control charts, central to SPC, are used to visualise and analyse the performance of a process—including biological processes such as blood pressure homoeostasis or organisational processes such as patient care in a hospital—over time, sometimes in real time. Statistically derived decision rules help users to determine whether the performance of a process is stable and predictable or whether there is variation in the performance that makes the process unstable and unpredictable. One source of such variation can be a successful intervention aimed at improvement that changes performance for the better. If the improvement is maintained, the process will stabilise again at its new level of performance. All of this can be easily determined by using SPC.4

Although there are theoretical propositions that SPC can facilitate decision making and QI in healthcare (eg, Berwick,8 Benneyan et al,9 Plsek10) it is not clear what empirical support there is in the literature for such a position11:

“The techniques of statistical process control, which have proved to be invaluable in other settings, appear not to have realised their potential in health care. ... Is this because they are, as yet, rarely used in this way in health care? Is it because they are unsuccessful when used in this way and thus not published (publication bias)? Or is it that they are being successfully used but not by people who have the inclination to share their experience in academic journals?” (p 200)

The present systematic review aimed to answer these questions. We examined the literature for how and where SPC has been applied in QI of clinical/patient care processes and the benefits, limitations, barriers and facilitating factors related to such application.

Materials and methods

Drawing on the principles and procedures for systematic review of QI interventions12 we searched for articles on the application of SPC in healthcare QI published between 1966 and June 2004 (see appendix A) in the following databases: Web of Science, Ovid Medline(R), EMBASE, CINAHL (Cumulative Index to Nursing and Allied Health Literature), PsycInfo, and the Centre for Reviews and Dissemination databases. We also included articles found by searching reference lists or from elsewhere which we were aware of, if they met our inclusion criteria: original empirical studies of SPC application in improvement of clinical/patient care processes in healthcare organisations, published in English. We excluded articles dealing with application of SPC in laboratory or technical processes (eg, film processing) and in surveillance/monitoring (unless they also contained empirical data about improvement efforts), as well as tutorials (unless they contained empirical case studies), letters, book reviews and dissertations.

We reviewed abstracts, when available, or else other information about the publication provided in the database (eg, publication type such as letters, book reviews or original articles). Articles that did not meet the inclusion criterion were excluded. We retrieved and read the full text of the remaining articles, again excluding the articles that did not meet the inclusion criterion.

We developed, pilot tested and modified a data abstraction form which we then used to consistently capture information of relevance to our review questions on reading the full text articles. The information recorded was: whether and how the article met the inclusion criterion; study objective(s); study design; whether the study empirically compared application of SPC with any other method for process data display and analysis; reported benefits, limitations, barriers and facilitating factors related to SPC; organisational setting; country where study was conducted; clinical specialty; unit of analysis; variables for SPC analysis; and other observations. Some questions in the form required a yes/no or brief response (eg, country where study was conducted) and others required answers in the form of direct quotes from the article or the a summary of the article written by the reviewer. Each article was read and data abstracted by one member of the review team (the coauthors of this review). Following this, all the data abstraction forms were reviewed by the first author, who solicited clarification and checked for any missing or incomplete data to ensure consistency in reporting across all articles reviewed. He also conducted the initial data synthesis, which was then reviewed by the entire team.

We determined the study design for each article and whether the investigators intended to test the utility of SPC application, alone or in combination with other interventions. In several articles, the study design or study objectives were not explicitly stated. Our determination of such intention in such cases was based on our reading of the full text papers.

Simple descriptive statistics—for example, the number of publications per year of publication or per country—were used to characterise the included studies. The qualitative nature of our research questions and of the abstracted data shaped our analysis and synthesis of findings regarding benefits, limitations, SPC variables, etc.13 The abstracted data was reviewed one question at a time and data from each article was classified into one or more thematic categories, each with a descriptive heading. Informed by our present understanding of QI and healthcare, we developed these categories as we reviewed the data, rather than using categories derived a priori from theory. For data that did not fit into an existing category, we developed a new one. Thus the categories emerged as we synthesised the data. We report the categorised data in tabular form, illustrated with examples, and give the references of all the source studies.

To strengthen our review through investigator triangulation,14 we sought feedback on an earlier version of this manuscript from two SPC experts: one was the most frequent coauthor in the included studies and the other was an expert on SPC application also in settings other than healthcare. Their comments helped us refine our data synthesis and distil our findings.


The database searches yielded 311 references. The initial review (abstracts etc.) yielded 100 articles which we read in full text form. Of these, 57 articles met the inclusion criteria and have been included in the review.15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71 To characterise the body of liferature, figure 11 shows the year of publication and whether the studies were conducted in USA or elsewhere (further specified below); table 11 gives the study designs and objectives—whether or not to test SPC utility.

figure qc22194.f1
Figure 1 The number of included articles by year of publication. (A total of 55 articles were published in 1990–2003; the two articles from 2004 are not included in this graph since the database searches were conducted in June 2004.) Black ...
Table thumbnail
Table 1 Study design and objectives of the studies included in the systematic review*

Most of the articles (45/57) concerned application of SPC in healthcare improvement in the USA.15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,37,38,39,40,42,43,45,47,49,50,51,52,53,54,55,56,59,60,63,67,68,69,70,71 While the first US‐based article was published in 1990, the non‐US articles were published between 1998 and 2003: three articles were from the UK,61,62,66 three were from Switzerland,36,41,46 and one each were from Australia,58 Finland,65 France,57 Indonesia,44 Norway64 and Spain.48 The intention to test the utility of SPC is exemplified by a study aiming to reduce the rate of acquisition of methicillin‐resistant Staphylococcus aureus (MRSA) on wards and units at Glasgow Royal Infirmary hospitals.61 Annotated control charts displaying data on MRSA acquisition were fed back monthly to medical staff, managers and hotel services. Sustained reductions in the rate of acquisition from the baseline, which could not otherwise be accounted for, started 2 months later. In contrast, investigators at a paediatric emergency department used SPC to demonstrate a decline in the rate of contamination following the introduction of a new approach to drawing blood for culture specimens,68 but the study had no intention to test the utility of SPC per se.

To characterise the content of the articles, we first present how and where SPC has been applied to healthcare QI. QI.TablesTables 2–4 present the study settings (ie, hospital etc. where SPC was applied; table 22),), the field of healthcare (ie, specialties or forms of care; table 33),), and the units of analysis (table 44).). Table 55 enlists the 97 distinct SPC variables that have been reported. reported.TablesTables 6–9 convey our synthesis of the reported benefits, limitations, barriers and facilitating factors related to SPC application. For each category, we have given explanations or examples and references to the source articles.

Table thumbnail
Table 2 How and where SPC was applied: study settings*
Table thumbnail
Table 3 How and where SPC was applied: fields of healthcare*
Table thumbnail
Table 4 How and where SPC was applied: units of analysis*
Table thumbnail
Table 5 SPC variables*
Table thumbnail
Table 6 Benefits of using SPC to improve clinical processes*
Table thumbnail
Table 7 Limitations of SPC application in improvement of clinical processes*
Table thumbnail
Table 8 Barriers to SPC application*
Table thumbnail
Table 9 Factors or conditions facilitating application of SPC*


SPC has been applied to healthcare improvement in a wide range of settings and specialties, at diverse levels of organisations and directly by patients, using many types of variables (fig 11,, ,tablestables 2–5). We found reports of substantial benefits of SPC application, as well as important limitations of, barriers to and factors that facilitate SPC application ((tablestables 6–9). These findings indicate that SPC can indeed be a powerful and versatile tool for managing changes in healthcare through QI. Besides helping diverse stakeholders manage and improve healthcare processes, SPC can also help clinicians and patients understand and improve patients' health when applied directly to health indicators such as PEFR in asthma or blood sugar concentrations in diabetes. In healthcare, the “study subject” can thus also be an active agent in the process, as when patients apply SPC to their own health. Several studies indicated the empowering effects this may have on patients.35,38,40,50 SPC application thus has therapeutic potential as it can help patients manage their own health. We agree with Alemi and Neuhauser70 that this potential merits further investigation.

Most of the included articles concerned application of SPC in healthcare improvement in the USA. Articles from other countries appeared only towards the end of the study period (fig 11).). We have no explanation for this finding, but we speculate that it is related to differences between US and other healthcare systems with regard to QI awareness and implementation.73

Only 22 studies included in the review were intended to test the utility of SPC (table 11).). Of the four controlled studies, only one included a control chart in the intervention (as a minor component which did not fully exploit the features of SPC). In 35 articles we did not find an intention to test the utility of SPC application. In those cases, SPC was applied for other reasons (ie, to evaluate the impact of other interventions). Even though many articles thus did not address the utility of SPC, all studies offered information—to varying degrees—relevant to our review's question of how SPC has been applied to healthcare. The utility of SPC is reflected in benefits reported regarding SPC application (table 66).

SPC has been applied in over 20 specialties or fields of healthcare, at a wide range of levels ((tablestables 33 and 4), suggesting that SPC has broad applicability in healthcare. The dominance of anaesthesia and intensive care can be explained in large part by the fact that many studies included their services in conjunction with other specialties. This reflects the way in which anaesthesia has a vital supporting role in many clinical care processes. The 97 SPC variables reported (table 55)) demonstrate a diversity of situations in which SPC has been applied, ranging from process indicators of patients' health to health outcomes and many aspects of healthcare processes and organisational performance. This indicates that SPC is a versatile QI tool.

The benefits of SPC application (table 66)) mirror those given in books and tutorials on SPC (exemplified by the quote in the Introduction to this review). As noted in a report from a top‐ranked healthcare system which has applied SPC widely:

“Among the most powerful quality management tools that IHC [Intermountain Health Care, USA] has applied is statistical process control, SPC. Most notable among those tools are control charts. Under optimal conditions, these graphical depictions of process performance allow participants to know what is happening within their processes as ‘real time' data enable them to make appropriate decisions. The capability of truly understanding processes and variation in a timely manner has resulted in the most dramatic, immediate, and ongoing improvements of any management technique applied at IHC.” (Shaha,26 p 22)

The limitations of SPC application (table 77)) identified by this review are important, and yet perhaps less emphasised than the benefits in books and tutorials on SPC. SPC cannot solve all problems and must be applied wisely. There are many opportunities to “go wrong”, as illustrated by the case where incorrect application was highlighted by other authors (limitation number 5 in table 77).). In several cases, our own understanding of SPC suggested that investigators had not used it correctly or fully (eg, standard decision rules to detect special causes were not applied to identify process changes). In the worst case scenario, incorrect application of SPC could lead to erroneous conclusions about process performance and waste time, effort and spirit and even contribute to patient harm. In the more authoritative studies we reviewed, co‐investigators included experts in industrial engineering or statistics or authors who otherwise had developed considerable expertise in SPC methodology. On the basis of these observations, we conclude that although SPC charts may be easy to use even for patients, clinicians or managers without extensive SPC training, they may not be equally simple to construct correctly. To apply SPC is, paradoxically, both simple and difficult at the same time. Its power hinges on correct and smart application, which is not necessarily a trivial task. The key, then, is to develop or recruit the expertise necessary to use SPC correctly and fully and to make SPC easy for non‐experts to use, before using it widely.

Autocorrelation is another limitation of SPC highlighted by this review. Our review, and published books, offer limited advice on how to manage it:

“There is no single acceptable way of dealing with autocorrelation. Some would say simply to ignore it. [Others] would disagree and suggest various measures to deal with the phenomenon. One way is to avoid the autocorrelation by sampling less frequently. ... Others argue against plotting autocorrelated data on control charts and recommend that the data be plotted on a line chart (without any centerline or control limits).” (Carey,4 p 68)

Just over a quarter of the articles reported barriers to SPC application (table 88).). The three broad divisions of barriers—people, data and chart construction, and IT—indicate where extra care should be taken when introducing SPC in a healthcare organisation. Ideas on how to manage the limitations of and barriers to SPC application can be found among the factors reported to facilitate SPC application (table 99).). They deal with, and go beyond, the areas of barriers we found. We noted the prominence of learning and also of focusing on topics of interest to clinicians and patients. The 11 categories under the heading “Smart application of SPC can be helpful” contain valuable approaches that can be used to improve SPC application. Examples include risk adjustment51,52,71 and stratification30,37,59 to enable correct SPC analysis of data from heterogeneous populations of patients (or organisational units). Basic understanding of SPC must be taught to stakeholders and substantial skill and experience is required to set up successful SPC application. Experts, or facilitators, in healthcare organisations can help, as indicated in table 99,, and as we have described for other QI methods.74

We found more information on SPC benefits and facilitating factors than on limitations and barriers, and this may represent a form of publication bias, as indicated by the quote in the Introduction.11 We did not find any study that reported failed SPC application. We can speculate that there have been situations when SPC application failed, just as there must be many cases of successful SPC application that have not been reported in the literature. Studies of failed SPC application efforts, as well as studies designed to identify successful ways to apply SPC to manage change, would help inform future SPC application efforts. On the basis of this review, we agree with the argument that “medical quality improvement will not reach its full potential unless accurate and transparent reports of improvement work are published frequently and widely (p 319),”75 and also that the way forward is to strengthen QI research rather than to lower the bar for publication.76

Methodological considerations regarding the included studies

None of the studies we found was designed to evaluate the effectiveness quantitatively—that is, the magnitude of benefits—of SPC application. This would have required other study designs such as cluster randomised trials or quasi‐experimental studies.12 Although the “methods of evaluating complex interventions such as quality improvement interventions are less well described [than those to evaluate less complex interventions such as drugs]”, Eccles et al argue that the “general principle underlying the choice of evaluative design is ... simple—those conducting such evaluations should use the most robust design possible to minimise bias and maximise generalisability. [The] design and conduct of quantitative evaluative studies should build upon the findings of other quality improvement research (p 47).”77 This review can provide such a foundation for future evaluative studies.

An important distinction is warranted here: we believe that SPC rests on a solid theoretical, statistical foundation and is a highly robust method for analysing process performance. The designs of the studies included in this systematic review were, however, not particularly robust with regard to evaluating the effectiveness of SPC application, and that was not their objective. This does not mean that SPC is not a useful tool for QI in healthcare, only that the studies reviewed here were more vulnerable to bias than more robust study designs, even if they do indicate many clear benefits of SPC application (table 66).). Despite the studies not being designed to evaluate the effectiveness of SPC, many used SPC to effectively show the impact of QI or other change initiatives. In this way, SPC analysis can be just as powerful and robust as study designs often deemed superior, such as randomised controlled trials (RCTs).77 The key to this power is the statistical and practical ability to detect significant changes over time in process performance when applying SPC.9 On the basis of a theoretical comparison between control charts and RCTs, Solodky et al38 argue that control charts can complement RCTs, and sometimes even be preferable to RCTs, since they are so robust and enable replication—“the gold standard” for research quality—at much lower cost than do RCTs. These points have been further elaborated in subsequent work.78,79

A curious methodological wrinkle in our review is: can you evaluate the application of a method (eg, SPC) using that same method for the evaluation? Several of the included studies used SPC both as (part of) an intervention and as a method to evaluate the impact of that intervention. For example, Curran et al used annotated control charts to feed information on MRSA acquisition rates back to stakeholders and used these same control charts to show the effectiveness of the feedback programme.61

Relationship between monitoring and improvement

When SPC is applied for monitoring, rather than for managing change, the aims are different—for example, to detect even small but clinically important deviations in performance—as are the methodological challenges.80,81 This review focused on the latter. Thus although studies on SPC application for monitoring healthcare performance were excluded from this review, we recognise the importance of such monitoring. The demarcation between monitoring and improvement is not absolute. Indeed, there are important connections between measurement, monitoring and improvement, even if improvement does not follow automatically from indications of dissatisfactory performance. “To improve performance, organizations and individuals need the capability to control, improve, and design processes, and then to monitor the effects of this improvement work on the results. Measurement alone will not suffice (pp 1–35).”82

Monitoring performance by way of control charts has been suggested as a better approach to clinical governance in the British National Health Service. Through six case studies, Mohammed et al demonstrate how control chart monitoring of performance can distinguish normal performance from performance that is either substandard or better than usual care. “These case studies illustrate an important role for Shewhart's approach to understanding and reducing variation. They demonstrate the simplicity and power of control charts at guiding their users towards appropriate action for improvement (p 466).”83

Comments on the review methodology

No search strategy is perfect, and we may well have missed some studies where SPC was applied to healthcare QI. There are no SPC specific keywords (eg, Medical Subject Headings, MeSH) so we had to rely on text words. Studies not containing our search terms in the title or abstract could still be of potential interest although presumably we found most of the articles where SPC application was a central element. We believe the risk that we systematically missed relevant studies to be small. Therefore, our findings would probably not have changed much due to such studies that we might have missed.

The review draws on our reading, interpretation and selection of predominantly qualitative data—in the form of text and figures—in the included articles to answer the questions in our data abstraction form. The questions we addressed, the answers we derived from the studies, and the ways we synthesised the findings are not the only ways to approach this dataset. Furthermore, each member of the review team brought different knowledge and experiences of relevance to the review, potentially challenging the reliability of our analysis. An attempt was made to reduce that risk by having one investigator read all data abstraction forms, and obtain clarifications or additional data from the original articles when needed. That investigator also conducted the initial data synthesis, which was then reviewed by the entire team and the two outside experts. Although other interpretations and syntheses of these data are possible, we believe that ours are plausible and hope they are useful.

The methods for reviewing studies based primarily on qualitative data in healthcare are less well developed than the more established methods for quantitative systematic reviews, and they are in a phase of development and diversification.13,84,85 Among the different methods for synthesising evidence, our approach is best characterised as an interpretive (rather than integrative) review applying thematic analysis—it “involves the identification of prominent or recurrent themes in the literature, and summarising the findings of different studies under thematic headings”.86 There is no gold standard for how to conduct reviews of primarily qualitative studies. Our response to this uncertainty has been to use the best ideas we could find, and to be explicit about our approach to allow readers to assess the findings and their provenance.

The main limitation of this review is the uncertainty regarding the methodological quality of many of the primary studies. Assessment of quality of qualitative studies is still under debate, and there is no consensus on whether at all, or, if so, how to conduct such assessments.84 We reviewed all the studies that satisfied our inclusion criteria and made no further quality assessment. Therefore our findings should be considered as tentative indications of benefits, limitations, etc to be corroborated, or rejected, by future research. The main strength of this review is our systematic and explicit approach to searching and including studies for review, and to data abstraction using a standardised form. It has helped generate an overview of how SPC has been applied to healthcare QI with both breadth and depth—similar to the benefits of thematic analysis reported by investigators reviewing young people's views on health and health behaviour.87

In conclusion, this review indicates how SPC has been applied to healthcare QI with substantial benefits to diverse stakeholders. Although there are important limitations and barriers regarding its application, when applied correctly SPC is a versatile tool which can enable stakeholders to manage change in healthcare and improve patients' health.


We thank Ms Christine Wickman, Information Specialist at the Karolinska Institutet Library, for expert assistance in conducting the database searches. We also acknowledge the pilot work conducted by Ms Miia Maunuaho as a student project at Helsinki University, supervised by Professor Brommels, which provided a starting point for this study. We thank Professor Duncan Neuhauser, Case Western Reserve University, Cleveland, Ohio, USA, and Professor Bo Bergman, Chalmers University of Technology, Gothenburg, Sweden, for their helpful comments on an earlier version of this manuscript. We thank Dr Rebecca Popenoe for her editorial assistance.


MRSA - methicillin resistant Staphylococcus aureus

PEFR - peak expiratory flow rate

QI - quality improvement

RCT - randomised controlled trial

SPC - statistical process control

Appendix A

Database search strategy

Web of Science (1986 – 11 June 2004)

TS [topic search]  = ((statistical process control or statistical quality control or control chart* or (design of experiment and doe)) and (medical or nurs* or patient* or clinic* or healthcare or health care))

We limited the search to articles in English only which reduced the number of hits from 167 to 159. We saved these 159 titles with abstracts in an EndNote library. Using a similar strategy, we searched the following databases through Ovid:

  • Ovid MEDLINE(R) (1966 to week 1, June 2004)
  • EMBASE (1988 to week 24, 2004)
  • CINAHL (1982 to week 1, June 2004)
  • PsycINFO (1985 to week 5, May 2004)

This yielded 287 hits, including many duplicates, which we saved in the same EndNote library as above.

Centre for Reviews and Dissemination (CRD)

We searched all CRD databases and found two articles which we also added to our EndNote library.


Funding: No dedicated funding was received for this study. All coauthors were supported by their respective employers in conducting this research as part of their work.

Competing interests: None.


1. Burnes B. Managing change: a strategic approach to organisational dynamics, 3rd edn. London: Financial Times/Prentice Hall, 2000.
2. Wheeler D J, Chambers D S. Understanding statistical process control, 2nd edn. Knoxville, TN: SPC Press, 1992.
3. Fisher L M, Humphries B L. Statistical quality control of rabbit brain thromboplastin for clinical use in the prothrombin time determination. Am J Clin Pathol 1966. 45148–152.152. [PubMed]
4. Carey R G. Improving healthcare with control charts: basic and advanced SPC methods and case studies. Milwaukee: ASQ Quality Press, 2003.
5. Daniels S, Johnson K, Johnson C.Quality glossary. 2006 [cited 28 June 2006],
6. Blumenthal D. Applying industrial quality management science to physicians' clinical decisions. In: Blumenthal D, Scheck A, eds. Improving clinical practice: total quality management and the physician. San Francisco: Jossey‐Bass Publishers, 1995. 25–49.49.
7. Berwick D M. A primer on leading the improvement of systems. BMJ 1996. 312619–622.622. [PMC free article] [PubMed]
8. Berwick D M. Controlling variation in health care: a consultation from Walter Shewhart. Med Care 1991. 291212–1225.1225. [PubMed]
9. Benneyan J C, Lloyd R C, Plsek P E. Statistical process control as a tool for research and healthcare improvement. Qual Saf Health Care 2003. 12458–464.464. [PMC free article] [PubMed]
10. Plsek P E. Quality improvement methods in clinical medicine. Pediatrics 1999. 103(1 Suppl E)203–214.214. [PubMed]
11. Wilcock P M, Thomson R G. Modern measurement for a modern health service. Qual Health Care 2000. 9199–200.200. [PMC free article] [PubMed]
12. Grimshaw J, McAuley L M, Bero L A. et al Systematic reviews of the effectiveness of quality improvement strategies and programmes. Qual Saf Health Care 2003. 12298–303.303. [PMC free article] [PubMed]
13. Mays N, Roberts E, Popay J. Synthesising research evidence. In: Fulop N, Allen P, Clarke A, et al, eds. Studying the organisation and delivery of health services: research methods. London: Routledge, 2001. 188–220.220.
14. Fulop N, Allen P, Clarke A. et al Issues in studying the organisation and delivery of health services. In: Fulop N, Allen P, Clarke A, et al, eds. Studying the organisation and delivery of health services: research methods. London: Routledge, 2001. 1–23.23.
15. Re R N, Krousel‐Wood M A. How to use continuous quality improvement theory and statistical quality control tools in a multispecialty clinic [see comment]. Qrb Qual Rev Bull 1990. 16391–397.397. [PubMed]
16. Schnelle J F, Newman D R, Fogarty T E. et al Assessment and quality‐control of incontinence care in long‐term nursing facilities. J Am Geriatr Soc 1991. 39165–171.171. [PubMed]
17. Bluth E I, Havrilla M, Blakeman C. Quality improvement techniques: value to improve the timeliness of preoperative chest radiographic reports. AJR Am J Roentgenol 1993. 160995–998.998. [PubMed]
18. McKenzie L. Process management: two control charts. Health Care Superv 1993. 1270–81.81. [PubMed]
19. Rollo J L, Fauser B A. Computers in total quality management. Statistical process control to expedite stats. Arch Pathol Lab Med 1993. 117900–905.905. [PubMed]
20. Dey M L, Sluyter G V, Keating J E. Statistical process control and direct care staff performance. J Ment Health Adm 1994. 21201–209.209. [PubMed]
21. Guinane C S, Sikes J I, Wilson R K. Using the PDSA cycle to standardize a quality assurance program in a quality improvement‐driven environment. Jt Comm J Qual Improv 1994. 20696–705.705. [PubMed]
22. Laffel G, Luttman R, Zimmerman S. Using control charts to analyze serial patient‐related data. Qual Manag Health Care 1994. 370–77.77. [PubMed]
23. Nelson F E, Hart M K, Hart R F. Application of control chart statistics to blood pressure measurement variability in the primary care setting. J Am Acad Nurse Pract 1994. 617–28.28. [PubMed]
24. Carey R G, Teeters J L. CQI case‐study—reducing medication errors. Jt Comm J Qual Improv 1995. 21232–237.237. [PubMed]
25. Oniki T A, Clemmer T P, Arthur L K. et al Using statistical quality control techniques to monitor blood glucose levels. Proc Annu Symp Comput Appl Med Care 1995. 586–590.590.
26. Shaha S H. Acuity systems and control charting. Qual Manag Health Care 1995. 322–30.30. [PubMed]
27. Ziegenfuss J T, Jr, McKenna C K. Ten tools of continuous quality improvement: a review and case example of hospital discharge. Am J Med Qual 1995. 10213–220.220. [PubMed]
28. Johnson C C, Martin M. Effectiveness of a physician education program in reducing consumption of hospital resources in elective total hip replacement. South Med J 1996. 89282–289.289. [PubMed]
29. Piccirillo J F. The use of patient satisfaction data to assess the impact of continuous quality improvement efforts. Arch Otolaryngol Head Neck Surg 1996. 1221045–1048.1048. [PubMed]
30. Shahian D M, Williamson W A, Svensson L G. et al Applications of statistical quality control to cardiac surgery. Ann Thorac Surg 1996. 621351–8 discussion 89.8 discussion 89. [PubMed]
31. Tsacle E G, Aly N A. An expert system model for implementing statistical process control in the health care industry. Computers & Industrial Engineering 1996. 31447–450.450.
32. Cornelissen G, Halberg F, Hawkins D. et al Individual assessment of antihypertensive response by self‐starting cumulative sums. J Med Eng Technol 1997. 21111–120.120. [PubMed]
33. Linford L H, Clemmer T P, Oniki T A. Development of a blood glucose protocol using statistical quality control techniques. Nutr Clin Pract 1997. 1238–41.41. [PubMed]
34. Ornstein S M, Jenkins R G, Lee F W. et al The computer‐based patient record as a CQI tool in a family medicine center. Jt Comm J Qual Improv 1997. 23347–361.361. [PubMed]
35. Boggs P B, Wheeler D, Washburne W F. et al Peak expiratory flow rate control chart in asthma care: chart construction and use in asthma care. Ann Allergy Asthma Immunol 1998. 81552–562.562. [PubMed]
36. Konrad C, Gerber H R, Schuepfer G. et al Transurethral resection syndrome: effect of the introduction into clinical practice of a new method for monitoring fluid absorption. J Clin Anesth 1998. 10360–365.365. [PubMed]
37. Nelson E C, Splaine M E, Batalden P B. et al Building measurement and data collection into medical practice. Ann Intern Med 1998. 128460–466.466. [PubMed]
38. Solodky C, Chen H G, Jones P K. et al Patients as partners in clinical research—a proposal for applying quality improvement methods to patient care. Med Care 1998. 36AS13–AS20.AS20. [PubMed]
39. Vitez T S, Macario A. Setting performance standards for an anesthesia department. J Clin Anesth 1998. 10166–175.175. [PubMed]
40. Boggs P B, Hayati F, Washburne W F. et al Using statistical process control charts for the continual improvement of asthma care. Jt Comm J Qual Improv 1999. 25163–181.181. [PubMed]
41. Konrad C, Gerber H, Schupfer G. et al Detection of fluid volume absorption by end‐tidal alcohol monitoring in patients undergoing endoscopic renal pelvic surgery. J Clin Anesth 1999. 11386–390.390. [PubMed]
42. Levett J M, Carey R G. Measuring for improvement: from Toyota to thoracic surgery. Ann Thorac Surg 1999. 68353–358.358. [PubMed]
43. Pollard J B, Garnerin P. Outpatient preoperative evaluation clinic can lead to a rapid shift from inpatient to outpatient surgery: a retrospective review of perioperative setting and outcome. J Clin Anesth 1999. 1139–45.45. [PubMed]
44. Purba M. Use of a control chart to monitor diarrhoea admissions: a quality improvement exercise in West Kalimantan Provincial Hospital, Pontianak, Indonesia. J Qual Clin Pract 1999. 19145–147.147. [PubMed]
45. Schwab R A, DelSorbo S M, Cunningham M R. et al Using statistical process control to demonstrate the effect of operational interventions on quality indicators in the emergency department. J Healthc Qual 1999. 2138–41.41. [PubMed]
46. Bonetti P O, Waeckerlin A, Schuepfer G. et al Improving time‐sensitive processes in the intensive care unit: the example of “door‐to‐needle time” in acute myocardial infarction. Int J Qual Health Care 2000. 12311–317.317. [PubMed]
47. Ratcliffe M B, Khan J H, Magee K M. et al Collection of process data after cardiac surgery: Initial implementation with a Java‐based intranet applet. Ann Thorac Surg 2000. 691817–1821.1821. [PubMed]
48. Saturno P J, Felices F, Segura J. et al Reducing time delay in the thrombolysis of myocardial infarction: an internal quality improvement project. Am J Med Qual 2000. 1585–93.93. [PubMed]
49. Sinanan M, Wicks K, Peccoud M. et al Formula for surgical practice resuscitation in an academic medical center. Am J Surg 2000. 179417–421.421. [PubMed]
50. Staker L V. Changing clinical practice by improving systems: the pursuit of clinical excellence through practice‐based measurement for learning and improvement. Qual Manag Health Care 2000. 91–13.13. [PubMed]
51. Alemi F, Oliver D W. Tutorial on risk‐adjusted P‐charts. Qual Manag Health Care 2001. 101–9.9. [PubMed]
52. Alemi F, Sullivan T. Tutorial on risk adjusted X‐bar charts: applications to measurement of diabetes control. Qual Manag Health Care 2001. 957–65.65. [PubMed]
53. Aronsky D, Kendall D, Merkley K. et al A comprehensive set of coded chief complaints for the emergency department. Acad Emerg Med 2001. 8980–989.989. [PubMed]
54. Caron A, Neuhauser D V. Health care organization improvement reports using control charts for key quality characteristics: ORYX measures as examples. Qual Manag Health Care 2001. 928–39.39. [PubMed]
55. Quinn D C, Graber A L, Elasy T A. et al Overcoming turf battles: developing a pragmatic, collaborative model to improve glycemic control in patients with diabetes. Jt Comm J Qual Improv 2001. 27255–264.264. [PubMed]
56. Roman S H, Chassin M R. Windows of opportunity to improve diabetes care when patients with diabetes are hospitalized for other conditions. Diabetes Care 2001. 241371–1376.1376. [PubMed]
57. Boelle P Y, Bonnet F, Valleron A J. An integrated system for significant anaesthetic events monitoring. J Am Med Inform Assoc 2002. 9S23–S27.S27.
58. Burnett L, Chesher D, Burnett J R. Optimizing the availability of “stat” laboratory tests using Shewhart “C” control charts. Ann Clin Biochem 2002. 39140–144.144. [PubMed]
59. Carey R G. Improving patient satisfaction: a control chart case study. J Ambul Care Manage 2002. 2578–83.83.
60. Chu J, Neuhauser D V, Schwartz I. et al The efficacy of automated/electrical twitch obtaining intramuscular stimulation (atoims/etoims) for chronic pain control: evaluation with statistical process control methods. Electromyogr Clin Neurophysiol 2002. 42393–401.401. [PubMed]
61. Curran E T, Benneyan J C, Hood J. Controlling methicillin‐resistant Staphylococcus aureus: A feedback approach using annotated statistical process control charts. Infect Control Hosp Epidemiol 2002. 2313–18.18. [PubMed]
62. Marshall T, Mohammed M A, Lim H T. Understanding variation for clinical governance: an illustration using the diagnosis and treatment of sore throat. Br J Gen Pract 2002. 52277–283.283. [PMC free article] [PubMed]
63. Stewart L J, Greisler D. Measuring primary care practice performance within an integrated delivery system: a case study. J Healthc Manag 2002. 47250–261.261. [PubMed]
64. Fasting S, Gisvold S E. Statistical process control methods allow the analysis and improvement of anesthesia care. Can J Anaesth 2003. 50767–774.774. [PubMed]
65. Hyrkas K, Lehti K. Continuous quality improvement through team supervision supported by continuous self‐monitoring of work and systematic patient feedback. J Nurs Manag 2003. 11177–188.188. [PubMed]
66. Marshall T, Mohammed M A. Understanding variation in quality improvement: the treatment of sore throats in primary care. Fam Pract 2003. 2069–73.73. [PubMed]
67. Mertens W C, Higby D J, Brown D. et al Improving the care of patients with regard to chemotherapy‐induced nausea and emesis: the effect of feedback to clinicians on adherence to antiemetic prescribing guidelines. J Clin Oncol 2003. 211373–1378.1378. [PubMed]
68. Norberg A, Christopher N C, Ramundo M L. et al Contamination rates of blood cultures obtained by dedicated phlebotomy vs intravenous catheter. JAMA 2003. 289726–729.729. [PubMed]
69. Rosow E, Adam J, Coulombe K. et al Virtual instrumentation and real‐time executive dashboards. Solutions for health care systems. Nurs Adm Q 2003. 2758–76.76. [PubMed]
70. Alemi F, Neuhauser D. Time‐between control charts for monitoring asthma attacks. Jt Comm J Qual Saf 2004. 3095–102.102. [PubMed]
71. Hart M K, Robertson J W, Hart R F. et al Application of variables control charts to risk‐adjusted time‐ordered healthcare data .[comment]. Qual Manag Health Care 2004. 1399–119.119. [PubMed]
72. Nelson E C, Batalden P B, Huber T P. et al Microsystems in health care: Part 1. Learning from high‐performing front‐line clinical units. Jt Comm J Qual Improv 2002. 28472–493.493. [PubMed]
73. Olsson J, Elg M, Molfenter T. Reflections on transnational transferability of improvement technologies—a comparison of factors for successful change in the United States and northern Europe. Qual Manag Health Care 2003. 12259–269.269. [PubMed]
74. Thor J, Wittlov K, Herrlin B. et al Learning helpers: how they facilitated improvement and improved facilitation—lessons from a hospital‐wide quality improvement initiative. Qual Manag Health Care 2004. 1360–74.74. [PubMed]
75. Davidoff F, Batalden P. Toward stronger evidence on quality improvement. Draft publication guidelines: the beginning of a consensus project. Qual Saf Health Care 2005. 14319–325.325. [PMC free article] [PubMed]
76. Pronovost P, Wachter R. Proposed standards for quality improvement research and publication: one step forward and two steps back. Qual Saf Health Care 2006. 15152–153.153. [PMC free article] [PubMed]
77. Eccles M, Grimshaw J, Campbell M. et al Research designs for studies evaluating the effectiveness of change and improvement strategies. Qual Saf Health Care 2003. 1247–52.52. [PMC free article] [PubMed]
78. Diaz M, Neuhauser D. Pasteur and parachutes: when statistical process control is better than a randomized controlled trial. Qual Saf Health Care 2005. 14140–143.143. [PMC free article] [PubMed]
79. Neuhauser D, Diaz M. Quality improvement research: are randomised trials necessary? Qual Saf Health Care 2007. 1677–80.80. [PMC free article] [PubMed]
80. Benneyan J C, Borgman A D. Risk‐adjusted sequential probability ratio tests and longitudinal surveillance methods [see comment] [comment]. Int J Qual Health Care 2003. 155–6.6. [PubMed]
81. Lim T O. Statistical process control tools for monitoring clinical performance. Int J Qual Health Care 2003. 153–4.4. [PubMed]
82. Berwick D M, James B, Coye M J. Connections between quality measurement and improvement. Med Care 2003. 41(1 Suppl)I30–I38.I38. [PubMed]
83. Mohammed M A, Cheng K K, Rouse A. et al Bristol, Shipman, and clinical governance: Shewhart's forgotten lessons. Lancet 2001. 357463–467.467. [PubMed]
84. Mays N, Pope C, Popay J. Systematically reviewing qualitative and quantitative evidence to inform management and policy‐making in the health field. J Health Serv Res Policy 2005. 10(Suppl 1)6–20.20. [PubMed]
85. Pawson R, Greenhalgh T, Harvey G. et al Realist review—a new method of systematic review designed for complex policy interventions. J Health Serv Res Policy 2005. 10(Suppl 1)21–34.34. [PubMed]
86. Dixon‐Woods M, Agarwal S, Jones D. et al Synthesising qualitative and quantitative evidence: a review of possible methods. J Health Serv Res Policy 2005. 1045–53.53. [PubMed]
87. Harden A, Garcia J, Oliver S. et al Applying systematic review methods to studies of people's views: an example from public health research. J Epidemiol Community Health 2004. 58794–800.800. [PMC free article] [PubMed]

Articles from Quality & Safety in Health Care are provided here courtesy of BMJ Group