Search tips
Search criteria 


Logo of bmjLink to Publisher's site
BMJ. 2000 March 18; 320(7237): 791–794.
PMCID: PMC1117777

Gaps in the continuity of care and progress on patient safety

Richard I Cook, associate director,a Marta Render, director,b and David D Woods, associate directorc

The patient safety movement includes a wide variety of approaches and views about how to characterise patient safety, study failure and success, and improve safety. Ultimately all these approaches make reference to the nature of technical work of practitioners at the “sharp end” in the complex, rapidly changing, intrinsically hazardous world of health care.1,2 It is clear that a major activity of technical workers (physicians, nurses, technicians, pharmacists, and others) is coping with complexity and, in particular, coping with the gaps that complexity spawns.3 Exploration of gaps and the way practitioners anticipate, detect, and bridge them is a fruitful means of pursuing robust improvements in patient safety.

Summary points

  • Complex systems involve many gaps between people, stages, and processes
  • Analysis of accidents usually reveals the presence of many gaps, yet only rarely do gaps produce accidents
  • Safety is increased by understanding and reinforcing practitioners' normal ability to bridge gaps
  • This view contradicts the normal view that systems need to be isolated from the unreliable human element
  • We know little about how practitioners identify and bridge new gaps that occur when systems change


The notion of gaps is simple. Gaps are discontinuities in care. They may appear as losses of information or momentum or interruptions in delivery of care. In practice gaps rarely lead to overt failure. Rather, most gaps are anticipated, identified, and bridged and their consequences nullified by the technical work done at the sharp end. These gap driven activities are so intimately woven into the fabric of technical work that neither outsiders nor insiders recognise them as distinct from other technical work.

Gaps are most readily seen when they are aligned with organisational and institutional boundaries that mark changes in responsibility or authority, different roles of professionals, or formal divisions of labour. For example, the loss of coherence in a plan of care that occurs during changes of shift is a kind of gap. Another example is the loss of information that sometimes accompanies transfers of patients from one facility to another, as when a patient is discharged from hospital to a rehabilitation facility. Gaps can appear within the processes of care and even in association with single practitioners. For example, when a nurse cares for two or more patients and must divide attention between them there is a potential for gaps in the continuity of care. Skilfully managing the division of attention is a means for bridging gaps—for example, by increasing the frequency of “checking” on the more ill patient or by putting off routine activities such as charting to provide longer periods for observation. But bridging has its own costs and vulnerabilities. To bridge a gap is not to eliminate it; some bridges are robust and reliable but others are frail, brittle, and easily undone by outside circumstances.

Gaps may arise from the unintended side effects of organisational and technological change. Such gaps may be either entirely new gaps or old ones that were regularly bridged before the change undermined the effectiveness of established bridges. To continue the previous example, consider the effects of dividing nursing work between nurses and “patient care technicians.” The economic benefits of division are substantial: it allows the nurse to spend virtually the entire work shift concentrating on high level tasks that require certain credentials (giving intravenous drugs, for example) while other tasks are given to less skilled personnel. Among the side effects of such a change, however, are restrictions on the ability of the individual nurse to anticipate and detect gaps in the care of the patients. The nurse now has more patients to track, requiring more (and more complicated) inferences about which patient will next require attention, where monitoring needs to be more intensive, and so forth. The change also limits the restructuring of work to adjust to changing demands for attention.

Gaps and technical work

Post hoc analysis often attributes healthcare accidents to “human error.” This attribution is heavily influenced by hindsight bias.4 More detailed exploration of the events, however, shows multiple gaps. For example, although popular (and regulatory) opinion regarded the surgeon as the proximate “cause” of the Florida “wrong leg” amputation case (box), there were multiple contributors to that event including misleading information on the operative consent and incorrect posting of the case on the daily schedule board.5 Similarly, in the Colorado nurses case a variety of breakdowns in continuity eventually led to the intravenous injection of benzathine penicillin and death of the newborn patient.6 What these cases have in common (aside from being celebrated in the popular press as evidence of healthcare delivery run amok) is the presence of multiple gaps that practitioners were unsuccessful in bridging.

Celebrated cases and folk models of human error

A few celebrated accidents have shaped public perceptions of safety in health care. Simple stories of these events obscure the multiple factors, interacting goals, and conflicting constraints that confronted practitioners and gave rise to the accidents. These simple stories “explain” how accidents happen and become the basis for folk models of human error. These folk models are strongly (but wrongly) held by the public and also by healthcare professionals, regulators, and managers. Based on these folk models of human error, various ineffective countermeasures are proposed—including increased sanctions, blame and train approaches, and new technology to prevent human error.

Among the celebrated cases in the United States are the 1995 Florida “wrong leg” case and the 1996 Colorado nurses case. The first story of each case produced a simple indictment of human error as the “cause” of the accident, but detailed study has shown that multiple factors were involved and that the accident followed the pattern characteristic of complex system failures.1 In both cases, strong censure of practitioners followed public attention.

Florida wrong leg case5

Celebrated first story—Capricious surgeon amputates contralateral leg rather than the one intended. Surgeon disciplined by state licensing board.

Some factors not included in first story:

  • Both legs were diseased; contralateral leg needed amputation as well
  • Amputation of contralateral leg was proposed to patient
  • Consent identified contralateral leg as target of amputation
  • Operating room schedule identified contralateral leg as target of amputation
  • Patient was anesthetised and contralateral leg was prepped and draped prior to surgeon's entry into theatre

Colorado nurses case6

Celebrated first story— Three nurses administer benzathine penicillin intravenously, causing the death of a neonate. Nurses charged with criminal negligence. One nurse pleads guilty to a reduced charge; another fights the charge and is exonerated.

Some factors not included in first story:

  • Documentation of mother's prior treatment for syphillis was not available, prompting consultations with a doctor that led to order for treatment of neonate with penicillin; it is doubtful that any treatment was required
  • Pharmacist miscalculated dose, resulting in 10 times overdose (1.5 million units instead of 150 000 units) being delivered to neonatal unit
  • Nurses, concerned about giving large volume of intramuscular drug to a newborn, sought guidance about changing to intravenous administration
  • Syringe label was difficult to read
  • Available reference materials were ambiguous about acceptable routes of administration
  • Drug was seldom used

Gaps generate accidents only rarely. In most instances, practitioners are able to detect and bridge gaps. Accidents result from breakdowns in the mechanisms that practitioners use to anticipate, detect, and bridge gaps. The opportunities for failure are many but the incidence of failure is low because practitioners are generally effective in bridging gaps. The irony is that stakeholders can attribute failure to human error only because practitioners in these roles usually bridge the gaps and prevent any escalation toward bad consequences for patients.

This is tantamount to claiming that practitioners create safety—a claim that is at once controversial and intriguing because it flies in the face of conventional views of both how accidents occur and also how safety might be increased.7 The conventional view holds that the system is safe by design but can be degraded by the failure of its human components. Thus, it is claimed, efforts to increase safety should be directed at guarding the system against the unreliable human element, in most cases by isolating practitioners from the system using technology, rules, guidelines, etc.

The alternative view

The alternative view, proposed here, is that accidents occur because conditions overwhelm or nullify the mechanisms practitioners normally use to detect and bridge gaps. Safety is increased primarily by understanding and reinforcing practitioners' ability to detect and bridge gaps. Because practitioners create safety locally, efforts to forestall errors by isolating practitioners from the system will misfire. Attempts at isolation are likely to decrease practitioners' ability to detect and bridge gaps and lead to new paths to and forms of failure.8

According to this alternative view, progress on safety begins with the development of a detailed understanding of technical work as experienced by practitioners in context.9 Much of human practitioners' expertise in action revolves around gaps. Work in the real world involves detecting when things have gone awry; discriminating between data and artefact; discarding red herrings; knowing when to abandon approaches that will ultimately become unsuccessful; and reacting smoothly to escalating consequences. It involves recognising that hazards are approaching; detecting and managing incipient failure; and, when failure cannot be avoided, working to recover from failure.

Past characterisations of safety and its relation to technical work have emphasised deliberate “violations” or failure to adhere to narrow procedural guides. But these characterisations reflect idealised models of what technical work entails; they largely ignore the presence of gaps or the need to cope with them. The focus on the pervasiveness of “error” in healthcare settings mistakes the nature of how the system works to produce results.10 As Barley and Orr observe: “anomalies that surface in aggregate analyses of technical work . . . are paralleled by, if not rooted in, disjunctures and ironies that pervade day-to-day life in technical settings.”9 It is important to note that the anomalies arise not because human behaviour is irreducibly irrational but because the aggregate analyses do not take account of the presence of gaps or the demands that bridging them places on sharp end workers.

Coping with gaps

When gaps recur and there is sufficient exposure to them, formal efforts to create bridges in anticipation are common. Indeed, it is possible to use the many types of forms and documents that exist in health care as pointers to the presence of and response to gaps. Take the example of forms used to communicate information between facilities when transferring the responsibility and authority for patient care. The use of discharge planning documents marks the anticipation of a certain type of gap and also of an effort to create a bridge to permit care to flow smoothly over the gap. How well the bridge serves the purpose depends on many factors including the institutions involved, characteristics of the patient, and so forth. These are empirical questions rather than theoretical ones, and the issues seem mundane.

Similarly, many of the social and organisational constructions that fill health care reflect the presence of gaps and efforts to overcome them: the various “report” activities between shifts and between caregivers, for instance. These easily seen gaps are a useful model for looking at the more heterogeneous gaps that permeate practice.

Gaps are not always apparent, nor are appropriate bridging reactions always clear. There may be only the subtlest hints (or no hints at all) that a gap in an individual's care is present. Continuing the previous example, some inconsistency between the drugs listed in a transfer document and the list of conditions in the patient's history might arouse suspicion that something is missing in the history. This is a case of data in the world triggering awareness of a potential gap.

Identifying gaps

Much experience and expertise involves having in mind a catalogue of conditions and situations that hint that gaps may be present, knowing about the kinds of gaps that can and are likely to occur, and being able to delineate existing gaps and techniques for bridging them.11 In addition, much of what counts for high quality performance in health care involves anticipating gaps that might occur in the future. Anticipating allows the effects of future gaps to be forestalled either by directing care in such a way that the gaps are never encountered or by creating bridges that will span the gaps if they arise. For example, physicians and nurses sometimes keep private notes, anticipating that the official medical record will become unavailable or be awkward to use in the future.

A crucial aspect of gaps is how new gaps are identified. New technology, new organisational structures, new techniques and knowledge, and new demands change the world of technical work—often in unexpected ways. Changes may create new, unfamiliar gaps or may change the character of old, familiar ones so that previously constructed bridges do not span them. A critical (and so far unexplored) safety issue is how practitioners recognise that new gaps are present in their work world or that the characteristics of familiar gaps have changed. Indeed, the Colorado nurses case has the hallmarks of a gap that arises through change and, undetected and unappreciated, participates in catastrophe. To plan effective change, it is essential to know how people detect and understand the new gaps produced by change.

Gaps as tools for research

The enormous complexity of health care is a daunting obstacle to those trying to study safety systematically. Everything, it seems, is connected to everything else, and every thread of action and cause is wound into a great Gordian knot. The pursuit of gaps as a research target is a means of cutting through the knot. Gaps themselves mark the areas of vulnerability and show the mechanism by which complexity flows through health care to individual patients. Pursuing gaps is a method that allows technical work to guide both research into and improvement in safety.

Future work on gaps might be approached in three different ways. Firstly, a catalogue of gaps would yield a map of many of the complexities and hazards in work at the sharp end of systems. Secondly, tracing out the details of how practitioners anticipate, detect, and bridge gaps within the context of actual practice would provide the outlines of what constitutes practitioners' expertise. Thirdly, discovering how gaps are created by organisational and institutional change would link the processes of management to the real demands confronting practitioners. Together, these explorations of gaps can provide a coherent, usable view of patient safety, a kind of landscape that can be used to identify future safety problems, anticipate the impact of change, and measure progress.

Coping with discontinuities in care is a major activity of health workers


 Funding: Preparation of this paper was made possible in part by support from the Department of Veterans Affairs Midwest Patient Safety Center of Inquiry (GAPS).

Competing interests: None declared.


1. Cook RI, Woods DD. Operating at the sharp end: the complexity of human error. In: Bogner MS, editor. Human error in medicine. Hillsdale, NJ: Lawrence Erlbaum; 1994. pp. 255–310.
2. National Patient Safety Foundation. Agenda for research and development in patient safety. 24 May 1999. 22 Nov 1999).
3. Woods DD. Coping with complexity: the psychology of human behavior in complex systems. In: Goodstein LP, Andersen HB, Olsen SE, editors. Tasks, errors and mental models. London: Taylor and Francis; 1988. pp. 128–148.
4. Woods DD, Cook RI. Perspectives on human error: hindsight biases and local rationality. In: Durso RS, editor. Handbook of applied cognition. New York: Wiley; 1999. pp. 141–171.
5. Cook RI, Woods DD, Miller C. A tale of two stories: contrasting views of patient safety. Chicago: National Patient Safety Foundation; 1998. . (Report from a workshop on assembling the scientific basis for progress on patient safety; available as PDF file via )
6. Senders JW, Schneider PJ, McCadden P, Grant RS, Torres CH, Cohen MR, et al. Error, negligence, crime: the Denver nurses trial. Enhancing patient safety and reducing errors in health care proceedings. Chicago: National Patient Safety Foundation; 1999. pp. 65–76.
7. Cook RI. Two years before the mast: learning how to learn about patient safety. Enhancing patient safety and reducing errors in health care proceedings. Chicago: National Patient Safety Foundation; 1999. pp. 61–64.
8. Rasmussen J, Pejtersen AM, Goodstein LP. Cognitive systems engineering. New York: Wiley; 1994. At the periphery of effective coupling: human error; pp. 135–159.
9. Barley SR, Orr JE, editors. Between craft and science: technical work in US settings. Ithaca, NY: ILR Press; 1997. p. 16.
10. Corrigan J, Kohn L, Donaldson M, editors. . To err is human: building a safer health system. Washington DC: National Academy Press; 1999.
11. Klein G. Cambridge, MA: MIT Press; 1998. Sources of power: how people make decisions.

Articles from BMJ : British Medical Journal are provided here courtesy of BMJ Group