Search tips
Search criteria 


Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
Surg Oncol Clin N Am. Author manuscript; available in PMC 2013 July 1.
Published in final edited form as:
PMCID: PMC3465945

Patient Safety in Surgical Oncology: Perspective from the Operating Room

Yue-Yung Hu, MD, MPH1,2 and Caprice C. Greenberg, MD, MPH3


In 1999, the Institute of Medicine1 published, “To Err is Human: Building a Safer Health Care System,” emphasizing the prevalence of preventable medical errors in American health care and the role of systems, processes, and conditions in ensuring (or undermining) safety. As 50–65% of inpatient adverse events are experienced by surgical patients2, and 75% of these occur in intra-operatively3, the operating room (OR) is a high impact area for safety improvements.

Traditionally, surgical vulnerability has been measured in terms of preoperative risk – patient and procedural risk factors4, surgeon volume5, institutional volume6 – while safety has been defined by the absence of postoperative morbidity and/or mortality. The intra-operative phase of care, despite its obvious relevance to the field of surgical safety and rich potential as a data source, has been largely neglected. Because it is understudied, there are many gaps in our knowledge of the intra-operative factors that contribute to or detract from patient safety, and, as a result, few evidence-based guidelines or interventions exist to support hospitals or their providers in the OR710.

Surgery is an inherently hazardous work domain requiring high reliability. Safe operations result from the successful coordination of individuals and teams of diverse training and experience levels, working within complex hospital systems, under constraints imposed by time, uncertainty, and health status. Human factors engineering, focusing on “the interaction among humans and other elements of a system…physical, cognitive, organizational, environmental, and other,”11 has been deployed and is responsible for safety and reliability advances in other, similarly high-risk industries, such as aeronautics or nuclear reactor control. Addressing the etiology of error at all levels – individual, team, and system, human factors analysis is an ideal tool for the study of safety in the OR.

Theoretical Models


In his 1990 treatise, “Human Error,” Jim Reason12 describes his Swiss cheese model of error (Figure 1). In it, the system is represented by a stack of Swiss cheese slices, each analogous to a protective layer in the system, with holes symbolizing the potential for failure at each step in the process. Because the holes in Swiss cheese (the vulnerabilities of a system) are not continuous throughout a stack (the system), most problems are stopped at one layer or another, before they culminate in a larger, more consequential error. In order for a catastrophic failure to occur, the holes must be aligned at every level.

Figure 1Figure 1
Reason's Swiss Cheese Model of Error. A) Each slice of cheese represents a barrier in the system. The holes in each slice represent opportunities for failure at each step, active or latent. B) Alignment of the system's vulnerabilities allows “hazards” ...

As per Reason, these holes may be of two types: active and latent. Active errors are those that are traditionally invoked during discussions about adverse events: readily apparent, they are committed by a human at “the sharp end,” at the point of care. A retained foreign body, for example, represents an active error: the failure to remove an instrument at the end of an operation. However, humans do not make these errors in isolation; they are predisposed towards them by latent conditions at “the blunt end,” in the system. Leaving an instrument in the patient is not the act of an individual surgeon; it is one precipitated by existing flaws in the organizational design of the entire process – the cumbersome and error-prone nature of the counting protocol, for example1314.


In recent years, human factors experts have begun to view the human as the hero, rather than the source, of error. Indeed, in his follow-up book, “The Human Contribution,” Reason15 cautions against “an excessive reliance on system measures,” as it is individuals that constitute a system's last line of defense against error. With the uniquely human ability to anticipate and adapt to changing circumstances, people are capable of recovering problems that have managed to propagate through even the most thoughtfully designed systems. This heroism, however, has its limits. Citing Carthey's16 observational study of arterial switch operations, in which an increased risk of death was demonstrated with higher numbers of minor events regardless of compensation, Reason proposes a knotted rubber band model of system resilience (Figure 2). In it, the system is analogous to a rubber band, with a knot in the middle to represent current operating conditions. To maintain safety, the knot must stay within a narrow operating zone; stretch applied in one direction by dangerous perturbations in the system must be counteracted by compensatory corrections in the opposite direction. With a rising number of perturbations and corrections, the system becomes distorted beyond its capacity to respond.

Figure 2
Reason's Knotted Rubber Band Model of System Resilience. The rubber band represents the system, and the knot represents current operating conditions. To maintain safety, the knot must remain in the “safe operating zone” (dotted area). ...


Retrospective Studies

The characterization of intra-operative human and system factors that impact safety has thus far been limited. Among available methodologies, the most widely utilized is the retrospective reconstruction of the intra-operative events – root or common cause analysis1718, for example, or the analysis of malpractice claims data3, 1920. While such research has been informative about the specific factors that may lead to adverse outcomes, it is susceptible to bias21. Such post-hoc analyses suffer from inaccurate or incomplete recall; without a contemporaneous record, it is difficult to capture all of the mechanisms that have culminated in error. Furthermore, focusing research efforts on the negative effects of care selects for only part of all the available data; information regarding events that are averted or compensated – processes that would be highly instructive in understanding safety in the OR – is lost.

Field Observations

Prospective data collection in the OR, thus, is needed to completely describe the intra-operative delivery of care. Field observations have been described by several groups14, 16, 2233, but has yet to be broadly applied; most of these studies are restricted to small case series at single institutions, for a number of reasons. First, human factors engineering is a relatively new field to medicine, and few people with experience in both disciplines (or multidisciplinary collaborations) exist. Access to the OR may be difficult to attain due to an under-recognition of intra-operative safety problems, as well as cultural mores regarding provider privacy in the workplace. Those who are successful in gaining entry are likely to encounter additional cognitive barriers to the complete transcription of intra-operative events: it can be very difficult to completely observe multiple simultaneous conversations or incidents, to link all downstream occurrences to all earlier preconditions, and to recall everything after the operation has ended. Moreover, as only a few extra people may unobtrusively be present in any OR at one time, the comprehension of ongoing events may only be as complete as the knowledge base and/or memory of the observers; consultation with domain experts for clarification purposes may only be realized retrospectively. Nevertheless, the vast majority of evidence about human factors in the OR has been generated using live field observations and will be reviewed.

Video-Based Observations

In circumventing many of the aforementioned methodological limitations, video-based analyses3435 hold great potential for furthering the study of safety in the OR. Video may be recorded prospectively, but reviewed retrospectively and repeatedly, until all events are fully understood and the connections between them are completely deciphered; as such, it eliminates many of the issues surrounding observer recall and subjectivity. Additionally, it may serve as an educational tool – a mechanism for providing targeted feedback to individuals, teams, and organizational leaders. However, video poses its own challenges. While it is theoretically indiscriminate in its capture, it may still generate incomplete data, depending on the technological capacity and/or functionality of the audiovisual equipment. Additionally, providers may be reluctant to be video-recorded due to fears that the recordings will be used during performance evaluations36 or in courts of law35. Such concerns may be addressed, at least in part, by carefully constructing research protocols with multiple layers of protection for study subjects, including restricted data access, scheduled data destruction, and acquisition of a Certificate of Confidentiality37. We will review the evidence generated using video in the OR.

Human Factors in the OR

For the purposes of this discussion, we will divide human factors into those pertaining to humans and those corresponding to the system. Human attributes are relevant to performance both individually and within a team, and include such qualities as communication, coordination, cooperation, leadership, and vigilance. System features circumscribe the environment in which humans work – the equipment they use, the structure of the larger organization in which they work, or the policies that govern them, for example. Several examples from the literature will be detailed below. Please note that these examples are intended to be illustrative only; they are not exhaustive lists of human factors.


As active errors were originally conceived by Reason, human cognitive limitations were the most proximal causal factor. Our own imperfect behavior, it was thought, makes us prone to failures at all stages of performance: planning (mistakes – flawed intentions), memory storage (lapses – omissions), and task execution (slips – failure to act as planned). Although humans are now viewed with increasing positivity – as agents of recovery, rather than sources of erraticism – their abilities are still subject to limitations. While we are recognized for our ability to compensate, this capacity diminishes with progressive perturbations in the system. The most competent nurse can slip when he's counting, and is even more likely to do so if he is simultaneously juggling the surgeons' requests for new instruments and the anesthesiologists' need for blood products, coordinating with the pre-operative and post-operative units, and answering the resident's pager. In the past, we have countered human limitations with increased standardization, theorizing that these additional barriers to atypical behavior would protect against failure. Requiring X-rays or automating the count procedure with bar-coding7 or radiofrequency identification technology38 decrease reliance on the error-prone manual count. However, recent human factors data indicates that flexibility is needed in the system to permit heroes to maneuver39; one must remain cognizant of the fact that even well-intended protocols (like the manual count) run the risk of inadvertently disabling providers13. An appropriate balance between minimizing human slips, lapses, and mistakes and maximizing human heroic potential must be maintained.

Communication is one of the most studied and most critical human factors in medicine. Root cause analyses of sentinel events reported to the Joint Commission on Accreditation of Healthcare Organizations between 2004 and mid-2011 implicate faulty communication in 56% of operative or post-operative complications, 63% of retained foreign bodies, and 68% of wrong patient, wrong site, or wrong procedure cases40. The importance of communication is further supported by reviews of surgical malpractice claims: Griffen41 attributes 22% of complications to miscommunication, making it the most pervasive behavioral problem of all he investigated, while Greenberg19 places 30% of all communication breakdowns in the OR. Lingard42 estimated that 31% of all procedurally-relevant communications in the OR fail; of these, 36% have tangible effects, such as inefficiency, delay, resource waste, or procedural error.

Miscommunication has multiple etiologies, and therefore is best addressed with a multi-pronged approach. As we mentioned above, standardization may help in certain, selected scenarios – protocols may serve as memory aids, for example, ensuring that all salient points are covered in a discussion. After standardized communication was integrated into handoffs at Northwestern University43, surgical residents' perceptions of the accuracy, completeness, and clarity during the transfer of care improved significantly. After implementing the Situation, Background, Assessment, and Recommendation model of communication into their surgical curriculum, the Mount Sinai School of Medicine44 demonstrated a decrease in order entry errors. The University of Washington's45 computerized sign-out system allowed residents to spend more time with patients during pre-rounds and halved the number of patients missed on rounds, while improving resident ratings of continuity of care and workload.

Checklists work analogously in the OR, reminding providers to do the things which are relevant to almost every operation, but also have another important function. Unlike surgical inpatient teams, the OR team is multidisciplinary; the individuals that comprise it are more likely to differ in their understandings of the situation at hand. The checklist compels them towards the establishment of a shared mental model that enables each team member to better anticipate and plan his/her own role. Multi-national studies have demonstrated its impact on patient morbidity and mortality, as well as provider attitudes regarding safety9, 46.

However, standardized protocols cannot help with the majority of communication in the OR – that which occurs spontaneously, in response to continuously evolving events. In such situations, communication is best accomplished ad hoc – with flexibility for individuals to speak up about arising threats to safety as they see fit. To achieve safety, a level of candidness, and hence a sense of team, is needed; the OR must be an environment in which each team member recognizes and is comfortable in his/her role as an equal contributor – a form of checks and balances for his/her colleagues and the system. The importance of such teamwork in the OR setting to patient outcomes is well-established. Across 44 Veterans Affairs Medical centers and 8 academic hospitals, OR team members who reported higher levels of positive communication and collaboration with attending and resident physicians on the surgical service were found to have lower risk-adjusted morbidity rates47. Mazzocco30 demonstrated an increased odds of complications or death when intra-operative information sharing – a communication behavior that she distinguishes from briefing and which incorporates “mutual respect” and “appropriate[ness]”– was observed to be low. In Catchpole's26 observational study of laparoscopic cholecystectomies and carotid endarterectomies, higher leadership and management skills scores for surgeons and nurses correlated with shorter operating times and lower rates of procedural problems and errors outside the operating field, respectively.

As these studies show, there is a high degree of variability surrounding teamwork in the OR30, 47. Indeed, even within a single OR team, the perception of it may differ, depending on the discipline of the reporting party48. Compared to anesthesiologists and nurses, surgeons seem to overestimate the communication and teamwork in the room4951. These disparities may be the result of the traditional vertical hierarchy in surgery. Surgeons, at its top, are simply not the ones who feel constrained by it, and thus are less likely to recognize the value of a flattened one (i.e. in the open communication or shared decision-making that it would promote)51. Likewise, while nurses, anesthesiologists, and surgeons are equally capable of recognizing tension in the OR, they disagree on the responsibility for creating and resolving it52.

Although certainly more amorphous a target than a successful handoff or briefing, teamwork is indeed amenable to intervention. After introducing a Team Training curriculum, Northwestern University53 reduced their observable intra-operative communication failure rate of 0.7/hour to 0.3/hour. At the University of Oxford54, a non-technical skills course decreased operative technical errors and non-operative procedural errors during laparoscopic cholecystectomies. The Veterans Health Administration, having implemented a medical team training program for OR personnel in its facilities on a rolling basis, documented a decline in risk-adjusted surgical mortality rate that was 50% greater in the trained hospitals than in the untrained ones55. Such interventions represent adaptations of Crew Resource Management (CRM), a training module developed in aviation to educate cockpit crews about communication (e.g. assertiveness, briefing/debriefing), error management (e.g. the recognition of “red flag” situations), and teamwork (e.g. cross checking, interpersonal skills, shared mental models, conflict resolution, and flat hierarchies)54, 5658, and thus far, have been conducted in a one-time fashion. Despite their apparent success, the investigators of these studies note the limitations of a single intervention; because the adoption of CRM techniques represents a significant cultural and professional shift in medicine, continuous training and feedback are needed54, 56.

Several instruments have been developed for measuring teamwork in the OR, and may be considered for assessments of baseline needs, as well as post-intervention change and sustainability over time. The Observational Teamwork Assessment for Surgery (OTAS) consists of a teamwork-related task checklist (patient tasks, equipment/provisions tasks, and communication tasks) and a global rating scale for teamwork-related behaviors (communication, coordination, leadership, monitoring, and cooperation). While its developers report good inter-observer reliability and content validity59, they have also described a learning curve for using it, which may limit its reproducibility by other groups60. The Oxford Non-Technical Skills System (NOTECHS) rates each OR subteam (anesthesiology, nursing, surgery) on 4 dimensions: leadership & management, teamwork & cooperation, problem-solving & decision-making, and situation awareness. Its developers, too, have demonstrated reliability and validity, as well as correlation with OTAS32, but it also has not found widespread use outside of its home institution.

The System

In an operation, the system may refer to the physical environment of that particular OR or the policies, practices, and organizational structure of the department or hospital. It may also implicate the professional culture or values of an institution as a whole, or that of the discipline of surgery.

As a human factor that describes the system, equipment has face validity for most surgeons; it is easy for us to appreciate the value in having functional, well-designed equipment available at the appropriate times. Healey61 documented 64 instances of unavailable or non-functional equipment in 35 out of 50 observed general surgical procedures, and of all of the intra-operative distractions and/or interruptions he noted, these contributed the most to interference with case. In 31 cardiac operations, Wiegmann24 mapped 11% of surgical flow disruptions (“deviations from the natural process of an operation”) to difficulties with equipment or technology.

Perhaps less readily understood, but no less critical, are the organization processes of the OR. For example, in a traditional OR system, the surgeon provides his/her own estimated case durations, a practice which introduces a great deal of subjectivity as well as variability, hindering attempts to match OR capacity to usage. Without accurate approximations of case length, the appropriate allocation of human and equipment resources is difficult. In our own study62, we observed an instance in which more oncology cases were simultaneously booked than the number of oncology kits available; the team expended extra time and effort to obtain the necessary instruments in a piecemeal fashion, and the case was delayed. At the Mayo Clinic63, the development of surgeon-specific procedural database to provide estimates for case duration based upon historical and prospective moving averages was among several initiatives that led to increased OR efficiency and financial performance.

Similarly, in a traditional OR system, the surgeon specifies the contents of his/her own instrument kits. With this practice, thousands of case-cards or pick-tickets may result, consuming a significant amount of nursing and central processing time and effort. After nursing leadership and surgical faculty collaborated to consolidate and streamline the instrument trays at the University of Alabama64, tray and case cart errors decreased. Because circulators spent less time making phone calls to central sterile and flash-sterilizing instruments, their ability to attend to the case increased.

To our knowledge, there are no instruments that assess the system in isolation. The Safety Attitudes Questionnaire65 and the OR Management Attitudes Questionnaire49 ask respondents to rate teamwork, as well as describe the organizational climate towards safety. Wiegmann's24 includes a teamwork category, and Healey's61 contains several measures of communication and “procedural” interference, as well as the environment. The Disruptions in Surgery Index (DiSI) also incorporates individual and team factors in its list of potential disruptions, and is essentially a survey, rather than an observational tool, querying respondents about the perceived impact on themselves and their team members66.

A Case Study

We captured one particularly illustrative example on video in our own observational study62. During a procedure, an alarm began to sound, yet its source was unclear; the alert did not indicate to which piece of equipment it belonged, or whether it represented equipment malfunction or patient endangerment. As the surgeons continued to operate, the nurses and anesthesiologists became absorbed in its investigation. After several minutes, the circulator took charge of the situation, sent the anesthesiologists back to the head of the bed, and called biomedical engineering for help. Eventually, a failed warmer was determined to be the source, and a new one was brought.

Because nothing adverse happened, this incident would likely have been dismissed, rather than reported. However, it provides us with useful information. An analogous case in the aviation industry, in which recovery was not achieved, warns us against ignoring such data. In 1974, the landing gear indicator light on Eastern Airlines Flight 401 failed to illuminate. While the crew preoccupied itself with troubleshooting the issue, the autopilot became deactivated and the flight crashed67. The tragedy of the event lies in the fact the landing gear could have been lowered manually; 101 people died as a result of a burned-out lightbulb (equipment which was unimportant to the flight) and a failure of the team to maintain situational awareness (vigilance) and to manage their human resources (delegate the responsibilities for troubleshooting and flying to separate people).

Our case displayed similar equipment problems (malfunction and poor design), as well as an initial failure of the team to maintain situational awareness and to manage their human resources. However, the smooth engagement of teamleadership of the circulator and cooperation from the anesthesiologists and the biomedical engineers – led to a quick recovery, and may have prevented the occurrence of a more catastrophic event.


In the OR, as in other high acuity, high reliability work environments, human and system factors interact to impact safety. The complex interrelations and interdependencies between people, resources, information, and technology must be more clearly delineated if successful interventions are to be developed. We emphasize that “success” must not be determined narrowly; given the intricate interconnections between various human factors in the OR and the importance of flexibility to system resilience, all consequences must be evaluated. Emerging techniques for conducting such research hold much promise in the advancement of our understanding of intra-operative safety.


Despite our knowledge that the majority of surgical adverse events occur in the operating room, our understanding of the intraoperative phase of care is incomplete; most studies measure surgical safety in terms of preoperative risk or postoperative morbidity and mortality. Because of the OR's complexity, human factors engineering provides an ideal methodology for studies of intraoperative safety. In this article, we review models of error and resilience as delineated by human factors experts, correlating them to OR performance. We then outline existing methodologies for studying intraoperative safety, focusing on video-based observational research. Finally, we detail specific human and system factors that have been examined in the OR.


  • Human factors engineering is a methodology well-suited for the study of complex intraoperative processes
  • Video provides a promising means of studying human factors in the OR
  • Providers are predisposed to committing active errors by latent conditions in the system
  • Adaptability is a positive aspect of human variability that constitutes the last line of defense against error


Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

The authors have nothing to disclose.


1. Kohn LT, Corrigan JM, Donaldson MS. To Err is Human: Building a Safer Health System. Institute of Medicine; 2000. [PubMed]
2. Gawande AA, Thomas EJ, Zinner MJ, et al. The incidence and nature of surgical adverse events in Colorado and Utah in 1992. Surgery. 1999;126(1):66–75. [PubMed]
3. Rogers SO, Jr., Gawande AA, Kwaan M, et al. Analysis of surgical errors in closed malpractice claims at 4 liability insurers. Surgery. 2006;140(1):25–33. [PubMed]
4. Raval MV, Cohen ME, Ingraham AM, et al. Improving American College of Surgeons National Surgical Quality Improvement Program risk adjustment: incorporation of a novel procedure risk score. J Am Coll Surg. 2010;211(6):715–723. [PubMed]
5. Birkmeyer JD, Stukel TA, Siewers AE, et al. Surgeon volume and operative mortality in the United States. N Engl J Med. 2003;349(22):2117–2127. [PubMed]
6. Birkmeyer JD, Dimick JB, Staiger DO. Operative mortality and procedure volume as predictors of subsequent hospital performance. Ann Surg. 2006;243(3):411–417. [PubMed]
7. Greenberg CC, Diaz-Flores R, Lipsitz SR, et al. Bar-coding surgical sponges to improve safety: a randomized controlled trial. Ann Surg. 2008;247(4):612–616. [PubMed]
8. Lingard L, Regehr G, Orser B, et al. Evaluation of a preoperative checklist and team briefing among surgeons, nurses, and anesthesiologists to reduce failures in communication. Arch Surg. 2008;143(1):12–17. discussion 18. [PubMed]
9. Haynes AB, Weiser TG, Berry WR, et al. A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med. 2009;360(5):491–499. [PubMed]
10. Michaels RK, Makary MA, Dahab Y, et al. Achieving the National Quality Forum's “Never Events”: prevention of wrong site, wrong procedure, and wrong patient operations. Ann Surg. 2007;245(4):526–532. [PubMed]
11. International Ergonomics Association [Accessed 6 April, 2010];What is Ergonomics?
12. Reason JT. Human Error. 1 ed Cambridge University Press; 1990.
13. Dierks MM, Christian CK, Roth EM, et al. Healthcare Safety: The Impact of Disabling Safety Protocols. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans. 2004;34(6):693–698.
14. Christian CK, Gustafson ML, Roth EM, et al. A prospective study of patient safety in the operating room. Surgery. 2006;139(2):159–173. [PubMed]
15. Reason JT. The Human Contribution. 1 ed Ashgate; 2008.
16. Carthey J, de Leval MR, Reason JT. The human factor in cardiac surgery: errors and near misses in a high technology medical domain. Ann Thorac Surg. 2001;72(1):300–305. [PubMed]
17. Mallett R, Conroy M, Saslaw LZ, et al. Preventing Wrong Site, Procedure, and Patient Events Using a Common Cause Analysis. Am J Med Qual. 2011 [PubMed]
18. Faltz LL, Morley JN, Flink E, et al. The New York Model: Root Cause Analysis Driving Patient Safety Initiative to Ensure Correct Surgical and Invasive Procedures (Vol. 1: Assessment) 2008. [PubMed]
19. Greenberg CC, Regenbogen SE, Studdert DM, et al. Patterns of communication breakdowns resulting in injury to surgical patients. J Am Coll Surg. 2007;204(4):533–540. [PubMed]
20. Regenbogen SE, Greenberg CC, Studdert DM, et al. Patterns of technical error among surgical malpractice claims: an analysis of strategies to prevent injury to surgical patients. Ann Surg. 2007;246(5):705–711. [PubMed]
21. Gopher D. Why it is not sufficient to study errors and incidents: human factors and safety in medical systems. Biomed Instrum Technol. 2004;38(5):387–391. [PubMed]
22. Roth EM, Christian CK, Gustafson ML, et al. Using field observations as a tool for discovery: analysing cognitive and collaborative demands in the operating room. Cognition, Technology, and Work. 2004;2004(6):148–157.
23. Schraagen JM, Schouten T, Smit M, et al. Assessing and improving teamwork in cardiac surgery. Qual Saf Health Care. 2010;19(6):e29. [PubMed]
24. Wiegmann DA, ElBardissi AW, Dearani JA, et al. Disruptions in surgical flow and their relationship to surgical errors: an exploratory investigation. Surgery. 2007;142(5):658–665. [PubMed]
25. Arora S, Hull L, Sevdalis N, et al. Factors compromising safety in surgery: stressful events in the operating room. Am J Surg. 2010;199(1):60–65. [PubMed]
26. Catchpole K, Mishra A, Handa A, et al. Teamwork and error in the operating room: analysis of skills and roles. Ann Surg. 2008;247(4):699–706. [PubMed]
27. de Leval MR, Carthey J, Wright DJ, et al. Human factors and cardiac surgery: a multicenter study. J Thorac Cardiovasc Surg. 2000;119(4 Pt 1):661–672. [PubMed]
28. Healey AN, Undre S, Vincent CA. Developing observational measures of performance in surgical teams. Qual Saf Health Care. 2004;13(Suppl 1):i33–40. [PMC free article] [PubMed]
29. Lingard L, Reznick R, Espin S, et al. Team communications in the operating room: talk patterns, sites of tension, and implications for novices. Acad Med. 2002;77(3):232–237. [PubMed]
30. Mazzocco K, Petitti DB, Fong KT, et al. Surgical team behaviors and patient outcomes. Am J Surg. 2009;197(5)::678–685. [PubMed]
31. Mishra A, Catchpole K, Dale T, et al. The influence of non-technical performance on technical outcome in laparoscopic cholecystectomy. Surg Endosc. 2008;22(1):68–73. [PubMed]
32. Mishra A, Catchpole K, McCulloch P. The Oxford NOTECHS System: reliability and validity of a tool for measuring teamwork behaviour in the operating theatre. Qual Saf Health Care. 2009;18(2):104–108. [PubMed]
33. Parker SE, Laviana AA, Wadhera RK, et al. Development and evaluation of an observational tool for assessing surgical flow disruptions and their impact on surgical performance. World J Surg. 2010;34(2):353–361. [PubMed]
34. Guerlain S, Adams RB, Turrentine FB, et al. Assessing team performance in the operating room: development and use of a “black-box” recorder and other tools for the intraoperative environment. J Am Coll Surg. 2005;200(1):29–37. [PubMed]
35. Xiao Y, Schimpff S, Mackenzie C, et al. Video technology to advance safety in the operating room and perioperative environment. Surg Innov. 2007;14(1):52–61. [PubMed]
36. Kim YJ, Xiao Y, Hu P, et al. Staff acceptance of video monitoring for coordination: a video system to support perioperative situation awareness. J Clin Nurs. 2009;18(16):2366–2371. [PubMed]
37. Guerlain S, Turrentine B, Adams R, et al. Using video data for the analysis and training of medical personnel. Cognition, Technology, and Work. 2004;6:131–138.
38. Macario A, Morris D, Morris S. Initial clinical evaluation of a handheld device for detecting retained surgical gauze sponges using radiofrequency identification technology. Arch Surg. 2006;141(7):659–662. [PubMed]
39. Sheridan TB. Risk, human error, and system resilience: fundamental ideas. Hum Factors. 2008;50(3):418–426. [PubMed]
40. (JCAHO) TJCoAoHO . Sentinel Event Data: Root Causes by Event Type (2004-Second Quarter 2011) Aug 5, 2011.
41. Griffen FD, Stephens LS, Alexander JB, et al. Violations of behavioral practices revealed in closed claims reviews. Ann Surg. 2008;248(3):468–474. [PubMed]
42. Lingard L, Espin S, Whyte S, et al. Communication failures in the operating room: an observational classification of recurrent types and effects. Qual Saf Health Care. 2004;13(5):330–334. [PMC free article] [PubMed]
43. Wayne JD, Tyagi R, Reinhardt G, et al. Simple standardized patient handoff system that increases accuracy and completeness. J Surg Educ. 2008;65(6):476–485. [PubMed]
44. Telem DA, Buch KE, Ellis S, et al. Integration of a formalized handoff system into the surgical curriculum: resident perspectives and early results. Arch Surg. 2011;146(1):89–93. [PubMed]
45. Van Eaton EG, Horvath KD, Lober WB, et al. A randomized, controlled trial evaluating the impact of a computerized rounding and sign-out system on continuity of care and resident work hours. J Am Coll Surg. 2005;200(4):538–545. [PubMed]
46. Weiser TG, Haynes AB, Dziekan G, et al. Effect of a 19-item surgical safety checklist during urgent operations in a global patient population. Ann Surg. 2010;251(5):976–980. [PubMed]
47. Davenport DL, Henderson WG, Mosca CL, et al. Risk-adjusted morbidity in teaching hospitals correlates with reported levels of communication and collaboration on surgical teams but not with scale measures of teamwork climate, safety climate, or working conditions. J Am Coll Surg. 2007;205(6):778–784. [PubMed]
48. Makary MA, Sexton JB, Freischlag JA, et al. Operating room teamwork among physicians and nurses: teamwork in the eye of the beholder. J Am Coll Surg. 2006;202(5):746–752. [PubMed]
49. Flin R, Yule S, McKenzie L, et al. Attitudes to teamwork and safety in the operating theatre. Surgeon. 2006;4(3):145–151. [PubMed]
50. Mills P, Neily J, Dunn E. Teamwork and communication in surgical teams: implications for patient safety. J Am Coll Surg. 2008;206(1):107–112. [PubMed]
51. Sexton JB, Thomas EJ, Helmreich RL. Error, stress, and teamwork in medicine and aviation: cross sectional surveys. BMJ. 2000;320(7237):745–749. [PMC free article] [PubMed]
52. Lingard L, Regehr G, Espin S, et al. Perceptions of operating room tension across professions: building generalizable evidence and educational resources. Acad Med. 2005;80(10 Suppl):S75–79. [PubMed]
53. Halverson AL, Casey JT, Andersson J, et al. Communication failure in the operating room. Surgery. 2011;149(3):305–310. [PubMed]
54. McCulloch P, Mishra A, Handa A, et al. The effects of aviation-style non-technical skills training on technical performance and outcome in the operating theatre. Qual Saf Health Care. 2009;18(2):109–115. [PubMed]
55. Neily J, Mills PD, Young-Xu Y, et al. Association between implementation of a medical team training program and surgical mortality. JAMA. 2010;304(15):1693–1700. [PubMed]
56. Grogan EL, Stiles RA, France DJ, et al. The impact of aviation-based teamwork training on the attitudes of health-care professionals. J Am Coll Surg. 2004;199(6):843–848. [PubMed]
57. Hamman WR. The complexity of team training: what we have learned from aviation and its applications to medicine. Qual Saf Health Care. 2004;13(Suppl 1):i72–79. [PMC free article] [PubMed]
58. Burke CS, Salas E, Wilson-Donnelly K, et al. How to turn a team of experts into an expert medical team: guidance from the aviation and military communities. Qual Saf Health Care. 2004;13(Suppl 1):i96–104. [PMC free article] [PubMed]
59. Hull L, Arora S, Kassab E, et al. Observational teamwork assessment for surgery: content validation and tool refinement. J Am Coll Surg. 2011;212(2):234–243. e231–235. [PubMed]
60. Sevdalis N, Lyons M, Healey AN, et al. Observational teamwork assessment for surgery: construct validation with expert versus novice raters. Ann Surg. 2009;249(6):1047–1051. [PubMed]
61. Healey AN, Sevdalis N, Vincent CA. Measuring intra-operative interference from distraction and interruption observed in the operating theatre. Ergonomics. 2006;49(5–6):589–604. [PubMed]
62. Hu YY, Arriaga A, Roth EM, et al. Protecting Patients from an Unsafe System: The Etiology and Recovery of Intra-Operative Deviations in Care. 2011 In process. [PMC free article] [PubMed]
63. Cima RR, Brown MJ, Hebl JR, et al. Use of Lean Six Sigma Methodology to Improve Operating Room Efficiency in a High-Volume Tertiary-Care Academic Medical Center. J Am Coll Surg. 2011 [PubMed]
64. Heslin MJ, Doster BE, Daily SL, et al. Durable improvements in efficiency, safety, and satisfaction in the operating room. J Am Coll Surg. 2008;206(5):1083–1089. discussion 1089-1090. [PubMed]
65. Sexton JB, Helmreich RL, Neilands TB, et al. The Safety Attitudes Questionnaire: psychometric properties, benchmarking data, and emerging research. BMC Health Serv Res. 2006;6:44. [PMC free article] [PubMed]
66. Sevdalis N, Forrest D, Undre S, et al. Annoyances, disruptions, and interruptions in surgery: the Disruptions in Surgery Index (DiSI) World J Surg. 2008;32(8):1643–1650. [PubMed]
67. Yanez L. Eastern Flight 301: The Story of the Crash. Miami Herald. 2007 Dec;