PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of jamiaJAMIA - The Journal of the American Medical Informatics AssociationInstructions for authorsCurrent TOC
 
J Am Med Inform Assoc. 2012 Jan-Feb; 19(1): 2–5.
Published online Nov 24, 2011. doi:  10.1136/amiajnl-2011-000674
PMCID: PMC3240771
The dangerous decade
Enrico Coiera,corresponding author1 Jos Aarts,2 and Casimir Kulikowski3
1Centre for Health Informatics, University of New South Wales, Sydney, New South Wales, Australia
2Institute of Health Policy and Management, Erasmus University Rotterdam, Rotterdam, The Netherlands
3Department of Computer Science, Rutgers University, Piscataway, New Jersey, USA
corresponding authorCorresponding author.
Correspondence to Professor Enrico Coiera, Centre for Health Informatics, University of New South Wales, Sydney, NSW 2052, Australia; e.coiera/at/unsw.edu.au
Accepted November 1, 2011.
Over the next 10 years, more information and communication technology (ICT) will be deployed in the health system than in its entire previous history. Systems will be larger in scope, more complex, and move from regional to national and supranational scale. Yet we are at roughly the same place the aviation industry was in the 1950s with respect to system safety. Even if ICT harm rates do not increase, increased ICT use will increase the absolute number of ICT related harms. Factors that could diminish ICT harm include adoption of common standards, technology maturity, better system development, testing, implementation and end user training. Factors that will increase harm rates include complexity and heterogeneity of systems and their interfaces, rapid implementation and poor training of users. Mitigating these harms will not be easy, as organizational inertia is likely to generate a hysteresis-like lag, where the paths to increase and decrease harm are not identical.
Keywords: Cognitive study (including experiments emphasizing verbal protocol analysis and usability), collaborative technologies, communication, decision support, developing/using computerized provider order entry, ethical study methods, historical, human–computer interaction and human-centered computing, legal, policy, qualitative/ethnographic field study, safety, social/organizational study, surveys and needs analysis, system implementation and management issues
There is a paradox in the relationship between information and communication technology (ICT) and patient safety. ICT can improve the quality, safety and effectiveness of clinical services and patient outcomes,1 although the evidence base for this is sometimes weak.2 As a consequence, the rapid deployment of ICT on a national scale is a priority for many nations faced with a diminishing clinical workforce, increasing workloads, and resource constraints.3 4
However, ICT use can also lead to patient harm.5 Many commentators have raised concerns that ICT has yet to deliver on its promises,6 or that the rapid adoption of ICT is a risk.7 7a Errors persist in clinical practice even after ICT is introduced,8 because manual processes co-exist with the automated, and the interfaces between the two are seldom perfect. Others counter that such overemphasis on ICT-related harm only delays the implementation of a crucial technology that will save lives.9
It appears that we are caught in a bind. The demands for health system reform are now so compelling that there appears no choice but to implement complex ICT on a large, often national, scale. Yet these ICT systems appear less mature than we would like and our understanding about how to implement and use them safely remains in its infancy. A such, we are faced with a pressing policy challenge on both the national and international stages.10
The Institute of Medicine (IOM) of the USA's National Academy of Sciences has now issued a report entitled “Health IT and Patient Safety: Building Safer Systems for Better Care”.10a It identifies what is known about health ICT safety, and comments on the complex socio-technical context within which these systems are developed and operate. The report makes clear that there is currently a significant gap in our understanding of the extent and severity of ICT related harm, but that to await more evidence before dealing with the problem is no longer an option. It recommends standards for safety of health IT systems be developed for manufacturers to follow, that a new federal entity be created to specifically monitor and investigatepatient deaths, serious injuries, and unsafe conditions associated with health IT, and to make public its findings. It concludes that the Food and Drug Administration should immediately begin on a framework for regulating health ICT. There is however caution in the report that safety regulations would impede industry innovation, an argument which would literally not fly in the aviation industry. Indeed dissenting views, within the IOM report and without, feel that the time for regulation has now well and truly arrived.10b Indeed, calls for regulation of clinical software have been with us for some time now.10c 10d The IOM report is a welcome request both for the resources needed to fill in the gaps in our research evidence about the safety of ICT, but also to put resources into the active detection and management of ICT related harms and near misses. Its caution toward recommending regulation may however be misplaced. Simply put, if healthcare wants the benefits of ICT then it must actively manage its risks.
While basic technical standards for interoperability are now being adopted in many nations, clear standards for user interface design, decision support system construction, or clinician training are only slowly emerging. We are now beginning to appreciate the complex sociotechnical construct that is created when ICT is placed in the hands of users in busy clinical environments.11 Implementing the same ICT in highly similar organizations can still end up having different results, because of local differences in work or communication patterns.12 Our understanding of the unintended opportunities for harm that arise when interruptions and multitasking disrupt clinicians using information systems is also in its infancy. The psychological literature on interruption is complex, and designing ICT that is ‘interruption safe’ remains a challenging goal.13 14
If we look to industries in which technical safety is also crucial, history tells us that the journey to creating robust, industrial strength and safe systems is a sometimes perilous one.15 The aviation industry, often held up as a paragon of safety, developed its safety culture, processes and technology after a very challenging period. Commercial aircraft did crash, lives were lost, and out of catastrophic failures, learning occurred. The learning cycle repeats with every new ‘quantum leap’ in plane design and the human–machine adaptations that must follow. Technological change always creates new pathways to harm, as evidenced by the sensor malfunctions that led to the Air France airbus catastrophe.
In comparison to aviation, healthcare appears to be more complex, heterogeneous, and harder to control. Yet over the next 10 years, around the world, we will build and deploy more ICT into the health system than in the entire previous history of our discipline. These systems will be larger in scope, more complex, and move from regional to national, and possibly supranational scale. Yet we could argue that we are at roughly the same place the aviation industry was in the 1950s. Will health ICT have to go through a similar painful period of learning from unexpected accidents?
At present ICT-related errors are slowly being identified, and their different types and causations described.15a Analysis of incident reports from the clinical front line is of particular importance right now, and is revealing the complex nature of the human and technical problems that combine to harm patients.16 Unsurprisingly, the majority of incident reports reflect the mundane reality of working in organizations in which there are power outages, resource constraints, network disruptions, and printer failures, as well as the human factors associated with working in technologically, cognitively, and socially complex environments.16 17 Not all types of ICT-related patient harm are the same. Some are serious, as with patient injuries from incorrect dosage of medications or radiotherapy.17a Other events may have a minor impact on individual patients but signal widespread organizational disruption.
Should we expect the number of patient harm events to rise as ICT is widely introduced into healthcare environments? With increased use of ICT the number of harms due to ICT is indeed likely to rise. Assuming a fixed probability of harm associated with each ICT use, in a setting of increased ICT use, there can indeed only be an increased number of ICT harm events (figure 1). To capture this simple observation, we suggest a first postulate of patient safety: For a given configuration of a system, the number of patient harm events increases in proportion to the frequency of events likely to cause harm.
Figure 1
Figure 1
ICT-associated patient harm is likely to increase in step with ICT usage. Different system configurations will have higher or lower opportunities for harm within them, shaping the actual harm rate experienced. ICT, information and communication technology. (more ...)
The actual rate of harm will also be shaped by the intrinsic safety of the system in its entirety. This ‘opportunity for harm’ in a given system configuration is governed by two broad sets of factors. One set diminishes the chances for harm and relates to technology maturity, improvements in quality of system development, testing, implementation and end user training. Simplification of systems through the adoption of common standards should also lead to reduced harm rates. The second set of factors is associated with an increased harm rate. These include increasing complexity and heterogeneity of systems and interfaces between systems, overly rapid implementation schedules and poor training of users.
At first glance it seems that all we now have to do is work on harm mitigation, to offset any growth in system size, use and complexity. However, health systems have repeatedly been observed not to behave in a simple linear fashion. There are indeed factors that have some sort of direct linear relationship with harm—the direct correspondence between the number of times a piece of software is used and the likelihood of harm, for example. Reduce how often the software is used, and you should reduce the associated harm.
Other factors are unlikely to be so well behaved. The health system often appears to exhibit system inertia, a resistance to change or a failure to achieve the degree of change expected for a specific amount of effort.18 System inertia may well underpin our slow progress with reducing iatrogenic patient harm and delivering evidence-based recommended care. System inertia has been long studied in the organizational science literature, and one specific manifestation of inertia is a hysteresis-like lag, in which the paths to increase and decrease an output of an organization are not identical,19 possibly because of the way organizations are structured, and their competing demands.18 We should not be surprised that in the complex feedback systems designed to control patient harm, many system states will lag attempts to change them, resulting in a ‘hysteresis-like’ relationship between output and control variables.
The gives rise to a second patient safety postulate: For a given configuration of a system, the rate of harm relative to opportunity to harm will exhibit a ‘hysteresis-like’ curve, as system inertia may impede attempts to reduce the harm rate, once it has increased.
This means that mitigating the expected increase in ICT-related harm due to increased use will not be easy (figure 2). Making the health system do what we want it to do is non-trivial. We thus need to understand better how those factors that increase or decrease harm rates interact to help find ways of reducing the rate of harm attributable to ICT.
Figure 2
Figure 2
Changes in configuration will alter the opportunities for harm within a system, some making it safer and others less so. Harm rates are likely to be higher with increased complexity, and truncated implementation times. These harms will reduce as technology, (more ...)
Increased system usage
The level of investment in ICT provides a simple proxy measure for the growth in the number of system implementations and users. According to one estimate, the US health ICT market is experiencing a 4.5% compound annual growth rate over the next few years.20 According to our first ‘postulate’, and only assuming a linear relationship between increasing system deployments and harm, we could expect an approximately 5% per annum compound growth in ICT-related patient harms. This is, however, likely to be a lower bound, as the harm rate is more likely be non-linear. The opportunities for harm are likely to be governed by complex network effects, in which the number of new implementations, the number of new users, and the growth in rate of usage, all contribute to increased harm, suggesting more of a logistic curve.
Increased system complexity
We know from clinical practice that harm occurs at the boundaries of care, when information is passed from one party to the next. The situation is the same with ICT. For example, most hospitals have no homogenous ICT ‘system’ but rather maintain a patchwork of different departmental systems, plumbed together to meet the organization's needs. Unsurprisingly, networking errors and failure of information to pass between different clinical subsystems are a commonly reported event when patient safety incident reports are analyzed.17
Differences in information models between systems means that semantic and technical interoperability always remains imperfect. There is thus a patient safety trade-off between opting for flexible choice of the most suitable systems for specific clinical needs and the risks of interfacing heterogeneous systems. As system standards are an ever-moving and dynamic ‘target’4 this problem is unlikely to be solved in the near term.
System complexity is also encountered at the user interface, and simpler systems that better fit the tasks at hand and the cognitive models of users are considered the ideal. Ensuring such usability requires system designers to engage, using different methods, at multiple stages in the software life cycle,21 underscoring the challenge to us all in managing the multiple interactions that occur between user, task, information system and environment.21a
Reduced implementation times
There is evidence that quick ICT implementation times are associated with increased risks of error and patient harm.22 Shortened implementation is often the result of artificial timetables, driven by factors including a failure to appreciate the importance of training the clinical workforce, a need to minimize implementation costs, or externally imposed timetables.22a In the USA, those institutions with mature ICT systems and a trained and expert workforce still need to make substantial efforts to comply with meaningful use requirements. There is likely to be a tail of laggard health organizations that have no or immature existing systems, little local expertise, but are facing looming deadlines with significant cost penalties. These laggards face a choice of wearing the penalties, which may have substantial fiscal impacts, or undertaking programmes of rapid adoption of ICT systems. Adopting ICT in haste, without clinical leadership and support, and to avoid penalties based upon highly specific rules will probably see organizations attempting to ‘tick the boxes’ for meaningful use certification. They will not necessarily be implementing the systems they need, nor attending to long-term issues of system quality and safety. An unintended consequence of a short deadline to comply with meaningful use requirements in the USA is that we may see a spike in harm associated with poorly implemented systems in laggard organizations. It may also mean the financial failure of some organizations that are unable to comply.
Better software quality
There are scant public data about the rate of software errors in clinical systems or their impact on patient safety. Such data, when they are actually collected, are likely to be commercial and kept confidential, and there appear to be no formal mechanisms for enforcing the recording or reporting of software error rates.
One now decade old dataset was generated from three releases of a major commercial USA medical record system and is illustrative.23 24 The system contained 188 separate software components across 6500 files. Release 1 had defects in 58 of 180 components, seven of which were only discovered post-release. Release 2 had 64 defects in 185 components, with five discovered post-release. Release 3 showed a numerical improvement in quality, with only 40 of 188 components being defective, but still six were discovered post-release. Such analyses allow us to estimate the defect rate in even good quality software, and help developers decide on when to release software, based upon estimates of the number of as yet undetected errors.
We can expect improvements in the software development process, and maturation of existing clinical products and the organizations that produce them, all to contribute in a positive way to error reduction post-implementation. Our challenge at present is that there appear to be no public mechanisms for tracking or studying clinical software errors before and after system release, making it difficult to shape policy that would support improved software quality. Sophisticated analytical techniques to model the safety profiles of ICT in use, such as accident models,25 are also still not commonplace.
Improved standards compliance
Technical standardization improves the interoperability of disparate software systems, and minimizes the opportunity for error inherent in federated data systems, at the level of individual health services, or at a national scale. Standards for messaging, clinical documentation, and security can harness the ‘many eyes’ of the technical community who can come together to produce inherently more robust, and thus safer, systems. Certification of the software development process, and of the quality of service provided, should inherently reduce the likelihood that systems when implemented contain errors. Certification for locally implemented systems, and accredited and certified education for clinical users in system use will also help flatten growth in ICT-related harm.26 However, adopting standards occurs in a resource-constrained world, and as the burden of standards compliance grows, there are penalties to be paid in what in the end is a zero-sum resources game. A wealth of standards will lead to a poverty of their implementation, to paraphrase Herb Simon's famous aphorism.18
Predicting the actual harm rate and total patient harms that we will see through the use of ICT in healthcare over the next decade is currently not possible. However, it is realistic to make qualitative predictions, based upon those factors that are likely to dominate the patient safety equation. It is prudent to prepare for the likelihood that there will initially be a substantial increase in harm associated with health ICT over the next decade. The increased opportunity for harm arising from the increased penetration of complex ICT into healthcare is likely to be a remarkable feature of the next few years, and it is unlikely that mitigating strategies will develop as quickly. Harm rates are likely to be compounded by systems being implemented according to artificially truncated implementation times, or being put into the hands of under-trained users. There is still a road to be traveled before we are able to construct information systems that we know are certifiably safe in the clinical workplace, and we can train clinical hands that are certified safe to use them.
Our urgent challenge is now to move beyond observing that ICT may lead to patient harm. As underscored by the IOM report on ICT safety,10a we must move to tracking true harm rates with active monitoring of incidents,27 and exploring the underlying controllability of the system. Because of its complex sociotechnical nature, the health system is not easily ‘controllable’ and we need to understand the limits to our influence on health system performance, and how that shapes our responsibilities to patients, and our responses to outbreaks of ICT-related harm.
For the majority of ICT-related incidents that lead to little or no patient harm, our responsibilities are to understand the underlying root cause and eliminate or minimize future occurrences. For those hopefully rare events that may be of a catastrophic nature, we need to explore the notion of design-standardized human overrides. Patient safety is not just about studiously dissecting past events. It is also about managing events as they play out in real (dare we say ‘internet’) time. Where is the ‘kill switch’ in our health ICT systems when large-scale privacy breaches are occurring, or large volumes of critical patient data are being corrupted? Who is authorized to activate such a switch? How will clinical services operate when all their computers are temporarily shut down? These are serious issues of joint responsibility between practitioners, healthcare administrators, ICT managers and policy-makers that need to be understood.
We should not shy away from this reality. The next 10 years are going to see the fruits of many decades of hard work transforming the healthcare system through the application of ICT. The coming years, however, need to be years of caution and learning, as we institute governance systems, probably global in scale, which allow us to learn rapidly from what has gone wrong, and alert others to the risk. Most vendors sell their products in many different nations, and the lessons from a failure of a system in one country are likely to be as relevant when the same system is implemented in another. Any industry that shows that it is serious about learning, is honest and transparent about its failings, and demonstrates that it can rapidly improve when things go wrong is bound to succeed. That has been the lesson from aviation. However for some it is going to be a dangerous decade. Let us not kid ourselves that it will be anything else.
Footnotes
Competing interests: None.
Provenance and peer review: Commissioned; internally peer reviewed.
1. Bates DW, Cohen M, Leape L, et al. Reducing the frequency of errors in medicine using information technology. J Am Med Inform Assoc 2001;8:299–308. [PMC free article] [PubMed]
2. McKibbon A, Lokker C, Handler S, et al. The effectiveness of integrated health information technologies across the phases of medication management: a systematic review. J Am Med Inform Assoc 2012;19:22–30. [PMC free article] [PubMed]
3. Coiera E, Hovenga E. Building a sustainable health system. Geissbuhler A, Haux R, Kulikowski C, editors. IMIA yearbook of medical informatics 2007. Methods Inf Med 2007;46(Suppl 1):11–18.
4. Coiera E. Building a national health IT system from the middle out. J Am Med Inform Assoc 2009;16:271–3. [PMC free article] [PubMed]
5. Coiera E, Westbrook JI, Wyatt JC. The safety and Quality of Decision Support systems. Haux R, Kulikowski C, editors. IMIA yearbook of medical informatics 2006. Methods Inf Med 2006;45(Suppl 1):20–5.
6. Wear R, Berg M. Computer technology and clinical work: still waiting for Godot. J Am Med Assoc 2005;293:1261–3. [PubMed]
7. Koppel R, Localio A, Cohen A, et al. Neither panacea nor black box: responding to three journal of biomedical informatics papers on computerized physician order entry system. J Biomed Informat 2005;38:267–9. [PubMed]
7a. Aarts J. The future of electronic prescribing. Stud Health Technol Inform 2011;166:13–17. [PubMed]
8. Rodriguez-Gonzalez CG, Herranz-Alonso A, Martin-Barbero ML, et al. Prevalence of medication administration errors in two medical units with automated prescription and dispensing. J Am Med Inform Assoc 2012;19:72–8. [PMC free article] [PubMed]
9. Bates DW. Computerized physician order entry and medication errors: finding a balance. J Biomed Informat 2005;38:262–3. [PubMed]
10. Bloomrosen M, Starren J, Lorenzi NM, et al. Anticipating and addressing the unintended consequences of health IT and policy: a report from the AMIA 2009 Health Policy Meeting. J Am Med Informat Assoc 2011;18:82–90. [PMC free article] [PubMed]
10a. IOM (Institute of Medicine) Health IT and Patient Safety: Building Safer Systems For Better Care. Washington, DC: The National Academies Press, 2012.
10b. Lohr S. Panel Emphasizes safety in Digitization of Health Records. New York Times. B7 Wednesday, 2011.
10c. Miller RA, Gardner RM.; for the American medical Informatics Association (AMIA), the medical Library Association, the Association of Academic health science Libraries, the American health information management Association, the American Nurses Association Recommendations for Responsible monitoring and regulation of clinical software systems. J Am Med Inform Assoc 1997;4:442–57. [PMC free article] [PubMed]
10d. Coiera E, Westbrook J. Should clinical software be regulated? Medical Journal of Australia 2006;184:600–1. [PubMed]
11. Berg M, Aarts J, Van der Lei J. ICT in health care: sociotechnical approaches. Methods Inf Med 2003;42:297–301. [PubMed]
12. Lanham HJ, Leykum LK, McDaniel RR., Jr Same organization, same electronic health records (EHRs) system, different use: exploring the linkage between practice member communication patterns and EHR use patterns in an ambulatory care setting. J Am Med Inform Assoc. Published Online First: 16 August 2011. doi:10.1136/amiajnl-2011-000263. [PMC free article] [PubMed]
13. Magrabi F, Li SYW, Day RO, et al. Errors and electronic prescribing: a controlled laboratory study to examine task complexity and interruption effects. J Am Med Informat Assoc 2010:575–83. [PMC free article] [PubMed]
14. Li SYW, Magrabi F, Coiera E. A systematic review of the psychological literature on interruption and its patient safety implications. J Am Med Inform Assoc 2012;19:6–12. [PMC free article] [PubMed]
15. Petroski H. Success Through Failure: The Paradox of Design. Princeton: Princeton University Press, 2006.
15a. Sittig DF, Singh H. Defining health information technology-related errors: new developments since to err is human. Arch Intern Med 2011;171:1281–4. [PubMed]
16. Magrabi F, Ong MS, Runciman W, et al. Using FDA reports to inform a classification for health information technology safety problems. J Am Med Inform Assoc 2012;19:45–53. [PMC free article] [PubMed]
17. Magrabi F, Ong MS, Runciman W, et al. An analysis of computer-related patient safety incidents to inform the development of a classification. J Am Med Informat Assoc 2010;17:663–70. [PMC free article] [PubMed]
17a. Magrabi F, Ong M, Runciman W, et al. Patient Safety Problems Associated with Heathcare information technology: an Analysis of Adverse Events Reported to the US Food and Drug administration. Washington DC: AMIA 2011 Annual Symposium, 2011:853–8. [PMC free article] [PubMed]
18. Coiera E. Why system inertia makes health reform so hard. BMJ 2011;343:27–9. [PubMed]
19. Ford JD. The occurrence of structural hysteresis in declining Organizations. Acad Manag Rev 1980;5:589–98.
20. Taylor H. Healthcare IT Market Expected to Reach $73.1 Billion in 2010: Compass Intelligence, 2010. http://www.compassintelligence.com/tabid/67/portalid/0/Default.aspx?DeliverableId=2089.
21. Yen PY, Bakken S. Review of health information technology usability study methodologies. J Am Med Inform Assoc. Published Online First: 9 August 2011. doi:10.1136/amiajnl-2010-000020. [PMC free article] [PubMed]
21a. Coiera E. Getting technical about socio-technical systems science. Int J Med Inf 2007;76(suppl 1):98–103. [PubMed]
22. Han YY, Carcillo JA, Venkataraman ST, et al. Unexpected increased mortality after implementation of a commercially sold computerized physician order entry system. Pediatrics 2005;116:1506–12. [PubMed]
22a. Sittig DF, Ash JS, Zhang J, et al. Lessons from Unexpected increased mortality after implementation of a commercially sold computerized physician order entry system. Pediatrics 2006;118:797–801. [PubMed]
23. Stringfellow C, Andrews A, Wohlin C, et al. Estimating the number of components with defects post-release that showed no defects in testing. Softw Test Verif Reliab 2002;12:93–122.
24. Hewett R, Kulkarni A, Seker R, et al. On Effective Use of Reliability Models and Defect Data in Software Development. Region 5 Conference, 7–9 April 2006 IEEE, San Antonio, TX, USA. 2006:67–71 http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=5507460.
25. Magrabi F, McDonnell G, Westbrook JI, et al. Using an accident model to design safe electronic medication management systems. Stud Health Technol Inform 2007;129:948–52. [PubMed]
26. Singh H, Classen DC, Sittig DF. Creating an oversight infrastructure for electronic health record-related patient safety hazards. J Patient Saf. Published Online First: 10 November 2011. doi:10.1097/PTS.0b013e31823d8df0. [PubMed]
27. Sittig DF, Classen DC. Safe electronic health record use requires a comprehensive monitoring and evaluation framework. JAMA 2010;303:450e1. [PMC free article] [PubMed]
Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of
American Medical Informatics Association