PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
Arch Intern Med. Author manuscript; available in PMC Jun 9, 2013.
Published in final edited form as:
PMCID: PMC3677061
NIHMSID: NIHMS476207
Defining Health Information Technology-related Errors: New Developments Since To Err is Human*
Dean F. Sittig, PhD1 and Hardeep Singh, MD, MPH2
1University of Texas – Memorial Hermann Center for Healthcare Quality & Safety, National Center for Cognitive Informatics and Decision Making, School of Biomedical Informatics, University of Texas Health Sciences Center, Houston, TX
2Houston VA Health Services Research and Development Center of Excellence and The Center of Inquiry to Improve Outpatient Safety Through Effective Electronic Communication, Michael E. DeBakey Veterans Affairs Medical Center and Section of Health Services Research, Department of Medicine, Baylor College of Medicine, Houston, TX
Address for Correspondence: Dean F. Sittig, Ph.D. UT - Memorial Hermann Center for Healthcare Quality & Safety, University of Texas School of Biomedical Informatics at Houston, 6410 Fannin St. UTPB 1100.43, Houston, TX 77030, Work: 713-500-7977, Fax: 713-500-0766, dean.f.sittig/at/uth.tmc.edu
Despite its promise, recent literature has revealed possible safety hazards of health information technology (HIT) use. The Office of the National Coordinator for HIT recently sponsored an Institute of Medicine committee to synthesize evidence and experience from the field on how HIT affects patient safety. To lay the groundwork for defining, measuring, and analyzing HIT-related safety hazards, we propose that Health information technology-related error occurs anytime HIT is unavailable for use, malfunctions during use, is used incorrectly by someone, or when HIT interacts with another system component incorrectly, resulting in data being lost or incorrectly entered, displayed, or transmitted. These errors, or the decisions that result from them, significantly increase the risk of adverse events and patient harm. In this paper, we describe how a socio-technical approach can be used to understand the complex origins of HIT errors, which may have roots in rapidly evolving technological, professional, organizational, and policy initiatives.
Keywords: Electronic Health Records, Health Information Technology, Patient Safety, Errors
Two Institute of Medicine (IOM) reports have recommended the use of information technologies to improve patient safety and reduce errors in health care1,2. Broadly speaking, health information technology (HIT) is the overarching term applied to various information and communication technologies used to collect, transmit, display, or store patient data. Despite HIT’s promise in improving safety, recent literature has revealed potential safety hazards associated with its use, often referred to as e-iatrogenesis.3,4 For example, Koppel et al5 describe 22 types of errors facilitated by a commercially available electronic health record (EHR) system’s computerized provider order entry (CPOE) application. In response to similar emerging concerns, the Office of the National Coordinator for HIT recently sponsored an IOM committee to “review the available evidence and the experience from the field” on how HIT use affects patient safety. Given the national impact of HIT, this initiative is a major step forward in ensuring the safety and well-being of our patients. However, the field currently lacks acceptable definitions of HIT-related errors and it is unclear how best to measure or analyze “HIT errors”.
The goal of this manuscript is to advance the understanding of HIT-related errors and explain how adverse events, near misses, and patient harm can result from problems with HIT itself or from interactions between HIT, its users, and the work system. Health information technology errors almost always jeopardize patient outcomes and have high potential for harm6 because they are often latent errors that occur at the “blunt end” of the healthcare system,7 potentially affecting large numbers of patients if not corrected. Furthermore, if important structural or process-related HIT problems are not addressed proactively, the care of millions of patients may be affected owing to the impending widespread adoption and implementation of EHRs8. We thus focus heavily on errors related to the use of EHR systems.
We define the HIT work system as the combination of the hardware and software required to implement the HIT, as well as the social environment in which it is implemented. We thus propose that HIT errors should be defined from the socio-technical viewpoint of end users (including patients, when applicable) rather than from the purely technical viewpoint of manufacturers, developers, vendors, and personnel responsible for implementation. Health information technology related error occurs anytime the HIT system is unavailable for use, malfunctions during use, is used incorrectly, or when HIT interacts with another system component incorrectly resulting in data being lost or incorrectly entered, displayed, or transmitted.9,10 Errors with HIT may involve failures of either structures or processes and can occur in the design and development, implementation and use, or evaluation and optimization phases of the HIT life cycle.11 This approach is consistent with the currently recommended systems and human factors approaches used to understand and reduce error.1
The HIT system is considered to be unavailable for use if for any reason the user cannot enter, review, transmit, or print data (e.g., patient’s medication allergies or most recent laboratory test results). Reasons could include unavailable computer hardware (e.g., missing keyboard or problems with the computer’s monitor, network routers that connect the computer to the data servers and printers, or the server where data is stored), unavailable software (e.g., missing components with the operating system that manages either the computer applications such as the internet browser and EHR or the interface between an EHR system and the information system of an ancillary service such as radiology or lab), and power sources (e.g., a power outage that results in hospital-wide computer failure).4
The HIT system is considered to be malfunctioning (i.e., available, but not working correctly) whenever a user cannot accomplish the desired task despite using the HIT system as designed. In this situation, error results from any hardware or software defect (or bug) that prohibits a user from entering or reviewing data, or any defect that causes the data to be entered, displayed, transmitted, or stored incorrectly. For example, the clinician might enter a patient’s weight in pounds, and the weight-based dosing algorithm might fail to convert it to kilograms before calculating the appropriate dose, resulting in a 2-fold overdose.
Finally, errors can occur even when hardware and software are functioning as designed. For instance, errors may result when users do not use the hardware or software as intended. For example, users might enter free-text comments (e.g., “take 7.5 mg Mon-Fri only”) that contradict information contained in the structured section of the medication order (e.g., “Warfarin tabs 10mg QD”).12 Errors may also arise when 2 or more parts of the HIT system (e.g., CPOE application and the pharmacy’s medication dispensing system) interact in an unpredicted manner, resulting in inaccurate, incomplete, or lost data during entry, display, transmission, or storage.13
Leveson14 proposes that new technologies have fundamentally altered the nature of errors and asserts that these changes necessitate new models and methods for investigating technology-related errors. Thus, technological advances could potentially give rise to increasingly complex and multifaceted errors in healthcare. In view of the resultant expanding and evolving context of safe HIT implementation and use, we illustrate how a recently developed socio-technical model for HIT evaluation and use can provide an origin-specific typology for HIT errors.15 The model’s 8 dimensions (Table 1) comprehensively account for the technology; its users and their respective workflow processes and how these 2 elements interface with the technology; the work system context including organizational and policy factors that affect HIT; and notably, the interactions between all of these factors.16 The Table lists examples of specific EHR-related errors that can occur within each of the 8 dimensions of the socio-technical model, along with examples of potential ways that the likelihood of each error could be reduced. Thus, the model not only illustrates the complex relationships between active and latent errors but also lays a foundation for error analysis.
Table 1
Table 1
Examples of the types of errors that can occur within each dimension of the socio-technical model16 and corresponding suggested mitigating procedures.
In conclusion, rapid advances in HIT development, implementation, and regulation have complicated the landscape of HIT-related safety issues. Erroneous or missing data and the decisions based on them, increase the risk of an adverse event and unnecessary costs. Because these errors can and frequently do occur after implementation, simply increasing oversight of HIT vendors’ development processes will not address all HIT-related errors. Comprehensive efforts to reduce HIT errors must start with clear definitions and an origin-focused understanding of HIT errors that addresses important socio-technical aspects of HIT use and implementation. To this end, we provide herein a much needed foundation for coordinating safety initiatives of HIT designers, developers, implementers, users, and policy makers, who must continue to work together to achieve a high-reliability HIT work system for safe patient care.
Acknowledgments
Dr. Sittig is supported in part by a grant from the National Library of Medicine R01-LM006942 and by a SHARP contract from the Office of the National Coordinator for Health Information Technology (ONC #10510592).
Dr. Singh is supported by an NIH K23 career development award (K23CA125585), the VA National Center of Patient Safety, Agency for Health Care Research and Quality, a SHARP contract from the Office of the National Coordinator for Health Information Technology (ONC #10510592), and in part by the Houston VA HSR&D Center of Excellence (HFP90-020).
These sources had no role in the preparation, review, or approval of the manuscript.
We thank Laura A. Petersen, MD, MPH, VAHSR&D Center of Excellence, Michael E. DeBakey Veterans Affairs Medical Center, and Baylor College of Medicine, and Eric J. Thomas, MD, MPH, University of Texas, Houston-Memorial Hermann Center for Healthcare Quality and Safety and Department of Medicine, University of Texas Medical School, Houston, for their guidance in this work and Annie Bradford, PhD, for assistance with medical editing, for which they received no compensation.
Footnotes
*Portions of this manuscript were presented to the United States’ Institute of Medicine Committee on Patient Safety and Health Information Technology held December 14, 2010 in Washington, DC
The views expressed in this article are those of the authors and do not necessarily represent the views of the Department of Veterans Affairs or any of the other funding agencies.
1. Institute of Medicine. To err is human: Building a safer health system. Washington, DC: National Academy Press; 1999. [Report by the Committee on Quality of HealthCare in America]
2. Institute of Medicine. Patient Safety: Achieving a new standard for care. Washington, DC: National Academy Press; 2004. [Report by the Committee on Data Standards for Patient Safety]
3. Weiner JP, Kfuri T, Chan K, Fowles JB. “e-Iatrogenesis:” The most critical unintended consequence of CPOE and other HIT. J Am Med Inform Assoc. 2007 Feb 28; [PMC free article] [PubMed]
4. Myers RB, Jones SL, Sittig DF. Reported Clinical Information System Adverse Events in US Food and Drug Administration Databases. Applied Clinical Informatics. 2011;2:63–74. doi: 10.4338/ACI-2010-11-RA-0064. [PMC free article] [PubMed] [Cross Ref]
5. Koppel R, Metlay JP, Cohen A, Abaluck B, Localio AR, Kimmel SE, Strom BL. Role of computerized physician order entry systems in facilitating medication errors. JAMA. 2005 Mar 9;293(10):1197–203. [PubMed]
6. Hofer TP, Kerr EA, Hayward RA. What is an error? Eff Clin Pract. 2000 Nov-Dec;3(6):261–9. [PubMed]
7. Reason J. Human error: models and management. BMJ. 2000 Mar 18;320(7237):768–70. [PMC free article] [PubMed]
8. Stead W, Lin H, editors. Computational technology for effective health care: immediate steps and strategic directions. Washington, DC: National Academies Press; 2009. [PubMed]
9. Mangalmurti SS, Murtagh L, Mello MM. Medical malpractice liability in the age of electronic health records. N Engl J Med. 2010 Nov 18;363(21):2060–7. [PubMed]
10. Perrow C. Normal Accidents: Living with High-Risk Technologies. Princeton University Press; Princeton, New Jersey: 1999.
11. Walker JM, Carayon P, Leveson N, Paulus RA, Tooker J, Chin H, Bothe A, Jr, Stewart WF. EHR safety: the way forward to safe and effective systems. J Am Med Inform Assoc. 2008 May-Jun;15(3):272–7. [PMC free article] [PubMed]
12. Singh H, Mani S, Espadas D, Petersen N, Franklin V, Petersen LA. Prescription errors and outcomes related to inconsistent information transmitted through computerized order entry: a prospective study. Arch Intern Med. 2009 May 25;169(10):982–9. [PMC free article] [PubMed]
13. Kleiner B. Sociotechnical System Design in Health Care. In: Carayon P, editor. Handbook of Human Factors and Ergonomics in Health Care and Patient Safety. Mahwah, NJ: Lawrence Erlbaum; 2007.
14. Leveson N. A New Accident Model for Engineering Safer Systems. Safety Science. 2004 Apr;42(4):237–270.
15. Sittig DF, Singh H. Eight rights of safe electronic health record use. JAMA. 2009 Sep 9;302(10):1111–3. [PubMed]
16. Sittig DF, Singh H. A new sociotechnical model for studying health information technology in complex adaptive healthcare systems. Qual Saf Health Care. 2010 Oct;19( Suppl 3):i68–74. [PMC free article] [PubMed]
17. Kilbridge P. Computer crash--lessons from a system failure. N Engl J Med. 2003 Mar 6;348(10):881–2. [PubMed]
18. Shojania KG. Patient Mix-Up. AHRQ WebM&M [serial online] 2003 Feb; Available at: http://www.webmm.ahrq.gov/case.aspx?caseID=1.
19. Horsky J, Kuperman GJ, Patel VL. Comprehensive analysis of a medication dosing error related to CPOE. J Am Med Inform Assoc. 2005 Jul-Aug;12(4):377–82. [PMC free article] [PubMed]
20. AHIMA MPI Task Force. Merging Master Patient Indexes. 1997 Sep; Available at: http://www.cstp.umkc.edu/~leeyu/Mahi/medical-data6.pdf.
21. Koppel R, Wetterneck T, Telles JL, Karsh BT. Workarounds to barcode medication administration systems: their occurrences, causes, and threats to patient safety. J Am Med Inform Assoc. 2008 Jul-Aug;15(4):408–23. [PMC free article] [PubMed]
22. Kuperman GJ, Teich JM, Tanasijevic MJ, Ma’Luf N, Rittenberg E, Jha A, Fiskio J, Winkelman J, Bates DW. Improving response to critical laboratory results with automation: results of a randomized controlled trial. J Am Med Inform Assoc. 1999 Nov-Dec;6(6):512–22. [PMC free article] [PubMed]
23. Singh H, Wilson L, Petersen LA, Sawhney MK, Reis B, Espadas D, Sittig DF. Improving follow-up of abnormal cancer screens using electronic health records: trust but verify test result communication. BMC Med Inform Decis Mak. 2009 Dec 9;9:49. [PMC free article] [PubMed]
24. Singh H, Thomas EJ, Sittig DF, Wilson L, Espadas D, Khan MM, Petersen LA. Notification of abnormal lab test results in an electronic medical record: do any safety concerns remain? Am J Med. 2010 Mar;123(3):238–44. [PMC free article] [PubMed]
25. Singh H, Thomas EJ, Mani S, Sittig D, Arora H, Espadas D, Khan MM, Petersen LA. Timely follow-up of abnormal diagnostic imaging test results in an outpatient setting: are electronic medical records achieving their potential? Arch Intern Med. 2009 Sep 28;169(17):1578–86. [PMC free article] [PubMed]
26. Strom BL, Schinnar R, Aberra F, Bilker W, Hennessy S, Leonard CE, Pifer E. Unintended effects of a computerized physician order entry nearly hard-stop alert to prevent a drug interaction: a randomized controlled trial. Arch Intern Med. 2010 Sep 27;170(17):1578–83. [PubMed]
27. Grissinger M. Preventing serious tissue injury with intravenous promethazine (phenergan) Pharmacy & Therapeutics. 2009 Apr;34(4):175–6. [PMC free article] [PubMed]
28. Medication reconciliation. 2005 National Patient Safety Goal #8 by the Joint Commission.
29. Poon EG, Blumenfeld B, Hamann C, Turchin A, Graydon-Baker E, McCarthy PC, Poikonen J, Mar P, Schnipper JL, Hallisey RK, Smith S, McCormack C, Paterno M, Coley CM, Karson A, Chueh HC, Van Putten C, Millar SG, Clapp M, Bhan I, Meyer GS, Gandhi TK, Broverman CA. Design and implementation of an application and associated services to support interdisciplinary medication reconciliation efforts at an integrated healthcare delivery network. J Am Med Inform Assoc. 2006 Nov-Dec;13(6):581–92. [PMC free article] [PubMed]
30. APPROVED: Will Not Score Medication Reconciliation in 2009. Joint Commission. Available at: http://www.jcrinc.com/common/PDFs/fpdfs/pubs/pdfs/JCReqs/JCP-03-09-S1.pdf. [PubMed]
31. Revised National Patient Safety Goal on medication reconciliation is approved. Joint Commission Online - December 8, 2010.