As noted above, federal legislation is stimulating numerous HIT implementation efforts around the country. While the potential benefit of these efforts is broad, experience has shown that major technological changes can bring about negative unintended consequences that can jeopardize the success of implementations. Consequently, AMIA dedicated its 2009 Annual Health Policy Meeting to an exploration of unintended consequences of HIT and related policies.
The goals of AMIA's 2009 health policy conference were as follows:
- Explore and outline approaches to recognizing, anticipating, and addressing unintended consequences of HIT and HIT-related policies and legislation.
- Identify areas for further study and research in the above areas.
The recommendations of the meeting (described at the end of the paper) were formulated during facilitated breakout discussion sessions attended by conference participants; these sessions are described in detail below.
Two conference speakers, representing other safety-conscious industries, shared their experience relating to HIT and unintended consequences, providing critical knowledge that informed the breakout discussions. Dr Nancy Leveson, an expert in aviation software safety, presented highlights of the experience of the aviation industry in addressing safety issues. Dr Leveson focused on factors contributing to previous IT disasters, noting that a high percentage of famous aviation disasters involved software that had been adapted from another product or use, rather than developed de novo. She discussed mistakes commonly made in introducing technology: attempting to do too much too fast, building technology-centered automation that is unusable or prone to error and necessitates unacceptable changes in workflow, and failing to build in safety at the start of design. Rodeina Davis, CIO of the Blood Center of Wisconsin, discussed the history and experience of blood banking software under FDA regulation. Keynote speakers, Dr David Blumenthal, ONC Director, and Aneesh Chopra, Federal Chief Technology Officer of the U.S., addressed national issues in the HIT arena.
First, AMIA initiated the meeting's discussions with the following questions:
- What do we know about unintended consequences that are related to HIT design, implementation, and use? To what extent can we anticipate, describe, categorize, and prevent (or mitigate) unintended consequences and what approaches have been effective? To what extent are unintended consequences undesirable? Desirable (conferring benefits or efficiencies)? What do we still need to learn?
- Who is responsible for anticipating unintended consequences?
- What unintended consequences may arise from policies engendered by current/pending legislation and regulation related to HIT?
- What lessons can we learn from other industries and how can we leverage them in addressing unintended consequences related to HIT?
The meeting's Steering Committee developed a working definition of unintended consequences: “Unintended consequences are outcomes of actions that are not originally intended in a particular situation (eg, HIT implementation).” In particular, the focus of the meeting was on those undesirable outcomes that are rarely, if ever, foreseen. For the sake of brevity, they are referred to subsequently in this paper simply as unintended consequences.
Breakout sessions focused on four intersecting domains
The breakout sessions were planned by the Steering Committee to stimulate in-depth consideration of HIT-related topics through an open exchange of knowledge and experience by meeting participants. Participants chose among breakout sessions focused on four domains: technology, human factors and cognition, organization, and fiscal/policy and regulation.
While unintended consequences from health IT are frequently considered in terms of a single domain (eg, technology), when addressing unintended consequences of HIT implementations and policy, there are intersections across the domains. For example, a recent qualitative meta-analysis of HIT implementations found that organizational efficiency is not automatically increased just by implementing a technology solution; actions that were needed to promote success include management involvement, integration of the system in clinical workflow, establishment of compatibility between software and hardware, and user involvement, education and training.36
These actions straddle the domains.
Decisions made about the technical design of an HIT system (eg, organization and display of information) can affect the ease or difficulty of its use due to limitations of human cognitive abilities. Failure to consider the complexities and dynamism of clinical workflow when designing and implementing HIT solutions can impact their effectiveness and may lead to unintentional errors that can impact patient safety. Unintended consequences can arise in the organization domain as a result of regulation/legislation that mandates meaningful use of HIT by a certain deadline, whereby some practitioners are not yet included in the definition of meaningful users, and thus are not eligible for payment incentives.
Therefore, AMIA introduced an input–output model to characterize unintended consequences, with several factors serving as inputs:
- Technology: hardware and software systems that are implemented and the constraints they impose.
- Human factors and cognition: the thought processes, habits of behavior, and mental capabilities that humans bring to the use of HIT tools and processes.
- Organization: the embedding of technology in the complex environment of healthcare organizations.
- Fiscal/policy and regulation: the legislative and regulatory environment governing the design, implementation, and use of HIT.
There are multi-dimensional implications of HIT-related unintended consequences arising in these domains. outlines relationships among the types of consequences and how they affect stakeholders. (The model shows Fiscal, and Policy and Regulation separately, but breakout discussions about these topics were combined because of their overlap.)
Input–output model of unintended consequences.
The conversion from a paper-based system to an electronic system results in inevitable challenges. Some problems are caused by attributes or characteristics of specific HIT systems, while others are the result of the general change process or may be related to the existing level of computerization in an organization. New types of errors may be generated when performing a task using a computer rather than paper, and communication via electronic means differs from face-to-face communication among clinicians and between clinicians and patients.
The transition from paper to an electronic system can result in errors due to problems with technical design, confusion about system features by users, and workflow mismatches.20
- Users of CPOE will often select an item from a pick list that is close to their desired choice but not technically correct or less precise than intended. The list of choices may be limited by tool design (eg, no facility for an ‘other’ type choice) or the terminology used to specify list choices.
- The quality of clinical documentation may be affected by a feature of EHRs that allows multiple parties to access and edit the same records. A positive consequence is that multiple providers can double check and refine summaries of clinical problems or medications; providers may correct entries, reconcile with patient lists, or add their own insights. However, different providers may disagree on the quantity and specificity of documentation to include on shared lists. When conflicts such as these are not managed, shared lists may develop duplicate, or even contradictory, entries that lead to confusion as different providers attempt to document in a manner that matches their needs and cognitive workflow.
Problems may arise when clinicians have false expectations regarding data accuracy and processing or have an unquestioning trust of automated systems without a thorough understanding of their limitations; when clinicians who have worked exclusively in automated environments find themselves in work settings without these technologies; and when there are insufficient backup systems and processes in place if applications go down.37
Another way that computerization affects workflow is termed ‘alert dependence.’ Electronic systems which track and audit physician decisions through clinical decision support or other mechanisms may potentially lead to an overdependence on safety checks. This is true when alerting only interacts with the provider when a problem has been detected. Moreover, providers who have grown accustomed to a particular alert at one practice site may fail to realize that a second site is not running that particular rule. On the other hand, the phenomenon of ‘alert fatigue’ may prompt CPOE users to override a large percentage of them, potentially compromising the safety effects that are the goal of integrating decision support into the application.38
Furthermore, EHR systems newly installed in an organization must work in synchronization with the computer systems already in place; harmonization between these systems may be overridden by subsequent implementations and updates.39
Systems that are improperly integrated, requiring that data be entered into multiple systems, may result in data fragmentation. For example, when a CPOE system is not integrated with a pharmacy system, every order has to be printed manually and then electronically transcribed into the pharmacy system. Further, the maintenance of multiple networks in an organization requires that data be updated in all relevant systems or records can become outdated, incomplete, or inconsistent.40
Experts recommend that organizations use rigorous approaches to ensure the quality of data in HIT systems. These include manual checking of results, development of measurable data quality benchmarks and procedures for identifying deviations from benchmarks, and training of users to correctly enter data in electronic forms.37
Human factors and cognition domain
Knowledge of the principles of human–computer interaction and an appreciation of the importance of human and cognitive factors are critical to HIT design and implementation. The study of human factors related to HIT systems focuses on the systematic application of knowledge about human sensory, perceptual, mental, psychomotor, and other characteristics to the design of these systems. Human and cognitive factors place emphasis on the mental (memory, knowledge, strategies) and social properties characteristic of humans.23
Horsky et al
noted, ”Cognition is considered to be a process of coordinating, mediating, and redistributing knowledge representations that are internal (ie, in the mind) and external (eg, visual displays, written instructions, etc). Environmental, social, cultural, organizational and regulatory factors contribute to the complexity of these systems that stretch over human beings and the technology they work with. Computing technology and artifacts are integral parts of this cognitive process and should be designed to correspond to human characteristics of reasoning, memory, attention, and constraints (human-centered design).”41
Electronic health records have enabled the collection of large quantities of data and text regarding patient encounters, hospitalizations, procedures, and test results. Increasingly voluminous repositories of electronic records that have been converted from paper can lead to unintended consequences such as difficulty in finding data that are relevant to a specific clinical need. Coupled with certain aspects of system design, these vast quantities of data not adequately organized can result in increased cognitive load, a term originating in cognitive science that refers to the load on working memory during information processing.
In addition to cognitive load, clinicians also face usability limitations with many current HIT systems. For example, order sets in CPOE systems are intended to relieve much of the tedious burden of selecting one order at a time and enable physicians to devote more cognitive resources to treatment and management planning. A study of strategy selection in order entry found that entering orders is a complex process that can be made more difficult or eased by interface and design support.42
Dense information displays (eg, many levels of nested drop-down lists) can make the selection process cumbersome.24
The very changes that ease the problems of usability can create additional problems as an unintended consequence if changes are not adequately monitored.
Other concerns have arisen from the inherent properties of HIT systems. Patel et al
report that exposure to EHRs' tightly structured format has been shown to be associated with changes in physicians’ information gathering (more efficient) and reasoning strategies (hypothesis driven) compared to their use of paper-based records (slow and data driven). The support provided by the system to guide and narrow down the search that changed the directionality of reasoning from data-driven to hypothesis-based, resulted in errors which were anticipated. With other intended changes, there were also unintended changes: loss of the narrative thread of the patient history and the additional cognitive effort needed to complete a patient history based on discrete information, leading to different types of errors.25
Alert and reminder systems are frequently characterized by linear, rigid rules, an approach that is a poor fit for the inherent complexity of medical decision making and is inconsistent with the way in which people tend to make decisions as shown by the classical decision making literature on heuristics and biases.43
This formalization of rules to manage decisions that were previously managed informally entails loss of flexibility, leading to loss of resilience, with the danger of generating medical error as an unintended consequence. Such examples indicate that a deeper understanding of the cognitive properties of a system prior to its implementation is needed to help planners anticipate and pre-empt many unintended consequences.
Peter Drucker described the modern hospital as “altogether the most complex human organization ever devised.”44
As increasingly complex HIT systems are implemented in complex environments, they will affect larger, more heterogeneous groups of people and organizations in a variety of settings. Major implementation challenges for an organization tend to be behavioral, sociological, cultural, and financial rather than strictly technical. An HIT implementation can lead to physical, mental, and emotional exhaustion of an organization and its workforce, thus rendering the organization reluctant (or unable) to move forward with further implementation efforts. The critical importance of managing the power and organizational conflicts inherent in information system development is being increasingly understood.17
To create an effective foundation for organizational transformation, there is a need for strong support by both management and future users during HIT implementations.32
Discussing factors contributing to HIT implementation challenges, Lorenzi et al
focus on the organization's capacity for change and its recognition of the importance of context: “The implementation of an IT system… requires a detailed plan that is driven by both capacity for change and context of change. Capacity represents the ability of the organization to invest in high-quality training, extensive support at go-live, and managers who can respond flexibly to changes in the environment so that patient safety is maintained as the highest priority. Context is the environment to which the implementation plan must adapt. A rollout schedule, for example, must take into account the many interdependencies that exist among clinical units as well as organizational changes that are occurring during the implementation.” Examples of aspects of implementation that are embedded in organizational structures, supports, and processes and may become sources of frustration include workflow changes, difficulty getting technical help at the time when it is needed, perceived (or actual) disassociation of IT staff from operational needs, and conflicting organizational priorities.31
A systematic literature review outlined lessons learned from HIT implementations in seven countries and found that strong project leadership using appropriate project management techniques, the establishment of standards, and staff training are needed to avoid risks that could compromise success. The review described ways in which HIT technical features interact with the social features of the healthcare work environment, and how this juxtaposition may contribute to complications of HIT deployment.45
Harrison et al
described unintended consequences resulting from the interaction between HIT and the healthcare organization's socio-technical system (workflows, culture, social interactions, and technologies), and offered the ISTA (interactive socio-technical analysis) model to help study these consequences and their causes.9
The Institute of Medicine report, To Err is Human
, noted that for optimum use, technology must be a ‘member’ of the work team.46
Considering an organization's workflow and procedures and the roles of its clinical teams during system planning, design, and implementation is critical. Workflow problems after CPOE implementation were rated high on a list of concerns of representatives from 176 U.S. hospitals responding to a recent survey.34
Indeed, HIT implementations are opportunities to review existing workflow processes to make sure that all are effective and up to date, and identify those that are unique to the institution; information gained from these reviews can guide modifications that need to be made to off-the-shelf HIT products. Implementation of HIT may blur the distinctions among traditional role lines, such as those of clinicians and information technology providers and administrators. When developing HIT solutions, it is necessary to balance system standardization with flexibility: standardization allows for a consistent approach to clinical care, information exchange, and related processes, while flexibility allows for customization to patient individuality, distinct clinical workflows, and HIT users' preferences.
Ongoing evaluating and monitoring of HIT systems to measure implementation success and pinpoint unintended consequences that occur during system usage is important. Sittig et al
recommended measures to assess system availability, use, benefits, and potential hazards.19
Campbell et al
addressed the pervasive problem of system downtime which can throw an organization's HIT-dependent operations into chaos. They advised healthcare organizations to prepare and test contingency plans so that operations can continue during system downtimes; these plans should include requirements for paper backup systems, procedures for operating without electronic resources, training for employees, and periodic drills to assure that these procedures function as planned and that all staff are thoroughly familiar with them.37
Fiscal/policy and regulatory domain
The widespread, accelerated diffusion of technology resulting from recent legislation may engender unintended consequences manifested in various ways throughout those organizations under pressure to accommodate the changes. Legislative and regulatory changes are moving health IT from a voluntary initiative to a highly regulated activity. Examples include revised privacy and security obligations for practices covered by the Health Insurance Portability and Accountability Act of 1996 (HIPAA), modifications to definitions of covered entities, and the inclusion of new ramifications for HIPAA violations.47
While ARRA holds out the promise of incentives to acquire and implement EHR systems, these efforts may introduce some unintended consequences related to its specific requirements and the fast pace of the implementations across large numbers of providers. Examples include the lack of empirical data to support the phased-in implementation of certain indicators of meaningful use, and concerns about the effect of current and future feature-oriented certification criteria on the ability of EHR vendors to innovate. Also, it is unclear to what extent the requirements for meaningful use could become barriers to HIT adoption by physicians and hospitals.48
The ONC Request for Proposal mentioned above acknowledges the need to study unintended consequences that may arise from the ARRA-driven, rapid market growth (by and large unregulated and potentially not evidence-based) of HIT vendors and software.18
The group did not reach consensus on whether formal regulation of EHR software would, on the whole, be beneficial or harmful. Plenary speaker, Rodeina Davis, CIO of the Blood Center of Wisconsin, discussed the experiences of the blood bank community with regulated software. She reported that the consensus of that community was that although regulation was necessary when it originally occurred, the advancements in software development methodologies have made formal regulation less advantageous.
Although meeting participants agreed that current EHR software, like all software, contains errors, there was debate about the impact of the rigidity and loss of nimbleness that formal software regulation entails. One anecdotal example given was that regulated software, such as that in some radiology systems or blood banking, is tied to a specific operating system version. As a result, vendors are prohibited from rapidly patching systems when the underlying operating system (eg, Windows) is compromised by a new virus or exploit. Given the evolving meaningful use requirements, and the mandate for interconnection, this inability to respond to novel threats concerned some participants. A loss of nimbleness can also impact the pace of innovation. While breakthrough innovation is possible in a highly regulated environment, conventional wisdom is that the regulation slows innovation. All of the participants agreed that the current generation of EHR software was less than optimal, and that significant innovation would be necessary before EHRs could achieve the promise of radical transformation of the healthcare process. In the time available, participants could not reach closure on an optimal tradeoff between regulation and nimbleness.