|Home | About | Journals | Submit | Contact Us | Français|
Information technology can support the implementation of clinical research findings in practice settings. Technology can address the quality gap in health care by providing automated decision support to clinicians that integrates guideline knowledge with electronic patient data to present real-time, patient-specific recommendations. However, technical success in implementing decision support systems may not translate directly into system use by clinicians. Successful technology integration into clinical work settings requires explicit attention to the organizational context. We describe the application of a “sociotechnical” approach to integration of ATHENA DSS, a decision support system for the treatment of hypertension, into geographically dispersed primary care clinics. We applied an iterative technical design in response to organizational input and obtained ongoing endorsements of the project by the organization's administrative and clinical leadership. Conscious attention to organizational context at the time of development, deployment, and maintenance of the system was associated with extensive clinician use of the system.
Information technology holds great promise as a means to support clinical practice guidelines; however, many promising information systems have encountered substantial difficulties in implementation.1,2,3,4 In some cases, clinicians have low rates of using the system, for example, not interacting directly with a guideline by using the computer5 or not accessing a clinical decision support system.6 We developed a decision support system for hypertension, ATHENA DSS, that automates evidence-based guidelines for management of primary hypertension.7 We recognized that development was only the first step in implementing a decision support system in a health care system: successful implementation requires full integration into the clinical setting.8,9,10,11,12,13,14 We evaluated what was known about barriers and facilitators for guideline implementation, including experience with successful and unsuccessful implementations of information technologies in clinical settings, and designed an approach that benefited from these insights.
Our implementation has gone well. The system has run for more than 15 months in clinic sites in nine different cities that are part of three administratively separate medical centers. The system has displayed detailed individualized advisories for more than 10,000 patients. Clinicians are interacting with the advisories at substantial rates. We describe the steps that we took in implementing this system. A key feature of our approach is its interdisciplinary nature: We address technical-informatics aspects and social-organizational aspects in an integrated manner.11,12,13,14,15 Rather than describe the technical aspects of the application in isolation from its clinical and organizational context, we focus on the interrelationships of the technical design with the clinical and organizational context and aims.
“In the current health care system, scientific knowledge about best care is not applied systematically or expeditiously to clinical practice. An average of about 17 years is required for new knowledge generated by randomized controlled trials to be incorporated into practice.”16,17 The gap between scientific evidence and clinical practice can be bridged only by influencing clinician behavior to translate research findings into routine clinical practice.18,19 Several major program initiatives address this need, including the Department of Veterans Affairs Quality Enhancement Research Initiative (QUERI). QUERI includes methodologies for translation research, with an emphasis on organizational structures and process considerations.20 Translation projects ideally pay explicit attention to identification of effective strategies for organizational implementation, so that translation can be systematized for installation at other medical centers. As described in the overview paper for this series, the QUERI program defines six standard steps of the QUERI process; we describe here our work on step 4, which is “Identify and Implement Interventions to Promote Best Practices.”21
In Crossing the Quality Chasm, the Institute of Medicine16 calls for a “New Health System for the 21st Century” that recognizes the predominance of chronic disease, creates an infrastructure to support evidence-based care, and facilitates the use of information technology to translate research findings into practice. The Institute of Medicine has identified “improvability gaps” in health care, i.e., areas in which clinical practice falls substantially short of evidence-based best practices. Information technology has substantial potential to support translation projects that address improvability gaps. However, attempts to integrate new informatics technology into clinical work settings are not always successful,22 as illustrated dramatically by the termination of the electronic medical record at Cedars-Sinai Medical Center in Los Angeles.23 Integration of new technology into an organization is a “politically textured process of organizational change” that must accord primacy to the needs of the users and the organization.2 Technology use depends on the “meticulous interrelation of the system's functioning” with the work of the health care professionals.2 The implementation must be supported by administrative leadership and by future users.1 A sociotechnical approach includes fundamental incorporation of organizational factors that include both iterative technical design in response to organizational input and explicit attention to the political context of technology implementation.2 We describe the integration of a guideline-based decision support system into geographically dispersed primary care clinics in a large health care system. We describe both the technical features of the implementation developed by responding iteratively to organizational input and the interrelated process of attending to the organizational context by obtaining and maintaining endorsement of the project by the organization's administrative and clinical leadership.
We developed a system for implementing clinical practice guidelines designed to translate hypertension research findings into practice in primary care clinics. The approach provides automated decision support for primary care clinicians. The decision support system ATHENA DSS was designed as a platform-independent system for integration with an existing electronic medical record (EMR) system. ATHENA DSS uses a guideline interpreter to combine patient information from the EMR with knowledge of hypertension to generate patient-specific recommendations, explanations, and evidence-based education, which are then delivered to clinicians in a pop-up window at the time of outpatient primary care clinic visits.7,24 ATHENA DSS was constructed using the EON architecture developed at Stanford Medical Informatics for guideline-based decision support systems.25 shows the basic system architecture. It includes a knowledge base (KB) that models hypertension knowledge, a guideline interpreter, a temporal database mediator, and a custom graphical user interface. ATHENA DSS, developed in Protégé, separates the KB from the interpreter rules and the patient database, so that clinician-managers can easily browse and update the KB.
Our aim was to integrate ATHENA DSS into the primary care clinics at three VA medical centers—VA Palo Alto Health Care System (VAPAHCS), San Francisco VA Medical Center, and Durham VA Medical Center—to implement national guidelines for the treatment of hypertension. We are evaluating the impact of this guideline implementation on patient care in a randomized, controlled trial (RCT) with patients' blood pressure control and guideline–drug concordance as the primary outcome measures (results not yet available). We implemented the system first at VAPAHCS for a limited number of clinicians who would not be enrolled in the study. After we had gained experience with the implementation process, we installed the system at San Francisco and Durham, starting with the physician-investigators and later moving to a larger group of primary care clinicians for the clinical trial.
ATHENA DSS is a platform-independent system designed for integration with legacy patient data systems.26 VAPAHCS uses the national Department of Veterans Affairs' medical record system27 VistA (Veterans Health Information Systems and Technology Architecture) (largely based on M, formerly known as the Massachusetts General Hospital Utility Multiprogramming System or MUMPS), and its user interface, Computerized Patient Record System–Graphical User Interface (CPRS-GUI).
An overall organizational requirement for the initial deployment at VAPAHCS was that the system be consistent with the VAPAHCS's goals for clinical practice guideline implementation and that it enhance clinician acceptance of guideline-based recommendations. Our design requirements included achievement of VAPAHCS's administrative goals, acceptability to clinician-users with simple user interface, and consistency with the RCT protocol. As a result, our integration objectives for ATHENA DSS were as follows:
To accomplish these goals, we involved representatives of the organization in the technical design in an iterative process, selecting technical features to enhance acceptability of the system to clinicians and administrators: a sociotechnical approach.
We included three VA medical centers: VAPAHCS, San Francisco VA Medical Center, and Durham VA Medical Center. VAPAHCS, the initial site, is a large integrated health care system in mid-coastal and central California, spanning more than 13,000 square miles, with a tertiary care hospital at the main campus and a network of sites with subacute units, long-term care, and outpatient clinics. We included clinic sites of VAPAHCS located at Palo Alto (PAD), Menlo Park (MPD), San Jose (SJC), Monterey (MON), Capitola (CAP), Livermore (LD), Stockton (STC), and Modesto (MOD). Driving time from the main campus at Palo Alto to the outer sites is greater than two hours even in optimal traffic conditions. Primary care clinicians with drug prescribing privileges include approximately 55 attending physicians, 40 resident physicians, and seven nurse practitioners and physician assistants. Clinic sites include many shared-use areas in which primary care clinicians and specialists use the same computers. In the primary care areas of VAPAHCS, there were 146 computers from various manufacturers, running either Microsoft Windows NT or Windows 2000. All are networked to the central VistA computers located in Palo Alto.
San Francisco and Durham VA Medical Centers' primary care clinics included in the study are both located in a single building, with a smaller number of computers (approximately 25 at each site). Clinicians at all sites possess a wide range of computer experience, including some who are adept and others who are beginners.
We describe features of the technical deployment designed to support the organization's requirements for system operation, focusing on the aspects of the technical design that addressed clinical and organizational aims.
We developed a method for transferring existing patient data to ATHENA DSS. An M program extracts patient data from VistA each night, based on the following selection criteria: patient has a diagnosis of hypertension; patient has a scheduled appointment in a primary care clinic (general medical clinic, geriatrics clinic, or women's health clinic) within a five-day window (encompassing Friday as the prior workday for Monday clinics). For each patient identified, the following data are extracted: diagnoses; result(s) of 14 selected laboratory tests together with the date of the test; all prescribed drugs in the pharmacy database with date, dose, and number of pills dispensed; and all blood pressure, pulse, height, and weight measurements with date. The extracted patient data file is sent by ftp to the ATHENA server. We precompute advisories on all patients whose data are extracted each night.
Our system minimizes the code installed on each client computer in the clinics. We use a client-server architecture where the guideline interpreter and temporal database mediator run as server processes on the ATHENA server. We install the ATHENA Client software that manages the pop-up window centrally on the ATHENA server. We install on client machines only stable commercial software that requires no update during the course of the clinical trial and a small client program that monitors messages from the CPRS. When the ATHENA knowledge base is updated, for example, to reflect refinements in the wording of messages, the new knowledge base is installed only on the server.
When the CPRS notifies ATHENA that a clinician has opened a patient record, ATHENA Client requests an advisory for that patient from the guideline interpreter. When ATHENA receives the advisory, it generates a pop-up window within the CPRS-GUI window containing recommendations based on the available patient data. The pop-up window includes tabs and buttons that allow clinicians to access additional screens and enter new data. Clinicians can easily bypass the window by clicking outside it or by closing it. A sample advisory is shown in . When the clinician enters additional patient data, for example, blood pressure measurements or diagnostic information not previously available, and requests an updated advisory, the information is transmitted to the server and the pop-up window is refreshed with the updated advisory. Clinicians may enter feedback about the program either by clicking on a checklist for each drug recommendation or by entering free-text in a comment box. The BP-Prescription Graphs tab () displays all the blood pressures and antihypertensive agents for this patient on the same time line, with a scroll bar to view older data. Graphical displays of clinical information about hypertension can summarize and clarify complex interrelationships.28
Data logging aids user acceptability, by capturing feedback that can be used to refine the system, and supports the goals of the clinical trial. The patient data and the precomputed advisories for all patients are logged, providing a record of the recommendations that would have been generated for both control and intervention groups, regardless of whether they were displayed to the clinician. The system captures patient data entered by the clinician-user, the updated recommendations displayed, and clicks indicating that the user viewed additional screens.
Translation projects typically unite people from disparate cultures and activities into what may be, at least initially, an unwieldy team. Teams need time to develop skills.29 Good teamwork requires that all the members are trying to succeed at the same game and have a common understanding of the rules and the language for communication. Academic informaticians, health services researchers, physician domain experts, and hospital information system specialists typically have disparate disciplinary perspectives and cultures. Consequently, uniting them on the same team with common overall objectives requires effort and commitment to develop a shared vocabulary and patterns for handing-off tasks. We approached this challenge by starting with small tasks and using bridge personnel, i.e., individuals who had familiarity with at least two of the four disciplines. As our team gained experience in working together, we developed patterns of communication and identified the most effective role assignments. One of the key roles that emerged was the database/pharmacist who ensured effective communication between the Health Care System's VistA experts and the medical informatics experts.
Effective implementation requires a realistic initial assessment of local facilitators for and barriers to implementation and periodic reassessment as the project develops. We describe these barriers and facilitators in the following sections.
The goal of the project, to improve care of patients with hypertension by implementing the VA guidelines, aligns with general institutional goals. The initial grant proposal for the project was discussed with the Chief of the Information Resource Management Service (IRMS) and the Chief of Staff, who both wrote letters of support for the project. The proposal included an implementation phase followed by a clinical trial phase. During the implementation phase, we took steps to ensure that the project remained well integrated with the administrative goals of VAPAHCS. We identified several physician-administrators (all general internists) whose approval and authorization were essential for successful implementation. These included the physician with oversight of VAPAHCS outpatient clinics, of which the primary care clinics are a subset, and the physician with oversight of the large block of satellite sites in the east San Francisco Bay area, including the LD, STC, and MOD sites. Several months before the anticipated launch of the clinical trial, we gave physician-administrators access to ATHENA DSS and offered them the opportunity to assess (1) the time involved in viewing the pop-up window and the impact on clinical work and (2) the clinical content of the knowledge base to ensure its consistency with VA guidelines. We were able to provide the system to selected individuals without activating it for clinicians who would later be enrolled in the clinical trial. We encouraged the physician-administrators to provide feedback to us, which we then addressed in redesign. For example, one physician-administrator requested that we display the pop-up window the day before the scheduled clinic visit as well as on the day of the visit because he encouraged outpatient clinic physicians to review their charts the day before the visits to allow the clinic schedule to run on time (visit planning). We revised the system accordingly.
Early in the planning stage, we met with IRMS administrators and networking staff to outline our plans and obtain their input and approval, and we maintained close contact with the IRMS staff during planning and implementation. The IRMS staff provided support in several key areas including installing the project server in the IRMS server room so that it could benefit from backup power supply, air conditioning, and optimal network connections; programming the M patient data extract; and network support.
We presented the project to the Health Care System's Medical Informatics Committee to gain feedback from key personnel. We also discussed the project with the CPRS Implementation Coordinator at several phases and redesigned in response to comments. For example, we added a second patient identifier to the pop-up window when the coordinator requested the presence of two identifiers to enhance patient safety.
Successful guideline implementation requires local clinical opinion leader “buy in” of the clinical content. Clinicians must be assured that the guideline recommendations are well founded. The recommendations presented should be based on sufficient backing.30 In our case, national guidelines provide overall backing for the recommendations. For the recommendations based on compelling indications, we display the evidence base supporting the recommendations.31 However, automation of guidelines requires translation of guideline knowledge into computable formats, which involves supplemental knowledge and decisions about how to resolve ambiguities in the guidelines.32 Additional backing is needed for supplemental knowledge and for choices about resolution of ambiguities. One of our experts directed the specialized Hypertension Clinic in the Health Care System and served as the primary domain expert for hypertension, enhancing clinician-users' confidence in the recommendations made by ATHENA DSS. We consulted with medical center experts in nephrology, cardiology, and rheumatology for some specialized content.
Clinical opinion leaders must also be confident that the guidelines apply well to their own patient population. We recruited several physicians to assist with review of our guideline implementation in ATHENA DSS. In addition to the physician-administrators described above (one of whom was also the medical center's overall guideline implementation leader), we recruited the supervisor of the general medical clinics at the Palo Alto site and the primary care chief resident as physician-monitors. We shared the knowledge rules used in ATHENA DSS, gave them individual training sessions in use of the system, activated the system at their clinics, and encouraged them to comment directly and to use the feedback features built into ATHENA DSS as described above.
Implementing the system at additional medical centers, under different administrations, presented a new set of issues. It is more difficult to achieve and sustain enthusiasm for the new technology at sites other than where the system was developed and tested. With funding from the VA Health Service Research and Development service to study the impact of hypertension guideline implementation using ATHENA, we were able to implement the ATHENA system at two additional VA medical centers (San Francisco and Durham). The San Francisco VA Medical Center is in the same VA regional group, the Sierra-Pacific Veterans Integrated Service Network (VISN) 21, as VA Palo Alto Health Care System but has its own medical center director and administrative structure completely separate from VAPAHCS. The Durham VA Medical Center has a separate administration and is located thousands of miles from Palo Alto. At these medical centers, we faced and addressed unique implementation issues that did not occur at Palo Alto.
Both San Francisco and Durham had on-site physician-investigators who were primarily responsible for overseeing implementation, but neither of these physicians had special training or experience in informatics. We designed the system to run with as little maintenance as possible required from the local site. The study funds provided for a half-time research assistant at each site, who was the only funded support for all aspects of the project at that site. The additional sites recruited research assistants who had enough computer skills to work with the staff at Palo Alto to determine that the system was up and running and to do initial steps in troubleshooting problems.
Local IRMS personnel at the additional sites had less invested in seeing that ATHENA worked properly than the IRMS personnel at Palo Alto. A small but crucial amount of support was required from the IRMS to install the extract program and to mount the software on the client computers in the clinics. In San Francisco, the VISN chief for IRMS facilitated this process. It was important for the developers at Palo Alto to recognize the need for IRMS staff at each site to review the software to assess its impact on network traffic and security. Our stance was to encourage and cooperate with every review of the system that the local IRMS staff deemed important. In addition, a Palo Alto staff member with extensive IRMS experience (RC) visited Durham for two days to work directly with the IRMS staff on installation.
At Durham, ATHENA DSS was run from a local server set up by the project. Because the ATHENA server was part of a research project, it was outside the main computing infrastructure of the IRMS at Durham and required an on-site person to maintain and update changes in that server. This person was the information officer for the research group participating in the study that brought ATHENA to the site. For wider implementation, ownership for this server, its maintenance, and updating will need to be assumed by the hospital IRMS.
Special approval from the facility clinical guidelines committee was required at the Durham VA Medical Center. The medical center director had to know of the project, and both the medical center and VISN directors had to approve the implementation, particularly since other initiatives were in development that would have had overlapping goals (e.g., a clinical reminder in the CPRS directed at patients whose blood pressure was higher than the recommended target range).
Since the VAPAHCS study site included clinics located in seven different cities across a wide geographic area, it was not feasible to bring the clinicians together for training. Training at VAPAHCS was accomplished in a short (typically ten-minute) telephone call with a member of the project staff. For the Durham and San Francisco sites, a training slide show was created in Palo Alto and provided to the lead clinician investigator at each site for local customization. The lead physician-investigator at each of those two sites then conducted a group training session for the clinicians, with individual follow-up sessions by either the physician-investigator or the research assistant with clinicians who missed the group session. As a result, training at the Durham and San Francisco sites was conducted by people who were not directly familiar with the development and nuances of ATHENA. As with all clinician training, it was difficult to achieve 100% compliance, even with both group and individual training sessions.
Clinicians receive many clinical reminders. One effective means to sustain clinician interest was to provide quarterly feedback on guideline–drug concordance for hypertension. Clinicians are aware of the evidence supporting treatment of hypertension to lower cardiovascular risk; presenting them with their medical center's and their individual rates of adequacy of control of blood pressure appeared to sustain interest. This often stimulated more questions and renewed interest in achieving stated goals or questions about how patients or outcomes were chosen. Providing a forum for these questions was important. At the Durham VA Medical Center, this occurred in the monthly clinical staff meetings.
The physician-investigators in San Francisco and Durham were not the original developers of ATHENA DSS and did not initially have a sense of “ownership” of the knowledge base. Lack of familiarity with the program could potentially interfere with their enthusiastic endorsement of it to their colleagues. We addressed this potential barrier by providing opportunities for the physician-investigators to “test drive” the system with their own patients in real time in their own clinics and by reviewing panels of patients in their office. We also discussed knowledge-base issues frequently in telephone conference calls, developing a group consensus about how to handle medical questions. Over time, this process built confidence that the knowledge in the system is state-of-the-art and correct.
In the first 15 months of the clinical trial, 91 primary care clinicians from the three sites were assigned to the ATHENA study arm. The ATHENA system displayed ATHENA advisories for 10,806 distinct patients. Clinicians entered a new blood pressure and updated the advisory for 34% of patients. Clinicians interacted with the advisory screen in this or other ways for 63% of patients. Use of the system remained high throughout the 15 months. These rates of clinician use contrast sharply with reported rates of other clinical decision supports systems, which are accessed only a very small percentage of the time that they are available.5 Clinicians entered free-text messages into the advisory's comment field for 747 patients; approximately half included an explanation of why a particular recommendation was not being followed. This extensive use of the system by clinicians in practice speaks to the usability and usefulness of the system.
ATHENA DSS is being installed at additional VA medical centers. Plans are under way to update the knowledge base in light of recent changes to hypertension guidelines and to extend the knowledge base to include diabetes and hyperlipidemia, two other important cardiovascular risk factors.
Although ATHENA DSS recommendations use patient data from VistA and appear in a pop-up window within the CPRS-GUI cover sheet for the patient selected, the system does not currently write patient data to VistA. In future work, we hope to further the integration with VistA so that, for example, blood pressure measurements entered into the ATHENA advisory window to obtain an updated recommendation may also be written to the patient's medical record, and drug recommendations made by ATHENA can generate a physician order-entry screen. We are surveying the clinicians for comments on the system. We hope to obtain input from clinician-users to inform redesign of the user interface.
Information technology can help translate research findings into practice, but implementations present organizational challenges. Integration of a platform-independent system into a legacy electronic medical record is difficult. In this case, the EON architecture software developers at Stanford Medical Informatics were not themselves employees of the institution (VA) implementing the system. Our approach of building a collaborative team bridging the necessary institutions and disciplines, soliciting and addressing the organizations' interests in the technical design, and maintaining close contact with the local administration, hospital information systems group, and clinical opinion leaders has led to a successful implementation in a complex environment. We have computed and displayed detailed patient-specific advisories to clinicians for thousands of patients.
In addition to the success of the deployment as measured by the display of advisories, we also have good indications that the advisories have captured the clinicians' interest. The clinicians are interacting with the displayed advisories at a very substantial rate. These time-pressured clinicians, who are generally fully booked in clinics with complex patients, entered hundreds of free-text comments suggesting their interest in the system. Achieving this degree of implementation was possible only through the cooperation of a large number of individuals. We believe that this was achieved in large part because of the careful attention to sociotechnical integration.
One of the accomplishments of this project is system implementation at sites that do not have a medical informatics–oriented lead physician. Information technology that can be implemented only at sites with highly involved informatics-oriented physicians would be quite limited in dissemination. We aimed to develop a system that could be implemented in a manner highly integrated with clinical care without an on-site physician with strong informatics skills. Our successful integration at two additional medical centers, both geographically and administratively separate from the main site, speaks to the robustness of the technology.
Recent papers have reported a lack of impact of computerized decision support for chronic disease in primary care, in contrast to the prior success in preventive medicine and other areas. Computer-based cardiac care suggestions had no effect on physicians' adherence to recommended care.33 Eccles et al.6 found that a system implementing guidelines for asthma and angina had no significant effect on processes of care. They noted that, for much of the study, the median number of clinician interactions with the system was zero. We do not yet have our clinical trial results for impact on clinician guideline adherence, but we have found substantial rates of clinician interaction with the system. Eccles' group conducted a parallel interview study to understand factors influencing system adoption and found that respondents thought the system did not fit well within their practice context.34 Our higher rates of clinician use of the system may reflect the attention given to sociotechnical integration. We anticipate that the information that we have collected and reported on clinician use of the system and on organizational factors will illuminate the trial results whether positive or negative.
We used Berg's inspired term sociotechnical to describe our approach to integrating social-organizational issues with the technical-informatics issues.2 The work that we describe in this paper was not a sociological study of an informatics implementation; rather it was a description of both social and technical aspects of an implementation done deliberately using approaches that take account of sociotechnical issues to maximize the likelihood of success.
Berg15 has also discussed the ways in which introduction of new information technology to the medical workplace can change the way that work is done. Future work could explore ways in which this may happen at the sites using the ATHENA DSS or other decision support systems for guideline implementation. Researchers could also investigate changes to the system design and the user interface that could improve the system further. For example, an advisory is displayed for every hypertensive patient who meets the eligibility criteria for the program, and the advisory includes a large amount of information on the top level. Ash et al.22 noted in a recent paper on unintended consequences of information technology in health care that “decision support systems could trigger an overdose of reminders, alerts, or warning messages.” In future work, focus groups with clinician-users could help guide design changes. For example, more selective triggers for the appearance of the advisory and/or a smaller pop-up window that alerts the user to the presence of an advisory that is displayed only if the clinician clicks to request it could be incorporated to lower the risk of decision support overload.
Our approach to implementing new information technology addresses both social-organizational issues and informatics-technical issues in an interrelated manner and can be applied to cross-platform and cross-institution implementations in other settings.
Supported in part by VA HSR&D CPI-99-275 and RCD-96-301 and NIH LM05708 and LM06245.
The authors thank the Director and staff, in particular IRMS and primary care leadership, at VA Palo Alto Health Care System, Durham VA Medical Center, and San Francisco VA Medical Center. The authors also thank the clinicians who have participated in monitoring and using ATHENA DSS.
The views expressed in this article are those of the authors and do not necessarily represent the views of the Department of Veterans Affairs.