|Home | About | Journals | Submit | Contact Us | Français|
Objective: To determine the availability of inpatient computerized physician order entry in U.S. hospitals and the degree to which physicians are using it.
Design: Combined mail and telephone survey of 964 randomly selected hospitals, contrasting 2002 data and results of a survey conducted in 1997.
Measurements: Availability: computerized order entry has been installed and is available for use by physicians; inducement: the degree to which use of computers to enter orders is required of physicians; participation: the proportion of physicians at an institution who enter orders by computer; and saturation: the proportion of total orders at an institution entered by a physician using a computer.
Results: The response rate was 65%. Computerized order entry was not available to physicians at 524 (83.7%) of 626 hospitals responding, whereas 60 (9.6%) reported complete availability and 41 (6.5%) reported partial availability. Of 91 hospitals providing data about inducement/requirement to use the system, it was optional at 31 (34.1%), encouraged at 18 (19.8%), and required at 42 (46.2%). At 36 hospitals (45.6%), more than 90% of physicians on staff use the system, whereas six (7.6%) reported 51–90% participation and 37 (46.8%) reported participation by fewer than half of physicians. Saturation was bimodal, with 25 (35%) hospitals reporting that more than 90% of all orders are entered by physicians using a computer and 20 (28.2%) reporting that less than 10% of all orders are entered this way.
Conclusion: Despite increasing consensus about the desirability of computerized physician order entry (CPOE) use, these data indicate that only 9.6% of U.S. hospitals presently have CPOE completely available. In those hospitals that have CPOE, its use is frequently required. In approximately half of those hospitals, more than 90% of physicians use CPOE; in one-third of them, more than 90% of orders are entered via CPOE.
In an editorial in American Medical News, legibility, remote access, and the potential “to make users better doctors” were described as the upsides of computerized physician order entry (CPOE) use, but the downsides of typing, system rigidity, and time were cited as making implementation of CPOE systems a highly controversial topic.1 We define CPOE as a process that allows a physician to use a computer to directly enter medical orders. Physicians are not the only members of the health care team who might enter orders into a computerized system, but they are the focus of this particular study. Hospitals are being encouraged by outside forces to implement CPOE in an effort to reduce medical errors. We conducted a survey in 1997, with results published in 1998,2 to discover what percentage of U.S. hospitals had CPOE at that time and to determine how heavily used it was in hospitals that had it. We found that one-third of hospitals claimed to have CPOE available but that it was little used at these sites. An earlier survey with a small response rate had found that 20% of surveyed institutions had CPOE,3 and a study published in 2000 that was limited to inpatient medication ordering by physicians reported that less than 10% of hospitals or health systems had such systems.4 A survey of hospital information systems in Japan discovered that order-entry systems for laboratory, imaging, and pharmacy were available at fewer than 20% of reporting hospitals, but this was not necessarily physician order entry.5 A 2003 report by the Leapfrog Group (a coalition of public and private organizations founded by the Business Roundtable, which is an association of chief executive officers of Fortune 500 companies) stated that 4.1% of the reporting hospitals in a recent survey had CPOE fully implemented,6 but the sample was primarily limited to certain demographics. During the five years since the results of our last survey were published, there have been numerous publications about the benefits of CPOE7,8,9,10 and about some of the difficulties encountered by hospitals implementing it.11,12,13 Several governmental agencies and other bodies such as the Leapfrog Group have made efforts to encourage CPOE use.14,15,16 To aid organizations during planning and implementation, a number of guides and manuals have been published as well.17,18,19,20,21 Although much attention is being focused on CPOE, no recent nationwide figures on hospital installations have been published. Therefore, we decided to send the same survey to the same sample population in 2002 that we did in 1997. The questions to be addressed here are: how widespread is the implementation of CPOE in hospitals across the United States, where is it available, and how much is it used?
The survey was originally designed to fit easily on a postcard so that respondents would immediately realize how quickly it could be filled out and they would be more likely to return it. We used exactly the same format again for this reason. Physician order entry was defined as: “direct entry of patient orders into a computer by the physician, whether using a keyboard, light pen, voice entry, mouse, or other device. This does not include entry by a surrogate or intermediary.” Although “provider” is now often substituted for “physician” in the term CPOE, we use “physician” to be consistent with the prior study. There were four questions requiring an answer on either a Likert scale or a visual analog scale (from 0% to 100% marked in quarters), as shown in . The first question was about the extent of availability of CPOE, the second concerned the level of requirement of its use (inducement), the third asked for an estimate of the percentage of physicians using it (participation), and the fourth asked for an estimate of the percentage of orders entered this way (saturation).
For the first survey, we had taken a random sample of 1,000 accredited hospitals from among those listed in the American Hospital Association Guide,19 a directory of all accredited hospitals in the United States. This sample size was more than adequate for estimating the proportion with order entry to within ± 5% with 95% confidence. Data concerning the names and addresses of contact people listed in the guide were entered into a database for generation of personalized letters and mailing labels. We accepted whatever contact person was listed, and in most cases it was the chief executive officer. For the 2002 survey, we updated the mailing list by checking it against the current AHA Guide and, when necessary, by phoning the last known number. Of the 1,000 hospitals selected for our first survey, 964 were still in existence, as far as we could determine, and updated information about these was entered into our database.
The Oregon Health & Science University Institutional Review Board approved the project. As with the first survey, the framework of the Total Design Method22 was used so that we could be confident the survey methodology was rigorous. A mailing was sent to each selected hospital, including a cover letter outlining the purpose of the study, a short summary of the results of the prior survey, and a self-addressed, stamped postcard asking the four questions. When it became clear that the response rate would be so low that valid conclusions would not be possible, we began making follow-up phone calls to find out why the postcards were not being returned. At the same time, we asked the four questions. When we became aware through these phone calls that the contact people listed in the AHA Guide were not forwarding the surveys to the most knowledgeable person in the organization (as we asked in the cover letter), we also probed for the name of an appropriate informant. Three trained research assistants followed a uniform script to encourage return of the surveys or, if the respondent preferred, to administer the questions once the caller reached the right person.
Respondents who answered by mail were compared with those who answered over the phone. Data from answers to the four questions were analyzed descriptively with simple proportions calculated for each question. Comments from both the postcards and the phone conversations were analyzed to find patterns.
Responses to the mail survey numbered 110 of 964, or just 11%. However, we reached an additional 516 by phone, so our ultimate response rate was 65%. In 1997, we had received 376 of 983 surveys (of the 1,000 we mailed, 17 had incorrect addresses), or 37%, so we were able to greatly improve the response rate this time with intense follow-up. We compared the phone and mail responses for each of the four questions using the chi-square test for significance. We found that in each case, the difference was significant, so we could not pool the mail and phone data, and we also could not conduct statistical tests comparing the 2002 and 1997 data. Descriptive results for each question and their proportions are shown in . Results of the 1997 survey are indicated for comparison purposes in the left columns, followed by the 2002 mail responses and the 2002 phone responses on the right. separates out the 2002 results for the hospitals that have CPOE.
In response to the availability question, 83.7% of the hospitals responded that they do not have computerized order entry available for use by physicians. Of the 16% that have CPOE (see ), 40.6% have it available in some locations (6.5% of the total responding hospitals) and 59.4% provide it completely in all locations (9.6% of all respondents). The second question concerned inducement. Of the hospitals that have physician order entry available, 34.0% consider its use optional, 19.8% encourage it, and 46.2% require its use.
Some categories for participation and saturation listed in have been collapsed because the numbers within them are small, but the entire spread for participation and saturation is shown in . The third question asked respondents for a percentage estimate of how many physicians use computerized order entry (participation). For 45.6% of those that have CPOE available, participation is over 90%. Only 17.7% of these hospitals report that 10% or fewer physicians use it. The results for saturation (the percent of orders entered by physicians using a computer vs. other mechanisms) indicate that 28.2% of those who responded to this question have 10% or fewer total orders entered this way; 35.2% report 90% or greater saturation. The distribution for both participation and saturation is bimodal, as indicated in .
Comments on the postcards followed three themes. First, many hospitals are planning to implement CPOE or said it is coming soon. Second, a number of respondents said that their hospitals are too small to consider CPOE or that the cost is too great. Third, many hospitals continue to have order entry done by an intermediary. Interestingly, many respondents said that they could only give us minimal answers; in a number of cases, we were told the hospital has CPOE but other questions could not be answered. In many cases, we reached the appropriate person after numerous phone calls only to be told that it is against hospital policy to do surveys. These have not been tallied as responses.
We believe that the figures reported for 2002 are accurate, and they represent the best estimate of CPOE diffusion yet available. The numbers indicate that there has been a decrease in the percent of hospitals that have CPOE when one compares 1997 and 2002 proportions for availability. We are of the opinion that the figures reported in 1997 were artificially inflated, however. For the new survey, we aggressively pursued responses from each hospital that answered the first survey and not one indicated that CPOE had been discontinued. A number were no longer in existence or had merged with other hospitals, however. It seems likely that respondents to the first survey sometimes reported having CPOE when they did not. This could have been the result of their misunderstanding what CPOE was, despite our careful definition of it, or their not actually knowing the correct answer (the majority of respondents were administrators). The environment surrounding CPOE has changed considerably since 1997; there is increased awareness of what it is, so those answering questions about it in 2002 most likely answered correctly. The low mail response in 2002 could be the result of a hesitancy to answer “any surveys,” as we were often told by phone. Two limitations of written surveys are that explanations cannot be given to the respondent, and the person receiving the survey instrument might not be the most knowledgeable. With the phone follow-up mechanism we used, we were able to overcome these problems and, we believe, get more accurate answers. The fact that there are significant differences between the mail and phone answers in 2002 could further indicate problems with the mail survey methodology.
In future studies, vendors could be encouraged to provide additional data about availability, chief information officers might be specifically targeted, and more detailed data about exact use should be requested. In addition, a new random sample might be selected.
The Leapfrog survey was conducted online and primarily included certain states and urban hospitals that are in their regional rollout areas. They report that in 2003, 4.1% of respondents had CPOE fully available, whereas our figure is 9.6%. Our data also show a trend over the past five years toward a higher percent of hospitals that have CPOE requiring its use, and concomitant trends in higher participation and saturation proportions. Although causes for these trends cannot be determined from our data, the upward movement should please CPOE proponents.
The hospital study results indicate that CPOE still does not enjoy widespread implementation across the United States. There are approximately 6,000 hospitals in the United States, yet we estimate that only 9.6% have it completely available. In those hospitals that have CPOE, its use is frequently required. In approximately half of those hospitals, over 90% of physicians use CPOE; in one-third of them, over 90% of orders done by physicians are entered by CPOE.
This work was supported initially by a grant from Mr. Paul Mongerson and then by National Library of Medicine grant L06942-02. The authors thank Lara Fournier, MS, and Britt Ash, AB, for their assistance with data gathering; Richard Dykstra, MD, and Dean Sittig, PhD, for their help interpreting the results; and Vanessa Dorr, RN, MSN, for help with both the mailing list and interpretation.