As a result of our work to date, we have learned several lessons that are more broadly applicable to decision support systems and especially to ordering systems that provide critical order checks to practitioners placing orders. First, the rules and logic that govern orders checks should be understandable, editable and maintainable by system operators and users. The CPRS order check rules are created centrally and meant to serve many local VA hospitals. This model is similar for commercial systems that contain knowledge bases and rule sets meant to serve a wide range of customers. Advantages of commercial knowledge bases include their comprehensiveness and ability to draw on a larger pool of expert opinion, but a drawback is their high sensitivity, resulting in frequent order checks. 11
We believe that the VA's centralized order check development model exhibits this same trade-off. Individual VA hospitals have the technical ability to customize or adjust rules, but doing so is a significant undertaking without easy-to-use tools and well-defined organizational processes. We agree with Kuperman et al. who recommend that drug knowledge base creators need to provide the necessary tools to understand, customize and share rule information and that organizations need to create policy and procedure infrastructure to support the use of these tools. 12
Second, system behavior should be easily monitored, and ease of evaluation and the development of built-in evaluation tools should play a more significant role in system design. As we have documented, it is particularly difficult to retrieve information about cancelled orders from the VA's CPRS system. Yet without this information, we cannot measure override rates, and thus cannot assess how often users overrule CPRS drug-drug and drug-allergy interaction rules. Because there is no native order check evaluation tool, it was necessary to develop our own local evaluation method in 2001 and again in 2006 using different techniques. Although CPRS is used at all VA hospitals, it would be a challenge to reproduce our study at other sites without a significant effort because of the variation in local implementations and available human and technical resources.
Finally, we agree with Abookire et al. that system behavior in general should be periodically evaluated, especially when there are significant changes in rules about order checks or in ordering policies or software feature changes. 4
Clinical decision support systems are expensive, complex systems that must be tightly integrated with other hospital information systems. Without periodic evaluation, it is difficult to know how these systems are actually being used and monitoring may alert system operators to the unexpected impact of changes in the environment. Clearly, if there are major changes in the design and features of an order check system such as those suggested by other researchers (categorized override reasons, tighter integration with the maintenance of patient allergy lists, suppression of renewal order checks for previously tolerated medications) or in our case the addition of non-VA medications and changes in topical medications, evaluation of ordering and override rates would be warranted. 4,5,7
Less obvious, perhaps, is that indirect changes such as changes in patient population, house staff, or system policies could also have unexpected effects on order checking and must be monitored as well.
In the post-analysis of these results compared to 2001, we noted the statistical increase in the overall rate of high severity order checks from 0.5% to 2.5%. This is in part due to the introduction of new critical order check types such as “No patient allergy assessment” and possible changes in logic of previously existing order check types. We also speculated that new VA Puget Sound allergy policies might have contributed to the much higher number of drug-allergy order checks. Previously only pharmacists could enter patient allergies, but a new policy permits practitioners, nurses and dieticians to enter allergies as well. However, the ability to remove allergies is limited to pharmacists. In addition, during the period between the two studies, VA Puget Sound began standardizing allergy data by disallowing free text allergy entry and matching existing free text allergies with drug file allergies and removing any unmatched entries. Any new locally standardized allergy terms are submitted to a national data standardization process to be added to the national drug file. We did not control for these factors in our study design, but we think it likely that these new policies were unanticipated contributors to changes in order check behavior.
As Van Der Sijs et. al concluded in their review of drug safety order check studies, error factors can unwittingly originate at many levels from the individual to the organization, and maintaining both the high sensitivity and specificity of order checks is one of the challenges of decision support systems. 3
Frequent order check evaluation with supporting system and environmental knowledge could help system operators adjust and improve their decision support systems before problems such as distrust or order check fatigue becomes an issue. Override rates would presumably be one component of such an evaluation, but because they only measure the final step in practitioner order entry, other evaluation methods such as behavior observation and work analysis should also be utilized to paint a richer picture of order checks and ordering behavior.
Our study is similar to other quantitative studies that have reported override rates that are generally considered high. 4,7,13,14
However, our purpose was not only to show current override rates at VA Puget Sound. We wished to demonstrate and discuss issues regarding local monitoring of practitioner order check override rates in a centrally developed CPOE system as part of on-going quality assurance.
There is significant interest in the medical informatics community in improving CPOE systems such as the VA's CPRS system by reducing override rates through the elimination of clinically irrelevant order checks. Shah et al. report a higher practitioner acceptance of order checks when only a subset of the original drug database was used and when only the most critical order checks required practitioner action before signing, and Weingart and colleagues recommend that clinically irrelevant order checks be suppressed. 7,14
Current VA order check logic is based in large part on VA drug classes that often group pharmacologically unrelated medications and is believed to contribute to unacceptably high override rates for CPRS that studies such as ours continue to show. To address this source of clinically irrelevant order checks, the VA has purchased a proprietary database that includes drug-drug and drug-allergy interaction order checking based on more specific chemical structure rather than broad drug classes and offers new features such as dosage checking. In addition, the VA is exploring other features such as expanded laboratory finding order checks and incorporating co-existing problems and patient characteristics including age, gender, and potential for pregnancy.
It is worth noting, that while we did not qualitatively evaluate the clinical relevance of the order checks in our data set, when examining other studies, a surprisingly large percentage of order checks appear to be clinically relevant compared to the corresponding override rates. This suggests that if a decision support system has a high override rate, it does not necessarily follow that a high percentage of order checks are clinically irrelevant. In the study by Weingart et al, 41% of drug-drug and 24% of drug-allergy order checks were deemed inappropriate, leaving the majority of order checks, in fact, appropriate. 7
However, the same study measured 89% drug-drug and 91% drug-allergy override rates implying that many clinically relevant order checks were being overridden. Similarly, in a study by Hsieh that reported an 80% override rate of drug-allergy order checks, 55% of the override reasons fell into the “Aware/Will Monitor” category indicating that the majority of these order checks may have been clinically relevant as well. In our case, although we report a very high override rate and show that it has stayed high over time, we do not believe that it should necessarily be a goal to reduce this rate without considering other factors of practitioner work.
Many overridden order checks may be clinically relevant or there may be a wide variation in perceived clinical relevance of order checks, as Spina et al conclude. 15
Certainly, we take seriously the problems associated with high override rates: informatics research has appropriately focused on practitioner acceptance of decision support systems, and a system that includes many order checks that force the practitioner to respond can be disruptive and perceived as a nuisance. However, one recent study of perceptions of CPRS orders checks suggests that practitioners may be more accepting of “false positive” order checks than previously reported and furthermore found that these false positives may have a neutral to positive impact. 16
It is important to remember that 15% of the order checks in our study resulted in a cancelled order that presumably enhanced patient safety. Override rates themselves should not be used as the only gauge of system performance because these numbers do not indicate the practitioners' decision-making process. They are specific, recordable, yes/no decision points that may not accurately reflect the complexities of such a process. In the analysis of our log data, we observed that some overrides occurred following the cancellation of an initial order with the same orderable item and order check possibly indicating that the practitioner thoroughly considered the order check before re-entering the order and overriding the order check. For other orders, practitioners entered override reasons containing only a space or period character possibly indicating that the practitioner barely looked at the order check before overriding or felt it was a nuisance not worthy of explanation. More research is needed to better understand ordering and order check behaviors and their relationship to information needs, decision making system quality, and ultimately patient outcomes.
Because CPRS does not save all cancelled orders, we used a prospective logging system to capture orders as they are being entered. For this study, we were unable to use the same logging methods from our 2001 study although the underlying system, CPRS, and the study measurements (orders, order checks) were the same. As we have discussed above, within each order check type we chose to analyze the order override rate because the practitioner is presented with a single interruptive window containing all of the order checks. In our comparison to previous results that analyzed order checks separately, we acknowledge the possibility that the previously reported drug-drug order check override rate may be slightly lower than the order override rate. This highlights the difficulty in using retrospective analysis of CPRS orders to determine the relevancy of individual order checks if several order checks appear on the same screen and the order is signed with a single override reason. We believe this supports our recommendation that qualitative work be carried out in parallel with quantitative order check analysis to analyze user behavior or order check effectiveness.
Also, we sampled orders at different times during the year. For this study, we analyzed orders over 6 days in early January, excluding the intervening weekend whereas the 2001 study analyzed orders entered during a continuous week in early August (August 1, 2001 through August 8, 2001). 6
It is possible that varied ordering practitioner (primarily house staff) experience influenced the results.
It is challenging to identify which factor or combination of factors, both technical and social, may have contributed to new system behaviors including significantly higher drug-allergy order check and override rates. Our study did not control for many possible changes in the environment so we cannot say with certainty the cause of the increases we report. We speculate that the addition of non-VA medications or changes in hospital policy supporting more comprehensive allergy documentation may have affected override rates and we recommend that further research be done to determine whether this is the case.