We found that our User Centred Design format was more efficient and more usable than the CPOE Test system. We also found that the User Centred Design format was as efficient and usable as the existing Paper format. The User Centred Design format had a similar number of errors as the CPOE test and paper based formats. Our secondary post hoc analyses showed similar results, strengthening our conclusion that the User Centred Design format was more efficient and more usable than the CPOE Test format, and similar to the existing Paper format.
Our results provide quantitative evaluative evidence of the impact of user-centred design, supporting the link between design, efficiency, and usability.3
Our physicians admit approximately 12–20 patients per day to our service, so a time difference of 5 min per ordering task is of great clinical importance. We found no differences in the number of ordering errors between formats, and a marginal increase in potentially harmful errors with our CPOE Test system compared to Paper. Our ordering error results are not comparable to field studies of medication errors with other CPOE systems, because we conducted our study in a simulated environment where our participants may have been less diligent than in usual practice.1
Regardless, the error rate in the CPOE Test and User Centred Design formats were high, so we need to make further design improvements to reduce or trap these errors. Our User Centred Design format was only a functional prototype, which lacked basic medication ordering decision support. The next iteration of a User Centred Design format would benefit from basic decision support, reminders to order important preventive treatments such as DVT prophylaxis, and cues to order the patient's preadmission medications.
We provided no training on our User Centred Design format, whereas we provided a total of approximately 13 h of training to the 27 physicians on the CPOE test system. Despite this difference in training intensity, we found that that the User Centred Design format was more efficient and more usable than CPOE Test format. Budgets for annual maintenance (training and support) in hospital CPOE implementations have been estimated to be as high as $1.35 million for a 500-bed hospital.24
Our results raise the enticing possibility that good design could reap further dividends through reduced need for support and training.
We have used the results of our study to inform the redesign of our CPOE Test system, and we have deferred implementation until these design issues are addressed. Our user-centred design recommendations could not be implemented in this version of our vendor system due to limitations in the existing software architecture. However, our recommendations are being integrated into future versions of our vendor system.
Our study had several strengths. Our study population represented a broad range of users, from junior residents to staff physicians, with a broad range of experience with computers, our existing electronic patient record, and our existing paper order sets. Our study design controlled for variations between participants, because all participants completed tasks in all order set formats. Our efficiency results were consistent across the study sample, suggesting that prior computer experience or order set experience was not a major determinant of our results.
Our study has several important limitations. We evaluated only one CPOE system that was still under development, so our results cannot be generalized to other CPOE systems. Our participants were using the CPOE Test format for the first time; so their efficiency would likely improve with further use. Our small sample size could not detect small differences in usability or safety between User Centred Design and Paper formats, but was able to detect statistically significant differences between CPOE Test and Paper formats. Our estimate of task efficiency for the Paper format was imperfect because of our error in executing our study procedures. However, our conclusions were similar after making extreme assumptions to correct for this error. We did not evaluate the reliability of our method for detecting and classifying errors in the submitted orders. Finally, we focused only on the task efficiency and safety of the ordering process. We did not examine task efficiency or safety at the order verification, dispensing or administration phases.
Other hospitals may develop a usability lab with a small investment in the necessary software, the expertise of usability or human factors engineer(s) to conduct the studies, and a room with computers. We found no difficulty in getting volunteers to participate in our usability study since clinicians are highly motivated to contribute to the design of clinical systems. We have also conducted usability evaluations prior to the purchase of other clinical information systems and intravenous infusion devices.25
We found that our User Centred Design format was more efficient and more usable than our CPOE Test format. We also found that the User Centred Design format was as efficient and usable as the existing Paper format. We conclude that application of user-centred design principles can enhance task efficiency and usability, increasing the likelihood of successful implementation. We have deferred our own CPOE implementation until these design issues are addressed.