There have been few reports of usability of structured, synoptic clinical reports. The aim of the present study was to compare the efficacy, efficiency, and user friendliness of standard electronic patient documentation and a structured, synoptic, bladder cancer-specific clinical documentation tool (eCancerCareBladder)
in a randomized study in a controlled laboratory setting. The results suggest that the use of eCancerCareBladder
results in faster and more accurate reconstruction of medical history compared with review of standard EMRs. Producing the structured report, including some text typing in predetermined entry fields and completing drop-down menus, did not take significantly longer than completing a standard narrative dictated report, but it resulted in significantly better report quality in terms of recorded data. Finally, the vast majority of clinicians participating in the study preferred to use eCancerCareBladder
. The improved quality of clinical notes probably results from predetermined data entry fields, which reduces the amount of data missing from clinical notes; the DePICT feature is especially valuable when medical history data are reviewed. Events that span a longer period of time (eg, serial bladder instillation in our scenarios) are captured particularly efficiently using the DePICT feature, compared with opening multiple clinical notes in traditional EMRs. Our results are in agreement with previous findings suggesting user preference of graphical entry and drop-down menus over text entry; embedded objects (data entry lists) and templates are also appreciated.20
These features may result in more consistent documentation, fewer documentation errors, and increased compliance.21
This should translate into improved patient safety.22
is an example of bottom-up software design with successful adoption, an approach characterized by participation of end users in all stages of development.23–29
Clinical input has led to several efficient and user-friendly applications, such as the timeline function, lists, drop-down menus, and use of shortcuts and direct data entry. Our results suggest that eCancerCareBladder
is superior to EMRs in ease of software use, clinical user satisfaction, and general software preference.
Successful clinical adoption of applications can benefit from formal usability testing of software during development as well as after implementation. Evaluating success in information systems has several dimensions that encompass system quality, information quality user satisfaction, individual impact, and organizational impact.30
The last four cover the effectiveness or influence of an informatics system. In evaluating the effectiveness of our new bladder cancer information system, we focused on the individual impact on physicians, timeliness, quality of data reported, and user satisfaction. Our own previous unpublished research has indicated good user uptake and utilization of the system. We chose to conduct a controlled comparative randomized study in a human laboratory setting. The clinical scenario was simulated using real life EMR and eCancerCareBladder
cases, and cystoscopy findings were presented as a video recording of real life cases as well. This approach is novel and allows investigation of various details of the end product (in this case a clinical report). Furthermore, the time used during various steps of the process can be exactly measured. An alternative study design could be a retrospective comparison of notes produced with the two methods, but this could introduce several biases. The type of case could potentially affect the chosen software (eg, simple case dictated, complicated case documented with synoptic tool). Another design could be prospective evaluation of the two systems—that is, for a period of time, clinical notes would be produced with both systems. Biases would be minimized, but the study would cause a significant work burden and potential delays in the workflow of the clinics. Also, it would be challenging to capture the timeliness of the systems used.
The limitations of this study include the artificial study environment, where participants were not rushed or disrupted, as is often the case in busy clinical practice. Further, the sample size was not large, limiting the power available to detect differences. On the other hand, the controlled laboratory environment allowed more precise analysis, and the cases used to study the two software programs were identical. We did not conduct any cost analysis, which is obviously an important aspect when implementation of software is considered. Previous cost–benefit analyses have suggested that electronic note implementation results in a reduction in transcription cost and duplicate data entry, along with increases in quality of decision support through graphical displays and searchable databases as a means to view a linear record.31
Another neglected aspect of clinical software is the ability to query a database for research purposes. The eCancerCareBladder
database has been used for bladder cancer research since its development in 2005, and, to date, >10 reports have been published in peer-reviewed journals based on analysis of the database. Although the study was conducted in a dedicated uro-oncology practice in a tertiary care referral center, we expect the results to be generalizable, as the presented clinical scenarios are typical cases seen daily by general urologists in a community setting as well. Also, although some participants were experienced uro-oncologists, participants represented all levels of expertise.