Usability testing was used to evaluate whether a new technology, a digital
pen and paper system, would be usable for hospital nurses. Twenty-one
nurses in a Labor and Delivery unit were randomly assigned into two
groups, and a crossover design was used to compare the digital pen and
paper system to conventional pens. Data collection included observations, interviews, and
a questionnaire. Results showed that nurses had
a positive attitude toward the system and could foresee its potential
benefits, but they found that in its current design the system had poor
usability and interfered with nurses’ work practices. Usability
testing provided important insight into the needs of nurses and the
suitability of this technology. This study is an example of how a user-centered
approach can improve our understanding of the real needs
of nurses and contribute to the design of useful and usable technologies
Despite recommendations that patients be involved in the design and testing of health technologies, few reports describe how to involve patients in systematic and meaningful ways to ensure that applications are customized to meet their needs. User-centered design (UCD) is an approach that involves end-users throughout the development process so that technology support tasks, are easy to operate, and are of value to users. In this paper we provide an overview of UCD and use the development of Pocket Personal Assistant for Tracking Health (Pocket PATH), to illustrate how these principles and techniques were applied to involve patients in the development of this interactive health technology. Involving patient-users in the design and testing ensured functionality and usability, therefore increasing the likelihood of promoting the intended health outcomes.
user-centered design; interactive health technologies; lung transplantation; self-monitoring; handheld computers
Developing functional clinical informatics products that are also usable remains a challenge. Despite evidence that usability testing should be incorporated into the lifecycle of health information technologies, rarely does this occur. Challenges include poor standards, a lack of knowledge around usability practices, and the expense involved in rigorous testing with a large number of users. Remote usability testing may be a solution for many of these challenges. Remotely testing an application can greatly enhance the number of users who can iteratively interact with a product, and it can reduce the costs associated with usability testing. A case study presents the experiences with remote usability testing when evaluating a Web site designed for health informatics knowledge dissemination. The lessons can inform others seeking to enhance their evaluation toolkits for clinical informatics products.
Building upon the foundation of the Structured Narrative electronic health record (EHR) model, we applied theory-based (combined Technology Acceptance Model and Task-Technology Fit Model) and user-centered methods to explore nurses’ perceptions of functional requirements for an electronic nursing documentation system, design user interface screens reflective of the nurses’ perspectives, and assess nurses’ perceptions of the usability of the prototype user interface screens. The methods resulted in user interface screens that were perceived to be easy to use, potentially useful, and well-matched to nursing documentation tasks associated with Nursing Admission Assessment, Blood Administration, and Nursing Discharge Summary. The methods applied in this research may serve as a guide for others wishing to implement user-centered processes to develop or extend EHR systems. In addition, some of the insights obtained in this study may be informative to the development of safe and efficient user interface screens for nursing document templates in EHRs.
Electronic Nursing Documentation; User Interface; Functional Requirements; Nursing Documentation Templates; User-Centered Approach; Clinical Document Architecture; Document Ontology; Technology Acceptance Model; Task-Technology Fit model
A research-practice gap exists between what is known about conducting methodologically rigorous randomized controlled trials (RCTs) and what is done. Evidence consistently shows that pediatric RCTs are susceptible to high risk of bias; therefore novel methods of influencing the design and conduct of trials are required. The objective of this study was to develop and pilot test a wiki designed to educate pediatric trialists and trainees in the principles involved in minimizing risk of bias in RCTs. The focus was on preliminary usability testing of the wiki.
The wiki was developed through adaptation of existing knowledge translation strategies and through tailoring the site to the identified needs of the end-users. The wiki was evaluated for usability and user preferences regarding the content and formatting. Semi-structured interviews were conducted with 15 trialists and systematic reviewers, representing varying levels of experience with risk of bias or the conduct of trials. Data were analyzed using content analysis.
Participants found the wiki to be well organized, easy to use, and straightforward to navigate. Suggestions for improvement tended to focus on clarification of the text or on esthetics, rather than on the content or format. Participants liked the additional features of the site that were supplementary to the text, such as the interactive examples, and the components that focused on practical applications, adding relevance to the theory presented. While the site could be used by both trialists and systematic reviewers, the lack of a clearly defined target audience caused some confusion among participants.
Participants were supportive of using a wiki as a novel educational tool. The results of this pilot test will be used to refine the risk of bias wiki, which holds promise as a knowledge translation intervention for education in medical research methodology.
Although amblyopia is most successfully treated when detected in early childhood, many preschool-aged children are not being screened. This project explored the delivery of Web-based vision screenings, integrated with patient education, to parents and children, aged 3 to 6 years. Through a user-centered design methodology involving requirements gathering, iterative prototype development, and usability testing, a highly usable screening Website was created. Interviewing and testing parents and children in the home were essential in gathering accurate data about environments where the tool would actually be used. Frequent iterations of designing, testing, and modifying the tool were useful in identifying and correcting usability problems. Usability goals were set early in the project, and in the final phase a satisfaction questionnaire was administered to participants. Twenty-one out of 22 final usability objectives were achieved and the feasibility of Web-based vision screening was demonstrated.
PROMIS (Patient-Reported Outcome Measurement Information System) is developing a set of tools for collecting patient reported outcomes, including computerized adaptive testing that can be administered using different modes, such as computers or phones. The user interfaces for these tools will be designed using the principles of universal design to ensure that it is accessible to all users, including those with disabilities. We review the rationale for making health assessment instruments accessible to users with disabilities, briefly review the standards and guidelines that exist to support developers in the creation of user interfaces with accessibility in mind, and describe the usability and accessibility testing PROMIS will conduct with content experts and users with and without disabilities. Finally, we discuss threats to validity and reliability presented by universal design principles. We argue that the social and practical benefits of interfaces designed to include a broad range of potential users, including those with disabilities, seem to outweigh the need for standardization. Suggestions for future research are also included.
computer-adapted testing; patient reported outcomes; accessibility; disability
Human factors engineering is a discipline that deals with computer and human systems and processes and provides a methodology for designing and evaluating systems as they interact with human beings. This review article reviews important current and past efforts in human factors engineering in health informatics in the context of the current trends in health informatics.
The methodology of human factors engineering and usability testing in particular were reviewed in this article.
This methodology arises from the field of human factors engineering, which uses principles from cognitive science and applies them to implementations such as a computer-human interface and user-centered design.
Patient safety and best practice of medicine requires a partnership between patients, clinicians and computer systems that serve to improve the quality and safety of patient care. People approach work and problems with their own knowledge base and set of past experiences and their ability to use systems properly and with low error rates are directly related to the usability as well as the utility of computer systems. Unusable systems have been responsible for medical error and patient harm and have even led to the death of patients and increased mortality rates. Electronic Health Record and Computerized Physician Order Entry systems like any medical device should come with a known safety profile that minimizes medical error and harm. This review article reviews important current and past efforts in human factors engineering in health informatics in the context of the current trends in health informatics.
Health Informatics; Human Factors Engineering; Usability Testing; User-Centered Design; Patient Safety
As patient care becomes more collaborative in nature, there is a need for information technology that supports interdisciplinary practices of care. This study developed and performed usability testing of a standalone computer-based information tool to support the interdisciplinary practice of palliative severe pain management (SPM).
A grounded theory-participatory design (GT-PD) approach was used with three distinct palliative data sources to obtain and understand user requirements for SPM practice and how a computer-based information tool could be designed to support those requirements.
The GT-PD concepts and categories provided a rich perspective of palliative SPM and the process and information support required for different SPM tasks. A conceptual framework consisting of an ontology and a set of three problem-solving methods was developed to reconcile the requirements of different interdisciplinary team members. The conceptual framework was then implemented as a prototype computer-based information tool that has different modes of use to support both day-to-day case management and education of palliative SPM. Usability testing of the computer tool was performed, and the tool tested favorably in a laboratory setting.
An interdisciplinary computer-based information tool can be developed to support the different work practices and information needs of interdisciplinary team members, but explicit requirements must be sought from all prospective users of such a tool. Qualitative methods such as the hybrid GT-PD approach used in this research are particularly helpful for articulating computer tool design requirements.
ClinicalTrials.gov is a Web-based system intended for a diverse audience, including patients, family members and other members of the public. Throughout the system design and development process, our decisions have been driven by usability concerns. We first describe the overall design of the site, including the home page, which provides a site overview and rapid access to the information contained within it. Next we discuss the data presentation format which has been standardized in spite of data coming to us from many different sources. We provide a detailed description of the search and browse features that are intended to simplify the complexities of medical terminology and support information discovery. We conclude with a review of our evaluation activities and future plans.
Evidence-based preventive services offer profound health benefits, yet Americans receive only half of indicated care. A variety of government and specialty society policy initiatives are promoting the adoption of information technologies to engage patients in their care, such as personal health records, but current systems may not utilize the technology's full potential.
Using a previously described model to make information technology more patient-centered, we developed an interactive preventive health record (IPHR) designed to more deeply engage patients in preventive care and health promotion. We recruited 14 primary care practices to promote the IPHR to all adult patients and sought practice and patient input in designing the IPHR to ensure its usability, salience, and generalizability. The input involved patient usability tests, practice workflow observations, learning collaboratives, and patient feedback. Use of the IPHR was measured using practice appointment and IPHR databases.
The IPHR that emerged from this process generates tailored patient recommendations based on guidelines from the U.S. Preventive Services Task Force and other organizations. It extracts clinical data from the practices' electronic medical record and obtains health risk assessment information from patients. Clinical content is translated and explained in lay language. Recommendations review the benefits and uncertainties of services and possible actions for patients and clinicians. Embedded in recommendations are self management tools, risk calculators, decision aids, and community resources - selected to match patient's clinical circumstances. Within six months, practices had encouraged 14.4% of patients to use the IPHR (ranging from 1.5% to 28.3% across the 14 practices). Practices successfully incorporated the IPHR into workflow, using it to prepare patients for visits, augment health behavior counseling, explain test results, automatically issue patient reminders for overdue services, prompt clinicians about needed services, and formulate personalized prevention plans.
The IPHR demonstrates that a patient-centered personal health record that interfaces with the electronic medical record can give patients a high level of individualized guidance and be successfully adopted by busy primary care practices. Further study and refinement are necessary to make information systems even more patient-centered and to demonstrate their impact on care.
Clinicaltrials.gov identifier: NCT00589173
Clinical decision support systems (CDSS) are important tools to improve health care outcomes and reduce preventable medical adverse events. However, the effectiveness and success of CDSS depend on their implementation context and usability in complex health care settings. As a result, usability design and validation, especially in real world clinical settings, are crucial aspects of successful CDSS implementations.
Our objective was to develop a novel CDSS to help frontline nurses better manage critical symptom changes in hospitalized patients, hence reducing preventable failure to rescue cases. A robust user interface and implementation strategy that fit into existing workflows was key for the success of the CDSS.
Guided by a formal usability evaluation framework, UFuRT (user, function, representation, and task analysis), we developed a high-level specification of the product that captures key usability requirements and is flexible to implement. We interviewed users of the proposed CDSS to identify requirements, listed functions, and operations the system must perform. We then designed visual and workflow representations of the product to perform the operations.
The user interface and workflow design were evaluated via heuristic and end user performance evaluation. The heuristic evaluation was done after the first prototype, and its results were incorporated into the product before the end user evaluation was conducted. First, we recruited 4 evaluators with strong domain expertise to study the initial prototype. Heuristic violations were coded and rated for severity. Second, after development of the system, we assembled a panel of nurses, consisting of 3 licensed vocational nurses and 7 registered nurses, to evaluate the user interface and workflow via simulated use cases. We recorded whether each session was successfully completed and its completion time. Each nurse was asked to use the National Aeronautics and Space Administration (NASA) Task Load Index to self-evaluate the amount of cognitive and physical burden associated with using the device.
A total of 83 heuristic violations were identified in the studies. The distribution of the heuristic violations and their average severity are reported. The nurse evaluators successfully completed all 30 sessions of the performance evaluations. All nurses were able to use the device after a single training session. On average, the nurses took 111 seconds (SD 30 seconds) to complete the simulated task. The NASA Task Load Index results indicated that the work overhead on the nurses was low. In fact, most of the burden measures were consistent with zero. The only potentially significant burden was temporal demand, which was consistent with the primary use case of the tool.
The evaluation has shown that our design was functional and met the requirements demanded by the nurses’ tight schedules and heavy workloads. The user interface embedded in the tool provided compelling utility to the nurse with minimal distraction.
clinical decision support systems; user-computer interface; software design; human computer interaction; usability testing; heuristic evaluations; software performance; patient-centered care
Interventions that target cancer patients and their caregivers have been shown to improve communication, support, and emotional well-being.
To adapt an in-person communication intervention for cancer patients and caregivers to a web-based format, and to examine the usability and acceptability of the web-based program among representative users.
A tailored, interactive web-based communication program for cancer patients and their family caregivers was developed based on an existing in-person, nurse-delivered intervention. The development process involved: 1) building a multidisciplinary team of content and web design experts, 2) combining key components of the in-person intervention with the unique tailoring and interactive features of a web-based platform, and 3) conducting focus groups and usability testing to obtain feedback from representative program users at multiple time points.
Four focus groups with 2 to 3 patient-caregiver pairs per group (n = 22 total participants) and two iterations of usability testing with 4 patient-caregiver pairs per session (n = 16 total participants) were conducted. Response to the program's structure, design, and content was favorable, even among users who were older or had limited computer and internet experience. The program received high ratings for ease of use and overall usability (mean System Usability Score of 89.5 out of 100).
Many elements of a nurse-delivered patient-caregiver intervention can be successfully adapted to a web-based format. A multidisciplinary design team and an iterative evaluation process with representative users were instrumental in the development of a usable and well-received web-based program.
Communication; caregiving; social support; technology assessment; cancer; oncology
Moodscope is an entirely service-user-developed online mood-tracking and feedback tool with built-in social support, designed to stabilize and improve mood. Many free internet tools are available with no assessment of acceptability, validity or usefulness. This study provides an exemplar for future assessments.
A mixed-methods approach was used. Participants with mild to moderate low mood used the tool for 3 months. Correlations between weekly assessments using the Patient Health Questionnaire (PHQ-9) and the Generalized Anxiety Disorder Assessment (GAD-7) with daily Moodscope scores were examined to provide validity data. After 3 months, focus groups and questionnaires assessed use and usability of the tool.
Moodscope scores were correlated significantly with scores on the PHQ-9 and the GAD-7 for all weeks, suggesting a valid measure of mood. Low rates of use, particularly toward the end of the trial, demonstrate potential problems relating to ongoing motivation. Questionnaire data indicated that the tool was easy to learn and use, but there were concerns about the mood adjectives, site layout and the buddy system. Participants in the focus groups found the tool acceptable overall, but felt clarification of the role and target group was required.
With appropriate adjustments, Moodscope could be a useful tool for clinicians as a way of initially identifying patterns and influences on mood in individuals experiencing low mood. For those who benefit from ongoing mood tracking and the social support provided by the buddy system, Moodscope could be an ongoing adjunct to therapy.
Depression; mood tracking; self-help; social support; web-based
Evidence is emerging that certain technologies such as computerized provider order entry may reduce the likelihood of patient harm. However, many technologies that should reduce medical errors have been abandoned because of problems with their design, their impact on workflow, and general dissatisfaction with them by end users. Patient safety researchers have therefore looked to human factors engineering for guidance on how to design technologies to be usable (easy to use) and useful (improving job performance, efficiency, and/or quality). While this is a necessary step towards improving the likelihood of end user satisfaction, it is still not sufficient. Human factors engineering research has shown that the manner in which technologies are implemented also needs to be designed carefully if benefits are to be realized. This paper reviews the theoretical knowledge on what leads to successful technology implementation and how this can be translated into specifically designed processes for successful technology change. The literature on diffusion of innovations, technology acceptance, organisational justice, participative decision making, and organisational change is reviewed and strategies for promoting successful implementation are provided. Given the rapid and ever increasing pace of technology implementation in health care, it is critical for the science of technology implementation to be understood and incorporated into efforts to improve patient safety.
Usability evaluations are a powerful tool that can assist developers in their efforts to optimize the quality of their web environment. This underutilized, experimental method can serve to move applications toward true user-centered design. This article describes the usability methodology and illustrates its importance and application by describing a usability study undertaken at the Mayo Clinic for the purpose of improving an academic research web environment. Academic institutions struggling in an era of declining reimbursements are finding it difficult to maintain academic enterprises on the back of clinical revenues. This may result in declining amounts of time that clinical investigators have to spend in non–patient-related activities. For this reason, we have undertaken to design a web environment, which can minimize the time that a clinician-investigator needs to spend to accomplish academic instrumental activities of daily living. Usability evaluation is a powerful application of human factors engineering, which can improve the utility of web-based Informatics applications.
Although there is broad consensus that careful content vetting and user testing is important in the development of technology-based educational interventions, often these steps are overlooked. This paper highlights the development of a theory-guided, web-based communication aid (CONNECT™), designed to facilitate treatment decision making among patients with advanced cancer.
The communication aid included an online survey, patient skills training module and an automated physician report. Development steps included: 1) evidence-based content development, 2) usability testing, 3) pilot testing, and 4) patient utilization and satisfaction.
Usability testing identified some confusing directions and navigation for the on-line survey and validated the relevance of the “patient testimonials” in the skills module. Preliminary satisfaction from the implementation of the communication aid showed that 66% found the survey length reasonable and 70% found it helpful in talking with the physician. Seventy percent reported the skills module helpful and about half found it affected the consultation.
Designing patient education interventions for translation into practice requires the integration of health communication best practice including user feedback along the developmental process.
This developmental process can be translated to a broad array of community based patient and provider educational interventions.
usability testing; web-based education; provider patient communication; cancer treatment; health communication; intervention development; decision aids
The role of usability testing in the evaluation of an electronic health record system could improve chances that the design is integrated with existing workflow and business processes in a clear, efficient way.
An oncology electronic health record (EHR) was implemented without prior usability testing. Before expanding the system to new clinics, this study was initiated to examine the role of usability testing in the evaluation of an EHR product and whether novice users could identify issues with usability that resonated with more experienced users of the system. In addition, our study evaluated whether usability issues with an already implemented system affect efficiency and satisfaction of users.
A general usability guide was developed by a group of five informaticists. Using this guide, four novice users evaluated an EHR product and identified issues. A panel of five experts reviewed the identified issues to determine agreement with and applicability to the already implemented system. A survey of 42 experienced users of the previously implemented EHR was also performed to assess efficiency and general satisfaction.
The novice users identified 110 usability issues. Our expert panel agreed with 90% of the issues and recommendations for correction identified by the novice users. Our survey had a 54% response rate. The majority of the experienced users of the previously implemented system, which did not benefit from upfront usability testing, had a high degree of dissatisfaction with efficiency and general functionality but higher overall satisfaction than expected.
In addition to reviewing features and content of an EHR system, usability testing could improve the chances that the EHR design is integrated with existing workflow and business processes in a clear and efficient way.
The Epigenomics resource at the National Center for Biotechnology Information (NCBI) has been created to serve as a comprehensive public repository for whole-genome epigenetic data sets (www.ncbi.nlm.nih.gov/epigenomics). We have constructed this resource by selecting the subset of epigenetics-specific data from the Gene Expression Omnibus (GEO) database and then subjecting them to further review and annotation. Associated data tracks can be viewed using popular genome browsers or downloaded for local analysis. We have performed extensive user testing throughout the development of this resource, and new features and improvements are continuously being implemented based on the results. We have made substantial usability improvements to user interfaces, enhanced functionality, made identification of data tracks of interest easier and created new tools for preliminary data analyses. Additionally, we have made efforts to enhance the integration between the Epigenomics resource and other NCBI databases, including the Gene database and PubMed. Data holdings have also increased dramatically since the initial publication describing the NCBI Epigenomics resource and currently consist of >3700 viewable and downloadable data tracks from 955 biological sources encompassing five well-studied species. This updated manuscript highlights these changes and improvements.
Objectives: This study tested the usability of the Database of International Rehabilitation Research, a bibliographic database developed by the Center for International Rehabilitation Research Information and Exchange (CIRRIE).
Methods: Potential users, i.e., rehabilitation researchers, were asked to participate in a usability study. Test questions were designed to represent common tasks performed in a bibliographic database. Participants were asked to think aloud during the test so that both their actions and comments could be recorded.
Results: This study identified common problems that participants had while searching the database and aspects of the database that needed improvement.
Conclusions: Usability testing proved to be an effective method for evaluating database effectiveness and user satisfaction. The method used provided valuable information about how the database searchers approached their searches as well as how they performed them.
The complexity and rapid growth of genetic data demand investment in information technology to support effective use of this information. Creating infrastructure to communicate genetic information to health care providers and enable them to manage that data can positively affect a patient’s care in many ways. However, genetic data are complex and present many challenges. We report on the usability of a novel application designed to assist providers in receiving and managing a patient’s genetic profile, including ongoing updated interpretations of the genetic variants in those patients. Because these interpretations are constantly evolving, managing them represents a challenge. We conducted usability tests with potential users of this application and reported findings to the application development team, many of which were addressed in subsequent versions. Clinicians were excited about the value this tool provides in pushing out variant updates to providers and overall gave the application high usability ratings, but had some difficulty interpreting elements of the interface. Many issues identified required relatively little development effort to fix suggesting that consistently incorporating this type of analysis in the development process can be highly beneficial. For genetic decision support applications, our findings suggest the importance of designing a system that can deliver the most current knowledge and highlight the significance of new genetic information for clinical care. Our results demonstrate that using a development and design process that is user focused helped optimize the value of this application for personalized medicine.
clinical decision support; electronic health records; genomics; personalized medicine
Designing usable geovisualization tools is an emerging problem in GIScience software development. We are often satisfied that a new method provides an innovative window on our data, but functionality alone is insufficient assurance that a tool is applicable to a problem in situ. As extensions of the static methods they evolved from, geovisualization tools are bound to enable new knowledge creation. We have yet to learn how to adapt techniques from interaction designers and usability experts toward our tools in order to maximize this ability. This is especially challenging because there is limited existing guidance for the design of usable geovisualization tools. Their design requires knowledge about the context of work within which they will be used, and should involve user input at all stages, as is the practice in any human-centered design effort. Toward that goal, we have employed a wide range of techniques in the design of ESTAT, an exploratory geovisualization toolkit for epidemiology. These techniques include; verbal protocol analysis, card-sorting, focus groups, and an in-depth case study. This paper reports the design process and evaluation results from our experience with the ESTAT toolkit.
This paper describes the development and evaluation of an innovative application designed to engage children and their parents in weekly asthma self-monitoring and self-management to prompt an early response to deteriorations in chronic asthma control, and to provide their physicians with longitudinal data to assess the effectiveness of asthma therapy and prompt adjustments. The evaluation included 2 iterative usability testing cycles with 6 children with asthma and 2 parents of children with asthma to assess user performance and satisfaction with the application. Several usability problems were identified and changes were made to ensure acceptability of the application and relevance of the content. This novel application is unique compared to existing asthma tools and may shift asthma care from the current reactive, acute care model to a preventive, proactive patient-centered approach where treatment decisions are tailored to patients’ individual patterns of chronic asthma control to prevent acute exacerbations.
Usability can influence patients’ acceptance and adoption of a health information technology. However, little research has been conducted to study the usability of a self-management health care system, especially one geared toward elderly patients.
This usability study evaluated a new computer-based self-management system interface for older adults with chronic diseases, using a paper prototype approach.
Fifty older adults with different chronic diseases participated. Two usability evaluation methods were involved: (1) a heuristics evaluation and (2) end-user testing with a think-aloud testing method, audio recording, videotaping, and interviewing. A set of usability metrics was employed to determine the overall system usability, including task incompletion rate, task completion time, frequency of error, frequency of help, satisfaction, perceived usefulness, and perceived ease of use. Interviews were used to elicit participants’ comments on the system design. The quantitative data were analyzed using descriptive statistics and the qualitative data were analyzed for content.
The participants were able to perform the predesigned self-management tasks with the current system design and they expressed mostly positive responses about the perceived usability measures regarding the system interface. However, the heuristics evaluation, performance measures, and interviews revealed a number of usability problems related to system navigation, information search and interpretation, information presentation, and readability. Design recommendations for further system interface modifications were discussed.
This study verified the usability of the self-management system developed for older adults with chronic diseases. Also, we demonstrated that our usability evaluation approach could be used to quickly and effectively identify usability problems in a health care information system at an early stage of the system development process using a paper prototype. Conducting a usability evaluation is an essential step in system development to ensure that the system features match the users’ true needs, expectations, and characteristics, and also to minimize the likelihood of the users committing user errors and having difficulties using the system.
Usability evaluation; self-management; patient participation; chronic disease
Usability testing can help generate design ideas to enhance the quality and safety of health information technology. Despite these potential benefits, few healthcare organizations conduct systematic usability testing prior to software implementation. We used a Rapid Usability Evaluation (RUE) method to apply usability testing to software development at a major VA Medical Center. We describe the development of the RUE method, provide two examples of how it was successfully applied, and discuss key insights gained from this work. Clinical informaticists with limited usability training were able to apply RUE to improve software evaluation and elected to continue to use this technique. RUE methods are relatively simple, do not require advanced training or usability software, and should be easy to adopt. Other healthcare organizations may be able to implement RUE to improve software effectiveness, efficiency, and safety.