Reichlin and colleagues10
explored the acceptability and usability of an interactive web-based game intended to help men with prostate cancer in their treatment decision process. The stated objective was to gain user feedback on an alpha version of the application related to the usability and feasibility of the game, and to guide future iterations. Specifically, the researchers evaluated if 1) the users would accept the game as a decision aid; 2) whether users could easily navigate as intended; and 3) whether it increased confidence and participation in decision making. The application was designed so that users could provide their preferences regarding possible side effects associated with four typical treatments. The graphical interface, which was designed with participatory input from prior unpublished work, involved two rounds of play – each having a computer guided orientation round. Two different metaphors were used in this decision support aid: simulated playing cards () for helping patients rate side effects for different treatments and a slot machine metaphor to communicate probabilities of different side effects.
Figure 1 Simulated Playing Cards for Rating Side Effects;10
With respect to the application design, the researchers reported involving users (n=13) throughout the design and build process. The sample was comprised of men with a recent diagnosis of localized prostate cancer and who had already made a treatment decision. Participants were observed first using the game and were then given an investigator-developed survey instrument regarding usability and acceptability of the application. Following the survey, the men participated in one of three focus groups. Results from the survey indicated that participants found the game to be an effective method for providing information on treatments and side effects. The survey respondents also reported three features that needed improvement: how the time periods for side effects were displayed, the use of the slot-machine ‘spinner’ and the display of final treatments ranked highest by the user.
Analysis of focus group data found that participants appreciated being able to generate lists of questions for their clinicians. In general, subjects reported the game as acceptable and useful in a decision making process. They also reported wanting to see a greater level of detail with respect to the side effects, a greater range of treatment options, more transparency on the data underlying the probabilities presented, and more data about longer term outcomes. Another area that the researchers highlighted in discussing their results was the participants’ desire for personalization – or for the application to elicit user data (age, marital status, physical health) and to use these within the application.
Noting the importance of ensuring usability for patients with limited computer experience, vision and dexterity issues, Fromme, Kenwoirthy-Heinige, and Hribar11
reported the development process for a tablet computer application designed to help patients report symptom and quality of life information. The team began by gathering user requirements from participating oncologists; chief among these was a high degree of ease of use given the computer literacy and functional limitations among the target audience. During the user-testing phase, they observed men with prostate cancer (n=7) interact with the application followed by interviews. The application was also assessed at a later time using a larger group of patients (n=60) with diverse cancer types; while participants in this group did not have their sessions observed, or interviews conducted, an ‘ease-of-use’ survey was administered. The graphical web interface for displaying symptom and quality of life questions was based on National Cancer Institute guidelines (now adopted by all of the national institutes)12
along with heuristic evaluations drawn from the literature. The researchers provide a thorough discussion on how vision issues guided their selection process of font sizes (18–27 point), choice of a Sans Serif font and use of primary colors for navigational elements to counter changes in the lens and cornea related to aging. In line with this approach, the researchers utilized larger boxes in place of radio buttons given the small target area of native HTML radio button elements and color coding of selected response options (). Other design considerations included displaying only one question at a time so that the largest font sizes and interface elements could be used without requiring scrolling and having the survey automatically advance to the next question following a user response. In an attempt to simplify the interface, the researchers disabled navigational controls on the browser and added a pause button.
Direct observations using ‘talk aloud’ approaches and subsequent interviews with the subjects (n=7) found issues related to delays in the conditional coloring of response boxes following selection of an answer box, screen visibility, tablet settings with the stylus button accidentally bringing up context menus and the tablet itself switching from portrait to landscape settings. Results from the 8-item survey given to the larger research sample (n=60) indicated high ease-of-use scores overall and while patients over 65 had statistically lower scores, the reported ease of use was still quite high for this group.
Stoddard and colleagues13
reported on efforts to test and refine a smoking cessation website created by the National Cancer Institute (NCI). The original prototype was designed using recommendations by the National Cancer Institute,12
with content developed and edited by communication experts for a 6th
grade reading level. The authors made revisions based on feedback from other usability experts at NCI and also developed a formal usability testing plan. The first stage of this testing plan involved identifying a list of critical tasks that most users would need to perform such as knowing where to find information on the website about medications and how to print copies. Five subjects, all current smokers with an interest in quitting, participated in usability tests which were videotaped and conducted in a private room with an adjoining observation room. As participants used the website and completed critical tasks, they were encouraged to use the ‘talk aloud’ approach. Two rounds of usability testing took place, in the first round participants were assessed on whether each of nine given tasks could be completed within one minute. Participants were also given an internally constructed 10 item satisfaction survey. The overall task completion rate for the participants in the first round was 42.3% with none of the tasks on finding specific information being finished successfully. Modifications to the website were then made including moving or eliminating information, edits to labels, and navigation elements. In a subsequent round of testing with seven new subjects, the overall task completion rate increased over two-fold to 89% and was deemed acceptable in meeting overall testing goals for the site. Responses to the 10 item satisfaction survey after the second round of testing with participants found increases in satisfaction across many of the questions.
Within our own research efforts, we have focused on developing and evaluating the Electronic Self Report Assessment- Cancer (ESRA-C) computer platform, which is designed for use by patients throughout ambulatory services of a comprehensive cancer center. We have reported elsewhere that both patient and clinician users reported high acceptability and the application significantly increased discussion of symptom and quality of life issues in clinic visits.14, 15
We recently completed data collection in a second randomized clinical trial examining the impact of ESRA-C on patient verbal report of problematic issues and symptom distress levels at end of therapy. Because the new intervention was substantially more complex at the patient level, we focused on systematic development and testing with a fully patient- centered approach with few assumptions regarding patient needs and requests. The design process for this application included a needs assessment, analysis, design/testing/development, and application release as suggested by Mayhew16
. Interaction with participants reflects the testing phase of this lifecycle and by virtue of iterative development, we were able to identify new user needs.
describes the two iterative methods that were used to further develop ESRA-C and extend it: participatory design (PD)17
and iterative development (ID). The cycle on the left used focus groups, individual interviews, and mock-up prototypes to assess the ESRA-C content and experience, particularly the reporting and viewing of patient-centered assessment data. We also assessed how the users wanted to integrate their assessments into a personal health record and how they preferred to control access to this information. Findings from focus groups and individual interviews were brought back to a subset of investigators comprising the Design Group who evaluated the findings of the PD process in light of architectural, technical, and budget concerns, and determined which needs could be addressed with the patient-oriented ESRA-C, and which must be deferred for future development.
Iterative Development Model for ESRA-C
The cycle on the right used an iterative process of development and usability testing to evaluate solutions proposed to meet the project requirements. We began this process concurrently with the PD process, as our previous experience with the clinician-oriented ESRA-C15
provided us several areas with which to begin. We iteratively integrated both feedback from usability sessions and from the need assessment activities into new mockups of ESRA-C for use in subsequent usability testing sessions and within a story-board () depicting how the user would move through the application. Data capture for later analysis was made possible through an external video camera and a Tobii T60 eye tracker.18
The results have been analyzed, incorporated into the newest version of ESRA-C and will be published elsewhere.
Blow up of usability testing in ESRA-C story boar
A number of key findings and implications for practice emerged from our review of these three exemplar studies as well as our own research experience. A summary of the exemplar studies may be found in , followed by a discussion of implications for clinical practice
Summary of Exemplar Patient-centered Technology Usability Studies