|Home | About | Journals | Submit | Contact Us | Français|
Actionable reminders (electronic reminders linked to computerized order entry) might improve care by facilitating direct ordering of recommended tests. The authors implemented four enhanced actionable reminders targeting performance of annual mammography, one-time bone-density screening, and diabetic testing. There was no difference in rates of appropriate testing between the four intervention and four matched, control primary care clinics for screening mammography (OR 0.81, 95% CI 0.64 to 1.02), bone-density exams (OR 1.29, 95% CI 0.82 to 2.02), HbA1c monitoring (OR 0.91, 95% CI 0.58 to 1.42) and LDL cholesterol monitoring (OR 1.40, 95% CI 0.76 to 2.59). Of the survey respondents, 79% almost never used the system or were unaware of the functionality. In the 9/228 (3.9%) cases with indirect evidence of mammography reminder use, there was a significantly lower proportion with test performance. Our actionable reminders did not improve receipt of overdue testing, potentially due to limitations of workflow integration.
Many patients do not receive recommended preventive or chronic disease care.1 2 These gaps have been demonstrated in screening for breast cancer3 and osteoporosis4 as well as for monitoring diabetes.5 6 Inadequate testing may lead to delays in diagnosis and treatment of unrecognized conditions and suboptimal management of chronic diseases.
Adoption of computerized decision-support systems can improve adherence to recommended care guidelines.7–9 Electronic reminders have shown varying levels of success in improving rates of breast cancer screening and diabetic testing.10–12 However, these electronic systems can disrupt care processes,13 and their effectiveness can be limited by technical and non-technical issues.14–16 Crucial components of a successful reminder system include proper integration of the system into clinician workflow14 and the ability to facilitate the recommended action.17 18 We hypothesized that implementing electronic reminders with direct links to computerized physician order entry would facilitate physician test ordering as part of an efficient workflow process and would improve the performance of overdue screening and diabetic monitoring tests in the outpatient setting.
We implemented our intervention from March 2007 to January 2008 across four primary care clinics affiliated with a large academic medical center in Boston, Massachusetts, USA. All physicians in these practices had been using a well-established internally developed electronic health record with electronic reminders for both preventive care and chronic disease care. These reminders did increase performance of recommended tests,10 but gaps remained. In March 2007, a new feature of electronic order entry was added to the electronic health record, allowing the ordering of laboratory and radiology exams from within the electronic record. Four clinics volunteered to have this new feature activated. We identified four clinics affiliated with the same medical center to serve as matched control sites. The study was approved by the human studies committee at Partners Healthcare System.
We created a set of ‘actionable’ electronic reminders that built on the new functionality of electronic order entry. These reminders directly linked recommendations to the order entry module, facilitating test ordering. The actionable reminders promoted (1) annual mammography in women 40–80 years old; (2) one-time bone density exam in patients at high risk for fracture; and (3) hemoglobin A1c testing every 6 months and low-density lipoprotein (LDL) cholesterol testing every 12 months among patients with diabetes.
Each time that a clinician opened an electronic patient chart, the algorithm for all reminders was run to determine whether the patient was overdue for recommended care. This algorithm searched all laboratory and radiology results as well as the electronic problem list, medication list, and allergy list. The reminders were displayed on the main patient summary screen and physician signature screen, and delivered in a passive rather than active manner.
The actionable reminders facilitated completion of the recommended test or procedure by providing a drop-down box displaying potential actions (figure 1). For laboratory tests, a single click completed the order. This order then automatically generated a printed slip for the patient to bring to the laboratory for testing. For radiology studies, providers were taken to the radiology order entry screen to provide additional information and schedule the exam.
The actionable reminders were activated in four intervention clinics between April and July 2007. Information technology staff provided group training sessions for the practice management staff. Each physician received 1 h of one-on-one training on the new features within the electronic health record. After implementation of the alerts at each site, there was a follow-up informational session led by the information technology staff.
We defined the four clinics using the actionable reminders and electronic order entry as intervention sites. We next identified the pool of primary care clinics (n=18) that were affiliated with same academic health center and using the same electronic health record, but had not volunteered to use the actionable reminder and order entry modules. We selected four control sites from this group, matching to the intervention sites based on number of patient visits and baseline annual screening mammography rates. Physicians at these matched, control sites continued to be exposed to non-actionable versions of the same reminders provided to physicians practicing in the intervention sites. The reminders in both intervention and control practices facilitated clinician recording of structured responses related to the overdue tests. These structured responses indicated that the test was scheduled or provided a reason it was not ordered.
Our primary outcome measure was the proportion of overdue tests that were performed following office visits where reminders were triggered. We measured this outcome at 2 weeks for blood tests and 2 months for mammograms or bone-density exams. We enrolled eligible patient visits at intervention and control sites during a 6-month period prior to the actionable reminder implementation (baseline) and during a 6-month period following implementation (follow-up).
Data on performance of exams were collected from the hospital's electronic central data repository. We reviewed a random subset of 75 cases for each alert type to determine the alert's positive and negative predictive values. The positive predictive values for the alerts were 80% for overdue mammography, 64% for bone density exams, 78% for hemoglobin A1c and 82% for LDL cholesterol. The negative predictive values were 88% for mammography, 84% for bone density exams, 100% for hemoglobin A1c, and 96% for LDL cholesterol.
We fit multivariable logistic regression models to produce adjusted estimates of the impact of the actionable reminders. The dependent variable in these models was the performance of recommended overdue tests, while the primary predictor was intervention versus control status. Additional independent variables included patient sex, race, insurance status, and baseline testing rate at each clinic for the measure of interest. We used the generalized estimating equations approach to adjust for clustering of patients within clinicians. We repeated a similar analysis among patients overdue for mammography to determine whether clinician documentation of a test-related structured response predicted performance of mammography.
We surveyed the 25 primary care clinicians practicing at the intervention sites 3 months following the implementation of the actionable reminders using a paper mailing followed by a second mailing to non-responders. The survey items assessed general use patterns of the electronic health record and self-reported use of the actionable electronic reminders.
The intervention and control groups demonstrated good balance in patient sociodemographic characteristics (see table 1, available online at www.jamia.org). The clinicians in the intervention and control groups were well matched based on gender, years since graduation, and number of patient visits in the study period (see table 2, available online at www.jamia.org). There was no difference between the intervention and control sites in performance of overdue tests for breast cancer screening, osteoporosis screening, and diabetes management (table 1).
The clinician survey response rate was 100%. All (100%) of the physicians reported using the electronic health record with every patient encounter. When asked how often they placed orders directly from the electronic actionable reminders using a 5-point Likert scale ranging from ‘Use during almost every patient visit’ to ‘Not aware of this feature,’ 46% responded ‘Almost never use,’ and an additional 33% responded ‘Not aware of this feature.’
Among patient visits in intervention practices where mammography testing was overdue, mammography was performed following 9/228 (3.9%) visits with a structured response compared with 316/3278 (9.6%) visits without a structured response (p=0.001). In control practices, mammography was performed following 2/140 (1.4%) visits with a structured response compared with 549/3507 (14.8%) visits without a structured response (p<0.001).
We studied a new set of actionable electronic reminders designed to improve quality of primary care and found no significant improvement in testing rates for diabetes care, breast cancer screening, and osteoporosis screening. Unfortunately, many intervention physicians reported not using, and even not being aware of, the new features of the electronic health record.
In prior reviews of electronic clinical decision support systems, reminders prompting clinicians to improve guideline adherence have often improved clinician performance.9 19 However, these studies almost universally compared the effect of electronic reminders to no reminders. In our study, we measured the incremental effect of linking reminders to the electronic order entry system, with both intervention and control sites receiving similar basic information from the electronic reminders.
Compared with informational reminders, the implementation of actionable reminders presented additional challenges of facilitating efficient test ordering across different ordering systems. Limited interfaces between our reminders and lab and radiology order systems were considerable hurdles for clinician adoption. This lack of ‘efficiency gain’ for ordering tests likely caused the unexpectedly lower rate of mammography testing when clinicians recorded mammography-related structured responses. When ordering mammography, clinicians likely went directly to the radiology order entry system, rather than take extra steps by clicking the reminders. We believe that most clinicians used the actionable reminders for informational purposes, as well as to document reasons tests were not performed. This theory was supported by the similar relationship between structured responses and mammogram performance seen in the control clinics with purely informational reminders.
To explore other reasons for low usage of the actionable reminders, we performed informal follow-up discussions with clinicians in the intervention sites. In addition to the workflow barriers described above, concerns arose regarding the content of the reminders. Clinicians felt that a few alerts contained inaccuracies that led to needless extra work. While our review of the positive predictive value of the reminders found that this rate was not excessive, the perception of inaccurate reminders for some cases may have led some clinicians to abandon the reminders altogether.
Our experience provides quantitative support for the theoretical frameworks identifying barriers to adoption of clinical decision support systems. During our evaluation, several practical lessons emerged. First, our ordering workflow using reminders needed to be more efficient. Second, our intervention likely would have benefited from obtaining data on usage of the system earlier in the process, allowing us to address the underlying issues in a more timely fashion. Third, despite our training efforts, a significant subset of clinicians in the intervention group was not fully aware of the capabilities of the system and required retraining. Finally, the usage of our reminders could have improved from increased communication with clinicians regarding perceived inaccuracies in the reminders, which may have increased confidence and use of the system. Our lessons are consistent with a recent review suggesting that effective evaluation strategies can provide vital information to improve the implementation of these systems.20
We conducted an evaluation of the implementation of a health-information-technology-based quality-improvement initiative which included collection of clinical data as well as an assessment of physician use patterns. However, our study does have some limitations. Our intervention group comprised clinicians who volunteered to use the reminders, leading to potential selection bias. However, we would have expected a bias toward a positive effect from the intervention because these clinicians were more likely to respond to the reminders. In addition, we were unable to directly measure whether the actionable reminders were clicked and used the recording of structured responses as a proxy for reminder usage. While we believe that the vast majority of these responses are entered via the reminders, it is possible to enter these structured responses outside the context of the reminders. Finally, the generalizability of our results may be limited by the fact that the intervention occurred within a single healthcare system and was performed using an internally developed electronic health record.
Our actionable reminders did not impact the performance of overdue testing, as compared to passive reminders, in the primary care setting. The limited effect of these reminders was likely strongly influenced by inefficiencies in the test ordering process. We believe that actionable reminders still have the potential to improve clinician performance and do not consider our findings to be sufficient to dismiss their utility. Future implementations of similar projects should address issues regarding full integration into the order management process, training, and content of reminders to ensure successful quality improvement.
We thank the Longitudinal Medical Record development team at Partners Healthcare System, for their work in creating the actionable reminders, and J Fiskio, for her assistance in data acquisition.
Funding: This study was funded by the Agency for Healthcare Research and Quality (1 R01 HS 015226-01) and the National Library of Medicine (2 T15 LM 007092-17).
Competing interests: None.
Ethics approval: This study was conducted with the approval of the Partners Healthcare System, Boston, Massachusetts.
Provenance and peer review: Not commissioned; externally peer reviewed.