|Home | About | Journals | Submit | Contact Us | Français|
A Web-based training course with embedded video clips for reducing central line–associated bloodstream infections (CLABSIs) was evaluated and shown to improve clinician knowledge and retention of knowledge over time. To our knowledge, this is the first study to evaluate Web-based CLABSI training as a stand-alone intervention.
Central line–associated bloodstream infections (CLABSIs) are a significant cause of mortality and morbidity for critically ill patients. The 2011 Infectious Diseases Society of America–Centers for Disease Control and Prevention guidelines for the prevention of CLABSIs recommend implementing educational programs as well as periodic assessment of knowledge and adherence to the guidelines for those who insert and maintain catheters.1
Conventional educational programs involve lecture formats, distributing materials, workshops, simulations, or some combination thereof.2 Online or Web-based formats provide several advantages to conventional training formats, including unrestricted physical location, flexibility in training time, access to course information after completion, training individualization, and tools to automate assessment and documentation.3
This study aimed to determine (a) the effect of a Web-based training course on clinicians’ knowledge and perceived norms regarding prevention of CLABSIs in 5 hospital clinical education programs and (b) knowledge retention months after the training course. To our knowledge, this is the first study to examine Web-based CLABSI training as a stand-alone method of increasing infection-control knowledge.
This study was approved by the institutional review boards at each study site. Five hospitals in the United States participated in the study. Each participating site identified a minimum of 20 clinicians who placed central venous catheters (CVCs) to take the training course described below. The 5 hospitals integrated the course into hospital-specific training activities targeted at those clinicians who might perform or assist in CVC placement. Study participants were registered nurses, nurse practitioners, physician assistants, residents, fellows, and attending physicians from hospital medicine, critical-care medicine, surgery, and radiology departments. Data were collected electronically via the training course’s assessment tool.
The self-paced course, which provided video demonstration of common errors, aimed to educate clinicians in (a) outcomes and morbidity of CLABSIs and (b) methods to prevent CLABSIs. The content of the course was based on published systematic reviews.1 The course was pilot tested previously in a prospective, randomized, controlled study in the admitting department of a university-based high-volume trauma center.4 In the pilot study, residents who had completed the CVC training course were significantly more likely to comply with sterile practices than were residents who took a paper-based training course or those who had not taken the training course (P = .003).
The CVC Knowledge and Attitude Questionnaire (CVCKAQ) was developed (see Figure 1) to assess knowledge on outcomes and morbidity of CLABSIs and methods to prevent CLABSIs (multiple-choice questions 1–17). Attitude toward sterile practices was assessed using the construct from the theory of planned behavior (Likert-scale questions 18–22), which asserts that people act according to their intentions and perceptions of command over the behavior, and their intentions are inspired by attitudes toward the behavior, subjective norms, and perceived behavioral control.5
The questionnaire was piloted in April 2008 on 10 residents at 1 participating institution and revised on the basis of the results and feedback from the residents. The revised version was piloted in a different participating institution but the questionnaire reliability was not specifically tested. The CVCKAQ was administered 3 times: (1) as a pretest, before the course, (2) as a posttest, immediately after the course, and (3) as a follow-up test, 3–4 months after course completion. The CVCKAQ was completed electronically. A dedicated course Web portal was used at each site (each at a different city) to reduce potential cross-influence among sites. Each participant was assigned a unique anonymous sign-in to allow within-subject comparison of test scores from the pretests, the posttests, and the follow-up tests while protecting anonymity.
Linear mixed models were used to analyze the change in test scores (posttraining minus pretraining). The models included random intercepts to account for within-subject and within-site correlation. Linear mixed models with random intercepts were also used to assess whether subjective norms of CVC infection prevention related to test scores. The statistical software R was used for analysis.6
There were 177 respondents from 5 hospitals. The mean score of the 17 knowledge questions was calculated for the pretest (baseline) questionnaires and the posttest questionnaires. The mean pretest score was 59.6% (standard error [SE], 0.9%) and the mean posttest score (test administered immediately after the course) was 77.9% (SE, 0.9%); this represents a significant increase of 18.3% (SE, 1.1%; P < .001). Pre- and posttest scores were examined for each of the 5 sites (Table 1). The mean posttest scores were significantly (P < .001) higher than the pretest scores for site 1 (81.6% vs. 58.6%; P < .001), site 2 (81.1% vs. 57.5%; P < .001), site 3 (73.6% vs. 59.6%; P < .001), site 4 (83.1% vs. 59.4%; P < .001), and site 5 (76.5% vs. 68.1%; P = .044). Site 4 had the largest change in mean score from the pretest to the posttest (23.7% [SE, 2.1%]), and site 5 had the smallest change (8.4% [SE, 3.9%]).
For site 4, the completion rate for the pretest, the posttest, and the follow-up test was 100%. The data from site 4 were analyzed to assess knowledge retention because it was only at this site that a sufficient number of participants finished the follow-up test. Table 2 shows that the mean follow-up test scores decreased by 12.2% from the posttest scores but they were still statistically higher (P < .001) than the baseline pretest scores. Responses to questions addressing attitude were not predictive of increased posttest scores when compared with pretest scores.
The goal of this study was to examine the effect of a Web-based training course about CVC placement on clinicians’ knowledge as well as change in and retention of knowledge in multiple hospitals. We found that the training course did increase clinicians’ knowledge about CVC-associated safety risks, and much of this knowledge was retained after 3–4 months, as scores decreased only about 12% over that time. The clinicians’ attitude toward CLABSIs was not predictive of the knowledge score improvement.
Effective multicenter interventions addressing CLABSIs have utilized several different modalities in training programs.7,8 Interventions have included didactic lectures, self-study modules, checklists, simulations, and recommended procedures posted as reminders. Two other studies have incorporated Web-based CVC training as part of infection-control programs. Berenholtz et al included such training as 1 of 5 interventions that were designed to increase provider awareness.9 The training was paired and evaluated with 16 lectures and was successful in decreasing CVC bloodstream infections. Web-based CVC trainings were also derived from a 30-hour patient safety course at the University of California, San Francisco.10 The advantage of the Web-based training course we used was that it was self-directed and did not require additional staff to give lectures or perform assessments. At 1 site, incoming residents were required to complete the modules before starting their training. Such self-directed study carries the advantages of providing the learner with autonomy and flexibility.11,12
All participants served as their own control to account for the expected differences in knowledge at the start of the course. We found no evidence that training targeting certain groups with certain pretraining attitudes would increase knowledge among clinicians who participated in the training course.
This study is limited in that only 1 site had sufficient participants in the 3–4–month follow-up test; thus, the study results may have been biased. Also, this study was not designed to assess whether a high level of knowledge translated to improved patient outcomes. Moreover, the study was not designed to stratify results by profession.
Future multicenter studies on effective ways to improve education training may include larger sample sizes with measures on behavioral changes and patient outcomes, such as CLABSIs or utilization of femoral sites. Also, researchers should determine ways to assess whether such a course has an effect on longer-term knowledge. Moreover, stratifying results by operator profession may provide further information about training targets.
We thank Cheryl Richards, BS, LPN, RHIA, for coordinating software access and ensuring that follow-up data were addressed at each site, and Kelli Trungale, MLS, ELS, for editorial assistance.
Financial support. A.D.H. was supported by National Institutes of Health (NIH) grant 1 K24 AI079040-03. J.T.J. is associated with an investigator-initiated grant from Baxter Healthcare Corporation for a device to reduce CLABSIs.
Potential conflicts of interest. All authors report no conflicts of interest relevant to this article.