|Home | About | Journals | Submit | Contact Us | Français|
Evidence based interventions are often disseminated in public health education with little known about their operational fidelity. This study examined the delivery of intervention components (operational fidelity) of a widely disseminated HIV prevention program designed for people living with HIV/AIDS named Healthy Relationships. Two hundred and ninety nine agencies that had been trained in the intervention by the Centers for Disease Control and Prevention were contacted and 122 (41%) completed confidential interviews. Among the 93 agencies that implemented the program, 39 (40%) adapted at least one core element activity and 21 (23%) dropped an activity. Most adaptations were intended to improve the community fit of the intervention. Agencies believed that funders demand that they implement the intervention with fidelity. Models of technology transfer that emphasize behavior change processes rather than specific curriculum content may advance prevention program dissemination.
Evidence-based health care is essential to enhancing quality of services and improving the public health. In HIV/AIDS prevention, evidence-based interventions are distributed to community-based organizations and public health service providers through a program delivery system established by the Centers for Disease Control and Prevention (CDC, Collins et al., 2006). Interventions deemed effective can either be directly funded by the CDC or funded through state and local health departments. When selecting programs to include in its dissemination program the CDC uses well-defined standards of intervention efficacy, almost invariably relying on (a) outcomes from randomized controlled trials (RCT), (b) demonstrated feasibility for practical implementation through a community replication process, and (c) evidence for cost-effectiveness (Lyles et al., 2004). Interventions that meet the CDC’s efficacy, feasibility, and cost-effectiveness criteria are field tested for potential implementation and ultimately assembled into user-friendly curricula, materials, programmatic training, and quality assurance packages. The CDC intervention packages are produced and distributed through the Diffusion of Effective Behavioral Interventions (DEBI) program. Since its launch in 2002, the DEBI program has changed the landscape of HIV prevention services in the US. New HIV prevention interventions are periodically added to the DEBI program through curriculum and program development grants. Thousands of community-based service providers working within hundreds of community based organizations (CBOs) and health departments are trained to implement interventions rolled out as DEBIs.
Training in evidence-based programs emphasizes the importance of close adherence to procedures that were used in the clinical trials that originally tested the intervention. In the case of the CDC DEBIs, each intervention has key characteristics and core elements to guide their implementation. In this framework, an intervention’s key characteristics are flexible and can be altered without changing the essential nature of the intervention. The CDC (2010) defines Key Characteristics as “those parts of an intervention (activities and delivery methods) that can be adapted to meet the needs of the CBO or target population.” Adapting key characteristics is actually encouraged to assure contextualization, customization, and community fit (Kalichman et al., 2007). In contrast, a DEBI’s core elements are considered essential to assuring a program’s effectiveness and according to the CDC must be closely adhered to. Specifically, “core elements are those parts of an intervention that must be done and cannot be changed. They come from the behavioral theory upon which the intervention or strategy is based; they are thought to be responsible for the intervention's effectiveness. Core elements are essential and cannot be ignored, added to, or changed” (CDC, 2010). In other words, fidelity to the core elements is the basis for expecting positive outcomes from a DEBI (Dworkin et al., 2008; Fradt et al., 2000; Harshbarger et al., 2006; McKleroy et al., 2006). From this perspective, it is essential that an intervention's core elements are not compromised. The degree to which end-users have retained the integrity of DEBI core elements and implement these interventions with fidelity has only recently been the subject of operations research.
Studies indicate that the core elements of evidence-based HIV prevention interventions are being adapted and even omitted by end users, calling into question the fidelity of their implementation. Galbraith et al. (2008), for example, studied the fidelity of the widely implemented DEBI program Focus on Kids, a theory-based behavioral intervention designed to reduce HIV/STI risks among African-American youth (Stanton et al., 2004). In a telephone survey of 34 agencies that were using Focus on Kids, Galbraith et al. (2008) found that one in five providers had dropped at least one of the intervention’s core elements, with some core elements being dropped by nearly every user. Adaptations and significant adjustments to essential intervention activities were also common, with more than one in three implementers changing the content of the original Focus on Kids activities. The most common reasons for adaptations were to assure that the intervention was suitable for the target community and to update the materials for current use. More than half of Focus on Kids users added new activities to the program aimed at meeting needs beyond those addressed in the original intervention, such as alcohol use, drug use and sexual abuse. Adding components to an intervention requires time and hence the abbreviation or removal of originally packaged activities was also common. Similar adaptations, changes, and deletions that result in non-fidelity to DEBI programs have been observed in other operations studies (Harshbarger et al., 2006; Rebchook et al., 2006). The degree to which these adaptations, all geared toward program improvement, actually enhance or degrade intervention effectiveness is unknown.
Fidelity to an HIV prevention intervention designed for people living with HIV/AIDS and delivered in clinical services has also been examined. Interventions that target HIV positive persons, or positive prevention, are integral to US national HIV prevention priorities (CDC, 2003) and positive prevention is recommended for all comprehensive HIV prevention plans in developing countries (Bunnell et al., 2006; Kalichman, 2005). Iverson et al. (2008) evaluated the fidelity of the physician-delivered positive prevention intervention Partnership for Health (Richardson et al., 2004). This intervention uses message framing, repetition, and reinforcement during routine patient visits to enhance knowledge, skills, and motivations to practice safer sex. Partnership for Health delivers a ‘prescription for prevention’ to improve patient-provider communication about safer sex, disclosure of HIV status, and HIV transmission risk reduction. The core elements of the intervention emphasize that activities are delivered by providers to HIV-positive patients in outpatient clinics. Partnership for Health is integrated into routine clinical care and all clinic staff are trained to include prevention counseling in their practice. Providers initiate brief (3- to 5-minute) discussions about safer sex, including self-protection, partner protection, and disclosure during routine clinic visits. In a fidelity evaluation that used clinic chart abstraction as well as patient and provider reports, Iverson et al. found a steady and significant increase in the delivery of prevention prescriptions to patients over the first four-months of intervention implementation, followed by a steep decline in the subsequent two months and a sustained low frequency of delivery over 15 additional months. These results were consistent with survey data collected from patients who reported infrequent discussions about safer sex with their providers and even less exposure to the prevention prescriptions over the course of the post-implementation evaluation. A trajectory of behavioral decay occurred over time among providers, similar to the degradation that is common to individual-level behavior change.
The limited fidelity to DEBI interventions, especially their core elements, observed in past research may result from several factors. Providers may find certain intervention elements overly complex or too difficult to implement. Time constraints may also demand shortening programs and therefore dropping activities. Adaptations may also be motivated by the need to improve intervention fit to the community. The CDC’s DEBI program itself is perceived as a top-down approach to dissemination (Dworkin et al., 2008), which may contribute to resistance to maintaining fidelity. Although non-fidelity to DEBI interventions examined thus far has been common, the factors that influence adaptations are less known. The purpose of the current study was to examine operational fidelity to, as well as the factors associated with fidelity to, a community-based positive prevention intervention.
Healthy Relationships is a five-session support group style intervention that is disseminated by the CDC as a DEBI program and is included among CDC’s best evidence based interventions (CDC, 2009; Crepaz et al., 2006; Lyles et al., 2007). Healthy Relationships emphasizes building skills for managing HIV status disclosure decisions and skills for practicing safer sex (Kalichman et al., 2001). The intervention has three major components focused on (a) decision-making skills for disclosure of HIV status to friends and family, (b) decision-making skills for HIV status disclosure to sex partners, and (c) safer sex negotiation and HIV transmission risk reduction behavioral self-management skills. The core elements of Healthy Relationships are, (a) defining stress and reinforcing coping skills as applied to disclosure decision making and establishing healthier and safer relationships; (b) using behavioral rehearsal and role play modeling with feedback to teach and practice coping skills; (c) teaching decision-making skills for HIV status disclosure; (d) providing personal feedback reports to motivate risk behavior changes; and e) using movie clips from popular films to set up scenarios for disclosure and risk reduction role-plays. The core elements are expressed as cognitive-behavioral activities in the intervention content.
We evaluated whether core element activities of Healthy Relationships were implemented, changed, or dropped (operational fidelity). For example, the use of popular film clips to set-up skills building exercises is a signature feature of the intervention. Although agencies can use any number of specific movie scenes to accomplish the aims of the exercises, the scenes are used within a prescribed behavioral skills training framework that is central to the core elements. We focused on the adaptation of program activities that are linked to the core elements as well as the rationale behind adaptations. We also examined service agency perceptions of the CDC’s DEBI program in relation to operational fidelity.
We contacted the CDC and requested the names of agencies and agency personnel trained in Healthy Relationships since it was instituted as a DEBI program in 2005. There had been a total of 63 trainings in Healthy Relationships conducted across the US between January 2005 and April 2008. A total of 999 persons who worked for 235 CBOs and 64 health departments had been trained. We attempted to reach each of the trained agencies by contacting all of the persons listed. The CDC only distributes Healthy Relationships intervention packages to agencies that have received training. Therefore, we employed a universal sampling scheme to reach all agencies trained in Healthy Relationships
Figure 1 shows the flow of agencies sampled in the study. A total of 139 agencies were unable to be contacted, 38 agencies were contacted but did not complete interviews, and 111 CBOs and 11 health departments from 37 states completed interviews. Individuals interviewed were primarily direct service staff and included program managers, case managers, and prevention specialists.
For the current study, we adapted measures previously used in HIV prevention program operations research (Galbraith et al., 2008; Kalichman et al., 1997). Participants were asked about their Healthy Relationships training experiences and their agency’s use of the intervention, particularly whether intervention activities were used and if used whether there were changes to the intervention content. With respect to specific intervention activities, participants reported whether the activity was implemented or dropped and if implemented whether their agency changed how the activity was used in relation to the original program content. The intervention activities directly addressed three behavioral domains keyed to the core elements, (a) disclosure of HIV status decisions to family and friends, (b) disclosure decisions to sex partners, and (c) practicing safer sex. Within each domain, four intervention activities were the focus of our assessment; (a) use of personalized feedback reports, (b) risk continuum activities, (c) decisional balance grids, and (d) movie clips for interactive skills training. The twelve activities were assessed for the rationale behind their changes by asking participants to rate the importance of four reasons for changing the activity (a) to simplify the intervention, (b) the activity was not understood, (c) time constraints, and (d) community fit. Ratings were made on 5-point scales, 1 = not at all important, 5 = extremely important.
We also assessed the structural features of the intervention that included key characteristics, such as whether the intervention was delivered in small groups, the duration of the intervention sessions, whether groups were mixed or segregated by gender and sexual orientation, and the characteristics of the group facilitators. Participants also rated their perceptions of Healthy Relationships effectiveness on five principle outcomes: stress reduction, coping with HIV/AIDS, helping with HIV disclosure decisions to non-partners and to sex partners, and increasing safer sex. Each perceived outcome was rated on a 10-pont scale, 1 = not at all effective, 10 = very effective. Finally, we asked participants to reflect on their views of the CDC DEBI program and respond to nine questions regarding their perceptions of DEBI interventions. These items were derived from formative meetings with service providers and were intended to reflect beliefs about the DEBI program in particular and the importance of maintaining fidelity to prevention interventions more generally. Agencies that did not implement Healthy Relationships were asked about their use of other CDC DEBI programs and the reasons why Healthy Relationships was not implemented.
Participating agencies were asked whether they had changed any of the major activities from the original implementation guidelines. Participants’ responses were probed to determine whether changes to core activities constituted a breach in intervention fidelity. For example, agencies can use a variety of different scenes from popular films to conduct skills training role plays without violating fidelity. However, failing to follow the steps of modeling, rehearsal, and feedback for skills training was defined as a breach in fidelity. For each core element activity, agencies were also asked if the activity was omitted. Fidelity was defined as having implemented all 12 core element activities without omitting or substantially changing them.
The majority of telephone interviews were conducted with one person per agency trained in the Healthy Relationships intervention. Five agencies requested that more than one staff person be interviewed to provide a more complete look at their agency’s experiences with the intervention. In these cases, the staff persons were interviewed together and agreement was reached by consensus. We contacted each person at each agency listed as trained in Healthy Relationships by email inviting them to participate in a confidential structured interview. Although it was common for the contact person listed to no longer be with the agency, all of the staff who participated were directly involved with Healthy Relationships at their agency. Individuals who expressed interest in participating were scheduled for a telephone interview. Missed appointments were rescheduled an unlimited number of times. Individuals who did not respond to the invitation email were sent two subsequent invitations. Returned emails were not pursued further. For agencies that did not implement Healthy Relationships the interviews lasted approximately 10 minutes. Interviews with agencies that had implemented Healthy Relationships required 45 minutes to complete. Agencies were offered a $50 office supply store gift card for their participation. All of the procedures were approved by the University of Connecticut Institutional Review Board.
Initial descriptive analyses were performed to examine intervention implementation as well as reasons for not implementing Healthy Relationships. The main analyses focused on comparing agencies that had implemented Healthy Relationships with fidelity to those who had adapted the intervention. Analyses for categorical variables used contingency table X2 tests and comparisons on continuous variables used independent t-tests. Comparisons were conducted on the importance of reasons for dropping or changing core elements using a within-subjects analysis of variance (ANOVA). Statistical significance was defined using conventional probabilities, p < .05 and p < .01.
A total of 122 of the 299 (40%) CBOs and health departments trained in Healthy Relationships were interviewed. The median duration that the agencies provided AIDS services was 19 years. The median number of staff at the agencies was 25, with 5 dedicated to HIV prevention. Forty-two percent (n =51) of the agencies received funding directly from the CDC for prevention services and 76% (n = 93) received funding through the Ryan White Care Act. The majority of agencies (57%, n = 69) indicated that they were maintaining their current level of funding, whereas 21% (n = 26) reported increased funding for new programs and 22% (n = 27) reported losing funding and cutting services. More than half of agencies (53%, n = 65) perceived themselves as doing better financially than most other agencies, 38% (n = 46) saw themselves as doing about the same, and 9% (n = 10) perceived their agency as doing worse than most others.
In most cases (n = 118, 97%) the person we interviewed had been trained to conduct Healthy Relationships. In addition, 71% (n = 86) had facilitated Healthy Relationships groups, 83% (n = 101) had read the intervention manual, and 67% (n = 82) had read the original intervention research outcome article. Eighty-four percent (n = 102) of agencies indicated that they had implemented at least one CDC-DEBI intervention, with 61% (n = 75) having implemented 3 or more DEBIs. Among the 78 agencies that had implemented a DEBI program for men who have sex with men, 60 (77%) had used Healthy Relationships. Similarly, 71 agencies had implemented a DEBI program for heterosexual men, of which 65 (75%) had used Healthy Relationships as was the case for 65 of the 87 (75%) agencies that implemented a DEBI program for women. In addition, agencies implemented the intervention with populations that it was not originally tested on; 34 (28%) with transgender persons, 28 (23%) with injection drug users, and 7 (6%) had used the intervention with adolescents.
A total of 29 of the 122 trained agencies (22%) had not implemented Healthy Relationships. Table 1 shows the reasons that agencies indicated for not implementing the program. As shown in the table, most agencies identified multiple reasons for not implementing Healthy Relationships. The most common reasons concerned perceiving the program as not meeting agency priorities, not meeting community needs, and implementation difficulty. The least common reasons for not implementing the program were a lack of staff interest and not having adequate space.
Among the 93 agencies that implemented Healthy Relationships, 52 (56%) stated that they had adapted the intervention, of which 39 (42%) changed at least one core element activity and 21 (23%) dropped at least one core element activity entirely. Table 2 shows the adaptation and dropping of each major Healthy Relationships activity as well as the importance ratings for each change. Results showed that most of the core element activities were adhered to by at least 70% of implementing agencies. It was far more common to adapt activities than to drop them altogether. Changes occurred across all three of the intervention domains concerning disclosure decision making skills and safer sex skills building. Few core activities were dropped from the intervention, with the most common being the Personalized Feedback Report. Changes were most commonly made to improve the fit of intervention activities to the target community. The multivariable within subjects ANOVA performed on the reasons for adapting individual components of the intervention showed that agencies were significantly more likely to adapt or drop components to simplify the intervention delivery and to improve the suitability of the intervention for their target community, Wilks’ Lambda = .432, F(3, 45) = 19.7, p < .01 (see Table 3). Post hoc tests showed that community fit was rated as significantly more important than all three other reasons for adaptation.
Table 4 shows adherence to Healthy Relationships’ key characteristics among agencies that adapted the core elements and agencies that implemented with core element fidelity. Results showed that only one key characteristic had differed between agencies that adapted the intervention and those that implemented with fidelity; agencies that adapted core elements were less likely to have matched group facilitators to group participants’ race/ethnicity as indicated in the intervention key characteristics. There were no other differences in adapted key characteristics between agencies that did not and did maintain fidelity to the core elements.
Implementing the Healthy Relationships intervention was significantly associated with directly receiving CDC funding, X2 (N=122, df, 1) = 5.07, p < .02, and receiving Ryan White Care Act funding, X2 (N=122, df, 1) = 4.21, p < .05. Participants endorsed several concerns that the CDC requires programs to maintain intervention fidelity while also indicating that it is in the best interest of the agency and community to adapt programs. A majority of participants (over 80%) believed that state health departments and the CDC tell agencies to implement DEBI programs exactly as they were originally tested. More than half of agencies stated that the DEBIs should be modified by changing or dropping activities. Nearly all participants, more than 90%, also felt that services were improved as a direct result of the DEBIs. The only factor that differed between groups was the perception that their agency had difficulty finding programs that fit their community needs; agencies that adapted Healthy Relationships indicated having more difficulty finding programs that fit (see Table 5).
Overall, agencies that adapted Healthy Relationships did not differ from agencies that implemented with fidelity in their perceptions of program effectiveness. Only one difference was observed; agencies that adapted Healthy Relationships rated it as less effective in helping people deal with stress (see Table 6).
There are now several evidence-based interventions available for HIV prevention services delivered in both clinical and community settings. The degree of latitude that end-users have in adapting effective interventions while preserving their efficacy can be established through implementation fidelity testing. Unfortunately, there are few HIV prevention interventions that have been subjected to implementation fidelity testing. The current study assessed operational fidelity to Healthy Relationships, an intensive risk reduction behavioral skills training group for HIV positive adults. While nearly all of the staff interviewed were trained in the intervention 83% had read the intervention manual and 67% had read the original outcome research article. We found that most of the agencies that were trained in the intervention had implemented and that half of those agencies had implemented with fidelity. Although it was common for agencies to alter the intervention content, one in five agencies dropped at least one core element activity. The most common reasons for altering the intervention were to improve its community fit and to meet time constraints for program delivery. These findings are similar to previous operations studies of DEBI programs, where adaptations, deletions, and non-adherence to core elements were observed, often in response to tailoring the intervention to improve community fit (Galbraith et al., 2008; Harshbarger et al., 2008; Iverson et al., 2008).
The current findings should be interpreted in light of the study methodological limitations. First, the participating agencies represented 40% of all agencies trained in Healthy Relationships. The sample cannot therefore be considered representative and caution should be exercised in generalizing the findings to agencies trained in Healthy Relationships. Another sampling limitation is that end-users change over time. There is no basis for assuming that agencies trained in the intervention at the time of this study will represent those trained in the future. Differences in staff training may impact the implications of the findings. This study also relied on self-reported implementation of program activities rather than objective measures drawn from agency records. Our measures also did not assess implementation facilitators and barriers, such as technical assistance and adaptation guidance. The single cross-sectional data collection point also cautions against drawing causal inferences from the study. We also did not include a standardized and validated measure of fidelity in the study. Despite these limitations, we believe that our findings have implications for improving the dissemination of Healthy Relationships.
Agencies that reported adapting or dropping core elements from the intervention did so to improve program fit to the community. Unfortunately, agencies have little guidance for making adjustments to the core elements of evidence-based interventions and there is no feedback system in place for updating DEBI interventions. Issues of assuring community fit, capitalizing on agency creativity and doing away with fidelity testing are emphasized by alternative models of evidence-based intervention dissemination. Process models, for example, downplay the importance of specific intervention content and curriculum materials while emphasizing overarching behavior change processes. For example, Rotheram-Borus et al. (2009a, 2009b) suggested that all effective interventions share common factors including goal setting and behavioral skills building that are crafted and expressed differently in various interventions. Under process models, the personalized feedback reports that were dropped by agencies in the current study could be substituted with an alternative approach to motivational enhancement. Similarly, one goal setting exercise could be replaced with another that better fits the community while still meeting the objectives of a core behavior change process.
The most common adaptation to Healthy Relationships occurred in the use of movie clips for behavioral skills building. Although agencies are trained and given guidance for selecting movies clips from popular films for use in role play-driven behavioral rehearsals, as many as one in five agencies adapted this activity beyond the guidance. The most frequently dropped intervention elements were the Personalized Feedback Reports, which were designed to enhance motivation for behavior change. Unfortunately, without a test of implementation fidelity it is not possible to know whether the content omissions impact outcomes. As an alternative to dropping an element, agencies may be provided with guidance for adaptation. In the case of Personalized Feedback Reports, it may have been possible to deliver the feedback verbally or in a pre-group individual counseling session rather than dropping the activity altogether. There were also several adaptations to the intervention’s key characteristics particularly with regard to maintaining gender and sexual orientation homogeneity within groups and the recommended characteristics of the group facilitators. However, the available research indicates that changes in the key characteristics may have minimal impact on implementing Healthy Relationships (Kalichman et al., 2007).
In summary, a majority of AIDS service providers trained to deliver Healthy Relationships implemented the program. Only about half of those that implemented, however, did so with fidelity to the core elements. Providers who adapted the intervention found it as beneficial as those who implemented with fidelity. Altering core elements to improve community fit of Healthy Relationships did not appear to impact any perceptions of effectiveness. However, research is needed to determine whether fidelity to core elements is related to real world efficacy. Greater guidance is needed for implementing Healthy Relationships to maximally adhere to its principles while also maximizing its fit to community needs.
This project was supported by grants from the National Institute of Mental Health (NIMH) grants R01-MH71164 and R01-MH82633.
The authors thank Charles Collins at the Centers for Disease Control and Prevention for providing the list of participating agencies, Daniel O’Connell, James Tesoriero, and Haven Battles at the New York State Department of Health for discussions regarding study design, and Jennifer Galbraith for sharing measures from previous research.