Search tips
Search criteria 


Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
Child Sch. Author manuscript; available in PMC 2010 July 14.
Published in final edited form as:
Child Sch. 2008; 30(2): 116–127.
PMCID: PMC2903757

Methodology for Evaluating an Adaptation of Evidence-Based Drug Abuse Prevention in Alternative Schools

Laura M. Hopson, Ph.D, Assistant Professor and Lori K. Holleran Steiker, Ph.D, Associate Professor


The purpose of this article is to set forth an innovative methodological protocol for culturally grounding interventions with high risk youth in alternative schools. This study utilized mixed methods to evaluate original and adapted versions of a culturally grounded substance abuse prevention program. The qualitative and quantitative methods concurrently explore behaviors around drugs and alcohol, contextual variables for youth substance abuse and related factors, cultural perspectives regarding drug-related attitudes and behaviors, and the complex reasons behind students’ substance use choices. While questionnaires are utilized to note demographics, cultural and acculturative variables, drug use, drug and alcohol attitudes and expectancies, and school culture variables, focus groups capture the voices of the students and staff and trends that cannot be fully understood via questionnaires. In this study, focus groups aid in the understanding of student drug and alcohol choices, attitudes and behaviors and help the researchers hone in on questions and necessary changes to future research procedures.

Keywords: Methodology, Adaptation, Evidence-Based, Substance Abuse, Prevention


Effective school-based drug prevention programs require special methodological considerations. Schools serve a broad range of youth who can benefit from evidence-based prevention programs, but there are many challenges to implementing these programs in school settings. Interventions that work well at one school may be a poor fit for others. This article presents an innovative methodology that was used to evaluate adapted and original versions of an evidence-based drug abuse prevention program, Keepin’ it REAL, in four alternative high schools. Keepin’ it REAL, originally developed by Flavio Marsiglia and Michael Hecht with students in Phoenix, Arizona, teaches youth to use four research-elicited resistance strategies: Refuse, Explain, Avoid, and Leave (Hecht, Marsiglia, Elek, Wagstaff, Kulis, & Dustman, 2003).

During Phase I of this research study, students at each school created their own versions of Keepin’ It REAL student workbooks by incorporating their language and life experiences into written scenarios. In addition, they created their own videos, to be utilized in place of the four created by youth in Phoenix for the original curriculum, to illustrate the program’s four resistance strategies. This manuscript discusses the methodology used in Phase II to implement and evaluate the differential effectiveness of the adapted versions and the original version of the curriculum. The methodology employed Participatory Action Research approaches (Whyte, 1991) to engage youth and staff at each setting and incorporate their culture and values. Mixed methods were used in the evaluation to capture the fullest, most accurate picture of students’ drug beliefs, choices and behaviors. This article begins by providing an overview of drug use in alternative schools and reasons for creating culturally grounded adaptations of evidence-based curricula. The research design and methodology are then presented in detail.

Drug Use in Alternative Schools

Alternative schools are important settings for prevention programs because they serve students at greater risk for substance use than those in traditional school settings (Hopson, 2006). Alternative school youth tend to use more drugs than other students and are more likely to use them to cope with stressors (Lehr et al., 2004; Vaughn, Slicker, & Hein, 2000). According to the 1998 National Alternative High School Youth Risk Behavior Survey conducted by the Centers for Disease Control, over 92 percent of alternative school student respondents said they had consumed alcohol at least once, and two thirds had done so during the past month. More than 85 percent of students reported smoking marijuana and over a third had tried cocaine at least once (Grunbaum et al., 2000). Although alternative schools report needing interventions for preventing substance abuse, relatively little research has examined implementation and effectiveness of prevention interventions in alternative schools (Kubik, Lytle, & Fulkerson, 2004; Sussman, Dent, & Stacy, 2002; Sussman, Sun, McCuller, & Dent, 2003).

A Need for Culturally Grounded Prevention Research in Alternative Schools

Hecht et al. (2003) have shown that curricula are more effective when they are grounded in the life experiences and culture of participating youth. School-based practitioners often feel that evidence-based prevention programs developed outside their communities do not meet the needs of their students (Botvin, 2004). This is one reason cited in the literature for the lack of evidence-based programs implemented with fidelity in schools. When these programs are used, facilitators often make modifications to the curriculum during implementation (Botvin 2004).

School Culture

Much of the literature on culturally grounded prevention focuses on the students’ race and ethnicity. However, the culture of the school is important to consider as well. Due to the great variation among alternative schools, it is especially important to examine school culture in these settings. Alternative schools vary in their philosophy and approach to working with students. The main purpose of alternative schools is to serve students whose needs are not met by traditional schools (Dupper, 2005). Many alternative schools exist to provide a disciplinary setting for disruptive students for a defined period of time. These schools typically emphasize improving student behavior rather than academic achievement (Dupper, 2005; Raywid, 1994). Other alternative schools aim to foster educational settings that meet the needs of students who are not thriving in traditional schools by providing self-paced curricula, smaller classes and more services to students (Dupper, 2005).

Two of the alternative schools in this study are disciplinary and two are non-disciplinary. Because these schools serve different purposes, they may vary considerably in their culture, which can influence every phase of a research project and determine the success of a newly introduced program.

Implications of School Culture

Research indicates that school culture can influence whether school staff are likely to incorporate new, innovative practices into their work with students (Glisson, 2002; Bowen, Rose, & Ware, 2006). The literature suggests that students from schools with open, collaborative cultures in which staff members share a common view about their mission have better outcomes than those from schools with hierarchical cultures (Bowen et al., 2006; Harris & Hopkins, 2000; Hofman et al., 2001; Keys, Sharp, Greene, & Grayson, 2003,Lee & Smith, 1993). Schools are more effective at reducing problems such as violence when rules are developed collaboratively with students, and staff members demonstrate respect for student differences (Erickson, Mattaini, & McGuire, 2004). In positive school cultures, school practitioners work together while respecting each other’s differences (Hiatt-Michael, 2001). Bowen and associates have labeled schools with the cultural characteristics described above as learning organizations (Bowen et al., 2006).

The culture of a learning organization fosters flexibility, acceptance of change, and openness to new ways of working toward organizational goals (Argyris, 1992; Bowen et al., 2006). Members of the organization accept new ideas and responsibility for the progress of the organization (Hiatt-Michael, 2001). Learning organizations are characterized by actions and sentiments that enable the organization to value and use information from school staff members and other key stakeholders. This shared information is then used to plan, implement, and evaluate strategies that help the school achieve its goals (Bowen et al., 2006).

Using Adaptation to Create Culturally Grounded Prevention

Because schools have different cultures and their students have unique life experiences, it may be helpful to adapt evidence-based curricula to improve the fit within a particular school. Program adaptation is defined as any deliberate or accidental modifications, such as deleting or adding components, changing the nature of components, changing the way the program is administered, and cultural modifications to the program (Backer, 2001). Studies and research reviews that examine substance abuse program implementation indicate that prevention programs are rarely implemented with strict adherence to the curriculum (Backer, 2001; Bergman and McLaughlin, 1976; Flay et al., 1987; Gottfredson & Gottfredson, 2002). Reasons for poor fidelity in schools include lack of training and support, inadequate resources, large class size, low morale and burnout among teachers and school staff, and insufficient time (Botvin, 2004). An additional barrier to program fidelity is a need to adapt programs to meet the unique needs of a particular school and student population. Teachers and administrators often argue that prevention programs needed to be tailored to better meet the needs of ethnic minority students (Botvin, 2004).

The concept of adaptation is controversial because of the possibility that it diminishes program effectiveness. Research supports the idea that interventions implemented with a great deal of fidelity have better outcomes than those in which implementers diverge from protocols (Blakely et al., 1987; Botvin et al., 1995; Elliott & Mihalic, 2004). Another body of literature argues that community settings should be allowed to adapt curricula to meet their specific needs (Castro, Barrera, & Martinez, 2004). This idea has gained popularity in the face of evidence demonstrating that, although many research dollars have been spent developing and evaluating evidence-based programs, few community settings that serve youth are likely to implement them with fidelity or use them at all (Castro, Barrera, & Martinez, 2004).

Interventions that are not culturally grounded for a given community are unlikely to receive much support from key stakeholders, which makes it unlikely that the curriculum will ever be implemented and sustained. One way to address this problem is by creating adaptations of evidence-based programs that maintain core, effective curriculum components while allowing communities to tailor the intervention to meet their needs (Castro, Barrera, & Martinez, 2004).

Creating adapted versions of Keepin’ it REAL for alternative schools is useful because there are few existing interventions that are designed for this population or for populations that are already experimenting with substance use. The original Keepin’ it REAL is a universal prevention program. Universal prevention aims to prevent harmful behavior by reaching the general population before they have engaged in that behavior (Institute of Medicine [IOM], 1994). An indicated prevention program, which aims to prevent substance use among those at elevated risk who may already be using a variety of substances, is more appropriate for the alternative school population (IOM, 1994). In addition to creating materials that better reflected youth’s life experiences, the adaptation aims to reflect the life experiences of youth who are more likely to have encountered substances and experimented with them than a more traditional school population.

The adapted versions of the curriculum evaluated in this study were created using structured procedures in which students at each school created new curriculum materials and videos. The procedures were developed in close collaboration with curriculum developers in order to maintain the core components and remain true to its theoretical base. A group of students at each school read workbook exercises and reworded them to reflect their daily activities, language, substances they had encountered, and other aspects of their culture and life experiences. In each case, students were given the following instructions: “The scenario should capture real events that occur commonly in the lives of your particular group.” Students also rewrote scripts for videos and filmed new ones to more accurately reflect their life experiences. In writing the scripts, students were instructed as follows: “Scenarios depicted in videos should be events that at least 75% of group members have witnessed or experienced to assure that common scenarios are being captured.” Focus groups were conducted with the students who created the adapted materials to explore whether they felt that the products accurately reflected their life experiences.

Participatory Action Research as Framework

Participatory Action Research (PAR) methods served as a foundation for creating the adaptation of Keepin’ it REAL in Phase I and planning the implementation and evaluation of the adapted curriculum, which is the focus of the study presented here. PAR requires collaboration between researchers and participants at every phase of the research process, a willingness to use participants’ definitions of their needs and potential solutions (Kidd & Kral, 2005), and shared power in making decisions at every phase of the research process (Kelly, 2005). The researcher takes on the role of consultant and serves to facilitate rather than direct the research process (Gosin, Dustman, et al., 2003). Participatory Action Research has been described as being both a process and a goal (Greenwood, Whyte, & Harkavy, 1993).

The benefits of PAR include building participants’ capacity to develop knowledge and skills and solve their own problems. However, because PAR studies typically do not employ experimental design methods but more qualitative methods and case studies, it is more difficult to definitively demonstrate program effectiveness (Hughes, 2003). In an attempt to take advantage of the benefits of experimental design research and Participatory Action Research, studies can integrate the methods. In this type of research, the researcher’s role is to bring to the table a discussion about the importance of strong research methods, evidence based practices, and a theoretical framework while incorporating knowledge from participants into every step of the research process (Hughes, 2003). Scientific rigor and community-based research can be difficult to integrate (Allison & Rootman, 1996; Hughes, 2003). However, in order to ground substance abuse prevention in the realities of the youth recipients, it is critical that drug prevention curricula and research grow from community partnerships, engaging key stakeholders and agencies in all phases of the prevention process.

Research Design and Methodology

The adapted Keepin’ it REAL curricula were evaluated using a mixed methods design that included a quasi-experimental pretest posttest design with a six week and three month follow up and focus groups with students and staff who facilitated the curricula. In order to remain consistent with the collaborative approach employed to create the original Keepin’ it REAL curriculum and create the adapted versions, Participatory Action Research (PAR) methods were used in planning for implementation and evaluation. PAR methods were employed for the following activities:

  • Selection of groups to participate in the study at each school
  • Determining the duration of the curriculum
  • Scheduling a training on the curriculum
  • Determining when and where questionnaires would be distributed
  • Conducting weekly consultations with facilitators at each school
  • Obtaining staff and student feedback on the curriculum through focus groups

In addition to using PAR methods to engage key stakeholders in the process of planning for implementation and evaluation, mixed methods were employed to maintain the most generalizable findings possible while giving students and staff a voice in interpreting the data. The qualitative data can also provide valuable information about the process of implementation and suggestions to improving future research on implementing evidence-based practices in the participating alternative schools (Hopson, 2006). Researchers are encouraging the use of more mixed methods research in schools to clarify the process of implementation and its implications for program effectiveness (Hoagwood & Johnson, 2003).

School social workers regularly employ practices similar to those defined as Participatory Action Research by collaborating with professionals from other disciplines, such as teachers, administrators, school psychologists. They are also trained to understand multiple perspectives on issues. The participatory methods employed in this study are likely to come naturally to many school social workers because they emphasize the importance of allowing the target population to define problems and potential solutions.


All procedures were approved by the Institutional Review Board at the University of Texas at Austin and the participating schools prior to beginning study activities. Purposive sampling was used to select the schools for participation based on their need for drug abuse prevention and their willingness to participate in both phases of the project. Each school included three treatment conditions: students receiving the adapted version of the curriculum, students receiving the original version of the curriculum, and students in a comparison condition who received neither version. Due to the variation between schools, it was important to include all three conditions at each site rather than have separate conditions located at different schools.

Because the alternative schools in this study served different purposes and had different structures, the researchers collaborated closely with staff and administrators to plan the details of implementation (Hopson, 2006). In order to remain true to the theoretical underpinnings of culturally grounded research and PAR, researchers worked to maximize methodological rigor while applying an implementation plan that would work best in each setting. This resulted in slight variations in implementation. For example, some of the schools had pre-existing problem solving classes that were appropriate for the curriculum while others had to use a health class or a pre-existing student support group. Randomization to treatment conditions was not feasible due to administrative constraints. The school principals asked that the curriculum be provided to pre-existing groups and that it be consistent with the purpose of the group or class in which it was offered. This made it impossible to randomly assign students to groups or randomly select classrooms for participation.

After obtaining written consent, participants in all conditions were administered a questionnaire. The questionnaire included items about demographic characteristics, culture and acculturation, current substance use, attitudes about substance use, and use of strategies for resisting substance use. After completing the questionnaire, students in the experimental groups were asked to attend six original or adapted Keepin’ it REAL sessions over the course of six weeks. The duration of the curriculum was determined based on the schedule of disciplinary schools that typically work with students for six to eight weeks before they return to their home schools. The time for offering these sessions was determined by the principal and staff at each setting in order to minimize disruptions to the schedules of staff and students.

After completion of the curricula, students in all three conditions were given the questionnaire again. At this time, students who received the original and adapted curricula were asked to participate in a 45 to 60 minute focus group to discuss their perceptions about the program. All participating students were asked to complete the questionnaire again at six weeks following completion of the curriculum.

At post-test, school staff members were asked to complete the School Success Profile-Learning Organization to assess school culture. They were also asked to participate in focus groups and interviews to discuss their experience of implementing the curriculum.

Facilitator Training

Curriculum facilitators were teachers or counselors at each school. The Keepin’ it REAL curriculum includes an easy to follow teacher’s manual and was designed to require little or no formal training. However, the research team conducted brief trainings with each facilitator prior to implementation. The trainings were completed in 60 to 90 minutes and facilitators were given the teacher’s manual, the student workbooks, and a copy of the videos. The researcher and principal investigator conducted the trainings by showing the curriculum videos and discussing the curriculum session-by-session with the facilitators. During the training, the facilitators were encouraged to ask questions and were given contact information for the researcher and principal investigator. Facilitators were compensated $20 an hour for implementing the curriculum.

Although the researchers included facilitator training in this study, Keepin’ it REAL is a curriculum that school social workers can implement with no formal training because implementation instructions for each session are described in a detailed, inexpensive training manual. It is feasible for a practitioner to implement the curriculum with little or no help from researchers or curriculum developers.


Culture and Acculturation

Initial demographic questions included an ethnicity checklist with various terms for each ethnicity and a blank for filling in a personal identifier as well as a question about generations of family members who were born outside of this country. Cuellar’s ARSMA-II multidimensional acculturation measure was used to allow assessment of both acculturation level and acculturative type (see Cuellar et al., 1995 for more information). Ultimately, analyses of data gathered with this measure will provide information about the differences in substance use among youth with different acculturative types.

Drug Use and Drug Use Expectancies

Measures of drug use expectancies and attitudes were adapted from questionnaire items used to evaluate the original version of Keepin’ it REAL using large school populations. Drug use was measured using items adapted from the Texas School Survey of Substance Use, which has been used for almost a decade to measure trends in substance use among Texas students (TCADA, 2000). Using measures that are consistent with DRS and the Texas School Survey measures provides the opportunity to compare sample characteristics and intervention outcomes across these studies. Because these measures were adapted from those used in other questionnaires, they were evaluated for internal consistency and test-retest reliability for this study. These reliabilities were acceptable, ranging from .81 to .94 (Rubin & Babbie, 2005).

Administering questionnaires can be an important tool that school social workers and other practitioners can also use in evaluating an intervention. In order to preserve confidentiality in this study, school staff did not have access to individual student questionnaires. This complicates the process of evaluating outcomes of an intervention by school practitioners if they wanted to maintain the same protections. However, students can be asked to complete questionnaires anonymously and practitioners could evaluate any change on these variables for the entire group over time. School social workers can partner with a researcher or evaluator to select and administer surveys and analyze the data. Many questionnaire developers also provide the service of analyzing the data and creating reports that explain the results.

School Culture

The School Success Profile – Learning Organization (SSP-LO) was used to measure organizational characteristics that facilitate student learning (Bowen, Rose, & Ware, 2006). This measure was included to determine whether participating alternative schools are significantly different in ways that could facilitate or hinder students’ ability to benefit from participation in the Keepin’ it REAL curriculum (Bowen, Rose, & Ware, 2006).

The SSP-LO assesses characteristics that define a school’s culture. Schools that score higher as learning organizations may be more successful in implementing the intervention because they are likely to be more accepting of new, innovative programs. Because each school created adapted videos and workbook materials that reflected their experiences, the school culture may have an impact on the materials that students choose to produce. Schools in which staff collaborate and work as a team facilitate student learning more than a school in which there is little collaboration, for example (Bowen et al., 2006; Lee, Dedrick, & Smith, 1991; Lee & Smith, 1993). Collaboration and teamwork for the purposes of creating a culture that fosters learning is defined not only by interactions among school staff but also interactions with students, their families, and community members (Bowen et al., 2006). In a school that values student contributions, students may feel that they have more freedom to express their views and that the materials they produce will be respected. The products from this type of school may differ greatly from a school setting in which students are given few opportunities to express their opinions. The SSP-LO has demonstrated strong internal consistency reliability and construct validity (Bowen et al., 2006).

School social workers and other practitioners can use measures such as the SSP-LO to plan for implementing a prevention strategy. The measure can provide information about among staff members and students and whether school practitioners are accepting of new, innovative practices. The measure only requires about ten minutes to complete. Once the surveys have been collected, the survey developers analyze the data and provide a user-friendly report that describes areas in which a school’s culture is consistent with a learning organization and areas in which changes may be needed in order to facilitate a positive learning environment that is conducive to innovation and academic success for students. If a social worker finds that the school’s culture is not likely to be accepting of a new intervention that relies on active collaboration among staff and students, the social worker can work with staff and administrators to develop a plan for building such a culture. Although fostering the culture of a learning organization where it does not yet exist is a daunting task, it can have benefits for staff and students that go far beyond the success of any particular intervention because the culture of a learning organization is associated with positive student outcomes in many areas (Bowen et al., 2006).

Student Focus Groups

Students who receive the curricula are also asked to participate in 45 to 60 minute focus groups to discuss their perceptions about the program. Student focus groups are conducted at posttest to provide information in the students’ own words that can supplement quantitative findings. The focus group protocol includes questions on the following topics:

  • Substance use by peers
  • Approaches that would be useful in preventing abuse of substances
  • Videos used in the curriculum
  • Components of the curriculum that were useful
  • Components of the curriculum that were not useful

Staff Focus Groups

Staff who are involved with implementing the curricula are asked to participate in focus groups or individual interviews to explore their perceptions about the adapted and original versions of the curriculum and about the goodness of fit between the intervention and the school setting. The purpose of the focus group is to supplement data from the SSP-LO and explore whether staff in different organizations express different perceptions about the quality of the intervention and its usefulness for their students.

Staff focus group transcripts are analyzed for themes related to staff perceptions about the adapted and original versions of the curriculum and about the goodness of fit between the intervention and the school setting.

Focus groups are a tool that school social workers and other practitioners can apply in schools settings fairly easily. They require little time from school staff and students and can provide very important information about the success of an intervention. It may be helpful to partner with a consultant or external researcher to develop focus group questions, facilitate the focus group, and analyze the data. In the end, this data can be used to plan for implementing a prevention program that is likely to meet the needs of staff and students or creating an adaptation of an existing intervention if focus groups reveal that they are not culturally appropriate for the school or students.

Reliability and Validity Checks for Qualitative Data

Some techniques that are helpful in establishing reliability and validity of qualitative data included in this study are:

  • Examining participant responses across different forms of the same question
  • Applying a Consistent Analytic Method
  • Prolonged Engagement
  • Triangulation
  • Negative Case Analysis (Franklin & Ballan, 2001)

The focus groups included different means of obtaining similar information. The protocol included questions that asked students to report positive and negative perceptions of the Keepin’ it REAL curriculum and were also asked for their ideas about appropriate prevention strategies for their age group. These two categories of questions both generated information about techniques that are and are not helpful in prevention programs for this population.

Applying a consistent analytic method includes pre-determining an analytical approach that is guided by a theoretical framework (Franklin & Ballan, 2001). For this study, consistent methods are employed to analyze each focus group. Analysis begins with researchers using open coding to define the themes. They then work to combine redundant themes and reveal additional themes until coding reaches the point of saturation in which no further themes are evident in the data.

Prolonged engagement involves spending enough time in a setting to reduce distortions in the data that could be caused by the researcher’s presence (Franklin & Ballan, 2001). This procedure was employed in each school to build rapport with school staff, students, and administrators. The researcher began visiting the schools during the adaptation phase of the larger study, which began in September, 2005, and made weekly visits to each school between pretest and posttest administrations, which occurred between March and June of 2006.

Triangulation is the process of using multiple data sources to verify findings (Creswell, 1998; Franklin & Ballan, 2001). By including both quantitative and qualitative methods to explore the curriculum’s impact on substance use and youth attitudes, the researcher can explore the validity of both data sources. When quantitative data and qualitative data provide corroborating evidence that confirms or fails to confirm a hypothesis, the researcher can have greater confidence in both forms of data (Franklin & Ballan, 2001).

Negative Case Analysis involves examining data that disconfirms hypotheses or themes that the researcher has defined (Creswell, 1998; Franklin & Ballan, 2001). In an attempt to further establish the trustworthiness of the data, the researchers look for anomalies, or instances in which the statements diverged from the main themes.

Quantitative Analyses

The analysis of data from this project is ongoing. Repeated Measures Multivariate Analysis of Variance (MANOVA) is used to determine the relationship between the dependent variables and the independent variables. Chi square and t-test analyses are used to determine differences between groups at pretest, since groups were not randomly assigned to treatment conditions. Analysis of Variance is used to assess for differences between schools on the School-Success Profile-Learning Organization.

Qualitative Analysis

The qualitative methodology for this study is also ongoing. The focus group discussions were audio recorded and transcribed verbatim. These transcriptions are supplemented by the researcher’s notes, which were written during the focus groups. The analytic process begins with open coding of focus group transcriptions in which two researchers independently assign codes to statements related to the research questions. The transcriptions are analyzed for themes related to substance use, attitudes about substances, attitudes about the curriculum, and helpful prevention strategies for the participants.

The researchers analyze transcriptions independently and manually assign codes to pertinent statements. Each researcher also independently develops a list of preliminary codes. The researchers meet after coding transcripts to achieve consensus on the preliminary codes. They independently code the transcripts again with the aim of combining redundant codes and achieving greater specificity of codes when necessary and meet to achieve consensus on these secondary codes. A third repetition of this process is used to further combine related codes and achieve the final list of codes and themes. Every theme is a result of ideas that occurred repeatedly in each of the focus groups. The coding procedure continues until codes reached the point of saturation in which further analysis resulted in no addition themes and the researchers agree on the core themes (Strauss, 1987; Lofland & Lofland, 1995).

Fidelity Assessment

After obtaining permission from facilitators, one of the researchers made several unscheduled and scheduled visits to the groups implementing the curriculum to observe implementation and address any questions from facilitators. The researcher would observe approximately 15 minutes of the curriculum for each visit. The visits were short in order to prevent students from feeling discomfort in discussing substance use issues in the presence of an outsider. The researcher would ask the facilitator which session the group was completing in order to assess how completely the facilitator was conducting the session. The researcher also conducted interviews with facilitators at the end of the curriculum to discuss implementation and whether they had chosen to leave out any sections or add any additional information.

Preliminary Findings

Analyses of the data are currently underway. To date, focus group data reveal common themes in student perceptions of this curriculum and prevention programs in general. Students feel that prevention programs should be conducted with younger students, such as those in elementary and middle school, rather than with high school students. In general, students express that the adapted version of the curriculum is still more appropriate as a universal prevention program for younger students who have not yet initiated use rather than an indicated prevention program for their peers who are already experimenting with substances. Abstinence messages receive consistent criticism as unrealistic for this population. Students indicate that the materials created for the adapted version of the curriculum more accurately depict the real life experiences of local students. Ongoing analyses will be used to determine whether students receiving the adapted version experience the intervention as significantly more realistic than students receiving the original version. Analyses will determine whether students in the three conditions experience significant differences in substance use and attitudes about substance over time. The researchers will also explore whether cultural differences among students are related to patterns of drugs use and attitudes.

Analysis of the SSP-LO does not reveal differences in school culture and scores for each school indicate that each school has characteristics consistent with learning organizations. This may help to explain why each school was able to successfully facilitate a student-driven adaptation of the curriculum and its subsequent implementation. The adaptation requires a school culture that is open to innovation and school practitioners who collaborate with each other and with students in decision making, since the students generated the adapted materials. This type of intervention may be more successful in schools, such as those participating in this study, that are characteristic of learning organizations.

There are many lessons learned during the implementation and evaluation of the curricula in alternative schools that can strengthen future efforts to implement a prevention strategy. These lessons can be applied when school practitioners plan for implementing a prevention program. Approximately one-third of the students dropped out of the study between pretest and posttest. This attrition is concerning because some of the students who are in most need of substance abuse prevention may have dropped out of the study. It is important to maintain close consultation with students during implementation to ensure that they feel the intervention is relevant for them. Incentives are an important method for increasing student participation. Again, practitioners will want to consult with students to determine whether incentives are meaningful to students. Gift cards to a particular store may not be a meaningful incentive for students who have no means of transportation to that store, for example.

The researchers maintained close communication with facilitators during implementation. This was critical for ensuring that the curriculum was completed in many of the schools because facilitators often had questions about particular sessions during the course of the 6-week curriculum. Some facilitators also became frustrated during implementation because of their busy schedules and needed encouragement to continue. When this frustration occurred, the researchers could collaborate with facilitators to problem solve by suggesting that facilitators offer the curriculum at a more convenient time, for example, or that they divide the content of one session into two consecutive days to minimize the time devoted to the curriculum on any given day.

Another factor that contributed to the successful completion of the curriculum by many students was the emphasis on empowering students to be educators about problems with substance use in their schools. Researchers and facilitators in this study repeatedly told students that they are in the best position to educate others about substance use among their peers and to make suggestions for improving prevention programs. Students responded favorably by openly discussing their perceptions of problematic substance use and possible solutions in the focus groups.


This article has presented a mixed-methods approach to evaluating adapted versions of an evidence-based substance abuse prevention program. Mixed methods are ideally suited for evaluation of such efforts at all points in the research timeline, from project planning, to research design, to the intervention’s cultural grounding through adaptation, through program evaluation and ongoing utilization. Including qualitative methods can provide valuable information about the implementation process and ideas for increasing the number of evidence-based practices that are likely to succeed in schools (Hoagwood & Johnson, 2003). The ideas and language of participants that emerge from the qualitative data are invaluable in interpreting data and defining future research plans.

Participatory Action Research methods were infused throughout the study to learn from students and staff about the types of interventions and implementation strategies that are likely to succeed at their schools. It has long been known in social research that interventions cannot flourish without participation and commitment on the part of the host environment (Price & Lorion, 1989; Kelly, 1987, Tornatzky, et al., 1983). Partnerships and collaborations that emerge from embracing PAR techniques are the best vehicles to engage, retain, and resonate with students and staff in alternative schools (Hopson, 2006).

The researchers were careful to include measures of culture and acculturation in addition to outcome measures for drug use and attitudes about drugs. These are important for understanding whether the curriculum may be more culturally grounded for some students than others. Since schools typically serve students from a range of cultural backgrounds, a curriculum needs to reflect the culture and life experiences of a diverse student body in order to be successful (Hecht et al., 2003).

Sites receptiveness and enthusiasm about this project varied somewhat due to the complexities of alternative school settings. It is important to utilize instruments such as the SSP-LO to consider organizational culture variables when assessing the needs and benefits of a health promotion curriculum in such settings. Results of the SSP-LO can help the researchers understand why an intervention may succeed in one school and not in another. It may also be a helpful tool for determining whether school staff members are ready and willing to implement an evidence-based practice. For some schools, aspects of school culture may need to be addressed before an administrator or researcher attempts to introduce a new intervention.

The energy and time spent connecting with and getting input from school staff and other stakeholders consistently paid off in terms of program involvement and enthusiasm. Students and staff were consistently treated as experts and were consulted as such. The more respectful and non-judgmental the stance of the research team towards the participants, the more open they were in the focus groups.

Although school social workers rarely would have the time and resources to replicate all of the evaluation procedures presented here, the article discusses strategies, such as combining anonymous questionnaires and focus groups, that they can use to evaluate their interventions. Partnering with researchers or program evaluators can be mutually beneficial by allowing researchers to understand whether prevention programs are successful in the community and by providing school practitioners and administrators with data to guide their plans for continuing to implement a curriculum, create an adaptation, or finding a new curriculum altogether.

School social workers and many other alternative school practitioners are trained and talented at speaking the “language” of their students and spontaneously adapt curricula to fit their students’ styles and needs. The authors recommend that research in the area of substance abuse prevention include creating and evaluating systematic, youth-centered adaptations to allow for cultural grounding of evidence-based curricula. It is the vision of the authors that no substance abuse prevention curricula will be widely distributed or called effective without directions for cultural adaptation.

Contributor Information

Laura M. Hopson, The University at Albany School of Social Welfare, 135 Western Avenue, Richardson Hall, Room 208, Albany, NY 12222, (518) 591-8787, ude.ynabla.liamau@nospohl..

Lori K. Holleran Steiker, The University of Texas at Austin, School of Social Work, 1925 San Jacinto Blvd., Austin, TX 78712, (512) 232-9330, ude.saxetu.liam@yakirol..


  • Allison KR, Rootman I. Scientific rigor and community participation in health promotion research: are they compatible? Health Promotion International. 1996;11(4):333–340.
  • Argyris C. On organizational learning. Malden, MA: Blackwell Business; 1992.
  • Backer TE. Finding the balance: Program fidelity and adaptation in substance abuse prevention: A state of the art review. Rockville, MD: Center for Substance Abuse Prevention; 2001.
  • Bergman P, McLaughlin MW. Implementation of educational innovation. The Educational Forum. 1976;40:345–370.
  • Blakely CH, Mayer JP, Gottschalk RG, Schmitt N, Davidson W, Roitman DB, Emshoff JG. The fidelity-adaptation debate: Implications for the implementation of public sector social programs. American Journal of Community Psychology. 1987;15:253–268.
  • Botvin GJ, Schinke SP, Epstein JA, Diaz T, Botvin EM. Effectiveness of culturally-focused and generic skills training approaches to alcohol and drug abuse prevention among minority adolescents: Two-year follow-up results. Psychology of Addictive Behaviors. 1995;8:183–194.
  • Botvin GJ. Advancing prevention science and practice: Challenges, critical issues, and future directions. Prevention Science. 2004;5(1):69–72. [PubMed]
  • Bowen GL, Rose RA, Ware WB. The reliability and validity of the School Success Profile Learning Organization Measure. Evaluation and Program Planning. 2006;29:97–104.
  • Castro FG, Barrera M, Martinez CR. The cultural adaptation of prevention interventions: Resolving tensions between fidelity and fit. Prevention Science. 2004;5(1):41–45. [PubMed]
  • Cuellar I, Arnold B. Acculturation Rating Scale for Mexican Americans-II: A revision of the original ARSMA scale. Hispanic Journal of Behavioral Sciences. 1995;17(3):275–305.
  • Dupper D. Guides for designing and establishing alternative school programs for dropout prevention. In: Franklin C, Harris MB, Allen-Meares P, editors. The School Resource Book for School Social Workers, Counselors and Mental Health Practitioners. New York: Oxford University Press; 2005. pp. 413–422.
  • Elliott DS, Mihalic S. Issues in disseminating and replicating effective prevention programs. Prevention Science. 2004;5:47–53. [PubMed]
  • Erickson C, Mattaini MA, McGuire MS. Constructing nonviolent cultures in schools: The state of the science. Children and Schools. 2004;26:102–116.
  • Flay BR. Mass media and smoking cessation: A critical review. American Journal of Public Health. 1987;77:153–160. [PubMed]
  • Glisson C. The organizational context of children’s mental health services. Clinical Child and Family Psychology Review. 2002;5(4):233–252. [PubMed]
  • Gosin MN, Dustman AE, Drapeau AE, Harthun ML. Participatory action research: Creating an effective prevention curriculum for adolescents in the Southwestern US. Health Education Research. 2003;18(3):363–379. [PubMed]
  • Gottfredson DC, Gottfredson GD. Quality of school-based prevention programs: Results from a national survey. Journal of Research on Crime and Delinquency. 2002;39:55–59.
  • Greenwood DJ, Whyte WF, Harkavy I. Participatory Action Research as a Process and as a Goal. Human Relations. 1993;46(2):175–192.
  • Hopson LM. Unpublished dissertation. The University of Texas at Austin; 2006. Effectiveness of Culturally Grounded Adaptations of an Evidence-based Substance Abuse Prevention Program with Alternative School Students.
  • Hughes J. Commentary: Participatory action research leads to sustainable school and community improvement. School Psychology Review. 2003;32(1):38–43.
  • Kidd SA, Kral MJ. Practicing participatory research. Journal of Counseling Psychology. 2005;52:187–195.
  • Kelly PJ. Practical suggestions for community interventions using participatory action research. Public Health Nursing. 2005;22(1):65–73. [PubMed]
  • Keys W, Sharp C, Greene K, Grayson H. Successful leadership of schools in urban and challenging contexts: A review of the literature. from the National College for School Leadership; 2003. Retrieved March 20, 2004, Web site:
  • Greenwood DJ, Whyte WF, Harkavy I. Participatory Action Research as a Process and as a Goal. Human Relations. 1993;46(2):175–192.
  • Grunbaum JA, Kann L, Kinchen SA, Ross JG, Gowda VR, Collins JL, Kolbe LJ. Youth risk behavior surveillance: National Alternative High School Youth Risk Behavior Survey, United States, 1998. Journal of School Health. 2000;70(1):5–17. [PubMed]
  • Harris A, Hopkins D. Introduction to special feature: Alternative perspectives on school improvement. School Leadership and Management. 2000;20(1):6–14.
  • Hecht ML, Marsiglia FF, Elek E, Wagstaff DA, Kulis S, Dustman P. Culturally-grounded substance use prevention: An evaluation of the keepin’ it REAL curriculum. Prevention Science. 2003;4(4):233–248. [PubMed]
  • Hiatt-Michael DB. Schools as learning communities: A vision for organic school reform. The School Community Journal. 2001;11:113–127.
  • Hoagwood K, Johnson J. School psychology: A public health framework I. From evidence based practices to evidence based policies. Journal of School Psychology. 2003;41:3–21.
  • Hofman RH, Hofman WHA, Guldemong H. The effectiveness of cohesive schools. International Journal of Leadership in Education. 2001;4(2):115–135.
  • Holleran L, Reeves L, Marsiglia FF, Dustman P. Creating culturally grounded videos for substance abuse prevention: A dual perspective on process. Journal of Social Work Practice in the Addictions. 2002;2(1):55–78.
  • Institute of Medicine. Reducing risks for mental disorders: Frontiers for preventive intervention research. Washington, DC: National Academy Press; 1994. [PubMed]
  • Kelly JG. Steinberg JA, Silverman MM, editors. Preventing mental disorders: A research perspective. Washington D.C: US Gov. Printing Office; 1987. Seven criteria when conducting community-based prevention research: A research agenda and commentary. DHHS Publication No. (ADM) 87-1492.
  • Kubik MY, Lytle L, Fulkerson JA. Physical activity, dietary practices, and other health behaviors of at-risk youth attending alternative high schools. Journal of School Health. 2004;74(4):119–124. [PubMed]
  • Lee VE, Smith JB. Effects of school restructuring on the achievement and engagement of middle-grade students. Sociology of Education. 1993;66:164–187.
  • Lehr CA, Moreau RA, Lange CM, Lanners EJ. Alternative schools: Findings from a national survey of the states. Minneapolis, MN: University of Minnesota, Institute on Community Integration; 2004.
  • Price RH, Lorion RP. Shaffer D, Philips I, Enzer NB, editors. Prevention of mental health disorders, alcohol and other drug use in children and adolescents. Washington, D.C: U.S. Government Printing Office; 1987. Prevention programming as organizational reinvention: From research to implementation. OSAP Prevention Monograph-2. DHHS Pub. No. (ADM) 90-1646.
  • Raywid MA. Alternative schools: The state of the art. Educational Leadership. 1994;52(1):26–31.
  • Rubin A, Babbie ER. Research methods for social work. 5th ed. Belmont, CA: Brooks/Cole – Thomson Learning; 2005.
  • Sussman S, Dent CW, Stacy AW, Craig S. One-year outcomes of project towards no drug abuse. Preventive Medicine. 1998;27:632–642. [PubMed]
  • Sussman S, Sun P, McCuller WJ, Dent CW. Project Towards no Drug Abuse: Two-year outcomes of a trial that compares health educator delivery to self-instruction. Preventive Medicine. 2003;37(2):155–162. [PubMed]
  • Texas Commission on Alcohol and Drug Abuse (TCADA) 1998 Texas school survey of substance use among students on the border: Grades 4–12. 2000. Retrieved October 15, 2005, from
  • Tornatzky LG, Fergus EO, Avellar JW, Fairweather GW. The process of technological innovation: Reviewing the literature. Washington, D.C: National Science Foundation; 1983.
  • Vaughn D, Slicker E, Van Hein J. Adolescent problems and coping strategies: Alternative schools versus non-alternative school students. Research in the Schools. 2000;7(2):41–48.
  • Whyte WT. Participatory action research. Newbury Park, CA: Sage Publications; 1991.