PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
J Behav Health Serv Res. Author manuscript; available in PMC 2017 April 1.
Published in final edited form as:
PMCID: PMC4312742
NIHMSID: NIHMS618215

A Statewide Common Elements Initiative for Children’s Mental Health

Abstract

Many evidence-based treatments (EBTs) for child and adolescent mental health disorders have been developed, but few are available in public mental health settings. This paper describes initial implementation outcomes for a state-funded effort in Washington State to increase EBT availability, via a common elements training and consultation approach focused on 4 major problem areas (anxiety, PTSD, depression, and behavioral problems). Clinicians (N = 180) reported significant improvement in their ability to assess and treat all problem areas at post-consultation. Clinicians from organizations with a supervisor-level “EBT champion” had higher baseline scores on a range of outcomes, but many differences disappeared at post-consultation. Outcomes suggest that a common elements initiative which includes training and consultation may positively impact clinician-level outcomes, and that having “in-house” organizational expertise may provide additional benefits.

Introduction

Over 500 evidence-based treatments (EBTs) for child and adolescent mental health disorders exist, many with multiple randomized controlled trials (RCTs) supporting their efficacy in improving outcomes and functioning.1,2 However, few EBTs are provided in public mental health settings,3,4 leading to recognition that community-level implementation of EBTs is proceeding at “an unacceptably slow pace.”5(p208) To address the current science-to-practice divide between empirical knowledge and adoption, states such as Hawaii and New York have used a variety of strategies6 in an attempt to address barriers to EBT adoption and to build in facilitators, with the ultimate goal of increasing child and adolescent access to and receipt of EBTs. Organizational champions – individuals who make use of their reputations and informal status to support a change effort – have been identified as an important facilitator in theoretical models of implementation.79 Empirical studies support the effectiveness of local champions in facilitating implementation, particularly in its earlier phases (e.g., initial adoption).10 Frequently, champions are situated at some level of an organization’s leadership,11 where they are more likely to hold status and have an influence on the behavior of others.

The most commonly used implementation strategy has been in-person training for mental health professionals.12 Hawaii, for example, has a 10-year history of providing training and additional supports (e.g., performance monitoring) for a range of specific EBTs and for common elements of EBTs.13 Eighteen states have undertaken initiatives to implement Trauma-Focused Cognitive Behavioral Therapy14 (TF-CBT), offering training, a period of expert case consultation, and supervisor support among other strategies.15 In New York State, an EBT center was established that offers in-person training and case consultation in EBTs for depression, disruptive behavior disorders, and Posttraumatic Stress Disorder (PTSD).16

Despite the fact that many states are offering training and other supports to expand EBT availability, state-funded initiatives often have limited evaluations of implementation outcomes (e.g., clinician adoption, fidelity) and/or child/adolescent outcomes. Furthermore, more detailed outcomes, when available, are typically written in reports to funders (e.g., state, foundations). Implementation and/or clinical outcomes have only been included in a limited number of reports in the scientific literature (for some exceptions, see 13,1620).

Evaluating the impact of EBT training efforts is a priority given its role as a primary implementation strategy. Recent findings21,22 document the importance of the method of training delivery (active vs. predominantly didactic) and the types of specific implementation supports following initial training (e.g., case consultation, supervision). Training alone, without additional supports,23 appears insufficient for changing clinician practice.21,22 Empirical evaluations of training efforts and other included supports, such as consultation, will inform future efforts so that investments made in training are maximized.

This paper describes initial implementation outcomes (e.g., knowledge gains, adoption) for an ongoing statewide implementation effort to increase EBT availability in child public mental health in Washington State. The initiative, CBT+, is funded by the state organization overseeing public mental health services (i.e., Washington State Department of Social and Health Services, Division of Behavioral Health and Recovery) using Federal Block Grant dollars. The CBT+ Initiative provides clinicians with in-person training and expert consultation in cognitive behavioral therapy (CBT) and parent management training (PMT) for the most common mental health problems of childhood. From 2009–2013, nearly 500 clinicians from more than 53 organizations participated. This paper first describes the development of the CBT+ Initiative and rationale for a common elements training approach in Washington State, then describes the CBT+ training and consultation approach. Lastly, this paper presents – and discusses the implications of – clinician-level outcomes for knowledge and adoption, considering consultation dose, impact of being from an organization with a supervisor-level EBT “champion,” and change over time from multipoint analyses with a subsample of trainees.

CBT+ Development

The impetus for CBT+ grew out of the early excitement and potential promise of common elements approaches developed by Weisz and Chorpita.25,26 The CBT+ Initiative evolved from an earlier statewide, public mental health TF-CBT training and consultation initiative. Trauma-focused CBT is a well-established EBT for trauma-related sequelae, including PTSD.27 The TF-CBT focus began with a National Child Traumatic Stress Network grant (second author, PI). When grant support ended in 2005, the state provided funding to sustain yearly TF-CBT training and consultation. The TF-CBT Initiative was well-received by public mental health organizations as evidenced by registration slots filling within a few days, clinicians completing initiative requirements (i.e., participation in the 2-day training and 6-months of biweekly consultation calls), and multiple requests for additional trainings.

After 2 years of state support for TF-CBT, the lead faculty (first and second authors) and participating organizations noted that an intervention with a single primary focus (i.e., trauma impact) had limited reach in public mental health. Only a small percentage of children referred for mental health services have trauma-related symptoms as the presenting or primary treatment concern.28 In Washington, exposure to a specific CBT approach – TF-CBT – seemed to create interest in how to apply CBT skills to other presenting problems. However, a broader, alternative approach was needed to efficiently and effectively extend EBTs to meet provider interest and the treatment needs of the children and their caregivers.

Rationale for a Common Elements Training Approach

CBT+ was developed to broaden EBT applicability for the public mental health population. Researchers have called for new approaches to EBT dissemination and implementation to extend reach and more rapidly achieve the goal of improving outcomes on a broad scale.29.30 Following the work of Chorpita and Weisz, one potential strategy for accomplishing this goal includes the use of a common elements approach.31 Most EBTs for commonly occurring child and adolescent problem areas (depression, anxiety, behavior disorders) are comprised of discrete clinical interventions or strategies, termed “practice elements,” or “kernels” (e.g., relaxation, praise, exposure31,32). Typically, common elements approaches are modular: practice elements can be delivered independently or together to achieve specific treatment outcomes.33 An approach that provides training in elements relevant to the treatment of depression, behavior problems, and anxiety (including PTSD) would equip clinicians to treat the majority of children seeking services in the public mental health system. In Washington State specifically, a report28 on child service utilization (N = 30,055) indicated that at least 70% had diagnoses that fell in 1 of these 3 areas. Two additional advantages for a modularized common elements approach included potentially better acceptability by clinicians,25,34 given options for treatment flexibility when faced with comorbidity or treatment interference and the potential for streamlining training and consultation.35,36 For many states and organizations, it is not feasible to pay for training and ongoing supervision in multiple single-focus EBTs to treat the range of diagnoses and problem areas seen in public mental health settings. Focusing on clinician competency in common practice elements could be an efficient, attractive, and cost-effective alternative.37

Empirical Evidence for Common Elements Approaches

Although evidence is accumulating, it is important to note that the current enthusiasm for common elements approaches has advanced more rapidly than empirical support for their effectiveness. In a recently completed RCT of Chorpita and Weisz’s Modularized Approach to Treating Children and Adolescents (MATCH),38 the modularized common elements approach resulted in better client outcomes than traditional EBT approaches or usual care at both post-treatment and at a 2-year follow-up.25,39 Chorpita and colleagues have over 10 years of history testing a common elements approach to treating anxiety disorders with positive outcomes (e.g., 26,40). Additional RCTs of MATCH are ongoing in Maine, Massachusetts, and California (Chorpita and Weisz, McArthur-funded). In the adult area, Barlow and colleagues developed and are testing a common elements, transdiagnostic approach (i.e., Unified Protocol41), with preliminary results from 2 small open trials42 and 1 small RCT43 showing promise. A series of pilot projects also have tested the feasibility of common elements approaches in school settings.44,45 For instance, Lyon and colleagues44 trained clinicians in school-based health centers in a common elements approach adapted from the work of Chorpita et al.46 Findings indicated acceptability and feasibility at the clinician level. Globally, researchers developed a common elements intervention for delivery by lay counselors, with pilot data showing promise in Southern Iraq and the Thailand-Burma border.47 Recently completed RCTs in both sites are demonstrating potential for positive results.

Method

CBT Plus Training and Consultation Approach

CBT+ includes a 3-day, in-person, skills-based training provided by the CBT+ developers (first and second author) and other CBT+ faculty, all of whom have CBT expertise (e.g., national trainers in other EBTs, received MATCH training, developers/trainers for other common elements approaches47). Training is followed by 6 months of biweekly phone consultation, predominantly provided CBT+ faculty. Organizations are required to send 1 supervisor and 2–3 clinicians so that organizations have a participating supervisor and a clinician cohort. Participants also have access to a yearly, advanced, 1-day booster training and the CBT+ listserv. Supervisor-specific supports are also offered56. Yearly trainings are announced through email distribution by the state division for mental health to regional networks that manage community mental health organization contracts and via announcement on the CBT+ listserv.

Training content for CBT+ includes a focus on assessment and treatment for 4 presenting problem areas: depression, anxiety, trauma-related anxiety (i.e., PTSD), and behavior problems. In the area of assessment, trainees practice scoring completed standardized measures for case vignettes and practice giving assessment feedback through behavioral rehearsal in small peer groups. Assessment measures used in CBT+ were selected based on the following characteristics: 1) limited items; 2) ability to score quickly by hand; 3) available in the public domain (i.e., no cost to organizations); and 4) strong psychometric properties (e.g., Child Posttraumatic Stress Scale48).

CBT+ training focuses on emotion regulation skills, exposure, and cognitive reprocessing for anxiety and PTSD; on emotion regulation skills, behavioral activation, problem solving, and cognitive reprocessing for depression; and on PMT strategies (e.g., praise, rewards, consequences) for behavioral problems. Given the trauma-focused origins of CBT+, PTSD treatment includes positioning the common elements within the TF-CBT model.14 Following the MATCH approach,38 the CBT+ model encourages a primary focus on the indicated elements for each problem area, but supports modularity – or tailoring – when clinically indicated (e.g., adding PMT to anxiety treatment when a child has comorbid anxiety and behavior problems). Training also targets CBT general competencies (e.g., agenda setting, collaborative homework assignment49) found to occur infrequently in community mental health settings.29 The process of CBT+ training is active and includes experiential learning activities (e.g., cognitive restructuring activity for a situation in the clinicians’ own life), trainer modeling and video demonstration of skills, trainee behavioral rehearsal of practice elements with both peer and trainer feedback and coaching, and small and large group work. Training is tailored to focus on the typical clients, setting constraints (e.g., 50-minute sessions), and identified EBT implementation challenges in public mental health settings (e.g., decision-making when faced with comorbidity, engagement strategies).

Within 3 weeks of the training, clinicians begin CBT+ expert-led consultation calls that focus on implementing CBT with clients on their caseloads. Calls are led by the first, second, and fifth authors, and by other CBT+ faculty. Each call group includes 3–4 organizational teams with approximately 10–15 trainees (clinicians and supervisors) per call. A specific case presentation format is used to help clinicians quickly get to the clinical question or difficulty so that the consultant and peers on the call can plan for concrete next steps. Calls involve reviewing assessment data to determine clinical focus, application of CBT+ components to cases, and problem-solving challenges with child and caregiver engagement. Trainees are expected to present at least 1 case over the consultation period and attend 9 of 12 calls in order to receive a CBT+ certificate of participation.

CBT+ emphasizes identifying and leveraging local EBT “champions” to support implementation and sustainment within their organizations. One ongoing mechanism for building and utilizing champions in CBT+ involves inviting individuals at the supervisor level who successfully completed the CBT+ Initiative to co-facilitate consultation calls with one of the CBT+ faculty. This benefits both the consultation group and the organization by bringing champion’s experience in using CBT+ to the consultation call group and by enhancing the champion’s CBT+ expertise through an opportunity to hone supervisory/consultation skills, which may advance CBT+ practice and supervision in the organization.

Procedures

Data for the current study predominantly come from an evaluation of all CBT+ trainees during the first 3 years of CBT+ (2009–2011; 4 cohorts). Trainees completed questionnaires at the beginning of training (via paper and pencil, at the training) and after the 6-month consultation period (via online survey, same content), henceforth referred to as the pre-post sample. For 3 of the 4 cohorts (2010–2011), data also included expert consultant ratings for all case presentations on each of the 12 consultation calls (4-item online survey). Participation in the pre-post evaluation was expected as part of Initiative involvement. Although no incentives were provided, participants received multiple reminders to complete the online surveys and also received telephone reminders in an attempt to increase completion rates. Clinicians and supervisors who participated in the CBT+ training did not seem to receive any systematic incentives or workload reductions from their organizations; however, the Initiative provided 18–20 hours of Continuing Education Credits for attending the training.

Supplementing these data, all trainees in these 3 cohorts were asked to also participate in a more time-intensive, longitudinal evaluation of CBT+, for which incentives were available, given the increased burden on participant time. Those who agreed received a $10 gift card for completion of additional measures, via web survey, at each of 4 assessment points: pre-training, post-training (within 1 week), after the 6-month consultation was completed, and at 3 months post-consultation. All evaluation activities were reviewed by the Washington State IRB and were exempted from review. This manuscript focuses predominantly on analyses with the pre-post sample of participants in the larger evaluation, with secondary analyses using the subsample that completed the longitudinal evaluation.

Participants

Participants were clinicians and supervisors employed at public mental health clinics in Washington State (see Table 1) who participated in the CBT+ Initiative. Of the 400 total participants in the 4 Washington State CBT+ training cohorts between 2009–2011 (fall 2009, fall 2010, spring 2011, fall 2011), 320 completed the pre-training survey. Of these, 36 were not clinicians or supervisors (i.e., administrators, case managers, or in other roles) and were excluded, resulting in a sample of 284 (78% participation rate). Of the 284, 180 completed the 6-month, post-consultation, and follow-up assessment, and constitute the primary sample for this study. Reasons for attrition (N = 104) included departure from organization (n = 27; 26%) and general non-response (n = 77; 74%) (see Missing Data Analyses section for examination of attrition bias). Pre-post sample participants (N = 180) were predominantly female (80%), Caucasian (84%), Master’s-level clinicians (93%) in their late twenties and thirties (68%) (see Table 1). The longitudinal subsample consisted of 71 participants who represent 54% of CBT+ trainees from the 3 cohorts in years that included the longitudinal evaluation (2010–2011).

Table 1
Participant demographics

Measures

Demographics

Participants completed a questionnaire that included demographic (e.g., sex, ethnicity) and background information (e.g., role in the organization [clinician, supervisor], years of experience).

Self-report of skill

Participants self-reported skill/ability for individual treatment elements for each of the 4 problem areas addressed in the CBT+ training (0, do not use; 1, minimal; 2, minimal to moderate; 3, moderate; 4, moderate to advanced; 5, advanced). Participants were asked to rate their skills and understanding of elements separately for each problem area. Therefore, certain items were common across all or some problem areas (e.g., using assessment measures, psychoeducation). Other items were unique by problem area (e.g., for depression: “Pleasurable activity scheduling (help the child identify, plan for, and engage in fun activities)”; for anxiety: “Facing your fears/exposure (developing a list of feared situations/memories, helping the client gradually face them)”; for behavior problems: “Positive parenting (increase positive time together)”; and for trauma/PTSD: “Trauma narrative (developing and working with child to modify cognitive distortions throughout narrative)”).

Principal components factor analyses with Varimax orthogonal rotation was conducted showing a clear inflexion, with 4 factors accounting for 68.3% of the variance: depression/anxiety, behavioral problems, trauma, and a fourth factor signifying assessment skill. Items for depression/anxiety were divided into 2 scales, despite clustering as 1, to aid interpretation and explore the possibility of different trajectories of change. The 5 final factors included anxiety (7 items; Cronbach’s α = .89), depression (8 items; Cronbach’s α = .88), behavioral problems (10 items; Cronbach’s α = .94), PTSD (10 items; Cronbach’s α = .94), and assessment (4 items; Cronbach’s α = .90).

Evidence-based clinician-level activities

The Evidence-based Clinician Checklist is a locally developed questionnaire used to assess participant practice characteristics consistent with EBTs. Items are rated on a 4-point Likert scale (1, rarely; 2, occasionally; 3, regularly; 4, almost always). The checklist was derived from review of the EBT literature for the purpose of specifically capturing the hallmark clinician activities associated with delivery of CBT. By design it was intended to include only essential activities, described in a behaviorally specific way, so as to be readily understood by clinicians. Sample items included, “I use standardized measures or questionnaires to identify and measure specific clinical conditions (depression, PTSD, ADHD, behavior problems),” and “I give verbal or written feedback about my diagnostic or clinical impressions to the child and/or the child’s caregiver and establish agreement on the problems to address in treatment.” Because of the late introduction of this measure to the study, these data are only available for 3 of 4 cohorts of the sample (fall 2010, spring 2011 & fall 2011 training). Principal components factor analysis on the original 8 items revealed that 3 items had unacceptable component loadings (< .5), degrading scale internal consistency. The scale was reduced to one 5-item factor, accounting for 58.4% of the variance (Cronbach’s α of .82), assessing: 1) use of standardized assessment, 2) providing assessment feedback and establishing agreement on problems, 3) providing information on treatment options, 4) using a specific EBT matched to clinical need, and 5) re-administering measures.

Evidence-based organization-level activities

The Evidence-based Organizational Checklist is a questionnaire used to assess organizational characteristics that support the delivery of EBTs (REMOVED, unpublished measure). Items are rated on a 4-point Likert scale (1, never; 2, occasionally, 3; most of the time; 4, ongoing/routine). This measure was also introduced later in the study and is only available for 3 cohorts. The checklist was derived from review of the dissemination and implementation literature for the purpose of specifically capturing the hallmark organizational activities associated with EBT uptake. Again, by design it was intended to include only essential activities and describe them in a behaviorally-specific way. This measure was included because participation in CBT+, or other EBT training, is sometimes an organization’s first foray into EBTs. Organizations may make some changes to support EBTs over the course of the 6-month consultation period (e.g., instituting routine screening and assessment; setting up a CBT+ specific supervision group). Sample items for this measure include, “Executive leadership (e.g. administrators, directors) explicitly and repeatedly express support for and promote use of Evidence Based Practices,” and “Clinicians are provided with EBT training opportunities and ready access to EBT materials (manuals, handouts, equipment).” After principal components factor analysis, the original 7-item scale was reduced to 6 items (1 had a low factor loading [< .5]), assessing: 1) executive leadership support for EBT; 2) availability of EBT training and materials, 3) EBT clinical supervision; 4) policies around screening and assessment; 5) agency expectation is to use matched EBT, if available; and 6) procedures for fidelity monitoring. These items represented a single factor, accounting for 60.3% of the variance, with a Cronbach’s α = .87.

Consultation dose

Consultation call attendance was reported by the expert consultants leading the consultation calls.

Case presentation quality ratings

Clinicians’ case presentations on the consultation calls were rated by the expert consultant to examine adherence to training expectations. Consultants received an online survey on the day of their consultation call and rated each presentation for presence/absence on: 1) assessment data; 2) identification of a clinical target; 3) specification of 1 or more appropriate CBT+ component(s) (e.g., mentioning use of exposure for anxiety treatment); and 4) homework assignment. These items were summed to create a consultation call quality index for each presentation (range: 0–4).

Organizational Champion Status

Organizations were dichotomized on the basis of whether they had an internal, supervisor-level EBT champion. For purposes of this study, EBT champion was operationalized as a supervisor who was served as a co-consultant on at least 1 consultation call. Because the literature suggests that champions in leadership roles may have more influence,11 only supervisor-level champions were invited to be consultation co-facilitators. CBT experts who led the consultation calls identified the EBT champion co-consultants as vocal proponents and adopters of either TF-CBT (before the CBT+ Initiative began) or CBT+ (when these individuals themselves were consultation call participants). Champions also served an advisory role for the Initiative and participated in approximately once or twice a year in-person or conference calls.

Results

Descriptive Analyses

Table 1 presents demographic and descriptive data stratified by participants (clinicians and supervisors) with only baseline data (N = 284 for most data; n = 217 or 214 for evidence-based clinician and organizational-level activities, respectively) and those with follow-up data (N = 180 for most data; n =146 or 144 for clinician and organization evidence-based activities). Call attendance data were available for 155 (86.1%) participants (attendance data were not available for two 2009 consult call groups). For these participants, the mean number of calls attended was 9.4 (SD = 1.9, range = 1–12), with 80% (n = 124) attending 9 or more calls. More than half (56%, n = 87) attended 10 or more calls, surpassing the 9 required to receive the CBT+ certificate of completion. The mean number of cases presented by participants during the 6-month consultation period was 2.0 (SD = 1.3, range = 1–7).

Missing Data Analyses

Cross tabulations with chi-square analyses and t-tests were run to examine any differences between participants with only a baseline survey and those with a follow-up. There were no significant differences on demographics or outcome variables at baseline. However, those with baseline-only data attended fewer consultation calls (M = 7.1 vs. 9.4, t(98.5) = −6.6, p < .001), were less likely to receive clinical supervision (91.3% vs. 97.2%, χ2(2) = 4.8, p = .028) and supervision in CBT (20.2% vs. 31.3%, χ2(1)= 4.09, p = .043), and were more likely to have left their organization during the 6-month period (19.2% vs. 4.4%, χ2(1) = 16.5, p < .001).

Primary Analyses

Change over time

Table 2 depicts paired t-tests indicating that participants reported significant improvements on self-reported skill in all five areas: anxiety (t(177) = −10.01, p < .001, d = 0.64), depression (t(173) = −8.38, p < .001, d = 0.76), behavioral problems (t(175) = −5.72, p < .001, d = 0.43), PTSD (t(176) = −11.10, p < .001, d = 0.84), and assessment (t(178) = −15.06, p < .001, d = 1.16). They also reported significant improvements on EBT activities at the clinician (t(143) = −10.24, p < .001, d = 0.28) and organizational level (t(145) = −3.27, p = .001, d = 0.86). Cohen’s d effect sizes (calculated as a within-subject effect size, adjusting for correlations between the means; equation 850) were moderate (> .5) to large (> .8) for most tests, with skill conducting assessment, clinician EBT checklist scores, and PTSD/trauma showing the largest effect sizes.

Table 2
Pre-post sample: Paired t-tests for change in self-reported understanding and skill

Association with consultation

Two-Way Repeated-Measures (RM) ANOVAs were run to explore whether changes were related to consultation call dosage or participation (operationalized as number of cases presented and as the case presentation quality index). There were no significant associations.

Case presentation quality

Table 3 provides detail from the independent case presentation quality ratings for the first 4 cases presented. Presentation quality was high across all 4 case presentations (range: M: 3.4 – 3.9). Three RM-ANOVAs were run to explore variation within each of the first 3 case presentations (the fourth was underpowered, n = 12). All 3 were significant. Pairwise between-measure tests, with Sidak’s adjustment for multiple comparisons, indicated that, of the 4 elements rated by consultants, significantly fewer participants specified assigning homework. In addition, second and later case presentations were more likely to include mention of specific CBT+ components.

Table 3
Pre-post sample: Case presentation descriptives for the first four cases presented by each participant

Exploratory Analyses

Association with champion organization status

The potential impact of having an internal organization EBT champion was explored by examining whether the 37 participants who were in organizations with a champion had different mean scores at either assessment or different rates of change. Analyses involved t-tests and Two-Way RM-ANOVAs (time x champion status) to test for statistical significance. Participants from champion organizations reported higher scores at baseline for skill in 3 of 5 areas: anxiety (t(106.8) = −2.26, p = .026), PTSD (t(80.1) = −3.90, p < .001), and assessment (t(171) = −2.21, p < .039). They also had higher scores for evidence-based activities at both the clinician (t(137) = −2.87, p = .005) and organizational level (t(138) = −4.00, p < .001). Group differences for skill in depression and behavior problems were not significant. At post-consultation, participants from champion organizations reported higher scores only for skill in PTSD (t(171) = −2.76, p = .006) and evidence-based activities at the clinician (t(139) = −2.79, p = .006) and organizational levels (t(139) = −4.31, p < .001). There were no statistically significant time x champion status interaction terms; status was not related to rate of change on any of the outcomes measures (self-report of skill in the five areas, evidence-based clinician or organizational activities).

Associations between organization status and call ratings for the first case presentation were also explored (as most participants had at least one case presentation). There were no differences in mean quality ratings between groups. Looking at the 2 elements with more variation (i.e., specifying CBT + components, assigning homework), a greater percentage of participants from champion organizations reported assigning homework (75% vs. 52%, χ2(1) = 4.7, p = .03).

Longitudinal subsample analysis

To more sensitively examine change over multiple time points, the same set of outcome variables with the longitudinal sample who had baseline data on all outcome measures (n = 71) was examined. A series of multilevel growth models with full maximum likelihood estimated the overall linear and piecemeal rates of change between time points. For each outcome, −2 Log Likelihood deviance statistics determined the best-fitting model (see Table 4). Four outcomes significantly improved at a linear rate over all time points from pre-training through 3-months post-consultation follow-up: depression skill (β = .25, p < .001); anxiety skill (β = .30, p < .001); behavior problem skill (β = .30, p < .001); and the organization checklist (β = .10, p < .001). Three outcomes significantly improved from pre-training through consultation, but then plateaued (i.e., no significant change) from the end of consultation to the 3-month follow up: PTSD skill (β = .38, p < .001), assessment skill (β = .70, p < .001), and clinician evidence-based activities (β = .34, p < .001). This plateau does not appear related to ceiling effects; for all outcomes, standard deviations were not significantly smaller and negative skew was not significantly greater in the last two time points, compared to the first two. Across outcomes, mean scores in the last two time points ranged from 1.5 to 2.5 standard deviations lower than the highest possible score. These statistics indicate that there was additional variance possible in each of the two final time points for all outcome measures (analyses available upon request).

Table 4
Longitudinal sample: Mixed model regression coefficient estimates for rate of change over time on all outcome variables

Discussion

This project was designed to evaluate implementation factors in the context of a common elements training program, delivered as part of a statewide initiative. The majority of clinicians who remained with their organizations completed the required consultation calls, suggesting some acceptability for both the CBT+ training process (in-person training, consultation calls) and potentially for the components-based approach. A number of clinicians completed more than the required number of consultation calls and case presentations, which may indicate that clinicians perceived the calls to be beneficial to their practice and worth the time investment. Based on their self-report, clinician skill improved significantly across the 4 targeted problem areas (i.e., depression, anxiety, behavior problems, and PTSD), with effect sizes – albeit calculated without a comparison condition – in the moderate to large range. However, clinician self-report of skill may or may not be representative of actual behavioral improvements.51 Independent consultant ratings of clinicians’ case presentations provide some additional information on clinician adoption of the CBT+ model, specifically that clinicians noted a CBT+ area of focus (e.g., depression), used standardized measures and mentioned CBT+ components when discussing cases. However, by independent consultant report, clinicians were significantly less likely to report having assigned therapeutic homework activities. Other research on usual care29 indicates that, as in CBT+, homework assignment and review infrequently occurs, despite being an important CBT competency.49 These findings bolster past research, suggesting a need for greater attention to client homework assignment and review during training and consultation.

Interestingly, participants from organizations with an internal champion had higher pre-training levels of self-reported skill for 3 of the 5 areas (anxiety, PTSD, and assessment), although other than PTSD, differences disappeared by the post-consultation assessment point. That is, clinicians from organizations with a champion did not demonstrate an increased rate of change. In the area of evidence-based clinician and organizational level activities, differences between clinicians from organizations with and without an organizational champion were maintained from pre-training to post-consultation. These findings suggest that having an organizational champion might be more beneficial in these areas. One promising finding, given low base rate across Initiative participants, was that those from organizations with an internal champion were more likely to specify homework activities in case presentations, suggesting that organizations with champions may be paying greater attention to and/or supporting this general CBT competency. Among other things, positive organizational culture and climate have been linked to EBT uptake, decreased clinician turnover, and improved child outcomes.52,53 The literature promotes the idea of EBT champions within organizations for implementing and sustaining EBTs.54 Although not the focus of this paper, the field would benefit from additional research on how to build champions within organizations, or other internal network change agents,55 for promoting and advancing EBTs.

The multi-time point analyses provide additional insight into the pattern of CBT+ participant change over time, and in particular, the importance of consultation. Although some outcomes continued to improve through the 3-month post-consultation follow-up, a number of gains – skill in assessment, skill in treating PTSD, and evidence-based clinician activities – plateaued. Given the small sample size (n = 71), these analyses are exploratory in nature, but reflect other findings21,23 from clinician training on the potential need for ongoing consultation or supervision. Organizations likely need to explicitly continue EBT supports to maintain improvements, either through local supervisors’ continuing to provide support around specific EBT activities56 or potentially using newly emerging technologies that include clinician reminders57 or other strategies.

As is common in public mental health,58 clinician turnover in CBT+ was notable. At least 9% of participants in this study left their organization in the short 6-month window following the in-person training. As turnover rates are approximately 20% per year in public mental health, rates in the CBT+ initiative are in-line with national averages.58 Organizations need access to efficient models for ongoing EBT training, as most cannot afford to send new hires to multiple trainings, each with their own set of post-training activities (e.g., required consultation, booster trainings) and fidelity monitoring procedures. Findings from the current study indicate that clinicians reported improved skill over time, participated in the post-training supports, and maintained skills post-consultation. A common elements approach, like CBT+, may offer a more streamlined and targeted approach (i.e., 1 training) for organizations to prepare new hires to treat the most common child mental health problems.

One limitation of many evaluations of training programs for behavioral health services – including this one – is that change in self-reported knowledge and skills may not necessarily reflect the acquisition or use of skills in actual practice,59,60 although recent work suggests that self-report may have some validity.61 Regardless, determining whether such changes have occurred may require more rigorous and direct methods. The inclusion of consultant ratings, although more rigorous than clinician self-report, still relied on assessing clinician self-report of their sessions with clients. Among other more rigorous options, one possibility includes objectively evaluating skills in the EBT elements by either using behavioral rehearsal with a standardized patient62 or employing direct observation of skill use with clients.63 However, both of these methods are costly and resource-intensive. Behavioral rehearsal methodology was included in 1 CBT+ cohort, with analyses underway. Preliminary findings suggest that behavioral rehearsal methods corroborate self-report of skill improvement over time. Additional efforts to devise feasible methods for evaluating the fidelity of EBT implementation following training are needed.64 A second limitation is the high non-response rate at the 6-month follow-up point, even among those clinicians who remained at their agencies. However, analyses examining attrition indicated no demographic or baseline differences in outcomes between those with and without data at the follow-up, and findings from the longitudinal sub-sample bolster the larger pre-post sample findings. The inclusion of incentives to increase response rate at the post-assessment for the pre-post sample may have improved retention.

Implications for Behavioral Health

The large investments in EBT training in recent years may not have produced presumed benefits. Research demonstrates that training alone21,23 is unrelated to actual practice change. Without a larger investment that includes post-training supports, investments in training may produce only poor or suboptimal outcomes.65 The CBT+ Initiative in Washington State is an example of how this ongoing support could be provided. Although predominantly based on self-report, the study suggests that the combination of training and consultation may result in knowledge gain and implementation of a common elements model, CBT+. A common elements approach potentially offers a more streamlined, integrated, and sustainable approach to training mental health providers in EBTs.38,39 In addition, this study offers some support for the beneficial role of organizational champions in supporting cross-cutting CBT competencies, such as homework assignment, within their organizations. As more states roll-out EBT initiatives, it will be important to include evaluation of implementation and client outcomes to ensure that limited resources are achieving targeted goals of improving clinician practice and positively impacting child and family outcomes.

Acknowledgments

This publication was made possible in part by funding from the Washington State Department of Social and Health Services, Division of Behavioral Health Recovery and from grant numbers F32 MH086978, K08 MH095939, and R01 MH095749, awarded from the National Institute of Mental Health (NIMH).

Drs. Dorsey, Lyon, and Murray are investigators with the Implementation Research Institute (IRI), at the George Warren Brown School of Social Work, Washington University in St. Louis; through an award from the National Institute of Mental Health (R25 MH080916) and the Department of Veterans Affairs, Health Services Research & Development Service, Quality Enhancement Research Initiative (QUERI).

Footnotes

Conflicts of Interest The authors have no conflicts of interest to report.

References

1. Kazdin AE. Psychotherapy for Children and Adolescents: Directions for Research and Practice. London: Oxford University Press; 2000.
2. Weisz JR, Sandler IN, Durlak JA, et al. Promoting and protecting youth mental health through evidence-based prevention and treatment. American Psychological Association. 2005;60(6):628–648. [PubMed]
3. President’s New Freedom Commission on Mental Health. Final Report. Rockville, MD: Substance Abuse and Mental Health Services Administration; 2003. Achieving the Promise: Transforming Mental Health Care in America. Available online at http://govinfo.library.unt.edu/mentalhealthcommission/reports/FinalReport/downloads/FinalReport.pdf.
4. Weisz JR, Jensen-Doss A, Hawley KM. Evidence-based youth psychotherapies versus usual clinical care: A meta-analysis of direct comparisons. American Psychological Association. 2006;61(7):671–689. [PubMed]
5. Mitchell PF. Evidence-based practice in real-world services for young people with complex needs: New opportunities suggested by recent implementation science. Children and Youth Services Review. 2011;33(2):207–216.
6. Bruns EJ, Hoagwood KM. State implementation of evidence-based practice for youths, part I: responses to the state of the evidence. Journal of the American Academy of Child and Adolescent Psychiatry. 2008;47(4):369–373. [PubMed]
7. Damschroder LJ, Aron DC, Keith RE, et al. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science 2009. 2009;4(1):50. [PMC free article] [PubMed]
8. Fixsen DL, Naoom SF, Blase KA, et al. Implementation research: A synthesis of the literature, 2005. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network; (FMHI Publication #231)
9. Rogers EM. Diffusion of innovations. Simon and Schuster; 2003.
10. Hendy J, Barlow J. The role of the organizational champion in achieving health system change. Social Science & Medicine. 2012;74(3):348–355. [PubMed]
11. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38(1):4–23. [PMC free article] [PubMed]
12. Powell BJ, McMillen JC, Proctor EK, et al. A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review. 2012;69(2):123–157. [PMC free article] [PubMed]
13. Daleiden EL, Chorpita BF, Donkervoet C, et al. Getting better at getting them better: Health outcomes and evidence-based practice within a system of care. Journal of the American Academy of Child Adolescent Psychiatry. 2006;45(6):749–756. [PubMed]
14. Cohen JA, Mannarino AP, Deblinger E, editors. Treating trauma and traumatic grief in children and adolescents. New York: Guilford Press; 2006.
15. Sigel BA, Lynch CE, Kramer TL, et al. Characteristics of 17 statewide initiatives to disseminate Trauma-Focused Cognitive-Behavioral Therapy (TF-CBT) Psychological Trauma: Theory, Research, Practice, and Policy. 2013;5(4):323–333.
16. Gleacher A, Nadeem E, Moy A, et al. Statewide CBT Training for clinicians and supervisors treating youth: The New York State evidence-based treatment dissemination center. Journal of Emotional and Behavioral Disorders. 2011;19(3):182–92.
17. Hoagwood KE, Olin SS, Kerker BD, et al. Empirically based school interventions targeted at academic and mental health functioning. Journal of Emotional and Behavioral Disorders. 2007;15(2):66–92.
18. Lopez MA, Osterberg LD, Jensen-Doss A, et al. Effects of workshop training for providers under mandated use of an evidence-based practice. Administration and Policy in Mental Health. 2011;38(4):301–312. [PubMed]
19. Nakamura BJ, Chorpita BF, Hirsch M, et al. Large-scale implementation of evidence-based treatments for children 10 years later: Hawaii’s evidence-based services initiative in children’s mental health. Clinical Psychology: Science and Practice. 2011;18(1):24–35.
20. Wonderlich SA, Simonich HK, Myers TC, et al. Evidence-based mental health interventions for traumatized youth: A statewide dissemination project. Behaviour Research and Therapy. 2011;49(10):579–587. [PubMed]
21. Beidas RS, Kendall PC. Training therapists in evidence-based practice: A critical review of studies from a systems-contextual perspective. Clinical Psychology: Science and Practice. 2010;17(1):1–30. [PMC free article] [PubMed]
22. Jensen-Doss A, Cusack KJ, de Arellano MA. Workshop-based training in trauma-focused CBT: An in-depth analysis of impact on provider practices. Community Mental Health Journal. 2008;44(4):227–244. [PubMed]
23. Herschell AD, Kolko DJ, Baumann BL, et al. The role of therapist training in the implementation of psychosocial treatments: a review and critique with recommendations. Clinical Psychology Review. 2010;30(4):448–466. [PMC free article] [PubMed]
24. Stokes TF, Baer DM. An implicit technology of generalization. Journal of Applied Behavior Analysis. 1977;10(2):349–367. [PMC free article] [PubMed]
25. Weisz JR, Chorpita BF, Palinkas LA, et al. Testing standard and modular designs for psychotherapy with youth depression, anxiety, and conduct problems: A randomized effectiveness trial. Archives of General Psychiatry. 2012;69(3):274–282. [PubMed]
26. Chorpita BF, Taylor AA, Francis SE, et al. Efficacy of modular Cognitive Behavior Therapy for childhood anxiety disorders. Behavior Therapy. 2004;35(2):263–287.
27. Dorsey S, Briggs EC, Woods BA. Cognitive-behavioral treatment for posttraumatic stress disorder in children and adolescents. Child and Adolescent Psychiatric Clinics of North America. 2011;20(2):255–269. [PMC free article] [PubMed]
28. Burley M. Outpatient treatment differences for children served in Washington’s public mental health system. Olympia, WA: Washington State Institute for Public Policy; 2009. Document No. 09-10-3401. Available online at http://www.wsipp.wa.gov/rptfiles/09-10-3401.pdf.
29. Garland AF, Brookman-Frazee L, Hurlburt MS, et al. Mental health care for children with disruptive behavior problems: a view inside therapists’ offices. Psychiatric Services. 2010;61(8):788–795. [PMC free article] [PubMed]
30. Wandersman A, Duffy J, Flaspohler P, et al. Bridging the gap between prevention research and practice: The interactive systems framework for dissemination and implementation. American Journal of Community Psychology. 2008;41(3–4):171–181. [PubMed]
31. Chorpita BF, Daleiden EL, Weisz JR. Identifying and selecting the common elements of evidence based interventions: A distillation and matching model. Mental Health Services Research. 2005;7(1):5–20. [PubMed]
32. Embry D, Biglan A. Evidence-based Kernels: Fundamental units of behavioral influence. Clinical Child and Family Psychology Review. 2008;11(3):75–113. [PMC free article] [PubMed]
33. Chorpita BF, Daleiden EL, Weisz JR. Modularity in the design and application of therapeutic interventions. Applied Preventive Psychology. 2005;11(3):141–156.
34. Borntrager CF, Chorpita BF, Higa-McMillan C, et al. Provider attitudes toward evidence-based practices: Are the concerns with the evidence or with the manuals? Psychiatric Services. 2009;60(5):677–681. [PubMed]
35. Barth RP, Lee BR, Lindsey MA, et al. Evidence-based practice at a crossroads: The timely emergence of common elements and common factors. Research on Social Work Practice. 2012;22(1):108–119.
36. Chorpita BF, Regan J. Dissemination of effective mental health treatment procedures: Maximizing the return on a significant investment. Behaviour Research and Therapy. 2009;47(11):990–993. [PubMed]
37. Garland AF, Bickman L, Chorpita BF. Change what? Identifying quality improvement targets by investigating usual mental health care. Administration and Policy in Mental Health and Mental Health Services Research. 2010;37(1–2):15–26. [PMC free article] [PubMed]
38. Chorpita BF, Weisz JR, editors. MATCH-ADC: Modular Approach to Therapy for Children with Anxiety, Depression, Trauma, or Conduct Problems. Satellite Beach, FL: Practice Wise; 2009.
39. Chorpita BF, Weisz JR, Daleiden EL, et al. Long term outcomes for the Child STEPs randomized effectiveness trial: A comparison of modular and standard treatment designs with usual care. Journal of Consulting and Clinical Psychology. 2013;81(6):99–1009. [PubMed]
40. Wood JJ, Drahota A, Sze K, et al. Cognitive behavioral therapy for anxiety in children with autism spectrum disorders: A randomized, controlled trial. Journal of Child Psychology and Psychiatry. 2009;50(3):224–234. [PMC free article] [PubMed]
41. Barlow DH, Boisseau CL, Ellard KK, et al. Unified Protocol for Transdiagnostic Treatment of Emotional Disorders: Therapist Guide. Oxford, England: Oxford University Press; 2010.
42. Ellard KK, Fairholme CP, Boisseau CL, et al. Unified protocol for the transdiagnostic treatment of emotional disorders: Protocol development and initial outcome data. Cognitive and Behavioral Practice. 2010;17(1):88–101. [PMC free article] [PubMed]
43. Farchione TJ, Fairholme CP, Ellard KK, et al. Unified protocol for transdiagnostic treatment of emotional disorders: A randomized controlled trial. Behavior Therapy. 2012;43(3):666–678. [PMC free article] [PubMed]
44. Lyon AR, Charlesworth-Attie S, Vander Stoep A, et al. Modular psychotherapy for youth with internalizing problems: Implementation with therapists in school-based health centers. School Psychology Review. 2011;40(4):569–581.
45. Stephan SH, Wissow L, Pichler E. Utilizing common factors and practice elements to improve mental health care by school-based primary care providers. Report on Emotional and Behavioral Disorders in Youth. 2010;10(54):81–86.
46. Chorpita B, Becker K, Phillips L, et al. Practitioner Guides. Satellite Beach, FL: PracticeWise; 2009.
47. Murray LK, Dorsey S, Haroz E, et al. A common elements transdiagnostic mental health approach for low- and middle-income countries: Initial pilots. Cognitive and Behavioral Practice. in press. [PMC free article] [PubMed]
48. Foa EB, Johnson KM, Feeny NC, et al. The Child PTSD Symptom Scale (CPSS): A preliminary examination of its psychometric properties. Journal of Clinical Child Psychology. 2001;30(3):376–384. [PubMed]
49. Sburlati ES, Schniering CA, Lyneham HJ, et al. A model of therapist competencies for the empirically supported cognitive behavioral treatment of child and adolescent anxiety and depressive disorders. Clinical Child and Family Psychology Review. 2011;14(1):89–109. [PubMed]
50. Morris SB, DeShon RP. Combining effect size estimates in meta-analysis with repeated measures and independent-groups designs. Psychological Methods. 2002;7(1):105–125. [PubMed]
51. Beidas RS, Barmish AJ, Kendall PC. Training as usual: Can therapist behavior change after reading a manual and attending a brief workshop on cognitive behavioral therapy for youth anxiety? The Behavior Therapist. 2009;32(5):97–101.
52. Glisson C, Schoenwald S, Kelleher K, et al. Therapist turnover and new program sustainability in mental health clinics as a function of organizational culture, climate, and service structure. Administration and Policy in Mental Health. 2008;35(1–2):124–133. [PubMed]
53. Glisson C, Schoenwald SK, Hemmelgarn A, et al. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. Journal of Consulting and Clinical Psychology. 2010;78(4):537–550. [PMC free article] [PubMed]
54. Atkins MS, Frazier SL, Leathers SJ, et al. Teacher key opinion leaders and mental health consultation in low-income urban schools. Journal of Consulting and Clinical Psychology. 2008;76(5):905–908. [PubMed]
55. Valente TW. Network interventions. Science. 2012;337(49):49–53. [PubMed]
56. Dorsey S, Pullmann MD, Deblinger E, et al. Improving practice in community-based settings: S randomized trial of supervision - Study protocol. Implementation Science. 2013;8(89):1–11. [PMC free article] [PubMed]
57. Shojania KG, Jennings A, Mayhew A, et al. Effect of point-of-care computer reminders on physician behavior: A systematic review. Canadian Medical Association Journal. 2010;182:E216–E225. [PMC free article] [PubMed]
58. Eby LT, Burk H, Maher CP. How serious of a problem is staff turnover in substance abuse treatment? A longitudinal study of actual turnover. Journal of Substance Abuse Treatment. 2010;39(3):264–271. [PMC free article] [PubMed]
59. Hurlburt MS, Garland AF, Nguyen K, et al. Child and family therapy process: Concordance of therapist and observational perspectives. Administration and Policy in Mental Health. 2010;37(3):230–244. [PMC free article] [PubMed]
60. Miller W, Sorensen J, Selzer J, et al. Disseminating evidence-based practices in substance abuse treatment: A review with suggestions. Journal of Substance Abuse Treatment. 2006;31(1):25–39. [PubMed]
61. Ward AM, Regan J, Chorpita BF, et al. Tracking evidence based practice with youth: Validity of the MATCH and Standard Manual Consultation Records. Journal of Clinical Child Adolescent Psychology. 2013;42(1):44–55. [PubMed]
62. Beidas RS, Cross W, Dorsey S. Show me, don’t tell me: Behavioral rehearsal as a training and analogue fidelity tool. Cognitive and Behavioral Practice. doi: 10.1016/j.cbpra.2013.04.002. published online ahead of print May 5, 2013. [PMC free article] [PubMed] [Cross Ref]
63. Hogue A, Ozechowski TJ, Robbins MS, et al. Making fidelity an intramural game: Localizing quality assurance procedures to promote sustainability of evidence-based practices in usual care. Clinical Psychology Science and Practice. 2013;20(1):60–77.
64. Schoenwald SK, Garland AF, Chapman JE, et al. Toward the effective and efficient measurement of implementation fidelity. Administration and Policy in Mental Health. 2011;38(1):32–43. [PMC free article] [PubMed]
65. Olmstead T, Carrol KM, Canning-Ball M, et al. Cost and cost-effectiveness of three strategies for training clinicians in motivational interviewing. Drug and Alcohol Dependence. 2011;116(1–3):195–202. [PMC free article] [PubMed]