This article describes the state of workshop education for mental health professionals in the assessment and management of suicide risk. We applied rigorous eligibility criteria to select in-person workshops that (1) target mental health professionals; (2) aim to promote general clinical competence in the assessment and management of suicide risk; and (3) have at least one published, peer-reviewed article describing or evaluating the training or model on which it is based. Our study is the first to provide a cross-program description of the objectives and methods of the clinician-targeted workshops; characterize the training, qualifications, and feedback for the trainers who deliver the workshops; and review published studies about training outcomes.
We surveyed developers of the 12 programs that met the criteria and discovered that these workshops cover a wide range of learning objectives, with the heaviest focus on assessment and formulation of suicide risk. Workshops that emphasized documentation and managing care in their written learning objectives were less common. Thousands of clinicians in the mental health workforce attend workshop trainings each year and more than 40,000 mental health professionals have participated in these 12 workshops. Half of the workshops included in our study have been delivered by program developers only; the other half are delivered by other trained instructors as well as by the developers. These additional instructors typically have at least a master’s degree, clinical experience, and licensure, and some have past teaching experience. They generally receive one to two days of initial training and minimal ongoing feedback, which mostly consists of results from satisfaction surveys. None of the programs reported routinely providing expert feedback based on observed performance.
The content provided to clinician participants in these workshops has strong face validity and bears the mark of expert clinician-developers. The programs are “evidence-based” in the sense that some of the content draws from clinical epidemiology and treatment research. Several of the workshops have grown out of mature treatment or prevention models, which have demonstrated efficacy with respect to improving participant knowledge and attitudes. Many use innovative and promising pedagogical techniques. One of the authors (ARP) attended most of the publically available workshops examined in this study and found them practical and engaging. In short, these workshops generally convey the best available recommendations for clinical practice, often in innovative ways. Furthermore, some of the programs have begun to take steps to assure that participants have learned what is taught: RRSR requires participants to pass a multiplechoice test, and QPRT and CASE require role-played demonstration of specific interviewing skills. Nevertheless, there is an urgency to evaluate the impact that training has on the care mental health professionals provide and the outcomes they achieve.
Our project revealed that research documenting real-world outcomes from these workshops for mental health professionals is limited. Studies up to this point have established that clinician knowledge and attitudes improve in response to training, but the evidence with respect to clinical skill comes from just two studies, making it difficult to draw conclusions. McNiel and colleagues’ (2008)
workshop produced a meaningful post-training improvement in vignette-based written risk assessment. Gask and colleagues (2006)
found that mental health professionals’ interview skills improved in the short-term, but the effect did not persist in a small follow-up group. We found no studies addressing the impact of workshop training on observed practice with real patients or on outcomes clinicians achieve with patients after participating in training. Although the framework taught in the CAMS workshop has evidence to support its efficacy with patients, this evidence cannot be extrapolated to the workshop because the clinicians providing treatment in the studies we reviewed were not trained via a workshop, but rather trained extensively in CAMS at the home institution of the developer.
Thus, based on the evidence available at this time, we can conclude that workshops provide an effective means for transferring knowledge and shifting attitudes, but not necessarily skills. Until we have more evidence that workshops improve skill and impact patient outcomes, clinicians and administrators can think of workshops as serving a valuable role in clinician education (transferring knowledge and attitudes, introducing skills), but should recognize that workshops have not yet been demonstrated to improve the clinical care of suicidal patients.
We focused on one type of educational opportunity for practicing clinicians: in-person workshops that purport to strengthen general, transtheoretical competency. These workshops do not represent the full range of clinical education that is available for professionals who wish to improve their ability to work with suicidal individuals. For example, we did not focus on workshops for clinicians seeking specific expertise in a manualized treatment or techniques for working with suicidal individuals. We also did not catalog education that is taking place in other venues, such as ad hoc employer in-services, clinical supervision, online courses, and professional journal articles. Thus, this article contributes knowledge about a significant, but limited, part of the spectrum of clinical training opportunities. Studies of clinical education in specific treatments and data through other venues are needed to evaluate the full range of clinical education in the assessment, management, and treatment of patients at risk for suicide.
We relied upon a relatively brief survey to minimize participant burden and maximize participation. While trying to be as comprehensive as possible, our survey may not have captured all domains relevant to describing the state of workshop education.
Finally, we gathered some of our data from developers via self-report. A vested interest in their program could bias developers’ reporting of the number of clinicians trained, the qualifications of trainers, and support trainers receive. Nevertheless, most of the information included in this article is based on publically available material.
Our study on the current state of workshop education in the assessment and management of suicide risk suggests several specific areas for future development and study. First, workshop developers should focus effort on factors that influence the implementation of knowledge, attitudes, and skills gained in workshops. Follow-up sessions, online refreshers, and special training for supervisors and administrators may be necessary for workshops to have an effect on patient care. Educational developers should thus consider collaborating with implementation scientists, whose field of study specifically includes designing interventions to improve implementation of skills and practices into service settings (Proctor et al., 2009
Second, half of the workshops in this study are disseminated using instructors who are trained by developers or by master trainers to deliver the workshop. While this “train-the-trainer” model is efficient and often necessary, the ability of trainers to present the workshops with fidelity and competence is a potential constraint to effectiveness. Research on the education and support process for instructors—including the role of instructor selection, feedback, and support—is needed to determine how to promote successful transfer to training from master trainers to instructors.
Third, despite the high prevalence of suicidal ideation and behavior among patients receiving mental health treatment, there are very limited data about current practices and needs in the mental health workforce. We lack basic information about how (and how often) clinicians elicit, explore, and respond to information about suicidal thinking, plans, and behavior. Scientific knowledge about usual care can help focus educational efforts on the areas most needing improvement and enable the field to measure progress.
Lastly, we need controlled studies evaluating the development, continued use, and clinical impact of the skills and approaches taught in these workshops. Researchers may also wish to focus on the comparative effectiveness of workshops for different kinds of clinicians. For example, some workshops may be more useful for outpatient versus inpatient or early-career versus experienced professionals. Ideally such studies would be conducted using observational data of real-world application of skills and patient response.