Researchers in health behavior change have found it difficult to elucidate causal mechanisms that can be empirically demonstrated. There exist multiple theories of health behavior change that suggest what constructs to measure and intervene upon (e.g., self efficacy, social support). Typically, studies are designed to test the efficacy of an intervention that posits to target a number of behavior change constructs usually through some combination of increasing knowledge, teaching skills, changing attitudes, and applying self-management strategies. These bundled interventions are delivered in numerous settings (e.g., primary care, home, schools, and worksites) and through various delivery modes (e.g., print materials, telephone contacts, websites, group settings).
When carefully conducted randomized trials (typically with treatment as usual as the comparison condition) are undertaken to evaluate health behavior change interventions, they are typically found to produce modest effects on behavior change.1 When effects are detected it is difficult to determine the “active ingredients” of the bundled intervention that resulted in behavior change.2 Determining the active ingredients of the intervention is often not even a primary interest of the investigators who would be satisfied to conclude that the theoretically-based intervention package had a significant effect on behavior change. As a result, clinicians and policymakers (among others) are usually left without a satisfactory answer to the question, “What works?” when it comes to health behavior change interventions.
An innovative study in this issue of the American Journal of Preventive Medicine highlights research methods that can help us to better address the “What works?” question. In their paper on web-based smoking-cessation components and tailoring depth, Strecher and colleagues 3 present the results of a randomized trial to test a web-based smoking-cessation intervention in a sample of 1848 smokers. Smokers were recruited through two HMOs and randomized to one of 16 experimental arms. Each treatment condition tested a different combination of five psychosocial and communication components that were delivered through a website.
The primary outcome measure was 7-day smoking abstinence assessed after 6 months of intervention. Abstinence was related to the high-tailored success stories and high personalized message source components but not outcome expectations, efficacy expectations, or single versus multiple exposures of the intervention materials. There was also an effect for receiving all three high-depth tailoring factors (success stories, outcome expectations, and efficacy expectations) with smoking abstinence nearly 40% for participants receiving this intervention component combination. This finding that more message tailoring is better than less message tailoring is consistent with Noar and colleague's meta-analysis of tailored interventions.1
Strecher and colleagues3 bring together three important elements in their research that move us closer to more definitive answers about “What works?” when it comes to smoking cessation. First, they recruited their study participants through two HMOs, which allowed them to reach out to all potential participants. Using the entire HMOs' population of smokers as the sampling frame has the advantage of engaging smokers at different level of motivational readiness: from those not at all thinking about quitting to those who are currently ready to quit.4 This recruitment method also has external validity because it reflects how the intervention would likely be disseminated in a healthcare organization. This strategy has been shown to be successful in other smoking-cessation intervention trials.5,6
Second, Strecher and colleagues3 describe how their study is the first phase of a multiphase optimization strategy (MOST) aimed at determining the active program components and their optimal dose for effective interventions.7 Typically, the evaluation of the “active ingredients” of an intervention occurs post-hoc by testing if hypothesized behavior change constructs mediate the treatment to outcome relationship.8 While mediation analyses can suggest which constructs were affected by the intervention, which in turn affected change in behavior, the overall study design is limited, since participants were not randomized to these intervention components.2 That is, participants are typically randomized to the entire intervention or to a comparison condition. MOST incorporates a fractional factorial experimental design borrowed from the engineering field. This design allows for testing theory-driven combinations of intervention components rather than crossing all possible component combinations in a full factorial experimental design.2 The fractional factorial design can provide considerable cost savings for more rapid prototype testing of intervention components and will likely be used more in future health behavior change research.
The third important element in Strecher and colleagues3 study is the use of eHealth technology to deliver the intervention component variations in a feasible and extensible system. Using web-based technologies not only capitalizes on the Internet's ability to extend the reach of intervention delivery 9 but can expand the range of intervention components that can be tested. Interactive web-based interventions can provide participants with theoretically-based behavior change tools that can more closely match the theoretical principles of behavior change than print-based static interventions. For example, a web-based intervention can facilitate goal setting by helping participants break down large goals to smaller ones and assess these smaller goals more frequently, and automatically adjust the goal to more challenging levels.10 eHealth technologies can also enhance more intensive interventions. For example, Tate and colleagues 11, 12 used a website for daily to weekly online submissions of calorie and fat intake, and energy expenditure. Both studies found significant weight loss for the intervention participants who received timely feedback from counselors via e-mail.
Strecher and colleagues' study3 adds to the body of research on the study of eHealth interventions.10,13 Their study demonstrates innovation in health behavior change research that, in conjunction with “State of the Science” evaluation and data-collection methods, will help us to get better answers to the “What works?” question. AJPM looks forward to publishing similar innovative empirical studies on the science of behavior change.
No financial disclosures were reported by the author of this paper.
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.