|Home | About | Journals | Submit | Contact Us | Français|
As adults age, their physical activity decreases and sedentary behavior increases, leading to increased risk of negative health outcomes. Wearable electronic activity monitors have shown promise for delivering effective behavior change techniques. However, little is known about the feasibility and acceptability of non-Fitbit wearables (Fitbit, Inc, San Francisco, California) combined with telephone counseling among adults aged more than 55 years.
The purpose of our study was to determine the feasibility, acceptability, and effect on physical activity of an intervention combining a wearable physical activity monitor, tablet device, and telephone counseling among adults aged 55-79 years.
Adults (N=40, aged 55-79 years, body mass index=25-35, <60 min of activity per week) were randomized to receive a 12-week intervention or to a wait list control. Intervention participants received a Jawbone Up24 monitor, a tablet with the Jawbone Up app installed, and brief weekly telephone counseling. Participants set daily and weekly step goals and used the monitor’s idle alert to notify them when they were sedentary for more than 1 h. Interventionists provided brief counseling once per week by telephone. Feasibility was measured using observation and study records, and acceptability was measured by self-report using validated items. Physical activity and sedentary time were measured using ActivPAL monitors following standard protocols. Body composition was measured using dual-energy x-ray absorptiometry scans, and fitness was measured using a 6-min walk test.
Participants were 61.48 years old (SD 5.60), 85% (34/40) female, 65% (26/40) white. Average activity monitor wear time was 81.85 (SD 3.73) of 90 days. Of the 20 Up24 monitors, 5 were reported broken and 1 lost. No related adverse events were reported. Acceptability items were rated at least 4 on a scale of 1-5. Effect sizes for most outcomes were small, including stepping time per day (d=0.35), steps per day (d=0.26), sitting time per day (d=0.21), body fat (d=0.17), and weight (d=0.33).
The intervention was feasible and acceptable in this population. Effect sizes were similar to the sizes found using other wearable electronic activity monitors, indicating that when combined with telephone counseling, wearable activity monitors are a potentially effective tool for increasing physical activity and decreasing sedentary behavior.
Clinicaltrials.gov NCT01869348; https://clinicaltrials.gov/ct2/show/NCT01869348 (Archived by WebCite at http://www.webcitation.org/6odlIolqy)
Increasing physical activity and decreasing sedentary behavior can reduce the risk of many negative health outcomes among older adults, including cardiovascular diseases, Type II diabetes, cancer, and all-cause mortality [1-4]. The effects of moderate-vigorous intensity physical activity and sedentary behavior on these outcomes appear to be independent [5,6]; thus, older adults could benefit from interventions targeting both behaviors simultaneously. Unfortunately, rates of physical activity are low in this population. Recent estimates from objective monitoring suggest that most American adults spend less than 2% of their time in moderate-vigorous intensity physical activity . Moderate-vigorous intensity activity decreases with age ; improving activity habits among mid-aged and older adults could prevent later functional decline and even mortality [9,10]. In addition, sedentary behavior is highly prevalent, amounting to most of adults’ waking hours . Although interventions have demonstrated positive effects on both behaviors, these methods suffer from limitations related to poor sustainability and poor scalability . There is a clear need for interventions that are effective in the long term as well as the ones that are easy to disseminate.
Wearable electronic activity monitors are advanced versions of pedometers that are able to offer more behavior change techniques and implement them in different ways as compared with standard displays on the device itself . Because these devices send information to a mobile app, they are able to offer feedback that better conforms to theoretical recommendations (eg, specific, clear, and comparing with similar others; past accomplishments; and specific goals) [14,15]. They also deliver many additional behavior change techniques that are not possible with standard pedometers, such as goal setting, social support, and cues to action. Cues to action are likely particularly important for replacing sedentary behavior with physical activity, as they alert participants to their sedentary behavior in real time. Traditionally, delivery of these behavior change techniques would require either in-person counseling or frequent (and thus expensive) tailored print materials. Delivery via mobile app offers an opportunity for interventions both effective and with broad reach.
These improvements on pedometers show promise, but wearable electronic monitors and their companion mobile apps still lack several important behavior change techniques. In particular, empirically proven techniques such as action planning and problem solving are typically absent from these apps . Adding brief counseling to provision of these devices could allow interventionists to deliver the full range of behavior change techniques standard in behavior physical activity interventions. The counseling should provide any techniques missing from the apps, while the apps allow for improved implementation of other fundamental techniques.
Studies of wearable electronic devices and mobile apps published thus far have found equivocal physical activity outcomes, though their use of Fitbit and Bodymedia products may not generalize to other self-monitoring systems [13,16,17]. It is also possible that these devices may not be feasible or acceptable to older adult populations , who are in unique need of more effective activity interventions. Feasibility and acceptability of mobile phone–based intervention among adults aged more than 55 years is not yet clear. Some studies have found low acceptability and preference for other media  as well as increased barriers to mobile phone use with increasing age . Older adults have also reported the feeling that iPads were designed for younger audiences than for their age group . However, some studies have found positive feasibility and acceptability results for smart device health interventions in this age group [21,22]. Short-term tests indicate that activity monitors may be acceptable to mid-aged and older adults , but the feasibility and acceptability of longer-term usage is unclear. Of the few published studies of wearable electronic activity monitors, even fewer have described in detail feasibility and acceptability results .
The purpose of this study was to investigate the feasibility and acceptability of an intervention including the Jawbone Up system and telephone counseling. To our knowledge, this system is yet to be tested in intervention trials. In a content analysis of available wearables and their apps, the Up system included the most behavior change techniques and implemented them very closely to theoretical recommendations . In addition to providing a wearable device and mobile app, we also provided brief weekly telephone counseling that was adapted to include behavior change techniques known to be important in physical activity research that were absent from the Up system [15,25-27]. We hypothesized that the intervention would be feasible and acceptable for this population. To operationalize these outcomes, we specifically measured days the monitor was worn and self-reported acceptability items taken from similar eHealth studies.
Figure 1 shows the CONSORT diagram for the study. This trial was a parallel randomized controlled pilot trial with 1:1 group allocation. Participants (N=40) were recruited in 2 cohorts of 20 between 2014 and 2015 via advertisements in local newspapers, online mailing lists, and university announcements. Cohort 1 was recruited over 6 months in 2014, and cohort 2 was recruited over 8 months in 2015. Major inclusion criteria were ages between 55 and 79 years, body mass index (BMI) between 25 and 35, the ability to read and understand English, and the ability to read words on a tablet-sized device. Major exclusion criteria included self-reported habitual physical activity more than 60 min per week, health issues that might preclude safe walking, psychological issues that might interfere with full participation, current use of a wearable electronic activity monitoring system, and endorsing cardiovascular risk questions on the Physical Activity Readiness Questionnaire . If the only questions endorsed had to do with taking medication, individuals could participate if they provided a doctor’s consent. Randomization was conducted using standard opaque envelopes with foil (to prevent seeing the group assignment inside the envelope) and carbon paper (to provide an audit trail). The envelopes were randomly sorted by an individual not involved with the randomization visit process, then numbered sequentially. As interventionists opened each envelope, they signed and dated each envelope and saved the inner paper with original printed allocation and carbon-copied sequence number, ID number of the participant, signature of interventionist, and date of opening. Randomization was carried out using sequentially numbered opaque sealed envelopes according to standard protocols , with randomization stratified by the 2 cohorts to promote adequate numbers of participants able to talk to one another through the app.
Participants attended 4 scheduled visits. The first consisted of informed consent procedures and provision of a research-grade activity monitor. Participants were provided information on the intervention procedures and nature of the wearable and app prior to providing informed consent. Approximately a week later, participants returned for a full baseline assessment and orientation to the study. A midpoint assessment occurred at 6 weeks (questionnaire and physical activity assessment only), and a full final assessment occurred at 12 weeks. Participants could not be blinded to their group. Unfortunately, resource limitations precluded using blinded assessors for all participants. All procedures were approved by the University of Texas Medical Branch Institutional Review Board and registered at clinicaltrials.gov prior to beginning data collection. Figure 1 shows the flow of participants through the trial using a CONSORT diagram.
The participants randomized to the intervention group were lent a mini tablet mobile device (Apple iPad Mini, Apple Inc, Cupertino, CA) and a wearable electronic activity monitor (Up24, Jawbone Inc, San Francisco, CA) for home use during the study. The tablet was preloaded with the Jawbone Up app and synced to an Up24 for each participant. Figure 2 shows an example of activity feedback and social support in the Up app. Please note that this example used data and social interaction from researchers in the study, not from the study participants. Detailed information on the contents of the app, including behavior change techniques and adherence to theory-based recommendations, are available in a previous publication . All the participants were provided with premade accounts that existed on a “team” with all other participants as well as an account for interventionist surveillance. The orientation visit included guidance on the use of the wearable and app, encouragement to comment and like others’ activity, and an initial goal-setting session. Interventionists encouraged participants to view their data at least twice per day, in the morning and late afternoon or evening. Participants set goals for physical activity (short- and long-term) and sedentary behavior (longest bout length). Interventionists provided training for self-monitoring, viewing feedback, and using sedentary behavior prompts in the app. Although some changes in the appearance of the app and tools provided in the app occurred during the overall study period, no substantive changes to the physical activity feedback content occurred. We were unable to determine whether individual participants updated their apps during their intervention periods, but updates should not have affected the overall experience.
Weekly telephone counseling was provided by a team led by the principal investigator and a postdoctoral fellow with extensive training in behavioral counseling. The team included a predoctoral fellow and a clinical research coordinator who were trained by the principal investigator and the postdoctoral fellow. Initial calls by team members were observed by the principal investigator and postdoctoral fellow and feedback was provided to maintain quality. In addition, team members followed a scripted counseling guide. Counseling calls were designed to last approximately 15-20 min each. Each counseling call included a check-in for any adverse events or technical problems, reevaluation of weekly step goals, and action planning for the next week. Goals were negotiated between the counselors and individuals, with counselors suggesting at least 7000 steps per day (based on step counts found to be appropriate for very deconditioned older adults, as we determined with baseline fitness tests)  on 2 days per week, increasing over time to at least five days per week. Sedentary bout goals were also negotiated, with 1 h being the number suggested and typically agreed to by participants. These goals were entered into the app so that progress bars would measure progress toward the specific goals. Idle alerts used the sedentary bout goals to determine when to vibrate to alert participants that they had been sedentary too long.
Weekly special topics delivered additional behavior change techniques from the CALO-RE framework  that are as follows: planning social support, problem-solving, self-rewards, when and where to perform the behavior, relapse prevention, stress management, and time management. The app provided other behavior change techniques, listed in the Multimedia Appendix 1 (see Lyons et al  for more complete descriptions; only behavior change techniques related to physical activity are listed here). Because the SmartCoach portion of the app adjusted the content based on the user behavior, we cannot state with certainty which behavior change techniques were delivered to each participant by the app. Some participants may not have triggered delivery of every possible behavior change technique. Interventionists based their counseling on the data taken from the app over the last week to negotiate changing goals.
The wait-list control group did not receive any intervention until after their final assessment, when they were provided the intervention in full.
Physical activity was measured using an ActivPAL device (PAL Technologies Ltd, Glasgow, Scotland). This small, thin device was attached to the front of each participant’s thigh, midway between the knee and the trunk, using an adhesive strip. The ActivPAL is well-validated for use in measuring physical activity as well as sedentary behavior [32,33]. Participants wore the devices for a period of 7 days at each assessment period (baseline, 6 weeks, 12 weeks). No specific criteria have been published for determining wear time and usable data for these monitors. Following the procedures of Bickmore et al , we removed daily activity values less than the 5th percentile (0.00 stepping minutes), as we considered it possibly representative of non-wear time (eg, time spent being mailed back and forth to participants, time being carried rather than worn). Physical activity was operationalized as mean minutes of physical activity per day, mean minutes spent sitting per day, and mean steps per day across all valid days per assessment.
Feasibility was measured in several ways. Use of the monitor was measured first by abstracting information weekly from the app. Because the interventionist account was a friend of each participant account, their daily data were posted to our news feed. We confirmed these data at the conclusion of the study using downloadable comma separated value files from the Jawbone website. These files provided extensive information as to different parts of the app that were used each day. Days in which activity, food, and sleep were logged were taken from these files. Attrition, adverse events, completed counseling calls, and reports of technical problems or loss of equipment were taken from the study records kept by interventionists (phone counseling logs) and the clinical research coordinator (records from emails and phone calls from participants). Acceptability of the monitor, app, and tablet were measured using items adapted from Vandelanotte et al [35,36]. Several additional items specific to the monitor and apps were included (eg, “I would continue using the idle alert”). These items used the same stems and responses as the ones adapted from previous research. All responses were made on a Likert-type scale from 1 (strongly disagree) to 5 (strongly agree). We also used a measure of perceived competence from the Intrinsic Motivation Inventory, with its items adapted to discuss competence using the tablet .
Fitness was estimated using a 6-min walk test . Participants were asked to walk for 6 min in a rectangular route marked with cones, with a trained assessor tracking the time and laps completed. Once the activity was performed, participants waited where they stopped, while the assessor measured their distance from the closest cone. Distance walked in 6 min was measured in terms of feet.
Percent body fat was estimated using Dual x-ray Absorptiometry (GE Lunar iDXA, GE Medical Systems Lunar, Madison, WI) at baseline and 12 weeks. Height and weight were measured using a stadiometer and calibrated scale. Sociodemographics were recorded at the baseline and included age, gender, race, and ethnicity.
Weekly telephone counseling logs were completed by counselors to indicate whether counseling calls were completed or missed. Counselors attempted to contact participants a maximum of 5 times, if a counseling call was missed at the scheduled time.
All self-report measures were taken in-person using paper questionnaires. Several indicators of feasibility were measured using data from the mobile app, but all other assessments occurred face-to-face.
Data were analyzed using the R system version 3.3.1 (R Foundation for Statistical Computing, Vienna, Austria) . Differences at baseline were investigated using Student’s t tests and chi-square tests. Differences between groups were estimated using analyses of covariance (ANCOVA), controlling for baseline values of the dependent variable. Box-Cox transformations were used to improve the validity of the inference. All the analyses used the intent-to-treat principle, bringing the last observation forward for the ones who dropped out. Multiple imputations were not used due to findings that data were not missing at random. An analysis of only study completers was also conducted for comparison purposes. In an attempt to account for potential clustering of effects due to social networking, we also ran models that included a random effect of cohort. Effect sizes presented are in the form of Cohen d.
As this was a pilot test intended to investigate feasibility, this study was not powered to detect a statistically significant difference in the primary outcome. Rather, the purpose of statistical tests was to provide estimated effect sizes that could inform decision making regarding development of a follow-up, fully powered intervention trial.
As shown in Table 1, Participants (N=40) were 61.5 (SD 5.6) years old with a BMI of 30.3 (SD 3.5). They were mostly female (34/40, 85%) and white (26/40, 65%). No related adverse events were reported. In the intervention group, 1 participant dropped out as compared with 1 in the wait list control. Participants in the intervention group completed a mean of 10.2 (SD 2.4) of 12 counseling calls. Participants wore their Up24 monitors on average 81.85 (SD 3.73) of 90 days, with a minimum of 69 days. Although the intervention did not instruct usage of nonactivity portions of the app, participants also spontaneously tracked their sleep (Mean 11.70, SD 11.97 days) and food intake (Mean 2.65, SD 7.83 days). Figure 3 shows changes to wear of the monitor from week to week as mean and standard deviation (lower bar).
During the study period, 5 Jawbone Up24 monitors were reported broken by participants and were replaced. One additional monitor was lost and replaced. No tablets were lost, and all technical problems with them were resolved without the need for replacement. Responses to acceptability questions are shown in Table 2 , broken down to show responses by participants under the age of 60 years as compared with the participants 60 years or older. All but one of the questions (including reverse-coding the negatively worded question) received a mean rating over 4 of 5 across both age groups, with only one receiving a 3.9 for the participants ages 60 or above (“would you continue to wear the monitor?”). Usage and step data were successfully retrieved weekly from the Up app by research assistants. Because the interventionist account was a “friend” of each participant, participants’ information and discussions were displayed on the interventionist account news feed.
Physical activity, body composition, and anthropometric results for both groups from ANCOVA models are shown in Table 3. Analyses conducted with only the study completers with complete data did not produce substantially different results (eg, effect sizes based on only complete data were 0.40 for minutes and 0.31 for steps as compared with 0.35 and 0.26, respectively, in the intent-to-treat analysis). Therefore, we have presented the results from the intent-to-treat analysis. When we added a random effect for cohort to these models to account for potential effects of social interaction within cohorts, results did not change meaningfully. Intraclass correlation coefficients ranged from 0 (minutes sedentary, body fat) to 0.10 (weight).
This physical activity and sedentary behavior intervention using an electronic activity monitor system with phone counseling was found to be feasible and acceptable in a sample of older adults. The study was also found to produce significant but small changes in total physical activity time and weight favoring the intervention group. Statistical differences between groups were not interpretable due to the underpowered nature of this trial. Effect sizes (0.35 for minutes, 0.26 for steps, 0.21 for sedentary time) suggest that a larger-scale implementation of the intervention will likely produce small but potentially clinically significant improvements in physical activity time and steps taken.
The findings of this study are very similar to the findings of a pilot study of postmenopausal women using a Fitbit system, who increased their steps from approximately 5900 at baseline to 6700 at 16 weeks . Comparing Fitbit One and a brief counseling with a pedometer group, Cadmus-Bertran et al found an effect size of approximately 0.24 for steps, whereas our comparison with a wait-list control produced an effect size about approximately 0.26 (from approximately 5100 to 6200 steps, as compared with approximately 4600 steps at both time points in the control group). Another study that compared Fitbit One with texting with Fitbit One without texting found no significant difference between groups and no increase in steps compared with baseline in either group . A preexperimental study of Fitbit provision among adults more than 60 years of age also found a very similar increase of approximately 1100 steps over 12 weeks . A large-scale study of weight loss that compared a standard behavioral weight loss intervention with and without the use of a BodyMedia wearable monitor found no difference between the 2 on physical activity . The BodyMedia monitor was substantially different from Fitbit and Jawbone in terms of behavior change techniques available in the app , which may partially explain this different result. Taken together, these results suggest that wearable electronic activity monitors with sophisticated feedback apps and supplemental guidance can indeed produce a clinically significant increase in physical activity.
Nearly half of our sample (18/40, 45%) self-reported as black, Hispanic, or other race, groups that are at increased risk of inactivity and related negative health outcomes. More than half of the sample (22/40, 55%) were aged above 60 years, with 10 (25%) above 65 years. Baseline fitness estimates also indicated that many of the participants were quite deconditioned. In all, this sample represents a population in critical need of novel and effective interventions to increase physical activity. Ample epidemiological evidence suggests that even small increases in physical activity among sedentary older adults can produce large health improvements [3,42]. Although a standard physical activity intervention might be expected to produce an increase of 1000-2000 steps , these interventions are typically much more intensive than the one tested here and thus likely more difficult to disseminate.
Feasibility and acceptability findings showed that participants overall were compliant and reported enjoying the intervention. Based on our findings for broken and lost monitors, researchers may need to purchase extra monitors for any long-term planned implementation. This study required purchase of 6 additional monitors on top of the original 20, which was more than expected. Acceptability findings were high for all components of the intervention, including for participants aged above 60 years. Despite technical issues such as broken monitors (not syncing, not powering on, buttons falling off), participants reported that the monitor, tablet, and app were user-friendly. No participant stated that they would rather use a simple pedometer instead of the provided wearable electronic activity monitor.
These results also raise many questions for future research. The extent to which apps with wearable devices cause increases in physical activity, as compared with telephone counseling or the two in combination, is not clear. We specifically designed brief counseling to address behavior change techniques absent from the app; a study of the Jawbone monitor in isolation may find different results. In addition, we arranged for participant accounts to be “friend”ed with other participants in their cohort to allow for anonymous “likes” and comments. Social interactions, either with participants or with family or friends, could be a powerful tool for increasing the efficacy of these devices.
The rigor of this study was limited by several aspects of its study design and by its nature as a small project conducted with very limited financial resources. As a pilot study, it was not fully powered to detect statistically significant differences in its outcomes or long-term behavior maintenance. We also cannot determine feasibility, acceptability, or effects of individual portions of the intervention such as the monitor only or telephone counseling only. Comparing with a wait-list control also limits our ability to interpret feasibility or acceptability as compared with other interventions such as pedometers. Although the effect sizes may be useful for assistance in powering future studies that use wearable electronic activity monitors, we do not report P values. A related limitation lies in our study design and analytic plan. We did not anticipate the importance of socializing within the app during our planning process and did not plan for clustering. Because of resource limitations, we were unable to ensure that all participants had equal access to socialization at the same time (ie, some participants had fewer people to talk to in the app for periods of time). We attempted to account for socialization by adding a random effect for cohort into our models, but even that technique cannot truly account for the potential effects of different social opportunities when cohorts do not spend equal time with each other in the app. Future follow-up studies that use the full potential of these apps, which includes online social networking, will need to plan for clustering in their recruitment schedule and analytic plans.
An issue with all studies that use commercially available technology is sustainability. The Jawbone company is no longer manufacturing activity monitors, and it is not clear what the future holds for the Up app. Although other, similar wearables and apps exist, the number of behavior change techniques and the quality of their implementation differ . In particular, social interaction is implemented quite differently in competitor’s products, which could affect future studies’ results.
A possible limitation has to do with the ActivPAL research-grade physical activity measurement devices. Comparisons with other physical activity studies in terms of physical activity time are difficult due to differences in how this outcome is estimated. ActivPAL’s minute estimates are for any physical activity, not only moderate to vigorous intensity activity. Because our focus here is on replacing sedentary time with any kind of activity, we felt this was the appropriate outcome. However, because many other studies use Actigraphs to measure moderate-vigorous intensity activity as their primary outcome, comparisons across studies for active time are difficult. We have provided both steps and active time to allow for more comparisons with other studies.
An intervention using wearable electronic activity monitors, tablets, and brief phone counseling was found feasible and acceptable in a population of sedentary, overweight middle-aged and older adults. These systems show promise as relatively inexpensive, scalable methods for the delivery of evidence-based behavior change techniques. Future studies are needed to better understand how and why monitor interventions may increase physical activity, for example, by comparing monitors alone with monitors with additional behavior change techniques delivered via counseling.
This study was internally funded by the Claude D Pepper Older Americans Independence Center (P30AG024832) and Sealy Center on Aging. Additional salary support was provided by a Mentored Research Scholar Grant in Applied and Clinical Research (MRSG-14-165-01-CPPB) from the American Cancer Society, the American Heart Association (13BGIA17110021), and the Cancer Prevention Research Institute of Texas (RP140020).
Behavior change techniques made available by the Jawbone Up system.
CONSORT e-health checklist 1.6.
Authors' Contributions: EJL developed the study. MCS, ZHL, and EM assisted with implementing the protocol, collecting data, and writing the manuscript. KJ performed statistical analyses and assisted with writing the manuscript. All authors provided edits to the manuscript.
Conflicts of Interest: MCS’s spouse has an equity interest in Apple Inc, a company that may potentially benefit from the research results. In addition, ZHL is employed by Beachbody, a company that may potentially benefit from the research results. ZHL’s employment began after data collection and analysis. UTMB’s Conflicts of Interest Committee has reviewed these conflicts and a management plan was implemented to prevent any appearance of a conflict of interests. Any inquiries regarding this management plan can be directed to UTMB’s Office of Institutional Compliance, (409) 747-8701.