|Home | About | Journals | Submit | Contact Us | Français|
In 1999, results from a study of lung cancer screening in cigarette smokers demonstrated that low dose helical computed tomography (CT) could detect nearly four times as many lesions as single-view posteroanterior chest radiograph.1 Enthusiasm for CT screening grew even though there was no evidence that the modality could reduce lung cancer mortality. A randomized controlled trial (RCT) with a mortality endpoint was proposed, but some cautioned that persons at high lung cancer risk would not accept randomization due to a strong desire to receive CT. While two feasibility studies2,3 demonstrated that such concerns were unfounded, there was an additional concern that CT use would increase, and that an RCT had to be done quickly. With this in mind, the US National Cancer Institute (NCI) launched the National Lung Screening Trial (NLST) in 2002, requesting that the 33 screening centers enroll the necessary 50,000 participants within two years. Enrollment to NLST finished ahead of schedule: 53,454 persons were randomized within twenty months. We describe NLST’s recruitment effort, reporting on methods used and costs associated with some of the methods for the 22 NLST centers that contributed data (Table 1).
The NLST has been described in detail elsewhere.4 Briefly, the NLST, an RCT, was designed to assess whether CT can reduce lung cancer mortality, relative to single-view posteroanterior chest radiograph, among individuals at elevated risk of lung cancer. Participants were between 55 and 74 years old, either current cigarette smokers or former smokers who had quit within the previous 15 years, and had a smoking history of at least 30 pack-years. Pack-years was calculated by multiplying the number of years smoked by the average number of packs (20 cigarettes) smoked per day. Each participant was randomized to one modality and invited to receive three annual screens. Thirty-three medical centers across the US enrolled and screened participants, and the study was approved by the local Institutional Review Board (IRB) at each screening center... The study launched in 2002, completed recruitment in 2004, completed screening in 2007, and was stopped in October 2010, after which a significant 20 percent lung cancer mortality reduction was reported.5
NLST had two administrative components. The Lung Screening Study (LSS) had 10 centers and was administered through an NCI contract mechanism. The other component, administered by the American College of Radiology Imaging Network (ACRIN) through an NCI grant and Clinical Trials Group cooperative agreement mechanism, had 23 centers. LSS and ACRIN were harmonized on major trial activities.4
Each center was free to choose its recruitment methods. We categorized methods into three broad categories: direct mail, community outreach, and mass media. Direct mail refers to the use of mass mailing to invite persons whose names and addresses appear on a list procured from a commercial, government, or other organization. Community outreach refers to the use of community resources and activities, and mass media the use of print or broadcast media or advertising to inform persons about the trial. Subcategories of the three broad categories were identified (Tables 2a and andb).b). Eleven subcategories were identified for direct mail, 11 for community outreach, and 8 for mass media. LSS centers, given their previous favorable experience with direct mail in the Prostate, Lung, Colorectal and Ovarian (PLCO) Cancer Screening Trial, chose this method as their primary means of recruitment. ACRIN centers felt that mass media and advertisements in physicians’ offices would be most effective and began their recruitment efforts there.
A three-part questionnaire was completed by each center coordinator and/or another staff member involved in recruitment (referred to as “coordinator” henceforth). Coordinators typically held research associate-type positions, although some were nurses, and at least one was a physician. The first part asked the coordinator to subjectively assess the degree to which each method listed in Tables 2a and and2b2b was used. Coordinators were instructed to leave the answer space blank if a method was not used. Otherwise, the coordinator was to put “+++,” “++,” “+,” or “−”. Coordinators were instructed to assign “+++” if a method was used extensively, assign fewer “+”’s if the method was used to lesser degrees, and assign “− ” if a method was used but abandoned due to lack of success or other issues. This method was chosen because it had been used successfully in evaluating recruitment efforts in another cancer screening trial.6 For ease of presentation, “+++” was recoded to “A”, “++” to “B”, “+” to “C”, and “−” to “F”.
The second part asked coordinators to report the number of inquiries generated by, and the number of participants enrolled, because of that method, with both based either on participant queries at the time of first contact or retrospective approximations by the coordinator. Respondents also were asked to report the exact or approximate total cost of each method, exclusive of personnel and overhead costs. Respondents were not queried as to the specific activities they included in the reported total cost. Coordinators also were asked to report the exact or approximate number of pieces mailed for direct mail methods. Data for the second part were not available for all centers: some had no data or could not provide an approximation, while others could provide data for only certain methods. For direct mail, response and enrollment rate were calculated by dividing the number of respondents and the number enrolled, respectively, by number of pieces mailed. For all categories, cost per enrollee was calculated by dividing cost by number enrolled. The third part of the questionnaire asked coordinators to report lessons learned as a result of recruitment efforts. Coordinators relied on their memories to complete the first and third parts of the questionnaire; internal records, if available, also were also used to complete the first part. To complete the second part, internal records were used.
Twenty-two centers completed the questionnaire. Each center, its administrative affiliation, location, catchment area, and number enrolled is listed in Table 1. Ten centers were affiliated with LSS; the remaining 12 were affiliated with ACRIN. The 22 responding centers recruited 46,402, or 87%, of all NLST participants.
Direct mail was utilized by every center in varying degrees. Eighteen centers used commercial mailing lists, although one center abandoned it. Direct mail using commercial lists was a focus of LSS recruitment efforts, with seven of nine centers reporting extensive use. Direct mail using lists from the American Cancer Society was a focus of ACRIN recruitment, with three centers reporting extensive use. Other extensively-used sources included Department of Motor Vehicles lists and lists of persons registered in a health care system.
The community outreach method used by most centers was referrals from NLST participants/word-of-mouth: seventeen centers obtained participants in this manner. Eleven centers used enrollment seminars: seven rated the method as extensively used, while one center abandoned it. Community outreach methods were used extensively by two centers in recruiting persons of minority ethnicity or heritage; other centers either used community outreach methods for minority recruitment to a lesser degree or abandoned them.
Mass media was used extensively by ACRIN and less so by LSS. All twelve ACRIN centers used radio, while only six LSS centers used radio, and one abandoned the method. Eleven ACRIN centers and eight LSS centers used newspapers. Television and magazines were each used by eleven of twelve ACRIN centers, although two abandoned television and three magazines. Television was used by five LSS centers, with one abandoning it. Magazines were used only by one LSS center.
Data on number enrolled, enrollment rate, and cost per enrollee for recruitment methods are presented in Tables 3, ,44 and and5.5. Twelve centers contributed data for direct mail methods, seven for community outreach methods, and nine centers for mass media methods. Enrollment rate and cost per enrollee varied substantially across centers.
The enrollment rate for direct mail ranged from approximately 0.2% to 3.7% and the cost per enrollee ranged from $6 to $325 if including targeted mailing to recruit Hispanics, $223 otherwise. The median cost per enrollee was $101. Centers that reported information on direct mail methods recruited approximately 19,000 participants using this method.
Cost per enrollee ranged from no cost to $103; the median was $4. Referral/word of mouth was a successful and inexpensive method; 3 of the 4 centers that reported using it paid nothing for referrals, while the remaining center paid $25 to the referring participant for each enrollee. Use of in-person recruiting methods, such as health fairs and enrollment seminars, had low or no cost per enrollee, but enrolled very few participants. Centers that reported information on community outreach recruited approximately 1,000 participants using these methods.
Cost per enrollee ranged from no cost to $1,953; the median was $79. Cost per enrollee and number enrolled varied widely by media type; furthermore, costs for certain type of media efforts (for example, radio), varied substantially across centers, perhaps because certain efforts were covered by the sponsoring organization or an advocacy group, or were conducted as public service announcements. Centers that reported information on mass media recruited approximately 4,200 participants using this method.
A few coordinators stated that when conducting extensive recruitment efforts, it was critical to know where to find persons who are likely to be eligible and interested, and how best to approach them. One coordinator stated that it was important to “go where the smokers are,” and as such, that center held informational sessions at bars and casinos. Another coordinator recommended that, for recruitment in rural areas, large medical establishments partner with local facilities to build trust. With regard to direct mail, one coordinator felt that it was important to use marketing techniques to develop materials that would speak to the target population. Another recommended timing a media blitz with a direct mailing effort to enhance knowledge of the trial and one suggested “saturating the airwaves” by placing radio ads at the same, strategically chosen, times of day for a consecutive number of days. For minority recruitment, a coordinator recommended hiring interviewers who are of the same ethnicity or heritage as the population to be enrolled. It also was noted that targeted recruitment of minority populations is more expensive and time consuming than recruitment from the general population. Two coordinators reported that monetary incentives, including free or subsidized transportation were useful. Coordinators also recommended using recruitment methods that did not result in additional cost to the screening center, such as public service announcements, television and radio interviews with study staff, and newspaper articles. Finally, some reported that it was important to prospectively track data on recruitment so that successful strategies could be identified, retained, and fine-tuned, and others either dropped or remedied.
The NLST recruited 53,454 participants in a little over a year and a half, an average of nearly 2,700 a month. Many more participants were enrolled using direct mail as compared with mass media and community outreach, although median cost per enrollee was inversely correlated with number enrolled. In some instances costs for similar types of recruitment activities, particularly direct mail, varied substantially by center. Researchers who use our data therefore must be cognizant of the range of possible costs and the possible reasons for such variability. They also must take into consideration the cost of personnel, information not included in total costs, but costs that do vary by method.
ITALUNG7 and NELSON,8 two RCTs of CT screening for lung cancer, have published enrollment rates for direct mail. Both reported a 4.5% enrollment rate, which is higher than NLST’s top enrollment rate of 3.7%. The higher enrollment rates in these two studies may be due to differences in eligibility criteria: in ITALUNG, participants were 69 or younger at enrollment and smoked a minimum of 20 pack-years (NLST’s criteria were 74 and 30, respectively), and in NELSON, participants were as young as 50 (NLST’s criterion was 55). These characteristics may have made potential participants more willing to enroll, as younger age and less smoking exposure often make for ease of participation. In addition, the ITALUNG recruitment base was comprised of patients of general practitioners who received encouragement from their doctors to enroll.
Neither ITALUNG nor NELSON reported costs associated with direct mail. Two RCTs, one a multiphasic cancer screening RCT and one a trial of lung cancer chemoprevention, have reported them, however. PLCO reported a median cost of $21 per enrollee, as compared with NLST”s cost of $101.6 It is probable that the higher cost of direct mail in NLST reflects PLCO’s composition: nearly half of the participants were never smokers, and anecdotal evidence suggests that recruitment of smokers is more difficult. The CARET chemoprevention RCT9 relied heavily on direct mail and reported a screening center cost per enrollee of $392,10 substantially higher than NLST’s cost. CARET figure included personnel costs as well as the costs of randomization and a six month follow-up visit, which probably explains the four-fold difference for the trials.
We are not surprised that NLST enrollment yield varied by center. Many factors are likely to have contributed. Differences in target population most certainly affect yield. Our centers were located in rural, urban, and inner-city areas; anecdotal reports indicate that willingness to participate differs by geographic location, and may be lowest in areas where distrust of the medical community is prevalent. Another source of variability may be the type of mailing list used. Some centers were able to procure lists of smokers, while others used lists that included both smokers and non-smokers. Yield also may vary by contents of direct mail packets. One center reported that response to thin packets was greater than response to more substantial mailings. Experience also may impact yield: staff who have conducted previous recruitment efforts are likely to know how best to conduct a given effort.
It is curious, though, that NLST’s direct mail costs per enrollee vary so widely by site. The discrepancy between high and low cost per enrollee is either 54-fold ($325 vs. $6) or 38-fold ($223 vs. $6), depending on whether Colorado’s unique Hispanic recruitment effort, which had the highest cost per enrollee, is included. As was the case with yield, cost differences may be due to differences in experience with recruitment efforts. Differences also may be due to costs of mailing lists. The University of Alabama at Birmingham, for example, had access to a roster of clinic patients who had expressed interest in participating in research studies. Other sites used commercial mailing lists that were purchased from for-profit organizations. Yet other sites used lists complied by advocacy organizations, which are likely to have been non-profit ventures that could have been willing to charge a low fee. Variability may reflect different accounting methods or perhaps differences or omissions in what centers included in total costs. However, examination of cost per piece mailed, a measure that removes the impact of enrollment rate, shows a much tighter range, $0.20 to $2.00 (10-fold difference), with most between $0.30 and $1.00 (3-fold difference), and suggests that costs to mail one piece are not meaningfully different across centers. Nevertheless, we recommend that researchers who wish to use our data to plan recruitment efforts carefully consider the possible sources of cost variability, determine which might be of relevance to their particular recruitment effort, and keep close track of costs as efforts proceed. The same level of consideration also must be given to our cost findings for community outreach and mass media, as variability across centers existed for those strategies too.
Some centers reported zero or very low cost for certain recruitment strategies, most notably health fairs and enrollment seminars. While there may be no fee to place a booth at an event or use a large conference room during off-hours, the staffing of that booth and presentations at that seminar do result in personnel costs. We were pessimistic that most centers would have records detailing staff commitment to specific recruitment activities, and therefore asked them to omit personnel costs. Omission of that data, however, limits the usefulness of our findings, as the reader cannot glean a true total cost of a recruitment activity. Researchers who use our data therefore must consider differences in full-time-equivalent staffing for each possible activity. They also must consider how personnel costs vary by the number of persons with whom contact is made. A health seminar with few participants may require only one presenter, but as the number of attendees grow, additional personnel would be necessary to assist with coordination efforts and distribution of materials, for example.
Choosing a recruitment strategy that is likely to be successful and low in cost is no easy task. In NLST, direct mail was responsible for the most enrollees, but community outreach was cheaper, and mass media was not particularly expensive when methods were successful. It is naïve to think, however, that one recruitment strategy, used in isolation, is a wise approach. As noted by one coordinator, a combination of a media blitz and direct mailing was very successful. Advertising can familiarize people with the existence of a study, making receipt of a direct mail package seem less foreign. We were unable to measure the effect and cost of cotemporaneous recruitment efforts, but such data could be collected in future efforts using a “check all that apply” approach.
While our findings must be applied cautiously, we feel that they still are of use to the research community. They provide measures that can be compared with one another and then ranked. While ranges, particularly for costs, are wide, medians are useful in that they are not affected by outliers. Our findings also have identified ways in which future studies can improve upon our data collection. First, recruitment must be tracked from the beginning of a study. Careful records need to be kept: they must include details of how much time specific personnel spent on each recruitment activity, when those activities occurred (during business hours or in the evenings or on weekends), and associated unusual or unexpected costs. Respondents must be asked how they heard of the study and must be allowed to indicate more than one source. They also should be queried as to their motivations for participation. Important as well is the practice of analyzing these data frequently while recruitment is on-going, particularly during the early months of the study. Such examination will allow failing methods to be identified and remedied if possible and excessive costs to be addressed.
In NLST, a large lung cancer screening trial, about 79 percent of enrollment is thought to have been attributable to direct mail, 17 percent to mass media, and four percent to community outreach. Costs per enrollee for specific methods (median for direct mail: $101; median for mass media: $79; median for community outreach: $4), which were exclusive of personnel and overhead costs, varied across center and the reason for such variation is likely to be multi-factorial. Personnel costs are known to vary by recruitment activity and must be considered when using the NLST data to plan recruitment endeavors, as must be the reasons for variability in our data. Future studies of recruitment success must collect careful, detailed records from study start to calculate a complete measure of cost per enrollee. The NLST data, however, are useful in that they provide a ranking of yield and costs for different method; such a comparison can be used to advise decisions concerning recruitment methods.
The authors thank the screening center investigators and staff of the National Lung Screening Trial (NLST). Most importantly, we acknowledge the study participants, whose contributions made this study possible.
FUNDING This research was supported by eleven contracts (NO1CN25476, NO1CN25511, NO1CN25512, NO1CN25513, NO1CN25514, NO1CN25515, NO1CN25516, NO1CN25518, NO1CN25522, NO1CN25524, NO1CN75022) from the Division of Cancer Prevention, National Cancer Institute, NIH, DHHS, and by grants (U01 CA80098 and CA79778) to the American College of Radiology Imaging Network (ACRIN) under a cooperative agreement with the Cancer Imaging Program, Division of Cancer Treatment and Diagnosis, NCI.
The authors wish to acknowledge the following individuals who contributed to this project by completing questionnaires:
Denise Aberle, MDa
Lisa Gren, PhDd
Ella Kazerooni, MDi
Jeffrey Schragin, MDp
Catherine E. Smiths
D. Lynn Werners
aUniversity of California, Los Angeles. Los Angeles, CA, USA
bMayo Clinic, Rochester, Minnesota. Rochester, MN, USA
cHenry Ford Health System. Detroit, MI, USA
dUniversity of Utah Health Sciences Center. Salt Lake City, UT, USA
eJohns Hopkins University. Baltimore, MD, USA
fUniversity of Minnesota School of Public Health. Minneapolis, MN, USA
gBeth Israel Deaconess Medical Center. Boston, MA, USA
hGeorgetown University Medical Center. Washington, DC, USA
iUniversity of Michigan Medical Center. Ann Arbor, MI, USA
jWashington University School of Medicine. St. Louis, MO, USA
kUniversity of Alabama at Birmingham. Birmingham, AL, USA
lWake Forest University. Winston-Salem, NC, USA
mPacific Health Research & Education Institute. Honolulu, Hawaii, USA
nMedical University of South Carolina. Charleston, SC, USA
oMarshfield Clinic Research Foundation. Marshfield, WI, USA
pUniversity of Pittsburgh Medical Center. Pittsburgh, PA, USA
qBrown University, Rhode Island Hospital. Providence, RI, USA
rUniversity of Colorado Denver. Aurora, CO, USA
sUniversity of Pennsylvania. Philadelphia, PA, USA
tUniversity of Iowa. Iowa City, IA, USA
uEmory University. Atlanta, GA, USA
vSt. Luke’s Mountain States Tumor Institute. Boise, ID, USA