|Home | About | Journals | Submit | Contact Us | Français|
To examine how the National Cancer Institute-funded Community Network Program (CNP) operationalized principles of community-based participatory research (CBPR).
Based on our review of the literature and extant CBPR measurement tools, scientists from nine of 25 CNPs developed a 27-item questionnaire to self-assess CNP operationalization of nine CBPR principles.
Of 25 CNPs, 22 (88%) completed the questionnaire. Most scored well on CBPR principles to recognize community as a unit of identity, build on community strengths, facilitate co-learning, embrace iterative processes in developing community capacity, and achieve a balance between data generation and intervention. CNPs varied in extent to which they employed CBPR principles of addressing determinants of health, sharing power among partners, engaging community in research dissemination, and striving for sustainability.
Although tool development in this field is in its infancy, findings suggest that fidelity to CBPR processes can be assessed in a variety of settings.
Despite a national commitment to reduce health disparities,1,2 recent studies show persistent differences in cancer incidence and mortality among different racial, ethnic, social, and geographic groups.3,4 Cancer-related health disparities across groups are due to numerous, interwoven factors, including differences in environmental stressors, socioeconomic status, health care coverage, providers‘ racial/ethnic/social biases, access to cancer screening and care, cultural beliefs about cancer, lifestyle behaviors, participation in routine cancer screening, and biological characteristics of the cancer.5 Attempts to reduce disparities through research are complicated by the fact that many vulnerable groups face a broad array of barriers that reduce willingness or ability to participate in research.6–12 For some disadvantaged communities, research has failed to address community concerns, has not yielded benefits to the community, has been exploitive, or has caused harm because findings attached unfavorable notoriety to the group.6, 9–11
Community-based participatory research (CBPR) seeks to improve the capacity of research to decrease cancer and other enduring health disparities.13 CBPR emerges from social justice and action research traditions, both of which recognize the unique strengths and perspectives of community partners and aim to produce tangible benefits for communities participating in research.14–16 CBPR principles require that academic and community partners work together to design research, collect and interpret data, and disseminate results. Community members should gain skills through the collaboration and see tangible benefits of having participated. Such endeavors can enhance research methods and relevance of findings.6,8,15–19
Despite these promises, there is little evidence evaluating the extent to which CBPR projects have involved communities in research. Processes associated with developing, implementing, analyzing, and disseminating research are poorly documented and evaluated across a range of CBPR projects. In their review of 60 CBPR studies, Viswanathan and colleagues found varying degrees of community involvement in priority setting, methods selection, proposal development and funding, study design and implementation, translation of research findings, and integration and sustainability of programs, as well as of community capacity building, all essential characteristics of CBPR.18–19
Recognizing the potential strengths of CBPR, in 2005 the National Cancer Institute (NCI) Center to Reduce Cancer Health Disparities (CRCHD) funded 25 Community Networks Programs (CNPs) to employ CBPR methods to reduce the unequal burden of cancer in minority and disadvantaged communities across the United States (US) and American Samoa.20 Although CNPs were funded at different levels (based on geographic scope), each CNP was charged to develop a research infrastructure that operated on CBPR principles. CNPs were required to convene a community advisory group, sponsor cancer education and outreach, train researchers and community members in CBPR methods, and conduct intervention studies. Community and academic partners were asked to work together to define research priorities in a way that was unconstrained by the biases of each and with the freedom to address research across the cancer research continuum, from discovery to dissemination.21 Engaging communities in research is consistent with NCI‘s commitment to interdisciplinary and team research,22 intentionally extending the team to community members who are experts in community culture and resources.
The CNPs provided an opportunity to explore how measures of adherence to CBPR principles, especially the extent to which community members are engaged in the design and dissemination of research, can be operationalized across a diverse sample of CBPR projects. Spearheaded by the authors of this paper, in 2009 the CNPs undertook a self-evaluation process to understand the extent to which each CNP's research effort reflected CBPR principles. As part of this process, we aimed to design and field test a quantitative tool to measure adherence to CBPR principles. In this paper, we describe the process of developing and implementing this quantitative evaluation measure for the CNPs, and discuss how such knowledge can be used to improve participatory processes and outcomes of CBPR endeavors within and beyond the CNPs.
The Principal Investigators of the CNPs appointed the authors, representing nine of 25 CNPS, to develop and implement an instrument to measure how the CNPs involved their relevant communities in accordance with CBPR principles. Five of the nine CNPs were established in 2000; the others in 2005. The authors reviewed the literature, available instruments, and their prior applications in response to NCI‘s funding announcement, with the goal of developing an instrument that could be utilized across a broad spectrum of communities and projects at different stages in their development.
The community involvement measure applied most broadly was one used in the 2004 review by the Agency for Healthcare Research and Quality of CBPR projects, but that instrument was utilized only by the reviewers and consisted only of yes/no items.18 The most concise statement of the domains of community involvement in CBPR projects were the nine CBPR principles outlined by Israel and colleagues.14 These principles state that CBPR projects should: 1) recognize the community as a unit of identity; 2) build on the strengths and resources within the community; 3) facilitate a collaborative, equitable partnership in all research phases, involving an empowering and power-sharing process that attends to social inequalities; 4) foster co-learning and capacity building among all partners; 5) integrate and achieve a balance between data generation and intervention for the mutual benefit of all partners; 6) focus on the local relevance of public health problems and on ecological perspectives that attend to multiple determinants of health; 7) involve systems development in a cyclical and iterative process; 8) disseminate results to all partners and involve them in the wider dissemination of results; and 9) involve a long-term process and commitment to sustainability. However, published literature was lacking in how these principles could be operationalized into measures that could be used across multiple CBPR projects in different communities.
Because of this gap, the authors developed a quantitative rating form based on Israel et al‘s nine principles, with three items tapping each principle. We based our scoring system on a previously validated guideline by Green et al, who published a 26-item tool (each with five response options) for assessing how well participatory research approaches are reflected in grant applications.23,24 Green et al organized their items in five domains: 1) participants and the nature of their involvement; 2) origin of the research question; 3) purpose of the research; 4) process and methodological implications; and 5) nature of the research outcomes. Although Green‘s tool was not organized according to Israel et al‘s nine principles, it included items relevant to several of the principles, including research partnership, project relevance, co-learning, and dissemination of findings. In collaboration with Green, Van Olphen used this tool to evaluate a single CBPR project, but noted the need to adapt it to better reflect the purpose and circumstances of her project.25
Learning from this, we developed the tool to more specifically reflect the purposes of the CNPs. We organized it around Israel et al‘s nine principles, which are accepted in the field as key components of CBPR. Investigators from a subset of CNPs, selected because of their years of experience in CBPR, developed 3 items for each of the nine principles. For each of the resulting 27 items, we specified five response options to reflect low to high operationalization of the principle. The tool evolved through an iterative process of review, discussion, and revision by CNP representatives.
This 27-item questionnaire was sent to the 25 CNP Principal Investigators (PI); 22 questionnaires were returned, completed either by the PI (n=12) or project manager (n=10). Analysis was done using Excel®. In using their tool to assess grant proposals, Green and colleagues cautioned against calculating means and total scores, as their response options did not follow a simple hierarchy and the “best” response option would likely depend on community context.23 We followed this same logic in the analysis of data collected with our tool. Thus, findings are reported as frequencies (i.e., with no attempt to rank or score responses).
In CBPR projects, the community must be clearly defined and want to engage with academics to research issues of mutual interest.14 As shown in Table 1, almost all CNPs (86%) had detailed or general definitions of their community. Of the 22 responding CNPs, 18 focused on specific racial/ethnic groups (e.g., African Americans, American Indians and Alaska Natives, Asian Americans, Hispanics, and Native Hawaiians) within or across states, while four focused on underserved populations within specific geographic areas (e.g., Appalachia, Arkansas, Tampa, and Boston). All CNPs reported that partnering communities had expressed at least moderate interest in cancer, and all but two communities had expressed moderate interest in participating in cancer research. These findings are not surprising, as the CNP request for application prompted applicants to describe the community, and letters of support were expected to demonstrate community interest in cancer and research.
An assumption of CBPR is that research will be more successful and yield more meaningful findings if it respects community values and capitalizes on the cultural assets and resources of the community.14 We found that the setting of priorities for CNP research projects and programs was dominated by the community in 8 (36.4%) CNPs and based on equal input by community and academic partners in 11 (50.0%) CNPs; however, priority setting was dominated by academic researchers in 3 (13.6%) CNPs. CNPs reported that, for the most part, their research projects and programs were built on community strengths. All but one CNP had at least moderate documentation of the community‘s strengths and resources.
In CBPR, mechanisms should be in place to allow community to influence research projects and processes; 14 for example as advisors, hired staff, or administrators and leaders of the research. CNPs were required to convene community advisory groups to influence research, but only 11 (50.0%) CNPs reported that this group was empowered to approve, disapprove, and recommend changes to all (100% of) CNP research proposals. One CNP did not require community approval for any research projects, while the remaining CNPs required approval for some (3), half (2), or most (5) research projects. Two CNPs reported no personnel from the partnering community, while six (27.3%) reported that the PI and most other personnel were from the partnering community. Three CNPs were based in community agencies and subcontracted with academic researchers and institutions as needed, while the other 19 (86.4%) were based in the university with minimal (3), moderate (11), or half (5) of the funding going to the community.
CBPR projects should have mechanisms to facilitate the reciprocal transfer of knowledge and skills.14 We found that all CNPs reported providing opportunities for community partners to learn about research and for academic researchers to learn about the culture and health issues of the community. All CNPs reported two or more examples of how community partners were strengthened by participation in the CNP.
CBPR projects should lead to improvements in the community, as well as produce new knowledge.14 All CNPs reported at least two examples of helping community partners obtain resources for cancer services. All noted that community members had articulated CNP benefits; e.g., in letters of support for subsequent CNP applications. Although two CNPs reported that the community‘s cancer care system had been unchanged by their work, 20 (90.9%) reported that work of the CNP had resulted in minimal (4), some (7), or significant (9) improvements in the community‘s cancer services.
Recognizing that health is influenced by social as well as community factors, CBPR projects should “strive to achieve broad-scale social changes aimed at eliminating health disparities.”14 CNPs varied in the number of sponsored initiatives to address non-proximal causes of cancer (e.g., general education, racism, stress, jobs, insurance, income, housing, or environment) from none (1) to four or more (7). No CNP reported that their projects focused primarily on the intra-and interpersonal levels of behavior change, and 12 (54.5%) reported sponsoring projects on all five levels of intervention noted in the social ecological model.26 All but one CNP reported that a major purpose was to empower community to identify and address its own issues, and 15 (68.2%) trained and supported community members to serve in leadership roles.
CBPR recognizes that it takes time spent in discussion and other aspects of engagement to develop and implement research projects that equally consider community and academic interests.14 Most (86.4%) CNPs reported four or more community-academia meetings per year to discuss, propose, review, improve, or interpret findings related to the CNP. Most (90.9%) CNPs noted that community members were somewhat or extremely comfortable initiating meetings about the CNP or questioning CNP research and programs. However, eight (36.4%) CNPs had initiated no, or only minimal, discussions with the community on long-term sustainability of the program.
Findings from CBPR projects should be respectfully presented, with presentation and authorship opportunities for community.14 Four (18.2%) CNPs reported sharing findings only with community advisors, while 16 (72.7%) CNPs also shared findings through reports, newsletters, and community meetings. No CNPs required community co-authorship of CNP-related publications, but 17 (77.3%) reported that community co-authorship was expected. Four (18.2%) CNPs required community co-presentation of CNP findings at meetings or conferences, and 14 (63.6%) expected and supported this.
Funding levels may fluctuate, but CBPR partnerships should continue to function and communities supported to obtain their own funding.14 We found that 17 (77.3%) CNPs felt they had some chance of continuing their CNP research infrastructures after NCI funding ended. All CNPs reported that community members had secured grants to sustain or expand programs related to cancer, and ten (45.5%) reported helping community secure four or more such grants. All but two CNPs reported that at least one junior researcher had secured his/her own research funding.
As with past efforts to evaluate CBPR studies, we found variation among CNPs in the operationalization of CBPR principles, reflecting the diversity of CBPR partnerships as well as the settings within which cancer health disparities persist.18 Some principles seem easier to adhere to than others. For example, we found that most CNPs scored well on recognizing the community as a unit of identity, assessing and building on community strengths, facilitating co-learning, embracing iterative processes in developing research and capacity, and achieving a balance between data generation and intervention.
There was wider variation in abilities of the CNPs to share power and resources with their respective communities. For example, not all CNPs empowered their community advisory group to disapprove CNP research projects or employed members of the target community. Although some CBPR scientists urge that grant funds at least be split between university and community partners,12 this was true for only eight of the 22 CNPs. Perhaps this is not surprising as CNPs were funded by NCI, with its clear research mission, and 19 of the CNPs were based in universities. Unfortunately, awards through the CNP mechanism were capped at a fixed total cost (i.e., including direct and indirect costs). Hence, university-based CNPs tended to have less money to share with community than did the three community-based CNPs, although a few university-based CNPs were able to negotiate a reduced indirect cost rate.
CNPs also varied in sponsorship of projects addressing multiple determinants of health. CBPR recognizes that underserved groups are likely to live and work in risk-laden environments and may have limited access to health insurance and health care.14 Some aspects of blue collar jobs (e.g., shift work and external control of tasks) and the need to work multiple jobs to make ends meet can jeopardize health and reduce time to access care.27 Long-term exposure to racism and structural inequalities clearly diminish health.28 Despite the value of addressing non-proximal causes of cancer and working at the policy level to improve cancer care, many CNP research projects tested interventions aimed at changing individual health behavior (albeit within the context of families, organizations, and communities), rather than increasing socio-economic status or reducing damage from stress or racism. Addressing upstream determinants of health was limited by low funding levels for pilot research projects ($50,000 total cost) and the usefulness of “simple” research projects that can yield early victories in training community members and junior researchers in CBPR.29
The results from this self-evaluation of the CNPs also identified several gaps in our CBPR processes. Sustainability remains a profound challenge to the institutionalization of innovation.30,31 Most CNPs helped community organizations and junior researchers build capacity and secure their own grants. However, it is unclear if CNPs themselves will be sustained without continued research infrastructure funding. Although community-university partnerships can yield benefits to both entities, maintaining these partnerships requires continued support for meetings, training, and resource sharing as new partners and junior researchers emerge.32
Because of the complexity of interactions between members of each community and the CBPR researchers working with them, as well as differences between communities, there are inherent limitations to the use of any quantitative tool to measure adherence to CBPR principles. Comprehensive CBPR evaluation must include other methods such as the use of focus groups, individual interviews, ethnographic observations, and other documentation from university and community stakeholders.25 However, development of a quantitative tool that is easy to use is important for CBPR projects that lack the resources to collect in-depth qualitative data and allows for some standardization of how CBPR is measured across projects.
Overall, high response across the diverse researchers involved in the CNPs indicates that it is feasible to implement this tool across a wide variety of CBPR projects. We were pleased with the variation in responses that our adapted tool elicited from the CNPs and its usefulness in our self-evaluation process. We believe that a tool like this could help programs gauge the extent to which they reflect the essential components of CBPR and engage community partners.
However, we recognize several limitations of our approach. Although our evaluation tool was informed by the work of Viswanathan et al18 and Green et al23 to operationalize and quantify the participatory aspects of the CNPs, the tool could be improved. The vetting of the tool was limited to CNP PIs and their designees. Some items reflected CNP-specific activities (e.g., references to “CNP proposal” or “cancer research”), which can be made more general or contextualized to other CBPR projects by replacing the term “CNP” with the appropriate proposal title and the topic of “cancer” with the appropriate topic. Some response options were vague, for example, asking if CNPs offered no, few, some, several, or many opportunities for training. This was done purposively because CNPs were funded at different levels, some focusing in one specific geographical community while others focused regionally or nationally, precluding use of absolute numbers in response options for some items. This illustrates a common difficulty that may be anticipated in quantifying some response options. We relied on self report without justification of answers, and findings are likely biased in favor of the CNPs because the respondents were CNP PIs and project managers. A potential next step could be to field test the tool with community leaders who were integral part of the CNPs.
To improve this tool, items and associated response options should be further examined for relevance, and response options should be quantified to the extent possible. Future work should include validity testing to assure that the items in fact measure Israel et al‘s nine principles, and might include broader review by CBPR experts. Attention should be focused on whether criterion validity is feasible and if construct validity is practicable. On the assumption that CBPR improves both participatory processes and research outcomes, future research should examine the association between extent of CBPR operationalization and alleged or anticipated effects (i.e., “constructs” that would be expected to change with committed application of CBPR principles).33,34
There could be several ways to use a validated tool to measure CBPR operationalization. If used periodically over the course of the program, it could stimulate increased opportunities for participation and empowerment of community members and facilitate early discussions about sustainability and expected benefits to community programs and health care systems. Asking community partners to complete the tool would allow comparison of community and academic perspectives (although Van Olphen et al had a low response rate from community partners when she asked them to complete her CBPR questionnaire).25
Overall, findings confirm the variability in the extent to which CBPR principles are applied in the implementation of CBPR projects. Although tool development in this field is in its infancy, results also suggest that the CBPR process can be operationalized and measured. Given the paucity of tools to assess the “participatory” components of CBPR, we hope our work helps others develop, refine, and test CBPR measures.
This research was supported by grant numbers: U01CA114630 (‘Imi Hale Native Hawaiian Cancer Network, PI: Chong); U01CA114640 (Asian American Network for Cancer Awareness, Research and Training, PI: Chen); U01 CA114591 (Weaving an Islander Network for Cancer Awareness, Research and Training, PI: Tanjasiri); U01 CA114626 (The University of Oklahoma Community Networks Program, PI: Campbell); U01CA114652 (National Black Leadership Initiative on Cancer III: Community Networks Program, PI: Blumenthal); U01 CA114696 (Southwest American Indian Collaborative Network, PI: Coe); U01CA114641 (Meharry Medical College - Community Health Centers Network, PI: Hargreaves); U01 CA114582 (ATECAR - Asian Community Cancer Network, PI: Ma); U01 CA114601 (South Carolina Cancer Disparities Community Network, PI: Hébert).
Contributor Statement: All authors contributed to the conceptualization of the research question, framework for questionnaire design, data collection, and interpretation. Drs. Braun, Tanjasiri, Blumenthal, and Hargreaves took the lead on questionnaire design; Dr. Campbell took the lead on data analysis; Drs. Braun, Nguyen, Tanjasiri, and Hebert lead the writing team.
The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
The authors have no conflicts or disclosures.
Human Subjects Protection: ‘Imi Hale Native Hawaiian Cancer Network (U01CA114630) is approved by the Native Hawaiian Health Care Systems IRB, which reviewed this study and determined it exempt.
Kathryn L Braun, University of Hawaii and Co-Principal Investigator, Imi Hale Native Hawaiian Cancer Network, 894 Queen Street, Honolulu, HI 96813, 808-330-1759, Email: ude.iiawah@nuarbk.
Tung T. Nguyen, University of California San Francisco and Deputy Principal Investigator, Asian American Network for Cancer Awareness, Research, and Training; Box 0320, UCSF Medical Center, San Francisco, CA 94143, (415) 514-8659, Email: ude.fscu@neyugN.gnuT.
Sora Park Tanjasiri, California State University, Fullerton and Principal Investigator, Weaving an Islander Network for Cancer Awareness, Research and Training, PO Box 6870, Fullerton, CA 92834, 657-278-4592, Email: ude.notrelluf@irisajnats.
Janis Campbell, The University of Oklahoma Health Sciences Center, Principal Investigator, The University of Oklahoma Community Networks Program, 801 NE 13th Street, Room 309, Oklahoma City, OK 73104, 405-271-2229, Email: ude.cshuo@llebpmacsinaj.
Sue P. Heiney, University of South Carolina, College of Nursing, Columbia, SC 29203, 803-777-8214, Email: ude.cs.xobliam@syenieh..
Heather M. Brandt, University of South Carolina, Arnold School of Public Health, Department of Health Promotion, Education, and Behavior, 800 Sumter Street, Columbia, SC 29208, 803-777-7096, Email: ude.cs@tdnarbh..
Selina A. Smith, Morehouse School of Medicine, 720 Westview Drive, SW Atlanta, GA 30310, 404-756-5205, Email: ude.msm@htimss.
Daniel S. Blumenthal, Morehouse School of Medicine, 720 Westview Dr., Atlanta, GA 30310. 404-752-1625, Email: ude.msm@lahtnemulbd.
Margaret Hargreaves, Meharry Medical College, 1005 D.B. Todd Blvd, Nashville, TN 37208, 615-327-6927, Email: ude.cmm@sevaergrahm.
Kathryn Coe, Department of Public Health, School of Medicine, Indiana University-Purdue University, Indianapolis, IN 46206, 317-278-3072, Email: ude.iupui@keoc.
Grace X. Ma, College of Health Professions, Temple University, 913 Ritter Annex, 1301 Cecil B. Moore Ave, Philadelphia, PA 19122 215-204-5108, Email: email@example.com.
Donna Kenerson, Department of Internal Medicine, Meharry Medical College, 1005 Dr. D.B. Todd Blvd., Nashville, TN 37208, 615-327-5914, Email: ude.cmm@nosrenekd.
Kushal Patel, Meharry Medical College, 1005 D.B. Todd Blvd, Nashville, TN 37208, 615- 327-5648, Email: ude.cmm@letapk.
JoAnn Tsark, Imi Hale Native Hawaiian Cancer Network, Papa Ola Lokahi, 894 Queen Street, Honolulu, HI 96813, 808-526-1700, Email: gro.elahimi@krastj.
James R. Hébert, South Carolina Statewide Cancer Prevention and Control Program, University of South Carolina, 915 Greene Street, Suite 241-2, Columbia, SC 29208, 803-576-5666, Email: ude.cs@trebehj..