PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
Am J Public Health. Author manuscript; available in PMC Jun 1, 2012.
Published in final edited form as:
PMCID: PMC3292685
NIHMSID: NIHMS326031
Operationalization of community-based participatory research principles across the National Cancer Institute’s Community Network Programs
Kathryn L Braun, DrPH, Professor of Public Health, Tung T. Nguyen, MD, Professor of Medicine, Sora Park Tanjasiri, DrPH, Professor of Health Science, Janis Campbell, PhD, Assistant Professor of Research, Sue P. Heiney, PhD, RN, FAAN, Dunn Shealy Professor of Nursing, Heather M. Brandt, PhD, CHES, Selina A. Smith, PhD, MDiv, Associate Professor of Community Health & Preventive Medicine, Daniel S. Blumenthal, MD, MPH, Professor of Community Health and Preventive Medicine, Margaret Hargreaves, PhD, Professor of Internal Medicine, Kathryn Coe, PhD, Professor and Lilly Scholar, Grace X. Ma, PhD, Professor of Public Health and Director of Center for Asian Health, Donna Kenerson, PhD, RN, MPA, Program Manager, Kushal Patel, PhD, Assistant Professor of Internal Medicine, JoAnn Tsark, Project Director, and James R. Hébert, Sc.D., Health Sciences Distinguished Professor of Epidemiology and Director
Kathryn L Braun, University of Hawaii and Co-Principal Investigator, Imi Hale Native Hawaiian Cancer Network, 894 Queen Street, Honolulu, HI 96813, 808-330-1759, kbraun/at/hawaii.edu;
Objectives
To examine how the National Cancer Institute-funded Community Network Program (CNP) operationalized principles of community-based participatory research (CBPR).
Methods
Based on our review of the literature and extant CBPR measurement tools, scientists from nine of 25 CNPs developed a 27-item questionnaire to self-assess CNP operationalization of nine CBPR principles.
Results
Of 25 CNPs, 22 (88%) completed the questionnaire. Most scored well on CBPR principles to recognize community as a unit of identity, build on community strengths, facilitate co-learning, embrace iterative processes in developing community capacity, and achieve a balance between data generation and intervention. CNPs varied in extent to which they employed CBPR principles of addressing determinants of health, sharing power among partners, engaging community in research dissemination, and striving for sustainability.
Conclusions
Although tool development in this field is in its infancy, findings suggest that fidelity to CBPR processes can be assessed in a variety of settings.
Keywords: Cancer disparities, community health, empowerment, health status disparities, indigenous populations, minority health, partnerships, training
Despite a national commitment to reduce health disparities,1,2 recent studies show persistent differences in cancer incidence and mortality among different racial, ethnic, social, and geographic groups.3,4 Cancer-related health disparities across groups are due to numerous, interwoven factors, including differences in environmental stressors, socioeconomic status, health care coverage, providers‘ racial/ethnic/social biases, access to cancer screening and care, cultural beliefs about cancer, lifestyle behaviors, participation in routine cancer screening, and biological characteristics of the cancer.5 Attempts to reduce disparities through research are complicated by the fact that many vulnerable groups face a broad array of barriers that reduce willingness or ability to participate in research.612 For some disadvantaged communities, research has failed to address community concerns, has not yielded benefits to the community, has been exploitive, or has caused harm because findings attached unfavorable notoriety to the group.6, 911
Community-based participatory research (CBPR) seeks to improve the capacity of research to decrease cancer and other enduring health disparities.13 CBPR emerges from social justice and action research traditions, both of which recognize the unique strengths and perspectives of community partners and aim to produce tangible benefits for communities participating in research.1416 CBPR principles require that academic and community partners work together to design research, collect and interpret data, and disseminate results. Community members should gain skills through the collaboration and see tangible benefits of having participated. Such endeavors can enhance research methods and relevance of findings.6,8,1519
Despite these promises, there is little evidence evaluating the extent to which CBPR projects have involved communities in research. Processes associated with developing, implementing, analyzing, and disseminating research are poorly documented and evaluated across a range of CBPR projects. In their review of 60 CBPR studies, Viswanathan and colleagues found varying degrees of community involvement in priority setting, methods selection, proposal development and funding, study design and implementation, translation of research findings, and integration and sustainability of programs, as well as of community capacity building, all essential characteristics of CBPR.1819
Recognizing the potential strengths of CBPR, in 2005 the National Cancer Institute (NCI) Center to Reduce Cancer Health Disparities (CRCHD) funded 25 Community Networks Programs (CNPs) to employ CBPR methods to reduce the unequal burden of cancer in minority and disadvantaged communities across the United States (US) and American Samoa.20 Although CNPs were funded at different levels (based on geographic scope), each CNP was charged to develop a research infrastructure that operated on CBPR principles. CNPs were required to convene a community advisory group, sponsor cancer education and outreach, train researchers and community members in CBPR methods, and conduct intervention studies. Community and academic partners were asked to work together to define research priorities in a way that was unconstrained by the biases of each and with the freedom to address research across the cancer research continuum, from discovery to dissemination.21 Engaging communities in research is consistent with NCI‘s commitment to interdisciplinary and team research,22 intentionally extending the team to community members who are experts in community culture and resources.
The CNPs provided an opportunity to explore how measures of adherence to CBPR principles, especially the extent to which community members are engaged in the design and dissemination of research, can be operationalized across a diverse sample of CBPR projects. Spearheaded by the authors of this paper, in 2009 the CNPs undertook a self-evaluation process to understand the extent to which each CNP's research effort reflected CBPR principles. As part of this process, we aimed to design and field test a quantitative tool to measure adherence to CBPR principles. In this paper, we describe the process of developing and implementing this quantitative evaluation measure for the CNPs, and discuss how such knowledge can be used to improve participatory processes and outcomes of CBPR endeavors within and beyond the CNPs.
The Principal Investigators of the CNPs appointed the authors, representing nine of 25 CNPS, to develop and implement an instrument to measure how the CNPs involved their relevant communities in accordance with CBPR principles. Five of the nine CNPs were established in 2000; the others in 2005. The authors reviewed the literature, available instruments, and their prior applications in response to NCI‘s funding announcement, with the goal of developing an instrument that could be utilized across a broad spectrum of communities and projects at different stages in their development.
The community involvement measure applied most broadly was one used in the 2004 review by the Agency for Healthcare Research and Quality of CBPR projects, but that instrument was utilized only by the reviewers and consisted only of yes/no items.18 The most concise statement of the domains of community involvement in CBPR projects were the nine CBPR principles outlined by Israel and colleagues.14 These principles state that CBPR projects should: 1) recognize the community as a unit of identity; 2) build on the strengths and resources within the community; 3) facilitate a collaborative, equitable partnership in all research phases, involving an empowering and power-sharing process that attends to social inequalities; 4) foster co-learning and capacity building among all partners; 5) integrate and achieve a balance between data generation and intervention for the mutual benefit of all partners; 6) focus on the local relevance of public health problems and on ecological perspectives that attend to multiple determinants of health; 7) involve systems development in a cyclical and iterative process; 8) disseminate results to all partners and involve them in the wider dissemination of results; and 9) involve a long-term process and commitment to sustainability. However, published literature was lacking in how these principles could be operationalized into measures that could be used across multiple CBPR projects in different communities.
Because of this gap, the authors developed a quantitative rating form based on Israel et al‘s nine principles, with three items tapping each principle. We based our scoring system on a previously validated guideline by Green et al, who published a 26-item tool (each with five response options) for assessing how well participatory research approaches are reflected in grant applications.23,24 Green et al organized their items in five domains: 1) participants and the nature of their involvement; 2) origin of the research question; 3) purpose of the research; 4) process and methodological implications; and 5) nature of the research outcomes. Although Green‘s tool was not organized according to Israel et al‘s nine principles, it included items relevant to several of the principles, including research partnership, project relevance, co-learning, and dissemination of findings. In collaboration with Green, Van Olphen used this tool to evaluate a single CBPR project, but noted the need to adapt it to better reflect the purpose and circumstances of her project.25
Learning from this, we developed the tool to more specifically reflect the purposes of the CNPs. We organized it around Israel et al‘s nine principles, which are accepted in the field as key components of CBPR. Investigators from a subset of CNPs, selected because of their years of experience in CBPR, developed 3 items for each of the nine principles. For each of the resulting 27 items, we specified five response options to reflect low to high operationalization of the principle. The tool evolved through an iterative process of review, discussion, and revision by CNP representatives.
This 27-item questionnaire was sent to the 25 CNP Principal Investigators (PI); 22 questionnaires were returned, completed either by the PI (n=12) or project manager (n=10). Analysis was done using Excel®. In using their tool to assess grant proposals, Green and colleagues cautioned against calculating means and total scores, as their response options did not follow a simple hierarchy and the “best” response option would likely depend on community context.23 We followed this same logic in the analysis of data collected with our tool. Thus, findings are reported as frequencies (i.e., with no attempt to rank or score responses).
Operationalization of CBPR Principles
Recognize community as a unit of identity
In CBPR projects, the community must be clearly defined and want to engage with academics to research issues of mutual interest.14 As shown in Table 1, almost all CNPs (86%) had detailed or general definitions of their community. Of the 22 responding CNPs, 18 focused on specific racial/ethnic groups (e.g., African Americans, American Indians and Alaska Natives, Asian Americans, Hispanics, and Native Hawaiians) within or across states, while four focused on underserved populations within specific geographic areas (e.g., Appalachia, Arkansas, Tampa, and Boston). All CNPs reported that partnering communities had expressed at least moderate interest in cancer, and all but two communities had expressed moderate interest in participating in cancer research. These findings are not surprising, as the CNP request for application prompted applicants to describe the community, and letters of support were expected to demonstrate community interest in cancer and research.
Table 1
Table 1
Frequencies of responses to CBPR checklist for CNP infrastructures (n=22)
Build on community strengths
An assumption of CBPR is that research will be more successful and yield more meaningful findings if it respects community values and capitalizes on the cultural assets and resources of the community.14 We found that the setting of priorities for CNP research projects and programs was dominated by the community in 8 (36.4%) CNPs and based on equal input by community and academic partners in 11 (50.0%) CNPs; however, priority setting was dominated by academic researchers in 3 (13.6%) CNPs. CNPs reported that, for the most part, their research projects and programs were built on community strengths. All but one CNP had at least moderate documentation of the community‘s strengths and resources.
Facilitate a collaborative, equitable partnership that attends to social inequalities
In CBPR, mechanisms should be in place to allow community to influence research projects and processes; 14 for example as advisors, hired staff, or administrators and leaders of the research. CNPs were required to convene community advisory groups to influence research, but only 11 (50.0%) CNPs reported that this group was empowered to approve, disapprove, and recommend changes to all (100% of) CNP research proposals. One CNP did not require community approval for any research projects, while the remaining CNPs required approval for some (3), half (2), or most (5) research projects. Two CNPs reported no personnel from the partnering community, while six (27.3%) reported that the PI and most other personnel were from the partnering community. Three CNPs were based in community agencies and subcontracted with academic researchers and institutions as needed, while the other 19 (86.4%) were based in the university with minimal (3), moderate (11), or half (5) of the funding going to the community.
Foster co-learning and capacity building
CBPR projects should have mechanisms to facilitate the reciprocal transfer of knowledge and skills.14 We found that all CNPs reported providing opportunities for community partners to learn about research and for academic researchers to learn about the culture and health issues of the community. All CNPs reported two or more examples of how community partners were strengthened by participation in the CNP.
Integrate and achieve a balance between data generation and intervention
CBPR projects should lead to improvements in the community, as well as produce new knowledge.14 All CNPs reported at least two examples of helping community partners obtain resources for cancer services. All noted that community members had articulated CNP benefits; e.g., in letters of support for subsequent CNP applications. Although two CNPs reported that the community‘s cancer care system had been unchanged by their work, 20 (90.9%) reported that work of the CNP had resulted in minimal (4), some (7), or significant (9) improvements in the community‘s cancer services.
Focus on ecological perspectives that attend to multiple determinants of health
Recognizing that health is influenced by social as well as community factors, CBPR projects should “strive to achieve broad-scale social changes aimed at eliminating health disparities.”14 CNPs varied in the number of sponsored initiatives to address non-proximal causes of cancer (e.g., general education, racism, stress, jobs, insurance, income, housing, or environment) from none (1) to four or more (7). No CNP reported that their projects focused primarily on the intra-and interpersonal levels of behavior change, and 12 (54.5%) reported sponsoring projects on all five levels of intervention noted in the social ecological model.26 All but one CNP reported that a major purpose was to empower community to identify and address its own issues, and 15 (68.2%) trained and supported community members to serve in leadership roles.
Involve system development through cyclical and iterative process
CBPR recognizes that it takes time spent in discussion and other aspects of engagement to develop and implement research projects that equally consider community and academic interests.14 Most (86.4%) CNPs reported four or more community-academia meetings per year to discuss, propose, review, improve, or interpret findings related to the CNP. Most (90.9%) CNPs noted that community members were somewhat or extremely comfortable initiating meetings about the CNP or questioning CNP research and programs. However, eight (36.4%) CNPs had initiated no, or only minimal, discussions with the community on long-term sustainability of the program.
Disseminate results and involves partners in this dissemination
Findings from CBPR projects should be respectfully presented, with presentation and authorship opportunities for community.14 Four (18.2%) CNPs reported sharing findings only with community advisors, while 16 (72.7%) CNPs also shared findings through reports, newsletters, and community meetings. No CNPs required community co-authorship of CNP-related publications, but 17 (77.3%) reported that community co-authorship was expected. Four (18.2%) CNPs required community co-presentation of CNP findings at meetings or conferences, and 14 (63.6%) expected and supported this.
Involve long-term processes and commitment to sustainability
Funding levels may fluctuate, but CBPR partnerships should continue to function and communities supported to obtain their own funding.14 We found that 17 (77.3%) CNPs felt they had some chance of continuing their CNP research infrastructures after NCI funding ended. All CNPs reported that community members had secured grants to sustain or expand programs related to cancer, and ten (45.5%) reported helping community secure four or more such grants. All but two CNPs reported that at least one junior researcher had secured his/her own research funding.
As with past efforts to evaluate CBPR studies, we found variation among CNPs in the operationalization of CBPR principles, reflecting the diversity of CBPR partnerships as well as the settings within which cancer health disparities persist.18 Some principles seem easier to adhere to than others. For example, we found that most CNPs scored well on recognizing the community as a unit of identity, assessing and building on community strengths, facilitating co-learning, embracing iterative processes in developing research and capacity, and achieving a balance between data generation and intervention.
There was wider variation in abilities of the CNPs to share power and resources with their respective communities. For example, not all CNPs empowered their community advisory group to disapprove CNP research projects or employed members of the target community. Although some CBPR scientists urge that grant funds at least be split between university and community partners,12 this was true for only eight of the 22 CNPs. Perhaps this is not surprising as CNPs were funded by NCI, with its clear research mission, and 19 of the CNPs were based in universities. Unfortunately, awards through the CNP mechanism were capped at a fixed total cost (i.e., including direct and indirect costs). Hence, university-based CNPs tended to have less money to share with community than did the three community-based CNPs, although a few university-based CNPs were able to negotiate a reduced indirect cost rate.
CNPs also varied in sponsorship of projects addressing multiple determinants of health. CBPR recognizes that underserved groups are likely to live and work in risk-laden environments and may have limited access to health insurance and health care.14 Some aspects of blue collar jobs (e.g., shift work and external control of tasks) and the need to work multiple jobs to make ends meet can jeopardize health and reduce time to access care.27 Long-term exposure to racism and structural inequalities clearly diminish health.28 Despite the value of addressing non-proximal causes of cancer and working at the policy level to improve cancer care, many CNP research projects tested interventions aimed at changing individual health behavior (albeit within the context of families, organizations, and communities), rather than increasing socio-economic status or reducing damage from stress or racism. Addressing upstream determinants of health was limited by low funding levels for pilot research projects ($50,000 total cost) and the usefulness of “simple” research projects that can yield early victories in training community members and junior researchers in CBPR.29
The results from this self-evaluation of the CNPs also identified several gaps in our CBPR processes. Sustainability remains a profound challenge to the institutionalization of innovation.30,31 Most CNPs helped community organizations and junior researchers build capacity and secure their own grants. However, it is unclear if CNPs themselves will be sustained without continued research infrastructure funding. Although community-university partnerships can yield benefits to both entities, maintaining these partnerships requires continued support for meetings, training, and resource sharing as new partners and junior researchers emerge.32
Because of the complexity of interactions between members of each community and the CBPR researchers working with them, as well as differences between communities, there are inherent limitations to the use of any quantitative tool to measure adherence to CBPR principles. Comprehensive CBPR evaluation must include other methods such as the use of focus groups, individual interviews, ethnographic observations, and other documentation from university and community stakeholders.25 However, development of a quantitative tool that is easy to use is important for CBPR projects that lack the resources to collect in-depth qualitative data and allows for some standardization of how CBPR is measured across projects.
Overall, high response across the diverse researchers involved in the CNPs indicates that it is feasible to implement this tool across a wide variety of CBPR projects. We were pleased with the variation in responses that our adapted tool elicited from the CNPs and its usefulness in our self-evaluation process. We believe that a tool like this could help programs gauge the extent to which they reflect the essential components of CBPR and engage community partners.
However, we recognize several limitations of our approach. Although our evaluation tool was informed by the work of Viswanathan et al18 and Green et al23 to operationalize and quantify the participatory aspects of the CNPs, the tool could be improved. The vetting of the tool was limited to CNP PIs and their designees. Some items reflected CNP-specific activities (e.g., references to “CNP proposal” or “cancer research”), which can be made more general or contextualized to other CBPR projects by replacing the term “CNP” with the appropriate proposal title and the topic of “cancer” with the appropriate topic. Some response options were vague, for example, asking if CNPs offered no, few, some, several, or many opportunities for training. This was done purposively because CNPs were funded at different levels, some focusing in one specific geographical community while others focused regionally or nationally, precluding use of absolute numbers in response options for some items. This illustrates a common difficulty that may be anticipated in quantifying some response options. We relied on self report without justification of answers, and findings are likely biased in favor of the CNPs because the respondents were CNP PIs and project managers. A potential next step could be to field test the tool with community leaders who were integral part of the CNPs.
To improve this tool, items and associated response options should be further examined for relevance, and response options should be quantified to the extent possible. Future work should include validity testing to assure that the items in fact measure Israel et al‘s nine principles, and might include broader review by CBPR experts. Attention should be focused on whether criterion validity is feasible and if construct validity is practicable. On the assumption that CBPR improves both participatory processes and research outcomes, future research should examine the association between extent of CBPR operationalization and alleged or anticipated effects (i.e., “constructs” that would be expected to change with committed application of CBPR principles).33,34
There could be several ways to use a validated tool to measure CBPR operationalization. If used periodically over the course of the program, it could stimulate increased opportunities for participation and empowerment of community members and facilitate early discussions about sustainability and expected benefits to community programs and health care systems. Asking community partners to complete the tool would allow comparison of community and academic perspectives (although Van Olphen et al had a low response rate from community partners when she asked them to complete her CBPR questionnaire).25
Overall, findings confirm the variability in the extent to which CBPR principles are applied in the implementation of CBPR projects. Although tool development in this field is in its infancy, results also suggest that the CBPR process can be operationalized and measured. Given the paucity of tools to assess the “participatory” components of CBPR, we hope our work helps others develop, refine, and test CBPR measures.
Acknowledgements
This research was supported by grant numbers: U01CA114630 (‘Imi Hale Native Hawaiian Cancer Network, PI: Chong); U01CA114640 (Asian American Network for Cancer Awareness, Research and Training, PI: Chen); U01 CA114591 (Weaving an Islander Network for Cancer Awareness, Research and Training, PI: Tanjasiri); U01 CA114626 (The University of Oklahoma Community Networks Program, PI: Campbell); U01CA114652 (National Black Leadership Initiative on Cancer III: Community Networks Program, PI: Blumenthal); U01 CA114696 (Southwest American Indian Collaborative Network, PI: Coe); U01CA114641 (Meharry Medical College - Community Health Centers Network, PI: Hargreaves); U01 CA114582 (ATECAR - Asian Community Cancer Network, PI: Ma); U01 CA114601 (South Carolina Cancer Disparities Community Network, PI: Hébert).
Footnotes
Contributor Statement: All authors contributed to the conceptualization of the research question, framework for questionnaire design, data collection, and interpretation. Drs. Braun, Tanjasiri, Blumenthal, and Hargreaves took the lead on questionnaire design; Dr. Campbell took the lead on data analysis; Drs. Braun, Nguyen, Tanjasiri, and Hebert lead the writing team.
The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
The authors have no conflicts or disclosures.
Human Subjects Protection: ‘Imi Hale Native Hawaiian Cancer Network (U01CA114630) is approved by the Native Hawaiian Health Care Systems IRB, which reviewed this study and determined it exempt.
Contributor Information
Kathryn L Braun, University of Hawaii and Co-Principal Investigator, Imi Hale Native Hawaiian Cancer Network, 894 Queen Street, Honolulu, HI 96813, 808-330-1759, kbraun/at/hawaii.edu.
Tung T. Nguyen, University of California San Francisco and Deputy Principal Investigator, Asian American Network for Cancer Awareness, Research, and Training; Box 0320, UCSF Medical Center, San Francisco, CA 94143, (415) 514-8659, Tung.Nguyen/at/ucsf.edu.
Sora Park Tanjasiri, California State University, Fullerton and Principal Investigator, Weaving an Islander Network for Cancer Awareness, Research and Training, PO Box 6870, Fullerton, CA 92834, 657-278-4592, stanjasiri/at/fullerton.edu.
Janis Campbell, The University of Oklahoma Health Sciences Center, Principal Investigator, The University of Oklahoma Community Networks Program, 801 NE 13th Street, Room 309, Oklahoma City, OK 73104, 405-271-2229, janiscampbell/at/ouhsc.edu.
Sue P. Heiney, University of South Carolina, College of Nursing, Columbia, SC 29203, 803-777-8214, heineys/at/mailbox.sc.edu.
Heather M. Brandt, University of South Carolina, Arnold School of Public Health, Department of Health Promotion, Education, and Behavior, 800 Sumter Street, Columbia, SC 29208, 803-777-7096, hbrandt/at/sc.edu.
Selina A. Smith, Morehouse School of Medicine, 720 Westview Drive, SW Atlanta, GA 30310, 404-756-5205, ssmith/at/msm.edu.
Daniel S. Blumenthal, Morehouse School of Medicine, 720 Westview Dr., Atlanta, GA 30310. 404-752-1625, dblumenthal/at/msm.edu.
Margaret Hargreaves, Meharry Medical College, 1005 D.B. Todd Blvd, Nashville, TN 37208, 615-327-6927, mhargreaves/at/mmc.edu.
Kathryn Coe, Department of Public Health, School of Medicine, Indiana University-Purdue University, Indianapolis, IN 46206, 317-278-3072, coek/at/iupui.edu.
Grace X. Ma, College of Health Professions, Temple University, 913 Ritter Annex, 1301 Cecil B. Moore Ave, Philadelphia, PA 19122 215-204-5108, grace.ma/at/temple.edu.
Donna Kenerson, Department of Internal Medicine, Meharry Medical College, 1005 Dr. D.B. Todd Blvd., Nashville, TN 37208, 615-327-5914, dkenerson/at/mmc.edu.
Kushal Patel, Meharry Medical College, 1005 D.B. Todd Blvd, Nashville, TN 37208, 615- 327-5648, kpatel/at/mmc.edu.
JoAnn Tsark, Imi Hale Native Hawaiian Cancer Network, Papa Ola Lokahi, 894 Queen Street, Honolulu, HI 96813, 808-526-1700, jtsark/at/imihale.org.
James R. Hébert, South Carolina Statewide Cancer Prevention and Control Program, University of South Carolina, 915 Greene Street, Suite 241-2, Columbia, SC 29208, 803-576-5666, jhebert/at/sc.edu.
1. Healthy People 2010: Understanding and improving health. Washington, DC: US Department of Health and Human Services; 2000. [accessed Nov 10, 2010]. US Department of Health and Human Services. http://www.healthypeople.gov/
2. Centers for Disease Control and Prevention. CDC Health Disparities and Inequalities Report – United States, 2011. [accessed Mar 24, 2011];MMWR. 2011 60(Suppl):1–113. http://www.cdc.gov/mmwr/pdf/other/su6001.pdf. [PubMed]
3. Miller BA, Chu KC, Hankey BF, Reis LAG. Cancer incidence and mortality patterns among specific Asian and Pacific Islander populations in the U.S. Cancer Causes Control. 2008;19:227–256. [PMC free article] [PubMed]
4. US Cancer Statistics Working Group. United States cancer statistics: 2004 incidence and mortality. Atlanta: U.S. Department of Health and Human Services, Centers for Disease Control and Prevention and National Cancer Institute; 2007.
5. Haynes MA, Smedley BD. The unequal burden of cancer: An assessment of NIH research and programs for ethnic minorities and the medically underserved. Washington, DC: National Academy Press; 1999.
6. Fong M, Braun KL, Tsark J. Improving Native Hawaiian health through community-based participatory research. Calif J Health Promot. 2003;1:136–148.
7. Larson C, Schlundt D, Patel K, McClellan L, Hargreaves M. Disparities in perceptions of healthcare access in a community sample. J Ambul Care Manage. 2007;30:142–149. [PubMed]
8. Larson C, Schlundt D, Patel K, Goldzweig I, Hargreaves M. Community participation in health initiatives for marginalized populations. J Ambul Care Manage. 2009;32:264–270. [PubMed]
9. McCallum JM, Arekere DM, Green BL, Katz RV, Rivers BM. Awareness and knowledge of the US Public Health Service syphilis study at Tuskegee: implications for biomedical research. J Health Care Poor Underserved. 2006;17(4):716–733. [PMC free article] [PubMed]
10. Mello MM, Wolf LE. The Havasupai Indian Tribe case — Lessons for research involving stored biologic samples. N Engl J Med. 2010;363:204–207. [PubMed]
11. Scharff DP, Mathews KJ, Jackson P, Hoffsuemmer J, Martin E, Edwards D. More than Tuskegee: understanding mistrust about research participation. J Health Care Poor Underserved. 2010;21:879–897. [PubMed]
12. Burhansstipanov L, Christopher S, Schumacher SA. Lessons learned from community-based participatory research in Indian country. Cancer Control. 2005;12(Suppl 2):70–76. [PMC free article] [PubMed]
13. Gottlieb B. Community-based approaches to cancer disparities. In: Koh HK, editor. Toward the elimination of cancer disparities. New York: Springer; 2009. pp. 317–357.
14. Israel BA, Schulz AJ, Parker EA, Becker AB, Allen AJ, Guzman R. Critical issues in developing and following community based participatory research principles. In: Minkler M, Wallerstein N, editors. Community-based participatory research for health. San Francisco: Jossey-Bass; 2003. pp. 53–76.
15. Minkler M. Linking science and policy through community-based participatory research to study and address health disparities. Am J Public Health. 2010;100(Suppl 1):S81–S87. [PubMed]
16. Wallerstein N, Duran B. Community-based participatory research contributions to intervention research: the intersection of science and practice to improve health equity. Am J Public Health. 2010;100:S40–S46. [PubMed]
17. Heiney SP, Adams SA, Wells LM, Johnson H. Evaluation of conceptual framework for recruitment of African American breast cancer patients. Oncol Nursing Forum. 2010;37:E160–E167. [PMC free article] [PubMed]
18. Viswanathan M, Ammerman A, Eng E, Gartlehner G, Lohr KN, Griffith D, Rhodes S, Samuel-Hodge C, Maty S, Lux L, Webb L, Sutton SF, Swinson R, Jackman A, Whitener L. Community-based participatory research: assessing the evidence. Agency for Healthcare Research and Quality, US Department of Health and Human Services; 2004. Jul,
19. Israel BA, Eng E, Schulz AJ, Parker EA. Introduction to methods in community-based participatory research for health. In: Israel BA, Eng E, Schultz AJ, Parker EA, editors. Methods in community-based participatory research for health. San Francisco: Jossey-Bass; 2005. pp. 3–26.
20. National Cancer Institute, Centers to Reduce Cancer Health Disparities. [accessed Nov 2, 2010];Community Network Programs. http://crchd.cancer.gov/cnp/background.html.
21. Hébert JR, Brandt HM, Armstead CA, Adams SA, Steck SE. Interdisciplinary, translational, and community-based participatory research: finding a common language to improve cancer research. Cancer Epidemiol Biomarkers Prev. 2009;18:1213–1217. [PMC free article] [PubMed]
22. NIH Roadmap for Medical Research. US Department of Health and Human Services: National Institutes of Health; 2008.
23. Green LW, George MA, Daniel M, Frankish CJ, Herbert CP, Bowie WR, O’Neill M. Guidelines for participatory research in health promotion. In: Minkler M, Wallerstein N, editors. Community-based participatory research for health. San Francisco, CA: Jossey-Bass; 2003. pp. 419–428.
24. Mercer SL, Green LW, Cargo M, Potter MA, Daniel M, Olds RS, et al. Reliability-tested guidelines for assessing participatory research projects. In: Minkler M, Wallerstein N, editors. Community-based participatory research for health. San Francisco: Jossey-Bass; 2009. pp. 407–418.
25. Van Olphen J, Ottoson J, Green L, Barlow J, Koblick K, Hiatt R. Evaluation of a partnership approach to translating research on breast cancer and the environment. Prog Community Health Partnersh. 2009;3:213–226. [PMC free article] [PubMed]
26. McLeroy KR, Bibeau D, Steckler A, Glanz K. An ecological perspective on health promotion programs. Health Educ Q. 1988;15:351–377. [PubMed]
27. Huisman M, Van Lenthe F, Avendano M, Mackenbach J. The contribution of job characteristics to socioeconomic inequalities in incidence of myocardial infarction. Soc Sci Med. 2008;66:2240–2252. [PubMed]
28. Thomas SB, Quinn SC, Butler J, Fryer CS, Garza MA. Toward a fourth generation of disparities research to achieve health equity. Annu Rev Public Health. 2001;32:399–416. [PMC free article] [PubMed]
29. Mitchell RE, Florin P, Stevenson JF. Supporting community-based prevention and health promotion initiatives: developing effective technical assistance systems. Health Educ Behav. 2002;29:620–639. [PubMed]
30. Scheirer MA. Is sustainability possible? A review and commentary on empirical studies of program sustainability. Am J Evaluation. 2005;26:320–347.
31. Altman DG. Challenges in Sustaining Public Health Interventions. Health Educ Behav. 2009;36:24–28. [PubMed]
32. Silka L, Renault-Caragianes P. Community-university research partnerships: devising a model for ethical engagement. J Higher Educ Outreach Engagement. 2006;11:171–183.
33. Evans WJ. Construct validity of the Attitudes about Reality Scale. Psychol Rep. 2000;86:738–744. [PubMed]
34. Smith TW, Frohm KD. What's so unhealthy about hostility? Construct validity and social correlates of the Cook and Medley Ho Scale. Health Psychol. 1985;4:503–520. [PubMed]