PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
Open J Epidemiol. Author manuscript; available in PMC Sep 9, 2013.
Published in final edited form as:
Open J Epidemiol. Feb 2013; 3(1): 20–24.
doi:  10.4236/ojepi.2013.31004
PMCID: PMC3767154
NIHMSID: NIHMS466488
Correlates of self-reported colorectal cancer screening accuracy in a multi-specialty medical group practice
Arica White,1 Sally W. Vernon,1,2 Jan M. Eberth,1,3 Jasmin A. Tiro,4 Sharon P. Coan,2 Peter N. Abotchie,2 and Anthony Greisinger5
1School of Public Health, Division of Epidemiology, University of Texas Health Science Center, Houston, USA
2School of Public Health, Division of Health Promotion and Behavioral Sciences, University of Texas Health Science Center, Houston, USA
3Department of Biostatistics, University of Texas MD Anderson Cancer Center, Houston, USA
4Department of Clinical Sciences, University of Texas Southwestern Medical Center, Dallas, USA
5Kelsey Research Foundation, Houston, USA
Arica White: awhite5/at/cdc.gov
Purpose
We assessed whether accuracy of self-reported screening for colorectal cancer (CRC) varied by respondent characteristics or healthcare utilization.
Methods
From 2005 to 2007, 857 respondents aged 51 – 74 were recruited from a multi-specialty medical group practice to answer a questionnaire about their CRC screening (CRCS) behaviors. Self-reports were compared with administrative and medical records to assess concordance, sensitivity, specificity, and report-to-records ratios for overall CRCS (fecal occult blood test, sigmoidoscopy, and/or colonoscopy).
Results
Concordance was good (≥0.8 to <0.9) or fair (≥0.7 to <0.8) for most subgroups; respondents with >5 visits outside the clinic had poor (<0.7) concordance. Sensitivity estimates were mostly excellent (≥0.9) or good but poor for respondents whose healthcare provider did not advise a specific CRCS test. Specificity was poor for the following respondents: 65+ years, males, college graduates, family history of CRC, >5 visits outside of the clinic, or whose healthcare provider advised a specific CRCS test. Respondents 65+ years and with >5 outside visits over-reported CRCS.
Conclusions
With few exceptions, self-reports of CRCS in an insured population is reasonably accurate across subgroups. More work is needed to replicate these findings in diverse settings and populations to better understand subgroup differences and improve measures of CRCS.
Keywords: Colorectal Cancer, Screening, Correlates, Self-Report, Accuracy
Although colorectal cancer screening (CRCS) rates are increasing [1], there is still room for improvement if we are to achieve Healthy People 2010 goals [2] and eliminate disparities. Monitoring adherence to guidelines enables us to assess progress towards meeting these goals and to identify screening disparities for population subgroups [2]. Adherence to CRCS guidelines is often assessed using self-reported data, in part, because of the time, cost, and limited access to medical records [3]. Increased reliance on self-reports underscores the need for accurate measures of adherence [3,4].
Although a number of studies have assessed agreement between self-reported CRCS and administrative data or medical records [519], fewer have examined sub group differences in accuracy [5,6,812,15,16,19], and- most were limited to socio-demographic characteristics.
No studies have examined whether healthcare utilization factors, such as the number of visits to a healthcare provider, is associated with accuracy of self-reported CRCS. Identifying subgroup differences in the accuracy of self-reported CRCS may assist in the interpretation of prevalence estimates from survey data and the results of behavioral interventions. Understanding differences also may be useful in guiding clinical decision-making and improving patient-physician communication about CRCS.
We used data from a randomized controlled trial designed to evaluate the reliability and validity of a standardized self-report questionnaire of CRCS behaviors to examine whether the accuracy of self-report measures of CRCS behavior varied by respondent characteristics and healthcare utilization.
2.1. Study Setting and Sample Selection
A questionnaire, developed by one of the authors (SWV) in collaboration with scientists at the National Cancer Institute (NCI) and the Centers for Disease Control and Prevention (CDC) [4,12], was evaluated for reliability and validity using three modes of survey administration mail, telephone, and face-to-face [13]. Study participants were men and women, 51 to 74 years old, who were primary care patients for at least 5 years at a large multispecialty medical group practice in Houston, Texas [13]. Patients with a prior history of CRC were excluded. From 2005 to 2007, 1004 patients were randomized to mail, telephone, or face- to-face mode of survey administration. Of these, 857 completed a baseline survey and were included in this analysis. Additional details about recruitment, eligibility, study design and study procedures are described elsewhere [13]. The study protocol was approved by the University of Texas Health Science Center at Houston Committee for Protection of Human Subjects.
2.2. Measures
Adherence to CRCS guidelines was defined as: a fecal occult blood test (FOBT) within the past year, flexible sigmoidoscopy within the past 5 years, or colonoscopy within the past 5 years. For FOBT and sigmoidoscopy, these recommendations are the same as the American Cancer Society (ACS) guidelines in effect at the time of the study [20]. Instead of colonoscopy within the past 10 years per ACS guidelines, we restricted the measure to within the past 5 years to match our eligibility criteria. This ensured a sufficient number of patients and reduced the likelihood of receiving CRCS from an outside provider.
Self-reports were compared to a combined database of administrative and medical records (referred to as the combined medical record). Type of tests and dates of each test were abstracted from the combined medical record. Inter-rater agreement was assessed for three pairs of raters for 81 patients. For the most recent test within guidelines, agreement was 98% (kappa = 0.96). For all tests within the past 5 years, agreement was 91% (kappa = 0.89). If a patient reported a test from an outside provider that was not recorded in the combined medical record, we contacted the provider to confirm the report [13]. Of the 30 providers contacted, 23 provided the requested information, and this information was added to the database.
We assessed the following characteristics through the survey: age (categorized as 51 – 64, 65+); gender (male, female); race/ethnicity (non-Hispanic white, African American, other); marital status (not married, married/living with partner); education (<high school diploma/General Equivalency Diploma (GED), some college, college+); family history of cancer (yes, no); number of physician visits in the past 5 years at the clinic (0 – 5, >5); number of physician visits in the past 5 years outside of the clinic (0 – 5, >5); and whether their healthcare provider advised a specific CRC test (FOBT, sigmoidoscopy, colonoscopy). We included respondent characteristics, CRC history and healthcare provider advised a specific CRC test to compare our results with previous studies.
2.3. Data Analysis
We combined data from the three survey modes because there were no differences in validity estimates by mode [13]. We computed concordance (i.e., percentage agreement), sensitivity, specificity, and report-to-records ratio for any CRCS test (FOBT, sigmoidoscopy and/or colonoscopy) by subgroup. Subgroups with less than 5 in a cell for their respective 2 × 2 tables were collapsed into an adjacent category if possible; those subgroups that could not be collapsed were excluded. Two-sided 95% confidence intervals were calculated for all measures. Analyses by each CRC test were attempted but yielded unstable estimates for multiple subgroups because cells (<5) could not be meaningfully collapsed.
Tisnado et al.’s [21] criteria for evaluating sensitivity and specificity of ambulatory care services were used to assess concordance, sensitivity, and specificity: >0.9 denotes excellent agreement, >0.8 to <0.9 denotes good agreement, >0.7 to <0.8 denotes fair agreement and <0.7 denotes poor agreement. The report-to-records ratio is a measure of net bias in test reporting [22]. Values >1.0 indicate over-reporting, and values <1.0 indicate under-reporting; 95% confidence intervals were used to determine the precision of over- or under-reporting. All data were analyzed using Microsoft Excel 2007 and Inter-cooled Stata version 9.0 (Stata Corporation College Station, TX).
Most respondents were 51 – 64 years old (81.2%), female (65.8%), married or living with a partner (74.2%), and college educated (54.6%); 59.4% were non-Hispanic white and 26.4% were African American. Approximately 11.3% had a family history of CRC. Most (93.2%) had a clinic visit within the last year, and 86.8% had more clinic visits within the past 5 years. Only 14.5% had more than 5 visits outside the clinic within the past 5 years. Most reported having had FOBT, sigmoidoscopy and/or colonoscopy in the past (61.5%).
Concordance was mostly good, except for the following subgroups: respondents aged 65 years and older, those with a family history of CRC, those with 5 or less visits to the clinic, and those with more than 5 visits outside the clinic (Table 1). For sensitivity, estimates were excellent or good except for respondents whose healthcare provider did not advise a specific CRCS test. In contrast, specificity estimates were mostly fair but were poor for several subgroups, including those aged 65 years and older, male, college graduates, positive family history of CRC, more than 5 visits to an outside provider in the past 5 years, and whose provider advised a specific CRCS test. Report-to-records ratios indicated over-reporting for respondents aged 65 years and older and those with more than 5 visits to an outside provider within the past 5 years.
Table 1
Table 1
Concordance, sensitivity, specificity, and report-to-records ratio comparing self-report of colorectal cancer screening within guidelines* to combined medical record (“gold standard”) by respondent characteristics.
Our finding that older patients over-reported any CRCS test within guidelines was similar to Partin et al.’s [12], suggesting that researchers and providers may need to rely on other sources of information to ascertain screening history for this subgroup. Other published studies on correlates of accuracy of self-reported CRCS tests are limited primarily to socio-demographics and show little consistency in the validity measures reported [5,6,812, 19]. Six studies used one or more of the four validity measures as our study-concordance [812,19], sensitivity [810,12,19], specificity [810,12,19] and/or report-to- records ratio [12] to determine whether age [1012], race/ethnicity [8,9,11,12,19], sex [9,12], education [11,12], marital status [12] and family history of CRC [11,12] were associated with accuracy of self-reported FOBT [811,19], sigmoidoscopy [8,9,19], colonoscopy [8,19] and/or any CRCS [12]. Inconsistencies in findings among these studies [810,12,19] are likely due to variation in the populations studied, the time interval used to assess recall, and the CRCS guidelines used to measure validity which limited comparability of our findings to only one study [12]. Study samples included carpenters [10], patients from health maintenance organizations [8,9], primary care clinics [19] and users of the Veteran’s Administration healthcare facilities [12]. The time interval used to compare self-reported CRCS behaviors with medical records also differed across studies, particularly for FOBT where the time intervals included one [10,12,19], two [9], and five years [8]. Only two studies [12,19] used evidence-based guidelines to assess validity of self-reported CRCS.
To our knowledge, this is the first study to examine whether accuracy of self-reported CRCS varied by healthcare utilization. Patients who had more than 5 visits to an outside provider in the past 5 years may over-report CRCS. For patients who reported seeing outside providers, we were able in most instances to contact those providers and verify self-reports; however, patients may not have provided complete information about the medical care they received elsewhere which might explain why specificity and concordance were poor. Nevertheless, providers need to be diligent in taking medical histories among patients using multiple sources of medical care. Further, there is a need for better patient-provider communication about CRCS as evidenced by the low percentage of respondents, without a provider advising a specific CRCS test, who correctly reported having any CRCS test when they had it (sensitivity) and low percentage of respondents, whose provider advised a specific CRCS test, that correctly reported not having any CRCS test and did not (specificity).
Strengths of our study include the use of a standardized questionnaire to ascertain self-reported screening, inclusion of covariates that have not been examined previously, a diverse sample with a large number of African Americans, a relatively stable patient population where almost all endoscopies were done on site, and follow-up with outside providers to verify self-reports in the few cases where CRCS was done elsewhere. Nevertheless, the generalizability of our findings is limited because our sample consisted of insured men and women receiving care primarily from one practice.
To our knowledge, this is the most comprehensive assessment to date of the accuracy of self-reported measures of CRCS by respondent characteristics and healthcare utilization factors. Our findings show that self-report of any CRCS test within guidelines, as measured by concordance, sensitivity, and report-to-records ratio, is reasonably accurate across subgroups. However, more research is needed to replicate these findings in diverse settings and populations to better understand subgroup differences and improve measures of CRC.
Acknowledgments
Grant support: PRC SIP 19-04 U48 DP000057 from the Centers for Disease Control and Prevention (S.W. Vernon and A. Greisinger). Pre-doctoral Fellowship (A. White, J. M. Eberth & P. N. Abotchie), University of Texas School of Public Health Cancer Education and Career Development Program—National Cancer Institute/NIH Grant R25-CA-57712. Post-doctoral Fellowship (J. M. Eberth), University of Texas MD Anderson Cancer Center Cancer—National Cancer Institute R25-CA-57730 and NIH P30-CA-016672.
The content is solely the responsibility of the authors and does not necessarily represent the official views of the Centers for Disease Control and Prevention, National Cancer Institute or the National Institutes of Health.
1. Klabunde CN, Cronin KA, Breen N, Waldron WR, Ambs AH, Nadel MR. Trends in colorectal cancer test use among vulnerable populations in the United States. Cancer Epidemiology Biomarkers & Prevention. 2011;20:1611–1621. doi: 10.1158/1055-9965.EPI-11-0220. [PMC free article] [PubMed] [Cross Ref]
2. US Department of Health and Human Services, Office of Disease Prevention and Health Promotion. Healthy people 2020. 2011 www.healthypeople.gov.
3. Nosowsky R, Giordano TJ. The Health Insurance Portability and Accountability Act of 1996 (HI-PAA) privacy rule: Implications for clinical research. Annual Review of Medicine. 1996;57:575–590. doi: 10.1146/annurev.med.57.121304.131257. [PubMed] [Cross Ref]
4. Vernon SW, Meissner H, Klabunde C, Rimer BK, Ahnen DJ, Bastani R, et al. Measures for ascertaining use of colorectal cancer screening in behavioral, health services, and epidemiologic research. Cancer Epidemiology, Biomarkers & Prevention. 2004;13:898–905. [PubMed]
5. Baier M, Calonge N, Cutter G, McClatchey M, Schoentgen S, Hines S, et al. Validity of self-reported colorectal cancer screening behavior. Cancer Epidemiology, Biomarkers & Prevention. 2000;9:229–232. [PubMed]
6. Brown JB, Adams ME. Patients as reliable reporters of medical care process. Recall of ambulatory encounter events. Medical Care. 1992;30:400–411. doi: 10.1097/00005650-199205000-00003. [PubMed] [Cross Ref]
7. Gordon NP, Hiatt RA, Lampert DI. Concordance of self-reported data and medical record audit for six cancer screening procedures. Journal of the National Cancer Institute. 1993;85:566–570. doi: 10.1093/jnci/85.7.566. [PubMed] [Cross Ref]
8. Hall HI, Van Den Eeden SK, Tolsma DD, Rardin K, Thompson T, Hughes Sinclair A, et al. Testing for prostate and colorectal cancer: Comparison of self-report and medical record audit. Preventive Medicine. 2004;39:27–35. doi: 10.1016/j.ypmed.2004.02.024. [PubMed] [Cross Ref]
9. Hiatt RA, Perez-Stable EJ, Quesenberry C, Jr, Sabogal F, Otero-Sabogal R, McPhee SJ. Agreement between self-reported early cancer detection practices and medical audits among Hispanic and non-Hispanic white health plan members in northern California. Preventive Medicine. 1995;24:278–285. doi: 10.1006/pmed.1995.1045. [PubMed] [Cross Ref]
10. Lipkus IM, Samsa GP, Dement J, Skinner CS, Green LS, Pompeii L, et al. Accuracy of self-reports of fecal occult blood tests and test results among individuals in the carpentry trade. Preventive Medicine. 2003;37:513–519. doi: 10.1016/S0091-7435(03)00178-6. [PubMed] [Cross Ref]
11. Mandelson MT, LaCroix AZ, Anderson LA, Nadel MR, Lee NC. Comparison of self-reported fecal occult blood testing with automated laboratory records among older women in a health maintenance organization. American Journal of Epidemiology. 1999;150:617–621. doi: 10.1093/oxfordjournals.aje.a010060. [PubMed] [Cross Ref]
12. Partin MR, Grill J, Noorbaloochi S, Powell AA, Burgess DJ, Vernon SW, et al. Validation of self-reported colorectal cancer screening behavior from a mixed-mode survey of veterans. Cancer Epidemiology Biomarkers & Prevention. 2008;17:768–776. doi: 10.1158/1055-9965.EPI-07-0759. [PubMed] [Cross Ref]
13. Vernon SW, Tiro JA, Vojvodic RW, Coan S, Diamond PM, Greisinger A, et al. Reliability and validity of a questionnaire to measure colorectal cancer screening behaviors: does mode of survey administration matter? Cancer Epidemiology, Biomarkers & Prevention. 2008;17:758–767. doi: 10.1158/1055-9965.EPI-07-2855. [PubMed] [Cross Ref]
14. Bastani R, Glenn BA, Maxwell AE, Ganz PA, Mojica CM, Chang LC. Validation of self-reported colorectal cancer (CRC) screening in a study of ethnically diverse first-degree relatives of CRC cases. Cancer Epidemiology, Biomarkers & Prevention. 2008;17:791–798. doi: 10.1158/1055-9965.EPI-07-2625. [PubMed] [Cross Ref]
15. Rauscher GH, Johnson TP, Cho YI, Walk JA. Accuracy of self-reported cancer-screening histories: A meta-analysis. Cancer Epidemiology, Biomarkers & Prevention. 2008;17:748–757. doi: 10.1158/1055-9965.EPI-07-2629. [PubMed] [Cross Ref]
16. Schenck AP, Klabunde CN, Warren JL, Peacock S, Davis WW, Hawley ST, et al. Evaluation of claims, medical records, and self-report for measuring fecal occult blood testing among medicare enrollees in fee for service. Cancer Epidemiology, Biomarkers & Prevention. 2008;17:799–804. doi: 10.1158/1055-9965.EPI-07-2620. [PubMed] [Cross Ref]
17. Beebe TJ, Jenkins SM, Anderson KJ, Davern ME, Rockwood TH. The effects of survey mode and asking about future intentions on self-reports of colorectal cancer screening. Cancer Epidemiology, Biomarkers & Prevention. 2008;17:785–790. doi: 10.1158/1055-9965.EPI-07-2622. [PubMed] [Cross Ref]
18. Jones RM, Mongin SJ, Lazovich D, Church TR, Yeazel MW. Validity of four self-reported colorectal cancer screening modalities in a general population: Differences over time and by intervention assignment. Cancer Epidemiology, Biomarkers & Prevention. 2008;17:777–784. doi: 10.1158/1055-9965.EPI-07-0441. [PubMed] [Cross Ref]
19. Shokar NK, Vernon SW, Carlson CA. Validity of self-reported colorectal cancer test use in different racial/ethnic groups. Family Practice. 2011;28:683–688. doi: 10.1093/fampra/cmr026. [PMC free article] [PubMed] [Cross Ref]
20. American Cancer Society. Cancer facts and figures 2008. American Cancer Society; Atlanta: 2008.
21. Tisnado DM, Adams JL, Liu H, Damberg CL, Chen WP, Hu FA, et al. What is the concordance between the medical record and patient self-report as data sources for ambulatory care? Medical Care. 2006;44:132–140. doi: 10.1097/01.mlr.0000196952.15921.bf. [PubMed] [Cross Ref]
22. Warnecke RB, Sudman S, Johnson TP, O’Rourke D, Davis AM, Jobe JB. Cognitive aspects of recalling and reporting health-related events: Papanicolaou smears, clinical breast examinations, and mammograms. American Journal of Epidemiology. 1997;146:982–992. doi: 10.1093/oxfordjournals.aje.a009226. [PubMed] [Cross Ref]