Colorectal cancer (CRC) screening may impact public health through both early diagnosis and actual cancer prevention via identification and colonoscopic removal of precursor lesions (adenomatous polyps)1. However, currently only half of the eligible population undergoes CRC screening 2.
Moreover, many of these patients utilize tests that are insensitive to adenomatous polyps. Therefore, despite being eminently preventable, CRC still ranks as the second leading cause of cancer deaths in the United States1.
Traditional approaches to screening dichotomize the population into increased-risk (e.g. personal or family history or conditions such as ulcerative colitis) or average-risk (age ≥50). The most recent guidelines categorize the available screening options as those that primarily identify cancers versus those that may also detect adenomas (with implications for CRC prevention)1. Generally, tests with higher patient acceptability lack sensitivity whereas the more accurate tests are associated with lower patient compliance (secondary to cost, discomfort, intrusiveness etc.)3. For instance, fecal occult blood test (FOBT) is widely accepted by patients but its sensitivity for advanced adenomas is only ~10%1, 3. Colonoscopy is arguably the single best test given its combined diagnostic and therapeutic facets although many patients are reticent to undergo this procedure.
While most would agree that increased-risk patients need colonoscopy, the majority of patients who develop CRC would have been a priori considered “average-risk”. On the other hand, screening the entire average-risk population with colonoscopy is inefficient in that only ~5–10% will harbor advanced neoplasia (carcinomas or advanced adenomas)3. Moreover, resource constraints probably will not allow all the ~80% of the population ≥50 who currently do not undergo colonoscopy to ever get this test performed3. The conundrum is that most of the people who need colonoscopy do not get it whereas the majority of people who undergo colonoscopy actually do not benefit from it. This dilemma provided the impetus for CRC risk-stratification.
There are numerous non-familial CRC risk factors including age, gender, race, diet, obesity, diabetes, smoking, and alcohol4. Indeed, modifiable risk factors have been estimated to be responsible for 39–71% of CRCs5. African-Americans have a higher occurrence of and mortality from CRC. There are a myriad of potential explanations including both societal issues (access to healthcare etc) and biological differences (higher proportion of tumors that evolve via DNA mismatch repair insufficiency and thus may behave more aggressively)6.
The report by Lieberman and colleagues confirmed that African-Americans have a higher prevalence of adenomas >9mm (advanced adenomas)7. This large database review noted that the effect size was more pronounced in females. Additionally, there was an increased proximal distribution of these adenomas that was probably related to factors such as age and gender. Nevertheless, one could make an argument for preferentially using colonoscopy over flexible sigmoidoscopy in African-Americans.
This data also suggests that race should possibly be one factor in determining when to begin CRC screening. Currently, screening for the average-risk population is recommended to start at age 50. However, this is not infallible given that ~8% of CRC cases occur prior to 508. Exogenous factors can impact upon the age of diagnosis. For instance, alcohol and tobacco use have been associated with a markedly younger age of CRC presentation when compared to abstainers9. With regards to race, the earlier median age at CRC diagnosis in African-Americans (~6 years compared to Whites) 10 provides further support for the American College of Gastroenterology’s (ACG) recommendation to initiate screening African-Americans at age 456.
A vexing problem for individualizing screening strategies is that the absolute effect size of each risk factor is relatively modest. For instance, when family history is the indication for colonoscopy (a remarkably heterogeneous group in electronic database studies), the advanced adenoma prevalence rate appears to be only marginally higher than that for those undergoing average-risk screening11. Thus, the goal of targeting finite colonoscopic resources to those likely to benefit by polypectomy is daunting. The standard approach of using a prescreen test (e.g. FOBT or flexible sigmoidoscopy) to determine need for colonoscopy (analogous to the use of the Pap smear as a gate-keeper to colposcopy for cervical cancer screening) is not ideal as these tests lack the requisite sensitivity1.
A promising new technology is CT colography (virtual colonoscopy). Although requiring colonic purge, and associated with some discomfort, suboptimal sensitivity for adenomas <1 cm and radiation exposure, CT colography (CTC) has many appealing aspects including good performance for advanced lesions12. Several barriers to widespread implementation of CTC center on its lack of therapeutic capability since ~25–35% of patients will actually harbor adenomas. There is some controversy whether patients with intermediate adenomas (5–9 mm) should undergo colonoscopic polypectomy given cost-effectiveness considerations etc.1 One could argue that risks of advanced features/cancer in adenomas <1 cm are low, although some patients and physicians may find unpalatable the strategy of allowing potentially premalignant lesions the opportunity to grow13. Additionally, given logistic hurdles, referral for polypectomy may require a separate visit and bowel purge. This is not trivial in that the colonic preparation is often cited as the major reason for not undergoing CRC screening14. Furthermore, the extra-colonic findings that are frequently identified during CTC represent a double-edged sword. While some may be clinically beneficial, more are likely to be unimportant findings but may still obligate further investigations increasing costs and discomfort15. Given these issues, a possible strategy would be to use CT colography in low-risk patients (less likely to require polypectomy) while reserving colonoscopy for the higher risk patients16.
Finally, the need for more precise risk-stratification extends to patients undergoing colonoscopic surveillance. Many colonoscopies are “squandered” on overly aggressive follow-up. Endoscopic findings may provide reasonable, but far from perfect, long-term risk guidance.17 One could envision incorporating other genetic and environmental risk factors to better guide surveillance colonoscopy intervals. A further complication is the issue of missed lesions, which can be mitigated, at least partially, by rigorous endoscopist quality control1.
So what is the primary care physician to do at present? The paramount role is as an advocate since lack of physician recommendations and patient awareness are major impediments to CRC screening18. It bears emphasis that even suboptimal regimens are better than nothing. Physicians can manipulate three facets of the screening program — modality, age of initiation, and test frequency. It would seem logical that patients with several risk factors should probably undergo colonoscopy. When to begin screening is less well-defined. In our opinion, multiple risk factors may warrant earlier age of initiation, consonant with the ACG guidelines that African-Americans should begin screening at age 456. The data on the intervals between screening tests remains unclear, with current guidelines (10 years and 5 years for negative colonoscopy and CT colography, respectively) based on expert opinion rather than compelling data1. Thus, some individualization based on risk factors seems reasonable in clinical practice.
What is on the horizon? This is clearly an active area of research among numerous groups. Blood tests are being investigated including those using genomic and proteomic technologies that identify tumor products or host response. Research is focusing on elucidating the lower penetrance higher frequency genes that modulate CRC risk. Single nucleotide polymorphisms (SNPs) analysis has yielded a number of genes that may portend a modest (<2 fold) elevation of risk. Potential approaches to improve predictive abilities include utilizing SNP panels (related by disease pathophysiology, molecular pathways etc.) or contextualizing with exogenous factors (e.g. smoking susceptibility genes)19, 20. Our group has been interested in bridging bio-optics to detection of field carcinogenesis potentially allowing a rectally-based risk assessment for neoplasia elsewhere in the colon21, 22. There are numerous other emerging technologies and approaches22. A key concept is to maximizing sensitivity in order to have an excellent negative predictive value. Since the idea is to eliminate the need for colonoscopy for many of the ~70% of patients who are actually adenoma-free, the negative predictive value is critical parameters. On the other hand, some false positives are acceptable because without the pre-screen all those patients would have required more intrusive tests such as colonoscopy.
In summary, it is becoming clear that as we enter the era of personalized medicine, CRC screening will evolve from simply dichotomizing patients into average or increased risk to assigning more precise gradations (“shades of gray”). Through assessing both genetic and environmental risk factors, we may be able to more rationally tailor screening strategies to maximize cost-effectiveness and risk-benefit. While waiting for this field to mature, utilizing the published guidelines with evidence-based judicious modifications (such as more aggressive screening of African-Americans) would seem to be prudent.