Concerns about rising global cancer rates, which are predicted to increase by about 50% by 2020 [65
], are intensifying discussions about appropriate prevention strategies. These projections point to a major preventable public health problem in both developing and developed countries. It remains to be determined if health care professionals are ready to deal with this major societal issue, including its economic impact.
Almost 35 years ago, Doll and Peto [67
] suggested that diet likely accounted for about 30% of the risk of developing cancer. Since then, a wealth of evidence has pointed to the ability of multiple dietary components to modify cancer development and progression. The World Cancer Research Fund and American Institute of Cancer Research Report [65
] concluded, on the basis of reviewing thousands of published articles, that diet contributes significantly to cancers worldwide, but that the actual percentage is highly dependent on the specific diet consumed and the type of cancer. Regardless, even this comprehensive evidence-based review points to the food relationship as ‘probable’ rather than ‘compelling’. A quick glance at the scientific literature explains why this is the case. Reviews, which often extol the benefits of classes of foods or their components, also point to the considerable variation in response across experiments [68
]. Some have argued that the interpretation of dietary data from large populations is filled with inaccuracies and is analogous to comparing apples and oranges. The inability to take into consideration individual variation in the amounts of foods/components consumed and how these are digested, metabolised to active intermediates and eliminated as waste products likely contributes significantly to the inability to unravel which foods are most important for health [71
]. An integrated framework that simultaneously examines nutrigenomics, nutrigenetics, epigenetics and transcriptomics should provide important clues about who might benefit or be placed at risk due to dietary change [71
]. It must be understood that this diet-phenotype relationship can also be influenced by the frequency and magnitude of insults that result from excess calories, viruses, bacteria and environmental contaminants [72
Genetic tests are already available for more than 1,700 diseases (http://www.ncbi.nlm.nih.gov/sites/GeneTests/
). The availability of databases that will allow for the effective use of genomic information to create personalised intervention strategies is sorely needed. While pharmacogenetic testing has already emerged as a strategy for predicting the efficacy of drugs, a similar approach has not occurred with nutrigenetics. Pharmacogenetics is employing gene polymorphism information for predicting the extent to which drugs are transported and metabolised, and thus for calculating the quantity needed to bring about a response. Similarly genetic variation in transport and metabolism of dietary factors may alter their impact on cancer. Moy et al. [74
] suggest that knowledge about glutathione S-transferase M1 or T1 polymorphisms may be useful for predicting the amounts of isothiocyanates needed to reduce the risk of gastric cancer. In their study in men in Shanghai, those with the T1 null condition required a smaller amount of isothiocyanates to reduce risk than those with the non-null condition. Similarly, incorporating information about multiple P450
polymorphisms may help predicting those who would benefit most from limiting meat consumption in terms of colorectal cancer risk [75
]. A subset of individuals, about 5% of the population, had an increased risk, which was close to 50-fold higher compared with typical epidemiological findings suggesting a 20% risk, thus raising an intriguing question about whether or not current global public health announcements are the most appropriate when some, but not all, may be particularly vulnerable. A peroxisome proliferator-activated receptor δ polymorphism (789 C→T) may also shed light on who might benefit most or be placed at risk due to exaggerated fish intake [76
]. Unfortunately, while these studies reveal intriguing relationships among nutrigenetics, diet and cancer risk, they are largely unsubstantiated and thus remain largely speculative.
Copy number may be an additional variable that influences the response to foods. Some of the strongest evidence that this is the case comes from the observation that an increase in amylase gene copy number is associated with an increased enzymatic activity and starch digestion [77
]. Some have proposed that copy number may account for about 25% individual variation in response [78
Today, considerable focus is on genome-wide association studies (GWAS) as an approach for identifying genes that precipitate diseases, including cancer. Some GWAS are considering dietary variables [79
], which is in contrast to the vast majority. At this point, GWAS have largely reconfirmed what was known, yet continue to point to the fact that it will not be an easy task to identify the most important genetic variables possibly because of redundant cellular controlled processes [80
For nutrigenetics to have meaning, the genetic change must be intrinsically linked to a specific biological process [72
]. Some of the strongest evidence for a link between a genetic polymorphism and a biological outcome comes from studies on the vitamin D receptor (VDR) gene Fok1 polymorphism and calcium homeostasis [81
]. The longer VDR Fok1 f allele, which is less responsive to vitamin D than the shorter F allele, is linked to increased colorectal cancer risk when calcium intakes are low [82
]. While the f allele is accompanied by a reduced calcium accretion and poorer bone health compared with the longer F allele, it remains to be determined how the change in calcium accretion relates to a change in cancer risk. Unquestionably, greater attention must be given to how a change in gene polymorphisms, deletions or copy number, for example, relate specifically to cancer processes and thereby changes in risk.
While considerable excitement exists for using nutrigenetics for predicting the benefits or risks of consuming specific foods, it is an area that remains in its infancy. A recent US government report found that nutrigenetic tests might be misleading or even harmful because they make claims that cannot be scientifically proven (http://www.gao.gov/new.items/d06977t.pdf
Considerable evidence suggests that epigenetic abnormalities induced by diet are also amongst the most important factors affecting cancer risk. At least four distinct processes are involved with epigenetics: DNA methylation, histone modifications, microRNAs as well as other noncoding regulatory RNA, and chromatin modelling [83
]. Some of the strongest data linking diet to epigenetic events comes from studies with the agouti mouse model. Adding dietary factors (i.e. choline, betaine or folic acid), which enhance methylation, to the maternal diet of these pregnant mice leads to a change in the phenotype of some of the offspring [84
]. Interestingly, adding genistein, which does not provide methyl groups, also leads to a change in the phenotype from a yellow to more agouti offspring [85
]. Most importantly, these shifts in coat color are accompanied by a reduction in the risk of cancer as well as diabetes and obesity. The shift in obesity in these animals is noteworthy because of the worldwide obesity epidemic. Such findings should serve as justification for additional attention to bioenergetic-epigenetic interrelationships, especially those that are modified by dietary factors.
Myzak and Dashwood [86
] have demonstrated that sulphoraphane, butyrate and allyl sulphur are effective inhibitors of histone deacetylase (HDAC). HDAC inhibition was associated with global increases in histone acetylation, enhanced interactions of acetylated histones with the promoter regions of the P21
genes, and elevated expression of p21Cip1/Waf1 and BAX proteins. Importantly, sulphoraphane has been reported to reduce HDAC activity in humans [87
]. Future research likely needs to relate HDAC changes in humans to a change in a cancer-related process. Furthermore, since acetylation is only one method to regulate histone homeostasis [83
], greater attention needs to be given to how nutrition might influence the other types of histone modifications.
Genomic and epigenomic processes likely do not entirely account for the ability of dietary factors to influence phenotypic changes since changes in the rate of transcription of genes (transcriptomics) can also be fundamental to cellular processes [88
]. Multiple pathways appear to intersect as a cause of multiple diseases [89
]. Thus, the examination of these pathways via transcriptomic profiles may simultaneously provide important clues about multiple disease risks. Noteworthy, several bioactive food components, including both essential and non-essential nutrients, can regulate gene expression patterns. Their influence on gene transcription and translation is not only concentration but also time dependent [90
]. Nevertheless, these changes may provide critical insights about the specificity of individual food components to influence one or more biological processes, including those involved in the risk of cancer development and/or tumour behaviour.
To date, few human studies have used transcriptomics to characterise the response to specific foods or their components. A recent study [92
] suggested its potential by demonstrating specific gene expression patterns in leucocytes a few hours after consumption of a high-protein or -carbohydrate breakfast cereal. Thus, it is conceivable that a bolus approach might be used with selected foods or components in concert with a transcriptomic profile to generate a predictive model for those who might benefit or be placed at risk due to a change in eating behaviour. It is unclear if blood truly reflects changes in target tissues, and thus exfoliated or other more relevant cell types may be needed. In another recent study, prostate biopsies were effective in detecting transcriptomic shifts caused by consumption of a low-fat/low-glycaemic-load diet compared with a traditional diet [93
]. It should be noted that over-interpretation of the physiological significance of transcriptomic patterns is certainly possible since these are single snapshots. Furthermore, mRNA abundance is not always proportional to protein activity and thus may limit its overall utility to serve as a predictor of responders and non-responders.