The ultimate, long-term goal for applying genomic-based approaches to human cancer risk assessment is the eventual replacement of the current testing paradigm, which includes genotoxicity and carcinogenicity testing, with mechanism-based assays that would allow both hazard detection and assessment of the relevance to humans, not rodents, of observed findings. For this to become a reality, molecular alterations and mechanistic insight derived from human cellular models need to be correlated with injury or potential for injury in humans. Linking outcomes of toxicogenomics investigations in vitro
to ongoing human omics-based biomarker studies could help to make this happen. Consequently, more extensive data that are derived from human studies are needed, in particular, appropriate samples from individuals exposed at low but well-defined levels. Because development of biomarkers suitable to monitor exposure in human populations is essential to human risk and relevance, approaches to develop genomic biomarkers for individuals exposed to a specific agent of concern would provide the necessary advances to develop broad biomarker-based approaches (McHale et al. 2010
). Genomic approaches have the potential to facilitate the discovery of surrogate biomarkers that are gene expression signatures or expression patterns of proteins or metabolites linked with a particular phenotype. This “phenotypic anchoring” of genomic signatures (Paules 2003
) would allow for the use of patterns as surrogate biomarkers that may be useful in treatment and risk assessment decision making in the clinical or regulatory setting, even if the underlying molecular mechanism is not fully understood. This approach has been demonstrated powerfully with gene-expression–based surrogate biomarkers that have provided information to clinicians about the prognosis of breast tumors and that have helped in the design of appropriate therapeutic treatment regimes (Paik et al. 2004
; Sotiriou and Pusztai 2009
; van ’t Veer et al. 2002
). Thus, genomic approaches that use appropriate human samples from well-designed studies of exposed human populations may yield powerful novel biomarkers useful both in the clinical setting and in risk assessment.
Systems toxicology approaches should also pay attention to the relative sensitivity of humans and the variability in the human response. As carcinogens are increasingly recognized to affect multiple molecular mechanisms and thus multiple cellular pathways, insights into these mechanisms could inform new predictive approaches, such as predictive in vitro
assays, and allow for the development of specific, mechanism-based human biomarkers. These new biomarkers could then provide insight into the genetic variability in responses to the risk of developing cancers. The use of such mechanistic data will play a key role in the future of risk assessment to aid in identification of additional sources of human variability and susceptibility (e.g., background diseases and processes, coexposures) and improve prediction of interactions across environmental and endogenous exposures. Identifying mechanistic drivers of adverse responses will be particularly important in the risk assessment of exposures at low doses. Once progress is made in these areas, it may be possible to address the dose–response curve in an individual, which can take multiple forms, depending on such factors as the individual’s genetic background, the target tissue affected, and the actual internal dose of a specific compound or chemical. Linking outcomes of in vitro
toxicogenomics investigations to ongoing human omics-based biomarker studies may make this happen. Goodsaid et al (2010)
noted that regulators appear to be willing to accept such approaches where use is clearly defined, evidence is strong, and approaches are validated and qualified for regulatory use. In general, educating stakeholders is crucial to successfully implement the new testing paradigm.
Toxicogenomics applications require further technological standardization as well as biological standardization, especially with respect to the annotation of genes and pathways related to toxicologically relevant end points. Further progress must be made in systems toxicology applications, that is, developing integrative approaches across multiple genomic, genetic, molecular, and cellular assays to assess toxic events from a holistic perspective, as described with the Connectivity Map approach. The first generation toxicogenomics studies used microarray-based whole-genome analysis of gene expression modifications. Current technologies analyze the interplay between epigenetic events (e.g., whole-genome DNA methylation and histone acetylation, modifications of levels of mRNA, modifications of levels of regulatory microRNAs) and proteomic and metabolomic events, thus increasing the potential of identifying pivotal pathways whose perturbation functionally contributes ultimately in inducing toxicity and disease. To accomplish this will require better data analysis tools, specifically bioinformatics-based decision-supporting tools, to help not only the research scientists but also chemical and drug registrants and regulators. Furthermore, this will require publically accessible databases that integrate different methods and types of information, from emerging omics data types to traditional pathological, toxicological, physiological, molecular, and clinical data. New methods of training and familiarizing all parties involved with these new tools and strategies will be needed. This training may require new additions to existing curricula for students and special, targeted training opportunities for professionals.
A major challenge is the need to phenotypically anchor genomic responses from in vitro studies and testing assays on chemical carcinogenesis to human pathophysiology. The critical need for human relevancy is not a new problem but a serious issue that has appeared to be intractable in the past, because of, at least in part, the paucity of critical human samples and information. For this to be overcome now, cooperation and data sharing between private and public research partners and broad collaborative efforts will be required.