|Home | About | Journals | Submit | Contact Us | Français|
Despite wide-spread consensus on the need to transform toxicology and risk assessment in order to keep pace with technological and computational changes that have revolutionized the life sciences, there remains much work to be done to achieve the vision of toxicology based on a mechanistic foundation. A workshop was organized to explore one key aspect of this transformation – the development of Pathways of Toxicity (PoT) as a key tool for hazard identification based on systems biology. Several issues were discussed in depth in the workshop: The first was the challenge of formally defining the concept of a PoT as distinct from, but complementary to, other toxicological pathway concepts such as mode of action (MoA). The workshop came up with a preliminary definition of PoT as “A molecular definition of cellular processes shown to mediate adverse outcomes of toxicants”. It is further recognized that normal physiological pathways exist that maintain homeostasis and these, sufficiently perturbed, can become PoT. Second, the workshop sought to define the adequate public and commercial resources for PoT information, including data, visualization, analyses, tools, and use-cases, as well as the kinds of efforts that will be necessary to enable the creation of such a resource. Third, the workshop explored ways in which systems biology approaches could inform pathway annotation, and which resources are needed and available that can provide relevant PoT information to the diverse user communities.
There is a goal, but no way; what we call a way is hesitation.Franz Kafka (Kafka, 1931, p 230)
The “Toxicology in the 21st Century” (Tox-21c) movement, initiated with the 2007 National Research Council report (NRC, 2007; Krewski et al., 2010), has stirred the toxicological community (Hartung and McBride, 2011; Hartung and Leist, 2008; Hartung, 2008, 2009) and initiated a far-reaching discussion about current practices in risk assessment and possible avenues for advancement. A critical overview of the extensive dialog that ensued after the publication of the report has been compiled by Andersen and Krewski (2010). Within a few years, the discussion has moved from whether the field of toxicology should change to discussions on how and when to do so – from the call for a Human Toxicology Project (Seidle and Stephens, 2009; http://www.humantoxicologyproject.org) to ongoing programs of US federal agencies (Judson et al., 2010; Knudsen et al., 2011) and the redefinition of the EPA toxicity-testing paradigm (Firestone et al., 2010).
The United States Food and Drug Administration (FDA) has recently embraced this strategy (Hamburg, 2011):
“We must bring 21st century approaches to 21st century products and problems. Toxicology is a prime example. Most of the toxicology tools used for regulatory assessment rely on high-dose animal studies and default extrapolation procedures and have remained relatively unchanged for decades, despite the scientific revolutions of the past half-century. We need better predictive models to identify concerns earlier in the product development process to reduce time and costs. We also need to modernize the tools used to assess emerging concerns about potential risks from food and other product exposures. … With an advanced field of regulatory science, new tools, including functional genomics, proteomics, metabolomics, high-throughput screening, and systems biology, can replace current toxicology assays with tests that incorporate the mechanistic underpinnings of disease and of underlying toxic side effects. This should allow the development, validation, and qualification of preclinical and clinical models that accelerate the evaluation of toxicities during drug development. … Ultimately, investments in regulatory science can lead to a new era of progress and safety. Because such investments will promote not only public health but also the economy, job creation, and global economic competitiveness, they have major implications for the nation’s future.”
We could not summarize it better.
The key proposal of Tox-21c is straightforward: we have to base regulatory toxicology (for environmental chemicals, because this was the mandate of the National Academy of Sciences panel) on mechanism and mode of action (MoA). The term “toxicity pathways” was coined in the NRC report and later the term “Pathway of Toxicity” (PoT) was introduced by Hartung and colleagues (Hartung, 2009; Hartung and McBride, 2011). OECD uses “adverse outcome pathway” (AOP) in the context of their QSAR Toolbox and ecotoxicology (Ankley et al., 2006) and recently published a proposal for a template and guidance on developing and assessing the completeness of AOP as a draft document (OECD, 2012). This is in line with the science of toxicology moving toward a more complete mechanistic understanding. There have already been some tentative efforts to identify and describe PoT. One component of the Tox-21 alliance formed by US EPA (ToxCast™), the NIEHS (within the National Toxicology Program), NIH Chemical Genomics Center (the high-throughput testing program), and FDA (the Critical Path Initiative), is focused on the use of high-throughput screening data to facilitate and test PoT1.
The limitations of the existing paradigm are well known: Hazard assessment based on animal testing has limited throughput achieved at a high cost; if traditional tests are applied to the backlog of existing chemicals of concern for which there is limited safety data, the costs would be enormous and, even if that were not an obstacle, the testing capacity is simply too small (see e.g., Hartung and Rovida, 2009a,b; Rovida and Hartung, 2009; Seok et al., 2013). Furthermore, while the continued or expanded use of animal testing has become more and more objectionable to the general public, as well as to many in the toxicology community, there is at the same time a public mandate to perform more thorough hazard assessment and testing for industrial chemicals (e.g., European REACH legislation), not to mention the demands of the drug and consumer industry. New types of products, e.g., nanomaterials, that will likely play an important role in our economic future, require a more sophisticated hazard assessment paradigm (Hartung, 2010a). The necessary practice of high-dose to low-dose extrapolation is both imprecise and often results in an overly cautious approach.
To foster the ideas of the NRC report, in Oct 2012, the Center for Alternatives to Animal Testing (CAAT) supported by the Doerenkamp-Zbinden Foundation, Zurich, Switzerland, and Unilever held a workshop on “Pathways of Toxicity” that discussed the concept of PoT as well as defining the necessary associated tools, standards, and core competencies. The three-day workshop brought together a diverse group of more than 30 front-line researchers and experts from academia (e.g., Universities in Boston, Alberta, Tel-Aviv, and Johns Hopkins University in Baltimore), independent research institutes (TNO Netherlands and The Hamner Institutes for Health Sciences), industry (e.g., Agilent and Unilever), non-governmental organizations (e.g., The Humane Society of the US), systems biology/toxicology content and tool providers (e.g., KEGG, Thomson Reuters, WikiPathways, Reactome, Ingenuity Systems, Genometry), and the regulatory professionals that employ toxicology studies and data analysis tools to protect public health (e.g., NIH & NIEHS, US EPA, US FDA, European Commission). This report presents the conclusions and perspectives from that workshop. We outline the possible benefits of mapping PoT and clarify the meaning and definition of PoT, complemented by a thorough discussion of the usefulness and validation of a public PoT database. Finally, we discuss the future challenges and directions, including the idea of the creation of a PoT consortium.
Toxicology, like the rest of biology, is undergoing a shift from a reductionist approach to a more system-oriented view that takes advantage of the newer, high-content and high-throughput technologies (van Vliet, 2011). The opportunity to move away from the limited mechanistic information provided by traditional animal tests to a pathway-based approach that provides detailed, specific, mechanistic understanding at a cellular level, predictive for target organ toxicities in a causal (ideally dose dependent) manner, presents both challenges and opportunities (Hartung and McBride, 2011; Hartung et al., 2012). As part of this challenge, the production of a comprehensive list of all PoT – that is, the “human toxome” – would be of great benefit. This concept is based on the assumption that the number of PoT is finite, and that, once mapped, toxicology can move towards more certainty while sharply reducing and eventually eliminating the need for animal testing (see Section 4).
Pathway-based approaches for toxicity testing require different methods for extrapolations. With animal testing, an expensive, two-year animal assay may establish, for example, that a 6 ppm dose exposure concentration is a point-of-departure for specific adverse responses. For non-cancer effects, this in-life point-of-departure would be divided by various uncertainty factors to arrive at a “safe” dose. Linear low-dose modeling would be used with carcinogens to estimate a dose associated with some level of risk (e.g., 1/100,000 or 1/1,000,000). With a PoT approach, the point-of-departure (PoD) will arise from observations in the in vitro test batteries that provide greater multi-dose concentration response curves. These in vitro PoDs will be adjusted using in vitro to in vivo extrapolation (Rotroff et al., 2010; Wetmore et al., 2012) and there will be a need for computational pathway models (Bhattacharya et al., 2011) to derive proposed “safe doses”, depending on characteristics of the pathway architecture. These pathway approaches will link dose and dynamics – especially at low doses – and will show a clear causal linkage between initiating event and adverse outcome that should be useful both for setting safe doses as well as identifying biomarkers.
Lastly, it is necessary to move toxicology away from an approach that extrapolates from rodents to one that uses a human-tissue based approach; this necessitates by definition understanding toxicological mechanisms at the cellular and pathway level, jointly with in vitro to in vivo extrapolations of dose levels. Ultimately, a pathway-based approach that uses human tissue, informed by a deeper mechanistic understanding of toxicity as well as a mechanistic understanding of human disease, decreases uncertainty in decision-making.
As an example of the existing problems that face regulators when testing a substance with current approaches, consider the dilemma posed by negative results: there is always the possibility that a different dosing scheme, different species, or other experimental variation might yield very different results. The uncertainty only increases when we consider that animals might have a defense mechanism not present in humans or in sensitive populations, like newborns, who, for instance, lack a functional blood-brain barrier for chemicals. Conventionally however, we assume that with some additional measures (high dose, species selection, more than one species, structural alerts, etc.) we can overcome this problem. However, a more definitive answer could be given if we had a complete list of human relevant PoT and a corresponding validated test battery. Then we could, for the first time, be reasonably confident that a substance does not trigger or perturb relevant PoT. Similarly, we can establish concentrations of substances (in vitro no-effect levels – NOELin vit) at which no PoT is triggered. It is important to note that the triggering of a PoT does not necessarily indicate harm, but a potential for harm.
As omics technologies have increasingly added to our knowledge of biology, there has been a proliferation of pathway oriented databases such as KEGG, WikiPathways, Reactome, etc., so the question might be asked, is there really a need for another pathway database?
Participants agreed as well that, ideally, the database should be constructed to allow easy answers to inquiries from researchers – (e.g., What nodes within a signaling network are suspected of being involved in endocrine disruption?) as well as from regulatory scientists looking to de-risk chemicals early in the R&D process (e.g., For which nodes in a PoT are there assays?). And lastly, it should be able to answer the question, “What nodes are important for regulatory purposes?”
After extensive discussion, the workshop participants came up with a formal definition of a Pathway of Toxicity:
A Pathway of Toxicity is a molecular definition of the cellular processes shown to mediate adverse outcomes of toxicants.
This definition focuses our attention on thoroughly understanding the molecular mechanisms of toxicity while maintaining the emphasis on the cellular context. PoT are relevant to regulators if, and only if, we can define necessary and sufficient pathways for adverse outcomes and establish their relevance by evaluating the scientific evidence. Evidence-based toxicology (EBT) could serve as a framework to establish the tools necessary for validating the PoT (see Section 5) (Hartung, 2010b).
It is important to keep in mind that a linear pathway is an artificial construct – all pathways are abstracted from a broader, global cellular network and therefore are, at some level, oversimplifications (see Figure 2 and for an overview, e.g., Kholodenko et al., 2012). Nonetheless, the complexity of a network is both difficult to represent on a map and distracts from focusing on key events. Nonetheless, it may be necessary not to think of the pathways as sharply and precisely delineated from the broader cellular network, but rather to keep in mind that a pathway representation may always be a “warm, fuzzy cloud”, i.e., warm since the answer is close but not necessarily exact; fuzzy, since the membership of components in a pathway is graded; and a cloud, since the boundaries are not sharply defined.
There will be several challenges to refining the definition of PoT into a useful working definition – how does one choose where a pathway ends? How does a pathway-based approach refine our understanding of a dose-response dependency? Toxicological processes are both spatially and temporally dynamic – how will this be represented in a pathway-based approach?
There are other questions that will need to be addressed as evidence accumulates: are PoT perturbations of known physiological pathways? For example, proliferation is a normal process; when does one re-label it as a PoT? Is it possible that certain PoT are novel pathways active only in the presence of a toxicant? Are there any PoT that are distinct pathways altogether? How many PoT can we expect to find? “132”, Mel Andersen, one of the proponents of Tox-21c and workshop organizer, often answers adding, after a pause, “as a toxicologist/risk assessor, I am accustomed to false accuracy.”
At this moment, any predictions of the number and nature of PoT are pure speculation and will have to wait for more experimental evidence. Nonetheless, the number of cellular targets and metabolic pathways is finite, and thus the number of PoT also should be. Evolution cannot have left too many vulnerable points, given the number of xenobiotics we are exposed to and the astonishingly large number of healthy years we enjoy on average. We grasp the enormous redundancy and buffering provided by biological networks when considering the surprising number of viable homozygous knockout mice, which often only display subtle phenotypic changes despite lacking an entire gene. The recent finding that each human individual is null for both alleles of in excess of twenty genes also attests to the genome’s redundancy (MacArthur et al., 2012).
One unique challenge for the PoT database will be the requirement not only to represent the PoT or their network but also the kinetics and cellular or tissue location of these events, as a PoT represents a spatio-temporal event in the cell. In this respect, it may be necessary to extend the definition of PoT to include a more quantitative model, similar to one of those discussed in Uri Alon’s Introduction to Systems Biology (Alon, 2007). From this perspective, a pathway represents not just a link between a series of nodes but instead might be thought of as a wiring diagram with components such as positive and negative feedback loops along with quantitative information about inputs, thresholds, and outputs.
Most importantly, toxicology is not alone in identifying pathways – all the life sciences are on the same quest under the label of systems biology. It is the logical next step stemming from the advent of high-content technologies (omics), attempting to create order by identifying the underlying pathways. Therefore, we will not have to reinvent the wheel as pathway mapping, visualization, and database tools are increasingly being developed in other areas of the life sciences (e.g., Cytoscape (Cline et al., 2007), PathVisio (van Iersel et al., 2008), iPath (Letunic et al., 2008), CellDesigner (Funahashi et al., 2003), VANTED (Junker et al., 2006), IPA from Ingenuity Systems, Agilent Genespring, or MetaCore from Thomson Reuters).
As an example for primary data analysis, identification of statistically significant signatures and mapping cross-technology datasets on known pathways, the Human Toxome Consortium – which initiated this PoT workshop – is largely relying on Agilent GeneSpring software. GeneSpring is a comprehensive package that combines advanced bioinformatics tools for analysis of gene expression microarrays, next generation sequencing (NGS), LC/MS and GC/MS data with a unique ability to conduct joint analysis in the context of curated or customized pathways. At the time of writing, GeneSpring supports WikiPathways, Biocyc, Ingenuity, and Metacore content; KEGG will become available later this year. Besides data normalization, QC, clustering, and statistical analyses of their primary gene expression and metabolite abundance data, users can perform pathway enrichment computations that leverage multiple data types and seamlessly explore and co-analyze the results overlaid on pathway diagrams in the Pathway Architect module. Additional analysis and visualization methods tailored to specific needs of PoT projects, such as multi-omics correlation tools, will be developed soon in collaboration with members of the NIH transformative research project on “Mapping the Human Toxome by Systems Toxicology” (http://humantoxome.com).
WikiPathways (Pico et al., 2008; Kelder et al., 2012) facilitates the contribution and maintenance of pathway information by the biology community. It is an open, collaborative platform dedicated to online pathway curation. WikiPathways thus complements ongoing efforts, such as KEGG and Reactome. Building on the same MediaWiki software that powers Wikipedia, custom graphical pathway editing tools and integrated databases are included covering major small-(bio)molecule systems. The web-based format of WikiPathways reduces the barrier for biologists (e.g., toxicologists) to participate in pathway curation. More importantly, the open, public approach of WikiPathways allows for wider participation by the entire toxicological community. This approach also shifts the bulk of peer review, editorial curation, and maintenance to the toxicological community, and as such can represent content for more peer-reviewed efforts, such as Reactome or the creation of a PoT database. Efforts to use WikiPathway content/tools in the context of in vitro toxicology, specifically to address the use of human disease mechanisms in silico in the interpretation of in vitro toxicological data, have started under the Assuring Safety Without Animal testing (ASAT) initiative for allergic contact dermatitis and hepatocellular cancer and are soon to be extended with models for cholestasis.
Reactome, another valuable resource, is a freely accessible, open-source, curated, and peer-reviewed biological knowledge-base of human bioreactions, pathways, and processes, which serves as a platform for pathway visualization and analysis of complex experimental data sets (Croft et al., 2011). A recent extension of the Reactome data model permits the capture of normal biological pathway behavior and predicts its response to a stress, like a mutational change in a protein’s function or the presence of a novel small molecule in the environment, in a comprehensive and internally consistent format (Milacic et al., 2012). The Reactome data model allows for annotation of small molecules, toxicological agents, and their specific mode of action. Pathway data visualization is facilitated by the Reactome Pathway Browser, a Systems Biology Graphical Notation (SBGN)-based interface (Le Novere et al., 2009), which exploits the Proteomics Standard Initiative Common QUery InterfaCe (PSICQUIC) web services (Aranda et al., 2011) to overlay molecular interaction data from external interaction databases. Overlaying interaction data from ChEMBL or Drug-bank (Gaulton et al., 2012; Knox et al., 2011) databases of bio-active drug-like compounds provides an opportunity to identify protein variant-drug interactions, identify novel small molecule targets, off-target effects, or pharmaceuticals that can perturb or moderate reactions or PoT. Reactome also provides the Functional Interaction (FI) network plug-in for Cytoscape, which can identify gene network patterns related to diseases, including cancer (Wu and Stein, 2012). Future expansion of the Reactome pathway database and the FI network with interactions based upon PoT should significantly improve coverage, enrich the functional annotations supported, and enhance the functionality of the pathway and network analyses.
MetaCore™ from Thomson Reuters (formerly GeneGo) is a commercial systems biology platform for network and pathway analysis. MetaCore includes a large, manually-curated database of molecular interactions (protein-protein, compound-protein, enzyme-reaction, reaction-substrate, miRNA, etc.), and tools to flexibly reconstruct and analyze biological networks. MetaCore also contains over 800 Canonical Pathway Maps – interactive visual representations of precise molecular pathways for well-characterized and annotated biological, metabolic, disease, and toxicological processes. At this time, 260 of these maps, covering a wide range of pathways relevant to toxicological and disease processes, have been made freely available at http://pathwaymaps.com.
However, many of these existing pathway and network mapping tools are more suited to hypothesis generation and do not provide the necessary precision and reproducibility for predicting full dose-dependent in vivo toxicity in man that will be required for PoT to become a useful tool for regulators. Validating PoT will likely require a sustained, coordinated effort to generate the necessary datasets to benchmark and provide context to the scoring of PoT.
Furthermore, we will need to develop tools which are suitable for looking at systems toxicology with the aim of validating them for regulatory purposes. As part of this effort, an evidence-based toxicology collaboration (EBTC, http://www.ebtox.com) has been established, which promises to generate a partnership between agency representatives, individuals from the corporate sector, and those promoting the paradigm shift in toxicology (Zurlo, 2011). Evidence-based toxicology uses concepts learned from evidence-based medicine, mechanistic/molecular toxicology, biostatistics, and validation to bring the necessary consistency and objectivity to the process. Moreover, evidence-based toxicology can help concisely summarize existing evidence on a specific topic so that experts and non-experts can use an EBT-assessed PoT database for decision-making in a regulatory context. Noteworthy, EBT has embarked on developing the validation concepts for 21st century tools (Hartung, 2010b; Judson et al., 2013; Hartung et al., 2013a).
There are many obstacles that remain before a comprehensive, PoT-based toxicology can be realized. Some of them are technological. While transcriptomics is a mature technology, metabolomics is just beginning to contribute to systems toxicology (Bouhifd et al., 2013), and some technologies – such as phosphoproteomics – remain in their infancy (van Vliet, 2011). Furthermore, even though gene and protein networks are relatively complete for humans (Tang et al., 2012; Taylor and Wrana, 2012), such “hairball” networks tell only a limited story and it is difficult to extract complete, concise pathways or to take into account dose, and spatial and temporal effects. In particular, causality with respect to predicting target organ specificity needs to be addressed (Hartung et al., 2013a). It will be necessary to analyze new methodologies for determining dose-response with high-throughput, high-content data and a PoT-based approach. It may be necessary then to bootstrap our way from what we know to what we do not know in an iterative process. The workshop participants agreed, however, that we do not need to know every detail of a pathway to use it in the context of a PoT, but we do need to establish fit-for-purpose principles.
Depending on the specific PoT, it may also be necessary to address the question of what types of data will be included and how the data will be integrated. Combining datasets of transcriptomics, metabolomics, and other omics still represents a challenge, although some progress in the application of systems biology approaches to such cross-domain data integration in toxicology has already been made (e.g., Xu et al., 2008). Integrating biomarker and epidemiology data will require new ways to turn the surfeit of existing data into useful information.
Other challenges will involve a dedicated process of consensus building in the toxicology community to develop a useful ontology and structured vocabulary to facilitate sharing information. And lastly, it will require new tools and concepts within the risk assessment community as toxicology moves away from older paradigms into a more probabilistic approach (Hartung et al., 2013b, 2012).
The creation of a PoT database will make it necessary to form and coordinate a larger consortium and link it to the development of the necessary concepts. Central steering needs to be established, incorporating the ideas of opinion leaders and the needs of stakeholders, especially regulators who ultimately have to accept the changes derived from novel approaches (Hartung, 2009). Regulators, therefore, need a seat at the table to provide input into the processes from the very beginning. The governance of such a consortium effort needs to be established, as does the quality assurance (validation), comparison to the current approaches, and possible transition. CAAT with its partners is at the moment trying to form such a consortium to define and set up a public resource for PoT information.
The vision represented here takes advantage of new innovations afforded by our rapidly evolving understanding of systems biology and a host of molecular, informational, and computational tools. Toxicity testing today is much like cartography before the development of satellites – islands of well-described territory alongside vast oceans about which little is known; it could be said that even the extent of the unmapped territory is unknown. A mapped human toxome, available in a PoT-database, would provide the necessary perspective to bring toxicology into the 21st century.
Freeman Dyson (Princeton) in his 1995 book, The Scientist as Rebel said: “The great advances in science usually result from new tools rather than from new doctrines” (Dyson, 2006, p 805). The map of the human toxome, available in a PoT database, promises to be such a new tool.
This CAAT workshop on Pathways of Toxicity was made possible by support from Unilever and the extensive discussions and experiences of the NIH transformative research project on “Mapping the Human Toxome by Systems Toxicology” (R01ES020750) and FDA grant “DNTox-21c Identification of pathways of developmental neurotoxicity for high throughput testing by metabolomics” (U01FD004230).
*a report of t4 – the transatlantic think tank for toxicology, a collaboration of the toxicologically oriented chairs in Baltimore, Konstanz, and Utrecht, sponsored by the Doerenkamp-Zbinden Foundation (DZF). The workshop was organized and sponsored by CAAT, DZF, and Unilever.
General disclaimer: The opinions expressed in this report are those of the participants as individuals and do not necessarily reflect the opinions of the organizations they are affiliated with; participants do not necessarily endorse all recommendations made.