|Home | About | Journals | Submit | Contact Us | Français|
Metabolomics, the comprehensive analysis of metabolites in a biological system, provides detailed information about the biochemical/physiological status of a biological system, and about the changes caused by chemicals. Metabolomics analysis is used in many fields, ranging from the analysis of the physiological status of genetically modified organisms in safety science to the evaluation of human health conditions. In toxicology, metabolomics is the -omics discipline that is most closely related to classical knowledge of disturbed biochemical pathways. It allows rapid identification of the potential targets of a hazardous compound. It can give information on target organs and often can help to improve our understanding regarding the mode-of-action of a given compound. Such insights aid the discovery of biomarkers that either indicate pathophysiological conditions or help the monitoring of the efficacy of drug therapies. The first toxicological applications of metabolomics were for mechanistic research, but different ways to use the technology in a regulatory context are being explored. Ideally, further progress in that direction will position the metabolomics approach to address the challenges of toxicology of the 21st century. To address these issues, scientists from academia, industry, and regulatory bodies came together in a workshop to discuss the current status of applied metabolomics and its potential in the safety assessment of compounds. We report here on the conclusions of three working groups addressing questions regarding 1) metabolomics for in vitro studies 2) the appropriate use of metabolomics in systems toxicology, and 3) use of metabolomics in a regulatory context.
Metabolomics is the comprehensive analysis of hundreds of metabolites in a biological sample; it provides detailed information on the physiological status of a living organism, a cell, or a subcellular compartment at any given moment. The analytes of interest are the small endogenous molecules, such as carbohydrates, amino acids, nucleotides, phospholipids, steroids, or fatty acids and their derivatives, which are produced and/or transformed by cells as a result of cellular metabolism (Lindon et al., 2004; Patti et al., 2012). Since these metabolites directly reflect the biochemical processes of the system under investigation, their analysis offers the opportunity not only to gain insight into the activity of biochemical pathways giving a particular metabolite profile, but also into the alteration of such pathways. As such, metabolomics can be used to study human physiology not only under normal conditions but also in pathological situations.
This opens up the possibility for its application in clinical settings; as such an approach allows the monitoring of treatment success at a very early stage. This is essential given the rise of combinatory multi-drug treatment scenarios. In this context, metabolomics has been expanding in scope from a basic research approach to an applied science, not only in medicine but also in the fields of biotechnology and toxicology (Bouhifd et al., 2013; Jungnickel and Luch, 2012; Jungnickel et al., 2012; Llorach et al., 2012; Nguyen et al., 2012; van Ravenzwaay et al., 2012; Zhang et al., 2012b). For instance, in biotechnology, metabolomics offers the possibility of assessing the relationship of a genetic modification(s) to a specific desired phenotype in an effort to determine the critical biochemical pathways involved. For example, it allows identification of increased activation of certain metabolic pathways for improved yield or production (Kim et al., 2012). In clinical medicine and pharmacology, metabolomics is becoming an established tool for the identification of pathologies through the use of more relevant biomarkers (Patti et al., 2012; Rhee and Gerszten, 2012).
Metabolomics is ideally positioned to address the challenges of toxicology in the 21st century (tox-21c). It represents a powerful tool for collecting rich mechanistic information indicating not only the extent of a toxic insult but also its underlying mechanisms.
From the currently available data it seems that metabolomics information can be more easily related to classical toxicological endpoints used in animal studies than, e.g., transcriptomics data. One reason for this may be that the changes in the metabolic profile are often “downstream” of those initial changes that occur at the level of the genome, transcriptome, and proteome (van Ravenzwaay et al., 2007). In addition, the relatively small number of metabolites (i.e., hundreds/thousands) present in a tissue or bio-fluid, compared to the tens of thousands of transcripts, or hundreds of thousands of proteins can be advantageous when working to determine meaningful changes associated with a toxic effect (Strauss et al., 2012; van Ravenzwaay et al., 2007). For toxicological studies, the fact that extracellular metabolites somehow reflect the intracellular situation is of major importance. This is the basis for non-invasive or minimally-invasive sampling of body fluids (blood, urine, etc.), and metabolomics analysis on such samples to gain information on target organ toxicities that would otherwise only be identifiable by highly invasive (histopathological) methods (Ebbels et al., 2007; Lindon et al., 2003). Also, time course studies within one study subject or animal are greatly facilitated by this particular advantage of the metabolomics approach (Ebbels et al., 2007; van Ravenzwaay et al., 2007, 2012).
In addition to providing information for a large number of metabolites in one measurement, either from body fluids, tissues, or whole organisms (i.e., fungi, aquatic organisms, etc.), metabolomics has been applied to in vitro cell systems for understanding drug effects (Balcke et al., 2011; Strigun et al., 2011a,b). First pilot studies show that future applications of the metabolomics approaches are high throughput chemical screening applications (http://www.stemina.com). Finally, new imaging techniques are not only capable of locating environmental toxicants within biological systems but can be used in combination with metabolomics approaches to describe specific toxicological effects within cells (Haase et al., 2011; Tentschert et al., in press).
Due to the increasing use of metabolomics in toxicology and safety sciences, a workshop was organized in Berlin on February 14–15, 2012. Scientists from academia, industry, and regulatory bodies discussed the current status of this approach and its present/future applicability. One day prior to the workshop, an international symposium was organized by BASF/CAAT-Europe to present the state of the art regarding the use of metabolomics for addressing a variety of pertinent toxicological questions. Participants identified several hurdles in the wider application of metabolomics in safety assessments and for in vitro compound screening. This paved the way for in-depth discussions on these issues in the workshop that followed. Here, we summarize the result of these discussions and offer solutions for successfully moving forward with this important area of research.
The application of metabolomics in vitro is an emerging theme that has been driven mostly by two major factors: (1) a better understanding of the biochemical changes provoked by a toxic insult in a defined and controllable experimental system and (2) the increasing need to move towards the use of human-relevant non-animal alternatives in toxicology in accordance with policies endorsing the 3Rs concept (Reduction, Refinement, and Replacement of animal testing). Special challenges for the application of metabolomics in vitro can be summarized as 1) different requirements of models, 2) quality criteria and quality control, 3) application areas, 4) investigation strategies, 5) technical challenges of the analysis, and 6) extrapolation to the in vivo situation.
Apart from the evident benefits of reducing animal testing and getting better insights into the molecular targets of xenobiotics and their mode of action (MoA), the application of metabolomics to in vitro systems allows the application of this approach at a high throughput level. Due to the increasing interest in in vitro systems combined with metabolomics, a working group specifically discussed the current uses, overall strengths, and pitfalls of in vitro metabolomics. The major topics that were discussed are:
The subchapters below give a summary of the discussion.
As for other methods, standardization is essential for comparability and reproducibility of results. the in vitro metabolomics approach faces difficulties similar to other in vitro approaches with respect to the heterogeneity and special requirements of the experimental models (Hartung, 2007). These concerns can be classified according to their level of complexity and the handling requirements as:
In addition, model organisms, such as zebrafish (D. rerio) and nematodes (C. elegans) often are used in an in vitro manner, but with the advantage of a complete living organism also endowed with metabolic capacity. The impact of handling and special requirements of the in vitro systems should be subjected to routine evaluation. For instance, changes in central metabolism also can be due to changes in the cell culture conditions rather than to the chemical insult. Therefore, profound knowledge of the system used and performance of controls for all potential influence factors of the in vitro model are crucial (see also Good Cell Culture Practice guidance, Coecke et al., 2005).
The relevance and reproducibility of in vitro data depends strongly upon the quality of the test system and its analytical endpoint (Hartung, 2009a; Kadereit et al., 2012; Leist et al., 2010, 2012a). Therefore, establishment of a complete set of quality criteria and guidelines is crucial for the acceptance and further use of in vitro metabolomics approaches. Such criteria can be classified into three categories, (1) general requirements (as also defined for Good Cell Culture Practice) (Coecke et al., 2005; Hartung et al., 2002), (2) criteria specific for the use of model systems for the purpose of metabolomics analysis, and (3) criteria referring to the quantification of in vitro metabolomics data.
For metabolomics, i.e., an approach that aims at a simultaneous determination of hundreds of metabolites, quality criteria are crucial in order 1) to avoid artifacts and 2) to facilitate the comparison of generated data. Establishment of reference systems with reference standards for metabolomics are essential for research purposes and for the credibility of this approach (Holmes et al., 2010). Therefore, the most important issues include the definition and the availability of negative and positive test controls. Notably, standards will allow a clear communication of results and integration of metabolomics data with other -omics approaches, e.g., proteomics and transcriptomics (Holmes et al., 2010).
Moreover, the definition of the test and its respective acceptance criteria, sterility issues, assurance of the identity and the condition of the cells (e.g., cell aging or spontaneous mutations), measures for the ratio of cell types and differentiation stages, adequate measures for viability assessment and availability of in vitro bio-kinetics data, determination of free concentrations of test components, information on the cellular concentrations of compounds, and the contribution of metabolism in case of metabolically active systems.
Apart from the biological variability, analytics and sample processing also are potential sources of variability. Therefore, the group agreed that there is an urgent need for harmonization of metabolomics protocols, allowing integration of quality criteria not only at the biological level, but also at the analytical level. Moreover, it is also relevant to have a quality control for measurement and data processing bias.
All information about technical details, such as the extraction methods and the storage methods should be compiled in standard operating procedures. Ideally, other responses of the model will be evaluated in parallel with metabolomics data (e.g., cell morphology and/or viability parameters). This allows the anchoring of the metabolomics data set to physiologic or functional responses. In this respect, verification of the presence of essential targets, signaling pathways, and response features needs to be verified. In this way it can be ensured that the changes in the metabolic profile are related to specific physiological conditions. Another important but frequently neglected criterion is whether the chosen in vitro model can be positioned reasonably within a decision tree, i.e., how the information obtained from the model can contribute to an overall evaluation or strategy.
This is still an area of active development, and generally applicable solutions are not available. Differences exist between the analysis of the metabolome in medium (cell supernatant) and in cells/tissues. In the first case, leakage from cells needs to be controlled, but sample spiking with isotope-labeled reference standards is easier than for the analysis of intracellular metabolites. Sets of standards covering different pathways may be used (Roede et al., 2012). Different normalization approaches have been tried for measurements within cells.
These include the use of “house-keeping” metabolites or combinations thereof, or the introduction of isotope labeled standards. The availability of a “housekeeping” metabolite or any other form of normalization standard is of utmost importance to correct for errors and variation in the cell number, in cell harvesting, during the extraction procedure, during sample processing, in detection sensitivity, and in many other steps related to the overall analysis of the metabolome. Further features required for a robust and reproducible quantification are a stable baseline metabolite pattern and a reproducible response to positive controls.
Metabolomics is a versatile approach with multiple potential applications in drug discovery and safety profiling. The in vivo metabolomics approach could be proven advantageous already, not only in clinical applications but also in toxicology. Examples are: for identification of toxic modes of action (van Ravenzwaay et al., 2012) or toxicological screenings providing insights into the potential toxic effects of chemicals under development, which lead to accelerating the decision-making processes (Kolle et al., 2012). Potential uses of the in vitro me-tabolomics approach include the following:
A major goal of the field is the use of metabolomics information in comparison to known standards to predict actions of unknown chemicals in biological systems (Rusyn et al., 2012). To develop such models, metabolite patterns related to well known training compounds would be used to develop classification schemes. These would then be applied to the metabolome patterns triggered by unknown compounds in order to predict their toxicological hazard (Fig. 1).
A slightly less ambitious application would be relative ranking, i.e., metabolomics would provide relative information within a group of compounds to rank them, e.g., according to toxicity and to facilitate decisions on further development.
This approach can help in understanding the effect of chemicals on the complexity of the metabolic networks (Roede et al., 2012). Furthermore, this is expected to lead to the discovery of the metabolic pathways that are perturbed by the chemical. Such information would help to pinpoint potential targets of the chemical and drugs and to predict their mode-of-action, as demonstrated by recent studies (Strigun et al., 2011a, 2012). Recently, simple metabolome analysis was shown to be useful to classify drugs into MoA classes (Strigun et al., 2011b).
In an extension of the mode-of-action approach, metabolomics can be used for the mapping of toxicity-related pathways (Hartung and McBride, 2011; Hartung et al., 2012; Shintu et al., 2012). The essential challenge is the identification of the critical pathways that lead to toxicity, as opposed to other chemical-induced changes that are only adaptations, counter-regulations, or epiphenomena (Andersen et al., 2011; Bhattacharya et al., 2011; Boekelheide and Andersen, 2010; Hartung et al., 2012). The pathways of toxicity (PoT) may be specific for the cell types and model systems, but some may allow general predictions from in vitro to in vivo.
One of the great expectations is that information from different in vitro models will allow predictions of potential target organs of toxicity in vivo (Zidek et al., 2007). This would require the identification of responses and of the activation of PoT in a concentration-dependent manner and in different systems predictive of processes in various organs. Moreover, background information on the relevant metabolite changes or activation of PoT in vivo would be required for various target organs, as well as for various modes of toxicity that may affect them (e.g., classifying hepatotoxicants as producing cholestasis, hyperlipidosis, or necrosis in the liver).
The PoD is a concentration of a test chemical that results in a significant change in the in vitro system, which is considered predictive for the in vivo situation. The PoD is used for in vitro-in vivo-extrapolation calculation (IVIVE) to determine the relevant in vivo dose or exposure. Upon exposure of a model system to a chemical, multiple changes take place. The shift in metabolite patterns will depend on the test concentration and exposure time. It will be important to identify for each model the type of change (i.e., the combination of metabolites and the extent of their change) that predicts toxicity. The conditions leading to these changes (i.e., the free concentration of the chemical or its key metabolite) will be used as points of departure (PoD) to extrapolate findings mathematically to in vivo doses.
Not only may endogenous metabolites be detected, but other questions may be addressed as well: What are the concentrations of different xenobiotics’ metabolites, and how do they change over time?
Human cells with different genetic backgrounds may vary in their responses to toxicants (Ingelman-Sundberg, 2001). Metabolomics may be useful to identify such differences. Information gained therefrom will be useful to model subpopulations with different susceptibilities.
In the case of distinct reactions of cells from different species, metabolomics may help identify the reasons and consequences of inter-species variability. Such knowledge would improve species extrapolations, e.g., from rodents to man.
Metabolomics approaches also may open the door to human-relevant research on idiosyncratic toxicity; such toxicities occur when a convergence of risk factors (disease, age, gender, co-medications, nutritional status, physiological status, microbiome, and genetic predispositions) disturbs the otherwise stable homeostasis and allows adverse chemical effects at otherwise innocuous concentrations (Clayton et al., 2006; Coen et al., 2009). Metabolomics studies allow insight into the cellular homeostasis under different experimental conditions, and the data may explain the conditions under which unexpected toxicities would occur.
Standard metabolomics methods measure concentrations of metabolites – “frozen” at a certain time point – but not the speed of their turnover. Knowledge of the complete set of metabolites is not enough to predict the phenotype, especially for higher cells in which the distinct metabolic processes involved in their production and degradation are finely regulated and interconnected. In these cases, quantitative knowledge of intracellular fluxes is required for a comprehensive characterization of metabolic networks and their functional operation (Cascante and Marin, 2008). Under given homeostatic conditions, for instance, a glycolysis metabolite may show similar concentrations but, e.g., low turnover when mitochondria are functioning and high turnover when mitochondria are unable to meet energy requirements. For instance, depletion of ATP in livers due to high fructose exposure does not yield information on whether glycolysis or mitochondria are affected, while fluxomics would deliver clear results (Latta et al., 2000). Metabolic flux analysis has been used, e.g., to study drug effects on the metabolome of HepG2 cells (Niklas et al., 2009). By using isotope-labeled substrates of metabolism in combination with time series experiments, information on the turnover – fluxes – in different pathways can be obtained. The use of isotope labeling in metabolomics and fluxomics has been recently reviewed (Klein and Heinzle, 2012). This type of metabolomics data covers an important aspect relevant to chemical hazards, which is necessary for a systems biology type of modeling but does not yet represent a routine approach in toxicology (Hartung et al., 2012).
Compared to the sequential measurement of individual endpoints, large increases in the speed of data acquisition are to be expected. The simultaneous availability of data on a large number of metabolites also is likely to increase the sensitivity.
The qualitative and quantitative analysis of the metabolome in vitro opens the opportunity for discovering biomarkers, which could be used for diagnostic purposes; other metabolites may be useful as biomarkers for the efficacy of drugs, and/or they may help to quantify the progression of human relevant diseases or the extent of organ damage.
a still unsolved large challenge of in vitro toxicology is the understanding of communication between cells that contributes to adverse effects. This is particularly important for interactions involving inflammatory and non-parenchymal cells. For instance, interaction between neurons and glial cells (Falsig et al., 2004; Henn et al., 2009, 2011; Kremer et al., 2010; Schildknecht et al., 2012; Hirt et al., 2000) involve multiple metabolite exchanges. Communication in co-cultures may be bidirectional, and the overall response would not be understood from the reaction of single cells alone (Gantner et al., 1996). Metabolomics approaches may help elucidate chemical communication between cells in the context of adverse reactions.
The combination of metabolomics with good in vitro models has great potential for the field of 3Rs (Hartung and Leist, 2008; Leist et al., 2008a,b). In vivo metabolomics already substantially contributes to the 3Rs principle by reducing animal testing through refinement (Kolle et al., 2012). In vitro predictions may lead directly to the replacement of animals, as well as to the improvement of the chemical risk assessment of pharmaceuticals and environmental toxicants. The rich information also would help to optimize in vivo testing. For instance, relevant endpoints could be chosen and the study design optimized. This would lead to a reduction in the use of animals. the use of early biomarkers also would shorten studies and thus lead to a refinement.
Two different strategies may be followed for the use of metabolomics in safety evaluations. The more long-term perspective is that a large set of rich data, comprising metabolomics and transcriptomics information, would be sufficient on its own to predict potential hazard. One requirement would be broad background knowledge of systems toxicology and the human toxome. This would be realistic in the more distant future.
In the immediate future the strategy will not be based on vast biological background data but rather on pattern comparison. Reference compounds will be tested in an in vitro model battery. The metabolome analysis of unknown compounds then will be aligned with the known pattern of the reference material. Here also, data from other -omics approaches can be implemented into the alignment pattern (Fig. 1).
To promote the widespread application of in vitro metabolomics, several technical challenges need to be solved.
Metabolomics is particularly challenging with respect to quality control, as the data set obtained is the result of a multi-step process. Each of these steps can create potential artifacts. It also should be noted that the in silico handling of large amounts of data requires a defined quality assured workflow. Also, sample preparation steps are critical, as the desired metabolites are usually embedded in biological matrices. Thus, metabolites have to be extracted without compromising their structure and concentration. Some metabolic processes are so fast that the metabolite pattern may change during sample preparation. Every in vitro model comes with its own particular issues concerning quality control. Thus, guidelines and SOP both require continuous adaptation.
Different challenges apply depending on whether samples are collected intracellularly (cell lysates reflecting the endogenous metabolome) or extracellularly (the cell secretome). For instance, leakage of metabolites from cells can be a problem. Large differences exist between the analysis of the intracellular and extracellular metabolites. For intracellular metabolites, leakage from cells or cell debris can be controlled by gentle centrifugation of the samples to avoid contamination of the supernatants. In the second case, which is actually the most challenging, the processing must be carried out with high speed in order to avoid changes of the intracellular metabolite concentrations but also contamination with exogenous metabolites from cell culture media. Therefore, washing steps have to be included, which can delay the processing of the samples. Different washing procedures could already have different effects on cellular metabolites. The choice of sampling time points poses particular challenges, and the cellular reaction to toxicants changes over time by the activation of counter-regulatory pathways.
Speed of sample processing prior to quenching also is a crucial step, since metabolite concentrations can drastically change in a very short period of time. Therefore, the fast “freezing” of the biochemical processes by sample quenching is of high relevance for obtaining reliable data. In addition, depending on the in vitro system, more technical steps could be included, affecting the quality and consistency of the analyzed metabolites.
The strength of in vitro systems is the control and easy variation of parameters. To fully profit from these features, a large number of samples need to be measured. Increase in sample throughput (with respect to sample preparation, measurement, and analysis) is a major factor determining future widespread use of in vitro metabolomics.
As for other -omics approaches, undersampling (too low sample number) leads to overfitting. Simply put, statistical tables contain too many columns (endpoints) compared to rows (sample replicates). The choice of the right sample size is essential for a conclusion regarding whether a marker behaves differently from the controls or not. The sample size should be determined from preliminary experiments in which different sample replicates are set and the internal variability among samples is used to estimate the number of replicates to achieve statistically significant results. Statistical rules suggest that the sample size needs to be matched to the number of metabolites and to the required statistical power. For the estimation of the right sample size for metabolomics approaches, some in silico tools can be recruited, e.g., the programs nemaed samr, ssize, and ssize.fdr. Notably, the variables for these estimation tools have to be well chosen, e.g., number of measured metabolites and the relative abundance of the metabolite concentrations. In practice, it often will not be possible to adhere completely to the stringent rules of statistics. Compromises need to be found that still allow technical feasibility.
In order to identify the robustness and reproducibility of the system it is essential to understand the baseline metabolite pattern of the cell system under the standard conditions of culturing and prior to a toxic insult. Only by doing so can reliable parameters be defined for comparison when applying a toxic compound and changing the metabolomics profile.
Cellular metabolites can be present at concentrations spanning at least 6 orders of magnitude, but they cannot be amplified like DNA, and they are chemically much more diverse than proteins. This poses a particular analytical challenge for metabolomics, and the sensitivity of the method to cover low abundance metabolites needs to be increased considerably.
Normalization allows reduction of the potential variability among replicates or experimental samples, e.g., due to viability changes of cells. Normalization will correct for slight changes in cell harvesting, during the extraction procedure, during sample processing, or in detection sensitivity. Normalization parameters are essential for analysis and comparison of in vitro metabolomics data. Different options in this regard include the use of external standards, such as protein concentrations, quantification of cell number, as well as the use of internal references, such as “housekeeping” metabolites (Ruiz-Aracama et al., 2011). Instead of external standards or internal “housekeeping” metabolites, intra-sample normalization based on overall metabolite quantity could be performed. For this procedure, exclusion of contaminants and artifacts is crucial. For instance, plasticizers may be present in varying amounts in the “metabolite” spectrum. Such contaminants would spoil normalization to a sum of total metabolites.
Metabolites are a chemically extremely diverse group of compounds. They range from highly charged phosphoesters or organic cations to extremely hydrophobic lipid constituents. Moreover, many metabolites exist as isomers or epimers that need to be separated. At present, a combination of analytical approaches is used to target metabolites with different physiochemical properties. This requires different analytical technologies, e.g., NMR, GC-MS, UPLC-ESI-MS/MS, TLC/GC-FID, DFI MS/MS. A big technical challenge is to optimize the technology in a way to allow analysis of all metabolites with only few methods.
A metabolomics-specific database is still lacking, but some technology providers, such as Agilent and Bruker, have started providing first solutions. These need to meet several challenges, i.e. (a) identification of metabolites: It is still common that the analysis yields a large number of metabolite peaks that cannot be unequivocally assigned to a chemical structure; (b) assignment of identified metabolites to known metabolic pathways or PoT; (c) combination of metabolite information with other data, e.g., transcriptomics.
The traditional perception of a metabolic pathway is a sequence of steps leading from an educt to a product. The analysis of metabolic pathways aims to determine the concentration and the fate of the relevant molecules at every stage of the procedure. The challenge arises from the fact that pathways are not linear, one-way roads but rather should be seen as parts of an intricate metabolic network. In this sense, each analyzed molecule is a node of such a network. In cells exposed to toxicants such nodes may change (altered metabolite concentration). Alternatively, nodes may remain relatively constant, but the connections originating from them may change (altered metabolic flux).
The ultimate challenge of the in vitro metabolomics approach is the extrapolation of the in vitro data to obtain relevant information for the in vivo situation. This will require further advances in the field of physiology-based pharmacokinetic modeling (Blaauboer et al., 2012; Leist et al., 2012b; Louisse et al., 2012; Prot and Leclerc, 2012).
More immediate goals will be to provide qualitative information on, for instance, what a potential target organ may be or whether developmental toxicity is to be expected (Kleinstreuer et al., 2011). The overall vision is that in vitro metabolomics facilitates qualitative in vivo predictions. For instance, key metabolites (or groups thereof) may be selected that predict in vivo toxicity (Yoon et al., 2012). their concentrations would be used to define benchmark concentrations to be used as the point-of-departure for IVIVE. With the definition of the points-of-departure and the employment of PBPK-modeling (ADME), the NOAEL for the in vivo situation can be calculated.
The challenge of information-rich technologies (high-throughput and high-content, for overview see (van Vliet, 2011)) is to make sense of extremely large datasets. This requires the integration of data, likely from different technologies and test systems (Leist et al., 2012b). Systems biology proposes to make use of our increasing understanding of the biological systems, i.e., how the different endpoints are physiologically interconnected. In the end, it attempts the modeling of the dynamics of the biological system (especially on a biochemical and molecular biology level) and of its response to perturbations such as disease. For toxicology, an analogous approach, i.e., a “systems toxicology” could be envisaged (Hartung et al., 2012) where the impact of an agent on the biological system is modeled. This concept represents an extension of moving from black-box models of effects (from apical endpoints), where effects are recorded without understanding the underlying mechanisms, to an approach based on knowledge of the MoA. the 2007 report of the US National Research Council has called for exactly this (NRC, 2007). The buzzword “Toxicology for the 21st Century” (Tox-21c) or similarly “Toxicity Testing for the 21st Century,” has been taken up to describe the variety of activities implementing the report. Among them an NIH funded initiative to map the human toxome by systems toxicology is attempting to create a process for pathway-of-toxicity annotation and sharing (http://www.humantoxome.com). To enable a systems toxicology approach and to allow quantitative modeling, we have to move beyond a rather qualitative MoA knowledge, and rather describe molecularly defined pathways. The abbreviation PoT for a pathway-of-toxicity has been coined to differentiate PoT from MoA/toxicity pathways, which are typically defined in a narrative way (Blaauboer et al., 2012; Hartung and McBride, 2011). The opportunities lying in such a systems toxicology approach were discussed intensively in the consensus process to a roadmap for replacing systemic toxicological animal testing (Basketter et al., 2012).
The established networks within an organism, which form the basis for modeling in systems toxicology, are based on molecular biology and biochemistry. Transcriptomics in all of its variants, including the increasing use of deep sequencing technologies, is the key approach for the molecular biology part, with a minor additional contribution by proteomics studies. Metabolomics is the core approach for the biochemistry part of this modeling. In this sense the advent of metabolomics in toxicology represents a “kick-start” into systems toxicology.
This can be initially viewed as a mostly academic exercise aimed at the generation of new knowledge that is not aimed at a specific regulatory purpose. However, society has large expectations of toxicology: this science has the potential to identify potential hazards of chemicals, and to provide improved safety to the consumer (Hartung, 2009b; Leist et al., 2008a). This situation calls for the exploitation of new powerful technologies such as metabolomics, and the goal of making regulatory use of this approach should be kept in mind. Early-stage uses, before definitive regulatory decisions are made on the basis of systems toxicology information, could be the screening for high risk compounds. This means that the right questions must be asked early in the process, i.e., to focus testing on substances with a higher likelihood of being identified as a problem.
The concept of PoT is key to the Tox-21c and systems toxicology concept. Ironically, even after some years of discussion, no definition of PoT has been agreed upon, though various such initiatives are on the way. Two very different views prevail at present, as discussed elsewhere (Hartung et al., 2012): (a) PoT represent the cascade of events leading to the perturbation of a system; (b) Pot represent the downstream signaling triggered by perturbed physiology (Fig. 2). Intuitively, Pot is understood as the initiating event (Fig. 3). However, neither metabolomics nor transcriptomics is currently used to assess these early events, instead we typically seek to assess the established new homeostasis under stress (Hartung et al., 2012).
For Tox-21c and systems toxicology we need high resolution sampling to capture time-dependent changes (dynamics) and the dose-response behavior of systems challenged with toxicants. We need to monitor a wide range of phenotypes (hazards). Metabolomics is especially well suited for this purpose as it (1) is most closely related to phenotypic changes representing functional endpoints, (2) assesses many such processes simultaneously, (3) has some protocols that are already broadly fit-for-purpose in terms of throughput, cost, sensitivity, coverage of the metabolome, and reproducibility, (4) achieves the sample throughput required for detailed dynamic / dose response studies, (5) can sometimes be non-invasive (especially NMR and secretome technologies), and (6) is future-proof since an untargeted approach can be employed. The latter means that we do not necessarily remain influenced by established knowledge on pathways. The complete measurement of many endpoints represents, rather, a challenge to this preconception (not to say prejudice) opening up opportunities for new PoT identification or for balancing the relative importance of different PoT.
Workshop participants felt that current metabolomics technologies are largely fit for the purpose of Tox-21c, while there is a tremendous need to (1) define standard procedures for quality control and data reporting, (2) annotation of metabolites and pathways, and (3) quantification of metabolites required for biological modeling.
A number of technological challenges were identified:
Available methods appear straightforward but current pathway databases may not reflect reality.
Routine metabolomics does not directly report on metabolic fluxes, which are necessary for modeling as discussed above.
Obviously, this is the holy grail of systems biology, which is only emerging as a discipline. It is still difficult to obtain the required data (forerunners are, for example, the Metabolights repository (EBI at http://www.ebi.ac.uk/metabolights) or DIXA (at http://www.dixa-fp7.eu) for toxicogenomics and metabolomics data). It represents a major challenge at a computational level, for which bioinformatics resources need to evolve. The good news is that toxicology is not alone: The entire field of biomedicine is embracing systems approaches, and each discipline benefits from cross-fertilization.
We have to distinguish here between (a) compound screening (typically based on signatures), which should allow an early-on regulatory use of metabolomics, as discussed in the previous chapters and (b) validating causal pathways for the purpose of Tox-21c. The validation of the former screening approach would be based on gathering data on lots of compounds that are historically well understood and looking for similarity of signatures/predictivity/anecdotal evidence of mechanistic relevance. However, we will not necessarily understand how changes in these biochemical pathways actually cause disease/ pathology. The mechanistic approach of Tox-21c, however, requires interfering with critical points in pathways identified in order to prove causality. This is difficult and laborious, but more convincing than deduction from phenotypic changes. Modeling strategies might bridge the gap, as they would allow virtual experiments to check the plausibility of suggested PoT before validation of a causal role is undertaken.
Current work on PoT identification is focused on in vitro systems. Therefore, the relevance for in vivo situations and correspondence of PoT will have to be established. Currently, multiple species often are a prerequisite for regulatory acceptance; the translation of PoT between species needs to be established. A similarity of signatures argues for predictivity and mechanistic relevance, but stability of signatures under various experimental conditions and their relevance to humans need to be established.
We should keep in mind that the future use of PoT may be much simpler than the methods used to find them: Ultimately, identified PoT should allow the design of rapid and targeted assays, e.g., for high throughput platforms. Metabolomics will not be the stand-alone approach to identify PoT. The combination with transcriptomics can help resolve relevant pathways, as metabolites typically play a role in several pathways. Me-tabolomics could be used to screen for candidate PoT, which are targeted in subsequent assays. the question arises whether metabolomics should be prioritized over other -omics for PoT identification. Many of the aspects discussed above argue in favor of this, including the low costs and high throughput once the method is established and the relative ease of interpreting metabolite changes. However, there is a strong need to integrate metabolomics information with classical endpoints from clinics, pathology, histology, etc. This poses some difficulties with regard to the time point of sampling. Classic endpoints represent “late” events (Fig. 3). Sampling at time points when these become evident may not be optimal for metabolomics endpoints and the identification of activated PoT (these are rather early events). Actually, we might need to control for the occurrence of late, generally-degenerative events as confounders for PoT identification, “taking them out of the equation” by measuring, for example, at subtoxic concentrations or early, before functional manifestations.
Taken together, metabolomics is core to the implementation of the Tox-21c concept. It will be a workhorse for PoT identification and possibly later for the testing of PoT activation/ perturbation as it is multiplexing information on various PoT. The major step to convert metabolomics information into high throughput test systems is the transitioning from a largely untargeted PoT identification to the targeted measurement of the predictive metabolite changes that are characteristic for known PoT.
The transition of “omics” technologies from basic to applied research may yield approaches that drastically improve our ability to conduct both predictive and diagnostic assessments of chemical toxicity and increase the efficiency for development of new drugs. In addition, information from omics technologies can improve the regulatory assessment of the safety profile of new compounds. However, regulators need to be convinced about the validity of such data. Here, policy makers play an essential role in speeding up the acceptance of these approaches for regulatory purposes. In order to achieve this, a major effort should be undertaken to design validation strategies tailored for omics technologies.
Today, a key challenge for the regulatory framework is to adapt more flexibly to rapidly-emerging technologies while at the same time ensuring safety for humans and the environment. However, the onus for the integration of these new data also rests with the researchers, who have a responsibility to objectively convey the strengths and weaknesses of the underlying techniques and to work in conjunction with regulators for the validation of these new methods. The main issues discussed at the workshop are summarized below:
In all case studies presented at the workshop symposium, metabolomics analysis was able to reliably identify toxicological MoA. This was independent of the technological platform (e.g., mass spectrometric or nuclear magnetic resonance spectroscopic identification of the metabolites). Therefore, the question of whether metabolomics is suitable for MoA identification was affirmed.
There is a need to discuss with regulators on a case-by-case basis as to whether the evidence obtained with metabolomics is sufficient for identification of mode of action. One issue may be that, currently, neither standardization of metabolomics methods nor guidance on how this could be achieved is available. As the identification of MoA is not a mandatory regulatory (“standard”) requirement and also not a toxicological endpoint, it is, at present, included in studies only on a voluntary basis. However, MoA identification is becoming more important in regulatory frameworks. For instance, the identification of endocrine disruptors is one of the targets of both EU and US legislation. Therefore, it is clear that knowing the MoA of a chemical will result in a better interpretation of the toxicological data, and it is likely to contribute positively to the entire risk assessment process (van Ravenzwaay et al., 2012), for example, by addressing species-specificity (Forgacs et al., 2012).
Regardless of input from regulators, scientists using metabolomics should strive towards: 1) a high quality study design, 2) the development of appropriate standard operating procedures (SOP), and 3) a high level of standardization. Once the method used is well described, it is important to follow the developed SOP strictly in order to minimize changes over time and to ensure comparability of results. Thus, overall attention to quality management will be one of the essential features for laboratories using metabolomics, and it will lead to increased confidence in the approach from risk assessors.
The participants of the working group felt that it would be useful to obtain validation for metabolomics, but given the lack of standardization, for the time being it may not be useful to try to achieve complete validation of all elements of MoA identification with the metabolomics approach. The evidence-based toxicology (EBT) initiative may provide alternative ways to evaluate test performance (Stephens et al., 2013). For instance, procedures have been suggested for high throughput screens that may be used as models for the evaluation of the usefulness and robustness of metabolomics approaches (Judson et al., 2013). Participants were of the opinion that, as regulators become more familiar with metabolomics, they are likely to recognize the value and advantages of this approach. they might then request its more frequent use (as has happened for markers of kidney toxicity). One of the additional advantages of metabolomics would be that it could put species comparisons (e.g., rat, mouse, human) on a more solid data basis. Me-tabolomics also can contribute to the assessment of additive or synergistic effects in co-exposure scenarios for both pharmaceuticals and environmental toxicants, which are more the rule than the exception in daily life.
A future perspective might be deduced from knowledge about other related (-omics) technologies. First examples for the use of transcriptomics can be found in the development of new pharmaceuticals, as well as in the safety evaluation of genetically modified crops (EC, 2011). The value (credibility) of MoA determined by fingerprints or biomarkers can be confirmed if the changes observed can be causally linked to toxicological pathways. It should also be noted that -omics data could be obtained routinely from regulatory studies, thus reducing the need for additional experiments and providing a highly standardized experimental setup of the biological study. To further enhance the acceptance of metabolomics, a careful design of biological experiments and high quality data are essential (e.g., appropriate biological model, treatment regime, and sampling method). In addition, proper controls, reference compounds, phenotypic anchoring as well as appropriate validation procedures should be used to ensure the quality of the generated data. Overall, metabolomics appears to be ready to be incorporated into regulatory testing as an additional robust source of relevant information, in a toxicological weight of evidence approach.
One of the critical elements of any regulatory study is the determination of a NOAEL. Sufficient guidance is available for experienced toxicologists to consistently determine a NOAEL based on the classical parameters observed in standard toxicological studies. However, for new technologies, such as metabolomics, there is very little guidance available. the absence of guidance criteria on how to determine a NOAEL in metabolomics is a hurdle for introducing such studies within a regulatory context. Therefore, defining criteria or providing guidance on metabolomics NOAEL setting is of utmost importance. It would reduce planning insecurity, especially for management decisions driven by financial factors and considerations of the time-to-market. Currently, there is only general guidance on how to determine a NOAEL in -omics studies from two ECETOC workshop reports (ECETOC, 2008) – see note on “ECETOC guidance” at the end of this section.
For scientists involved in metabolomics, or any other -omics approach, it is clear that with hundreds or even thousands of parameters measured, a single parameter cannot be used to determine a NOAEL. Classical stochastic-based statistical methods would result in an overly high false discovery rate, and a large degree of unreliability. Refined statistics can resolve this problem to some extent. For example, the use of false discovery rate corrections can be introduced to estimate the probability of parameters being changed randomly. Bayesian statistics and considerations of biological relevance based on existing background knowledge are required to define meaningful endpoints. experience already has shown that some MoA can be detected with a relatively low number of consistently altered parameters. To better evaluate the consistency of an effect it would be desirable to investigate multiple time points, but this is not always possible. If a series of parameters is found to be changed consistently, and if these parameters are known to be associated with a known pathway, then this set of changes would constitute a metabolomics effect useful for consideration as a toxicological endpoint. However, not all such metabolomics patterns need to result in pathological conditions or adverse effects in general. For instance, liver enzyme induction correlates with a specific metabolome pattern but does not necessarily result in functional or structural damage. With increased knowledge of the significance of metabolomics pathways, compensatory reactions may also become visible. these may represent good (in vitro) biomarkers of toxicity – BoT (Blaauboer et al., 2012), and they need to be taken into account for systems biology models of overall adverse outcomes (Hartung and McBride, 2011; Hartung et al., 2012). In addition, metabolomics could add to our understanding of the difference between compensatory reactions (adaptation) and those changes that are linked to cell fate decisions. Indeed, metabolomics patterns might constitute useful BoT (Blaauboer et al., 2012) that can help defining appropriate NOAEL.
In summary, the identification of altered metabolic pathways by metabolomics approaches does not necessarily mean that they lead to an adverse outcome. Consequently, for the time being, metabolomics is not a stand-alone approach in toxicology; it needs and can be matched with other toxicological data. An interesting and fruitful approach is to correlate metabolomics effects (patterns of change) with adverse toxicological outcomes, and to develop prediction models. Moreover, the relevance of reversibility is not yet clear for metabolomics parameters, and requires further studies.
In contrast to metabolomics data, the relevance of transcriptomics findings is often less clear, as these changes rarely can be linked directly to phenotypic changes. From a statistical point of view, transcriptomics is also more problematic than metabolomics because there are many more parameters relative to the sample size. But the combination of both transcriptomics and metabolomics may significantly enhance data interpretation, especially when results from time series experiments are available.
The participants of the working group recommended building a data base using metabolomics data from regulatory studies in order to validate its use for predicting adverse effects and/or identifying MoA. Using samples from regulatory studies would provide the necessary standards to correlate changes of metabolomics data with classical toxicological parameters.
ECETOC’s guidance to derive a meaningful NOAEL recommended that (1) only specific patterns of change (in any type of -omics study) should be used to conclude that a potentially relevant biological effect is taking place, (2) as changes in -omics pathways do not necessarily implicate that changes at cellular, individual, or population levels will necessarily occur, these pathways need to be correlated to observable histological changes at the microscopic or macroscopic level, and (3) to use changes in an -omics pattern for NOAEL purposes, it must be assured that the pathway identified is related to an adverse effect (ECETOC, 2010).
There are two major sources, namely, technical and biological, that contribute to the overall variability, and they need to be handled separately. Technical variability results from the analytical process, starting with sample preparation and ranging to the separation and detection of metabolites. Optimization of procedures, the use of quality control samples, as well as compliance with SOP and the exact monitoring and documentation of observed deviations from SOP protocols, can help to reduce this variability. Randomization of samples, and quality control also are important measures to reduce variability. The second source of variability is the one inherent to the biological system used. Here also, standardization and the development of SOP will help to reduce variability. Moreover, it has been noted that each additional step in the experimental protocol will introduce more variability. Therefore, reducing complexity is essential. The risk of high variability is that it can mask subtle but important effects and thus reduce the sensitivity of the technology in obtaining biologically relevant data. As indicated above, variability is associated with the protocols and SOP used, therefore variability needs to be determined and defined for each individual “test system,” and only then is it possible to decide how the test system can be used, i.e., which questions can be addressed and which cannot in terms of signal to noise ratios. At high noise levels (= high variability), only large signals can be studied. For example, in a study using different rat strains, the metabolome patterns and MoA induced by 2-methyl-4-chlorophenoxyacetic acid were still clearly visible, despite the additional noise introduced by using different rat strains. Weak changes, as those associated with anemia, were less clear when using different rat strains (Strauss et al., 2009). With increasing knowledge of how metabolites respond to different confounding factors such as reduced food consumption, dietary changes, age, etc. such effects can be recognized and compensated for. New statistical methods also allow the identification of outliers in -omics studies and thus help to reduce variability in experimental groups in which, e.g., one animal behaved quite differently from the rest. Therefore, statistical models need to be developed that have the capability of “learning.” This means that recursive cycles of new data generation and improved analysis will improve the already existing model and make it more and more accurate.
The use of metabolomics in human samples is highly attractive because relevant body fluids such as blood or urine can be easily obtained. One example is collection of specimens in national bio-monitoring banks, such as the German environmental specimen bank, where sample specimens are stored and can be re-analyzed in the future. However, human variability is much higher than the biological variability encountered in controlled animal experiments. Therefore, much larger sample sizes and enhanced sub-grouping of the population are needed. Human variability also will be an important factor to be taken into account when translating metabolomics findings from animal studies to humans. Again, standardization will be very important, but factors such as lifestyle, diet, disease state, etc. will inevitably introduce elements of variability.
For in vitro studies, the situation often is more complex than might be expected. As cell culture procedures can involve many steps, variability introduced by the experimental setup may be quite high. Initial experiences of several participants suggest that variability associated with in vivo systems may be less than that associated with in vitro systems.
The participants concluded that, due to lack of standardization, currently no general guidance can be provided for evaluation of the variability and that each individual researcher needs to assess the variability of their system/procedure. Guidance for adequate study design (e.g., how to determine adequate group sizes) based on statistical considerations (e.g., strength of the effect, prevalence, etc.) would be helpful. Such adaptations of the study design are not easily possible for regulatory studies that follow a strict protocol and build on historical background data. With a more mid-term or long-term perspective, regulatory study designs from already existing protocols may need to be changed to allow incorporation of modern endpoints such as those from the metabolomics approach.
In view of the many opportunities that metabolomics has to offer for toxicology, particularly in terms of identifying MoA, it would be desirable to make the metabolomics approach acceptable for regulatory purposes (Fig. 4). This would require some sort of validation process, as it is common for any other new method.
Assuming that the regulatory use of metabolomics for the time being would concentrate on MoA identification it was recommended by the workshop that each individual metabolome pattern, indicating a particular MoA, should be validated. In order to ensure that adverse outcome pathways would be addressed in such an exercise, it would be necessary to clearly demonstrate a good correlation with toxicological effects such as pathology. Plausibility of the metabolomics changes and observed toxicological effects should be one of the key elements for validation. Before the start of any type of regulatory validation, it seemed advisable first to consult with regulators to explain the usefulness of metabolomics in a regulatory context and to ensure that, following validation, such data would also be acceptable for regulatory purposes (Fig. 4). This requires first of all more communication with regulators and the publication/communication of success stories. It also will become more important to reach out to those more involved in regulatory and risk assessment aspects of toxicology. One example for regulatory acceptance is the altered metabolomics biomarker pattern for the detection of certain types of kidney damage (Dieterle et al., 2010; Fuchs and Hewitt, 2011).
The participants concluded that some guidance needs to be provided on how validation of metabolomics methods could be achieved. this guidance should be developed jointly by multiple stakeholders together with regulators and risk assessment institutions.
Currently, metabolomics is used mainly for academic research purposes, and only a few companies have started to use this approach for the early identification of toxicological effects. the use and application of metabolomics in toxicology would advance more rapidly if it was also used for regulatory purposes. this would require at least some type of validation protocols (see aforementioned questions) and acceptance of the outcome of the studies by regulatory agencies. Certainly the latter would require that regulatory agencies be more familiar with the metabolomics approach. Ideally, they would use it themselves to better understand the strengths and weaknesses of this approach and to build confidence in the data obtained. The working group believed that regulators would hardly accept metabolomics data unless they have gained their own experience with this technology.
The question was asked whether the regulatory framework for metabolomics should be different for pharmaceutical active ingredients, pesticides, or industrial chemicals. There was agreement that the regulatory framework should be identical for all sectors, as far as identification of the MoA in toxicology is addressed. For some special modes of action, e.g., endocrine disruption, there is a regulatory demand for identifying them (Hecker and Hollert, 2011). Consequently, MoA identification by means of metabolomics should be attractive, as this could be done without additional/animal studies (Fig. 4) by using various biomatrices (e.g., blood and urine) from regulatory studies (van Ravenzwaay et al., 2010; Zhang et al., 2012a). For the time being, the integration of metabolomics data into a regulatory decision-making framework may be limited to MoA identification for the three sectors. It was noted by some participants that, in the absence of any toxicological findings, which is not uncommon for certain classes of industrial chemicals (evaluated under REACH), there is no merit in MoA identification by metabolomics (or any other approach). For pharmaceutical compounds, there could be more (regulatory) options for the use of metabolomics, particularly with respect to human relevance and the comparison of metabolite responses in different species. A further practical application of metabolomics in a regulatory context is its use in diagnostics and food quality evaluation (Shepherd et al., 2011).
An aspect of metabolomics that has not received much attention, but could be very attractive for both research and regulatory purposes, is the fact that metabolomics data include information on both normal constituents of the organisms tested and on the test substance and its metabolites. Additionally, by integrating imaging techniques in metabolomics studies, the obtained results give insights into the actual distribution of metabolite patterns and pharmaceuticals or environmental toxicants and their metabolites within tissue or single cells. Thus, metabolomics could simultaneously provide information on the chemical exposure in the organisms/cells tested and on the perturbation triggered thereby. Although this may require adaptation of the technical equipment, tracking exposure and analyzing internal dose-response relationships is highly attractive. Overall, this will add to the weight of evidence concerning toxicological effects following chemical exposure.
For metabolomics information obtained from in vitro data, an important aspect is the translation to the in vivo situation. This has to be demonstrated before such data can be used in a regulatory framework. Concerning the combination of metabolomics data with information obtained from other -omics technologies (often referred to as systems biology), integration of transcriptomics and metabolomics data has already been shown to be helpful for increasing certainty in the identification of a specific effect/MoA (Bundy et al., 2008) or for identifying pathways influencing susceptibility to toxicity (Cavill et al., 2011). However, there is a need for development of better tools for data integration and to further advance computer software. This would enable the use of combined -omics data to achieve a more comprehensive interpretation. Emerging technologies include mass spectrometry-based imaging metabolomics approaches like matrix-assisted laser desorption/ionization (MALDI) and time of flight secondary ion mass spectrometry (ToF SIMS). They would provide new insights into intra-tissue and intra-cellular metabolite distribution changes at sub-organ or sub-cellular levels, but data handling and standardization for quantitative analysis may be even more complex.
At present, there is considerably more experience in applying transcriptomics to toxicological investigations. In the future, the use of metabolomics is likely to increase, as the information obtained with this approach is closer to classical toxicological endpoints, and therefore easier to interpret.
Associations that may potentially help with the design of guidance for regulatory use and validations are: WHO, OECD, ILSI, and ECETOC.
The use of metabolomics for a better understanding of the pathways and regulations relevant for the toxicity and efficacy of compounds (e.g., drugs, pesticides, industrial chemicals) will bring significant benefits for consumers/patients. However, some challenges still need to be overcome. Dealing with the large amounts of data is still not a trivial task. Moreover, more efforts are required with respect to the standardization of protocols, study designs, data processing, and analyses. Altogether, this will ensure the generation of reliable information that can be compared among laboratories and that can be used to create meaningful databases for their application in many fields of life sciences.
Regarding the application of metabolomics for safety evaluations, two different strategies may be followed. The more long-term perspective is that a large (“information rich”) set of data from simple model systems (e.g., human cell cultures) would be sufficient on its own to predict potential hazard. The data would comprise metabolomics and transcriptomics information, and an additional requirement would be broad background knowledge of systems toxicology and the human toxome. This would become realistic only in the more distant future. In the immediate future, the strategy will not require vast systems biology background knowledge, but it will be based, rather, on pattern comparisons (“signatures”). Reference compounds will be tested in an in vitro or in vivo model battery. The metabolome analysis of unknown compounds will then be aligned with the known pattern of the reference materials. Within this approach, data from other -omics technologies also can be implemented in order to refine the predictive value of this strategy.
The rapidly emerging use of metabolomics analysis as end-point for in vitro test systems requires special attention. Often such test systems allow a high throughput and a large degree of control of the experimental conditions. However, the extrapolation of in vitro data to the in vivo situation is still a substantial scientific challenge. In many cases, information from multiple systems may need to be combined to account for tissue effects such as cell-cell interactions, compensatory regulations, and communication between different organs. the interpretation of data from individual systems often requires ample experience. A second major issue is the susceptibility of in vitro systems to experimental artifacts due to poor study design or small variations of the experimental conditions. Therefore, now more than ever, the quality control of study design as well as all the conditions crucial for the good performance of the system must be taken very seriously.
Implementation of metabolomics in the regulatory context will require an intense collaboration among the different stakeholders, whether they belong to academia, industry, or regulatory bodies. It will be crucial to jointly investigate and define the relevance of the changes observed. If this is achieved, then the innovative methodology of metabolomics can be rapidly integrated into the regulatory process to provide more complete information on chemical effects on the physiological/cellular levels, information about the spatial distribution not only of the toxicants but also of specific marker metabolites within whole tissues and single cells, as well as on the safety of humans and the environment.
Work by Thomas Hartung and Helena Hogberg on metabolomics is supported by the NIH transformative research grant “Mapping the Human Toxome by Systems Toxicology” (RO1ES020750) and the FDA grant “DNTox-21c identification of pathways of developmental neurotoxicity for high throughput testing by me-tabolomics” (U01FD004230). Work by Mardas Daneshian and Marcel Leist was supported by the State of Baden-Württemberg and the Doerenkamp-Zbinden Foundation.
*a report of t4 – the transatlantic think tank for toxicology, a collaboration of the toxicologically oriented chairs in Baltimore, Konstanz and Utrecht sponsored by the Doerenkamp-Zbinden Foundation. The opinions expressed in this report are those of the participants as individuals and do not necessarily reflect the opinions of the organizations they are affiliated with; participants do not necessarily endorse all recommendations made.
The views expressed in this article are those of the author(s) and do not necessarily represent the views or policies of the U.S. environmental Protection Agency.