The objective of the NexGen program is to begin to incorporate recent progress in molecular and systems biology into risk assessment practice. A broad array of new data and methods is being considered, including genomics, epigenomics, transcriptomics, proteomics, and metabolomics. Initially, this effort will ensure that risk assessments include state-of-the-science information. The ultimate success of this program, however, will be based on the incorporation of new practices that facilitate faster, cheaper, and/or more accurate assessments of public health risks. We anticipate that these new approaches will have a variety of applications, such as the assessment of new and existing chemicals in commerce and the design of chemical products and processes that reduce or eliminate the use or generation of hazardous substances. The program, described briefly in this commentary, maps a course forward and engenders movement from strategy to practical application in risk assessment.
The NexGen program is a U.S. EPA–led, multiagency collaboration among the U.S. EPA, the NIEHS, the National Center for Advancing Translational Sciences, the Centers for Disease Control/Agency for Toxic Substances and Disease Registry, the Food and Drug Administration’s National Center for Toxicological Research, the Department of Defense, and the State of California’s Environmental Protection Agency. These agencies are pooling knowledge, data, and analyses to explore the use of new science in risk assessment and to provide advice to the U.S. EPA’s National Center for Environmental Assessment.
The broad set of questions we seek to address in the NexGen program is
- How can these new data and methods substantively improve our understanding of risk?
- Can scientifically sound assessments be made faster, cheaper, and/or more accurate using these new methods, and better address a variety of environmental management challenges (risk context)?
- How can these new types of information best be incorporated into risk assessments and used to inform risk managers and the public?
- What new policies and procedures are needed to produce consistent, reasonable, and robust assessments?
Specifically, NexGen aims to develop a) a NexGen framework informed by the NRC framework for risk-based decision making (NRC 2009); b) a bioinformatics system for knowledge mining, creation, and integration to serve risk assessment; and c) prototype assessments targeted to the risk context and iteratively refined through discussions with scientists, risk managers, and stakeholders. These three aims are discussed further below.
Framework for risk-based decision making. Developing and implementing new approaches to risk assessment will require engaging a broad spectrum of stakeholders. The NRC framework for risk-based decision making provides a structure for such stakeholder engagement. Key components of the framework are public stakeholder discussion in the problem formulation, scoping, and planning steps of risk assessment; increased transparency throughout the entire process; and tailoring risk assessments more closely to the risk context. The framework process provides opportunities for fostering transparent and open discussion among a broad array of stakeholders. This effort ensures access to a broad representation of stakeholders (not just experts in technical fields), fostering their desired level of understanding, meeting their specific information needs, and providing resources to less advantaged groups so that equal access to the process is guaranteed. In February 2011, the U.S. EPA and its NexGen partners held a public meeting to begin to engage stakeholders in the NexGen process (U.S. EPA 2011a). Additionally, an expert workshop, open to the public, was held by the Emerging Science for Environmental Health Decisions Committee of the National Academy of Sciences 14–15 June 2012.
An important task for NexGen is to match risk context to specific methodologies and to the level of scientific certainty required for decision making. To begin tailoring risk assessment approaches to the risk context, the NexGen program has constructed a three-tier scheme (). shows distinct tiers with differing assessment approaches; in practice, these differing approaches lie on a continuum that could be modified for various situations. The cost of assessment in time, resources, and the number of animals used increases as one moves from Tier 1 to Tier 2 and then to Tier 3; scientific certainty also increases.
Figure 1 The proposed assessment paradigm is tailored to meet specific risk management needs for different types of environmental problems. From left to right, Tier 1 is designed to evaluate the tens of thousands of chemicals in commerce to which the American (more ...) Bioinformatics: knowledge mining, creation, and integration.
In today’s rapidly expanding world of information, productive use of new and existing information depends on the effective and efficient integration of dissimilar types of knowledge from a wide variety of sources. Information relevant to NexGen is found as unstructured information reported in the open literature, electronic “libraries” of molecular biology data such as those housed at the National Library of Medicine, and legally mandated test results reported to the U.S. EPA in rigidly structured formats. “Unstructured,” in this context, refers to how information is presented in the open literature text (e.g., not in a standardized format such as is done for test data submission) (Blake 2010
). Consequently, we and others are developing informatics-based systems that support scientists as they face the daunting task of synthesizing diverse information from a wide array of resources. For example, diverse sources such as the U.S. EPA’s Aggregated Computational Toxicology Resource database, the NIH Comparative Toxicogenomic Database, and the National Library of Medicine Gene Expression Omnibus, in addition to textual descriptions of health end points found in hundreds of papers in the open literature, might contain, in combination, the necessary information to characterize hazard and exposure–response for a risk assessment. Informatics can help identify, summarize, and analyze large amounts of data from various sources for additional human consideration, as well as enable discovery and reduce the need to rely on known associations. The amount and breadth of data captured by informatics approaches will facilitate evaluation of both uncertainty (e.g., measurement error) and variability (e.g., among species, in humans), as recommended by the NRC (2009). Note, however, that informatics is a tool to assist scientists and not a replacement for human expertise and judgment.
A key feature of the NexGen program is the development of targeted prototype assessments to help engender movement from strategy to practical application. With these initial prototypes, we seek to demonstrate proof of concept, to characterize the value of information, and to determine decision rules for using new types of data and knowledge in risk assessment. We anticipate that the data-rich prototypes will a
) help us understand how to use molecular and systems biology data to evaluate data-limited chemicals and b
) provide insight into problematic issues generally unresolved by conventional data (e.g., response in the low-exposure range, characterization of a susceptible subpopulation). As part of this effort, we are exploring both qualitative and quantitative uses of the data and predictive methods and models (Chiu et al. 2010
; Edwards and Preston 2008
; Felter et al. 2011
; Judson et al. 2011
; Wetmore et al. 2012
Although federal human health assessment guidelines explicitly encourage the use of mechanistic information, these guidelines largely reflect the knowledge and thinking of the 1980s and early 1990s. Currently, information concerning omic × environment interactions might be discussed qualitatively in assessments as supporting information, but, to date, such data have not been widely defined in regard to adversity (i.e., adverse or not adverse). Consequently, “omics” data have been used rarely in risk assessment and management decisions (Judson et al. 2010a
; U.S. EPA 2011b).
Recent advances in scientific understanding of molecular and systems biology support the view that environmental chemicals can act through multiple toxicity pathways to induce adverse health outcomes (Edwards and Preston 2008
; Guyton et al. 2009
; Judson et al. 2010b
; Miller et al. 2009
). Moreover, the relationship between a dose and a particular outcome in an individual could take multiple forms depending on genetic background, target tissue, and other factors besides mechanisms of action. Interindividual variability and preexisting backgrounds of response are, in turn, key determinants of the population dose–response curve (NRC 2009). Moving from current risk assessment practices to risk assessment based on a modern view of disease will require a paradigm shift.
For the prototype human health assessments, we are evaluating several health end points/diseases at three levels of complexity, or tiers (). shows the prototype risk assessments currently under development. Prototypes will attempt to identify consistent molecular and cellular patterns reflective of causal relationships between chemical exposures and induction of human health end points and to evaluate exposure or dose relationships using these approaches. The intent is to use in vivo
traditional data to explore further the predictive potential of both in vivo
and in vitro
“omic” data. Observed associations will be grouped into weight-of-evidence categories, describing the certainty with which an observed effect can be attributed to a particular chemical. In addition, each prototype will seek, to the extent feasible, to evaluate human variability, background health end point incidence, adaptation, and exposures to similar chemicals. Using such a construct, the effects of mixtures exposures and nonchemical stressors (e.g., socioeconomic factors, lifestyle) could be evaluated in later stages of the effort. Criteria for choosing the initial chemicals for prototype development were human exposures in which common, underlying mechanisms are generally understood, and both in vitro
molecular biology data and in vivo
traditional data are available for the chemical. We particularly emphasized the availability of in vivo
human data, including observed responses at or near ambient concentrations and traditional upstream events. Initial work on methods used to inform the various tiers has been published (Judson et al. 2011
; McHale et al. 2012
; Thomas et al. 2011
; Villeneuve et al. 2012
). In partnership with the NIEHS, we are also adding diabetes/metabolic disease to the set of prototypes (not shown in ) (Thayer et al. 2012
). Over time, additional chemicals and health end point or disease combinations will be developed.
Prototype risk assessments organized by issue
Underlying questions considered in these prototypes include the following:
- How can molecular and systems biology provide insights into potential adverse effects, or a lack of effects, in humans—when combined with in vivo data or in the absence of in vivo data?
- How can these data inform relative potency estimates or exposure/dose–response relationships predictive of in vivo human responses?
- What is the role of dosimetry or physiologically based pharmacokinetic modeling in using in vitro data?
- Can these data inform us about
- Variability and susceptibility in the human population?
- Mixtures interactions?
- What are the strengths and weaknesses of these new approaches for assessing risks in the human population?
- How can the probabilities of harm to public health be better characterized, including noncancer health effects, and how will uncertainty and variability be characterized?
Additionally, results of the prototype development efforts are likely to spur further research and test methods development.