PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptNIH Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
Risk Anal. Author manuscript; available in PMC Oct 26, 2011.
Published in final edited form as:
PMCID: PMC3202604
NIHMSID: NIHMS191952
Toxicity Testing in the 21st Century: Implications for Human Health Risk Assessment
Robert J. Kavlock, Ph.D., Christopher P. Austin, M.D., and Raymond Tice, Ph.D.
Robert J. Kavlock, Director, National Center for Computational Toxicology, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711;
The risk analysis perspective by Daniel Krewski and colleagues lays out the long-term vision and strategic plan developed by a National Research Council committee (1), sponsored by the U.S. Environmental Protection Agency (EPA) with support from the U.S. National Toxicology Program (NTP), to “advance the practices of toxicity testing and human health assessment of environmental agents.” Components of the vision include chemical characterization; the use of human cell-based, high-throughput assays that cover the diversity of toxicity pathways; targeted testing using animals to fill in data gaps; dose-response and extrapolation modeling; and the generation and use of population-based and human exposure data for interpreting the results of toxicity tests. The strategic plan recognizes that meeting this vision will require a major research effort conducted over a period of a decade or more to identify all of the important toxicity pathways, and that a clear distinction must be made between which pathway perturbations are truly adverse (i.e., would likely lead to adverse health outcomes in humans) and those that are not. Krewski et al. note that achieving this vision in a reasonable time frame (i.e., decades) would require the involvement of an interdisciplinary research institute that would be coordinated and funded primarily by the U.S. Federal government and which would foster appropriate intramural and extramural research. It is expected that this approach would greatly increase the number of compounds that can be tested, while providing data more directly relevant for conducting human health risk assessment. The NTP though its Roadmap,1 the National Institutes of Health (NIH) Chemical Genomics Center (NCGC) through the Molecular Libraries Initiative,2 and the EPA through its ToxCast program3 and its draft Strategic Plan for the Future of Toxicity Testing have individually recognized the need to bring innovation into the assessment of the toxicological activity of chemicals, and each has made progress in doing so. However, the grand challenge put forth by Krewski et al. requires an effort unparalleled in the field of toxicology and risk assessment.
In recognition of the importance of the NRC report (1) and to accelerate progress in this area, two NIH institutes and EPA have entered into a formal collaboration known as Tox21 to identify mechanisms of chemically-induced biological activity, prioritize chemicals for more extensive toxicological evaluation, and develop more predictive models of in vivo biological response (2). Consistent with the vision outlined by Krewski et al., success in achieving these goals is expected to result in methods for toxicity testing that are more scientific and cost-effective as well as models for risk assessment that are more mechanistically based. As a consequence, a reduction or replacement of animals in regulatory testing is anticipated to occur in parallel with an increased ability to evaluate the large numbers of chemicals that currently lack adequate toxicological evaluation. Ultimately, Tox21 is expected to deliver biological activity profiles that are predictive of in vivo toxicities for the thousands of under-studied substances of concern to regulatory authorities in the United States, as well as in many other countries.
The Tox21 collaboration is being coordinated through a five-year Memorandum of Understanding (MoU),4 which leverages the strengths of each organization. The MoU builds on the experimental toxicology expertise at the NTP, headquartered at the NIH National Institute of Environmental Health Sciences (NIEHS); the high throughput screening (HTS) technology of the NIH Chemical Genomics Center (NCGC), managed by the National Human Genome Research Institute (NHGRI); and the computational toxicology capabilities of the EPA’s National Center for Computational Toxicology (NCCT). Each party brings complementary expertise to bear on the application of novel methodologies to evaluate large numbers of chemicals for their potential to interact with the myriad of biological processes relevant to toxicity. A central aspect of Tox21 is the unique capabilities of the NCGC’s high-speed, automated screening robots to simultaneously test thousands of potentially toxic compounds in biochemical and cell-based HTS assays, and an ability to target this resource toward environmental health issues. As mentioned by Krewski et al., EPA’s ToxCast Program is an integral and critical component for achieving the Tox21 goals laid out in the MoU.
To support the goals of Tox21, four focus groups – Chemical Selection, Biological Pathways/Assays, Informatics, Targeted Testing – have been established; these focus groups represent the different components of the NRC vision described by Krewski et al. The Chemical Selection group is coordinating the selection of chemicals for the Tox21 compound library to test at the NCGC. A chemical library of nearly 2400 chemicals selected by NTP and the EPA is already under study at the NCGC and results from several dozen HTS assays are already available. In the near term, this library will be expanded to approximately 8400 compounds, with an additional ~1400 compounds selected by the NTP, ~1400 compounds selected by the EPA, and ~2800 clinically approved drugs selected by the NCGC. Compound selection is currently based largely on the compound having a defined chemical structure and known purity; on the extent of its solubility and stability in dimethyl sulfoxide (DMSO), the preferred solvent for HTS assays conducted at the NCGC; and on the compound having low volatility. Implementing quality control procedures for ensuring identify, purity, and stability of all compounds in the library is an important responsibility of this group. A subset of the Tox21 chemical library will be included in Phase II of the ToxCast program, which will examine a broader suite of assays in order to evaluate the predictive power of bioactivity signatures derived in Phase I. Phase II of ToxCast will be launched by the summer of 2009.
The Biological Pathways and Assays group is identifying critical cellular toxicity pathways for interrogation using biochemical- and cell-based high throughput screens and prioritizing HTS assays for use at the NCGC. Assays already performed at the NCGC include those to assess (1) cytotoxicity and activation of caspases in a number of human and rodent cell types, (2) up-regulation of p53, (3) agonist/antagonist activity for a number of nuclear receptors, and (4) differential cytotoxicity in several cell lines associated with an inability to repair various classes of DNA damage. Other assays under consideration include those for a variety of physiologically important molecular pathways (e.g., cellular stress responses) as well as methods for integrating human and rodent hepatic metabolic activation into reporter gene assays. Based on the results obtained, this group will construct test batteries useful for identifying hazard for humans and for prioritizing chemicals for further, more in-depth evaluation.
The Informaticsgroup is developing databases to store all Tox21-related data and evaluating the results obtained from testing conducted at the NCGC and via ToxCast for predictive toxicity patterns. To encourage independent evaluations and/or analyses of the Tox21 test results, all data as well as the comparative animal and human data, where available, will be made publicly accessible via various databases, including EPA’s ACToR (Aggregated Computational Toxicology Resource), NIEHS’ CEBS (Chemical Effects in Biological Systems), and the National Center for Biotechnology Information’s PubChem.
As HTS data on compounds with inadequate testing for toxicity becomes available via Tox21, there will be a need to test selected compounds in more comprehensive assays. The Targeted Testing group is developing strategies and capabilities for this purpose using assays that involve higher order testing systems (e.g., roundworms [Caenorhabditis elegans], zebrafish embryos, rodents).
In addition to the testing activities, the MOU promotes coordination and sponsorship of workshops, symposia, and seminars to educate the various stakeholder groups including regulatory scientists and the public with regard to Tox21-related activities. Persons interested in following the progress of Tox21 are invited to join the EPA’s Chemical Prioritization Community of Practice5, which meets monthly via teleconference.
Given the scope of the challenge presented by Krewski et al., success will require a long-term concerted effort by a large number of investigators, working in a coordinated manner. The Tox21 consortium welcomes participation in our effort by individual scientists and by organizations. The implications for success of this effort are considerable. If successful, we will be able to address regulatory demands such as those placed by the Food Quality Protection Act for the endocrine screening program6 and the new European Community Regulation on chemicals and their safe use, known as REACH (Registration, Evaluation, Authorisation and Restriction of Chemical Substances),7 identify key modes of action on a scale not imaginable even a few years ago, direct a much more efficient and effective use of animals in toxicity testing, identify potentially susceptible subpopulations based on the presence of polymorphisms in toxicity pathways, screen the effects of mixtures, and study emerging issues like the safety of nanomaterials. The acquisition of data from broad-scale HTS programs also creates demands to integrate this knowledge and understand the implications for systems biology, and to have risk assessors trained and conversant in the new technologies and their utilities. While the ultimate goal of eliminating the use of animals in toxicology testing might seem unattainable, it is only by carefully evaluating the relevance and reliability of strategies based on in vitro test methods that the utility and limitations of such an approach can be determined and decisions made on how best to conduct toxicology testing in the future. To do otherwise will result in increasing demands being placed on systems never designed to handle the large numbers of chemicals in need of evaluation, and continued reliance on test methods based on empirical observation rather than on mechanistic understanding.
Contributor Information
Robert J. Kavlock, Director, National Center for Computational Toxicology, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711.
Christopher P. Austin, Director, NIH Chemical Genomics Center, National Human Genome Research Institute, National Institutes of Health, 9800 Medical Center Drive, MSC 3370, Bethesda, MD 20892-3370.
Raymond Tice, Acting Chief, Biomolecular Screening Branch, National Toxicology Program, National Institute of Environmental Health Sciences, Mail Code EC-17, P.O. Box 12233, Research Triangle Park, NC 27709.
References
1. National Research Council. Toxicity Testing in the 21st Century: A Vision and A Strategy. Washington, D.C: National Academy Press; 2007.
2. Collins FS, Gray GM, Bucher JR. Transforming Environmental Health Protection. Science. 2008 Feb 15;319:907–907. [PMC free article] [PubMed]