PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1501850)

Clipboard (0)
None

Related Articles

1.  Compound Cytotoxicity Profiling Using Quantitative High-Throughput Screening 
Environmental Health Perspectives  2007;116(3):284-291.
Background
The propensity of compounds to produce adverse health effects in humans is generally evaluated using animal-based test methods. Such methods can be relatively expensive, low-throughput, and associated with pain suffered by the treated animals. In addition, differences in species biology may confound extrapolation to human health effects.
Objective
The National Toxicology Program and the National Institutes of Health Chemical Genomics Center are collaborating to identify a battery of cell-based screens to prioritize compounds for further toxicologic evaluation.
Methods
A collection of 1,408 compounds previously tested in one or more traditional toxicologic assays were profiled for cytotoxicity using quantitative high-throughput screening (qHTS) in 13 human and rodent cell types derived from six common targets of xenobiotic toxicity (liver, blood, kidney, nerve, lung, skin). Selected cytotoxicants were further tested to define response kinetics.
Results
qHTS of these compounds produced robust and reproducible results, which allowed cross-compound, cross-cell type, and cross-species comparisons. Some compounds were cytotoxic to all cell types at similar concentrations, whereas others exhibited species- or cell type–specific cytotoxicity. Closely related cell types and analogous cell types in human and rodent frequently showed different patterns of cytotoxicity. Some compounds inducing similar levels of cytotoxicity showed distinct time dependence in kinetic studies, consistent with known mechanisms of toxicity.
Conclusions
The generation of high-quality cytotoxicity data on this large library of known compounds using qHTS demonstrates the potential of this methodology to profile a much broader array of assays and compounds, which, in aggregate, may be valuable for prioritizing compounds for further toxicologic evaluation, identifying compounds with particular mechanisms of action, and potentially predicting in vivo biological response.
doi:10.1289/ehp.10727
PMCID: PMC2265061  PMID: 18335092
1,536-well; cell viability; NTP 1,408 compound library; PubChem; qHTS; RT-CES
2.  Analysis of multiple compound–protein interactions reveals novel bioactive molecules 
The authors use machine learning of compound-protein interactions to explore drug polypharmacology and to efficiently identify bioactive ligands, including novel scaffold-hopping compounds for two pharmaceutically important protein families: G-protein coupled receptors and protein kinases.
We have demonstrated that machine learning of multiple compound–protein interactions is useful for efficient ligand screening and for assessing drug polypharmacology.This approach successfully identified novel scaffold-hopping compounds for two pharmaceutically important protein families: G-protein-coupled receptors and protein kinases.These bioactive compounds were not detected by existing computational ligand-screening methods in comparative studies.The results of this study indicate that data derived from chemical genomics can be highly useful for exploring chemical space, and this systems biology perspective could accelerate drug discovery processes.
The discovery of novel bioactive molecules advances our systems-level understanding of biological processes and is crucial for innovation in drug development. Perturbations of biological systems by chemical probes provide broader applications not only for analysis of complex systems but also for intentional manipulations of these systems. Nevertheless, the lack of well-characterized chemical modulators has limited their use. Recently, chemical genomics has emerged as a promising area of research applicable to the exploration of novel bioactive molecules, and researchers are currently striving toward the identification of all possible ligands for all target protein families (Wang et al, 2009). Chemical genomics studies have shown that patterns of compound–protein interactions (CPIs) are too diverse to be understood as simple one-to-one events. There is an urgent need to develop appropriate data mining methods for characterizing and visualizing the full complexity of interactions between chemical space and biological systems. However, no existing screening approach has so far succeeded in identifying novel bioactive compounds using multiple interactions among compounds and target proteins.
High-throughput screening (HTS) and computational screening have greatly aided in the identification of early lead compounds for drug discovery. However, the large number of assays required for HTS to identify drugs that target multiple proteins render this process very costly and time-consuming. Therefore, interest in using in silico strategies for screening has increased. The most common computational approaches, ligand-based virtual screening (LBVS) and structure-based virtual screening (SBVS; Oprea and Matter, 2004; Muegge and Oloff, 2006; McInnes, 2007; Figure 1A), have been used for practical drug development. LBVS aims to identify molecules that are very similar to known active molecules and generally has difficulty identifying compounds with novel structural scaffolds that differ from reference molecules. The other popular strategy, SBVS, is constrained by the number of three-dimensional crystallographic structures available. To circumvent these limitations, we have shown that a new computational screening strategy, chemical genomics-based virtual screening (CGBVS), has the potential to identify novel, scaffold-hopping compounds and assess their polypharmacology by using a machine-learning method to recognize conserved molecular patterns in comprehensive CPI data sets.
The CGBVS strategy used in this study was made up of five steps: CPI data collection, descriptor calculation, representation of interaction vectors, predictive model construction using training data sets, and predictions from test data (Figure 1A). Importantly, step 1, the construction of a data set of chemical structures and protein sequences for known CPIs, did not require the three-dimensional protein structures needed for SBVS. In step 2, compound structures and protein sequences were converted into numerical descriptors. These descriptors were used to construct chemical or biological spaces in which decreasing distance between vectors corresponded to increasing similarity of compound structures or protein sequences. In step 3, we represented multiple CPI patterns by concatenating these chemical and protein descriptors. Using these interaction vectors, we could quantify the similarity of molecular interactions for compound–protein pairs, despite the fact that the ligand and protein similarity maps differed substantially. In step 4, concatenated vectors for CPI pairs (positive samples) and non-interacting pairs (negative samples) were input into an established machine-learning method. In the final step, the classifier constructed using training sets was applied to test data.
To evaluate the predictive value of CGBVS, we first compared its performance with that of LBVS by fivefold cross-validation. CGBVS performed with considerably higher accuracy (91.9%) than did LBVS (84.4%; Figure 1B). We next compared CGBVS and SBVS in a retrospective virtual screening based on the human β2-adrenergic receptor (ADRB2). Figure 1C shows that CGBVS provided higher hit rates than did SBVS. These results suggest that CGBVS is more successful than conventional approaches for prediction of CPIs.
We then evaluated the ability of the CGBVS method to predict the polypharmacology of ADRB2 by attempting to identify novel ADRB2 ligands from a group of G-protein-coupled receptor (GPCR) ligands. We ranked the prediction scores for the interactions of 826 reported GPCR ligands with ADRB2 and then analyzed the 50 highest-ranked compounds in greater detail. Of 21 commercially available compounds, 11 showed ADRB2-binding activity and were not previously reported to be ADRB2 ligands. These compounds included ligands not only for aminergic receptors but also for neuropeptide Y-type 1 receptors (NPY1R), which have low protein homology to ADRB2. Most ligands we identified were not detected by LBVS and SBVS, which suggests that only CGBVS could identify this unexpected cross-reaction for a ligand developed as a target to a peptidergic receptor.
The true value of CGBVS in drug discovery must be tested by assessing whether this method can identify scaffold-hopping lead compounds from a set of compounds that is structurally more diverse. To assess this ability, we analyzed 11 500 commercially available compounds to predict compounds likely to bind to two GPCRs and two protein kinases. Functional assays revealed that nine ADRB2 ligands, three NPY1R ligands, five epidermal growth factor receptor (EGFR) inhibitors, and two cyclin-dependent kinase 2 (CDK2) inhibitors were concentrated in the top-ranked compounds (hit rate=30, 15, 25, and 10%, respectively). We also evaluated the extent of scaffold hopping achieved in the identification of these novel ligands. One ADRB2 ligand, two NPY1R ligands, and one CDK2 inhibitor exhibited scaffold hopping (Figure 4), indicating that CGBVS can use this characteristic to rationally predict novel lead compounds, a crucial and very difficult step in drug discovery. This feature of CGBVS is critically different from existing predictive methods, such as LBVS, which depend on similarities between test and reference ligands, and focus on a single protein or highly homologous proteins. In particular, CGBVS is useful for targets with undefined ligands because this method can use CPIs with target proteins that exhibit lower levels of homology.
In summary, we have demonstrated that data mining of multiple CPIs is of great practical value for exploration of chemical space. As a predictive model, CGBVS could provide an important step in the discovery of such multi-target drugs by identifying the group of proteins targeted by a particular ligand, leading to innovation in pharmaceutical research.
The discovery of novel bioactive molecules advances our systems-level understanding of biological processes and is crucial for innovation in drug development. For this purpose, the emerging field of chemical genomics is currently focused on accumulating large assay data sets describing compound–protein interactions (CPIs). Although new target proteins for known drugs have recently been identified through mining of CPI databases, using these resources to identify novel ligands remains unexplored. Herein, we demonstrate that machine learning of multiple CPIs can not only assess drug polypharmacology but can also efficiently identify novel bioactive scaffold-hopping compounds. Through a machine-learning technique that uses multiple CPIs, we have successfully identified novel lead compounds for two pharmaceutically important protein families, G-protein-coupled receptors and protein kinases. These novel compounds were not identified by existing computational ligand-screening methods in comparative studies. The results of this study indicate that data derived from chemical genomics can be highly useful for exploring chemical space, and this systems biology perspective could accelerate drug discovery processes.
doi:10.1038/msb.2011.5
PMCID: PMC3094066  PMID: 21364574
chemical genomics; data mining; drug discovery; ligand screening; systems chemical biology
3.  Use of in Vitro HTS-Derived Concentration–Response Data as Biological Descriptors Improves the Accuracy of QSAR Models of in Vivo Toxicity 
Environmental Health Perspectives  2010;119(3):364-370.
Background
Quantitative high-throughput screening (qHTS) assays are increasingly being used to inform chemical hazard identification. Hundreds of chemicals have been tested in dozens of cell lines across extensive concentration ranges by the National Toxicology Program in collaboration with the National Institutes of Health Chemical Genomics Center.
Objectives
Our goal was to test a hypothesis that dose–response data points of the qHTS assays can serve as biological descriptors of assayed chemicals and, when combined with conventional chemical descriptors, improve the accuracy of quantitative structure–activity relationship (QSAR) models applied to prediction of in vivo toxicity end points.
Methods
We obtained cell viability qHTS concentration–response data for 1,408 substances assayed in 13 cell lines from PubChem; for a subset of these compounds, rodent acute toxicity half-maximal lethal dose (LD50) data were also available. We used the k nearest neighbor classification and random forest QSAR methods to model LD50 data using chemical descriptors either alone (conventional models) or combined with biological descriptors derived from the concentration–response qHTS data (hybrid models). Critical to our approach was the use of a novel noise-filtering algorithm to treat qHTS data.
Results
Both the external classification accuracy and coverage (i.e., fraction of compounds in the external set that fall within the applicability domain) of the hybrid QSAR models were superior to conventional models.
Conclusions
Concentration–response qHTS data may serve as informative biological descriptors of molecules that, when combined with conventional chemical descriptors, may considerably improve the accuracy and utility of computational approaches for predicting in vivo animal toxicity end points.
doi:10.1289/ehp.1002476
PMCID: PMC3060000  PMID: 20980217
acute toxicity; animal testing; computational toxicology; quantitative high-throughput screening; QSAR
4.  Use of Cell Viability Assay Data Improves the Prediction Accuracy of Conventional Quantitative Structure–Activity Relationship Models of Animal Carcinogenicity 
Environmental Health Perspectives  2008;116(4):506-513.
Background
To develop efficient approaches for rapid evaluation of chemical toxicity and human health risk of environmental compounds, the National Toxicology Program (NTP) in collaboration with the National Center for Chemical Genomics has initiated a project on high-throughput screening (HTS) of environmental chemicals. The first HTS results for a set of 1,408 compounds tested for their effects on cell viability in six different cell lines have recently become available via PubChem.
Objectives
We have explored these data in terms of their utility for predicting adverse health effects of the environmental agents.
Methods and results
Initially, the classification k nearest neighbor (kNN) quantitative structure–activity relationship (QSAR) modeling method was applied to the HTS data only, for a curated data set of 384 compounds. The resulting models had prediction accuracies for training, test (containing 275 compounds together), and external validation (109 compounds) sets as high as 89%, 71%, and 74%, respectively. We then asked if HTS results could be of value in predicting rodent carcinogenicity. We identified 383 compounds for which data were available from both the Berkeley Carcinogenic Potency Database and NTP–HTS studies. We found that compounds classified by HTS as “actives” in at least one cell line were likely to be rodent carcinogens (sensitivity 77%); however, HTS “inactives” were far less informative (specificity 46%). Using chemical descriptors only, kNN QSAR modeling resulted in 62.3% prediction accuracy for rodent carcinogenicity applied to this data set. Importantly, the prediction accuracy of the model was significantly improved (72.7%) when chemical descriptors were augmented by HTS data, which were regarded as biological descriptors.
Conclusions
Our studies suggest that combining NTP–HTS profiles with conventional chemical descriptors could considerably improve the predictive power of computational approaches in toxicology.
doi:10.1289/ehp.10573
PMCID: PMC2291015  PMID: 18414635
carcinogenesis; computational toxicology; high-throughput screening; QSAR
5.  Identification of Thyroid Hormone Receptor Active Compounds Using a Quantitative High-Throughput Screening Platform 
To adapt the use of GH3.TRE-Luc reporter gene cell line for a quantitative high-throughput screening (qHTS) platform, we miniaturized the reporter gene assay to a 1536-well plate format. 1280 chemicals from the Library of Pharmacologically Active Compounds (LOPAC) and the National Toxicology Program (NTP) 1408 compound collection were analyzed to identify potential thyroid hormone receptor (TR) agonists and antagonists. Of the 2688 compounds tested, eight scored as potential TR agonists when the positive hit cut-off was defined at ≥10% efficacy, relative to maximal triiodothyronine (T3) induction, and with only one of those compounds reaching ≥20% efficacy. One common class of compounds positive in the agonist assays were retinoids such as all-trans retinoic acid, which are likely acting via the retinoid-X receptor, the heterodimer partner with the TR. Five potential TR antagonists were identified, including the antiallergy drug tranilast and the anxiolytic drug SB 205384 but also some cytotoxic compounds like 5-fluorouracil. None of the inactive compounds were structurally related to T3, nor had been reported elsewhere to be thyroid hormone disruptors, so false negatives were not detected. None of the low potency (>100µM) TR agonists resembled T3 or T4, thus these may not bind directly in the ligand-binding pocket of the receptor. For TR agonists, in the qHTS, a hit cut-off of ≥20% efficacy at 100 µM may avoid identification of positives with low or no physiological relevance. The miniaturized GH3.TRE-Luc assay offers a promising addition to the in vitro test battery for endocrine disruption, and given the low percentage of compounds testing positive, its high-throughput nature is an important advantage for future toxicological screening.
doi:10.2174/2213988501408010036
PMCID: PMC3999704  PMID: 24772387
Endocrine disruption; pituitary cells; quantitative high-throughput screening; thyroid hormone receptor; reporter gene assay; retinoid-X receptor.
6.  Chemical combinations elucidate pathway interactions and regulation relevant to Hepatitis C replication 
SREBP-2, oxidosqualene cyclase (OSC) or lanosterol demethylase were identified as novel sterol pathway-associated targets that, when probed with chemical agents, can inhibit hepatitis C virus (HCV) replication.Using a combination chemical genetics approach, combinations of chemicals targeting sterol pathway enzymes downstream of and including OSC or protein geranylgeranyl transferase I (PGGT) produce robust and selective synergistic inhibition of HCV replication. Inhibition of enzymes upstream of OSC elicit proviral responses that are dominant to the effects of inhibiting all downstream targets.Inhibition of the sterol pathway without inhibition of regulatory feedback mechanisms ultimately results in an increase in HCV replication because of a compensatory upregulation of 3-hydroxy-3-methylglutaryl coenzyme A reductase (HMGCR) expression. Increases in HMGCR expression without inhibition of HMGCR enzymatic activity ultimately stimulate HCV replication through increasing the cellular pool of geranylgeranyl pyrophosphate (GGPP).Chemical inhibitors that ultimately prevent SREBP-2 activation, inhibit PGGT or encourage the production of polar sterols have great potential as HCV therapeutics if associated toxicities can be reduced.
Chemical inhibition of enzymes in either the cholesterol or the fatty acid biosynthetic pathways has been shown to impact viral replication, both positively and negatively (Su et al, 2002; Ye et al, 2003; Kapadia and Chisari, 2005; Sagan et al, 2006; Amemiya et al, 2008). FBL2 has been identified as a 50 kDa geranylgeranylated host protein that is necessary for localization of the hepatitis C virus (HCV) replication complex to the membranous web through its close association with the HCV protein NS5A and is critical for HCV replication (Wang et al, 2005). Inhibition of the protein geranylgeranyl transferase I (PGGT), an enzyme that transfers geranylgeranyl pyrophosphate (GGPP) to cellular proteins such as FBL2 for the purpose of membrane anchoring, negatively impacts HCV replication (Ye et al, 2003). Conversely, chemical agents that increase intracellular GGPP concentrations promote viral replication (Kapadia and Chisari, 2005). Statin compounds that inhibit 3-hydroxy-3-methylglutaryl coenzyme A reductase (HMGCR), the rate-limiting enzyme in the sterol pathway (Goldstein and Brown, 1990), have been suggested to inhibit HCV replication through ultimately reducing the cellular pool of GGPP (Ye et al, 2003; Kapadia and Chisari, 2005; Ikeda et al, 2006). However, inhibition of the sterol pathway with statin drugs has not yielded consistent results in patients. The use of statins for the treatment of HCV is likely to be complicated by the reported compensatory increase in HMGCR expression in vitro and in vivo (Stone et al, 1989; Cohen et al, 1993) in response to treatment. Enzymes in the sterol pathway are regulated on a transcriptional level by sterol regulatory element-binding proteins (SREBPs), specifically SREBP-2 (Hua et al, 1993; Brown and Goldstein, 1997). When cholesterol stores in cells are depleted, SREBP-2 activates transcription of genes in the sterol pathway such as HMGCR, HMG-CoA synthase, farnesyl pyrophosphate (FPP) synthase, squalene synthase (SQLS) and the LDL receptor (Smith et al, 1988, 1990; Sakai et al, 1996; Brown and Goldstein, 1999; Horton et al, 2002). The requirement of additional downstream sterol pathway metabolites for HCV replication has not been completely elucidated.
To further understand the impact of the sterol pathway and its regulation on HCV replication, we conducted a high-throughput combination chemical genetic screen using 16 chemical probes that are known to modulate the activity of target enzymes relating to the sterol biosynthesis pathway (Figure 1). Using this approach, we identified several novel antiviral targets including SREBP-2 as well as targets downstream of HMGCR in the sterol pathway such as oxidosqualene cyclase (OSC) and lanosterol demethylase. Many of our chemical probes, specifically SR-12813, farnesol and squalestatin, strongly promoted replicon replication. The actions of both farnesol and squalestatin ultimately result in an increase in the cellular pool of GGPP, which is known to increase HCV replication (Ye et al, 2003; Kapadia and Chisari, 2005; Wang et al, 2005).
Chemical combinations targeting enzymes upstream of squalene epoxidase (SQLE) at the top of the sterol pathway (Figure 4A) elicited Bateson-type epistatic responses (Boone et al, 2007), where the upstream agent's response predominates over the effects of inhibiting all downstream targets. This was especially notable for combinations including simvastatin and either U18666A or squalestatin, and for squalestatin in combination with Ro48-8071. Treatment with squalestatin prevents the SQLS substrate, farnesyl pyrophosphate (FPP) from being further metabolized by the sterol pathway. As FPP concentrations increase, the metabolite can be shunted away from the sterol pathway toward farnesylation and GGPP synthetic pathways, resulting in an increase in host protein geranylgeranylation, including FBL2, and consequently replicon replication. This increase in replicon replication explains the source of the observed epistasis over Ro48-8071 treatment.
Combinations between probes targeting enzymes downstream of and including OSC produced robust synergies with each other or with a PGGT inhibitor. Figure 4B highlights examples of antiviral synergy resulting from treatment of cells with an OSC inhibitor in combination with an inhibitor of either an enzyme upstream or downstream of OSC. A combination of terconazole and U18666A is synergistic without similar combination effects in the host proliferation screen. Likewise, clomiphene was also synergistic when added to replicon cells in combination with U18666A. One of the greatest synergies observed downstream in the sterol pathway is a combination of amorolfine and AY 9944, suggesting that there is value in developing combinations of drugs that target enzymes in the sterol pathway, which are downstream of HMGCR.
Interactions with the protein prenylation pathway also showed strong mechanistic patterns (Figure 4C). GGTI-286 is a peptidomimetic compound resembling the CAAX domain of a protein to be geranylgeranylated and is a competitive inhibitor of protein geranylgeranylation. Simvastatin impedes the antiviral effect of GGTI-286 at low concentrations but that antagonism is balanced by comparable synergy at higher concentrations. At the low simvastatin concentrations, a compensatory increase in HMGCR expression leads to increased cellular levels of GGPP, which are likely to result in an increase in PGGT enzymatic turnover and decreased GGTI-286 efficacy. The antiviral synergy observed at the higher inhibitor concentrations is likely nonspecific as synergy was also observed in a host viability assay. Further downstream, however, a competitive interaction was observed between GGTI-286 and squalestatin, where the opposing effect of one compound obscures the other compound's effect. This competitive relationship between GGTI and SQLE explains the epistatic response observed between those two agents. For inhibitors of targets downstream of OSC, such as amorolfine, there are strong antiviral synergies with GGTI-286. Notably, combinations with OSC inhibitors and GGTI-286 were selective, in that comparable synergy was not found in a host viability assay. This selectivity suggests that jointly targeting OSC and PGGT is a promising avenue for future HCV therapy development.
This study provides a comprehensive and unique perspective into the impact of sterol pathway regulation on HCV replication and provides compelling insight into the use of chemical combinations to maximize antiviral effects while minimizing proviral consequences. Our results suggest that HCV therapeutics developed against sterol pathway targets must consider the impact on underlying sterol pathway regulation. We found combinations of inhibitors of the lower part of the sterol pathway that are effective and synergistic with each other when tested in combination. Furthermore, the combination effects observed with simvastatin suggest that, though statins inhibit HMGCR activity, the resulting regulatory consequences of such inhibition ultimately lead to undesirable epistatic effects. Inhibitors that prevent SREBP-2 activation, inhibit PGGT or encourage the production of polar sterols have great potential as HCV therapeutics if associated toxicities can be reduced.
The search for effective Hepatitis C antiviral therapies has recently focused on host sterol metabolism and protein prenylation pathways that indirectly affect viral replication. However, inhibition of the sterol pathway with statin drugs has not yielded consistent results in patients. Here, we present a combination chemical genetic study to explore how the sterol and protein prenylation pathways work together to affect hepatitis C viral replication in a replicon assay. In addition to finding novel targets affecting viral replication, our data suggest that the viral replication is strongly affected by sterol pathway regulation. There is a marked transition from antagonistic to synergistic antiviral effects as the combination targets shift downstream along the sterol pathway. We also show how pathway regulation frustrates potential hepatitis C therapies based on the sterol pathway, and reveal novel synergies that selectively inhibit hepatitis C replication over host toxicity. In particular, combinations targeting the downstream sterol pathway enzymes produced robust and selective synergistic inhibition of hepatitis C replication. Our findings show how combination chemical genetics can reveal critical pathway connections relevant to viral replication, and can identify potential treatments with an increased therapeutic window.
doi:10.1038/msb.2010.32
PMCID: PMC2913396  PMID: 20531405
chemical genetics; combinations and synergy; hepatitis C; replicon; sterol biosynthesis
7.  Profiling of drugs and environmental chemicals for functional impairment of neural crest migration in a novel stem cell-based test battery 
Archives of Toxicology  2014;88(5):1109-1126.
Developmental toxicity in vitro assays have hitherto been established as stand-alone systems, based on a limited number of toxicants. Within the embryonic stem cell-based novel alternative tests project, we developed a test battery framework that allows inclusion of any developmental toxicity assay and that explores the responses of such test systems to a wide range of drug-like compounds. We selected 28 compounds, including several biologics (e.g., erythropoietin), classical pharmaceuticals (e.g., roflumilast) and also six environmental toxicants. The chemical, toxicological and clinical data of this screen library were compiled. In order to determine a non-cytotoxic concentration range, cytotoxicity data were obtained for all compounds from HEK293 cells and from murine embryonic stem cells. Moreover, an estimate of relevant exposures was provided by literature data mining. To evaluate feasibility of the suggested test framework, we selected a well-characterized assay that evaluates ‘migration inhibition of neural crest cells.’ Screening at the highest non-cytotoxic concentration resulted in 11 hits (e.g., geldanamycin, abiraterone, gefitinib, chlorpromazine, cyproconazole, arsenite). These were confirmed in concentration–response studies. Subsequent pharmacokinetic modeling indicated that triadimefon exerted its effects at concentrations relevant to the in vivo situation, and also interferon-β and polybrominated diphenyl ether showed effects within the same order of magnitude of concentrations that may be reached in humans. In conclusion, the test battery framework can identify compounds that disturb processes relevant for human development and therefore may represent developmental toxicants. The open structure of the strategy allows rich information to be generated on both the underlying library, and on any contributing assay.
Electronic supplementary material
The online version of this article (doi:10.1007/s00204-014-1231-9) contains supplementary material, which is available to authorized users.
doi:10.1007/s00204-014-1231-9
PMCID: PMC3996367  PMID: 24691702
Test battery-based compound screening; Developmental toxicity testing; hESC-based test system; Neural crest migration assay
8.  Identification of Compounds with Anti-Proliferative Activity against Trypanosoma brucei brucei Strain 427 by a Whole Cell Viability Based HTS Campaign 
Human African Trypanosomiasis (HAT) is caused by two trypanosome sub-species, Trypanosoma brucei rhodesiense and Trypanosoma brucei gambiense. Drugs available for the treatment of HAT have significant issues related to difficult administration regimes and limited efficacy across species and disease stages. Hence, there is considerable need to find new alternative and less toxic drugs. An approach to identify starting points for new drug candidates is high throughput screening (HTS) of large compound library collections. We describe the application of an Alamar Blue based, 384-well HTS assay to screen a library of 87,296 compounds against the related trypanosome subspecies, Trypanosoma brucei brucei bloodstream form lister 427. Primary hits identified against T.b. brucei were retested and the IC50 value compounds were estimated for T.b. brucei and a mammalian cell line HEK293, to determine a selectivity index for each compound. The screening campaign identified 205 compounds with greater than 10 times selectivity against T.b. brucei. Cluster analysis of these compounds, taking into account chemical and structural properties required for drug-like compounds, afforded a panel of eight compounds for further biological analysis. These compounds had IC50 values ranging from 0.22 µM to 4 µM with associated selectivity indices ranging from 19 to greater than 345. Further testing against T.b. rhodesiense led to the selection of 6 compounds from 5 new chemical classes with activity against the causative species of HAT, which can be considered potential candidates for HAT early drug discovery. Structure activity relationship (SAR) mining revealed components of those hit compound structures that may be important for biological activity. Four of these compounds have undergone further testing to 1) determine whether they are cidal or static in vitro at the minimum inhibitory concentration (MIC), and 2) estimate the time to kill.
Author Summary
Human African Sleeping Sickness (HAT) is a disease caused by sub-species of Trypanosoma. The disease affects developing countries within Africa, mainly occurring in rural regions that lack resources to purchase drugs for treatment. Drugs that are currently available have significant side effects, and treatment regimes are lengthy and not always transferrable to the field. In consideration of these factors, new drugs are urgently needed for the treatment of HAT. To discover compounds suitable for drug discovery, cultured trypanosomes can be tested against libraries of compounds to identify candidates for further biological analysis. We have utilised a 384-well format, Alamar Blue viability assay to screen a large non-proprietary compound collection against Trypanosoma brucei brucei bloodstream form lister 427. The assay was shown to be reproducible, with reference compounds exhibiting activity in agreement with previously published results. Primary screening hits were retested against T.b. brucei and HEK293 mammalian cells in order to assess selectivity against the parasite. Selective hits were characterised by chemical analysis, taking into consideration drug-like properties amenable to further progression. Priority compounds were tested against a panel of protozoan parasites, including Trypanosoma brucei rhodesiense, Trypanosoma cruzi, Leishmania donovani and Plasmodium falciparum. Five new compound classes were discovered that are amenable to progression in the drug discovery process for HAT.
doi:10.1371/journal.pntd.0001896
PMCID: PMC3510080  PMID: 23209849
9.  Environmental Impact on Vascular Development Predicted by High-Throughput Screening 
Environmental Health Perspectives  2011;119(11):1596-1603.
Background: Understanding health risks to embryonic development from exposure to environmental chemicals is a significant challenge given the diverse chemical landscape and paucity of data for most of these compounds. High-throughput screening (HTS) in the U.S. Environmental Protection Agency (EPA) ToxCast™ project provides vast data on an expanding chemical library currently consisting of > 1,000 unique compounds across > 500 in vitro assays in phase I (complete) and Phase II (under way). This public data set can be used to evaluate concentration-dependent effects on many diverse biological targets and build predictive models of prototypical toxicity pathways that can aid decision making for assessments of human developmental health and disease.
Objective: We mined the ToxCast phase I data set to identify signatures for potential chemical disruption of blood vessel formation and remodeling.
Methods: ToxCast phase I screened 309 chemicals using 467 HTS assays across nine assay technology platforms. The assays measured direct interactions between chemicals and molecular targets (receptors, enzymes), as well as downstream effects on reporter gene activity or cellular consequences. We ranked the chemicals according to individual vascular bioactivity score and visualized the ranking using ToxPi (Toxicological Priority Index) profiles.
Results: Targets in inflammatory chemokine signaling, the vascular endothelial growth factor pathway, and the plasminogen-activating system were strongly perturbed by some chemicals, and we found positive correlations with developmental effects from the U.S. EPA ToxRefDB (Toxicological Reference Database) in vivo database containing prenatal rat and rabbit guideline studies. We observed distinctly different correlative patterns for chemicals with effects in rabbits versus rats, despite derivation of in vitro signatures based on human cells and cell-free biochemical targets, implying conservation but potentially differential contributions of developmental pathways among species. Follow-up analysis with antiangiogenic thalidomide analogs and additional in vitro vascular targets showed in vitro activity consistent with the most active environmental chemicals tested here.
Conclusions: We predicted that blood vessel development is a target for environmental chemicals acting as putative vascular disruptor compounds (pVDCs) and identified potential species differences in sensitive vascular developmental pathways.
doi:10.1289/ehp.1103412
PMCID: PMC3226499  PMID: 21788198
angiogenesis; developmental toxicity; high-throughput screening (HTS); thalidomide; vascular development
10.  Weighted Feature Significance: A Simple, Interpretable Model of Compound Toxicity Based on the Statistical Enrichment of Structural Features 
Toxicological Sciences  2009;112(2):385-393.
In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high–throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) data from Salmonella typhimurium reverse mutagenicity assays conducted by the U.S. National Toxicology Program, and (3) hepatotoxicity data published in the Registry of Toxic Effects of Chemical Substances. Enrichments of structural features in toxic compounds are evaluated for their statistical significance and compiled into a simple additive model of toxicity and then used to score new compounds for potential toxicity. The predictive power of the model for cytotoxicity was validated using an independent set of compounds from the U.S. Environmental Protection Agency tested also at the National Institutes of Health Chemical Genomics Center. We compared the performance of our WFS approach with classical classification methods such as Naive Bayesian clustering and support vector machines. In most test cases, WFS showed similar or slightly better predictive power, especially in the prediction of hepatotoxic compounds, where WFS appeared to have the best performance among the three methods. The new algorithm has the important advantages of simplicity, power, interpretability, and ease of implementation.
doi:10.1093/toxsci/kfp231
PMCID: PMC2777082  PMID: 19805409
modeling; toxicity prediction; structural features; cell viability; caspase-3,7 activation; in vivo toxicity
11.  A systematic study of mitochondrial toxicity of environmental chemicals using quantitative high throughput screening 
Chemical research in toxicology  2013;26(9):1323-1332.
A goal of the Tox21 program is to transit toxicity testing from traditional in vivo models to in vitro assays that assess how chemicals affect cellular responses and toxicity pathways. A critical contribution of the NIH Chemical Genomics center (NCGC) to the Tox21 program is the implementation of a quantitative high throughput screening (qHTS) approach, using cell- and biochemical-based assays to generate toxicological profiles for thousands of environmental compounds. Here, we evaluated the effect of chemical compounds on mitochondrial membrane potential in HepG2 cells by screening a library of 1,408 compounds provided by the National Toxicology Program (NTP) in a qHTS platform. Compounds were screened over 14 concentrations, and results showed that 91 and 88 compounds disrupted mitochondrial membrane potential after treatment for one or five h, respectively. Seventy-six compounds active at both time points were clustered by structural similarity, producing 11 clusters and 23 singletons. Thirty-eight compounds covering most of the active chemical space were more extensively evaluated. Thirty-six of the 38 compounds were confirmed to disrupt mitochondrial membrane potential using a fluorescence plate reader and 35 were confirmed using a high content imaging approach. Among the 38 compounds, 4 and 6 induced LDH release, a measure of cytotoxicity, at 1 or 5 h, respectively. Compounds were further assessed for mechanism of action (MOA) by measuring changes in oxygen consumption rate, which enabled identification of 20 compounds as uncouplers. This comprehensive approach allows for evaluation of thousands of environmental chemicals for mitochondrial toxicity and identification of possible MOAs.
doi:10.1021/tx4001754
PMCID: PMC4154066  PMID: 23895456
mitochondrial membrane potential assay; mitochondrial toxicity; NTP 1408 compound library; oxygen consumption rate; qHTS; Tox21 collaboration
12.  Quantitative High-Throughput Screening for Chemical Toxicity in a Population-Based In Vitro Model 
Toxicological Sciences  2012;126(2):578-588.
A shift in toxicity testing from in vivo to in vitro may efficiently prioritize compounds, reveal new mechanisms, and enable predictive modeling. Quantitative high-throughput screening (qHTS) is a major source of data for computational toxicology, and our goal in this study was to aid in the development of predictive in vitro models of chemical-induced toxicity, anchored on interindividual genetic variability. Eighty-one human lymphoblast cell lines from 27 Centre d’Etude du Polymorphisme Humain trios were exposed to 240 chemical substances (12 concentrations, 0.26nM–46.0μM) and evaluated for cytotoxicity and apoptosis. qHTS screening in the genetically defined population produced robust and reproducible results, which allowed for cross-compound, cross-assay, and cross-individual comparisons. Some compounds were cytotoxic to all cell types at similar concentrations, whereas others exhibited interindividual differences in cytotoxicity. Specifically, the qHTS in a population-based human in vitro model system has several unique aspects that are of utility for toxicity testing, chemical prioritization, and high-throughput risk assessment. First, standardized and high-quality concentration-response profiling, with reproducibility confirmed by comparison with previous experiments, enables prioritization of chemicals for variability in interindividual range in cytotoxicity. Second, genome-wide association analysis of cytotoxicity phenotypes allows exploration of the potential genetic determinants of interindividual variability in toxicity. Furthermore, highly significant associations identified through the analysis of population-level correlations between basal gene expression variability and chemical-induced toxicity suggest plausible mode of action hypotheses for follow-up analyses. We conclude that as the improved resolution of genetic profiling can now be matched with high-quality in vitro screening data, the evaluation of the toxicity pathways and the effects of genetic diversity are now feasible through the use of human lymphoblast cell lines.
doi:10.1093/toxsci/kfs023
PMCID: PMC3307611  PMID: 22268004
chemical cytotoxicity; apoptosis; HapMap; lymphoblasts; qHTS
13.  The Role of the Toxicologic Pathologist in the Post-Genomic Era# 
Journal of Toxicologic Pathology  2013;26(2):105-110.
An era can be defined as a period in time identified by distinctive character, events, or practices. We are now in the genomic era. The pre-genomic era: There was a pre-genomic era. It started many years ago with novel and seminal animal experiments, primarily directed at studying cancer. It is marked by the development of the two-year rodent cancer bioassay and the ultimate realization that alternative approaches and short-term animal models were needed to replace this resource-intensive and time-consuming method for predicting human health risk. Many alternatives approaches and short-term animal models were proposed and tried but, to date, none have completely replaced our dependence upon the two-year rodent bioassay. However, the alternative approaches and models themselves have made tangible contributions to basic research, clinical medicine and to our understanding of cancer and they remain useful tools to address hypothesis-driven research questions. The pre-genomic era was a time when toxicologic pathologists played a major role in drug development, evaluating the cancer bioassay and the associated dose-setting toxicity studies, and exploring the utility of proposed alternative animal models. It was a time when there was shortage of qualified toxicologic pathologists. The genomic era: We are in the genomic era. It is a time when the genetic underpinnings of normal biological and pathologic processes are being discovered and documented. It is a time for sequencing entire genomes and deliberately silencing relevant segments of the mouse genome to see what each segment controls and if that silencing leads to increased susceptibility to disease. What remains to be charted in this genomic era is the complex interaction of genes, gene segments, post-translational modifications of encoded proteins, and environmental factors that affect genomic expression. In this current genomic era, the toxicologic pathologist has had to make room for a growing population of molecular biologists. In this present era newly emerging DVM and MD scientists enter the work arena with a PhD in pathology often based on some aspect of molecular biology or molecular pathology research. In molecular biology, the almost daily technological advances require one’s complete dedication to remain at the cutting edge of the science. Similarly, the practice of toxicologic pathology, like other morphological disciplines, is based largely on experience and requires dedicated daily examination of pathology material to maintain a well-trained eye capable of distilling specific information from stained tissue slides - a dedicated effort that cannot be well done as an intermezzo between other tasks. It is a rare individual that has true expertise in both molecular biology and pathology. In this genomic era, the newly emerging DVM-PhD or MD-PhD pathologist enters a marketplace without many job opportunities in contrast to the pre-genomic era. Many face an identity crisis needing to decide to become a competent pathologist or, alternatively, to become a competent molecular biologist. At the same time, more PhD molecular biologists without training in pathology are members of the research teams working in drug development and toxicology. How best can the toxicologic pathologist interact in the contemporary team approach in drug development, toxicology research and safety testing? Based on their biomedical training, toxicologic pathologists are in an ideal position to link data from the emerging technologies with their knowledge of pathobiology and toxicology. To enable this linkage and obtain the synergy it provides, the bench-level, slide-reading expert pathologist will need to have some basic understanding and appreciation of molecular biology methods and tools. On the other hand, it is not likely that the typical molecular biologist could competently evaluate and diagnose stained tissue slides from a toxicology study or a cancer bioassay. The post-genomic era: The post-genomic era will likely arrive approximately around 2050 at which time entire genomes from multiple species will exist in massive databases, data from thousands of robotic high throughput chemical screenings will exist in other databases, genetic toxicity and chemical structure-activity-relationships will reside in yet other databases. All databases will be linked and relevant information will be extracted and analyzed by appropriate algorithms following input of the latest molecular, submolecular, genetic, experimental, pathology and clinical data. Knowledge gained will permit the genetic components of many diseases to be amenable to therapeutic prevention and/or intervention. Much like computerized algorithms are currently used to forecast weather or to predict political elections, computerized sophisticated algorithms based largely on scientific data mining will categorize new drugs and chemicals relative to their health benefits versus their health risks for defined human populations and subpopulations. However, this form of a virtual toxicity study or cancer bioassay will only identify probabilities of adverse consequences from interaction of particular environmental and/or chemical/drug exposure(s) with specific genomic variables. Proof in many situations will require confirmation in intact in vivo mammalian animal models. The toxicologic pathologist in the post-genomic era will be the best suited scientist to confirm the data mining and its probability predictions for safety or adverse consequences with the actual tissue morphological features in test species that define specific test agent pathobiology and human health risk.
doi:10.1293/tox.26.105
PMCID: PMC3695332  PMID: 23914052
genomic era; history of toxicologic pathology; molecular biology
14.  A Three-Stage Algorithm to Make Toxicologically Relevant Activity Calls from Quantitative High Throughput Screening Data 
Environmental Health Perspectives  2012;120(8):1107-1115.
Background: The ability of a substance to induce a toxicological response is better understood by analyzing the response profile over a broad range of concentrations than at a single concentration. In vitro quantitative high throughput screening (qHTS) assays are multiple-concentration experiments with an important role in the National Toxicology Program’s (NTP) efforts to advance toxicology from a predominantly observational science at the level of disease-specific models to a more predictive science based on broad inclusion of biological observations.
Objective: We developed a systematic approach to classify substances from large-scale concentration–​response data into statistically supported, toxicologically relevant activity categories.
Methods: The first stage of the approach finds active substances with robust concentration–response profiles within the tested concentration range. The second stage finds substances with activity at the lowest tested concentration not captured in the first stage. The third and final stage separates statistically significant (but not robustly statistically significant) profiles from responses that lack statistically compelling support (i.e., “inactives”). The performance of the proposed algorithm was evaluated with simulated qHTS data sets.
Results: The proposed approach performed well for 14-point-concentration–response curves with typical levels of residual error (σ ≤ 25%) or when maximal response (|RMAX|) was > 25% of the positive control response. The approach also worked well in most cases for smaller sample sizes when |RMAX| ≥ 50%, even with as few as four data points.
Conclusions: The three-stage classification algorithm performed better than one-stage classification approaches based on overall F-tests, t-tests, or linear regression.
doi:10.1289/ehp.1104688
PMCID: PMC3440085  PMID: 22575717
activity calls; concentration–response; Hill equation; quantitative high throughput screening; Tox21
15.  A Static-Cidal Assay for Trypanosoma brucei to Aid Hit Prioritisation for Progression into Drug Discovery Programmes 
Human African Trypanosomiasis is a vector-borne disease of sub-Saharan Africa that causes significant morbidity and mortality. Current therapies have many drawbacks, and there is an urgent need for new, better medicines. Ideally such new treatments should be fast-acting cidal agents that cure the disease in as few doses as possible. Screening assays used for hit-discovery campaigns often do not distinguish cytocidal from cytostatic compounds and further detailed follow-up experiments are required. Such studies usually do not have the throughput required to test the large numbers of hits produced in a primary high-throughput screen. Here, we present a 384-well assay that is compatible with high-throughput screening and provides an initial indication of the cidal nature of a compound. The assay produces growth curves at ten compound concentrations by assessing trypanosome counts at 4, 24 and 48 hours after compound addition. A reduction in trypanosome counts over time is used as a marker for cidal activity. The lowest concentration at which cell killing is seen is a quantitative measure for the cidal activity of the compound. We show that the assay can identify compounds that have trypanostatic activity rather than cidal activity, and importantly, that results from primary high-throughput assays can overestimate the potency of compounds significantly. This is due to biphasic growth inhibition, which remains hidden at low starting cell densities and is revealed in our static-cidal assay. The assay presented here provides an important tool to follow-up hits from high-throughput screening campaigns and avoid progression of compounds that have poor prospects due to lack of cidal activity or overestimated potency.
Author Summary
Trypanosoma brucei is a protozoan parasite causing African sleeping sickness. Current treatments for this disease have significant limitations, underlining the need for better and safer drugs. To identify new chemical starting points for drug development, large compound collections are screened against the parasite. Such screens typically do not distinguish between compounds that slow the growth of the parasite and compounds that actually kill the parasite (cidal compounds). Here, we present the development of an assay to identify such compounds. The main advantage of our assay is that it marries a relatively high-throughput to increased understanding of mode of action. Many active compounds (hits) are usually identified in T. brucei primary screening campaigns, making it difficult to select which compounds should undergo further development. Our assay allows testing of all of the hits for cidal activity so that only the most promising compounds are progressed. We show that the starting cell density used in the T. brucei growth assay can have a significant effect on the shape of dose response curves, and that important information regarding the mode of action of a compound can remain hidden at low starting densities as used commonly in T. brucei screening assays.
doi:10.1371/journal.pntd.0001932
PMCID: PMC3510075  PMID: 23209868
16.  Nanomaterial Toxicity Testing in the 21st Century: Use of a Predictive Toxicological Approach and High Throughput Screening 
Accounts of chemical research  2012;46(3):607-621.
Conspectus
The production of engineered nanomaterials (ENMs) is a scientific breakthrough in material design and the development of new consumer products. While the successful implementation of nanotechnology is important for the growth of the global economy, we also need to consider the possible environmental health and safety (EHS) impact as a result of the novel physicochemical properties that could generate hazardous biological outcomes. In order to assess ENM hazard, reliable and reproducible screening approaches are needed to test the basic materials as well as nano-enabled products. A platform is required to investigate the potentially endless number of bio-physicochemical interactions at the nano/bio interface, in response to which we have developed a predictive toxicological approach. We define a predictive toxicological approach as the use of mechanisms-based high throughput screening in vitro to make predictions about the physicochemical properties of ENMs that may lead to the generation of pathology or disease outcomes in vivo. The in vivo results are used to validate and improve the in vitro high throughput screening (HTS) and to establish structure-activity relationships (SARs) that allow hazard ranking and modeling by an appropriate combination of in vitro and in vivo testing. This notion is in agreement with the landmark 2007 report from the US National Academy of Sciences, “Toxicity Testing in the 21st Century: A Vision and a Strategy” (http://www.nap.edu/catalog.php?record_id=11970), which advocates increased efficiency of toxicity testing by transitioning from qualitative, descriptive animal testing to quantitative, mechanistic and pathway-based toxicity testing in human cells or cell lines using high throughput approaches. Accordingly, we have implemented HTS approaches to screen compositional and combinatorial ENM libraries to develop hazard ranking and structure-activity
relationships that can be used for predicting in vivo injury outcomes. This predictive approach allows the bulk of the screening analysis and high volume data generation to be carried out in vitro, following which limited, but critical, validation studies are carried out in animals or whole organisms. Risk reduction in the exposed human or environmental populations can then focus on limiting or avoiding exposures that trigger these toxicological responses as well as implementing safer design of potentially hazardous ENMs. In this communication, we review the tools required for establishing predictive toxicology paradigms to assess inhalation and environmental toxicological scenarios through the use of compositional and combinatorial ENM libraries, mechanism-based HTS assays, hazard ranking and development of nano-SARs. We will discuss the major injury paradigms that have emerged based on specific ENM properties, as well as describing the safer design of ZnO nanoparticles based on characterization of dissolution chemistry as a major predictor of toxicity.
doi:10.1021/ar300022h
PMCID: PMC4034475  PMID: 22676423
17.  Profiling Environmental Chemicals for Activity in the Antioxidant Response Element Signaling Pathway Using a High Throughput Screening Approach 
Environmental Health Perspectives  2012;120(8):1150-1156.
Background: Oxidative stress has been implicated in the pathogenesis of a variety of diseases ranging from cancer to neurodegeneration, highlighting the need to identify chemicals that can induce this effect. The antioxidant response element (ARE) signaling pathway plays an important role in the amelioration of oxidative stress. Thus, assays that detect the up-regulation of this pathway could be useful for identifying chemicals that induce oxidative stress.
Objectives: We used cell-based reporter methods and informatics tools to efficiently screen a large collection of environmental chemicals and identify compounds that induce oxidative stress.
Methods: We utilized two cell-based ARE assay reporters, β-lactamase and luciferase, to screen a U.S. National Toxicology Program 1,408-compound library (NTP 1408, which contains 1,340 unique compounds) for their ability to induce oxidative stress in HepG2 cells using quantitative high throughput screening (qHTS).
Results: Roughly 3% (34 of 1,340) of the unique compounds demonstrated activity across both cell-based assays. Based on biological activity and structure–activity relationship profiles, we selected 50 compounds for retesting in the two ARE assays and in an additional follow-up assay that employed a mutated ARE linked to β-lactamase. Using this strategy, we identified 30 compounds that demonstrated activity in the ARE-bla and ARE-luc assays and were able to determine structural features conferring compound activity across assays.
Conclusions: Our results support the robustness of using two different cell-based approaches for identifying compounds that induce ARE signaling. Together, these methods are useful for prioritizing chemicals for further in-depth mechanism-based toxicity testing.
doi:10.1289/ehp.1104709
PMCID: PMC3440086  PMID: 22551509
ARE; Nrf2; oxidative stress; qHTS; toxicity; Tox21
18.  Quantitative High-Throughput Screen Identifies Inhibitors of the Schistosoma mansoni Redox Cascade 
Schistosomiasis is a tropical disease associated with high morbidity and mortality, currently affecting over 200 million people worldwide. Praziquantel is the only drug used to treat the disease, and with its increased use the probability of developing drug resistance has grown significantly. The Schistosoma parasites can survive for up to decades in the human host due in part to a unique set of antioxidant enzymes that continuously degrade the reactive oxygen species produced by the host's innate immune response. Two principal components of this defense system have been recently identified in S. mansoni as thioredoxin/glutathione reductase (TGR) and peroxiredoxin (Prx) and as such these enzymes present attractive new targets for anti-schistosomiasis drug development. Inhibition of TGR/Prx activity was screened in a dual-enzyme format with reducing equivalents being transferred from NADPH to glutathione via a TGR-catalyzed reaction and then to hydrogen peroxide via a Prx-catalyzed step. A fully automated quantitative high-throughput (qHTS) experiment was performed against a collection of 71,028 compounds tested as 7- to 15-point concentration series at 5 µL reaction volume in 1536-well plate format. In order to generate a robust data set and to minimize the effect of compound autofluorescence, apparent reaction rates derived from a kinetic read were utilized instead of end-point measurements. Actives identified from the screen, along with previously untested analogues, were subjected to confirmatory experiments using the screening assay and subsequently against the individual targets in secondary assays. Several novel active series were identified which inhibited TGR at a range of potencies, with IC50s ranging from micromolar to the assay response limit (∼25 nM). This is, to our knowledge, the first report of a large-scale HTS to identify lead compounds for a helminthic disease, and provides a paradigm that can be used to jump-start development of novel therapeutics for other neglected tropical diseases.
Author Summary
Schistosomiasis, also known as bilharzia, is a tropical disease associated with high morbidity and mortality, currently affecting over 200 million people worldwide. Praziquantel is the only drug used to treat the disease, and with its increased use the probability of developing resistance has grown significantly. The Schistosoma parasites can survive for up to decades in the human host due in part to a unique set of antioxidant enzymes that continuously degrade the reactive oxygen species produced by the host's innate immune response. Two principal components of this defense system, thioredoxin/glutathione reductase (TGR) and peroxiredoxin (Prx2), have been recently identified and validated as targets for anti-schistosomiasis drug development. In search of inhibitors of this critical redox cascade, we optimized and performed a highly miniaturized automated screen of 71,028 compounds arrayed as 7- to 15-point dilution sets. We identified novel structural series of TGR inhibitors, several of which are highly potent and should serve both as mechanistic tools for probing redox pathways in S. mansoni and as starting points for developing much-needed new treatments for schistosomiasis. The paradigm presented here effectively bridges the gap between academic target identification and the first steps of drug development, and should be applicable to a variety of other important neglected diseases.
doi:10.1371/journal.pntd.0000127
PMCID: PMC2217675  PMID: 18235848
19.  Main Report 
Genetics in Medicine  2006;8(Suppl 1):12S-252S.
Background:
States vary widely in their use of newborn screening tests, with some mandating screening for as few as three conditions and others mandating as many as 43 conditions, including varying numbers of the 40+ conditions that can be detected by tandem mass spectrometry (MS/MS). There has been no national guidance on the best candidate conditions for newborn screening since the National Academy of Sciences report of 19751 and the United States Congress Office of Technology Assessment report of 1988,2 despite rapid developments since then in genetics, in screening technologies, and in some treatments.
Objectives:
In 2002, the Maternal and Child Health Bureau (MCHB) of the Health Resources and Services Administration (HRSA) of the United States Department of Health and Human Services (DHHS) commissioned the American College of Medical Genetics (ACMG) to: Conduct an analysis of the scientific literature on the effectiveness of newborn screening.Gather expert opinion to delineate the best evidence for screening for specified conditions and develop recommendations focused on newborn screening, including but not limited to the development of a uniform condition panel.Consider other components of the newborn screening system that are critical to achieving the expected outcomes in those screened.
Methods:
A group of experts in various areas of subspecialty medicine and primary care, health policy, law, public health, and consumers worked with a steering committee and several expert work groups, using a two-tiered approach to assess and rank conditions. A first step was developing a set of principles to guide the analysis. This was followed by developing criteria by which conditions could be evaluated, and then identifying the conditions to be evaluated. A large and broadly representative group of experts was asked to provide their opinions on the extent to which particular conditions met the selected criteria, relying on supporting evidence and references from the scientific literature. The criteria were distributed among three main categories for each condition: The availability and characteristics of the screening test;The availability and complexity of diagnostic services; andThe availability and efficacy of treatments related to the conditions. A survey process utilizing a data collection instrument was used to gather expert opinion on the conditions in the first tier of the assessment. The data collection format and survey provided the opportunity to quantify expert opinion and to obtain the views of a diverse set of interest groups (necessary due to the subjective nature of some of the criteria). Statistical analysis of data produced a score for each condition, which determined its ranking and initial placement in one of three categories (high scoring, moderately scoring, or low scoring/absence of a newborn screening test). In the second tier of these analyses, the evidence base related to each condition was assessed in depth (e.g., via systematic reviews of reference lists including MedLine, PubMed and others; books; Internet searches; professional guidelines; clinical evidence; and cost/economic evidence and modeling). The fact sheets reflecting these analyses were evaluated by at least two acknowledged experts for each condition. These experts assessed the data and the associated references related to each criterion and provided corrections where appropriate, assigned a value to the level of evidence and the quality of the studies that established the evidence base, and determined whether there were significant variances from the survey data. Survey results were subsequently realigned with the evidence obtained from the scientific literature during the second-tier analysis for all objective criteria, based on input from at least three acknowledged experts in each condition. The information from these two tiers of assessment was then considered with regard to the overriding principles and other technology or condition-specific recommendations. On the basis of this information, conditions were assigned to one of three categories as described above:Core Panel;Secondary Targets (conditions that are part of the differential diagnosis of a core panel condition.); andNot Appropriate for Newborn Screening (either no newborn screening test is available or there is poor performance with regard to multiple other evaluation criteria).
ACMG also considered features of optimal newborn screening programs beyond the tests themselves by assessing the degree to which programs met certain goals (e.g., availability of educational programs, proportions of newborns screened and followed up). Assessments were based on the input of experts serving in various capacities in newborn screening programs and on 2002 data provided by the programs of the National Newborn Screening and Genetics Resource Center (NNSGRC). In addition, a brief cost-effectiveness assessment of newborn screening was conducted.
Results:
Uniform panel
A total of 292 individuals determined to be generally representative of the regional distribution of the United States population and of areas of expertise or involvement in newborn screening provided a total of 3,949 evaluations of 84 conditions. For each condition, the responses of at least three experts in that condition were compared with those of all respondents for that condition and found to be consistent. A score of 1,200 on the data collection instrument provided a logical separation point between high scoring conditions (1,200–1,799 of a possible 2,100) and low scoring (<1,000) conditions. A group of conditions with intermediate scores (1,000–1,199) was identified, all of which were part of the differential diagnosis of a high scoring condition or apparent in the result of the multiplex assay. Some are identified by screening laboratories and others by diagnostic laboratories. This group was designated as a “secondary target” category for which the program must report the diagnostic result.
Using the validated evidence base and expert opinion, each condition that had previously been assigned to a category based on scores gathered through the data collection instrument was reconsidered. Again, the factors taken into consideration were: 1) available scientific evidence; 2) availability of a screening test; 3) presence of an efficacious treatment; 4) adequate understanding of the natural history of the condition; and 5) whether the condition was either part of the differential diagnosis of another condition or whether the screening test results related to a clinically significant condition.
The conditions were then assigned to one of three categories as previously described (core panel, secondary targets, or not appropriate for Newborn Screening).
Among the 29 conditions assigned to the core panel are three hemoglobinopathies associated with a Hb/S allele, six amino acidurias, five disorders of fatty oxidation, nine organic acidurias, and six unrelated conditions (congenital hypothyroidism (CH), biotinidase deficiency (BIOT), congenital adrenal hyperplasia (CAH), classical galactosemia (GALT), hearing loss (HEAR) and cystic fibrosis (CF)). Twenty-three of the 29 conditions in the core panel are identified with multiplex technologies such as tandem mass spectrometry (MS/MS) or high pressure liquid chromatography (HPLC). On the basis of the evidence, six of the 35 conditions initially placed in the core panel were moved into the secondary target category, which expanded to 25 conditions. Test results not associated with potential disease in the infant (e.g., carriers) were also placed in the secondary target category. When newborn screening laboratory results definitively establish carrier status, the result should be made available to the health care professional community and families. Twenty-seven conditions were determined to be inappropriate for newborn screening at this time.
Conditions with limited evidence reported in the scientific literature were more difficult to evaluate, quantify and place in one of the three categories. In addition, many conditions were found to occur in multiple forms distinguished by age-of-onset, severity, or other features. Further, unless a condition was already included in newborn screening programs, there was a potential for bias in the information related to some criteria. In such circumstances, the quality of the studies underlying the data such as expert opinion that considered case reports and reasoning from first principles determined the placement of the conditions into particular categories.
Newborn screening program optimization
– Assessment of the activities of newborn screening programs, based on program reports, was done for the six program components: education; screening; follow-up; diagnostic confirmation; management; and program evaluation. Considerable variation was found between programs with regard to whether particular aspects (e.g., prenatal education program availability, tracking of specimen collection and delivery) were included and the degree to which they are provided. Newborn screening program evaluation systems also were assessed in order to determine their adequacy and uniformity with the goal being to improve interprogram evaluation and comparison to ensure that the expected outcomes from having been identified in screening are realized.
Conclusions:
The state of the published evidence in the fast-moving worlds of newborn screening and medical genetics has not kept up with the implementation of new technologies, thus requiring the considerable use of expert opinion to develop recommendations about a core panel of conditions for newborn screening. Twenty-nine conditions were identified as primary targets for screening from which all components of the newborn screening system should be maximized. An additional 25 conditions were listed that could be identified in the course of screening for core panel conditions. Programs are obligated to establish a diagnosis and communicate the result to the health care provider and family. It is recognized that screening may not have been maximized for the detection of these secondary conditions but that some proportion of such cases may be found among those screened for core panel conditions. With additional screening, greater training of primary care health care professionals and subspecialists will be needed, as will the development of an infrastructure for appropriate follow-up and management throughout the lives of children who have been identified as having one of these rare conditions. Recommended actions to overcome barriers to an optimal newborn screening system include: The establishment of a national role in the scientific evaluation of conditions and the technologies by which they are screened;Standardization of case definitions and reporting procedures;Enhanced oversight of hospital-based screening activities;Long-term data collection and surveillance; andConsideration of the financial needs of programs to allow them to deliver the appropriate services to the screened population.
doi:10.1097/01.gim.0000223467.60151.02
PMCID: PMC3109899
20.  Identification of Pregnane X Receptor Ligands Using Time-Resolved Fluorescence Resonance Energy Transfer and Quantitative High-Throughput Screening 
Abstract
The human pregnane X nuclear receptor (PXR) is a xenobiotic-regulated receptor that is activated by a range of diverse chemicals, including antibiotics, antifungals, glucocorticoids, and herbal extracts. PXR has been characterized as an important receptor in the metabolism of xenobiotics due to induction of cytochrome P450 isozymes and activation by a large number of prescribed medications. Developing methodologies that can efficiently detect PXR ligands will be clinically beneficial to avoid potential drug–drug interactions. To facilitate the identification of PXR ligands, a time-resolved fluorescence resonance energy transfer (TR-FRET) assay was miniaturized to a 1,536-well microtiter plate format to employ quantitative high-throughput screening (qHTS). The optimized 1,536-well TR-FRET assay showed Z′-factors of ≥0.5. Seven- to 15-point concentration–response curves (CRCs) were generated for 8,280 compounds using both terbium and fluorescein emission data, resulting in the generation of 241,664 data points. The qHTS method allowed us to retrospectively examine single concentration screening datasets to assess the sensitivity and selectivity of the PXR assay at different compound screening concentrations. Furthermore, nonspecific assay artifacts such as concentration-based quenching of the terbium signal and compound fluorescence were identified through the examination of CRCs for specific emission channels. The CRC information was also used to define chemotypes associated with PXR ligands. This study demonstrates the feasibility of profiling thousands of compounds against PXR using the TR-FRET assay in a high-throughput format.
doi:10.1089/adt.2009.193
PMCID: PMC3116688  PMID: 19505231
21.  Drug Discovery for Schistosomiasis: Hit and Lead Compounds Identified in a Library of Known Drugs by Medium-Throughput Phenotypic Screening 
Background
Praziquantel (PZQ) is the only widely available drug to treat schistosomiasis. Given the potential for drug resistance, it is prudent to search for novel therapeutics. Identification of anti-schistosomal chemicals has traditionally relied on phenotypic (whole organism) screening with adult worms in vitro and/or animal models of disease—tools that limit automation and throughput with modern microtiter plate-formatted compound libraries.
Methods
A partially automated, three-component phenotypic screen workflow is presented that utilizes at its apex the schistosomular stage of the parasite adapted to a 96-well plate format with a throughput of 640 compounds per month. Hits that arise are subsequently screened in vitro against adult parasites and finally for efficacy in a murine model of disease. Two GO/NO GO criteria filters in the workflow prioritize hit compounds for tests in the animal disease model in accordance with a target drug profile that demands short-course oral therapy. The screen workflow was inaugurated with 2,160 chemically diverse natural and synthetic compounds, of which 821 are drugs already approved for human use. This affords a unique starting point to ‘reposition’ (re-profile) drugs as anti-schistosomals with potential savings in development timelines and costs.
Findings
Multiple and dynamic phenotypes could be categorized for schistosomula and adults in vitro, and a diverse set of ‘hit’ drugs and chemistries were identified, including anti-schistosomals, anthelmintics, antibiotics, and neuromodulators. Of those hits prioritized for tests in the animal disease model, a number of leads were identified, one of which compares reasonably well with PZQ in significantly decreasing worm and egg burdens, and disease-associated pathology. Data arising from the three components of the screen are posted online as a community resource.
Conclusions
To accelerate the identification of novel anti-schistosomals, we have developed a partially automated screen workflow that interfaces schistosomula with microtiter plate-formatted compound libraries. The workflow has identified various compounds and drugs as hits in vitro and leads, with the prescribed oral efficacy, in vivo. Efforts to improve throughput, automation, and rigor of the screening workflow are ongoing.
Author Summary
The flatworm disease schistosomiasis infects over 200 million people with just one drug (praziquantel) available—a concern should drug resistance develop. Present drug discovery approaches for schistosomiasis are slow and not conducive to automation in a high-throughput format. Therefore, we designed a three-component screen workflow that positions the larval (schistosomulum) stage of S. mansoni at its apex followed by screens of adults in culture and, finally, efficacy tests in infected mice. Schistosomula are small enough and available in sufficient numbers to interface with automated liquid handling systems and prosecute thousands of compounds in short time frames. We inaugurated the workflow with a 2,160 compound library that includes known drugs in order to cost effectively ‘re-position’ drugs as new therapies for schistosomiasis and/or identify compounds that could be modified to that end. We identify a variety of ‘hit’ compounds (antibiotics, psychoactives, antiparasitics, etc.) that produce behavioral responses (phenotypes) in schistosomula and adults. Tests in infected mice of the most promising hits identified a number of ‘leads,’ one of which compares reasonably well with praziquantel in killing worms, decreasing egg production by the parasite, and ameliorating disease pathology. Efforts continue to more fully automate the workflow. All screen data are posted online as a drug discovery resource.
doi:10.1371/journal.pntd.0000478
PMCID: PMC2702839  PMID: 19597541
22.  Revealing a signaling role of phytosphingosine-1-phosphate in yeast 
Perturbing metabolic systems of bioactive sphingolipids with genetic approachMultiple types of “omics” data collected from the systemSystems approach for integrating multiple “omics” informationPredicting signal transduction information flow: lipid; TF activation; gene expression
In contemporary biomedical research, gene mutation remains the most powerful and commonly used tool in molecular and systems biology for perturbation and dissection of biological systems. However, as biological systems consist of highly connected networks, for example, metabolic networks or signal transduction networks, perturbing one portion could result in widely spread effects across the network. Such ‘ripple effects' in systems pose a challenge to the paradigm of investigating the role of a metabolite through mutating enzymes required for its production. In this study, we have developed a systems biology approach that integrates different types of ‘-omics' data to identify signal transduction pathways involving spingolipids and gene expression. See Figure 1 for an overall scheme of our approaches.
Sphingolipids are a family of bioactive lipids that have important signaling functions in cells; in yeast, de novo synthesis is required to mediate the cell response to heat shock. We hypothesized that a specific sphingolipid, phyto-sphingosine-1-phosphate (PHS1P), functions as a signaling molecule in the heat stress response (HSR) because, though its mammalian counterparts are known to have important signaling roles, the function of this metabolite in yeast remains unknown. To identify a putative role of PHS1P in the HSR, we deleted the genes involved in production (LCB4 and LCB5) and degradation (DPL1) of PHS1P to perturb its levels in cells. In wild-type cells, heat shock induces a significant increase in PHS1P. Over the same course, expression of over a thousand genes was modulated.
While deleting the genes involved in PHS1P metabolism ‘clamped' the PHS1P concentration as expected, these mutations also resulted in wide spread changes in many sphingolipids in addition to PHS1P. This ‘ripple effect' prevented direct identification of signaling role of PHS1P in gene expression. We overcame this difficulty by using a set of systems approaches as follows: (1) identifying the information between levels of each individual sphingolipid species and gene expression through combining correlation analysis and clustering; (2) identifying the putative PHS1P-sensitive subset of genes by analyzing the results from step 1; (3) identifying transcription factors (TFs) that potentially regulate these PHS1P-sensitive genes thought promoter analysis; (4) modeling the activation states of the TFs by combining gene expression data and promoter sequence data; and finally, (5) modeling the relationship between sphingolipids and activation of TFs.
Our study showed that 441 genes were differentially expressed in the lcb4Δ/lcb5Δ strain in comparison to wild-type strain; however, only 77 genes among them showed a significant correlation with respect to PHS1P, with 22 genes positively correlated and 54 genes negatively correlated. The results led to a hypothesis that the genes showing significant correlation were PHS1P sensitive whereas differential expression of other genes resulted from the compounding ‘ripple effects' of the gene deletions. We tested this hypothesis by directly treating cells with PHS1P and monitoring the expression levels of the genes that were PHS1P sensitive and PHS1P insensitive, and the results showed that the expression of PHS1P-sensitive genes indeed changed in response to the treatment whereas others did not. We developed a statistical model referred to as Bayesian transcription factor state model to infer activation states of TFs in cells under a specific condition based on the genomic information and gene expression data. We then used a Bayesian logistic regression to further model the relationship between the lipid concentrations and activation states of the TFs. Combined TF enrichment analysis and TF state modeling indicated that the HAP TF complex was likely responding to the signal from PHS1P and mediating the regulation of PHS1P-sensitive genes. We tested this hypothesis by treating wild type and a strain of yeast with deletion of HAP4 gene (hap4Δ), a component of the HAP complex, with PHS1P and monitoring the expression of PHS1P-sensitive genes. Indeed, the PHS1P induced the genes in the wild-type strain but not in hap4Δ, thus indicating that induction of the PHS1P-sensitive genes required a functioning HAP complex (see Figure 5 ).
In summary, our experiments demonstrated that, though gene mutation remains one of the most powerful tools to perturb biological systems, the high connectivity of biological systems poses a challenge for using this approach to identify signaling roles of bioactive metabolites. Here, we demonstrated combining the information from multiple types of ‘-omics' data using systems approaches, it is possible to circumvent these difficulties and reveal novel signal transduction pathways.
Sphingolipids including sphingosine-1-phosphate and ceramide participate in numerous cell programs through signaling mechanisms. This class of lipids has important functions in stress responses; however, determining which sphingolipid mediates specific events has remained encumbered by the numerous metabolic interconnections of sphingolipids, such that modulating a specific lipid of interest through manipulating metabolic enzymes causes ‘ripple effects', which change levels of many other lipids. Here, we develop a method of integrative analysis for genomic, transcriptomic, and lipidomic data to address this previously intractable problem. This method revealed a specific signaling role for phytosphingosine-1-phosphate, a lipid with no previously defined specific function in yeast, in regulating genes required for mitochondrial respiration through the HAP complex transcription factor. This approach could be applied to extract meaningful biological information from a similar experimental design that produces multiple sets of high-throughput data.
doi:10.1038/msb.2010.3
PMCID: PMC2835565  PMID: 20160710
information integration; lipidomics; signal transduction; sphingolipids; transcriptomics
23.  Cross-species chemogenomic profiling reveals evolutionarily conserved drug mode of action 
Chemogenomic screens were performed in both budding and fission yeasts, allowing for a cross-species comparison of drug–gene interaction networks.Drug–module interactions were more conserved than individual drug–gene interactions.Combination of data from both species can improve drug–module predictions and helps identify a compound's mode of action.
Understanding the molecular effects of chemical compounds in living cells is an important step toward rational therapeutics. Drug discovery aims to find compounds that will target a specific pathway or pathogen with minimal side effects. However, even when an effective drug is found, its mode of action (MoA) is typically not well understood. The lack of knowledge regarding a drug's MoA makes the drug discovery process slow and rational therapeutics incredibly difficult. More recently, different high-throughput methods have been developed that attempt to discern how a compound exerts its effects in cells. One of these methods relies on measuring the growth of cells carrying different mutations in the presence of the compounds of interest, commonly referred to as chemogenomics (Wuster and Babu, 2008). The differential growth of the different mutants provides clues as to what the compounds target in the cell (Figure 2). For example, if a drug inhibits a branch in a vital two-branch pathway, then mutations in the second branch might result in cell death if the mutants are grown in the presence of the drug (Figure 2C). As these compound–mutant functional interactions are expected to be relatively rare, one can assume that the growth rate of a mutant–drug combination should generally be equal to the product of the growth rate of the untreated mutant with the growth rate of the drug-treated wild type. This expectation is defined as the neutral model and deviations from this provide a quantitative score that allow us to make informed predictions regarding a drug's MoA (Figure 2B; Parsons et al, 2006).
The availability of these high-throughput approaches now allows us to perform cross-species studies of functional interactions between compounds and genes. In this study, we have performed a quantitative analysis of compound–gene interactions for two fungal species (budding yeast (S. cerevisiae) and fission yeast (S. pombe)) that diverged from each other approximately 500–700 million years ago. A collection of 2957 compounds from the National Cancer Institute (NCI) were screened in both species for inhibition of wild-type cell growth. A total of 132 were found to be bioactive in both fungi and 9, along with 12 additional well-characterized drugs, were selected for subsequent screening. Mutant libraries of 727 and 438 gene deletions were used for S. cerevisiae and S. pombe, respectively, and these were selected based on availability of genetic interaction data from previous studies (Collins et al, 2007; Roguev et al, 2008; Fiedler et al, 2009) and contain an overlap of 190 one-to-one orthologs that can be directly compared. Deviations from the neutral expectation were quantified as drug–gene interactions scores (D-scores) for the 21 compounds against the deletion libraries. Replicates of both screens showed very high correlations (S. cerevisiae r=0.72, S. pombe r=0.76) and reproduced well previously known compound–gene interactions (Supplementary information). We then compared the D-scores for the 190 one-to-one orthologs present in the data set of both species. Despite the high reproducibility, we observed a very poor conservation of these compound–gene interaction scores across these species (r=0.13, Figure 4A).
Previous work had shown that, across these same species, genetic interactions within protein complexes were much more conserved than average genetic interactions (Roguev et al, 2008). Similarly we observed a higher cross-species conservation of the compound–module (complex or pathway) interactions than the overall compound–gene interactions. Specifically, the data derived from fission yeast were a poor predictor of S. cerevisaie drug–gene interactions, but a good predictor of budding yeast compound–module connections (Figure 4B). Also, a combined score from both species improved the prediction of compound–module interactions, above the accuracy observed with the S. cerevisae information alone, but this improvement was not observed for the prediction of drug–gene interactions (Figure 4B). Data from both species were used to predict drug–module interactions, and one specific interaction (compound NSC-207895 interaction with DNA repair complexes) was experimentally verified by showing that the compound activates the DNA damage repair pathway in three species (S. cerevisiae, S. pombe and H. sapiens).
To understand why the combination of chemogenomic data from two species might improve drug–module interaction predictions, we also analyzed previously published cross-species genetic–interaction data. We observed a significant correlation between the conservation of drug–gene and gene–gene interactions among the one-to-one orthologs (r=0.28, P-value=0.0078). Additionally, the strongest interactions of benomyl (a microtubule inhibitor) were to complexes that also had strong and conserved genetic interactions with microtubules (Figure 4C). We hypothesize that a significant number of the compound–gene interactions obtained from chemogenomic studies are not direct interactions with the physical target of the compounds, but include many indirect interactions that genetically interact with the main target(s). This would explain why the compound interaction networks show similar evolutionary patterns as the genetic interactions networks.
In summary, these results shed some light on the interplay between the evolution of genetic networks and the evolution of drug response. Understanding how genetic variability across different species might result in different sensitivity to drugs should improve our capacity to design treatments. Concretely, we hope that this line of research might one day help us create drugs and drug combinations that specifically affect a pathogen or diseased tissue, but not the host.
We present a cross-species chemogenomic screening platform using libraries of haploid deletion mutants from two yeast species, Saccharomyces cerevisiae and Schizosaccharomyces pombe. We screened a set of compounds of known and unknown mode of action (MoA) and derived quantitative drug scores (or D-scores), identifying mutants that are either sensitive or resistant to particular compounds. We found that compound–functional module relationships are more conserved than individual compound–gene interactions between these two species. Furthermore, we observed that combining data from both species allows for more accurate prediction of MoA. Finally, using this platform, we identified a novel small molecule that acts as a DNA damaging agent and demonstrate that its MoA is conserved in human cells.
doi:10.1038/msb.2010.107
PMCID: PMC3018166  PMID: 21179023
chemogenomics; evolution; modularity
24.  Paradigm Shift in Toxicity Testing and Modeling 
The AAPS Journal  2012;14(3):473-480.
The limitations of traditional toxicity testing characterized by high-cost animal models with low-throughput readouts, inconsistent responses, ethical issues, and extrapolability to humans call for alternative strategies for chemical risk assessment. A new strategy using in vitro human cell-based assays has been designed to identify key toxicity pathways and molecular mechanisms leading to the prediction of an in vivo response. The emergence of quantitative high-throughput screening (qHTS) technology has proved to be an efficient way to decompose complex toxicological end points to specific pathways of targeted organs. In addition, qHTS has made a significant impact on computational toxicology in two aspects. First, the ease of mechanism of action identification brought about by in vitro assays has enhanced the simplicity and effectiveness of machine learning, and second, the high-throughput nature and high reproducibility of qHTS have greatly improved the data quality and increased the quantity of training datasets available for predictive model construction. In this review, the benefits of qHTS routinely used in the US Tox21 program will be highlighted. Quantitative structure–activity relationships models built on traditional in vivo data and new qHTS data will be compared and analyzed. In conjunction with the transition from the pilot phase to the production phase of the Tox21 program, more qHTS data will be made available that will enrich the data pool for predictive toxicology. It is perceivable that new in silico toxicity models based on high-quality qHTS data will achieve unprecedented reliability and robustness, thus becoming a valuable tool for risk assessment and drug discovery.
doi:10.1208/s12248-012-9358-1
PMCID: PMC3385826  PMID: 22528508
computational toxicology; qHTS; risk assessment; Tox21
25.  Identification of Small Molecule Lead Compounds for Visceral Leishmaniasis Using a Novel Ex Vivo Splenic Explant Model System 
Background
New drugs are needed to treat visceral leishmaniasis (VL) because the current therapies are toxic, expensive, and parasite resistance may weaken drug efficacy. We established a novel ex vivo splenic explant culture system from hamsters infected with luciferase-transfected Leishmania donovani to screen chemical compounds for anti-leishmanial activity.
Methodology/Principal Findings
This model has advantages over in vitro systems in that it: 1) includes the whole cellular population involved in the host-parasite interaction; 2) is initiated at a stage of infection when the immunosuppressive mechanisms that lead to progressive VL are evident; 3) involves the intracellular form of Leishmania; 4) supports parasite replication that can be easily quantified by detection of parasite-expressed luciferase; 5) is adaptable to a high-throughput screening format; and 6) can be used to identify compounds that have both direct and indirect anti-parasitic activity. The assay showed excellent discrimination between positive (amphotericin B) and negative (vehicle) controls with a Z' Factor >0.8. A duplicate screen of 4 chemical libraries containing 4,035 compounds identified 202 hits (5.0%) with a Z score of <–1.96 (p<0.05). Eighty-four (2.1%) of the hits were classified as lead compounds based on the in vitro therapeutic index (ratio of the compound concentration causing 50% cytotoxicity in the HepG2 cell line to the concentration that caused 50% reduction in the parasite load). Sixty-nine (82%) of the lead compounds were previously unknown to have anti-leishmanial activity. The most frequently identified lead compounds were classified as quinoline-containing compounds (14%), alkaloids (10%), aromatics (11%), terpenes (8%), phenothiazines (7%) and furans (5%).
Conclusions/Significance
The ex vivo splenic explant model provides a powerful approach to identify new compounds active against L. donovani within the pathophysiologic environment of the infected spleen. Further in vivo evaluation and chemical optimization of these lead compounds may generate new candidates for preclinical studies of treatment for VL.
Author Summary
Visceral leishmaniasis is a life threatening parasitic disease present in several countries of the world. New drugs are needed to treat this disease because treatments are becoming increasingly ineffective. We established a novel system to screen for new anti-leishmanial compounds that utilizes spleen cells from hamsters infected with the parasite Leishmania donovani. The parasite strain we used was genetically engineered to emit light by the incorporation of the firefly luciferase gen. This laboratory test system has the advantage of reproducing the cellular environment where the drug has to combat the infection. The efficacy of the compounds is easily determined by measuring the light emitted by the surviving parasites in a luminometer after exposing the infected cells to the test compounds. The screening of more than 4,000 molecules showed that 84 (2.1%) of them showed anti-leishmanial activity and had an acceptable toxicity evaluation. Eighty two percent of these molecules, which had varied chemical structures, were previously unknown to have anti-leishmanial activity. Further studies in animals of these new chemical entities may identify drug candidates for the treatment of visceral leishmaniasis.
doi:10.1371/journal.pntd.0000962
PMCID: PMC3039689  PMID: 21358812

Results 1-25 (1501850)