PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1377581)

Clipboard (0)
None

Related Articles

1.  Enhanced stochastic optimization algorithm for finding effective multi-target therapeutics 
BMC Bioinformatics  2011;12(Suppl 1):S18.
Background
For treating a complex disease such as cancer, we need effective means to control the biological network that underlies the disease. However, biological networks are typically robust to external perturbations, making it difficult to beneficially alter the network dynamics by controlling a single target. In fact, multi-target therapeutics is often more effective compared to monotherapies, and combinatory drugs are commonly used these days for treating various diseases. A practical challenge in combination therapy is that the number of possible drug combinations increases exponentially, which makes the prediction of the optimal drug combination a difficult combinatorial optimization problem. Recently, a stochastic optimization algorithm called the Gur Game algorithm was proposed for drug optimization, which was shown to be very efficient in finding potent drug combinations.
Results
In this paper, we propose a novel stochastic optimization algorithm that can be used for effective optimization of combinatory drugs. The proposed algorithm analyzes how the concentration change of a specific drug affects the overall drug response, thereby making an informed guess on how the concentration should be updated to improve the drug response. We evaluated the performance of the proposed algorithm based on various drug response functions, and compared it with the Gur Game algorithm.
Conclusions
Numerical experiments clearly show that the proposed algorithm significantly outperforms the original Gur Game algorithm, in terms of reliability and efficiency. This enhanced optimization algorithm can provide an effective framework for identifying potent drug combinations that lead to optimal drug response.
doi:10.1186/1471-2105-12-S1-S18
PMCID: PMC3044272  PMID: 21342547
2.  A Diverse Stochastic Search Algorithm for Combination Therapeutics 
BioMed Research International  2014;2014:873436.
Background. Design of drug combination cocktails to maximize sensitivity for individual patients presents a challenge in terms of minimizing the number of experiments to attain the desired objective. The enormous number of possible drug combinations constrains exhaustive experimentation approaches, and personal variations in genetic diseases restrict the use of prior knowledge in optimization. Results. We present a stochastic search algorithm that consisted of a parallel experimentation phase followed by a combination of focused and diversified sequential search. We evaluated our approach on seven synthetic examples; four of them were evaluated twice with different parameters, and two biological examples of bacterial and lung cancer cell inhibition response to combination drugs. The performance of our approach as compared to recently proposed adaptive reference update approach was superior for all the examples considered, achieving an average of 45% reduction in the number of experimental iterations. Conclusions. As the results illustrate, the proposed diverse stochastic search algorithm can produce optimized combinations in relatively smaller number of iterative steps. This approach can be combined with available knowledge on the genetic makeup of the patient to design optimal selection of drug cocktails.
doi:10.1155/2014/873436
PMCID: PMC3971504  PMID: 24738075
3.  Bird biodiversity assessments in temperate forest: the value of point count versus acoustic monitoring protocols 
PeerJ  2015;3:e973.
Effective monitoring programs for biodiversity are needed to assess trends in biodiversity and evaluate the consequences of management. This is particularly true for birds and faunas that occupy interior forest and other areas of low human population density, as these are frequently under-sampled compared to other habitats. For birds, Autonomous Recording Units (ARUs) have been proposed as a supplement or alternative to point counts made by human observers to enhance monitoring efforts. We employed two strategies (i.e., simultaneous-collection and same-season) to compare point count and ARU methods for quantifying species richness and composition of birds in temperate interior forests. The simultaneous-collection strategy compares surveys by ARUs and point counts, with methods matched in time, location, and survey duration such that the person and machine simultaneously collect data. The same-season strategy compares surveys from ARUs and point counts conducted at the same locations throughout the breeding season, but methods differ in the number, duration, and frequency of surveys. This second strategy more closely follows the ways in which monitoring programs are likely to be implemented. Site-specific estimates of richness (but not species composition) differed between methods; however, the nature of the relationship was dependent on the assessment strategy. Estimates of richness from point counts were greater than estimates from ARUs in the simultaneous-collection strategy. Woodpeckers in particular, were less frequently identified from ARUs than point counts with this strategy. Conversely, estimates of richness were lower from point counts than ARUs in the same-season strategy. Moreover, in the same-season strategy, ARUs detected the occurrence of passerines at a higher frequency than did point counts. Differences between ARU and point count methods were only detected in site-level comparisons. Importantly, both methods provide similar estimates of species richness and composition for the region. Consequently, if single visits to sites or short-term monitoring are the goal, point counts will likely perform better than ARUs, especially if species are rare or vocalize infrequently. However, if seasonal or annual monitoring of sites is the goal, ARUs offer a viable alternative to standard point-count methods, especially in the context of large-scale or long-term monitoring of temperate forest birds.
doi:10.7717/peerj.973
PMCID: PMC4451018  PMID: 26038728
ARU; Avian; Conservation; Diversity; Interior forest; Long-term monitoring; Management; Methodology
4.  Characterization of an Arginine:Pyruvate Transaminase in Arginine Catabolism of Pseudomonas aeruginosa PAO1▿  
Journal of Bacteriology  2007;189(11):3954-3959.
The arginine transaminase (ATA) pathway represents one of the multiple pathways for l-arginine catabolism in Pseudomonas aeruginosa. The AruH protein was proposed to catalyze the first step in the ATA pathway, converting the substrates l-arginine and pyruvate into 2-ketoarginine and l-alanine. Here we report the initial biochemical characterization of this enzyme. The aruH gene was overexpressed in Escherichia coli, and its product was purified to homogeneity. High-performance liquid chromatography and mass spectrometry (MS) analyses were employed to detect the presence of the transamination products 2-ketoarginine and l-alanine, thus demonstrating the proposed biochemical reaction catalyzed by AruH. The enzymatic properties and kinetic parameters of dimeric recombinant AruH were determined by a coupled reaction with NAD+ and l-alanine dehydrogenase. The optimal activity of AruH was found at pH 9.0, and it has a novel substrate specificity with an order of preference of Arg > Lys > Met > Leu > Orn > Gln. With l-arginine and pyruvate as the substrates, Lineweaver-Burk plots of the data revealed a series of parallel lines characteristic of a ping-pong kinetic mechanism with calculated Vmax and kcat values of 54.6 ± 2.5 μmol/min/mg and 38.6 ± 1.8 s−1. The apparent Km and catalytic efficiency (kcat/Km) were 1.6 ± 0.1 mM and 24.1 mM−1 s−1 for pyruvate and 13.9 ± 0.8 mM and 2.8 mM−1 s−1 for l-arginine. When l-lysine was used as the substrate, MS analysis suggested Δ1-piperideine-2-carboxylate as its transamination product. These results implied that AruH may have a broader physiological function in amino acid catabolism.
doi:10.1128/JB.00262-07
PMCID: PMC1913410  PMID: 17416668
5.  Functional Genomics Enables Identification of Genes of the Arginine Transaminase Pathway in Pseudomonas aeruginosa▿  
Journal of Bacteriology  2007;189(11):3945-3953.
Arginine utilization in Pseudomonas aeruginosa with multiple catabolic pathways represents one of the best examples of the metabolic versatility of this organism. To identify genes involved in arginine catabolism, we have employed DNA microarrays to analyze the transcriptional profiles of this organism in response to l-arginine. While most of the genes involved in arginine uptake, regulation, and metabolism have been identified as members of the ArgR (arginine-responsive regulatory protein) regulon in our previous study, they did not include any genes of the arginine dehydrogenase (ADH) pathway. In this study, 18 putative transcriptional units of 38 genes, including the two known genes of the ADH pathway, kauB and gbuA, were found to be inducible by exogenous l-arginine in the absence of ArgR. To identify the missing genes that encode enzymes for the initial steps of the ADH pathway, the potential physiological functions of those candidate genes in arginine utilization were studied by growth phenotype analysis of knockout mutants. Expression of these genes was induced by l-arginine in an aruF mutant strain devoid of a functional arginine succinyltransferase pathway, the major route of arginine utilization. Disruption of dadA, a putative catabolic alanine dehydrogenase-encoding gene, in the aruF mutant produced no growth on l-arginine, suggesting the involvement of l-alanine in arginine catabolism. This hypothesis was further supported by the detection of an l-arginine-inducible arginine:pyruvate transaminase activity in the aruF mutant. Knockout of aruH and aruI, which encode an arginine:pyruvate transaminase and a 2-ketoarginine decarboxylase in an operon, also abolished the ability of the aruF mutant to grow on l-arginine. The results of high-performance liquid chromatography analysis demonstrated consumption of 2-ketoarginine and suggested that generation of 4-guanidinobutyraldehyde occurred in the aruF mutant but not in the aruF aruI mutant. These results led us to propose the arginine transaminase pathway that removes the α-amino group of l-arginine via transamination instead of oxidative deamination by dehydrogenase or oxidase as originally proposed. In the same genetic locus, we also identified a two-component system, AruRS, for the regulation of arginine-responsive induction of the arginine transaminase pathway. This work depicted a wider network of arginine metabolism than we previously recognized.
doi:10.1128/JB.00261-07
PMCID: PMC1913404  PMID: 17416670
6.  A high-performance spatial database based approach for pathology imaging algorithm evaluation 
Background:
Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform.
Context:
The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS) data model.
Aims:
(1) Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2) Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3) Develop a set of queries to support data sampling and result comparisons; (4) Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure.
Materials and Methods:
We have considered two scenarios for algorithm evaluation: (1) algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2) algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The validated data were formatted based on the PAIS data model and loaded into a spatial database. To support efficient data loading, we have implemented a parallel data loading tool that takes advantage of multi-core CPUs to accelerate data injection. The spatial database manages both geometric shapes and image features or classifications, and enables spatial sampling, result comparison, and result aggregation through expressive structured query language (SQL) queries with spatial extensions. To provide scalable and efficient query support, we have employed a shared nothing parallel database architecture, which distributes data homogenously across multiple database partitions to take advantage of parallel computation power and implements spatial indexing to achieve high I/O throughput.
Results:
Our work proposes a high performance, parallel spatial database platform for algorithm validation and comparison. This platform was evaluated by storing, managing, and comparing analysis results from a set of brain tumor whole slide images. The tools we develop are open source and available to download.
Conclusions:
Pathology image algorithm validation and comparison are essential to iterative algorithm development and refinement. One critical component is the support for queries involving spatial predicates and comparisons. In our work, we develop an efficient data model and parallel database approach to model, normalize, manage and query large volumes of analytical image result data. Our experiments demonstrate that the data partitioning strategy and the grid-based indexing result in good data distribution across database nodes and reduce I/O overhead in spatial join queries through parallel retrieval of relevant data and quick subsetting of datasets. The set of tools in the framework provide a full pipeline to normalize, load, manage and query analytical results for algorithm evaluation.
doi:10.4103/2153-3539.108543
PMCID: PMC3624706  PMID: 23599905
Algorithm validation; parallel database; pathology imaging; spatial database
7.  3D Protein structure prediction with genetic tabu search algorithm 
BMC Systems Biology  2010;4(Suppl 1):S6.
Background
Protein structure prediction (PSP) has important applications in different fields, such as drug design, disease prediction, and so on. In protein structure prediction, there are two important issues. The first one is the design of the structure model and the second one is the design of the optimization technology. Because of the complexity of the realistic protein structure, the structure model adopted in this paper is a simplified model, which is called off-lattice AB model. After the structure model is assumed, optimization technology is needed for searching the best conformation of a protein sequence based on the assumed structure model. However, PSP is an NP-hard problem even if the simplest model is assumed. Thus, many algorithms have been developed to solve the global optimization problem. In this paper, a hybrid algorithm, which combines genetic algorithm (GA) and tabu search (TS) algorithm, is developed to complete this task.
Results
In order to develop an efficient optimization algorithm, several improved strategies are developed for the proposed genetic tabu search algorithm. The combined use of these strategies can improve the efficiency of the algorithm. In these strategies, tabu search introduced into the crossover and mutation operators can improve the local search capability, the adoption of variable population size strategy can maintain the diversity of the population, and the ranking selection strategy can improve the possibility of an individual with low energy value entering into next generation. Experiments are performed with Fibonacci sequences and real protein sequences. Experimental results show that the lowest energy obtained by the proposed GATS algorithm is lower than that obtained by previous methods.
Conclusions
The hybrid algorithm has the advantages from both genetic algorithm and tabu search algorithm. It makes use of the advantage of multiple search points in genetic algorithm, and can overcome poor hill-climbing capability in the conventional genetic algorithm by using the flexible memory functions of TS. Compared with some previous algorithms, GATS algorithm has better performance in global optimization and can predict 3D protein structure more effectively.
doi:10.1186/1752-0509-4-S1-S6
PMCID: PMC2880412  PMID: 20522256
8.  Rapid optimization of drug combinations for the optimal angiostatic treatment of cancer 
Angiogenesis  2015;18(3):233-244.
Drug combinations can improve angiostatic cancer treatment efficacy and enable the reduction of side effects and drug resistance. Combining drugs is non-trivial due to the high number of possibilities. We applied a feedback system control (FSC) technique with a population-based stochastic search algorithm to navigate through the large parametric space of nine angiostatic drugs at four concentrations to identify optimal low-dose drug combinations. This implied an iterative approach of in vitro testing of endothelial cell viability and algorithm-based analysis. The optimal synergistic drug combination, containing erlotinib, BEZ-235 and RAPTA-C, was reached in a small number of iterations. Final drug combinations showed enhanced endothelial cell specificity and synergistically inhibited proliferation (p < 0.001), but not migration of endothelial cells, and forced enhanced numbers of endothelial cells to undergo apoptosis (p < 0.01). Successful translation of this drug combination was achieved in two preclinical in vivo tumor models. Tumor growth was inhibited synergistically and significantly (p < 0.05 and p < 0.01, respectively) using reduced drug doses as compared to optimal single-drug concentrations. At the applied conditions, single-drug monotherapies had no or negligible activity in these models. We suggest that FSC can be used for rapid identification of effective, reduced dose, multi-drug combinations for the treatment of cancer and other diseases.
Electronic supplementary material
The online version of this article (doi:10.1007/s10456-015-9462-9) contains supplementary material, which is available to authorized users.
doi:10.1007/s10456-015-9462-9
PMCID: PMC4473022  PMID: 25824484
Anti-angiogenesis; Combination therapy; Drug–drug interactions; Feedback system control; Search algorithm
9.  A hybrid algorithm for instant optimization of beam weights in anatomy-based intensity modulated radiotherapy: A performance evaluation study 
The study aims to introduce a hybrid optimization algorithm for anatomy-based intensity modulated radiotherapy (AB-IMRT). Our proposal is that by integrating an exact optimization algorithm with a heuristic optimization algorithm, the advantages of both the algorithms can be combined, which will lead to an efficient global optimizer solving the problem at a very fast rate. Our hybrid approach combines Gaussian elimination algorithm (exact optimizer) with fast simulated annealing algorithm (a heuristic global optimizer) for the optimization of beam weights in AB-IMRT. The algorithm has been implemented using MATLAB software. The optimization efficiency of the hybrid algorithm is clarified by (i) analysis of the numerical characteristics of the algorithm and (ii) analysis of the clinical capabilities of the algorithm. The numerical and clinical characteristics of the hybrid algorithm are compared with Gaussian elimination method (GEM) and fast simulated annealing (FSA). The numerical characteristics include convergence, consistency, number of iterations and overall optimization speed, which were analyzed for the respective cases of 8 patients. The clinical capabilities of the hybrid algorithm are demonstrated in cases of (a) prostate and (b) brain. The analyses reveal that (i) the convergence speed of the hybrid algorithm is approximately three times higher than that of FSA algorithm; (ii) the convergence (percentage reduction in the cost function) in hybrid algorithm is about 20% improved as compared to that in GEM algorithm; (iii) the hybrid algorithm is capable of producing relatively better treatment plans in terms of Conformity Index (CI) [~ 2% - 5% improvement] and Homogeneity Index (HI) [~ 4% - 10% improvement] as compared to GEM and FSA algorithms; (iv) the sparing of organs at risk in hybrid algorithm-based plans is better than that in GEM-based plans and comparable to that in FSA-based plans; and (v) the beam weights resulting from the hybrid algorithm are about 20% smoother than those obtained in GEM and FSA algorithms. In summary, the study demonstrates that hybrid algorithms can be effectively used for fast optimization of beam weights in AB-IMRT.
doi:10.4103/0971-6203.79693
PMCID: PMC3119957  PMID: 21731224
Anatomy-based IMRT; hybrid algorithm; intensity modulated radiotherapy; optimization; fast simulated annealing
10.  Machine Learning Assisted Design of Highly Active Peptides for Drug Discovery 
PLoS Computational Biology  2015;11(4):e1004074.
The discovery of peptides possessing high biological activity is very challenging due to the enormous diversity for which only a minority have the desired properties. To lower cost and reduce the time to obtain promising peptides, machine learning approaches can greatly assist in the process and even partly replace expensive laboratory experiments by learning a predictor with existing data or with a smaller amount of data generation. Unfortunately, once the model is learned, selecting peptides having the greatest predicted bioactivity often requires a prohibitive amount of computational time. For this combinatorial problem, heuristics and stochastic optimization methods are not guaranteed to find adequate solutions. We focused on recent advances in kernel methods and machine learning to learn a predictive model with proven success. For this type of model, we propose an efficient algorithm based on graph theory, that is guaranteed to find the peptides for which the model predicts maximal bioactivity. We also present a second algorithm capable of sorting the peptides of maximal bioactivity. Extensive analyses demonstrate how these algorithms can be part of an iterative combinatorial chemistry procedure to speed up the discovery and the validation of peptide leads. Moreover, the proposed approach does not require the use of known ligands for the target protein since it can leverage recent multi-target machine learning predictors where ligands for similar targets can serve as initial training data. Finally, we validated the proposed approach in vitro with the discovery of new cationic antimicrobial peptides. Source code freely available at http://graal.ift.ulaval.ca/peptide-design/.
Author Summary
Part of the complexity of drug discovery is the sheer chemical diversity to explore combined to all requirements a compound must meet to become a commercial drug. Hence, it makes sense to automate this chemical exploration endeavor in a wise, informed, and efficient fashion. Here, we focused on peptides as they have properties that make them excellent drug starting points. Machine learning techniques may replace expensive in-vitro laboratory experiments by learning an accurate model of it. However, computational models also suffer from the combinatorial explosion due to the enormous chemical diversity. Indeed, applying the model to every peptides would take an astronomical amount of computer time. Therefore, given a model, is it possible to determine, using reasonable computational time, the peptide that has the best properties and chance for success? This exact question is what motivated our work. We focused on recent advances in kernel methods and machine learning to learn a model that already had excellent results. We demonstrate that this class of model has mathematical properties that makes it possible to rapidly identify and sort the best peptides. Finally, in-vitro and in-silico results are provided to support and validate this theoretical discovery.
doi:10.1371/journal.pcbi.1004074
PMCID: PMC4388847  PMID: 25849257
11.  Change of Platelet Reactivity to Antiplatelet Therapy after Stenting Procedure for Cerebral Artery Stenosis: VerifyNow Antiplatelet Assay before and after Stenting 
Neurointervention  2012;7(1):23-26.
Purpose
VerifyNow antiplatelet assays were performed before and after stenting for various cerebral artery stenoses to determine the effect of the procedure itself to the function of dual antiplatelets given.
Materials and Methods
A total of 30 consecutive patients underwent cerebral arterial stenting procedure were enrolled. The antiplatelet pretreatment regimen was aspirin (100 mg daily) and clopidogrel (300 mg of loading dose followed by 75mg daily). VerifyNow antiplatelet assay performed before and right after stenting. The two test results were compared in terms of aspirin-reaction unit (ARU), P2Y12 reaction units (PRU), baseline (BASE), and percentage inhibition. We evaluated occurrence of any intra-procedural in-stent thrombosis or immediate thromboembolic complication, and ischemic events in 1-month follow-up.
Results
The median Pre-ARU was 418 (range, 350-586). For clopidogrel the medians of the pre-BASE, PRU, and percent inhibition were 338 (279-454), 256 (56-325), and 27% (0-57%). The medians of the post-ARU, BASE, PRU, and percent inhibition after stenting were 469 (range, 389-573), 378 (288-453), 274 (81-370), and 26% (0-79%). There was a significant increase of ARU (p=0.045), BASE (p=0.026), and PRU (p=0.018) before and after stenting. One immediate thromboembolic event was observed in poor-response group after stenting. There was no in-stent thrombosis and ischemic event in 1-month follow-up.
Conclusion
We observed a significant increase of platelet reactivity to dual antiplatelet therapy right after stenting procedure for various cerebral arterial stenoses.
doi:10.5469/neuroint.2012.7.1.23
PMCID: PMC3299946  PMID: 22454781
Stent; Cerebrovascular disorders; Atherosclerosis; Antiplatelet drugs; VerifyNow antiplatelet assay
12.  NAVIS-An UGV Indoor Positioning System Using Laser Scan Matching for Large-Area Real-Time Applications 
Sensors (Basel, Switzerland)  2014;14(7):11805-11824.
Laser scan matching with grid-based maps is a promising tool for real-time indoor positioning of mobile Unmanned Ground Vehicles (UGVs). While there are critical implementation problems, such as the ability to estimate the position by sensing the unknown indoor environment with sufficient accuracy and low enough latency for stable vehicle control, further development work is necessary. Unfortunately, most of the existing methods employ heuristics for quick positioning in which numerous accumulated errors easily lead to loss of positioning accuracy. This severely restricts its applications in large areas and over lengthy periods of time. This paper introduces an efficient real-time mobile UGV indoor positioning system for large-area applications using laser scan matching with an improved probabilistically-motivated Maximum Likelihood Estimation (IMLE) algorithm, which is based on a multi-resolution patch-divided grid likelihood map. Compared with traditional methods, the improvements embodied in IMLE include: (a) Iterative Closed Point (ICP) preprocessing, which adaptively decreases the search scope; (b) a totally brute search matching method on multi-resolution map layers, based on the likelihood value between current laser scan and the grid map within refined search scope, adopted to obtain the global optimum position at each scan matching; and (c) a patch-divided likelihood map supporting a large indoor area. A UGV platform called NAVIS was designed, manufactured, and tested based on a low-cost robot integrating a LiDAR and an odometer sensor to verify the IMLE algorithm. A series of experiments based on simulated data and field tests with NAVIS proved that the proposed IMEL algorithm is a better way to perform local scan matching that can offer a quick and stable positioning solution with high accuracy so it can be part of a large area localization/mapping, application. The NAVIS platform can reach an updating rate of 12 Hz in a feature-rich environment and 2 Hz even in a feature-poor environment, respectively. Therefore, it can be utilized in a real-time application.
doi:10.3390/s140711805
PMCID: PMC4168456  PMID: 24999715
laser scan matching; indoor position; real-time; iterative closed point; unmanned ground vehicle
13.  Therapy of moderate and severe psoriasis 
Objective and methods
This health technology assessment (HTA) report synthesises systematically randomized controlled studies (RCT) on the therapy of moderate and severe psoriasis vulgaris which were published between 1999 and 2004; it includes some important clinical studies which have been published after 2004 and thus updates the English HTA report by Griffiths et al. [1]. The major objective is the evaluation of the medical effectiveness of different therapeutical approaches and the cost effectiveness with relevance for Germany.
Results
The major conclusions from the results of medical RCT on moderate and severe psoriasis vulgaris are:
Oral fumarates are effective in the treatment of moderate to severe psoriasis vulgaris. However, fumarates quiet frequently cause moderate side effects. Cyclosporine and methotrexate are both effective in the treatment of severe psoriasis vulgaris. Both substances have a different spectrum of side effects which may limit the individual applicability. Acetritin is only moderately effective in the treatment of severe psoriasis of the plaque type. Calcipotriol or UV-radiation used at the same time can increase the clinical effectiveness of acetritin.
Systemic PUVA, balneo-PUVA and UVB therapy are all effective for the treatment of severe psoriasis. The combination of UV therapy with vitamin D3 analogues or with topical steroids is more effective than the treatment with UV radiation alone. Saltwater baths increase the effectiveness of UVB therapy. No RCT on the therapeutical effects of topical tar or of dithranol in combination with UV therapy have been published so far. A continuous therapy with PUVA should not be applied due to its proven photocarcinogenicity.
Three substances from the group of biologicals (Efalizumab, Etanercept, and Infliximab) are now available in Europe and a further substance (Alefacept) is available in the USA for the treatment of moderate to severe psoriasis. All biologicals have been effective in placebo controlled studies. The substances differ in the times until a clinical effect is observable, in the spectrum of side effects and in their efficiency on psoriasis arthritis.
From health-economic studies considering both costs and clinical efficiency oral fumarates appear to be superior to acitretin or cyclosporine (although cyclosporine appears to be more effective in severe psoriasis). From the health economic view methotrexate is equivalent with UVB or PUVA and superior to cyclosporine. The therapy options UVB, UVB plus calcipotriol and PUVA are equivalent and superior to balneo-phototherapy. Biologicals are cost intensive and should be used when other approaches are not sufficient or are not applicable due to their side effects.
The HTA report summarizes some health-economic studies on dithranol, on calcipotriol and on the combination with tar and UV light. No RCT have been published for the treatment of severe psoriasis with these agents alone but it appears to be certain that these substances are effective in severe psoriasis as well.
Discussion
The spectrum of therapeutical options has fortunately increased during the last years. It must be emphasized that a number of therapeutical procedures exist which are not discussed in detail in this HTA. This is due to the search strategy of literature: Only RCT performed with patients with moderate and/or severe psoriasis vulgaris were included into this evaluation. This led to the exclusion of a number of substances which are traditionally used alone or in combination for the treatment of moderate or severe psoriasis vulgaris (e.g. dithranol, salicyc acid, tar, corticosteroids and topical retinoids). Moreover, other approaches which include neither drugs nor UV light are not discussed in this HTA although the authors believe in the importance of psychotherapeutical interventions, educational approaches and combined medical and non-medical approaches in rehabilitational medicine in the management of psoriasis vulgaris.
The transferability of the health economic evaluations is strongly limited by the fact that all included health economic evaluations except one were not aligned to a German setting. A future research question will be the evaluation of the duration of remission and relapse ratios in the context of different therapy options of moderate and severe psoriasis. Moreover, the consideration of combined outcomes such as the improvement of psoriatic symptoms and the decrease of symptoms in accompanying psoriasis arthritis represents a future requirement of health assessment.
Conclusions
From the clinical point of view it is positive that the spectrum of therapeutic procedures for a chronic severe skin disease has increased continuously during the last years. In cases of individual contraindications or individual inefficacies it is now possible to try alternative approaches. Moreover the risk of long-term side effects can be reduced by changing the therapeutical procedure after some time (so-called rotation therapy). The therapeutical algorithm for severe psoriasis vulgaris now includes photo(chemo-)therapy in combination with topical substances, oral fumaric acid esters, retinoids (in combination with phototherapy or topical substances), methotrexate, cylosporine and the new biologics.
Future studies should address therapeutical approaches which can not easily be studied by RCT, e.g. physical, balneological, climate approaches, educational programs and complex rehabilitation therapy which all may have positive effects on individuals with severe psoriasis.
As in medical therapy management of moderate and severe psoriasis the economic evaluation also points out the way of a strategic therapy concept which corresponds to a large extent to the algorithm in medical practice.
PMCID: PMC3011355  PMID: 21289958
14.  Search Algorithms as a Framework for the Optimization of Drug Combinations 
PLoS Computational Biology  2008;4(12):e1000249.
Combination therapies are often needed for effective clinical outcomes in the management of complex diseases, but presently they are generally based on empirical clinical experience. Here we suggest a novel application of search algorithms—originally developed for digital communication—modified to optimize combinations of therapeutic interventions. In biological experiments measuring the restoration of the decline with age in heart function and exercise capacity in Drosophila melanogaster, we found that search algorithms correctly identified optimal combinations of four drugs using only one-third of the tests performed in a fully factorial search. In experiments identifying combinations of three doses of up to six drugs for selective killing of human cancer cells, search algorithms resulted in a highly significant enrichment of selective combinations compared with random searches. In simulations using a network model of cell death, we found that the search algorithms identified the optimal combinations of 6–9 interventions in 80–90% of tests, compared with 15–30% for an equivalent random search. These findings suggest that modified search algorithms from information theory have the potential to enhance the discovery of novel therapeutic drug combinations. This report also helps to frame a biomedical problem that will benefit from an interdisciplinary effort and suggests a general strategy for its solution.
Author Summary
This work describes methods that identify drug combinations that might alleviate the suffering caused by complex diseases. Our biological model systems are: physiological decline associated with aging, and selective killing of cancer cells. The novelty of this approach is based on a new application of methods from digital communications theory, which becomes useful when the number of possible combinations is large and a complete set of measurements cannot be obtained. This limit is reached easily, given the many drugs and doses available for complex diseases. We are not simply using computer models but are using search algorithms implemented with biological measurements, built to integrate information from different sources, including simulations. This might be considered parallel biological computation and differs from the classic systems biology approach by having search algorithms rather than explicit quantitative models as the central element. Because variation is an essential component of biology, this approach might be more appropriate for combined drug interventions, which can be considered a form of biological control. Search algorithms are used in many fields in physics and engineering. We hope that this paper will generate interest in a new application of importance to human health from practitioners of diverse computational disciplines.
doi:10.1371/journal.pcbi.1000249
PMCID: PMC2590660  PMID: 19112483
15.  Efficacy and safety of antiplatelet-combination therapy after drug-eluting stent implantation 
Background/Aims
Combination single-pill therapy can improve cost-effectiveness in a typical medical therapy. However, there is a little evidence about the efficacy and tolerability of combination single-pill antiplatelet therapy after percutaneous coronary intervention (PCI) with drug-eluting stents (DES).
Methods
From June to November 2012, in total, 142 patients who met the following criteria were enrolled: at least 18 years old; successful PCI with DES at least 3 months earlier; and regular medication of aspirin and clopidogrel with no side effects. After VerifyNow P2Y12 and aspirin assays, the combination single pill of aspirin and clopidogrel was given and laboratory tests were repeated 6 weeks later.
Results
At baseline, the incidence of aspirin resistance, defined as aspirin reaction unit (ARU) ≥ 550, was 9.2%, that of clopidogrel resistance, defined as P2Y12 reaction unit (PRU) ≥ 230, was 46.5%, and that of percent inhibition of PRU < 20% was 32.4%. At follow-up, the incidence of resistance by ARU value was 7.0%, 50.0% by PRU value, and 35.9% by percentage inhibition of PRU, respectively. The mean values of ARU (431.5 ± 63.6 vs. 439.8 ± 55.2; p = 0.216) and PRU (227.5 ± 71.4 vs. 223.3 ± 76.0; p = 0.350) were not significantly different before versus after antiplatelet-combination single-pill therapy. Five adverse events (3.5%) were observed during the study period.
Conclusions
Combination single-pill antiplatelet therapy, which may reduce daily pill burden for patients after PCI with DES, demonstrated similar efficacy to separate dual-pill antiplatelet therapy.
doi:10.3904/kjim.2014.29.2.210
PMCID: PMC3956991  PMID: 24648804
Aspirin; Clopidogrel; Drug combinations
16.  Primitive Fitting Based on the Efficient multiBaySAC Algorithm 
PLoS ONE  2015;10(3):e0117341.
Although RANSAC is proven to be robust, the original RANSAC algorithm selects hypothesis sets at random, generating numerous iterations and high computational costs because many hypothesis sets are contaminated with outliers. This paper presents a conditional sampling method, multiBaySAC (Bayes SAmple Consensus), that fuses the BaySAC algorithm with candidate model parameters statistical testing for unorganized 3D point clouds to fit multiple primitives. This paper first presents a statistical testing algorithm for a candidate model parameter histogram to detect potential primitives. As the detected initial primitives were optimized using a parallel strategy rather than a sequential one, every data point in the multiBaySAC algorithm was assigned to multiple prior inlier probabilities for initial multiple primitives. Each prior inlier probability determined the probability that a point belongs to the corresponding primitive. We then implemented in parallel a conditional sampling method: BaySAC. With each iteration of the hypothesis testing process, hypothesis sets with the highest inlier probabilities were selected and verified for the existence of multiple primitives, revealing the fitting for multiple primitives. Moreover, the updated version of the initial probability was implemented based on a memorable form of Bayes’ Theorem, which describes the relationship between prior and posterior probabilities of a data point by determining whether the hypothesis set to which a data point belongs is correct. The proposed approach was tested using real and synthetic point clouds. The results show that the proposed multiBaySAC algorithm can achieve a high computational efficiency (averaging 34% higher than the efficiency of the sequential RANSAC method) and fitting accuracy (exhibiting good performance in the intersection of two primitives), whereas the sequential RANSAC framework clearly suffers from over- and under-segmentation problems. Future work will aim at further optimizing this strategy through its application to other problems such as multiple point cloud co-registration and multiple image matching.
doi:10.1371/journal.pone.0117341
PMCID: PMC4363901  PMID: 25781620
17.  Spike-based Decision Learning of Nash Equilibria in Two-Player Games 
PLoS Computational Biology  2012;8(9):e1002691.
Humans and animals face decision tasks in an uncertain multi-agent environment where an agent's strategy may change in time due to the co-adaptation of others strategies. The neuronal substrate and the computational algorithms underlying such adaptive decision making, however, is largely unknown. We propose a population coding model of spiking neurons with a policy gradient procedure that successfully acquires optimal strategies for classical game-theoretical tasks. The suggested population reinforcement learning reproduces data from human behavioral experiments for the blackjack and the inspector game. It performs optimally according to a pure (deterministic) and mixed (stochastic) Nash equilibrium, respectively. In contrast, temporal-difference(TD)-learning, covariance-learning, and basic reinforcement learning fail to perform optimally for the stochastic strategy. Spike-based population reinforcement learning, shown to follow the stochastic reward gradient, is therefore a viable candidate to explain automated decision learning of a Nash equilibrium in two-player games.
Author Summary
Socio-economic interactions are captured in a game theoretic framework by multiple agents acting on a pool of goods to maximize their own reward. Neuroeconomics tries to explain the agent's behavior in neuronal terms. Classical models in neuroeconomics use temporal-difference(TD)-learning. This algorithm incrementally updates values of state-action pairs, and actions are selected according to a value-based policy. In contrast, policy gradient methods do not introduce values as intermediate steps, but directly derive an action selection policy which maximizes the total expected reward. We consider a decision making network consisting of a population of neurons which, upon presentation of a spatio-temporal spike pattern, encodes binary actions by the population output spike trains and a subsequent majority vote. The action selection policy is parametrized by the strengths of synapses projecting to the population neurons. A gradient learning rule is derived which modifies these synaptic strengths and which depends on four factors, the pre- and postsynaptic activities, the action and the reward. We show that for classical game-theoretical tasks our decision making network endowed with the four-factor learning rule leads to Nash-optimal action selections. It also mimics human decision learning for these same tasks.
doi:10.1371/journal.pcbi.1002691
PMCID: PMC3459907  PMID: 23028289
18.  Application of Composite Dictionary Multi-Atom Matching in Gear Fault Diagnosis 
Sensors (Basel, Switzerland)  2011;11(6):5981-6002.
The sparse decomposition based on matching pursuit is an adaptive sparse expression method for signals. This paper proposes an idea concerning a composite dictionary multi-atom matching decomposition and reconstruction algorithm, and the introduction of threshold de-noising in the reconstruction algorithm. Based on the structural characteristics of gear fault signals, a composite dictionary combining the impulse time-frequency dictionary and the Fourier dictionary was constituted, and a genetic algorithm was applied to search for the best matching atom. The analysis results of gear fault simulation signals indicated the effectiveness of the hard threshold, and the impulse or harmonic characteristic components could be separately extracted. Meanwhile, the robustness of the composite dictionary multi-atom matching algorithm at different noise levels was investigated. Aiming at the effects of data lengths on the calculation efficiency of the algorithm, an improved segmented decomposition and reconstruction algorithm was proposed, and the calculation efficiency of the decomposition algorithm was significantly enhanced. In addition it is shown that the multi-atom matching algorithm was superior to the single-atom matching algorithm in both calculation efficiency and algorithm robustness. Finally, the above algorithm was applied to gear fault engineering signals, and achieved good results.
doi:10.3390/s110605981
PMCID: PMC3231441  PMID: 22163938
composite dictionary; multi-atom matching; threshold de-noising; segmented decomposition and reconstruction; gear fault diagnosis; genetic algorithm
19.  Improved Ant Algorithms for Software Testing Cases Generation 
The Scientific World Journal  2014;2014:392309.
Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to porduce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO), and improved the global path pheromone update strategy for ant colony optimization (IGPACO). At last, we put forward a comprehensive improved ant colony optimization (ACIACO), which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND) and genetic algorithm (GA) in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations.
doi:10.1155/2014/392309
PMCID: PMC4032714  PMID: 24883391
20.  Ultraviolet Phototherapy Management of Moderate-to-Severe Plaque Psoriasis 
Executive Summary
Objective
The purpose of this evidence based analysis was to determine the effectiveness and safety of ultraviolet phototherapy for moderate-to-severe plaque psoriasis.
Research Questions
The specific research questions for the evidence review were as follows:
What is the safety of ultraviolet phototherapy for moderate-to-severe plaque psoriasis?
What is the effectiveness of ultraviolet phototherapy for moderate-to-severe plaque psoriasis?
Clinical Need: Target Population and Condition
Psoriasis is a common chronic, systemic inflammatory disease affecting the skin, nails and occasionally the joints and has a lifelong waning and waxing course. It has a worldwide occurrence with a prevalence of at least 2% of the general population, making it one of the most common systemic inflammatory diseases. The immune-mediated disease has several clinical presentations with the most common (85% - 90%) being plaque psoriasis.
Characteristic features of psoriasis include scaling, redness, and elevation of the skin. Patients with psoriasis may also present with a range of disabling symptoms such as pruritus (itching), pain, bleeding, or burning associated with plaque lesions and up to 30% are classified as having moderate-to-severe disease. Further, some psoriasis patients can be complex medical cases in which diabetes, inflammatory bowel disease, and hypertension are more likely to be present than in control populations and 10% also suffer from arthritis (psoriatic arthritis). The etiology of psoriasis is unknown but is thought to result from complex interactions between the environment and predisposing genes.
Management of psoriasis is related to the extent of the skin involvement, although its presence on the hands, feet, face or genitalia can present challenges. Moderate-to-severe psoriasis is managed by phototherapy and a range of systemic agents including traditional immunosuppressants such as methotrexate and cyclospsorin. Treatment with modern immunosuppressant agents known as biologicals, which more specifically target the immune defects of the disease, is usually reserved for patients with contraindications and those failing or unresponsive to treatments with traditional immunosuppressants or phototherapy.
Treatment plans are based on a long-term approach to managing the disease, patient’s expectations, individual responses and risk of complications. The treatment goals are several fold but primarily to:
1) improve physical signs and secondary psychological effects,
2) reduce inflammation and control skin shedding,
3) control physical signs as long as possible, and to
4) avoid factors that can aggravate the condition.
Approaches are generally individualized because of the variable presentation, quality of life implications, co-existent medical conditions, and triggering factors (e.g. stress, infections and medications). Individual responses and commitments to therapy also present possible limitations.
Phototherapy
Ultraviolet phototherapy units have been licensed since February 1993 as a class 2 device in Canada. Units are available as hand held devices, hand and foot devices, full-body panel, and booth styles for institutional and home use. Units are also available with a range of ultraviolet A, broad and narrow band ultraviolet B (BB-UVB and NB-UVB) lamps. After establishing appropriate ultraviolet doses, three-times weekly treatment schedules for 20 to 25 treatments are generally needed to control symptoms.
Evidence-Based Analysis Methods
The literature search strategy employed keywords and subject headings to capture the concepts of 1) phototherapy and 2) psoriasis. The search involved runs in the following databases: Ovid MEDLINE (1996 to March Week 3 2009), OVID MEDLINE In-Process and Other Non-Indexed Citations, EMBASE (1980 to 2009 Week 13), the Wiley Cochrane Library, and the Centre for Reviews and Dissemination/International Agency for Health Technology Assessment. Parallel search strategies were developed for the remaining databases. Search results were limited to human and English-language published between January 1999 and March 31, 2009. Search alerts were generated and reviewed for relevant literature up until May 31, 2009.
English language reports and human studies
Ultraviolet phototherapy interventions for plaque-type psoriasis
Reports involving efficacy and/or safety outcome studies
Original reports with defined study methodology
Standardized measurements on outcome events such as technical success, safety, effectiveness, durability, quality of life or patient satisfaction
Non-systematic reviews, letters, comments and editorials
Randomized trials involving side-to-side or half body comparisons
Randomized trials not involving ultraviolet phototherapy intervention for plaque-type psoriasis
Trials involving dosing studies, pilot feasibility studies or lacking control groups
Summary of Findings
A 2000 health technology evidence report on the overall management of psoriasis by The National Institute Health Research (NIHR) Health Technology Assessment Program of the UK was identified in the MAS evidence-based review. The report included 109 RCT studies published between 1966 and June 1999 involving four major treatment approaches – 51 on phototherapy, 32 on oral retinoids, 18 on cyclosporin and five on fumarates.. The absence of RCTs on methotrexate was noted as original studies with this agent had been performed prior to 1966.
Of the 51 RCT studies involving phototherapy, 22 involved UVA, 21 involved UVB, five involved both UVA and UVB and three involved natural light as a source of UV. The RCT studies included comparisons of treatment schedules, ultraviolet source, addition of adjuvant therapies, and comparisons between phototherapy and topical treatment schedules. Because of heterogeneity, no synthesis or meta-analysis could be performed. Overall, the reviewers concluded that the efficacy of only five therapies could be supported from the RCT-based evidence review: photochemotherapy or phototherapy, cyclosporin, systemic retinoids, combination topical vitamin D3 analogues (calcipotriol) and corticosteroids in combination with phototherapy and fumarates. Although there was no RCT evidence supporting methotrexate, it’s efficacy for psoriasis is well known and it continues to be a treatment mainstay.
The conclusion of the NIHR evidence review was that both photochemotherapy and phototherapy were effective treatments for clearing psoriasis, although their comparative effectiveness was unknown. Despite the conclusions on efficacy, a number of issues were identified in the evidence review and several areas for future research were discussed to address these limitations. Trials focusing on comparative effectiveness, either between ultraviolet sources or between classes of treatment such as methotrexate versus phototherapy, were recommended to refine treatment algorithms. The need for better assessment of cost-effectiveness of therapies to consider systemic drug costs and costs of surveillance, as well as drug efficacy, were also noted. Overall, the authors concluded that phototherapy and photochemotherapy had important roles in psoriasis management and were standard therapeutic options for psoriasis offered in dermatology practices.
The MAS evidence-based review focusing on the RCT trial evidence for ultraviolet phototherapy management of moderate-to-severe plaque psoriasis was performed as an update to the NIHR 2000 systemic review on treatments for severe psoriasis. In this review, an additional 26 RCT reports examining phototherapy or photochemotherapy for psoriasis were identified. Among the studies were two RCTs comparing ultraviolet wavelength sources, five RCTs comparing different forms of phototherapy, four RCTs combining phototherapy with prior spa saline bathing, nine RCTs combining phototherapy with topical agents, two RCTs combining phototherapy with the systemic immunosuppressive agents methotrexate or alefacept, one RCT comparing phototherapy with an additional light source (the excimer laser), and one comparing a combination therapy with phototherapy and psychological intervention involving simultaneous audiotape sessions on mindfulness and stress reduction. Two trials also examined the effect of treatment setting on effectiveness of phototherapy, one on inpatient versus outpatient therapy and one on outpatient clinic versus home-based phototherapy.
Conclusions
The conclusions of the MAS evidence-based review are outlined in Table ES1. In summary, phototherapy provides good control of clinical symptoms in the short term for patients with moderate-to-severe plaque-type psoriasis that have failed or are unresponsive to management with topical agents. However, many of the evidence gaps identified in the NIHR 2000 evidence review on psoriasis management persisted. In particular, the lack of evidence on the comparative effectiveness and/or cost-effectiveness between the major treatment options for moderate-to-severe psoriasis remained. The evidence on effectiveness and safety of longer term strategies for disease management has also not been addressed. Evidence for the safety, effectiveness, or cost-effectiveness of phototherapy delivered in various settings is emerging but is limited. In addition, because all available treatments for psoriasis – a disease with a high prevalence, chronicity, and cost – are palliative rather than curative, strategies for disease control and improvements in self-efficacy employed in other chronic disease management strategies should be investigated.
RCT Evidence for Ultraviolet Phototherapy Treatment of Moderate-To-Severe Plaque Psoriasis
Phototherapy is an effective treatment for moderate-to-severe plaque psoriasis
Narrow band PT is more effective than broad band PT for moderate-to-severe plaque psoriasis
Oral-PUVA has a greater clinical response, requires less treatments and has a greater cumulative UV irradiation dose than UVB to achieve treatment effects for moderate-to-severe plaque psoriasis
Spa salt water baths prior to phototherapy did increase short term clinical response of moderate-to-severe plaque psoriasis but did not decrease cumulative UV irradiation dose
Addition of topical agents (vitamin D3 calcipotriol) to NB-UVB did not increase mean clinical response or decrease treatments or cumulative UV irradiation dose
Methotrexate prior to NB-UVB in high need psoriasis patients did significantly increase clinical response, decrease number of treatment sessions and decrease cumulative UV irradiation dose
Phototherapy following alefacept did increase early clinical response in moderate-to-severe plaque psoriasis
Effectiveness and safety of home NB-UVB phototherapy was not inferior to NB-UVB phototherapy provided in a clinic to patients with psoriasis referred for phototherapy. Treatment burden was lower and patient satisfaction was higher with home therapy and patients in both groups preferred future phototherapy treatments at home
Ontario Health System Considerations
A 2006 survey of ultraviolet phototherapy services in Canada identified 26 phototherapy clinics in Ontario for a population of over 12 million. At that time, there were 177 dermatologists and 50 geographic regions in which 28% (14/50) provided phototherapy services. The majority of the phototherapy services were reported to be located in densely populated areas; relatively few patients living in rural communities had access to these services. The inconvenience of multiple weekly visits for optimal phototherapy treatment effects poses additional burdens to those with travel difficulties related to health, job, or family-related responsibilities.
Physician OHIP billing for phototherapy services totaled 117,216 billings in 2007, representing approximately 1,800 patients in the province treated in private clinics. The number of patients treated in hospitals is difficult to estimate as physician costs are not billed directly to OHIP in this setting. Instead, phototherapy units and services provided in hospitals are funded by hospitals’ global budgets. Some hospitals in the province, however, have divested their phototherapy services, so the number of phototherapy clinics and their total capacity is currently unknown.
Technological advances have enabled changes in phototherapy treatment regimens from lengthy hospital inpatient stays to outpatient clinic visits and, more recently, to an at-home basis. When combined with a telemedicine follow-up, home phototherapy may provide an alternative strategy for improved access to service and follow-up care, particularly for those with geographic or mobility barriers. Safety and effectiveness have, however, so far been evaluated for only one phototherapy home-based delivery model. Alternate care models and settings could potentially increase service options and access, but the broader consequences of the varying cost structures and incentives that either increase or decrease phototherapy services are unknown.
Economic Analyses
The focus of the current economic analysis was to characterize the costs associated with the provision of NB-UVB phototherapy for plaque-type, moderate-to-severe psoriasis in different clinical settings, including home therapy. A literature review was conducted and no cost-effectiveness (cost-utility) economic analyses were published in this area.
Hospital, Clinic, and Home Costs of Phototherapy
Costs for NB-UVB phototherapy were based on consultations with equipment manufacturers and dermatologists. Device costs applicable to the provision of NB-UVB phototherapy in hospitals, private clinics and at a patient’s home were estimated. These costs included capital costs of purchasing NB-UVB devices (amortized over 15-20 years), maintenance costs of replacing equipment bulbs, physician costs of phototherapy treatment in private clinics ($7.85 per phototherapy treatment), and medication and laboratory costs associated with treatment of moderate-to-severe psoriasis.
NB-UVB phototherapy services provided in a hospital setting were paid for by hospitals directly. Phototherapy services in private clinic and home settings were paid for by the clinic and patient, respectively, except for physician services covered by OHIP. Indirect funding was provided to hospitals as part of global budgeting and resource allocation. Home therapy services for NB-UVB phototherapy were not covered by the MOHLTC. Coverage for home-based phototherapy however, was in some cases provided by third party insurers.
Device costs for NB-UVB phototherapy were estimated for two types of phototherapy units: a “booth unit” consisting of 48 bulbs used in hospitals and clinics, and a “panel unit” consisting of 10 bulbs for home use. The device costs of the booth and panel units were estimated at approximately $18,600 and $2,900, respectively; simple amortization over 15 and 20 years implied yearly costs of approximately $2,500 and $150, respectively. Replacement cost for individual bulbs was about $120 resulting in total annual cost of maintenance of about $8,640 and $120 for booth and panel units, respectively.
Estimated Total Costs for Ontario
Average annual cost per patient for NB-UVB phototherapy provided in the hospital, private clinic or at home was estimated to be $292, $810 and $365 respectively. For comparison purposes, treatment of moderate-to-severe psoriasis with methotrexate and cyclosporin amounted to $712 and $3,407 annually per patient respectively; yearly costs for biological drugs were estimated to be $18,700 for alefacept and $20,300 for etanercept-based treatments.
Total annual costs of NB-UVB phototherapy were estimated by applying average costs to an estimated proportion of the population (age 18 or older) eligible for phototherapy treatment. The prevalence of psoriasis was estimated to be approximately 2% of the population, of which about 85% was of plaque-type psoriasis and approximately 20% to 30% was considered moderate-to-severe in disease severity. An estimate of 25% for moderate-to-severe psoriasis cases was used in the current economic analysis resulting in a range of 29,400 to 44,200 cases. Approximately 21% of these patients were estimated to be using NB-UVB phototherapy for treatment resulting in a number of cases in the range between 6,200 and 9,300 cases. The average (7,700) number of cases was used to calculate associated costs for Ontario by treatment setting.
Total annual costs were as follows: $2.3 million in a hospital setting, $6.3 million in a private clinic setting, and $2.8 million for home phototherapy. Costs for phototherapy services provided in private clinics were greater ($810 per patient annually; total of $6.3 million annually) and differed from the same services provided in the hospital setting only in terms of additional physician costs associated with phototherapy OHIP fees.
Keywords
Psoriasis, ultraviolet radiation, phototherapy, photochemotherapy, NB-UVB, BB-UVB PUVA
PMCID: PMC3377497  PMID: 23074532
21.  The JCSG MR pipeline: optimized alignments, multiple models and parallel searches 
The practical limits of molecular replacement can be extended by using several specifically designed protein models based on fold-recognition methods and by exhaustive searches performed in a parallelized pipeline. Updated results from the JCSG MR pipeline, which to date has solved 33 molecular-replacement structures with less than 35% sequence identity to the closest homologue of known structure, are presented.
The success rate of molecular replacement (MR) falls considerably when search models share less than 35% sequence identity with their templates, but can be improved significantly by using fold-recognition methods combined with exhaustive MR searches. Models based on alignments calculated with fold-recognition algorithms are more accurate than models based on conventional alignment methods such as FASTA or BLAST, which are still widely used for MR. In addition, by designing MR pipelines that integrate phasing and automated refinement and allow parallel processing of such calculations, one can effectively increase the success rate of MR. Here, updated results from the JCSG MR pipeline are presented, which to date has solved 33 MR structures with less than 35% sequence identity to the closest homologue of known structure. By using difficult MR problems as examples, it is demonstrated that successful MR phasing is possible even in cases where the similarity between the model and the template can only be detected with fold-recognition algorithms. In the first step, several search models are built based on all homologues found in the PDB by fold-recognition algorithms. The models resulting from this process are used in parallel MR searches with different combinations of input parameters of the MR phasing algorithm. The putative solutions are subjected to rigid-body and restrained crystallo­graphic refinement and ranked based on the final values of free R factor, figure of merit and deviations from ideal geometry. Finally, crystal packing and electron-density maps are checked to identify the correct solution. If this procedure does not yield a solution with interpretable electron-density maps, then even more alternative models are prepared. The structurally variable regions of a protein family are identified based on alignments of sequences and known structures from that family and appropriate trimmings of the models are proposed. All combinations of these trimmings are applied to the search models and the resulting set of models is used in the MR pipeline. It is estimated that with the improvements in model building and exhaustive parallel searches with existing phasing algorithms, MR can be successful for more than 50% of recognizable homologues of known structures below the threshold of 35% sequence identity. This implies that about one-third of the proteins in a typical bacterial proteome are potential MR targets.
doi:10.1107/S0907444907050111
PMCID: PMC2394805  PMID: 18094477
molecular replacement; sequence-alignment accuracy; homology modeling; parameter-space screening; structural genomics
22.  Refinement of a Population-Based Bayesian Network for Fusion of Health Surveillance Data 
Objective
The project involves analytic combination of multiple evidence sources to monitor health at hundreds of care facilities. A demonstration module featuring a population-based Bayes Network [1] was refined and expanded for application in the Department of Defense Electronic Surveillance System for Community-Based Epidemics (ESSENCE).
Introduction
The ESSENCE demonstration module was built to help DoD health monitors make routine decisions based on disparate evidence sources such as daily counts of ILI-related chief complaints, ratios of positive lab tests for influenza, patient age distribution, and counts of antiviral prescriptions [1]. The module was a population-based (rather than individual-based) Bayesian network (PBN) in that inputs were algorithmic results from these multiple aggregate data streams, and output was the degree of belief that the combined evidence required investigation. The module reduced total alerts substantially and retained sensitivity to the majority of documented outbreaks while clarifying underlying sources of evidence. The current effort was to advance the prototype to production by refining components of the fusion methodology to improve sensitivity while retaining the reduced alert rate.
Methods
The multi-level approach to sensitivity improvement included expanded syndromic queries, more data-sensitive algorithm selection, improved transformation of algorithm outputs to alert states, and hierarchical training of Bayesian networks. Components were tested individually, and the net result was iteratively refined with performance using documented outbreaks.
We examined time series of classes of prescribed drugs and laboratory tests during known events and discussed outbreak-associated elements with domain experts to liberalize data queries. Algorithms were matched to data streams with injection testing applied to 4.5 years of data from 502 outpatient clinics. A hierarchical approach was applied for improved training and verification of PBNs for events related to categories of Influenza-like Illness, Gastrointestinal, Fever, Neurological, and Rash, chosen both for public health importance and for availability of multiple supporting data types. Hierarchical, modular training was applied to common subnetworks, such as a severity indicator PBN depending on case disposition, acute case indicators, complex evaluation/management codes, and patient bounce-backs, depicted in Figure 1. Conversion of individual algorithm outputs to belief states (e.g. “at least two red alerts/past 7 days”) was broadened using analysis of lags between data sources. With data from the known events, we calculated decision support thresholds for the parent-level PBN decision nodes with a stochastic optimization technique maximizing the ratio of alert rates during outbreak to non-outbreak periods.
Results
The expanded data queries, more stream-specific algorithm selection, generalized state transformation, and hierarchical PBN training detected 22 of an expanded collection of 24 documented outbreaks, with incremental improvement ongoing. The mean alert rate drop achieved by the Bayes Net was 87% (minimum of 85%) compared to the combined alerts of all component algorithms across syndromes and facilities.
Conclusions
Expansion and further technical validation upheld the PBN approach as a user-friendly means of analytic decision support given multiple, variably weighted evidence sources. The PBN affords not only sharply reduced alerting, but also transparent indication of evidence underlying each alert. The older algorithm approach remains available as backup. Beta testing of the resulting production system will drive further modification.
PMCID: PMC3692906
Fusion; Bayesian Network; Multivariate; Decision Support
23.  DT-Web: a web-based application for drug-target interaction and drug combination prediction through domain-tuned network-based inference 
BMC Systems Biology  2015;9(Suppl 3):S4.
Background
The identification of drug-target interactions (DTI) is a costly and time-consuming step in drug discovery and design. Computational methods capable of predicting reliable DTI play an important role in the field. Algorithms may aim to design new therapies based on a single approved drug or a combination of them. Recently, recommendation methods relying on network-based inference in connection with knowledge coming from the specific domain have been proposed.
Description
Here we propose a web-based interface to the DT-Hybrid algorithm, which applies a recommendation technique based on bipartite network projection implementing resources transfer within the network. This technique combined with domain-specific knowledge expressing drugs and targets similarity is used to compute recommendations for each drug. Our web interface allows the users: (i) to browse all the predictions inferred by the algorithm; (ii) to upload their custom data on which they wish to obtain a prediction through a DT-Hybrid based pipeline; (iii) to help in the early stages of drug combinations, repositioning, substitution, or resistance studies by finding drugs that can act simultaneously on multiple targets in a multi-pathway environment. Our system is periodically synchronized with DrugBank and updated accordingly. The website is free, open to all users, and available at http://alpha.dmi.unict.it/dtweb/.
Conclusions
Our web interface allows users to search and visualize information on drugs and targets eventually providing their own data to compute a list of predictions. The user can visualize information about the characteristics of each drug, a list of predicted and validated targets, associated enzymes and transporters. A table containing key information and GO classification allows the users to perform their own analysis on our data. A special interface for data submission allows the execution of a pipeline, based on DT-Hybrid, predicting new targets with the corresponding p-values expressing the reliability of each group of predictions. Finally, It is also possible to specify a list of genes tracking down all the drugs that may have an indirect influence on them based on a multi-drug, multi-target, multi-pathway analysis, which aims to discover drugs for future follow-up studies.
doi:10.1186/1752-0509-9-S3-S4
PMCID: PMC4464606  PMID: 26050742
drug-target interaction; domain-tuned network-based inference; drug repositioning; drug combinations; drug substitutions; drug resistance; early stage analysis; online tool
24.  Research on Implementation of Interventions in Tuberculosis Control in Low- and Middle-Income Countries: A Systematic Review 
PLoS Medicine  2012;9(12):e1001358.
Cobelens and colleagues systematically reviewed research on implementation and cost-effectiveness of the WHO-recommended interventions for tuberculosis.
Background
Several interventions for tuberculosis (TB) control have been recommended by the World Health Organization (WHO) over the past decade. These include isoniazid preventive therapy (IPT) for HIV-infected individuals and household contacts of infectious TB patients, diagnostic algorithms for rule-in or rule-out of smear-negative pulmonary TB, and programmatic treatment for multidrug-resistant TB. There is no systematically collected data on the type of evidence that is publicly available to guide the scale-up of these interventions in low- and middle-income countries. We investigated the availability of published evidence on their effectiveness, delivery, and cost-effectiveness that policy makers need for scaling-up these interventions at country level.
Methods and Findings
PubMed, Web of Science, EMBASE, and several regional databases were searched for studies published from 1 January 1990 through 31 March 2012 that assessed health outcomes, delivery aspects, or cost-effectiveness for any of these interventions in low- or middle-income countries. Selected studies were evaluated for their objective(s), design, geographical and institutional setting, and generalizability. Studies reporting health outcomes were categorized as primarily addressing efficacy or effectiveness of the intervention. These criteria were used to draw landscapes of published research. We identified 59 studies on IPT in HIV infection, 14 on IPT in household contacts, 44 on rule-in diagnosis, 19 on rule-out diagnosis, and 72 on second-line treatment. Comparative effectiveness studies were relatively few (n = 9) and limited to South America and sub-Saharan Africa for IPT in HIV-infection, absent for IPT in household contacts, and rare for second-line treatment (n = 3). Evaluations of diagnostic and screening algorithms were more frequent (n = 19) but geographically clustered and mainly of non-comparative design. Fifty-four studies evaluated ways of delivering these interventions, and nine addressed their cost-effectiveness.
Conclusions
There are substantial gaps in published evidence for scale-up for five WHO-recommended TB interventions settings at country level, which for many countries possibly precludes program-wide implementation of these interventions. There is a strong need for rigorous operational research studies to be carried out in programmatic settings to inform on best use of existing and new interventions in TB control.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Tuberculosis (TB), caused by Mycobacterium tuberculosis, is curable and preventable, but according to the World Health Organization (WHO), in 2011, 8.7 million people had symptoms of TB (usually a productive cough and fever) and 1.4 million people—95% from low- and middle-income countries—died from TB. TB is also the leading cause of death in people with HIV worldwide, and in 2010 about 10 million children were orphaned as a result of their parents dying from TB. To help reduce the considerable global burden of TB, a global initiative called the Stop TB Partnership, led by WHO, has implemented a strategy to reduce deaths from TB by 50% by 2015—even greater than the target of Millennium Development Goal 6 (to reverse the increase in TB incidence by 2015).
Why Was This Study Done?
Over the past few years, WHO has recommended that countries implement several interventions to help control the spread of tuberculosis through measures to improve prevention, diagnosis, and treatment. Five such interventions currently recommended by WHO are: treatment with isoniazid to prevent TB among people who are HIV positive, and also among household contacts of people infected with TB; the use of clinical pathways (algorithms) for diagnosing TB in people accessing health care who have a negative smear test—the most commonly used diagnostic test, which relies on sputum samples—(“rule-in algorithms”); screening algorithms for excluding TB in people who have HIV (“rule-out algorithms”); and finally, provision of second-line treatment for multidrug-resistant tuberculosis (a form of TB that does not respond to the most commonly used drugs) under programmatic conditions. The effectiveness of these interventions, their costs, and the practicalities of implementation are all important information for countries seeking to control TB following the WHO guidelines, but little is known about the availability of this information. Therefore, in this study the researchers systematically reviewed published studies to find evidence of the effectiveness of each of these interventions when implemented in routine practice, and also for additional information on the setting and conditions of implemented interventions, which might be useful to other countries.
What Did the Researchers Do and Find?
Using a specific search strategy, the researchers comprehensively searched through several key databases of publications, including regional databases, to identify 208 (out of 11,489 found initially) suitable research papers published between January 1990 and March 2012. For included studies, the researchers also noted the geographical location and setting and the type and design of study.
Of the 208 included studies, 59 focused on isoniazid prevention therapy in HIV infection, and only 14 on isoniazid prevention therapy for household contacts. There were 44 studies on “rule-in” clinical diagnosis, 19 on “rule-out” clinical diagnosis, and 72 studies on second-line treatment for TB. Studies on each intervention had some weaknesses, and overall, researchers found that there were very few real-world studies reporting on the effectiveness of interventions in program settings (rather than under optimal conditions in research settings). Few studies evaluated the methods used to implement the intervention or addressed delivery and operational issues (such as adherence to treatment), and there were limited economic evaluations of the recommended interventions. Furthermore, the researchers found that in general, the South Asian region was poorly represented.
What Do These Findings Mean?
These findings suggest that there is limited evidence on effectiveness, delivery, and cost-effectiveness to guide the scale-up of five WHO recommended interventions to control tuberculosis in the countries and settings, despite the urgent need for such interventions to be implemented. The poor evidence base identified in this review highlights the tension between the decision to adopt the recommendation and its implementation adapted to local circumstances, and may be an important reason as to why these interventions are not implemented in many countries. This study also suggests creative thinking is necessary to address the gaps between WHO recommendations and global health policy on new interventions and their real-world implementation in country-wide TB control programs. Future research should focus more on operational studies, the results of which should be made publicly available, and researchers, donors, and medical journals could perhaps re-consider their priorities to help bridge the knowledge gap identified in this study.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001358.
WHO has a wide range of information about TB and research on TB, including more about the STOP TB strategy and the STOP TB Partnership
The UN website has more information about MDG 6
The Global Fund to Fight AIDS, Tuberculosis and Malaria has specific information about progress on TB control
doi:10.1371/journal.pmed.1001358
PMCID: PMC3525528  PMID: 23271959
25.  Perturbation Biology: Inferring Signaling Networks in Cellular Systems 
PLoS Computational Biology  2013;9(12):e1003290.
We present a powerful experimental-computational technology for inferring network models that predict the response of cells to perturbations, and that may be useful in the design of combinatorial therapy against cancer. The experiments are systematic series of perturbations of cancer cell lines by targeted drugs, singly or in combination. The response to perturbation is quantified in terms of relative changes in the measured levels of proteins, phospho-proteins and cellular phenotypes such as viability. Computational network models are derived de novo, i.e., without prior knowledge of signaling pathways, and are based on simple non-linear differential equations. The prohibitively large solution space of all possible network models is explored efficiently using a probabilistic algorithm, Belief Propagation (BP), which is three orders of magnitude faster than standard Monte Carlo methods. Explicit executable models are derived for a set of perturbation experiments in SKMEL-133 melanoma cell lines, which are resistant to the therapeutically important inhibitor of RAF kinase. The resulting network models reproduce and extend known pathway biology. They empower potential discoveries of new molecular interactions and predict efficacious novel drug perturbations, such as the inhibition of PLK1, which is verified experimentally. This technology is suitable for application to larger systems in diverse areas of molecular biology.
Author Summary
Drugs that target specific effects of signaling proteins are promising agents for treating cancer. One of the many obstacles facing optimal drug design is inadequate quantitative understanding of the coordinated interactions between signaling proteins. De novo model inference of network or pathway models refers to the algorithmic construction of mathematical predictive models from experimental data without dependence on prior knowledge. De novo inference is difficult because of the prohibitively large number of possible sets of interactions that may or may not be consistent with observations. Our new method overcomes this difficulty by adapting a method from statistical physics, called Belief Propagation, which first calculates probabilistically the most likely interactions in the vast space of all possible solutions, then derives a set of individual, highly probable solutions in the form of executable models. In this paper, we test this method on artificial data and then apply it to model signaling pathways in a BRAF-mutant melanoma cancer cell line based on a large set of rich output measurements from a systematic set of perturbation experiments using drug combinations. Our results are in agreement with established biological knowledge, predict novel interactions, and predict efficacious drug targets that are specific to the experimental cell line and potentially to related tumors. The method has the potential, with sufficient systematic perturbation data, to model, de novo and quantitatively, the effects of hundreds of proteins on cellular responses, on a scale that is currently unreachable in diverse areas of cell biology. In a disease context, the method is applicable to the computational design of novel combination drug treatments.
doi:10.1371/journal.pcbi.1003290
PMCID: PMC3868523  PMID: 24367245

Results 1-25 (1377581)