We present the development of a tool, which provides users with the ability to visualize and interact with a comprehensive description of a multi-scale model of the renal nephron. A one-dimensional anatomical model of the nephron has been created and is used for visualization and modelling of tubule transport in various nephron anatomical segments. Mathematical models of nephron segments are embedded in the one-dimensional model. At the cellular level, these segment models use models encoded in CellML to describe cellular and subcellular transport kinetics. A web-based presentation environment has been developed that allows the user to visualize and navigate through the multi-scale nephron model, including simulation results, at the different spatial scales encompassed by the model description. The Zinc extension to Firefox is used to provide an interactive three-dimensional view of the tubule model and the native Firefox rendering of scalable vector graphics is used to present schematic diagrams for cellular and subcellular scale models. The model viewer is embedded in a web page that dynamically presents content based on user input. For example, when viewing the whole nephron model, the user might be presented with information on the various embedded segment models as they select them in the three-dimensional model view. Alternatively, the user chooses to focus the model viewer on a cellular model located in a particular nephron segment in order to view the various membrane transport proteins. Selecting a specific protein may then present the user with a description of the mathematical model governing the behaviour of that protein—including the mathematical model itself and various simulation experiments used to validate the model against the literature.
physiome project; CellML; computational physiology
The loss of cardiac pump function accounts for a significant increase in both mortality and morbidity in Western society, where there is currently a one in four lifetime risk, and costs associated with acute and long-term hospital treatments are accelerating. The significance of cardiac disease has motivated the application of state-of-the-art clinical imaging techniques and functional signal analysis to aid diagnosis and clinical planning. Measurements of cardiac function currently provide high-resolution datasets for characterizing cardiac patients. However, the clinical practice of using population-based metrics derived from separate image or signal-based datasets often indicates contradictory treatments plans owing to inter-individual variability in pathophysiology. To address this issue, the goal of our work, demonstrated in this study through four specific clinical applications, is to integrate multiple types of functional data into a consistent framework using multi-scale computational modelling.
cardiac modelling; patient specific; multi-scale; virtual physiological human
The Virtual Physiological Human is synonymous with a programme in computational biomedicine that aims to develop a framework of methods and technologies to investigate the human body as a whole. It is predicated on the transformational character of information technology, brought to bear on that most crucial of human concerns, our own health and well-being.
computational biomedicine; biotechnology; biomedical informatics
Polymorphisms identified in genome-wide association studies of human traits rarely explain more than a small proportion of the heritable variation, and improving this situation within the current paradigm appears daunting. Given a well-validated dynamic model of a complex physiological trait, a substantial part of the underlying genetic variation must manifest as variation in model parameters. These parameters are themselves phenotypic traits. By linking whole-cell phenotypic variation to genetic variation in a computational model of a single heart cell, incorporating genotype-to-parameter maps, we show that genome-wide association studies on parameters reveal much more genetic variation than when using higher-level cellular phenotypes. The results suggest that letting such studies be guided by computational physiology may facilitate a causal understanding of the genotype-to-phenotype map of complex traits, with strong implications for the development of phenomics technology.
Despite an ever-increasing number of genome locations reported to be associated with complex human diseases or quantitative traits, only a small proportion of phenotypic variations in a typical quantitative trait can be explained by the discovered variants. We argue that this problem can partly be resolved by combining the statistical methods of quantitative genetics with computational biology. We demonstrate this for the in silico genotype-to-phenotype map of a model heart cell in conjunction with publically accessible genomic data. We show that genome wide association studies (GWAS) on model parameters identify more causal variants and can build better prediction models for the higher-level phenotypes than by performing GWAS on the higher-level phenotypes themselves. Since model parameters are in principle measurable physiological phenotypes, our findings suggest that development of future phenotyping technologies could be guided by mathematical models of the biological systems being targeted.
Understanding the causal chain from genotypic to phenotypic variation is a tremendous challenge with huge implications for personalized medicine. Here we argue that linking computational physiology to genetic concepts, methodology, and data provides a new framework for this endeavor. We exemplify this causally cohesive genotype–phenotype (cGP) modeling approach using a detailed mathematical model of a heart cell. In silico genetic variation is mapped to parametric variation, which propagates through the physiological model to generate multivariate phenotypes for the action potential and calcium transient under regular pacing, and ion currents under voltage clamping. The resulting genotype-to-phenotype map is characterized using standard quantitative genetic methods and novel applications of high-dimensional data analysis. These analyses reveal many well-known genetic phenomena like intralocus dominance, interlocus epistasis, and varying degrees of phenotypic correlation. In particular, we observe penetrance features such as the masking/release of genetic variation, so that without any change in the regulatory anatomy of the model, traits may appear monogenic, oligogenic, or polygenic depending on which genotypic variation is actually present in the data. The results suggest that a cGP modeling approach may pave the way for a computational physiological genomics capable of generating biological insight about the genotype–phenotype relation in ways that statistical-genetic approaches cannot.
causally cohesive genotype–phenotype modeling; multivariate genotype-to-phenotype map; cGP heart model; penetrance; epistasis
Weight loss and under-nutrition are relatively common in older people, and are associated with poor outcomes including increased rates of hospital admissions and death. In a pilot study of 49 undernourished older, community dwelling people we found that daily treatment for one year with a combination of testosterone tablets and a nutritional supplement produced a significant reduction in hospitalizations. We propose a larger, multicentre study to explore and hopefully confirm this exciting, potentially important finding (NHMRC project grant number 627178).
One year randomized control trial where subjects are allocated to either oral testosterone undecanoate and high calorie oral nutritional supplement or placebo medication and low calorie oral nutritional supplementation. 200 older community-dwelling, undernourished people [Mini Nutritional Assessment score <24 and either: a) low body weight (body mass index, in kg/m2: <22) or b) recent weight loss (>7.5% over 3 months)]. Hospital admissions, quality-adjusted life years, functional status, nutritional health, muscle strength, body composition and other variables will be assessed.
The pilot study showed that combined treatment with an oral testosterone and a supplement drink was well tolerated and safe, and reduced the number of people hospitalised and duration of hospital admissions in undernourished, community dwelling older people. This is an exciting finding, as it identifies a treatment which may be of substantial benefit to many older people in our community. We now propose to conduct a multi-centre study to test these findings in a substantially larger subject group, and to determine the cost effectiveness of this treatment.
Australian Clinical Trial Registry: ACTRN 12610000356066
Motivation: Integrative mathematical and statistical models of cardiac anatomy and physiology can play a vital role in understanding cardiac disease phenotype and planning therapeutic strategies. However, the accuracy and predictive power of such models is dependent upon the breadth and depth of noninvasive imaging datasets. The Cardiac Atlas Project (CAP) has established a large-scale database of cardiac imaging examinations and associated clinical data in order to develop a shareable, web-accessible, structural and functional atlas of the normal and pathological heart for clinical, research and educational purposes. A goal of CAP is to facilitate collaborative statistical analysis of regional heart shape and wall motion and characterize cardiac function among and within population groups.
Results: Three main open-source software components were developed: (i) a database with web-interface; (ii) a modeling client for 3D + time visualization and parametric description of shape and motion; and (iii) open data formats for semantic characterization of models and annotations. The database was implemented using a three-tier architecture utilizing MySQL, JBoss and Dcm4chee, in compliance with the DICOM standard to provide compatibility with existing clinical networks and devices. Parts of Dcm4chee were extended to access image specific attributes as search parameters. To date, approximately 3000 de-identified cardiac imaging examinations are available in the database. All software components developed by the CAP are open source and are freely available under the Mozilla Public License Version 1.1 (http://www.mozilla.org/MPL/MPL-1.1.txt).
Supplementary information: Supplementary data are available at Bioinformatics online.
We propose an innovative, integrated, cost-effective health system to combat major non-communicable diseases (NCDs), including cardiovascular, chronic respiratory, metabolic, rheumatologic and neurologic disorders and cancers, which together are the predominant health problem of the 21st century. This proposed holistic strategy involves comprehensive patient-centered integrated care and multi-scale, multi-modal and multi-level systems approaches to tackle NCDs as a common group of diseases. Rather than studying each disease individually, it will take into account their intertwined gene-environment, socio-economic interactions and co-morbidities that lead to individual-specific complex phenotypes. It will implement a road map for predictive, preventive, personalized and participatory (P4) medicine based on a robust and extensive knowledge management infrastructure that contains individual patient information. It will be supported by strategic partnerships involving all stakeholders, including general practitioners associated with patient-centered care. This systems medicine strategy, which will take a holistic approach to disease, is designed to allow the results to be used globally, taking into account the needs and specificities of local economies and health systems.
European funding under framework 7 (FP7) for the virtual physiological human (VPH) project has been in place now for nearly 2 years. The VPH network of excellence (NoE) is helping in the development of common standards, open-source software, freely accessible data and model repositories, and various training and dissemination activities for the project. It is also helping to coordinate the many clinically targeted projects that have been funded under the FP7 calls. An initial vision for the VPH was defined by framework 6 strategy for a European physiome (STEP) project in 2006. It is now time to assess the accomplishments of the last 2 years and update the STEP vision for the VPH. We consider the biomedical science, healthcare and information and communications technology challenges facing the project and we propose the VPH Institute as a means of sustaining the vision of VPH beyond the time frame of the NoE.
virtual physiological human; physiome; computational physiology; multi-scale modelling
Deterministic dynamic models of complex biological systems contain a large number of parameters and state variables, related through nonlinear differential equations with various types of feedback. A metamodel of such a dynamic model is a statistical approximation model that maps variation in parameters and initial conditions (inputs) to variation in features of the trajectories of the state variables (outputs) throughout the entire biologically relevant input space. A sufficiently accurate mapping can be exploited both instrumentally and epistemically. Multivariate regression methodology is a commonly used approach for emulating dynamic models. However, when the input-output relations are highly nonlinear or non-monotone, a standard linear regression approach is prone to give suboptimal results. We therefore hypothesised that a more accurate mapping can be obtained by locally linear or locally polynomial regression. We present here a new method for local regression modelling, Hierarchical Cluster-based PLS regression (HC-PLSR), where fuzzy C-means clustering is used to separate the data set into parts according to the structure of the response surface. We compare the metamodelling performance of HC-PLSR with polynomial partial least squares regression (PLSR) and ordinary least squares (OLS) regression on various systems: six different gene regulatory network models with various types of feedback, a deterministic mathematical model of the mammalian circadian clock and a model of the mouse ventricular myocyte function.
Our results indicate that multivariate regression is well suited for emulating dynamic models in systems biology. The hierarchical approach turned out to be superior to both polynomial PLSR and OLS regression in all three test cases. The advantage, in terms of explained variance and prediction accuracy, was largest in systems with highly nonlinear functional relationships and in systems with positive feedback loops.
HC-PLSR is a promising approach for metamodelling in systems biology, especially for highly nonlinear or non-monotone parameter to phenotype maps. The algorithm can be flexibly adjusted to suit the complexity of the dynamic model behaviour, inviting automation in the metamodelling of complex systems.
Building repositories of computational models of biological systems ensures that published models are available for both education and further research, and can provide a source of smaller, previously verified models to integrate into a larger model.
One problem with earlier repositories has been the limitations in facilities to record the revision history of models. Often, these facilities are limited to a linear series of versions which were deposited in the repository. This is problematic for several reasons. Firstly, there are many instances in the history of biological systems modelling where an 'ancestral' model is modified by different groups to create many different models. With a linear series of versions, if the changes made to one model are merged into another model, the merge appears as a single item in the history. This hides useful revision history information, and also makes further merges much more difficult, as there is no record of which changes have or have not already been merged. In addition, a long series of individual changes made outside of the repository are also all merged into a single revision when they are put back into the repository, making it difficult to separate out individual changes. Furthermore, many earlier repositories only retain the revision history of individual files, rather than of a group of files. This is an important limitation to overcome, because some types of models, such as CellML 1.1 models, can be developed as a collection of modules, each in a separate file.
The need for revision history is widely recognised for computer software, and a lot of work has gone into developing version control systems and distributed version control systems (DVCSs) for tracking the revision history. However, to date, there has been no published research on how DVCSs can be applied to repositories of computational models of biological systems.
We have extended the Physiome Model Repository software to be fully revision history aware, by building it on top of Mercurial, an existing DVCS. We have demonstrated the utility of this approach, when used in conjunction with the model composition facilities in CellML, to build and understand more complex models. We have also demonstrated the ability of the repository software to present version history to casual users over the web, and to highlight specific versions which are likely to be useful to users.
Providing facilities for maintaining and using revision history information is an important part of building a useful repository of computational models, as this information is useful both for understanding the source of and justification for parts of a model, and to facilitate automated processes such as merges. The availability of fully revision history aware repositories, and associated tools, will therefore be of significant benefit to the community.
The field modelling language FieldML is being developed as a standard for modelling and interchanging field descriptions in software, suitable for a wide range of computation techniques. It comprises a rich set of operators for defining generalized fields as functions of other fields, starting with basic domain fields including sets of discrete objects and coordinate systems. It is extensible by adding new operators and by their arbitrary combination in expressions, making it well suited for describing the inherent complexity of biological materials and organ systems. This paper describes the concepts behind FieldML, including a simple example of a spatially varying finite-element field. It outlines current implementations in established, open source computation and visualization software, both drawing on decades of bioengineering modelling software development experience.
field; modelling; computation; serialization; FieldML
The Physiome Project, exemplified by the Cardiac Physiome, is now 10 years old. In this article, we review past progress and future challenges in developing a quantitative framework for understanding human physiology that incorporates both genetic inheritance and environmental influence. Despite the enormity of the challenge, which is certainly greater than that facing the pioneers of the human genome project 20 years ago, there is reason for optimism that real and accelerating progress is being made.
Mass populations of toxin-producing cyanobacteria commonly develop in fresh-, brackish- and marine waters and effective strategies for monitoring and managing cyanobacterial health risks are required to safeguard animal and human health. A multi-interdisciplinary study, including two UK freshwaters with a history of toxic cyanobacterial blooms, was undertaken to explore different approaches for the identification, monitoring and management of potentially-toxic cyanobacteria and their associated risks. The results demonstrate that (i) cyanobacterial bloom occurrence can be predicted at a local- and national-scale using process-based and statistical models; (ii) cyanobacterial concentration and distribution in waterbodies can be monitored using remote sensing, but minimum detection limits need to be evaluated; (iii) cyanotoxins may be transferred to spray-irrigated root crops; and (iv) attitudes and perceptions towards risks influence the public's preferences and willingness-to-pay for cyanobacterial health risk reductions in recreational waters.
The development of standards for encoding mathematical models is an important component of model building and model sharing among scientists interested in understanding multi-scale physiological processes. CellML provides such a standard, particularly for models based on biophysical mechanisms, and a substantial number of models are now available in the CellML Model Repository. However, there is an urgent need to extend the current CellML metadata standard to provide biological and biophysical annotation of the models in order to facilitate model sharing, automated model reduction and connection to biological databases. This paper gives a broad overview of a number of new developments on CellML metadata and provides links to further methodological details available from the CellML website.
CellML; markup languages; metadata; modelling
Type 2 diabetes is characterized by insulin resistance of target organs, which is due to impaired insulin signal transduction. The skeleton of signaling mediators that provide for normal insulin action has been established. However, the detailed kinetics, and their mechanistic generation, remain incompletely understood. We measured time-courses in primary human adipocytes for the short-term phosphorylation dynamics of the insulin receptor (IR) and the IR substrate-1 in response to a step increase in insulin concentration. Both proteins exhibited a rapid transient overshoot in tyrosine phosphorylation, reaching maximum within 1 min, followed by an intermediate steady-state level after approximately 10 min. We used model-based hypothesis testing to evaluate three mechanistic explanations for this behavior: (A) phosphorylation and dephosphorylation of IR at the plasma membrane only; (B) the additional possibility for IR endocytosis; (C) the alternative additional possibility of feedback signals to IR from downstream intermediates. We concluded that (A) is not a satisfactory explanation; that (B) may serve as an explanation only if both internalization, dephosphorylation, and subsequent recycling are permitted; and that (C) is acceptable. These mechanistic insights cannot be obtained by mere inspection of the datasets, and they are rejections and thus stronger and more final conclusions than ordinary model predictions.
Insulin is a central player in maintaining energy balance in our bodies and in type 2 diabetes, where the effect of insulin on its target tissues is diminished. Insulin acts on cells by binding to specific insulin receptors (IRs) at the cell surface. This triggers a series of events, including attachment of phosphate to IR, activation of downstream proteins that eventually mediate the signal to specific targets in the cell, and internalization of IR to the inner cytosolic part of the cell. The importance, time relations, and interactions between these events are not fully understood. We have collected experimental time-series and developed a novel analysis method based on mathematical modeling to gain insights into these initial aspects of how insulin controls cells. The main conclusion is that either IR internalization and the subsequent recycling back to the cell surface or feedbacks from downstream proteins (or both) must be significantly active during the first few minutes of insulin action. These conclusions could not have been reached from the experimental data through conventional biological reasoning, and this work thus illustrates the power of modeling to improve our understanding of biological systems.
During embryogenesis, multicellular animals are shaped via cell proliferation, cell rearrangement, and apoptosis. At the end of development, tissue architecture is then maintained through balanced rates of cell proliferation and loss. Here, we take an in silico approach to look for generic systems features of morphogenesis in multicellular animals that arise as a consequence of the evolution of development. Using artificial evolution, we evolved cellular automata-based digital organisms that have distinct embryonic and homeostatic phases of development. Although these evolved organisms use a variety of strategies to maintain their form over time, organisms of different types were all found to rapidly recover from environmental damage in the form of wounds. This regenerative response was most robust in an organism with a stratified tissue-like architecture. An evolutionary analysis revealed that evolution itself contributed to the ability of this organism to maintain its form in the face of genetic and environmental perturbation, confirming the results of previous studies. In addition, the exceptional robustness of this organism to surface injury was found to result from an upward flux of cells, driven in part by cell divisions with a stable niche at the tissue base. Given the general nature of the model, our results lead us to suggest that many of the robust systems properties observed in real organisms, including scar-free wound-healing in well-protected embryos and the layered tissue architecture of regenerating epithelial tissues, may be by-products of the evolution of morphogenesis, rather than the direct result of selection.
During development, multicellular animals are shaped by cell proliferation, cell rearrangement, and cell death to generate an adult whose form is maintained over time. Disruption of this finely balanced state can have devastating consequences, including aging, psoriasis, and cancer. Typically, however, development is robust, so that animals achieve the same final form even when challenged by environmental damage such as wounding. To see how morphogenetic robustness arises, we have taken an in silico approach to evolve digital organisms that exhibit distinct phases of growth and homeostasis. During the homeostasis period, organisms were found to use a variety of strategies to maintain their form. Remarkably, however, all recovered from severe wounds, despite having evolved in the absence of selection pressure to do so. This ability to regenerate was most striking in an organism with a tissue-like architecture, where it was enhanced by a directional flux of cells that drives tissue turnover. This identifies a stratified architecture, like that seen in human skin and gut, as an evolutionarily accessible and robust form of tissue organisation, and suggests that wound-healing may be a general feature of evolved morphogenetic systems. Both may therefore contribute to homeostasis, wound-healing, and regeneration in real animals.
Discrete stochastic simulations are a powerful tool for understanding the dynamics of chemical kinetics when there are small-to-moderate numbers of certain molecular species. In this paper we introduce delays into the stochastic simulation algorithm, thus mimicking delays associated with transcription and translation. We then show that this process may well explain more faithfully than continuous deterministic models the observed sustained oscillations in expression levels of hes1 mRNA and Hes1 protein.
Delay processes are ubiquitous in the biological sciences but are not always well-represented in mathematical models attempting to describe these biological processes. Additional issues arise when attempting to capture the uncertainty (intrinsic noise) associated with chemical kinetics in dealing with when and in what order reactions take place. Complicating the situation further are important instances when certain key molecules occur only in small numbers, so that it is not meaningful to talk about concentrations.
In this paper Barrio et al. show how to incorporate delay, intrinsic noise, and discreteness associated with chemical kinetic systems into a very simple algorithm called the delay stochastic simulation algorithm (DSSA). This algorithm very naturally generalises the stochastic simulation algorithm that does not treat delays. The authors then apply the DSSA to a specific set of experiments performed by Hirata et al. who showed, amongst other things, that serum treatment of cultured cells induces cyclic expression of both mRNA and protein of the Notch effector Hes1 with a two-hour period. The authors show how this approach can explain additional experiments performed by Hirata et al., and, because this approach is very general, suggest that it can provide deep insights into the relationship between delayed processes, intrinsic noise, and small numbers of molecules in many biological systems.
Observations on the relationship between cardiac work rate and the levels of energy metabolites adenosine triphosphate (ATP), adenosine diphosphate (ADP), and phosphocreatine (CrP) have not been satisfactorily explained by theoretical models of cardiac energy metabolism. Specifically, the in vivo stability of ATP, ADP, and CrP levels in response to changes in work and respiratory rate has eluded explanation. Here a previously developed model of mitochondrial oxidative phosphorylation, which was developed based on data obtained from isolated cardiac mitochondria, is integrated with a spatially distributed model of oxygen transport in the myocardium to analyze data obtained from several laboratories over the past two decades. The model includes the components of the respiratory chain, the F0F1-ATPase, adenine nucleotide translocase, and the mitochondrial phosphate transporter at the mitochondrial level; adenylate kinase, creatine kinase, and ATP consumption in the cytoplasm; and oxygen transport between capillaries, interstitial fluid, and cardiomyocytes. The integrated model is able to reproduce experimental observations on ATP, ADP, CrP, and inorganic phosphate levels in canine hearts over a range of workload and during coronary hypoperfusion and predicts that cytoplasmic inorganic phosphate level is a key regulator of the rate of mitochondrial respiration at workloads for which the rate of cardiac oxygen consumption is less than or equal to approximately 12 μmol per minute per gram of tissue. At work rates corresponding to oxygen consumption higher than 12 μmol min−1 g−1, model predictions deviate from the experimental data, indicating that at high work rates, additional regulatory mechanisms that are not currently incorporated into the model may be important. Nevertheless, the integrated model explains metabolite levels observed at low to moderate workloads and the changes in metabolite levels and tissue oxygenation observed during graded hypoperfusion. These findings suggest that the observed stability of energy metabolites emerges as a property of a properly constructed model of cardiac substrate transport and mitochondrial metabolism. In addition, the validated model provides quantitative predictions of changes in phosphate metabolites during cardiac ischemia.
To function properly over a range of work rates, the heart must maintain its metabolic energy level within a range that is narrow relative to changes in the rate of energy utilization. Decades of observations have revealed that in cardiac muscle cells, the supply of adenosine triphosphate (ATP)—the primary currency of intracellular energy transfer—is controlled to maintain intracellular concentrations of ATP and related compounds within narrow ranges. Yet the development of a mechanistic understanding of this tight control has lagged behind experimental observation. This paper introduces a computational model that links ATP synthesis in a subcellular body called the mitochondrion with ATP utilization in the cytoplasm, and reveals that the primary control mechanism operating in the system is feedback of substrate concentrations for ATP synthesis. In other words, changes in the concentrations of the products generated by the utilization of ATP in the cell (adenosine diphosphate and inorganic phosphate) effect changes in the rate at which mitochondria utilize those products to resynthesize ATP.
One hundred and forty-one randomly selected surgical patients, aged 35 years or over, were studied preoperatively, followed through their operative procedures, and reassessed during the first post-operative week for evidence of myocardial ischaemia associated with surgical operations under general anaesthesia. Of these patients 38% were found to have preoperative clinical evidence of heart disease, hypertension, or diabetes; 45% had abnormal preoperative E.C.G. patterns.
Three patients experienced myocardial infarction during or within 36 hours of operation, all of the occult type; all were in the preoperative abnormal groups. Non-specific postoperative E.C.G. changes were equally common in the groups of patients with normal or abnormal preoperative electrocardiograms.
A relationship existed between a rise in serum lactic dehydrogenase (L.D.H.) concentration and the field of the operation, but the diagnosis of infarction was not confused provided serum L.D.H. isoenzyme patterns and a rise in serum aspartate aminotransferase (S.G.O.T.) levels were consistent with the diagnosis.