PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (712431)

Clipboard (0)
None

Related Articles

1.  Genetic Predictors of Response to Serotonergic and Noradrenergic Antidepressants in Major Depressive Disorder: A Genome-Wide Analysis of Individual-Level Data and a Meta-Analysis 
PLoS Medicine  2012;9(10):e1001326.
Testing whether genetic information could inform the selection of the best drug for patients with depression, Rudolf Uher and colleagues searched for genetic variants that could predict clinically meaningful responses to two major groups of antidepressants.
Background
It has been suggested that outcomes of antidepressant treatment for major depressive disorder could be significantly improved if treatment choice is informed by genetic data. This study aims to test the hypothesis that common genetic variants can predict response to antidepressants in a clinically meaningful way.
Methods and Findings
The NEWMEDS consortium, an academia–industry partnership, assembled a database of over 2,000 European-ancestry individuals with major depressive disorder, prospectively measured treatment outcomes with serotonin reuptake inhibiting or noradrenaline reuptake inhibiting antidepressants and available genetic samples from five studies (three randomized controlled trials, one part-randomized controlled trial, and one treatment cohort study). After quality control, a dataset of 1,790 individuals with high-quality genome-wide genotyping provided adequate power to test the hypotheses that antidepressant response or a clinically significant differential response to the two classes of antidepressants could be predicted from a single common genetic polymorphism. None of the more than half million genetic markers significantly predicted response to antidepressants overall, serotonin reuptake inhibitors, or noradrenaline reuptake inhibitors, or differential response to the two types of antidepressants (genome-wide significance p<5×10−8). No biological pathways were significantly overrepresented in the results. No significant associations (genome-wide significance p<5×10−8) were detected in a meta-analysis of NEWMEDS and another large sample (STAR*D), with 2,897 individuals in total. Polygenic scoring found no convergence among multiple associations in NEWMEDS and STAR*D.
Conclusions
No single common genetic variant was associated with antidepressant response at a clinically relevant level in a European-ancestry cohort. Effects specific to particular antidepressant drugs could not be investigated in the current study.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Genetic and environmental factors can influence a person's response to medications. Taking advantage of the recent advancements in genetics, scientists are working to match specific gene variations with responses to particular medications. Knowing whether a patient is likely to respond to a drug or have serious side effects would allow doctors to select the best treatment up front. Right now, there are only a handful of examples where a patient's version of a particular gene predicts their response to a particular drug. Some scientists believe that there will be many more such matches between genetic variants and treatment responses. Others think that because the action of most drugs is influenced by many different genes, a variant in one of those genes is unlikely to have measurable effect in most cases.
Why Was This Study Done?
One of the areas where patients' responses to available drugs vary widely is severe depression (or major depressive disorder). Prescription of an antidepressant is often the first step in treating the disease. However, less than half of patients get well taking the first antidepressant prescribed. Those who don't respond to the first drug need to, together with their doctors, try multiple courses of treatment to find the right drug and the right dose for them. For some patients none of the existing drugs work well.
To see whether genetic information could help improve the choice of antidepressant, researchers from universities and the pharmaceutical industry joined forces in this large study. They examined two ways to use genetic information to improve the treatment of depression. First, they searched all genes for common genetic variants that could predict which patients would not respond to the two major groups of antidepressants (serotonin reuptake inhibitors, or SRIs, and noradrenaline reuptake inhibitors, or NRIs). They hoped that this would help with the development of new drugs that could help these patients. Second, they looked for common genetic variants in all genes that could identify patients who responded to one of the two major groups of antidepressants. Such predictors would make it possible to know which drug to prescribe for which patient.
What Did the Researchers Do and Find?
The researchers selected 1,790 patients with severe depression who had participated in one of several research studies; 1,222 of the patients had been treated with an SRI, the remaining 568 with an NRI, and it was recorded how well the drugs worked for each patient. The researchers also had a detailed picture of the genetic make-up of each patient, with information for over half a million genetic variants. They then looked for an association between genetic variants and responses to drugs.
They found not a single genetic variant that could predict clearly whether a person would respond to antidepressants in general, to one of the two main groups (SRIs and NRIs), or much better to one than the other. They also didn't find any combination of variants in groups of genes that work together that could predict responses. Combining their data with those from another large study did not yield any robust predictors either.
What Do These Findings Mean?
This study was large enough that it should have been possible to find common genetic variants that by themselves could predict a clinically meaningful response to SRIs and/or NRIs, had such variants existed. The fact that the study failed to find such variants suggests that such variants do not exist. It is still possible, however, that variants that are less common could predict response, or that combinations of variants could. To find those, if they do exist, even larger studies will need to be done.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001326
The National Institute of General Medical Sciences at the US National Institutes of Health has a fact sheet on personalized medicine
PubMed Health at the US National Library of Medicine has a page on major depressive disorder
Wikipedia has pages on major depressive disorder and pharmacogenetics, the study of how genetic variation affects response to certain drugs (note that Wikipedia is a free online encyclopedia that anyone can edit)
The UK National Health Service has comprehensive information pages on depression
doi:10.1371/journal.pmed.1001326
PMCID: PMC3472989  PMID: 23091423
2.  The Role of the Toxicologic Pathologist in the Post-Genomic Era# 
Journal of Toxicologic Pathology  2013;26(2):105-110.
An era can be defined as a period in time identified by distinctive character, events, or practices. We are now in the genomic era. The pre-genomic era: There was a pre-genomic era. It started many years ago with novel and seminal animal experiments, primarily directed at studying cancer. It is marked by the development of the two-year rodent cancer bioassay and the ultimate realization that alternative approaches and short-term animal models were needed to replace this resource-intensive and time-consuming method for predicting human health risk. Many alternatives approaches and short-term animal models were proposed and tried but, to date, none have completely replaced our dependence upon the two-year rodent bioassay. However, the alternative approaches and models themselves have made tangible contributions to basic research, clinical medicine and to our understanding of cancer and they remain useful tools to address hypothesis-driven research questions. The pre-genomic era was a time when toxicologic pathologists played a major role in drug development, evaluating the cancer bioassay and the associated dose-setting toxicity studies, and exploring the utility of proposed alternative animal models. It was a time when there was shortage of qualified toxicologic pathologists. The genomic era: We are in the genomic era. It is a time when the genetic underpinnings of normal biological and pathologic processes are being discovered and documented. It is a time for sequencing entire genomes and deliberately silencing relevant segments of the mouse genome to see what each segment controls and if that silencing leads to increased susceptibility to disease. What remains to be charted in this genomic era is the complex interaction of genes, gene segments, post-translational modifications of encoded proteins, and environmental factors that affect genomic expression. In this current genomic era, the toxicologic pathologist has had to make room for a growing population of molecular biologists. In this present era newly emerging DVM and MD scientists enter the work arena with a PhD in pathology often based on some aspect of molecular biology or molecular pathology research. In molecular biology, the almost daily technological advances require one’s complete dedication to remain at the cutting edge of the science. Similarly, the practice of toxicologic pathology, like other morphological disciplines, is based largely on experience and requires dedicated daily examination of pathology material to maintain a well-trained eye capable of distilling specific information from stained tissue slides - a dedicated effort that cannot be well done as an intermezzo between other tasks. It is a rare individual that has true expertise in both molecular biology and pathology. In this genomic era, the newly emerging DVM-PhD or MD-PhD pathologist enters a marketplace without many job opportunities in contrast to the pre-genomic era. Many face an identity crisis needing to decide to become a competent pathologist or, alternatively, to become a competent molecular biologist. At the same time, more PhD molecular biologists without training in pathology are members of the research teams working in drug development and toxicology. How best can the toxicologic pathologist interact in the contemporary team approach in drug development, toxicology research and safety testing? Based on their biomedical training, toxicologic pathologists are in an ideal position to link data from the emerging technologies with their knowledge of pathobiology and toxicology. To enable this linkage and obtain the synergy it provides, the bench-level, slide-reading expert pathologist will need to have some basic understanding and appreciation of molecular biology methods and tools. On the other hand, it is not likely that the typical molecular biologist could competently evaluate and diagnose stained tissue slides from a toxicology study or a cancer bioassay. The post-genomic era: The post-genomic era will likely arrive approximately around 2050 at which time entire genomes from multiple species will exist in massive databases, data from thousands of robotic high throughput chemical screenings will exist in other databases, genetic toxicity and chemical structure-activity-relationships will reside in yet other databases. All databases will be linked and relevant information will be extracted and analyzed by appropriate algorithms following input of the latest molecular, submolecular, genetic, experimental, pathology and clinical data. Knowledge gained will permit the genetic components of many diseases to be amenable to therapeutic prevention and/or intervention. Much like computerized algorithms are currently used to forecast weather or to predict political elections, computerized sophisticated algorithms based largely on scientific data mining will categorize new drugs and chemicals relative to their health benefits versus their health risks for defined human populations and subpopulations. However, this form of a virtual toxicity study or cancer bioassay will only identify probabilities of adverse consequences from interaction of particular environmental and/or chemical/drug exposure(s) with specific genomic variables. Proof in many situations will require confirmation in intact in vivo mammalian animal models. The toxicologic pathologist in the post-genomic era will be the best suited scientist to confirm the data mining and its probability predictions for safety or adverse consequences with the actual tissue morphological features in test species that define specific test agent pathobiology and human health risk.
doi:10.1293/tox.26.105
PMCID: PMC3695332  PMID: 23914052
genomic era; history of toxicologic pathology; molecular biology
3.  Number of Patients Studied Prior to Approval of New Medicines: A Database Analysis 
PLoS Medicine  2013;10(3):e1001407.
In an evaluation of medicines approved by the European Medicines Agency 2000 to 2010, Ruben Duijnhoven and colleagues find that the number of patients evaluated for medicines approved for chronic use are inadequate for evaluation of safety or long-term efficacy.
Background
At the time of approval of a new medicine, there are few long-term data on the medicine's benefit–risk balance. Clinical trials are designed to demonstrate efficacy, but have major limitations with regard to safety in terms of patient exposure and length of follow-up. This study of the number of patients who had been administered medicines at the time of medicine approval by the European Medicines Agency aimed to determine the total number of patients studied, as well as the number of patients studied long term for chronic medication use, compared with the International Conference on Harmonisation's E1 guideline recommendations.
Methods and Findings
All medicines containing new molecular entities approved between 2000 and 2010 were included in the study, including orphan medicines as a separate category. The total number of patients studied before approval was extracted (main outcome). In addition, the number of patients with long-term use (6 or 12 mo) was determined for chronic medication. 200 unique new medicines were identified: 161 standard and 39 orphan medicines. The median total number of patients studied before approval was 1,708 (interquartile range [IQR] 968–3,195) for standard medicines and 438 (IQR 132–915) for orphan medicines. On average, chronic medication was studied in a larger number of patients (median 2,338, IQR 1,462–4,135) than medication for intermediate (878, IQR 513–1,559) or short-term use (1,315, IQR 609–2,420). Safety and efficacy of chronic use was studied in fewer than 1,000 patients for at least 6 and 12 mo in 46.4% and 58.3% of new medicines, respectively. Among the 84 medicines intended for chronic use, 68 (82.1%) met the guideline recommendations for 6-mo use (at least 300 participants studied for 6 mo and at least 1,000 participants studied for any length of time), whereas 67 (79.8%) of the medicines met the criteria for 12-mo patient exposure (at least 100 participants studied for 12 mo).
Conclusions
For medicines intended for chronic use, the number of patients studied before marketing is insufficient to evaluate safety and long-term efficacy. Both safety and efficacy require continued study after approval. New epidemiologic tools and legislative actions necessitate a review of the requirements for the number of patients studied prior to approval, particularly for chronic use, and adequate use of post-marketing studies.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Before any new medicine is marketed for the treatment of a human disease, it has to go through extensive laboratory and clinical research. In the laboratory, scientists investigate the causes of diseases, identify potential new treatments, and test these interventions in disease models, some of which involve animals. The safety and efficacy of potential new interventions is then investigated in a series of clinical trials—studies in which the new treatment is tested in selected groups of patients under strictly controlled conditions, first to determine whether the drug is tolerated by humans and then to assess its efficacy. Finally, the results of these trials are reviewed by the government body responsible for drug approval; in the US, this body is the Food and Drug Administration, and in the European Union, the European Medicines Agency (EMA) is responsible for the scientific evaluation and approval of new medicines.
Why Was This Study Done?
Clinical trials are primarily designed to test the efficacy—the ability to produce the desired therapeutic effect—of new medicines. The number of patients needed to establish efficacy determines the size of a clinical trial, and the indications for which efficacy must be shown determine the trial's duration. However, identifying adverse effects of drugs generally requires the drug to be taken by more patients than are required to show efficacy, so the information about adverse effects is often relatively limited at the end of clinical testing. Consequently, when new medicines are approved, their benefit–risk ratios are often poorly defined, even though physicians need this information to decide which treatment to recommend to their patients. For the evaluation of risk or adverse effects of medicines being developed for chronic (long-term) treatment of non-life-threatening diseases, current guidelines recommend that at least 1,000–1,500 patients are exposed to the new drug and that 300 and 100 patients use the drug for six and twelve months, respectively, before approval. But are these guidelines being followed? In this database analysis, the researchers use data collected by the EMA to determine how many patients are exposed to new medicines before approval in the European Union and how many are exposed for extended periods of time to medicines intended for chronic use.
What Did the Researchers Do and Find?
Using the European Commission's Community Register of Medicinal Products, the researchers identified 161 standard medicines and 39 orphan medicines (medicines to treat or prevent rare life-threatening diseases) that contained new active substances and that were approved in the European Union between 2000 and 2010. They extracted information on the total number of patients studied and on the number exposed to the medicines for six months and twelve months before approval of each medicine from EMA's European public assessment reports. The average number of patients studied before approval was 1,708 for standard medicines and 438 for orphan medicines (marketing approval is easier to obtain for orphan medicines than for standard medicines to encourage drug companies to develop medicines that might otherwise be unprofitable). On average, medicines for chronic use (for example, asthma medications) were studied in more patients (2,338) than those for intermediate use such as anticancer drugs (878), or short-term use such as antibiotics (1,315). The safety and efficacy of chronic use was studied in fewer than 1,000 patients for at least six and twelve months in 46.4% and 58.4% of new medicines, respectively. Finally, among the 84 medicines intended for chronic use, 72 were studied in at least 300 patients for six months, and 70 were studied in at least 100 patients for twelve months.
What Do These Findings Mean?
These findings suggest that although the number of patients studied before approval is sufficient to determine the short-term efficacy of new medicines, it is insufficient to determine safety or long-term efficacy. Any move by drug approval bodies to require pharmaceutical companies to increase the total number of patients exposed to a drug, or the number exposed for extended periods of time to drugs intended for chronic use, would inevitably delay the entry of new products into the market, which likely would be unacceptable to patients and healthcare providers. Nevertheless, the researchers suggest that a reevaluation of the study size and long-term data requirements that need to be met for the approval of new medicines, particularly those designed for long-term use, is merited. They also stress the need for continued study of both the safety and efficacy of new medicines after approval and the importance of post-marketing studies that actively examine safety issues.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001407.
The European Medicines Agency (EMA) provides information about all aspects of the scientific evaluation and approval of new medicines in the European Union; its European public assessment reports are publicly available
The European Commission's Community Register of Medicinal Products is a publicly searchable database of medicinal products approved for human use in the European Union
The US Food and Drug Administration provides information about drug approval in the US for consumers and for health professionals
The US National Institutes of Health provides information (including personal stories) about clinical trials
doi:10.1371/journal.pmed.1001407
PMCID: PMC3601954  PMID: 23526887
4.  Evaluating Drug Prices, Availability, Affordability, and Price Components: Implications for Access to Drugs in Malaysia 
PLoS Medicine  2007;4(3):e82.
Background
Malaysia's stable health care system is facing challenges with increasing medicine costs. To investigate these issues a survey was carried out to evaluate medicine prices, availability, affordability, and the structure of price components.
Methods and Findings
The methodology developed by the World Health Organization (WHO) and Health Action International (HAI) was used. Price and availability data for 48 medicines was collected from 20 public sector facilities, 32 private sector retail pharmacies and 20 dispensing doctors in four geographical regions of West Malaysia. Medicine prices were compared with international reference prices (IRPs) to obtain a median price ratio. The daily wage of the lowest paid unskilled government worker was used to gauge the affordability of medicines. Price component data were collected throughout the supply chain, and markups, taxes, and other distribution costs were identified. In private pharmacies, innovator brand (IB) prices were 16 times higher than the IRPs, while generics were 6.6 times higher. In dispensing doctor clinics, the figures were 15 times higher for innovator brands and 7.5 for generics. Dispensing doctors applied high markups of 50%–76% for IBs, and up to 316% for generics. Retail pharmacy markups were also high—25%–38% and 100%–140% for IBs and generics, respectively. In the public sector, where medicines are free, availability was low even for medicines on the National Essential Drugs List. For a month's treatment for peptic ulcer disease and hypertension people have to pay about a week's wages in the private sector.
Conclusions
The free market by definition does not control medicine prices, necessitating price monitoring and control mechanisms. Markups for generic products are greater than for IBs. Reducing the base price without controlling markups may increase profits for retailers and dispensing doctors without reducing the price paid by end users. To increase access and affordability, promotion of generic medicines and improved availability of medicines in the public sector are required.
Drug price and availability data were collected from West Malaysian public sector facilities, private sector retail pharmacies, and dispensing doctors. Mark-ups were higher on generic drugs than on innovator brands.
Editors' Summary
Background.
The World Health Organization has said that one-third of the people of the world cannot access the medicines they need. An important reason for this problem is that prices are often too high for people or government-funded health systems to afford. In developing countries, most people who need medicines have to pay for them out of their own pockets. Where the cost of drugs is covered by health systems, spending on medicines is a major part of the total healthcare budget. Governments use a variety of approaches to try to control the cost of drugs and make sure that essential medicines are affordable and not overpriced. According to the theory of “free market economics,” the costs of goods and services are determined by interactions between buyers and sellers and not by government intervention. However, free market economics does not work well at containing the costs of medicines, particularly new medicines, because new medicines are protected by patent law, which legally prevents others from making, using, or selling the medicine for a particular period of time. Therefore, without government intervention, there is nothing to help to push down prices.
Why Was This Study Done?
Malaysia is a middle-income country with a relatively effective public health system, but it is facing a rapid rise in drug costs. In Malaysia, medicine prices are determined by free-market economics, without any control by government. Government hospitals are expected to provide drugs free, but a substantial proportion of medicines are paid for by patients who buy them directly from private pharmacies or prescribing doctors. There is evidence that Malaysian patients have difficulties accessing the drugs they need and that cost is an important factor. Therefore, the researchers who wrote this paper wanted to examine the cost of different medicines in Malaysia, and their availability and affordability from different sources.
What Did the Researchers Do and Find?
In this research project, 48 drugs were studied, of which 28 were part of a “core list” identified by the World Health Organization as “essential drugs” on the basis of the global burden of disease. The remaining 20 reflected health care needs in Malaysia itself. The costs of each medicine were collected from government hospitals, private pharmacies, and dispensing doctors in four different regions of Malaysia. Data were collected for the “innovator brand” (made by the original patent holder) and for “generic” brands (an equivalent drug to the innovator brand, produced by a different company once the innovator brand no longer has an exclusive patent). The medicine prices were compared against international reference prices (IRP), which are the average prices offered by not-for-profit drug companies to developing countries. Finally, the researchers also compared the cost of the drugs with daily wages, in order to work out their “affordability.”
The researchers found that, irrespective of the source of medicines, prices were on average very much higher than the international reference price, ranging from 2.4 times the IRP for innovator brands accessed through public hospitals, to 16 times the IRP for innovator brands accessed through private pharmacies. The availability of medicines was also very poor, with only 25% of generic medicines available on average through the public sector. The affordability of many of the medicines studied was again very poor. For example, one month's supply of ranitidine (a drug for stomach ulcers) was equivalent to around three days' wages for a low-paid government worker, and one month's supply of fluoxetine (an antidepressant) would cost around 26 days' wages.
What Do These Findings Mean?
These results show that essential drugs are very expensive in Malaysia and are not universally available. Many people would not be able to pay for essential medicines. The cost of medicines in Malaysia seems to be much higher than in areas of India and Sri Lanka, although the researchers did not attempt to collect data in order to carry out an international comparison. It is possible that the high cost and low availability in Malaysia are the result of a lack of government regulation. Overall, the findings suggest that the government should set up mechanisms to prevent drug manufacturers from increasing prices too much and thus ensure greater access to essential medicines.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0040082.
Read a related PLoS Medicine Perspective article by Suzanne Hill
Information is available from the World Health Organization on Improving Access to Medicines
Information on medicine prices is available from Health Action International
Wikipedia has an entry on Patent (a type of intellectual property that is normally used to prevent other companies from selling a newly invented medicine). (Wikipedia is an internet encyclopedia anyone can edit.)
The Drugs for Neglected Diseases Initiative is an international collaboration between public organizations that aims to develop drugs for people suffering from neglected diseases
doi:10.1371/journal.pmed.0040082
PMCID: PMC1831730  PMID: 17388660
5.  Genomics for Disease Treatment and Prevention 
The enormous advances in genetics and genomics of the past decade have the potential to revolutionize health care, including mental health care, and bring about a system predominantly characterized by the practice of genomic and personalized medicine. We briefly review the history of genetics and genomics and present heritability estimates for major chronic diseases of aging and neuropsychiatric disorders. We then assess the extent to which the results of genetic and genomic studies are currently being leveraged clinically for disease treatment and prevention and identify priority research areas in which further work is needed. Pharmacogenomics has emerged as one area of genomics that already has had notable impacts on disease treatment and the practice of medicine. Little evidence, however, for the clinical validity and utility of predictive testing based on genomic information is available, and thus has, to some extent, hindered broader-scale preventive efforts for common, complex diseases. Furthermore, although other disease areas have had greater success in identifying genetic factors responsible for various conditions, progress in identifying the genetic basis of neuropsychiatric diseases has lagged behind. We review social, economic, and policy issues relevant to genomic medicine, and find that a new model of health care based on proactive and preventive health planning and individualized treatment will require major advances in health care policy and administration. Specifically, incentives for relevant stakeholders are critical, as are realignment of incentives and education initiatives for physicians, and updates to pertinent legislation. Moreover, the translational behavioral and public health research necessary for fully integrating genomics into health care is lacking, and further work in these areas is needed. In short, while the pace of advances in genetic and genomic science and technology has been rapid, more work is needed to fully realize the potential for impacting disease treatment and prevention generally, and mental health specifically.
doi:10.1016/j.psc.2010.11.005
PMCID: PMC3073546  PMID: 21333845
genomics; genetic testing; genetic risk assessment; public health genomics; pharmacogenomics
6.  Discovery of small molecule cancer drugs: Successes, challenges and opportunities 
Molecular Oncology  2012;6(2):155-176.
The discovery and development of small molecule cancer drugs has been revolutionised over the last decade. Most notably, we have moved from a one-size-fits-all approach that emphasized cytotoxic chemotherapy to a personalised medicine strategy that focuses on the discovery and development of molecularly targeted drugs that exploit the particular genetic addictions, dependencies and vulnerabilities of cancer cells. These exploitable characteristics are increasingly being revealed by our expanding understanding of the abnormal biology and genetics of cancer cells, accelerated by cancer genome sequencing and other high-throughput genome-wide campaigns, including functional screens using RNA interference. In this review we provide an overview of contemporary approaches to the discovery of small molecule cancer drugs, highlighting successes, current challenges and future opportunities. We focus in particular on four key steps: Target validation and selection; chemical hit and lead generation; lead optimization to identify a clinical drug candidate; and finally hypothesis-driven, biomarker-led clinical trials. Although all of these steps are critical, we view target validation and selection and the conduct of biology-directed clinical trials as especially important areas upon which to focus to speed progress from gene to drug and to reduce the unacceptably high attrition rate during clinical development. Other challenges include expanding the envelope of druggability for less tractable targets, understanding and overcoming drug resistance, and designing intelligent and effective drug combinations. We discuss not only scientific and technical challenges, but also the assessment and mitigation of risks as well as organizational, cultural and funding problems for cancer drug discovery and development, together with solutions to overcome the ‘Valley of Death’ between basic research and approved medicines. We envisage a future in which addressing these challenges will enhance our rapid progress towards truly personalised medicine for cancer patients.
Highlights
► Here we review small molecule cancer drug discovery and development. ► We focus on Target selection, hit identification, lead optimization and clinical trials. ► A particular emphasis of this article is personalized medicine.
doi:10.1016/j.molonc.2012.02.004
PMCID: PMC3476506  PMID: 22440008
Small molecule cancer drug discovery and development; Target and validation selection; Hit identification; Lead optimization and clinical trials; Personalized medicine
7.  Paving the Way to Personalized Genomic Medicine: Steps to Successful Implementation 
Over the last decade there has been vast interest in and focus on the implementation of personalized genomic medicine. Although there is general agreement that personalized genomic medicine involves utilizing genome technology to assess individual risk and ensure the delivery of the “right treatment, for the right patient, at the right time,” different categories of stakeholders focus on different aspects of personalized genomic medicine and operationalize it in diverse ways. In order to move toward a clearer, more holistic understanding of the concept, this article begins by identifying and defining three major elements of personalized genomic medicine commonly discussed by stakeholders: molecular medicine, pharmacogenomics, and health information technology. The integration of these three elements has the potential to improve health and reduce health care costs, but it also raises many challenges. This article endeavors to address these challenges by identifying five strategic areas that will require significant investment for the successful integration of personalized genomics into clinical care: (1) health technology assessment; (2) health outcomes research; (3) education (of both health professionals and the public); (4) communication among stakeholders; and (5) the development of best practices and guidelines. While different countries and global regions display marked heterogeneity in funding of health care in the form of public, private, or blended payor systems, previous analyses of personalized genomic medicine and attendant technological innovations have been performed without due attention to this complexity. Hence, this article focuses on personalized genomic medicine in the United States as a model case study wherein a significant portion of health care payors represent private, nongovernment resources. Lessons learned from the present analysis of personalized genomic medicine could usefully inform health care systems in other global regions where payment for personalized genomic medicine will be enabled through private or hybrid public-private funding systems.
doi:10.2174/187569209788653998
PMCID: PMC2809376  PMID: 20098629
Personalized Genomic Medicine; Personalized Medicine; Ethics; Genomics; Policy
8.  Issue Information 
Aims and Scope: Molecular Genetics & Genomic Medicine is a peer reviewed journal for rapid dissemination of high-quality research related to the dynamically developing areas of human, molecular and medical genetics. The journal publishes original research articles covering novel findings in phenotypic, molecular, biological, and genomic aspects of genomic variation, inherited disorders and birth defects. The broad publishing spectrum of Molecular Genetics & Genomic Medicine includes rare and common disorders from diagnosis to treatment. Examples of appropriate articles include reports of novel disease genes, functional studies of genetic variants, in-depth genotype-phenotype studies, genomic analysis of inherited disorders, molecular diagnostic methods, medical bioinformatics, ethical, legal, and social implications (ELSI), and novel approaches to clinical diagnosis. We anticipate that Molecular Genetics & Genomic Medicine will provide a high quality scientific home for next generation sequencing studies of rare and common disorders, which will make novel findings in this fascinating area easily and rapidly accessible to the scientific community. This will serve as the basis for translating next generation sequencing studies into individualized diagnostics and therapeutics, for day-to-day medical care.
Molecular Genetics & Genomic Medicine publishes original research articles, reviews, and research methods papers, along with invited editorials and commentaries. Original research papers must report well-conducted research with conclusions supported by the data presented.
Molecular Genetics & Genomic Medicine is a Wiley Open Access journal, one of a series of peer reviewed titles publishing quality research with speed and efficiency. For further information visit the Wiley Open Access website.
doi:10.1002/mgg3.31
PMCID: PMC3907910  PMID: 24498632
9.  Red Blood Cell Transfusion and Mortality in Trauma Patients: Risk-Stratified Analysis of an Observational Study 
PLoS Medicine  2014;11(6):e1001664.
Using a large multicentre cohort, Pablo Perel and colleagues evaluate the association of red blood cell transfusion with mortality according to the predicted risk of death for trauma patients.
Please see later in the article for the Editors' Summary
Background
Haemorrhage is a common cause of death in trauma patients. Although transfusions are extensively used in the care of bleeding trauma patients, there is uncertainty about the balance of risks and benefits and how this balance depends on the baseline risk of death. Our objective was to evaluate the association of red blood cell (RBC) transfusion with mortality according to the predicted risk of death.
Methods and Findings
A secondary analysis of the CRASH-2 trial (which originally evaluated the effect of tranexamic acid on mortality in trauma patients) was conducted. The trial included 20,127 trauma patients with significant bleeding from 274 hospitals in 40 countries. We evaluated the association of RBC transfusion with mortality in four strata of predicted risk of death: <6%, 6%–20%, 21%–50%, and >50%. For this analysis the exposure considered was RBC transfusion, and the main outcome was death from all causes at 28 days. A total of 10,227 patients (50.8%) received at least one transfusion. We found strong evidence that the association of transfusion with all-cause mortality varied according to the predicted risk of death (p-value for interaction <0.0001). Transfusion was associated with an increase in all-cause mortality among patients with <6% and 6%–20% predicted risk of death (odds ratio [OR] 5.40, 95% CI 4.08–7.13, p<0.0001, and OR 2.31, 95% CI 1.96–2.73, p<0.0001, respectively), but with a decrease in all-cause mortality in patients with >50% predicted risk of death (OR 0.59, 95% CI 0.47–0.74, p<0.0001). Transfusion was associated with an increase in fatal and non-fatal vascular events (OR 2.58, 95% CI 2.05–3.24, p<0.0001). The risk associated with RBC transfusion was significantly increased for all the predicted risk of death categories, but the relative increase was higher for those with the lowest (<6%) predicted risk of death (p-value for interaction <0.0001). As this was an observational study, the results could have been affected by different types of confounding. In addition, we could not consider haemoglobin in our analysis. In sensitivity analyses, excluding patients who died early; conducting propensity score analysis adjusting by use of platelets, fresh frozen plasma, and cryoprecipitate; and adjusting for country produced results that were similar.
Conclusions
The association of transfusion with all-cause mortality appears to vary according to the predicted risk of death. Transfusion may reduce mortality in patients at high risk of death but increase mortality in those at low risk. The effect of transfusion in low-risk patients should be further tested in a randomised trial.
Trial registration
www.ClinicalTrials.gov NCT01746953
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Trauma—a serious injury to the body caused by violence or an accident—is a major global health problem. Every year, injuries caused by traffic collisions, falls, blows, and other traumatic events kill more than 5 million people (9% of annual global deaths). Indeed, for people between the ages of 5 and 44 years, injuries are among the top three causes of death in many countries. Trauma sometimes kills people through physical damage to the brain and other internal organs, but hemorrhage (serious uncontrolled bleeding) is responsible for 30%–40% of trauma-related deaths. Consequently, early trauma care focuses on minimizing hemorrhage (for example, by using compression to stop bleeding) and on restoring blood circulation after blood loss (health-care professionals refer to this as resuscitation). Red blood cell (RBC) transfusion is often used for the management of patients with trauma who are bleeding; other resuscitation products include isotonic saline and solutions of human blood proteins.
Why Was This Study Done?
Although RBC transfusion can save the lives of patients with trauma who are bleeding, there is considerable uncertainty regarding the balance of risks and benefits associated with this procedure. RBC transfusion, which is an expensive intervention, is associated with several potential adverse effects, including allergic reactions and infections. Moreover, blood supplies are limited, and the risks from transfusion are high in low- and middle-income countries, where most trauma-related deaths occur. In this study, which is a secondary analysis of data from a trial (CRASH-2) that evaluated the effect of tranexamic acid (which stops excessive bleeding) in patients with trauma, the researchers test the hypothesis that RBC transfusion may have a beneficial effect among patients at high risk of death following trauma but a harmful effect among those at low risk of death.
What Did the Researchers Do and Find?
The CRASH-2 trail included 20,127 patients with trauma and major bleeding treated in 274 hospitals in 40 countries. In their risk-stratified analysis, the researchers investigated the effect of RBC transfusion on CRASH-2 participants with a predicted risk of death (estimated using a validated model that included clinical variables such as heart rate and blood pressure) on admission to hospital of less than 6%, 6%–20%, 21%–50%, or more than 50%. That is, the researchers compared death rates among patients in each stratum of predicted risk of death who received a RBC transfusion with death rates among patients who did not receive a transfusion. Half the patients received at least one transfusion. Transfusion was associated with an increase in all-cause mortality at 28 days after trauma among patients with a predicted risk of death of less than 6% or of 6%–20%, but with a decrease in all-cause mortality among patients with a predicted risk of death of more than 50%. In absolute figures, compared to no transfusion, RBC transfusion was associated with 5.1 more deaths per 100 patients in the patient group with the lowest predicted risk of death but with 11.9 fewer deaths per 100 patients in the group with the highest predicted risk of death.
What Do These Findings Mean?
These findings show that RBC transfusion is associated with an increase in all-cause deaths among patients with trauma and major bleeding with a low predicted risk of death, but with a reduction in all-cause deaths among patients with a high predicted risk of death. In other words, these findings suggest that the effect of RBC transfusion on all-cause mortality may vary according to whether a patient with trauma has a high or low predicted risk of death. However, because the participants in the CRASH-2 trial were not randomly assigned to receive a RBC transfusion, it is not possible to conclude that receiving a RBC transfusion actually increased the death rate among patients with a low predicted risk of death. It might be that the patients with this level of predicted risk of death who received a transfusion shared other unknown characteristics (confounders) that were actually responsible for their increased death rate. Thus, to provide better guidance for clinicians caring for patients with trauma and hemorrhage, the hypothesis that RBC transfusion could be harmful among patients with trauma with a low predicted risk of death should be prospectively evaluated in a randomised controlled trial.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001664.
This study is further discussed in a PLOS Medicine Perspective by Druin Burch
The World Health Organization provides information on injuries and on violence and injury prevention (in several languages)
The US Centers for Disease Control and Prevention has information on injury and violence prevention and control
The National Trauma Institute, a US-based non-profit organization, provides information about hemorrhage after trauma and personal stories about surviving trauma
The UK National Health Service Choices website provides information about blood transfusion, including a personal story about transfusion after a serious road accident
The US National Heart, Lung, and Blood Institute also provides detailed information about blood transfusions
MedlinePlus provides links to further resources on injuries, bleeding, and blood transfusion (in English and Spanish)
More information in available about CRASH-2 (in several languages)
doi:10.1371/journal.pmed.1001664
PMCID: PMC4060995  PMID: 24937305
10.  The Paradox of Equipoise: The Principle That Drives and Limits Therapeutic Discoveries in Clinical Research 
Background
Progress in clinical medicine relies on the willingness of patients to take part in experimental clinical trials, particularly randomized controlled trials (RCTs). Before agreeing to enroll in clinical trials, patients require guarantees that they will not knowingly be harmed and will have the best possible chances of receiving the most favorable treatments. This guarantee is provided by the acknowledgment of uncertainty (equipoise), which removes ethical dilemmas and makes it easier for patients to enroll in clinical trials.
Methods
Since the design of clinical trials is mostly affected by clinical equipoise, the “clinical equipoise hypothesis” has been postulated. If the uncertainty requirement holds, this means that investigators cannot predict what they are going to discover in any individual trial that they undertake. In some instances, new treatments will be superior to standard treatments, while in others, standard treatments will be superior to experimental treatments, and in still others, no difference will be detected between new and standard treatments. It is hypothesized that there must be a relationship between the overall pattern of treatment successes and the uncertainties that RCTs are designed to address.
Results
An analysis of published trials shows that the results cannot be predicted at the level of individual trials. However, the results also indicate that the overall pattern of discovery of treatment success across a series of trials is predictable and is consistent with clinical equipoise hypothesis. The analysis shows that we can discover no more than 25% to 50% of successful treatments when they are tested in RCTs. The analysis also indicates that this discovery rate is optimal in helping to preserve the clinical trial system; a high discovery rate (eg, a 90% to 100% probability of success) is neither feasible nor desirable since under these circumstances, neither the patient nor the researcher has an interest in randomization. This in turn would halt the RCT system as we know it.
Conclusions
The “principle or law of clinical discovery” described herein predicts the efficiency of the current system of RCTs at generating discoveries of new treatments. The principle is derived from the requirement for uncertainty or equipoise as a precondition for RCTs, the precept that paradoxically drives discoveries of new treatments while limiting the proportion and rate of new therapeutic discoveries.
PMCID: PMC2782889  PMID: 19910921
11.  Transforming the practice of medicine using genomics 
Recent studies have demonstrated the use of genomic data, particularly gene expression signatures, as clinical prognostic factors in complex diseases. Such studies herald the future for genomic medicine and the opportunity for personalized prognosis in a variety of clinical contexts that utilize genomescale molecular information. Several key areas represent logical and critical next steps in the use of complex genomic profiling data towards the goal of personalized medicine. First, analyses should be geared toward the development of molecular profiles that predict future events – such as major clinical events or the response, resistance, or adverse reaction to therapy. Secondly, these must move into actual clinical practice by forming the basis for the next generation of clinical trials that will employ these methodologies to stratify patients. Lastly, there remain formidable challenges is in the translation of genomic technologies into clinical medicine that will need to be addressed: professional and public education, health outcomes research, reimbursement, regulatory oversight and privacy protection.
PMCID: PMC2781216  PMID: 22461094
genomic medicine, personalized medicine, human genome.
12.  Quantifying the Impoverishing Effects of Purchasing Medicines: A Cross-Country Comparison of the Affordability of Medicines in the Developing World 
PLoS Medicine  2010;7(8):e1000333.
Laurens Niëns and colleagues estimate the impoverishing effects of four medicines in 16 low- and middle-income countries using the impoverishment method as a metric of affordability and show that medicine purchases could impoverish large numbers of people.
Background
Increasing attention is being paid to the affordability of medicines in low- and middle-income countries (LICs and MICs) where medicines are often highly priced in relation to income levels. The impoverishing effect of medicine purchases can be estimated by determining pre- and postpayment incomes, which are then compared to a poverty line. Here we estimate the impoverishing effects of four medicines in 16 LICs and MICs using the impoverishment method as a metric of affordability.
Methods and Findings
Affordability was assessed in terms of the proportion of the population being pushed below US$1.25 or US$2 per day poverty levels because of the purchase of medicines. The prices of salbutamol 100 mcg/dose inhaler, glibenclamide 5 mg cap/tab, atenolol 50 mg cap/tab, and amoxicillin 250 mg cap/tab were obtained from facility-based surveys undertaken using a standard measurement methodology. The World Bank's World Development Indicators provided household expenditure data and information on income distributions. In the countries studied, purchasing these medicines would impoverish large portions of the population (up to 86%). Originator brand products were less affordable than the lowest-priced generic equivalents. In the Philippines, for example, originator brand atenolol would push an additional 22% of the population below US$1.25 per day, whereas for the lowest priced generic equivalent this demographic shift is 7%. Given related prevalence figures, substantial numbers of people are affected by the unaffordability of medicines.
Conclusions
Comparing medicine prices to available income in LICs and MICs shows that medicine purchases by individuals in those countries could lead to the impoverishment of large numbers of people. Action is needed to improve medicine affordability, such as promoting the use of quality assured, low-priced generics, and establishing health insurance systems.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
In recent years, the international community has prioritized access to essential medicines, which has required focusing on the accessibility, availability, quality, and affordability of life-saving medicines and the development of appropriate data and research agendas to measure these components. Determining the degree of affordability of medicines, especially in low- and middle-income countries, is a complex process as the term affordability is vague. However, the cost of medicines is a major public health issue, especially as the majority of people in developing countries do not have health insurance and medicines freely provided through the public sector are often unavailable. Therefore, although countries have a legal obligation to make essential medicines available to those who need them at an affordable cost, poor people often have to pay for the medicines that they need when they are ill. Consequently, where medicine prices are high, people may have to forego treatment or they may go into debt if they decide to buy the necessary medicines.
Why Was This Study Done?
The researchers wanted to show the impact of the cost of medicines on poorer populations by undertaking an analysis that quantified the proportion of people who would be pushed into poverty (an income level of US$1.25 or US$2 a day) because their only option is to pay out-of-pocket expenses for the life-saving medicines they need. The researchers referred to this consequence as the “impoverishing effect of a medicine.”
What Did the Researchers Do and Find?
The researchers generated “impoverishment rates” of four medicines in 16 low- and middle-income countries by comparing households' daily per capita income before and after (the hypothetical) purchase of one of the following: a salbutamol 100 mcg/dose inhaler, glibenclamide 5 mg cap/tab, atenolol 50 mg cap/tab, and amoxicillin 250 mg cap/tab. This selection of drugs covers the treatment/management of three chronic diseases and one acute illness. The cost of each medicine was taken from standardized surveys, which report median patient prices for a selection of commonly used medicines in the private sector (the availability of essential medicines in the public sector is much lower so many people will depend on the private sector for their medicines) for both originator brand and lowest priced generic products. If the prepayment income was above the US$1.25 (or US$2) poverty line and the postpayment income fell below these lines, purchasing these medicines at current prices impoverishes people.
According to the results of this analysis, a substantial proportion (up to 86%) of the population in the countries studied would be pushed into poverty as a result of purchasing one of the four selected medicines. Furthermore, the lowest priced generic versions of each medicine were generally substantially more affordable than originator brand products. For example, in the Philippines, purchasing originator brand atenolol would push an additional 22% of the population below US$1.25 per day compared to 7% if the lowest priced generic equivalent was bought instead. In effect, purchasing essential medicines for both chronic and acute conditions could impoverish large numbers of people, especially if originator brand products are bought.
What Do These Findings Mean?
Although the purchasing of medicines represents only part of the costs associated with the management of an illness, it is clear that the high cost of medicines have catastrophic effects on poor people. In addition, as the treatment of chronic conditions often requires a combination of medicines, the cost of treating and managing a chronic condition such as asthma, diabetes, and cardiovascular disease is likely to be even more unaffordable than what is reported in this study. Therefore concerted action is urgently required to improve medicine affordability and prevent poor populations from being pushed further into poverty. Such action could include: governments, civil society organizations, and others making access to essential medicines more of a priority and to consider this strategy as an integral part of reducing poverty; the development, implementation, and enforcement of sound national and international price policies; actively promoting the use of quality assured, low-cost generic drugs; ensuring the availability of essential medicines in the public sector at little or no charge to poor people; establishing health insurance systems with outpatient medicine benefits; encouraging pharmaceutical companies to differentially price medicines that are still subject to patent restrictions.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1000333.
For a comprehensive resource for medicine prices, availability, and affordability, see Health Action International
Guidelines about access to essential medicines and pharmaceutical policies can be found at WHO
Transparency Alliance provides more information about medicines
Access to essential medicines has become a key campaign topic; for more information see Médecins Sans Frontières (Doctors without Borders)
doi:10.1371/journal.pmed.1000333
PMCID: PMC2930876  PMID: 20824175
13.  Survival-Related Profile, Pathways, and Transcription Factors in Ovarian Cancer 
PLoS Medicine  2009;6(2):e1000024.
Background
Ovarian cancer has a poor prognosis due to advanced stage at presentation and either intrinsic or acquired resistance to classic cytotoxic drugs such as platinum and taxoids. Recent large clinical trials with different combinations and sequences of classic cytotoxic drugs indicate that further significant improvement in prognosis by this type of drugs is not to be expected. Currently a large number of drugs, targeting dysregulated molecular pathways in cancer cells have been developed and are introduced in the clinic. A major challenge is to identify those patients who will benefit from drugs targeting these specific dysregulated pathways.The aims of our study were (1) to develop a gene expression profile associated with overall survival in advanced stage serous ovarian cancer, (2) to assess the association of pathways and transcription factors with overall survival, and (3) to validate our identified profile and pathways/transcription factors in an independent set of ovarian cancers.
Methods and Findings
According to a randomized design, profiling of 157 advanced stage serous ovarian cancers was performed in duplicate using ∼35,000 70-mer oligonucleotide microarrays. A continuous predictor of overall survival was built taking into account well-known issues in microarray analysis, such as multiple testing and overfitting. A functional class scoring analysis was utilized to assess pathways/transcription factors for their association with overall survival. The prognostic value of genes that constitute our overall survival profile was validated on a fully independent, publicly available dataset of 118 well-defined primary serous ovarian cancers. Furthermore, functional class scoring analysis was also performed on this independent dataset to assess the similarities with results from our own dataset. An 86-gene overall survival profile discriminated between patients with unfavorable and favorable prognosis (median survival, 19 versus 41 mo, respectively; permutation p-value of log-rank statistic = 0.015) and maintained its independent prognostic value in multivariate analysis. Genes that composed the overall survival profile were also able to discriminate between the two risk groups in the independent dataset. In our dataset 17/167 pathways and 13/111 transcription factors were associated with overall survival, of which 16 and 12, respectively, were confirmed in the independent dataset.
Conclusions
Our study provides new clues to genes, pathways, and transcription factors that contribute to the clinical outcome of serous ovarian cancer and might be exploited in designing new treatment strategies.
Ate van der Zee and colleagues analyze the gene expression profiles of ovarian cancer samples from 157 patients, and identify an 86-gene expression profile that seems to predict overall survival.
Editors' Summary
Background.
Ovarian cancer kills more than 100,000 women every year and is one of the most frequent causes of cancer death in women in Western countries. Most ovarian cancers develop when an epithelial cell in one of the ovaries (two small organs in the pelvis that produce eggs) acquires genetic changes that allow it to grow uncontrollably and to spread around the body (metastasize). In its early stages, ovarian cancer is confined to the ovaries and can often be treated successfully by surgery alone. Unfortunately, early ovarian cancer rarely has symptoms so a third of women with ovarian cancer have advanced disease when they first visit their doctor with symptoms that include vague abdominal pains and mild digestive disturbances. That is, cancer cells have spread into their abdominal cavity and metastasized to other parts of the body (so-called stage III and IV disease). The outlook for women diagnosed with stage III and IV disease, which are treated with a combination of surgery and chemotherapy, is very poor. Only 30% of women with stage III, and 5% with stage IV, are still alive five years after their cancer is diagnosed.
Why Was This Study Done?
If the cellular pathways that determine the biological behavior of ovarian cancer could be identified, it might be possible to develop more effective treatments for women with stage III and IV disease. One way to identify these pathways is to use gene expression profiling (a technique that catalogs all the genes expressed by a cell) to compare gene expression patterns in the ovarian cancers of women who survive for different lengths of time. Genes with different expression levels in tumors with different outcomes could be targets for new treatments. For example, it might be worth developing inhibitors of proteins whose expression is greatest in tumors with short survival times. In this study, the researchers develop an expression profile that is associated with overall survival in advanced-stage serous ovarian cancer (more than half of ovarian cancers originate in serous cells, epithelial cells that secrete a watery fluid). The researchers also assess the association of various cellular pathways and transcription factors (proteins that control the expression of other proteins) with survival in this type of ovarian carcinoma.
What Did the Researchers Do and Find?
The researchers analyzed the gene expression profiles of tumor samples taken from 157 patients with advanced stage serous ovarian cancer and used the “supervised principal components” method to build a predictor of overall survival from these profiles and patient survival times. This 86-gene predictor discriminated between patients with favorable and unfavorable outcomes (average survival times of 41 and 19 months, respectively). It also discriminated between groups of patients with these two outcomes in an independent dataset collected from 118 additional serous ovarian cancers. Next, the researchers used “functional class scoring” analysis to assess the association between pathway and transcription factor expression in the tumor samples and overall survival. Seventeen of 167 KEGG pathways (“wiring” diagrams of molecular interactions, reactions and relations involved in cellular processes and human diseases listed in the Kyoto Encyclopedia of Genes and Genomes) were associated with survival, 16 of which were confirmed in the independent dataset. Finally, 13 of 111 analyzed transcription factors were associated with overall survival in the tumor samples, 12 of which were confirmed in the independent dataset.
What Do These Findings Mean?
These findings identify an 86-gene overall survival gene expression profile that seems to predict overall survival for women with advanced serous ovarian cancer. However, before this profile can be used clinically, further validation of the profile and more robust methods for determining gene expression profiles are needed. Importantly, these findings also provide new clues about the genes, pathways and transcription factors that contribute to the clinical outcome of serous ovarian cancer, clues that can now be exploited in the search for new treatment strategies. Finally, these findings suggest that it might eventually be possible to tailor therapies to the needs of individual patients by analyzing which pathways are activated in their tumors and thus improve survival times for women with advanced ovarian cancer.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1000024.
This study is further discussed in a PLoS Medicine Perspective by Simon Gayther and Kate Lawrenson
See also a related PLoS Medicine Research Article by Huntsman and colleagues
The US National Cancer Institute provides a brief description of what cancer is and how it develops, and information on all aspects of ovarian cancer for patients and professionals (in English and Spanish)
The UK charity Cancerbackup provides general information about cancer, and more specific information about ovarian cancer
MedlinePlus also provides links to other information about ovarian cancer (in English and Spanish)
The KEGG Pathway database provides pathway maps of known molecular networks involved in a wide range of cellular processes
doi:10.1371/journal.pmed.1000024
PMCID: PMC2634794  PMID: 19192944
14.  Liver Dysfunction and Phosphatidylinositol-3-Kinase Signalling in Early Sepsis: Experimental Studies in Rodent Models of Peritonitis 
PLoS Medicine  2012;9(11):e1001338.
Experimental studies in a rat model of fecal peritonitis conducted by Michael Bauer and colleagues show that in this model, changes in liver function occur early in the development of sepsis, with potential implications for prognosis and development of new therapeutic approaches.
Background
Hepatic dysfunction and jaundice are traditionally viewed as late features of sepsis and portend poor outcomes. We hypothesized that changes in liver function occur early in the onset of sepsis, yet pass undetected by standard laboratory tests.
Methods and Findings
In a long-term rat model of faecal peritonitis, biotransformation and hepatobiliary transport were impaired, depending on subsequent disease severity, as early as 6 h after peritoneal contamination. Phosphatidylinositol-3-kinase (PI3K) signalling was simultaneously induced at this time point. At 15 h there was hepatocellular accumulation of bilirubin, bile acids, and xenobiotics, with disturbed bile acid conjugation and drug metabolism. Cholestasis was preceded by disruption of the bile acid and organic anion transport machinery at the canalicular pole. Inhibitors of PI3K partially prevented cytokine-induced loss of villi in cultured HepG2 cells. Notably, mice lacking the PI3Kγ gene were protected against cholestasis and impaired bile acid conjugation. This was partially confirmed by an increase in plasma bile acids (e.g., chenodeoxycholic acid [CDCA] and taurodeoxycholic acid [TDCA]) observed in 48 patients on the day severe sepsis was diagnosed; unlike bilirubin (area under the receiver-operating curve: 0.59), these bile acids predicted 28-d mortality with high sensitivity and specificity (area under the receiver-operating curve: CDCA: 0.77; TDCA: 0.72; CDCA+TDCA: 0.87).
Conclusions
Liver dysfunction is an early and commonplace event in the rat model of sepsis studied here; PI3K signalling seems to play a crucial role. All aspects of hepatic biotransformation are affected, with severity relating to subsequent prognosis. Detected changes significantly precede conventional markers and are reflected by early alterations in plasma bile acids. These observations carry important implications for the diagnosis of liver dysfunction and pharmacotherapy in the critically ill. Further clinical work is necessary to extend these concepts into clinical practice.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Sepsis (blood poisoning)—a life-threatening condition caused by an inappropriate immune response to an infection—is a major global cause of death. Normally, when bacteria or other microbes enter the human body, the immune system efficiently destroys the invaders. In sepsis the immune system goes into overdrive, and the chemicals it releases into the blood to combat the infection trigger widespread inflammation (swelling). This leads to the formation of small blood clots and leaky blood vessels that block the flow of blood to vital organs such as the kidneys and liver. In the most severe cases, multiple organs fail and the patient dies. Anyone can get sepsis, but people with weakened immune systems, the very young, and the elderly are most vulnerable. Symptoms of sepsis include fever, chills, rapid breathing, a fast heart rate, and confusion. In its early stages, sepsis can be treated with antibiotics alone, but people with severe sepsis need to be admitted to an intensive care unit where the vital organs can be supported while the infection is treated.
Why Was This Study Done?
Thirty to fifty percent of people who develop severe sepsis die. If sepsis could be diagnosed in its early stages, it might be possible to save more people. Unfortunately, the symptoms of sepsis mimic those of other conditions, and, because sepsis tends to develop very quickly, it is often not diagnosed until it is too late to save the patient's life. The development of liver (hepatic) dysfunction and jaundice are both regarded as late features of sepsis (jaundice is yellowing of the skin and eyes caused by a build-up of bilirubin in the blood). However, the researchers hypothesized that changes in liver function occur early in sepsis and could, therefore, be used to improve the diagnosis and management of sepsis.
What Did the Researchers Do and Find?
The researchers induced sepsis in rats by injecting bacteria into the peritoneal cavity (the gap between the abdominal wall and the abdominal organs), separated the infected animals into predicted survivors and non-survivors based on their heart stroke volume measured using cardiac ultrasound, and then examined their liver function. The expression of genes encoding proteins involved in “biotransformation” and “hepatobiliary transport” (the processes that convert waste products and toxic chemicals into substances that can be conjugated to increase solubility and then excreted) was down-regulated within six hours of sepsis induction in the predicted non-survivors compared to the predicted survivors. Functional changes such as bilirubin and bile acid accumulation in the liver (cholestasis), poor excretion of xenobiotics (molecules not usually found in the body such as antibiotics), and disturbed bile acid conjugation were also seen in predicted non-survivors but not in survivors. Moreover, phosphatidylinositol-3-kinase (PI3K) signaling (which is involved in several immune processes) increased soon after sepsis induction in non-survivor but not in survivor animals. Notably, mice lacking the PI3Kγ gene did not develop cholestasis or show impaired bile acid conjugation after induction of sepsis. Finally, in human patients, plasma bile acids were increased in 48 patients on the day that severe sepsis was diagnosed, and these increases accurately predicted death in these patients.
What Do These Findings Mean?
These findings show that liver dysfunction is an early event in animal models of sepsis and that PI3K signalling plays a crucial role in the development of liver dysfunction. They show that all aspects of liver biotransformation are affected during sepsis and suggest that outcomes are related to the severity of these changes. The limited clinical data included in this study also support the hypothesis that changes in liver function occur early in sepsis, although these data need confirming and extending. Taken together, these findings suggest that liver function tests might aid early diagnosis of sepsis and might also provide information about likely outcomes. They also have important implications for the use of drugs in patients who are critically ill with sepsis, in that some of the drugs routinely administered to such patients may not be adequately detoxified and may, therefore, contribute to organ injury. Finally, these findings suggest that inhibition of PI3Kγ may alleviate sepsis-associated cholestasis.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001338.
This study is further discussed in a PLOS Medicine Perspective by John Marshall
The US National Institute of General Medical Sciences has a fact sheet on sepsis
The UK National Health Service Choices website has information about sepsis and about jaundice
The Surviving Sepsis Campaign, which was developed to improve the management, diagnosis, and treatment of sepsis, provides basic information about sepsis
The Sepsis Alliance, a US not-for-profit organization, also provides information about sepsis for patients and their families, including personal stories about sepsis
The not-for profit UK Sepsis Trust is another useful source of information about sepsis that includes patient stories
MedlinePlus provides links to additional resources about sepsis and jaundice (in English and Spanish)
doi:10.1371/journal.pmed.1001338
PMCID: PMC3496669  PMID: 23152722
15.  What do evidence-based secondary journals tell us about the publication of clinically important articles in primary healthcare journals? 
BMC Medicine  2004;2:33.
Background
We conducted this analysis to determine i) which journals publish high-quality, clinically relevant studies in internal medicine, general/family practice, general practice nursing, and mental health; and ii) the proportion of clinically relevant articles in each journal.
Methods
We performed an analytic survey of a hand search of 170 general medicine, general healthcare, and specialty journals for 2000. Research staff assessed individual articles by using explicit criteria for scientific merit for healthcare application. Practitioners assessed the clinical importance of these articles. Outcome measures were the number of high-quality, clinically relevant studies published in the 170 journal titles and how many of these were published in each of four discipline-specific, secondary "evidence-based" journals (ACP Journal Club for internal medicine and its subspecialties; Evidence-Based Medicine for general/family practice; Evidence-Based Nursing for general practice nursing; and Evidence-Based Mental Health for all aspects of mental health). Original studies and review articles were classified for purpose: therapy and prevention, screening and diagnosis, prognosis, etiology and harm, economics and cost, clinical prediction guides, and qualitative studies.
Results
We evaluated 60,352 articles from 170 journal titles. The pass criteria of high-quality methods and clinically relevant material were met by 3059 original articles and 1073 review articles. For ACP Journal Club (internal medicine), four titles supplied 56.5% of the articles and 27 titles supplied the other 43.5%. For Evidence-Based Medicine (general/family practice), five titles supplied 50.7% of the articles and 40 titles supplied the remaining 49.3%. For Evidence-Based Nursing (general practice nursing), seven titles supplied 51.0% of the articles and 34 additional titles supplied 49.0%. For Evidence-Based Mental Health (mental health), nine titles supplied 53.2% of the articles and 34 additional titles supplied 46.8%. For the disciplines of internal medicine, general/family practice, and mental health (but not general practice nursing), the number of clinically important articles was correlated withScience Citation Index (SCI) Impact Factors.
Conclusions
Although many clinical journals publish high-quality, clinically relevant and important original studies and systematic reviews, the articles for each discipline studied were concentrated in a small subset of journals. This subset varied according to healthcare discipline; however, many of the important articles for all disciplines in this study were published in broad-based healthcare journals rather than subspecialty or discipline-specific journals.
doi:10.1186/1741-7015-2-33
PMCID: PMC518974  PMID: 15350200
16.  Predicting Survival within the Lung Cancer Histopathological Hierarchy Using a Multi-Scale Genomic Model of Development 
PLoS Medicine  2006;3(7):e232.
Background
The histopathologic heterogeneity of lung cancer remains a significant confounding factor in its diagnosis and prognosis—spurring numerous recent efforts to find a molecular classification of the disease that has clinical relevance.
Methods and Findings
Molecular profiles of tumors from 186 patients representing four different lung cancer subtypes (and 17 normal lung tissue samples) were compared with a mouse lung development model using principal component analysis in both temporal and genomic domains. An algorithm for the classification of lung cancers using a multi-scale developmental framework was developed. Kaplan–Meier survival analysis was conducted for lung adenocarcinoma patient subgroups identified via their developmental association. We found multi-scale genomic similarities between four human lung cancer subtypes and the developing mouse lung that are prognostically meaningful. Significant association was observed between the localization of human lung cancer cases along the principal mouse lung development trajectory and the corresponding patient survival rate at three distinct levels of classical histopathologic resolution: among different lung cancer subtypes, among patients within the adenocarcinoma subtype, and within the stage I adenocarcinoma subclass. The earlier the genomic association between a human tumor profile and the mouse lung development sequence, the poorer the patient's prognosis. Furthermore, decomposing this principal lung development trajectory identified a gene set that was significantly enriched for pyrimidine metabolism and cell-adhesion functions specific to lung development and oncogenesis.
Conclusions
From a multi-scale disease modeling perspective, the molecular dynamics of murine lung development provide an effective framework that is not only data driven but also informed by the biology of development for elucidating the mechanisms of human lung cancer biology and its clinical outcome.
Editors' Summary
Background.
Lung cancer causes the most deaths from cancer worldwide—around a quarter of all cancer deaths—and the number of deaths is rising each year. There are a number of different types of the disease, whose names come from early descriptions of the cancer cells when seen under the microscope: carcinoid, small cell, and non–small cell, which make up 2%, 13%, and 86% of lung cancers, respectively. To make things more complicated, each of these cancer types can be subdivided further. It is important to distinguish the different types of cancer because they differ in their rates of growth and how they respond to treatment; for example, small cell lung cancer is the most rapidly progressing type of lung cancer. But although these current classifications of cancers are useful, researchers believe that if the underlying molecular changes in these cancers could be discovered then a more accurate way of classifying cancers, and hence predicting outcome and response to treatment, might be possible.
Why Was This Study Done?
Previous work has suggested that some cancers come from very immature cells, that is, cells that are present in the early stages of an animal's development from an embryo in the womb to an adult animal. Many animals have been closely studied so as to understand how they develop; the best studied model that is also relevant to human disease is the mouse, and researchers have previously studied lung development in mice in detail. This group of researchers wanted to see if there was any relation between the activity (known as expression) of mouse genes during the development of the lung and the expression of genes in human lung cancers, particularly whether they could use gene expression to try to predict the outcome of lung cancer in patients.
What Did the Researchers Do and Find?
They compared the gene expression in lung cancer samples from 186 patients with four different types of lung cancer (and in 17 normal lung tissue samples) to the gene expression found in normal mice during development. They found similarities between expression patterns in the lung cancer subtypes and the developing mouse lung, and that these similarities explain some of the different outcomes for the patients. In general, they found that when the gene expression in the human cancer was similar to that of very immature mouse lung cells, patients had a poor prognosis. When the gene expression in the human cancer was more similar to mature mouse lung cells, the prognosis was better. However, the researchers found that carcinoid tumors had rather different expression profiles compared to the other tumors.
  The researchers were also able to discover some specific gene types that seemed to have particularly strong associations between mouse development and the human cancers. Two of these gene types were ones that are involved in building and breaking down DNA itself, and ones involved in how cells stick together. This latter group of genes is thought to be involved in how cancers spread.
What Do These Findings Mean?
These results provide a new way of thinking about how to classify lung cancers, and also point to a few groups of genes that may be particularly important in the development of the tumor. However, before these results are used in any clinical assessment, further work will need to be done to work out whether they are true for other groups of patients.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0030232.
•  MedlinePlus has information from the United States National Library of Medicine and other government agencies and health-related organizations [MedlinePlus]
•  National Institute on Aging is also a good place to start looking for information [National Institute for Aging]
•  [The National Cancer Institute] and Lung Cancer Online [ Lung Cancer Online] have a wide range of information on lung cancer
Comparison of gene expression patterns in patients with lung cancer and in mouse lung development showed that those tumors associated with earlier mouse lung development had a poorer prognosis.
doi:10.1371/journal.pmed.0030232
PMCID: PMC1483910  PMID: 16800721
17.  Something going on in Milan: a review of the 4th International PhD Student Cancer Conference 
ecancermedicalscience  2010;4:198.
The 4th International PhD Student Cancer Conference was held at the IFOM-IEO-Campus in Milan from 19–21 May 2010 http://www.semm.it/events_researchPast.php
The Conference covered many topics related to cancer, from basic biology to clinical aspects of the disease. All attendees presented their research, by either giving a talk or presenting a poster. This conference is an opportunity to introduce PhD students to top cancer research institutes across Europe.
The core participanting institutes included: European School of Molecular Medicine (SEMM)—IFOM-IEO Campus, MilanBeatson Institute for Cancer Research (BICR), GlasgowCambridge Research Institute (CRI), Cambridge, UKMRC Gray Institute of Radiation Biology (GIROB), OxfordLondon Research Institute (LRI), LondonPaterson Institute for Cancer Research (PICR), ManchesterThe Netherlands Cancer Institute (NKI), Amsterdam
‘You organizers have crushed all my prejudices towards Italians. Congratulations, I enjoyed the conference immensely!’ Even if it might have sounded like rudeness for sure this was supposed to be a genuine compliment (at least, that’s how we took it), also considering that it was told by a guy who himself was the fusion of two usually antithetical concepts: fashion style and English nationality.
The year 2010 has marked an important event for Italian research in the international scientific panorama: the European School of Molecular Medicine (SEMM) had the honour to host the 4th International PhD Student Cancer Conference, which was held from 19–21 May 2010 at the IFOM-IEO-Campus (http://www.semm.it/events_researchPast.php) in Milan.
The conference was attended by more than one hundred students, coming from a selection of cutting edge European institutes devoted to cancer research. The rationale behind it is the promotion of cooperation among young scientists across Europe to debate about science and to exchange ideas and experiences. But that is not all, it is also designed for PhD students to get in touch with other prestigious research centres and to create connections for future post docs or job experiences. And last but not least, it is a golden chance for penniless PhD students to spend a couple of extra days visiting a foreign country (this motivation will of course never be voiced to supervisors).
The network of participating institutes has a three-nation core, made up of the Netherlands Cancer Institute, the Italian European School of Molecular Medicine (SEMM) and five UK Cancer Research Institutes (The London Research Institute, The Cambridge Research Institute, The Beatson Institute for Cancer Research in Glasgow, The Patterson Institute for Cancer Research in Manchester and the MRC Gray Institute for Radiation Oncology and Biology in Oxford).
The conference is hosted and organised every year by one of the core institutes; the first was in Cambridge in 2007, Amsterdam in 2008 and London in 2009, this year was the turn of Milan.
In addition to the core institutes, PhD students from several other high-profile institutes are invited to attend the conference. This year participants applied from the Spanish National Cancer Centre (CNIO, Madrid), the German Cancer Research Centre (DKFZ, Heidelberg), the European Molecular Biology Labs (EMBL, Heidelberg) and the San Raffaele Institute (HSR, Milan). Moreover four ‘special guests’ from the National Centre for Biological Sciences of Bangalore (India) attended the conference in Milan. This represents a first step in widening the horizons beyond Europe into a global worldwide network of talented PhD students in life sciences.
The conference spread over two and a half days (Wednesday 19th to Friday 21st May) and touched on a broad spectrum of topics: from basic biology to development, from cancer therapies to modelling and top-down new generation global approaches. The final selection of presentations has been a tough task for us organisers (Chiara Segré, Federica Castellucci, Francesca Milanesi, Gianluca Varetti and Gian Maria Sarra Ferraris), due to the high scientific level of the abstracts submitted. In the end, 26 top students were chosen to give a 15-min oral presentation in one of eight sessions: Development & Differentiation, Cell Migration, Immunology & Cancer, Modelling & Large Scale approaches, Genome Instability, Signal Transduction, Cancer Genetics & Drug Resistance, Stem Cells in Biology and Cancer.
The scientific programme was further enriched by two scientific special sessions, held by Professor Pier Paolo di Fiore and Dr Giuseppe Testa, Principal Investigators at the IFOM-IEO-Campus and by a bioethical round table on human embryonic stem cell research moderated by Silvia Camporesi, a senior PhD student in the SEMM PhD Programme ‘Foundation of Life Science and their Bioethical Consequences’.
On top of everything, we had the pleasure of inviting, as keynote speakers, two leading European scientists in the fields of cancer invasion and biology of stem cells, respectively: Dr Peter Friedl from The Nijmegen Centre for Molecular Life (The Netherlands) and Professor Andreas Trumpp from The Heidelberg Institute for Stem Cell Technology and Experimental Medicine (Heidelberg).
All the student talks have distinguished themselves for the impressive quality of the science; an encouraging evidence of the high profile level of research carried out in Europe. It would be beyond the purposes of this report to summarise all 26 talks, which touched many different and specific topics. For further information, the Conference Abstract book with all the scientific content is available on the conference Web site (http://www.semm.it/events_researchPast.php). In what follows, the special sessions and the keynote lectures will be discussed in detail.
doi:10.3332/ecancer.2010.198
PMCID: PMC3234021  PMID: 22276043
18.  Metabolic network reconstruction of Chlamydomonas offers insight into light-driven algal metabolism 
A comprehensive genome-scale metabolic network of Chlamydomonas reinhardtii, including a detailed account of light-driven metabolism, is reconstructed and validated. The model provides a new resource for research of C. reinhardtii metabolism and in algal biotechnology.
The genome-scale metabolic network of Chlamydomonas reinhardtii (iRC1080) was reconstructed, accounting for >32% of the estimated metabolic genes encoded in the genome, and including extensive details of lipid metabolic pathways.This is the first metabolic network to explicitly account for stoichiometry and wavelengths of metabolic photon usage, providing a new resource for research of C. reinhardtii metabolism and developments in algal biotechnology.Metabolic functional annotation and the largest transcript verification of a metabolic network to date was performed, at least partially verifying >90% of the transcripts accounted for in iRC1080. Analysis of the network supports hypotheses concerning the evolution of latent lipid pathways in C. reinhardtii, including very long-chain polyunsaturated fatty acid and ceramide synthesis pathways.A novel approach for modeling light-driven metabolism was developed that accounts for both light source intensity and spectral quality of emitted light. The constructs resulting from this approach, termed prism reactions, were shown to significantly improve the accuracy of model predictions, and their use was demonstrated for evaluation of light source efficiency and design.
Algae have garnered significant interest in recent years, especially for their potential application in biofuel production. The hallmark, model eukaryotic microalgae Chlamydomonas reinhardtii has been widely used to study photosynthesis, cell motility and phototaxis, cell wall biogenesis, and other fundamental cellular processes (Harris, 2001). Characterizing algal metabolism is key to engineering production strains and understanding photobiological phenomena. Based on extensive literature on C. reinhardtii metabolism, its genome sequence (Merchant et al, 2007), and gene functional annotation, we have reconstructed and experimentally validated the genome-scale metabolic network for this alga, iRC1080, the first network to account for detailed photon absorption permitting growth simulations under different light sources. iRC1080 accounts for 1080 genes, associated with 2190 reactions and 1068 unique metabolites and encompasses 83 subsystems distributed across 10 cellular compartments (Figure 1A). Its >32% coverage of estimated metabolic genes is a tremendous expansion over previous algal reconstructions (Boyle and Morgan, 2009; Manichaikul et al, 2009). The lipid metabolic pathways of iRC1080 are considerably expanded relative to existing networks, and chemical properties of all metabolites in these pathways are accounted for explicitly, providing sufficient detail to completely specify all individual molecular species: backbone molecule and stereochemical numbering of acyl-chain positions; acyl-chain length; and number, position, and cis–trans stereoisomerism of carbon–carbon double bonds. Such detail in lipid metabolism will be critical for model-driven metabolic engineering efforts.
We experimentally verified transcripts accounted for in the network under permissive growth conditions, detecting >90% of tested transcript models (Figure 1B) and providing validating evidence for the contents of iRC1080. We also analyzed the extent of transcript verification by specific metabolic subsystems. Some subsystems stood out as more poorly verified, including chloroplast and mitochondrial transport systems and sphingolipid metabolism, all of which exhibited <80% of transcripts detected, reflecting incomplete characterization of compartmental transporters and supporting a hypothesis of latent pathway evolution for ceramide synthesis in C. reinhardtii. Additional lines of evidence from the reconstruction effort similarly support this hypothesis including lack of ceramide synthetase and other annotation gaps downstream in sphingolipid metabolism. A similar hypothesis of latent pathway evolution was established for very long-chain fatty acids (VLCFAs) and their polyunsaturated analogs (VLCPUFAs) (Figure 1C), owing to the absence of this class of lipids in previous experimental measurements, lack of a candidate VLCFA elongase in the functional annotation, and additional downstream annotation gaps in arachidonic acid metabolism.
The network provides a detailed account of metabolic photon absorption by light-driven reactions, including photosystems I and II, light-dependent protochlorophyllide oxidoreductase, provitamin D3 photoconversion to vitamin D3, and rhodopsin photoisomerase; this network accounting permits the precise modeling of light-dependent metabolism. iRC1080 accounts for effective light spectral ranges through analysis of biochemical activity spectra (Figure 3A), either reaction activity or absorbance at varying light wavelengths. Defining effective spectral ranges associated with each photon-utilizing reaction enabled our network to model growth under different light sources via stoichiometric representation of the spectral composition of emitted light, termed prism reactions. Coefficients for different photon wavelengths in a prism reaction correspond to the ratios of photon flux in the defined effective spectral ranges to the total emitted photon flux from a given light source (Figure 3B). This approach distinguishes the amount of emitted photons that drive different metabolic reactions. We created prism reactions for most light sources that have been used in published studies for algal and plant growth including solar light, various light bulbs, and LEDs. We also included regulatory effects, resulting from lighting conditions insofar as published studies enabled. Light and dark conditions have been shown to affect metabolic enzyme activity in C. reinhardtii on multiple levels: transcriptional regulation, chloroplast RNA degradation, translational regulation, and thioredoxin-mediated enzyme regulation. Through application of our light model and prism reactions, we were able to closely recapitulate experimental growth measurements under solar, incandescent, and red LED lights. Through unbiased sampling, we were able to establish the tremendous statistical significance of the accuracy of growth predictions achievable through implementation of prism reactions. Finally, application of the photosynthetic model was demonstrated prospectively to evaluate light utilization efficiency under different light sources. The results suggest that, of the existing light sources, red LEDs provide the greatest efficiency, about three times as efficient as sunlight. Extending this analysis, the model was applied to design a maximally efficient LED spectrum for algal growth. The result was a 677-nm peak LED spectrum with a total incident photon flux of 360 μE/m2/s, suggesting that for the simple objective of maximizing growth efficiency, LED technology has already reached an effective theoretical optimum.
In summary, the C. reinhardtii metabolic network iRC1080 that we have reconstructed offers insight into the basic biology of this species and may be employed prospectively for genetic engineering design and light source design relevant to algal biotechnology. iRC1080 was used to analyze lipid metabolism and generate novel hypotheses about the evolution of latent pathways. The predictive capacity of metabolic models developed from iRC1080 was demonstrated in simulating mutant phenotypes and in evaluation of light source efficiency. Our network provides a broad knowledgebase of the biochemistry and genomics underlying global metabolism of a photoautotroph, and our modeling approach for light-driven metabolism exemplifies how integration of largely unvisited data types, such as physicochemical environmental parameters, can expand the diversity of applications of metabolic networks.
Metabolic network reconstruction encompasses existing knowledge about an organism's metabolism and genome annotation, providing a platform for omics data analysis and phenotype prediction. The model alga Chlamydomonas reinhardtii is employed to study diverse biological processes from photosynthesis to phototaxis. Recent heightened interest in this species results from an international movement to develop algal biofuels. Integrating biological and optical data, we reconstructed a genome-scale metabolic network for this alga and devised a novel light-modeling approach that enables quantitative growth prediction for a given light source, resolving wavelength and photon flux. We experimentally verified transcripts accounted for in the network and physiologically validated model function through simulation and generation of new experimental growth data, providing high confidence in network contents and predictive applications. The network offers insight into algal metabolism and potential for genetic engineering and efficient light source design, a pioneering resource for studying light-driven metabolism and quantitative systems biology.
doi:10.1038/msb.2011.52
PMCID: PMC3202792  PMID: 21811229
Chlamydomonas reinhardtii; lipid metabolism; metabolic engineering; photobioreactor
19.  Implementation of the CDC translational informatics platform - from genetic variants to the national Swedish Rheumatology Quality Register 
Background
Sequencing of the human genome and the subsequent analyses have produced immense volumes of data. The technological advances have opened new windows into genomics beyond the DNA sequence. In parallel, clinical practice generate large amounts of data. This represents an underused data source that has much greater potential in translational research than is currently realized. This research aims at implementing a translational medicine informatics platform to integrate clinical data (disease diagnosis, diseases activity and treatment) of Rheumatoid Arthritis (RA) patients from Karolinska University Hospital and their research database (biobanks, genotype variants and serology) at the Center for Molecular Medicine, Karolinska Institutet.
Methods
Requirements engineering methods were utilized to identify user requirements. Unified Modeling Language and data modeling methods were used to model the universe of discourse and data sources. Oracle11g were used as the database management system, and the clinical development center (CDC) was used as the application interface. Patient data were anonymized, and we employed authorization and security methods to protect the system.
Results
We developed a user requirement matrix, which provided a framework for evaluating three translation informatics systems. The implementation of the CDC successfully integrated biological research database (15172 DNA, serum and synovial samples, 1436 cell samples and 65 SNPs per patient) and clinical database (5652 clinical visit) for the cohort of 379 patients presents three profiles. Basic functionalities provided by the translational medicine platform are research data management, development of bioinformatics workflow and analysis, sub-cohort selection, and re-use of clinical data in research settings. Finally, the system allowed researchers to extract subsets of attributes from cohorts according to specific biological, clinical, or statistical features.
Conclusions
Research and clinical database integration is a real challenge and a road-block in translational research. Through this research we addressed the challenges and demonstrated the usefulness of CDC. We adhered to ethical regulations pertaining to patient data, and we determined that the existing software solutions cannot meet the translational research needs at hand. We used RA as a test case since we have ample data on active and longitudinal cohort.
doi:10.1186/1479-5876-11-85
PMCID: PMC3623742  PMID: 23548156
Swedish Rheumatology Quality Register (SRQ); Translational medicine platform; Secondary use of clinical data; Patient de-identification
20.  A Systematic Review of Studies That Aim to Determine Which Outcomes to Measure in Clinical Trials in Children  
PLoS Medicine  2008;5(4):e96.
Background
In clinical trials the selection of appropriate outcomes is crucial to the assessment of whether one intervention is better than another. Selection of inappropriate outcomes can compromise the utility of a trial. However, the process of selecting the most suitable outcomes to include can be complex. Our aim was to systematically review studies that address the process of selecting outcomes or outcome domains to measure in clinical trials in children.
Methods and Findings
We searched Cochrane databases (no date restrictions) in December 2006; and MEDLINE (1950 to 2006), CINAHL (1982 to 2006), and SCOPUS (1966 to 2006) in January 2007 for studies of the selection of outcomes for use in clinical trials in children. We also asked a group of experts in paediatric clinical research to refer us to any other relevant studies. From these articles we extracted data on the clinical condition of interest, description of the method used to select outcomes, the people involved in the selection process, the outcomes selected, and limitations of the method as defined by the authors. The literature search identified 8,889 potentially relevant abstracts. Of these, 70 were retrieved, and 25 were included in the review. These studies described the work of 13 collaborations representing various paediatric specialties including critical care, gastroenterology, haematology, psychiatry, neurology, respiratory paediatrics, rheumatology, neonatal medicine, and dentistry. Two groups utilised the Delphi technique, one used the nominal group technique, and one used both methods to reach a consensus about which outcomes should be measured in clinical trials. Other groups used semistructured discussion, and one group used a questionnaire-based survey. The collaborations involved clinical experts, research experts, and industry representatives. Three groups involved parents of children affected by the particular condition.
Conclusions
Very few studies address the appropriate choice of outcomes for clinical research with children, and in most paediatric specialties no research has been undertaken. Among the studies we did assess, very few involved parents or children in selecting outcomes that should be measured, and none directly involved children. Research should be undertaken to identify the best way to involve parents and children in assessing which outcomes should be measured in clinical trials.
Ian Sinha and colleagues show, in a systematic review of published studies, that there are very few studies that address the appropriate choice of outcomes for clinical research with children.
Editors' Summary
Background.
When adult patients are given a drug for a disease by their doctors, they can be sure that its benefits and harms will have been carefully studied in clinical trials. Clinical researchers will have asked how well the drug does when compared to other drugs by giving groups of patients the various treatments and determining several “outcomes.” These are measurements carefully chosen in advance by clinical experts that ensure that trials provide as much information as possible about how effectively a drug deals with a specific disease and whether it has any other effects on patients' health and daily life. The situation is very different, however, for pediatric (child) patients. About three-quarters of the drugs given to children are “off-label”—they have not been specifically tested in children. The assumption used to be that children are just small people who can safely take drugs tested in adults provided the dose is scaled down. However, it is now known that children's bodies handle many drugs differently from adult bodies and that a safe dose for an adult can sometimes kill a child even after scaling down for body size. Consequently, regulatory bodies in the US, Europe, and elsewhere now require clinical trials to be done in children and drugs for pediatric use to be specifically licensed.
Why Was This Study Done?
Because children are not small adults, the methodology used to design trials involving children needs to be adapted from that used to design trials in adult patients. In particular, the process of selecting the outcomes to include in pediatric trials needs to take into account the differences between adults and children. For example, because children's brains are still developing, it may be important to include outcome measures that will detect any effect that drugs have on intellectual development. In this study, therefore, the researchers undertook a systematic review of the medical literature to discover how much is known about the best way to select outcomes in clinical trials in children.
What Did the Researchers Do and Find?
The researchers used a predefined search strategy to identify all the studies published since 1950 that examined the selection of outcomes in clinical trials in children. They also asked experts in pediatric clinical research for details of relevant studies. Only 25 studies, which covered several pediatric specialties and were published by 13 collaborative groups, met the strict eligibility criteria laid down by the researchers for their systematic review. Several approaches previously used to choose outcomes in clinical trials in adults were used in these studies to select outcomes. Two groups used the “Delphi” technique, in which opinions are sought from individuals, collated, and fed back to the individuals to generate discussion and a final, consensus agreement. One group used the “nominal group technique,” which involves the use of structured face-to-face discussions to develop a solution to a problem followed by a vote. Another group used both methods. The remaining groups (except one that used a questionnaire) used semistructured discussion meetings or workshops to decide on outcomes. Although most of the groups included clinical experts, people doing research on the specific clinical condition under investigation, and industry representatives, only three groups asked parents about which outcomes should be included in the trials, and none asked children directly.
What Do These Findings Mean?
These findings indicate that very few studies have addressed the selection of appropriate outcomes for clinical research in children. Indeed, in many pediatric specialties no research has been done on this important topic. Importantly, some of the studies included in this systematic review clearly show that it is inappropriate to use the outcomes used in adult clinical trials in pediatric populations. Overall, although the studies identified in this review provide some useful information on the selection of outcomes in clinical trials in children, further research is urgently needed to ensure that this process is made easier and more uniform. In particular, much more research must be done to determine the best way to involve children and their parents in the selection of outcomes.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0050096.
A related PLoSMedicine Perspective article is available
The European Medicines Agency provides information about the regulation of medicines for children in Europe
The US Food and Drug Administration Office of Pediatric Therapeutics provides similar information for the US
The UK Medicines and Healthcare products Regulatory Agency also provides information on why medicines need to be tested in children
The UK Medicines for Children Research Network aims to facilitate the conduct of clinical trials of medicines for children
The James Lind Alliance has been established in the UK to increase patient involvement in medical research issues such as outcome selection in clinical trials
doi:10.1371/journal.pmed.0050096
PMCID: PMC2346505  PMID: 18447577
21.  Role of DNA Methylation and Epigenetic Silencing of HAND2 in Endometrial Cancer Development 
PLoS Medicine  2013;10(11):e1001551.
TB filled in by Laureen
Please see later in the article for the Editors' Summary
Background
Endometrial cancer incidence is continuing to rise in the wake of the current ageing and obesity epidemics. Much of the risk for endometrial cancer development is influenced by the environment and lifestyle. Accumulating evidence suggests that the epigenome serves as the interface between the genome and the environment and that hypermethylation of stem cell polycomb group target genes is an epigenetic hallmark of cancer. The objective of this study was to determine the functional role of epigenetic factors in endometrial cancer development.
Methods and Findings
Epigenome-wide methylation analysis of >27,000 CpG sites in endometrial cancer tissue samples (n = 64) and control samples (n = 23) revealed that HAND2 (a gene encoding a transcription factor expressed in the endometrial stroma) is one of the most commonly hypermethylated and silenced genes in endometrial cancer. A novel integrative epigenome-transcriptome-interactome analysis further revealed that HAND2 is the hub of the most highly ranked differential methylation hotspot in endometrial cancer. These findings were validated using candidate gene methylation analysis in multiple clinical sample sets of tissue samples from a total of 272 additional women. Increased HAND2 methylation was a feature of premalignant endometrial lesions and was seen to parallel a decrease in RNA and protein levels. Furthermore, women with high endometrial HAND2 methylation in their premalignant lesions were less likely to respond to progesterone treatment. HAND2 methylation analysis of endometrial secretions collected using high vaginal swabs taken from women with postmenopausal bleeding specifically identified those patients with early stage endometrial cancer with both high sensitivity and high specificity (receiver operating characteristics area under the curve = 0.91 for stage 1A and 0.97 for higher than stage 1A). Finally, mice harbouring a Hand2 knock-out specifically in their endometrium were shown to develop precancerous endometrial lesions with increasing age, and these lesions also demonstrated a lack of PTEN expression.
Conclusions
HAND2 methylation is a common and crucial molecular alteration in endometrial cancer that could potentially be employed as a biomarker for early detection of endometrial cancer and as a predictor of treatment response. The true clinical utility of HAND2 DNA methylation, however, requires further validation in prospective studies.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Cancer, which is responsible for 13% of global deaths, can develop anywhere in the body, but all cancers are characterized by uncontrolled cell growth and reduced cellular differentiation (the process by which unspecialized cells such as “stem” cells become specialized during development, tissue repair, and normal cell turnover). Genetic alterations—changes in the sequence of nucleotides (DNA's building blocks) in specific genes—are required for this cellular transformation and subsequent cancer development (carcinogenesis). However, recent evidence suggests that epigenetic modifications—reversible, heritable changes in gene function that occur in the absence of nucleotide sequence changes—may also be involved in carcinogenesis. For example, the addition of methyl groups to a set of genes called stem cell polycomb group target genes (PCGTs; polycomb genes control the expression of their target genes by modifying their DNA or associated proteins) is one of the earliest molecular changes in human cancer development, and increasing evidence suggests that hypermethylation of PCGTs is an epigenetic hallmark of cancer.
Why Was This Study Done?
The methylation of PCGTs, which is triggered by age and by environmental factors that are associated with cancer development, reduces cellular differentiation and leads to the accumulation of undifferentiated cells that are susceptible to cancer development. It is unclear, however, whether epigenetic modifications have a causal role in carcinogenesis. Here, the researchers investigate the involvement of epigenetic factors in the development of endometrial (womb) cancer. The risk of endometrial cancer (which affects nearly 50,000 women annually in the United States) is largely determined by environmental and lifestyle factors. Specifically, the risk of this cancer is increased in women in whom estrogen (a hormone that drives cell proliferation in the endometrium) is functionally dominant over progesterone (a hormone that inhibits endometrial proliferation and causes cell differentiation); obese women and women who have taken estrogen-only hormone replacement therapies fall into this category. Thus, endometrial cancer is an ideal model in which to study whether epigenetic mechanisms underlie carcinogenesis.
What Did the Researchers Do and Find?
The researchers collected data on genome-wide DNA methylation at cytosine- and guanine-rich sites in endometrial cancers and normal endometrium and integrated this information with the human interactome and transcriptome (all the physical interactions between proteins and all the genes expressed, respectively, in a cell) using an algorithm called Functional Epigenetic Modules (FEM). This analysis identified HAND2 as the hub of the most highly ranked differential methylation hotspot in endometrial cancer. HAND2 is a progesterone-regulated stem cell PCGT. It encodes a transcription factor that is expressed in the endometrial stroma (the connective tissue that lies below the epithelial cells in which most endometrial cancers develop) and that suppresses the production of the growth factors that mediate the growth-inducing effects of estrogen on the endometrial epithelium. The researchers hypothesized, therefore, that epigenetic deregulation of HAND2 could be a key step in endometrial cancer development. In support of this hypothesis, the researchers report that HAND2 methylation was increased in premalignant endometrial lesions (cancer-prone, abnormal-looking tissue) compared to normal endometrium, and was associated with suppression of HAND2 expression. Moreover, a high level of endometrial HAND2 methylation in premalignant lesions predicted a poor response to progesterone treatment (which stops the growth of some endometrial cancers), and analysis of HAND2 methylation in endometrial secretions collected from women with postmenopausal bleeding (a symptom of endometrial cancer) accurately identified individuals with early stage endometrial cancer. Finally, mice in which the Hand2 gene was specifically deleted in the endometrium developed precancerous endometrial lesions with age.
What Do These Findings Mean?
These and other findings identify HAND2 methylation as a common, key molecular alteration in endometrial cancer. These findings need to be confirmed in more women, and studies are needed to determine the immediate molecular and cellular consequences of HAND2 silencing in endometrial stromal cells. Nevertheless, these results suggest that HAND2 methylation could potentially be used as a biomarker for the early detection of endometrial cancer and for predicting treatment response. More generally, these findings support the idea that methylation of HAND2 (and, by extension, the methylation of other PCGTs) is not a passive epigenetic feature of cancer but is functionally involved in cancer development, and provide a framework for identifying other genes that are epigenetically regulated and functionally important in carcinogenesis.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001551
The US National Cancer Institute provides information on all aspects of cancer and has detailed information about endometrial cancer for patients and professionals (in English and Spanish)
The not-for-profit organization American Cancer Society provides information on cancer and how it develops and specific information on endometrial cancer (in several languages)
The UK National Health Service Choices website includes an introduction to cancer, a page on endometrial cancer, and a personal story about endometrial cancer
The not-for-profit organization Cancer Research UK provides general information about cancer and specific information about endometrial cancer
Wikipedia has a page on cancer epigenetics (note: Wikipedia is a free online encyclopedia that anyone can edit; available in several languages)
The Eve Appeal charity that supported this research provides useful information on gynecological cancers
doi:10.1371/journal.pmed.1001551
PMCID: PMC3825654  PMID: 24265601
22.  GliomaPredict: a clinically useful tool for assigning glioma patients to specific molecular subtypes 
Background
Advances in generating genome-wide gene expression data have accelerated the development of molecular-based tumor classification systems. Tools that allow the translation of such molecular classification schemas from research into clinical applications are still missing in the emerging era of personalized medicine.
Results
We developed GliomaPredict as a computational tool that allows the fast and reliable classification of glioma patients into one of six previously published stratified subtypes based on sets of extensively validated classifiers derived from hundreds of glioma transcriptomic profiles. Our tool utilizes a principle component analysis (PCA)-based approach to generate a visual representation of the analyses, quantifies the confidence of the underlying subtype assessment and presents results as a printable PDF file. GliomaPredict tool is implemented as a plugin application for the widely-used GenePattern framework.
Conclusions
GliomaPredict provides a user-friendly, clinically applicable novel platform for instantly assigning gene expression-based subtype in patients with gliomas thereby aiding in clinical trial design and therapeutic decision-making. Implemented as a user-friendly diagnostic tool, we expect that in time GliomaPredict, and tools like it, will become routinely used in translational/clinical research and in the clinical care of patients with gliomas.
doi:10.1186/1472-6947-10-38
PMCID: PMC2912783  PMID: 20633285
23.  Promoting synergistic research and education in genomics and bioinformatics 
BMC Genomics  2008;9(Suppl 1):I1.
Bioinformatics and Genomics are closely related disciplines that hold great promises for the advancement of research and development in complex biomedical systems, as well as public health, drug design, comparative genomics, personalized medicine and so on. Research and development in these two important areas are impacting the science and technology.
High throughput sequencing and molecular imaging technologies marked the beginning of a new era for modern translational medicine and personalized healthcare. The impact of having the human sequence and personalized digital images in hand has also created tremendous demands of developing powerful supercomputing, statistical learning and artificial intelligence approaches to handle the massive bioinformatics and personalized healthcare data, which will obviously have a profound effect on how biomedical research will be conducted toward the improvement of human health and prolonging of human life in the future. The International Society of Intelligent Biological Medicine (http://www.isibm.org) and its official journals, the International Journal of Functional Informatics and Personalized Medicine (http://www.inderscience.com/ijfipm) and the International Journal of Computational Biology and Drug Design (http://www.inderscience.com/ijcbdd) in collaboration with International Conference on Bioinformatics and Computational Biology (Biocomp), touch tomorrow's bioinformatics and personalized medicine throughout today's efforts in promoting the research, education and awareness of the upcoming integrated inter/multidisciplinary field. The 2007 international conference on Bioinformatics and Computational Biology (BIOCOMP07) was held in Las Vegas, the United States of American on June 25-28, 2007. The conference attracted over 400 papers, covering broad research areas in the genomics, biomedicine and bioinformatics. The Biocomp 2007 provides a common platform for the cross fertilization of ideas, and to help shape knowledge and scientific achievements by bridging these two very important disciplines into an interactive and attractive forum. Keeping this objective in mind, Biocomp 2007 aims to promote interdisciplinary and multidisciplinary education and research. 25 high quality peer-reviewed papers were selected from 400+ submissions for this supplementary issue of BMC Genomics. Those papers contributed to a wide-range of important research fields including gene expression data analysis and applications, high-throughput genome mapping, sequence analysis, gene regulation, protein structure prediction, disease prediction by machine learning techniques, systems biology, database and biological software development. We always encourage participants submitting proposals for genomics sessions, special interest research sessions, workshops and tutorials to Professor Hamid R. Arabnia (hra@cs.uga.edu) in order to ensure that Biocomp continuously plays the leadership role in promoting inter/multidisciplinary research and education in the fields. Biocomp received top conference ranking with a high score of 0.95/1.00. Biocomp is academically co-sponsored by the International Society of Intelligent Biological Medicine and the Research Laboratories and Centers of Harvard University – Massachusetts Institute of Technology, Indiana University - Purdue University, Georgia Tech – Emory University, UIUC, UCLA, Columbia University, University of Texas at Austin and University of Iowa etc. Biocomp - Worldcomp brings leading scientists together across the nation and all over the world and aims to promote synergistic components such as keynote lectures, special interest sessions, workshops and tutorials in response to the advances of cutting-edge research.
doi:10.1186/1471-2164-9-S1-I1
PMCID: PMC3226105  PMID: 18366597
24.  Exploratory trials, confirmatory observations: A new reasoning model in the era of patient-centered medicine 
Background
The prevailing view in therapeutic clinical research today is that observational studies are useful for generating new hypotheses and that controlled experiments (i.e., randomized clinical trials, RCTs) are the most appropriate method for assessing and confirming the efficacy of interventions.
Discussion
The current trend towards patient-centered medicine calls for alternative ways of reasoning, and in particular for a shift towards hypothetico-deductive logic, in which theory is adjusted in light of individual facts. A new model of this kind should change our approach to drug research and development, and regulation. The assessment of new therapeutic agents would be viewed as a continuous process, and regulatory approval would no longer be regarded as the final step in the testing of a hypothesis, but rather, as the hypothesis-generating step.
The main role of RCTs in this patient-centered research paradigm would be to generate hypotheses, while observations would serve primarily to test their validity for different types of patients. Under hypothetico-deductive logic, RCTs are considered "exploratory" and observations, "confirmatory".
Summary
In this era of tailored therapeutics, the answers to therapeutic questions cannot come exclusively from methods that rely on data aggregation, the analysis of similarities, controlled experiments, and a search for the best outcome for the average patient; they must also come from methods based on data disaggregation, analysis of subgroups and individuals, an integration of research and clinical practice, systematic observations, and a search for the best outcome for the individual patient. We must look not only to evidence-based medicine, but also to medicine-based evidence, in seeking the knowledge that we need.
doi:10.1186/1471-2288-11-57
PMCID: PMC3097156  PMID: 21518440
25.  Reporting Bias in Drug Trials Submitted to the Food and Drug Administration: Review of Publication and Presentation 
PLoS Medicine  2008;5(11):e217.
Background
Previous studies of drug trials submitted to regulatory authorities have documented selective reporting of both entire trials and favorable results. The objective of this study is to determine the publication rate of efficacy trials submitted to the Food and Drug Administration (FDA) in approved New Drug Applications (NDAs) and to compare the trial characteristics as reported by the FDA with those reported in publications.
Methods and Findings
This is an observational study of all efficacy trials found in approved NDAs for New Molecular Entities (NMEs) from 2001 to 2002 inclusive and all published clinical trials corresponding to the trials within the NDAs. For each trial included in the NDA, we assessed its publication status, primary outcome(s) reported and their statistical significance, and conclusions. Seventy-eight percent (128/164) of efficacy trials contained in FDA reviews of NDAs were published. In a multivariate model, trials with favorable primary outcomes (OR = 4.7, 95% confidence interval [CI] 1.33–17.1, p = 0.018) and active controls (OR = 3.4, 95% CI 1.02–11.2, p = 0.047) were more likely to be published. Forty-one primary outcomes from the NDAs were omitted from the papers. Papers included 155 outcomes that were in the NDAs, 15 additional outcomes that favored the test drug, and two other neutral or unknown additional outcomes. Excluding outcomes with unknown significance, there were 43 outcomes in the NDAs that did not favor the NDA drug. Of these, 20 (47%) were not included in the papers. The statistical significance of five of the remaining 23 outcomes (22%) changed between the NDA and the paper, with four changing to favor the test drug in the paper (p = 0.38). Excluding unknowns, 99 conclusions were provided in both NDAs and papers, nine conclusions (9%) changed from the FDA review of the NDA to the paper, and all nine did so to favor the test drug (100%, 95% CI 72%–100%, p = 0.0039).
Conclusions
Many trials were still not published 5 y after FDA approval. Discrepancies between the trial information reviewed by the FDA and information found in published trials tended to lead to more favorable presentations of the NDA drugs in the publications. Thus, the information that is readily available in the scientific literature to health care professionals is incomplete and potentially biased.
Lisa Bero and colleagues review the publication status of all efficacy trials carried out in support of new drug approvals from 2001 and 2002, and find that a quarter of trials remain unpublished.
Editors' Summary
Background.
All health-care professionals want their patients to have the best available clinical care—but how can they identify the optimum drug or intervention? In the past, clinicians used their own experience or advice from colleagues to make treatment decisions. Nowadays, they rely on evidence-based medicine—the systematic review and appraisal of clinical research findings. So, for example, before a new drug is approved for the treatment of a specific disease in the United States and becomes available for doctors to prescribe, the drug's sponsors (usually a pharmaceutical company) must submit a “New Drug Application” (NDA) to the US Food and Drug Administration (FDA). The NDA tells the story of the drug's development from laboratory and animal studies through to clinical trials, including “efficacy” trials in which the efficacy and safety of the new drug and of a standard drug for the disease are compared by giving groups of patients the different drugs and measuring several key (primary) “outcomes.” FDA reviewers use this evidence to decide whether to approve a drug.
Why Was This Study Done?
Although the information in NDAs is publicly available, clinicians and patients usually learn about new drugs from articles published in medical journals after drug approval. Unfortunately, drug sponsors sometimes publish the results only of the trials in which their drug performed well and in which statistical analyses indicate that the drug's improved performance was a real effect rather than a lucky coincidence. Trials in which a drug did not show a “statistically significant benefit” or where the drug was found to have unwanted side effects often remain unpublished. This “publication bias” means that the scientific literature can contain an inaccurate picture of a drug's efficacy and safety relative to other therapies. This may lead to clinicians preferentially prescribing newer, more expensive drugs that are not necessarily better than older drugs. In this study, the researchers test the hypothesis that not all the trial results in NDAs are published in medical journals. They also investigate whether there are any discrepancies between the trial data included in NDAs and in published articles.
What Did the Researchers Do and Find?
The researchers identified all the efficacy trials included in NDAs for totally new drugs that were approved by the FDA in 2001 and 2002 and searched the scientific literature for publications between July 2006 and June 2007 relating to these trials. Only three-quarters of the efficacy trials in the NDAs were published; trials with favorable outcomes were nearly five times as likely to be published as those without favorable outcomes. Although 155 primary outcomes were in both the papers and the NDAs, 41 outcomes were only in the NDAs. Conversely, 17 outcomes were only in the papers; 15 of these favored the test drug. Of the 43 primary outcomes reported in the NDAs that showed no statistically significant benefit for the test drug, only half were included in the papers; for five of the reported primary outcomes, the statistical significance differed between the NDA and the paper and generally favored the test drug in the papers. Finally, nine out of 99 conclusions differed between the NDAs and the papers; each time, the published conclusion favored the test drug.
What Do These Findings Mean?
These findings indicate that the results of many trials of new drugs are not published 5 years after FDA approval of the drug. Furthermore, unexplained discrepancies between the data and conclusions in NDAs and in medical journals are common and tend to paint a more favorable picture of the new drug in the scientific literature than in the NDAs. Overall, these findings suggest that the information on the efficacy of new drugs that is readily available to clinicians and patients through the published scientific literature is incomplete and potentially biased. The recent introduction in the US and elsewhere of mandatory registration of all clinical trials before they start and of mandatory publication in trial registers of the full results of all the predefined primary outcomes should reduce publication bias over the next few years and should allow clinicians and patients to make fully informed treatment decisions.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0050217.
This study is further discussed in a PLoS Medicine Perspective by An-Wen Chan
PLoS Medicine recently published a related article by Ida Sim and colleagues: Lee K, Bacchetti P, Sim I (2008) Publication of clinical trials supporting successful new drug applications: A literature analysis. PLoS Med 5: e191. doi:10.1371/journal.pmed.0050191
The Food and Drug Administration provides information about drug approval in the US for consumers and for health-care professionals; detailed information about the process by which drugs are approved is on the Web site of the FDA Center for Drug Evaluation and Research (in English and Spanish)
NDAs for approved drugs can also be found on this Web site
The ClinicalTrials.gov Web site provides information about the US National Institutes of Health clinical trial registry, background information about clinical trials, and a fact sheet detailing the requirements of the FDA Amendments Act 2007 for trial registration
The World Health Organization's International Clinical Trials Registry Platform is working toward setting international norms and standards for the reporting of clinical trials (in several languages)
doi:10.1371/journal.pmed.0050217
PMCID: PMC2586350  PMID: 19067477

Results 1-25 (712431)