Recently, the World Health Organization (WHO) declared that the spread of polio is an international public health emergency, and a coordinated international response is sought. Although the importance of such a response is recognized, there are challenges to stopping the spread of polio and achieving a polio free world. The most important issue is directing limited national resources to the specific areas where polio is endemic. In an article published in BMC Medicine, Upfill-Brown and his colleagues recognized this problem and successfully identified the potential risk areas in Nigeria using a validated spatial predictive model of wild poliovirus circulation. They also showed that a lower vaccine-derived population immunity is associated with the probability of a higher number of wild poliovirus cases within a district. Identification of the potential risk areas and understanding the magnitude of risk may help direct limited resources of the endemic countries to areas most at risk to maximize the impact of interventions and motivate the people to participate in the intervention program. These efforts are crucial if these endemic countries hope to eradicate polio.
Please see related research article: http://www.biomedcentral.com/1741-7015/12/92.
Polio; Spatial model; Risk area; OPV
The placebo response plays a major role in psychiatry, particularly in depression. A new network meta-analysis investigates whether the effects of placebo vary in studies comparing fluoxetine and venlafaxine, two widely prescribed antidepressants. Even though data from this article indicate that the effects of placebos do not differ, publication bias cannot be ruled out. The authors use their finding to criticise the paradigm of evidence-based medicine, questioning whether there is anything certain in psychiatry and, more precisely, in the field of antidepressant treatment for major depression. This study stimulates the debate about validity of scientific knowledge in medicine and highlights the importance of considering things from a different perspective. However, the authors’ view should be considered with caution. As clinicians, we make decisions every day, integrating individual clinical expertise and patients’ preferences and values with the best, up-to-date research data. The quality of scientific information must be improved, but we still think that valid conclusions to help clinical practice can be drawn from a critical and cautious use of the best available, if flawed, evidence.
Please see related articles: http://www.biomedcentral.com/1741-7015/11/230 and http://www.biomedcentral.com/1741-7015/12/106.
Antidepressants; Major depressive disorder; Meta-analysis; Placebo; Publication bias
Circulating free fatty acids are often elevated in patients with type 2 diabetes (T2D) and obese individuals. Chronic exposure to high levels of saturated fatty acids has detrimental effects on islet function and insulin secretion. Altered gene expression and epigenetics may contribute to T2D and obesity. However, there is limited information on whether fatty acids alter the genome-wide transcriptome profile in conjunction with DNA methylation patterns in human pancreatic islets. To dissect the molecular mechanisms linking lipotoxicity to impaired insulin secretion, we investigated the effects of a 48 h palmitate treatment in vitro on genome-wide mRNA expression and DNA methylation patterns in human pancreatic islets.
Genome-wide mRNA expression was analyzed using Affymetrix GeneChip® Human Gene 1.0 ST whole transcript-based array (n = 13) and genome-wide DNA methylation was analyzed using Infinium HumanMethylation450K BeadChip (n = 13) in human pancreatic islets exposed to palmitate or control media for 48 h. A non-parametric paired Wilcoxon statistical test was used to analyze mRNA expression. Apoptosis was measured using Apo-ONE® Homogeneous Caspase-3/7 Assay (n = 4).
While glucose-stimulated insulin secretion was decreased, there was no significant effect on apoptosis in human islets exposed to palmitate. We identified 1,860 differentially expressed genes in palmitate-treated human islets. These include candidate genes for T2D, such as TCF7L2, GLIS3, HNF1B and SLC30A8. Additionally, genes in glycolysis/gluconeogenesis, pyruvate metabolism, fatty acid metabolism, glutathione metabolism and one carbon pool by folate were differentially expressed in palmitate-treated human islets. Palmitate treatment altered the global DNA methylation level and DNA methylation levels of CpG island shelves and shores, 5′UTR, 3′UTR and gene body regions in human islets. Moreover, 290 genes with differential expression had a corresponding change in DNA methylation, for example, TCF7L2 and GLIS3. Importantly, out of the genes differentially expressed due to palmitate treatment in human islets, 67 were also associated with BMI and 37 were differentially expressed in islets from T2D patients.
Our study demonstrates that palmitate treatment of human pancreatic islets gives rise to epigenetic modifications that together with altered gene expression may contribute to impaired insulin secretion and T2D.
Palmitate; Human pancreatic islets; Type 2 diabetes; Lipotoxicity; DNA methylation; mRNA expression; Insulin secretion; Epigenetics
Palmitic acid, or hexadecanoic acid, a 16-carbon saturated fatty acid (FA), accounts for approximately 38% of the total circulating FA in lean or obese humans. In an article published in BMC Medicine, Hall et al. report that cultured islets from healthy donors, when exposed to palmitate, undergo changes in CpG methylation that are associated with modifications of expression in 290 genes. Their results provide a first look at the mechanisms used by the endocrine pancreas of humans to keep a durable genomic imprint from their exposure to FA that can influence gene expression and possibly cell phenotype in the long term. It is likely that such studies will help understand the epigenetic response of β cells to a disturbed metabolic environment, especially one created by obesity.
Please see related article: http://www.biomedcentral.com/1741-7015/12/103
pancreatic cells; fatty acids; palmitate; epigenetics; gene expression; type 2 diabetes; islets of Langerhans; obesity; DNA methylation
Tuberculous pericarditis (TBP) is associated with high morbidity and mortality, and is an important treatable cause of heart failure in developing countries. Tuberculous aetiology of pericarditis is difficult to diagnose promptly. The utility of the new quantitative PCR test (Xpert MTB/RIF) for the diagnosis of TBP is unknown. This study sought to evaluate the diagnostic accuracy of the Xpert MTB/RIF test compared to pericardial adenosine deaminase (ADA) and unstimulated interferon-gamma (uIFNγ) in suspected TBP.
From October 2009 through September 2012, 151 consecutive patients with suspected TBP were enrolled at a single centre in Cape Town, South Africa. Mycobacterium tuberculosis culture and/or pericardial histology served as the reference standard for definite TBP. Receiver-operating-characteristic curve analysis was used for selection of ADA and uIFNγ cut-points.
Of the participants, 49% (74/151) were classified as definite TBP, 33% (50/151) as probable TBP and 18% (27/151) as non TBP. A total of 105 (74%) participants were human immunodeficiency virus (HIV) positive. Xpert-MTB/RIF had a sensitivity and specificity (95% confidence interval (CI)) of 63.8% (52.4% to 75.1%) and 100% (85.6% to 100%), respectively. Concentration of pericardial fluid by centrifugation and using standard sample processing did not improve Xpert MTB/RIF accuracy. ADA (≥35 IU/L) and uIFNγ (≥44 pg/ml) both had a sensitivity of 95.7% (88.1% to 98.5%) and a negative likelihood ratio of 0.05 (0.02 to 0.10). However, the specificity and positive likelihood ratio of uIFNγ was higher than ADA (96.3% (81.7% to 99.3%) and 25.8 (3.6 to 183.4) versus 84% (65.4% to 93.6%) and 6.0 (3.7 to 9.8); P = 0.03) at an estimated background prevalence of TB of 30%. The sensitivity and negative predictive value of both uIFNγ and ADA were higher than Xpert-MT/RIF (P < 0.001).
uIFNγ offers superior accuracy for the diagnosis of microbiologically confirmed TBP compared to the ADA assay and the Xpert MTB/RIF test.
Tuberculous pericarditis; Adenosine deaminase; Interferon γ; Xpert MTB/RIF test; Diagnosis
Building on an approach developed to assess the economic returns to cardiovascular research, we estimated the economic returns from UK public and charitable funded cancer-related research that arise from the net value of the improved health outcomes.
To assess these economic returns from cancer-related research in the UK we estimated: 1) public and charitable expenditure on cancer-related research in the UK from 1970 to 2009; 2) net monetary benefit (NMB), that is, the health benefit measured in quality adjusted life years (QALYs) valued in monetary terms (using a base-case value of a QALY of GB£25,000) minus the cost of delivering that benefit, for a prioritised list of interventions from 1991 to 2010; 3) the proportion of NMB attributable to UK research; 4) the elapsed time between research funding and health gain; and 5) the internal rate of return (IRR) from cancer-related research investments on health benefits. We analysed the uncertainties in the IRR estimate using sensitivity analyses to illustrate the effect of some key parameters.
In 2011/12 prices, total expenditure on cancer-related research from 1970 to 2009 was £15 billion. The NMB of the 5.9 million QALYs gained from the prioritised interventions from 1991 to 2010 was £124 billion. Calculation of the IRR incorporated an estimated elapsed time of 15 years. We related 17% of the annual NMB estimated to be attributable to UK research (for each of the 20 years 1991 to 2010) to 20 years of research investment 15 years earlier (that is, for 1976 to 1995). This produced a best-estimate IRR of 10%, compared with 9% previously estimated for cardiovascular disease research. The sensitivity analysis demonstrated the importance of smoking reduction as a major source of improved cancer-related health outcomes.
We have demonstrated a substantive IRR from net health gain to public and charitable funding of cancer-related research in the UK, and further validated the approach that we originally used in assessing the returns from cardiovascular research. In doing so, we have highlighted a number of weaknesses and key assumptions that need strengthening in further investigations. Nevertheless, these cautious estimates demonstrate that the returns from past cancer research have been substantial, and justify the investments made during the period 1976 to 1995.
Medical research investment; QALYs; Cancer; Medical research charities; Value of health; Rate of return; Time lags; Research payback
The recent publication of the PREDIMED trial provided definitive evidence that a Mediterranean diet provides protection against cardiovascular disease. Two articles published in BMC Medicine provide further understanding of why this may be the case, by considering contributory effects of olive oil, a core food in the diet, and polyphenols, a class of identifiable protective compounds. Using a number of statistical models, analyses were conducted to show around a 35% cardiovascular disease risk reduction in the highest consumers of olive oil and a similar degree of risk reduction for all-cause mortality comparing highest to lowest quintiles of polyphenol intake. The effects were an advance on cohort studies not related to trials. This suggests that it may be necessary to have better control of the background diet to enable exposure of the value of individual foods and nutrients in a dietary pattern, bearing in mind that, by nature, it is difficult to separate out effects of foods, nutrients and whole diets.
Please see related articles: http://www.biomedcentral.com/1741-7015/12/77 and http://www.biomedcentral.com/1741-7015/12/78.
Mediterranean diet; Cardiovascular disease; Mortality; PREDIMED study
Generic atypical antipsychotic drugs offer health authorities opportunities for considerable savings. However, schizophrenia and bipolar disorders are complex diseases that require tailored treatments. Consequently, generally there have been limited demand-side measures by health authorities to encourage the preferential prescribing of generics. This is unlike the situation with hypertension, hypercholaesterolaemia or acid-related stomach disorders.
The objectives of this study were to compare the effect of the limited demand-side measures in Western European countries and regions on the subsequent prescribing of risperidone following generics; to utilise the findings to provide future guidance to health authorities; and where possible, to investigate the utilisation of generic versus originator risperidone and the prices for generic risperidone.
Principally, this was a segmented regression analysis of retrospective time-series data of the effect of the various initiatives in Belgium, Ireland, Scotland and Sweden following the introduction of generic risperidone. The study included patients prescribed at least one atypical antipsychotic drug up to 20 months before and up to 20 months after generic risperidone. In addition, retrospective observational studies were carried out in Austria and Spain (Catalonia) from 2005 to 2011 as well as one English primary care organisation (Bury Primary Care Trust (PCT)).
There was a consistent steady reduction in risperidone as a percentage of total selected atypical antipsychotic utilisation following generics. A similar pattern was seen in Austria and Spain, with stable utilisation in one English PCT. However, there was considerable variation in the utilisation of generic risperidone, ranging from 98% of total risperidone in Scotland to only 14% in Ireland. Similarly, the price of generic risperidone varied considerably. In Scotland, generic risperidone was only 16% of pre-patent loss prices versus 72% in Ireland.
Consistent findings of no increased prescribing of risperidone post generics with limited specific demand-side measures suggests no ‘spillover’ effect from one class to another encouraging the preferential prescribing of generic atypical antipsychotic drugs. This is exacerbated by the complexity of the disease area and differences in the side-effects between treatments. There appeared to be no clinical issues with generic risperidone, and prices inversely reflected measures to enhance their utilisation.
Generics; Antipsychotics; Risperidone; Demand-side measures; Drug utilisation; Cross national study
The use of antibiotics is the single most important driver in antibiotic resistance. Nevertheless, antibiotic overuse remains common. Decline in antibiotic prescribing in the United States coincided with the launch of national educational campaigns in the 1990s and other interventions, including the introduction of routine infant immunizations with the pneumococcal conjugate vaccine (PCV-7); however, it is unknown if these trends have been sustained through recent measurements.
We performed an analysis of nationally representative data from the Medical Expenditure Panel Surveys from 2000 to 2010. Trends in population-based prescribing were examined for overall antibiotics, broad-spectrum antibiotics, antibiotics for acute respiratory tract infections (ARTIs) and antibiotics prescribed during ARTI visits. Rates were reported for three age groups: children and adolescents (<18 years), adults (18 to 64 years), and older adults (≥65 years).
An estimated 1.4 billion antibiotics were dispensed over the study period. Overall antibiotic prescribing decreased 18% (risk ratio (RR) 0.82, 95% confidence interval (95% CI) 0.72 to 0.94) among children and adolescents, remained unchanged for adults, and increased 30% (1.30, 1.14 to 1.49) among older adults. Rates of broad-spectrum antibiotic prescriptions doubled from 2000 to 2010 (2.11, 1.81 to 2.47). Proportions of broad-spectrum antibiotic prescribing increased across all age groups: 79% (1.79, 1.52 to 2.11) for children and adolescents, 143% (2.43, 2.07 to 2.86) for adults and 68% (1.68, 1.45 to 1.94) for older adults. ARTI antibiotic prescribing decreased 57% (0.43, 0.35 to 0.52) among children and adolescents and 38% (0.62, 0.48 to 0.80) among adults; however, it remained unchanged among older adults. While the number of ARTI visits declined by 19%, patients with ARTI visits were more likely to receive an antibiotic (73% versus 64%; P <0.001) in 2010 than in 2000.
Antibiotic use has decreased among children and adolescents, but has increased for older adults. Broad-spectrum antibiotic prescribing continues to be on the rise. Public policy initiatives to promote the judicious use of antibiotics should continue and programs targeting older adults should be developed.
Antibiotic; Prescribing; Ambulatory care; Antibiotic resistance; Surveillance
The existence of socio-economic inequalities in child mortality is well documented. African cities grow faster than cities in most other regions of the world; and inequalities in African cities are thought to be particularly large. Revealing health-related inequalities is essential in order for governments to be able to act against them. This study aimed to systematically compare inequalities in child mortality across 10 major African cities (Cairo, Lagos, Kinshasa, Luanda, Abidjan, Dar es Salaam, Nairobi, Dakar, Addis Ababa, Accra), and to investigate trends in such inequalities over time.
Data from two rounds of demographic and health surveys (DHS) were used for this study (if available): one from around the year 2000 and one from between 2007 and 2011. Child mortality rates within cities were calculated by population wealth quintiles. Inequality in child mortality was assessed by computing two measures of relative inequality (the rate ratio and the concentration index) and two measures of absolute inequality (the difference and the Erreyger’s index).
Mean child mortality rates ranged from about 39 deaths per 1,000 live births in Cairo (2008) to about 107 deaths per 1,000 live births in Dar es Salaam (2010). Significant inequalities were found in Kinshasa, Luanda, Abidjan, and Addis Ababa in the most recent survey. The difference between the poorest quintile and the richest quintile was as much as 108 deaths per 1,000 live births (95% confidence interval 55 to 166) in Abidjan in 2011–2012. When comparing inequalities across cities or over time, confidence intervals of all measures almost always overlap. Nevertheless, inequalities appear to have increased in Abidjan, while they appear to have decreased in Cairo, Lagos, Dar es Salaam, Nairobi and Dakar.
Considerable inequalities exist in almost all cities but the level of inequalities and their development over time appear to differ across cities. This implies that inequalities are amenable to policy interventions and that it is worth investigating why inequalities are higher in one city than in another. However, larger samples are needed in order to improve the certainty of our results. Currently available data samples from DHS are too small to reliably quantify the level of inequalities within cities.
Socioeconomic factors; Urban health; Child mortality; Africa; Social justice
The aim of this study is to outline a general process for assessing the feasibility of performing a valid network meta-analysis (NMA) of randomized controlled trials (RCTs) to synthesize direct and indirect evidence for alternative treatments for a specific disease population.
Several steps to assess the feasibility of an NMA are proposed based on existing recommendations. Next, a case study is used to illustrate this NMA feasibility assessment process in order to compare everolimus in combination with hormonal therapy to alternative chemotherapies in terms of progression-free survival for women with advanced breast cancer.
A general process for assessing the feasibility of an NMA is outlined that incorporates explicit steps to visualize the heterogeneity in terms of treatment and outcome characteristics (Part A) as well as the study and patient characteristics (Part B). Additionally, steps are performed to illustrate differences within and across different types of direct comparisons in terms of baseline risk (Part C) and observed treatment effects (Part D) since there is a risk that the treatment effect modifiers identified may not explain the observed heterogeneity or inconsistency in the results due to unexpected, unreported or unmeasured differences. Depending on the data available, alternative approaches are suggested: list assumptions, perform a meta-regression analysis, subgroup analysis, sensitivity analyses, or summarize why an NMA is not feasible.
The process outlined to assess the feasibility of an NMA provides a stepwise framework that will help to ensure that the underlying assumptions are systematically explored and that the risks (and benefits) of pooling and indirectly comparing treatment effects from RCTs for a particular research question are transparent.
Advanced breast cancer; Everolimus; Chemotherapy; Network meta-analysis; Progression-free survival; Systematic literature review; Feasibility assessment
One of the challenges facing the Global Polio Eradication Initiative is efficiently directing limited resources, such as specially trained personnel, community outreach activities, and satellite vaccinator tracking, to the most at-risk areas to maximize the impact of interventions. A validated predictive model of wild poliovirus circulation would greatly inform prioritization efforts by accurately forecasting areas at greatest risk, thus enabling the greatest effect of program interventions.
Using Nigerian acute flaccid paralysis surveillance data from 2004-2013, we developed a spatial hierarchical Poisson hurdle model fitted within a Bayesian framework to study historical polio caseload patterns and forecast future circulation of type 1 and 3 wild poliovirus within districts in Nigeria. A Bayesian temporal smoothing model was applied to address data sparsity underlying estimates of covariates at the district level.
We find that calculated vaccine-derived population immunity is significantly negatively associated with the probability and number of wild poliovirus case(s) within a district. Recent case information is significantly positively associated with probability of a case, but not the number of cases. We used lagged indicators and coefficients from the fitted models to forecast reported cases in the subsequent six-month periods. Over the past three years, the average predictive ability is 86 ± 2% and 85 ± 4% for wild poliovirus type 1 and 3, respectively. Interestingly, the predictive accuracy of historical transmission patterns alone is equivalent (86 ± 2% and 84 ± 4% for type 1 and 3, respectively). We calculate uncertainty in risk ranking to inform assessments of changes in rank between time periods.
The model developed in this study successfully predicts districts at risk for future wild poliovirus cases in Nigeria. The highest predicted district risk was 12.8 WPV1 cases in 2006, while the lowest district risk was 0.001 WPV1 cases in 2013. Model results have been used to direct the allocation of many different interventions, including political and religious advocacy visits. This modeling approach could be applied to other vaccine preventable diseases for use in other control and elimination programs.
Polio eradication; Spatial epidemiology; Risk modeling; Disease mapping
Despite intense investigation, the temporal sequence between alcohol consumption and mental health remains unclear. This study explored the relationship between alcohol consumption and mental health over multiple occasions, and compared a series of competing theoretical models to determine which best reflected the association between the two.
Data from phases 5 (1997 to 1999), 7 (2002 to 2004), and 9 (2007 to 2009) of the Whitehall II prospective cohort study were used, providing approximately 10 years of follow-up for 6,330 participants (73% men; mean ± SD age 55.8 ± 6.0 years). Mental health was assessed using the Short Form (SF)-36 mental health component score. Alcohol consumption was defined as the number of UK units of alcohol drunk per week. Four dynamic latent change score models were compared: 1) a baseline model in which alcohol consumption and mental health trajectories did not influence each other, 2) and model in which alcohol consumption influenced changes in mental health but mental health exerted no effect on changes in drinking and 3) vice versa, and (4) a reciprocal model in which both variables influenced changes in each other.
The third model, in which mental health influenced changes in alcohol consumption but not vice versa, was the best fit. In this model, the effect of previous mental health on upcoming change in alcohol consumption was negative (γ = -0.31, 95% CI -0.52 to -0.10), meaning that those with better mental health tended to make greater reductions (or shallower increases) in their drinking between occasions.
Mental health appears to be the leading indicator of change in the dynamic longitudinal relationship between mental health and weekly alcohol consumption in this sample of middle-aged adults. In addition to fuelling increases in alcohol consumption among low-level consumers, poor mental health may also be a maintaining factor for heavy alcohol consumption. Future work should seek to examine whether there are critical levels of alcohol intake at which different dynamic relationships begin to emerge between alcohol-related measures and mental health.
Alcohol; Mental health; Longitudinal; Reciprocal; Self-medication; Temporality
Providing additional Saturday rehabilitation can improve functional independence and health related quality of life at discharge and it may reduce patient length of stay, yet the economic implications are not known. The aim of this study was to determine from a health service perspective if the provision of rehabilitation to inpatients on a Saturday in addition to Monday to Friday was cost effective compared to Monday to Friday rehabilitation alone.
Cost utility and cost effectiveness analyses were undertaken alongside a multi-center, single-blind randomized controlled trial with a 30-day follow up after discharge. Participants were adults admitted for inpatient rehabilitation in two publicly funded metropolitan rehabilitation facilities. The control group received usual care rehabilitation services from Monday to Friday and the intervention group received usual care plus an additional rehabilitation service on Saturday. Incremental cost utility ratio was reported as cost per quality adjusted life year (QALY) gained and an incremental cost effectiveness ratio (ICER) was reported as cost for a minimal clinically important difference (MCID) in functional independence.
996 patients (mean age 74 (standard deviation 13) years) were randomly assigned to the intervention (n = 496) or the control group (n = 500). Mean difference in cost of AUD$1,673 (95% confidence interval (CI) -271 to 3,618) was a saving in favor of the intervention group. The incremental cost utility ratio found a saving of AUD$41,825 (95% CI -2,817 to 74,620) per QALY gained for the intervention group. The ICER found a saving of AUD$16,003 (95% CI -3,074 to 87,361) in achieving a MCID in functional independence for the intervention group. If the willingness to pay per QALY gained or for a MCID in functional independence was zero dollars the probability of the intervention being cost effective was 96% and 95%, respectively. A sensitivity analysis removing Saturday penalty rates did not significantly alter the outcome.
From a health service perspective, the provision of rehabilitation to inpatients on a Saturday in addition to Monday to Friday, compared to Monday to Friday rehabilitation alone, is likely to be cost saving per QALY gained and for a MCID in functional independence.
Australian and New Zealand Clinical Trials Registry November 2009
Rehabilitation; Economic evaluation; Randomized controlled trial; Allied health
Mutations in the cystic fibrosis transmembrane conductance regulator (CFTR) gene lead to the disease cystic fibrosis (CF). Although patients with CF often have disturbances in glucose metabolism including impaired insulin release, no previous studies have tested the hypothesis that CFTR has a biological function in pancreatic beta-cells.
Experiments were performed on islets and single beta-cells from human donors and NMRI-mice. Detection of CFTR was investigated using PCR and confocal microscopy. Effects on insulin secretion were measured with radioimmunoassay (RIA). The patch-clamp technique was used to measure ion channel currents and calcium-dependent exocytosis (as changes in membrane capacitance) on single cells with high temporal resolution. Analysis of ultrastructure was done on transmission electron microscopy (TEM) images.
We detected the presence of CFTR and measured a small CFTR conductance in both human and mouse beta-cells. The augmentation of insulin secretion at 16.7 mM glucose by activation of CFTR by cAMP (forskolin (FSK) or GLP-1) was significantly inhibited when CFTR antagonists (GlyH-101 and/or CFTRinh-172) were added. Likewise, capacitance measurements demonstrated reduced cAMP-dependent exocytosis upon CFTR-inhibition, concomitant with a decreased number of docked insulin granules. Finally, our studies demonstrate that CFTR act upstream of the chloride channel Anoctamin 1 (ANO1; TMEM16A) in the regulation of cAMP- and glucose-stimulated insulin secretion.
Our work demonstrates a novel function for CFTR as a regulator of pancreatic beta-cell insulin secretion and exocytosis, and put forward a role for CFTR as regulator of ANO1 and downstream priming of insulin granules prior to fusion and release of insulin. The pronounced regulatory effect of CFTR on insulin secretion is consistent with impaired insulin secretion in patients with CF.
CFTR; Cystic Fibrosis; Diabetes; Insulin secretion; Islet of Langerhans; Beta-cell; Exocytosis
Appropriate public health responses to infectious disease threats should be based on best-available evidence, which requires timely reliable data for appropriate analysis. During the early stages of epidemics, analysis of ‘line lists’ with detailed information on laboratory-confirmed cases can provide important insights into the epidemiology of a specific disease. The objective of the present study was to investigate the extent to which reliable epidemiologic inferences could be made from publicly-available epidemiologic data of human infection with influenza A(H7N9) virus.
We collated and compared six different line lists of laboratory-confirmed human cases of influenza A(H7N9) virus infection in the 2013 outbreak in China, including the official line list constructed by the Chinese Center for Disease Control and Prevention plus five other line lists by HealthMap, Virginia Tech, Bloomberg News, the University of Hong Kong and FluTrackers, based on publicly-available information. We characterized clinical severity and transmissibility of the outbreak, using line lists available at specific dates to estimate epidemiologic parameters, to replicate real-time inferences on the hospitalization fatality risk, and the impact of live poultry market closure.
Demographic information was mostly complete (less than 10% missing for all variables) in different line lists, but there were more missing data on dates of hospitalization, discharge and health status (more than 10% missing for each variable). The estimated onset to hospitalization distributions were similar (median ranged from 4.6 to 5.6 days) for all line lists. Hospital fatality risk was consistently around 20% in the early phase of the epidemic for all line lists and approached the final estimate of 35% afterwards for the official line list only. Most of the line lists estimated >90% reduction in incidence rates after live poultry market closures in Shanghai, Nanjing and Hangzhou.
We demonstrated that analysis of publicly-available data on H7N9 permitted reliable assessment of transmissibility and geographical dispersion, while assessment of clinical severity was less straightforward. Our results highlight the potential value in constructing a minimum dataset with standardized format and definition, and regular updates of patient status. Such an approach could be particularly useful for diseases that spread across multiple countries.
Epidemiological monitoring; Line list; Infectious disease outbreak; Influenza A virus; H7N9 subtype
Wheat, once thought to be a critical ingredient in a healthy diet, has become a major threat, according to public opinion. The term non-celiac gluten sensitivity has been widely adopted to describe a clinical entity characterized by symptoms induced by gluten without the diagnostic criteria found in other gluten-related disorders. However, it has not been shown that gluten per se is involved, and it can be debated if the condition is a disease. Nevertheless, a large number of individuals go gluten-free, avoiding wheat, rye and barley, even without a defined medical cause. In a study in BMC Medicine, Volta and colleagues from Italy report on a large, multicenter attempt to enumerate the prevalence of non-celiac gluten sensitivity in secondary gastroenterology care. They found that approximately 3% of their more than 12,000 patients fulfilled their criteria for non-celiac gluten sensitivity. However, we are still challenged with finding stricter clinical criteria for the condition, developing a usable clinical approach for gluten challenge in these individuals, and understanding the pathogenesis of the condition.
Please see related article http://www.biomedcentral.com/1741-7015/12/85.
Celiac disease; Diagnosis; FODMAP; Gluten; Gluten-free diet; Irritable bowel syndrome; Multicenter study; Non-celiac gluten sensitivity
Non-celiac gluten sensitivity (NCGS) is still an undefined syndrome with several unsettled issues despite the increasing awareness of its existence. We carried out a prospective survey on NCGS in Italian centers for the diagnosis of gluten-related disorders, with the aim of defining the clinical picture of this new syndrome and to establish roughly its prevalence compared with celiac disease.
From November 2012 to October 2013, 38 Italian centers (27 adult gastroenterology, 5 internal medicine, 4 pediatrics, and 2 allergy) participated in this prospective survey. A questionnaire was used in order to allow uniform and accurate collection of clinical, biochemical, and instrumental data.
In total, 486 patients with suspected NCGS were identified in this 1-year period. The female/male ratio was 5.4 to 1, and the mean age was 38 years (range 3–81). The clinical picture was characterized by combined gastrointestinal (abdominal pain, bloating, diarrhea and/or constipation, nausea, epigastric pain, gastroesophageal reflux, aphthous stomatitis) and systemic manifestations (tiredness, headache, fibromyalgia-like joint/muscle pain, leg or arm numbness, 'foggy mind,' dermatitis or skin rash, depression, anxiety, and anemia). In the large majority of patients, the time lapse between gluten ingestion and the appearance of symptoms varied from a few hours to 1 day. The most frequent associated disorders were irritable bowel syndrome (47%), food intolerance (35%) and IgE-mediated allergy (22%). An associated autoimmune disease was detected in 14% of cases. Regarding family history, 18% of our patients had a relative with celiac disease, but no correlation was found between NCGS and positivity for HLA-DQ2/-DQ8. IgG anti-gliadin antibodies were detected in 25% of the patients tested. Only a proportion of patients underwent duodenal biopsy; for those that did, the biopsies showed normal intestinal mucosa (69%) or mild increase in intraepithelial lymphocytes (31%). The ratio between suspected NCGS and new CD diagnoses, assessed in 28 of the participating centers, was 1.15 to 1.
This prospective survey shows that NCGS has a strong correlation with female gender and adult age. Based on our results, the prevalence of NCGS seems to be only slightly higher than that of celiac disease.
Please see related article http://www.biomedcentral.com/1741-7015/12/86.
Non-celiac gluten sensitivity; Celiac disease; Prospective survey; Clinical picture; Duodenal biopsy; Anti-gliadin antibodies
In recent years, different approaches to large-scale mental health service provision for children in war-affected, mainly low- and middle-income, countries have been developed. Some school-based programs aiming at both strengthening resilience and reducing symptoms of trauma-related distress have been evaluated. In an article published in BMC Medicine, Tol and colleagues integrate their findings of the efficacy of universal school-based intervention across four countries and do not recommend classroom-based intervention as a treatment of trauma-related symptoms, since no consistent positive effects were found. On the contrary, for some children this type of universal intervention may impair recovery. Since universal school-based programs similar to the one evaluated here are widely implemented, Tol et al.’s results are highly relevant to inform the field of mental health service provision in war-affected countries.
Please see related article http://www.biomedcentral.com/1741-7015/12/56.
Trauma; Children; War; PTSD; Depression; Treatment; Prevention; Trial; Effectiveness
Primary focal segmental glomerulosclerosis (FSGS) accounts for nearly 10 % of patients who require renal replacement therapy. Elevated circulating levels of soluble urokinase receptor (suPAR) have been identified as a biomarker to discriminate primary FSGS from other glomerulopathies. Subsequent reports have questioned the diagnostic utility of this test. In a study in BMC Medicine, Huang et al. demonstrate that urinary soluble urokinase receptor (suPAR) excretion assists in distinguishing primary FSGS from other glomerular diseases, and that high plasma suPAR concentrations are not directly linked to a decline in glomerular filtration rate (GFR). This observation suggests that further investigation of suPAR is warranted in patients with FSGS. It should be interpreted in light of a recent report that B7-1 is expressed in the podocytes of a subset of patients with FSGS, and that blocking this molecule may represent the first successful targeted intervention for this disease. These advances highlight the rapid pace of scientific progress in the field of nephrology. Nephrologists should work together, share resources, and expedite the design of protocols to evaluate these novel biomarkers in a comprehensive and scientifically valid manner.
Please see related article http://www.biomedcentral.com/1741-7015/12/81.
Focal segmental glomerulosclerosis (FSGS); Soluble urokinase receptor (suPAR); Podocyte; B7.1
Focal segmental glomerulosclerosis (FSGS) is a major cause of end-stage renal disease. Recent studies have proposed that plasma soluble urokinase receptor (suPAR) might be a causative circulating factor but this proposal has caused controversy. This study aimed to measure urinary suPAR levels in patients with primary FSGS and its significance in the pathogenesis of FSGS.
Sixty-two patients with primary FSGS, diagnosed between January 2006 and January 2012, with complete clinical and pathologic data were enrolled, together with disease and normal controls. Urinary suPAR levels were measured using commercial ELISA kits and were corrected by urinary creatinine (Cr). The associations between urinary suPAR levels and clinical data at presentation and during follow up were analyzed. Conditionally immortalized human podocytes were used to study the effect of urinary suPAR on activating β3 integrin detected by AP5 staining.
The urinary suPAR level of patients with primary FSGS (500.56, IQR 262.78 to 1,059.44 pg/μmol Cr) was significantly higher than that of patients with minimal change disease (307.86, IQR 216.54 to 480.18 pg/μmol Cr, P = 0.033), membranous nephropathy (250.23, IQR 170.37 to 357.59 pg/μmol Cr, P <0.001), secondary FSGS (220.45, IQR 149.38 to 335.54 pg/μmol Cr, P <0.001) and normal subjects (183.59, IQR 103.92 to 228.78 pg/μmol Cr, P <0.001). The urinary suPAR level of patients with cellular variant was significantly higher than that of patients with tip variant. The urinary suPAR level in the patients with primary FSGS was positively correlated with 24-hour urine protein (r = 0.287, P = 0.024). During follow up, the urinary suPAR level of patients with complete remission decreased significantly (661.19, IQR 224.32 to 1,115.29 pg/μmol Cr versus 217.68, IQR 121.77 to 415.55 pg/μmol Cr, P = 0.017). The AP5 signal was strongly induced along the cell membrane when human differentiated podocytes were incubated with the urine of patients with FSGS at presentation, and the signal could be reduced by a blocking antibody specific to uPAR.
Urinary suPAR was specifically elevated in patients with primary FSGS and was associated with disease severity. The elevated urinary suPAR could activate β3 integrin on human podocytes.
Please see related article http://www.biomedcentral.com/1741-7015/12/82.
Focal segmental glomerulosclerosis; Urinary soluble urokinase receptor; Podocyte
Rapid risk stratification is a core task in emergency medicine. Identifying patients at high and low risk shortly after admission could help clinical decision-making regarding treatment, level of observation, allocation of resources and post discharge follow-up. The purpose of the present study was to determine short-, mid- and long-term mortality by plasma measurement of copeptin in unselected admitted patients.
Consecutive patients >40-years-old admitted to an inner-city hospital were included. Within the first 24 hours after admission, a structured medical interview was conducted and self-reported medical history was recorded. All patients underwent a clinical examination, an echocardiographic evaluation and collection of blood for later measurement of risk markers.
Plasma for copeptin measurement was available from 1,320 patients (average age 70.5 years, 59.4% women). Median follow-up time was 11.5 years (range 11.0 to 12.0 years). Copeptin was elevated (that is, above the 97.5 percentile in healthy individuals).
Mortality within the first week was 2.7% (17/627) for patients with elevated copeptin (above the 97.5 percentile, that is, >11.3 pmol/L) compared to 0.1% (1/693) for patients with normal copeptin concentrations (that is, ≤11.3 pmol/L) (P <0.01). Three-month mortality was 14.5% (91/627) for patients with elevated copeptin compared to 3.2% (22/693) for patients with normal copeptin. Similar figures for one-year mortality and for the entire observation period were 27.6% (173/627) versus 8.7% (60/693) and 82.9% (520/527) versus 57.5% (398/693) (P <0.01 for both), respectively.
Using multivariable Cox regression analyses shows that elevated copeptin was significantly and independently related to short-, mid- and long-term mortality. Adjusted hazard ratios were 2.4 for three-month mortality, 1.9 for one-year mortality and 1.4 for mortality in the entire observation period.
In patients admitted to an inner-city hospital, copeptin was strongly associated with short-, mid- and long-term mortality. The results suggest that rapid copeptin measurement could be a useful tool for both disposition in an emergency department and for mid- and long-term risk assessment.
Biomarker; Mortality; Inflammation
Some children with autism spectrum disorders (ASD; 15% to 30% of patients) show a significant and persistent regression in speech and social function during early childhood. There are no established treatments for the regressive symptoms. However, there are some known causes of this type of regression, such as Rett syndrome and Landau-Kleffner syndrome (LKS). In LKS, steroids have been used as a treatment. Some evidence suggests an autoimmune contribution to the pathophysiology of autism (Chez MG, Guido-Estrada N: Immune therapy in autism: historical experience and future directions with immunomodulatory therapy. Neurotherapeutics 2010, 7:293–301, Wasilewska J, Kaczmarski M, Stasiak-Barmuta A, Tobolczyk J, Kowalewska E: Low serum IgA and increased expression of CD23 on B lymphocytes in peripheral blood in children with regressive autism aged 3-6 years old. Arch Med Sci 2012, 8:324–331, Stefanatos G: Changing perspectives on Landau-Kleffner syndrome. Clin Neuropsychol 2011, 25:963–988), raising the possibility that steroids might be a useful therapy for regression in ASD. A retrospective study published in BMC Neurology by Duffy et al. (Duffy, et al: Corticosteroid therapy in regressive autism: A retrospective study of effects on the Frequency Modulated Auditory Evoked Response (FMAER), language, and behavior. BMC Neurol 2014, 14:70) reviewed 20 steroid treated R-ASD (STAR) patients and 24 ASD control patients not treated with steroids (NSA). Improvements in clinical function and in a neurophysiological biomarker were seen in the steroid-treated children pre- to post-prednisolone treatment. This research provides a rationale for a randomized trial with steroid therapy to determine the longer term benefits and complications of steroids in this population.
Please see related article http://www.biomedcentral.com/1471-2377/14/70/abstract.
Autism; Regression; Corticosteroids; FMAER; Language; Behavior
It is unknown whether individuals at high cardiovascular risk sustain a benefit in cardiovascular disease from increased olive oil consumption. The aim was to assess the association between total olive oil intake, its varieties (extra virgin and common olive oil) and the risk of cardiovascular disease and mortality in a Mediterranean population at high cardiovascular risk.
We included 7,216 men and women at high cardiovascular risk, aged 55 to 80 years, from the PREvención con DIeta MEDiterránea (PREDIMED) study, a multicenter, randomized, controlled, clinical trial. Participants were randomized to one of three interventions: Mediterranean Diets supplemented with nuts or extra-virgin olive oil, or a control low-fat diet. The present analysis was conducted as an observational prospective cohort study. The median follow-up was 4.8 years. Cardiovascular disease (stroke, myocardial infarction and cardiovascular death) and mortality were ascertained by medical records and National Death Index. Olive oil consumption was evaluated with validated food frequency questionnaires. Multivariate Cox proportional hazards and generalized estimating equations were used to assess the association between baseline and yearly repeated measurements of olive oil intake, cardiovascular disease and mortality.
During follow-up, 277 cardiovascular events and 323 deaths occurred. Participants in the highest energy-adjusted tertile of baseline total olive oil and extra-virgin olive oil consumption had 35% (HR: 0.65; 95% CI: 0.47 to 0.89) and 39% (HR: 0.61; 95% CI: 0.44 to 0.85) cardiovascular disease risk reduction, respectively, compared to the reference. Higher baseline total olive oil consumption was associated with 48% (HR: 0.52; 95% CI: 0.29 to 0.93) reduced risk of cardiovascular mortality. For each 10 g/d increase in extra-virgin olive oil consumption, cardiovascular disease and mortality risk decreased by 10% and 7%, respectively. No significant associations were found for cancer and all-cause mortality. The associations between cardiovascular events and extra virgin olive oil intake were significant in the Mediterranean diet intervention groups and not in the control group.
Olive oil consumption, specifically the extra-virgin variety, is associated with reduced risks of cardiovascular disease and mortality in individuals at high cardiovascular risk.
This study was registered at controlled-trials.com (http://www.controlled-trials.com/ISRCTN35739639). International Standard Randomized Controlled Trial Number (ISRCTN): 35739639. Registration date: 5 October 2005.
Olive oil; Cardiovascular; Mortality; Mediterranean Diet; PREDIMED
Hyperglycemia is associated with increased risk of all-site cancer that may be mediated through activation of the renin-angiotensin-system (RAS) and 3-hydroxy-3-methyl-glutaryl-coenzyme-A-reductase (HMGCR) pathways. We examined the joint associations of optimal glycemic control (HbA1c <7%), RAS inhibitors and HMGCR inhibitors on cancer incidence in patients with type 2 diabetes.
Patients with type 2 diabetes, with or without a history of cancer or prior exposure to RAS or HMGCR inhibitors at baseline were observed between 1996 and 2005. All patients underwent a comprehensive assessment at baseline and were followed until the censored date at 2005 or their death.
After a median follow-up period of 4.91 years (interquartile range, 2.81 to 6.98), 271 out of 6,103 patients developed all-site cancer. At baseline, patients with incident cancers were older, had longer disease duration of diabetes, higher alcohol and tobacco use, and higher systolic blood pressure and albuminuria, but lower triglyceride levels and estimated glomerular filtration rate (P <0.05). Patients who developed cancers during follow-up were less likely to have started using statins (22.5% versus 38.6%, P <0.001), fibrates (5.9% versus 10.2%, P = 0.02), metformin (63.8% versus 74.5%, P <0.001) or thiazolidinedione (0.7% versus 6.8%, P <0.001) than those who remained cancer-free. After adjusting for co-variables, new treatment with metformin (hazard ratio: 0.39; 95% confidence interval: 0.25, 0.61; P <0.001), thiazolidinedione (0.18; 0.04, 0.72; P = 0.015), sulphonylurea (0.44; 0.27, 0.73; P = 0.014), insulin (0.58; 0.38, 0.89; P = 0.01), statins (0.47; 0.31, 0.70; P <0.001) and RAS inhibitors (0.55; 0.39, 0.78; P <0.001) were associated with reduced cancer risk. Patients with all three risk factors of HbA1c ≥7%, non-use of RAS inhibitors and non-use of statins had four-fold adjusted higher risk of cancer than those without any risk factors (incidence per 1,000-person-years for no risk factors: 3.40 (0.07, 6.72); one risk factor: 6.34 (4.19, 8.50); two risk factors: 8.40 (6.60, 10.20); three risk factors: 13.08 (9.82, 16.34); P <0.001).
Hyperglycemia may promote cancer growth that can be attenuated by optimal glycemic control and inhibition of the RAS and HMGCR pathways.
Cancer risk; Type 2 diabetes; Glycemic control; Statins; Renin-angiotensin-system inhibitors