OBJECTIVE--To evaluate the prevention of respiratory complications after abdominal surgery by a comparison of a global policy of incentive spirometry with a regimen consisting of deep breathing exercises for low risk patients and incentive spirometry plus physiotherapy for high risk patients. DESIGN--Stratified randomised trial. SETTING--General surgical service of an urban teaching hospital. PATIENTS--456 patients undergoing abdominal surgery. Patients less than 60 years of age with an American Society of Anesthesia classification of 1 were considered to be at low risk. OUTCOME MEASURES--Respiratory complications were defined as clinical features consistent with collapse or consolidation, a temperature above 38 degrees C, plus either confirmatory chest radiology or positive results on sputum microbiology. We also recorded the time that staff devoted to prophylactic respiratory therapy. RESULTS--There was good baseline equivalence between the groups. The incidence of respiratory complications was 15% (35/231) for patients in the incentive spirometry group and 12% (28/225) for patients in the mixed therapy group (P = 0.40; 95% confidence interval -3.6% to 9.0%). It required similar amounts of staff time to provide incentive spirometry and deep breathing exercises for low risk patients. The inclusion of physiotherapy for high risk patients, however, resulted in the utilisation of an extra 30 minutes of staff time per patient. CONCLUSIONS--When the use of resources is taken into account, the most efficient regimen of prophylaxis against respiratory complications after abdominal surgery is deep breathing exercises for low risk patients and incentive spirometry for high risk patients.
An alarming increase in emergence of antibiotic resistance among pathogens worldwide has become a serious threat to our ability to treat infectious diseases according to the World Health Organization. Extensive use of antibiotics by livestock producers promotes the spread of new resistant strains, some of zoonotic concern, which increases food-borne illness in humans and causes significant economic burden on healthcare systems. Furthermore, consumer preferences for meat/poultry/fish produced without the use of antibiotics shape today’s market demand. So, it is viewed as inevitable by the One Health Initiative that humans need to reduce the use of antibiotics and turn to alternative, improved means to control disease: vaccination and prophylactics. Besides the intense research focused on novel therapeutic molecules, both these strategies rely heavily on the availability of cost-effective, efficient and scalable production platforms which will allow large-volume manufacturing for vaccines, antibodies and other biopharmaceuticals. Within this context, plant-based platforms for production of recombinant therapeutic proteins offer significant advantages over conventional expression systems, including lack of animal pathogens, low production costs, fast turnaround and response times and rapid, nearly-unlimited scalability. Also, because dried leaves and seeds can be stored at room temperature for lengthy periods without loss of recombinant proteins, plant expression systems have the potential to offer lucrative benefits from the development of edible vaccines and prophylactics, as these would not require “cold chain” storage and transportation, and could be administered in mass volumes with minimal processing. Several biotechnology companies currently have developed and adopted plant-based platforms for commercial production of recombinant protein therapeutics. In this manuscript, we outline the challenges in the process of livestock immunization as well as the current plant biotechnology developments aimed to address these challenges.
Globally, Nigeria had the fourth highest incidence of tuberculosis (TB) cases in 2009. Datasets of the 2008 Nigeria Demographic and Health Survey (NDHS) were used for examining factors associated with respondents’ knowledge of and attitude towards TB in Nigeria. With the same age-group of males and females, the sample included 47,193 respondents aged 15-49 years. Factors associated with the knowledge of and attitude towards TB were examined against a set of individual-, household- and community-level variables, using multiple binary logistic regression analyses. Respondents who reported having ever heard of TB was 74.7%. Of those who ever heard of TB, 76.9% believed that TB can be cured, and 19.6% would want a family member's TB to be kept secret. Of those who ever heard of TB, 63.1% believed that TB was spread from person to person through the air by coughing or sneezing. Multivariate analysis indicated that the probability of having poor knowledge of and negative attitude towards TB was consistently significant among the poorest household (lowest wealth quintile), geopolitical regions (North Central), respondents with no schooling, non-working respondents, youngest age-group (15-19 years), and rural areas [adjusted odds ratios (AOR)=0.76, 95% CI 0.66-0.86 for respondents who had ever heard of TB; AOR=0.89, 95% CI 0.80-0.99 for respondents who had ever heard of TB and believed that TB can be cured; AOR=0.83, 95% CI 0.73-0.94 for those who had ever heard of TB and concealed the fact that a family member had TB; and AOR=0.88, 95% CI 0.78-0.99 for those who had ever heard of TB and believed TB was spread from person to person through the air by coughing or sneezing]. Efforts to improve the knowledge of and attitude towards TB in Nigeria should focus on the youngest age-group (15-19 years), the poorest households, and respondents with no schooling. Improving the knowledge and attitude of these groups of individuals may result in an increase in the number of people who will seek early treatment.
Attitude; Determinants; Knowledge; Tuberculosis; Nigeria
The following are the proceedings of a symposium held at the Second International Congress for Respiratory Science in Bad Honnef, Germany. The goals of the symposium were to delineate the blood-gas barrier phenotype across vertebrate species; to delineate the interrelationship between the evolution of the blood-gas barrier, locomotion and metabolism; to introduce the selection pressures for the evolution of the surfactant system as a key to understanding the physiology of the blood-gas barrier; to introduce the lung lipofibroblast and its product, leptin, which coordinately regulates pulmonary surfactant, type IV collagen in the basement membrane and host defense, as the cell-molecular site of selection pressure for the blood-gas barrier; to drill down to the gene regulatory network(s) involved in leptin signaling and the blood-gas barrier phenotype; to extend the relationship between leptin and the blood-gas-barrier to diving mammals.
The Liverpool Care Pathway for the Dying Patient (LCP) aims to transfer hospice principles of care for dying patients to other health-care sectors. This post-bereavement survey explored the LCP's effectiveness in improving quality of care for cancer patients.
Postal self-completion questionnaires were sent to 778 next-of-kin to consecutive deceased patients who had died an ‘expected' cancer death in a hospice and acute tertiary hospital.
Following exclusions (n=53), 255 of the 725 next-of-kin agreed to participate (35.2% response rate). Overall hospice participants reported the best quality of care, and hospital participants, for whom care was not supported by the LCP, reported the worst quality of care. Multivariate analysis showed the hospice was an independent predictor for patients being treated with dignity (OR 8.46) and receiving adequate family support (OR 7.18) (P<0.0001). Care supported by the LCP and the hospital specialist palliative care team were both associated with good family support, but neither was an independent predictor.
From the bereaved relatives' perspective, within the hospital, the LCP is effective in improving specific aspects of care, such as symptom control for dying patients. Further improvement is required, however, to attain the hospice standard of care.
dying; evaluation; Liverpool Care Pathway; post-bereavement survey; proxy; quality of care
Delirium, an acute organ dysfunction, is common among critically ill patients leading to significant morbidity and mortality; its epidemiology in a mixed cardiology and cardiac surgery intensive care unit (CVICU) is not well established. We sought to determine the prevalence and risk factors for delirium among CVICU patients.
Prospective observational study.
27-bed medical-surgical CVICU.
200 consecutive patients with an expected CVICU length of stay >24 hours.
Baseline demographic data and daily assessments for delirium using the validated and reliable Confusion Assessment Method for the Intensive Care Unit (CAM-ICU) were recorded, and quantitative tracking of delirium risk factors were conducted. Separate analyses studied the role of admission risk factors for occurrence of delirium during the CVICU stay and identified daily occurring risk factors for the development of delirium on a subsequent CVICU day.
Prevalence of delirium was 26%, similar among cardiology and cardiac surgical patients. Nearly all (92%) exhibited the hypoactive subtype of delirium. Benzodiazepine use on admission was independently predictive of a 3-fold increased risk of delirium [Odds Ratio 3.1 (1, 9.4), p=0.04] during the CVICU stay. Of the daily occurring risk factors, patients who received benzodiazepines [2.6 (1.2, 5.7), p=0.02] or had restraints or devices that precluded mobilization [2.9 (1.3, 6.5), p<0.01] were more likely to have delirium the following day. Hemodynamic status was not associated with delirium.
Delirium occurred in 1 in 4 patients in the CVICU and was predominately hypoactive in subtype. Chemical restraints via use of benzodiazepines or the use of physical restraints/restraining devices predisposed patients to a greater risk of delirium, pointing to areas of quality improvement that would be new to the vast majority of CVICUs.
delirium; cardiovascular intensive care unit; acute coronary syndrome; cardiac surgery; benzodiazepines; restraints
Prostatic abscess is a rarely described condition and is commonly caused by gram-negative organisms such as enterobacteria. However, as the prevalence of methicillin resistant Staphylococcus aureus (MRSA) increases in the community, unusual infections due to this organism have been recently published. In this report, we describe a patient with diabetes mellitus type 2, who presents with diabetic ketoacidosis—later found to be due to a prostatic abscess from which MRSA was cultured.
One of the challenges for evaluating new otoprotective agents for potential benefit in human populations is availability of an established clinical paradigm with real world relevance. These studies were explicitly designed to develop a real-world digital music exposure that reliably induces temporary threshold shift (TTS) in normal hearing human subjects.
Thirty-three subjects participated in studies that measured effects of digital music player use on hearing. Subjects selected either rock or pop music, which was then presented at 93–95 (n=10), 98–100 (n=11), or 100–102 (n=12) dBA in-ear exposure level for a period of four hours. Audiograms and distortion product otoacoustic emissions (DPOAEs) were measured prior to and after music exposure. Post-music tests were initiated 15 min, 1 hr 15 min, 2 hr 15 min, and 3 hr 15 min after the exposure ended. Additional tests were conducted the following day and one week later.
Changes in thresholds after the lowest level exposure were difficult to distinguish from test-retest variability; however, TTS was reliably detected after higher levels of sound exposure. Changes in audiometric thresholds had a “notch” configuration, with the largest changes observed at 4 kHz (mean=6.3±3.9dB; range=0–13 dB). Recovery was largely complete within the first 4 hours post-exposure, and all subjects showed complete recovery of both thresholds and DPOAE measures when tested 1-week post-exposure.
These data provide insight into the variability of TTS induced by music player use in a healthy, normal-hearing, young adult population, with music playlist, level, and duration carefully controlled. These data confirm the likelihood of temporary changes in auditory function following digital music player use. Such data are essential for the development of a human clinical trial protocol that provides a highly powered design for evaluating novel therapeutics in human clinical trials. Care must be taken to fully inform potential subjects in future TTS studies, including protective agent evaluations, that some noise exposures have resulted in neural degeneration in animal models, even when both audiometric thresholds and DPOAE levels returned to pre-exposure values.
music; hearing loss; digital audio player; MP3; distortion product otoacoustic emission; temporary threshold shift; TTS
As degradation of formalin-fixed paraffin-embedded (FFPE) samples limits the ability to profile mRNA expression, we explored factors predicting the success of mRNA expression profiling of FFPE material and investigated an approach to overcome the limitation.
Bladder (n=140, stored 3–8 years) and cervix (n=160, stored 8–23 years) carcinoma FFPE samples were hybridised to Affymetrix Exon 1.0ST arrays. Percentage detection above background (%DABG) measured technical success. Biological signal was assessed by distinguishing cervix squamous cell carcinoma (SCC) and adenocarcinoma (AC) using a gene signature. As miR-205 had been identified as a marker of SCC, precursor mir-205 was measured by Exon array and mature miR-205 by qRT–PCR. Genome-wide microRNA (miRNA) expression (Affymetrix miRNA v2.0 arrays) was compared in eight newer FFPE samples with biological signal and eight older samples without.
RNA quality controls (QCs) (e.g., RNA integrity (RIN) number) failed to predict profiling success, but sample age correlated with %DABG in bladder (R=−0.30, P<0.01) and cervix (R=−0.69, P<0.01). Biological signal was lost in older samples and neither a signature nor precursor mir-205 separated samples by histology. miR-205 qRT–PCR discriminated SCC from AC, validated by miRNA profiling (26-fold higher in SCC; P=1.10 × 10−5). Genome-wide miRNA (R=0.95) and small nucleolar RNA (R=0.97) expression correlated well in the eight newer vs older FFPE samples and better than mRNA expression (R=0.72).
Sample age is the best predictor of successful mRNA profiling of FFPE material, and miRNA profiling overcomes the limitation of age and copes well with older samples.
microRNA; FFPE; degradation; microarray; snoRNA; profiling
Small recombinant antibody fragments (e.g. scFvs and VHHs), which are highly tissue permeable, are being investigated for antivenom production as conventional antivenoms consisting of IgG or F(ab’)2 antibody fragments do not effectively neutralize venom toxins located in deep tissues. However, antivenoms composed entirely of small antibody fragments may have poor therapeutic efficacy due to their short serum half-lives. To increase serum persistence and maintain tissue penetration, we prepared low and high molecular mass antivenom antibodies. Four llama VHHs were isolated from an immune VHH-displayed phage library and were shown to have high affinity, in the low nM range, for α-cobratoxin (α–Cbtx), the most lethal component of Naja kaouthia venom. Subsequently, our highest affinity VHH (C2) was fused to a human Fc fragment to create a VHH2-Fc antibody that would offer prolonged serum persistence. After in planta (Nicotiana benthamiana) expression and purification, we show that our VHH2-Fc antibody retained high affinity binding to α–Cbtx. Mouse α–Cbtx challenge studies showed that our highest affinity VHHs (C2 and C20) and the VHH2-Fc antibody effectively neutralized lethality induced by α–Cbtx at an antibody:toxin molar ratio as low as ca. 0.75×:1. Further research towards the development of an antivenom therapeutic involving these anti-α-Cbtx VHHs and VHH2-Fc antibody molecules should involve testing them as a combination, to determine whether they maintain tissue penetration capability and low immunogenicity, and whether they exhibit improved serum persistence and therapeutic efficacy.
The microbiota contributes to the induction of both effector and regulatory responses in the gastrointestinal tract. However, the mechanisms controlling these distinct properties remain poorly understood. We previously showed that commensal DNA promotes intestinal immunity. Here, we find that the capacity of bacterial DNA to stimulate immune responses is species specific and correlated with the frequency of motifs known to exert immunosuppressive function. In particular, we show that the DNA of Lactobacillus species, including various probiotics, are enriched in suppressive motifs able to inhibit lamina propria DC activation. In addition, immunosuppressive oligonucleotides sustain Treg cell conversion during inflammation and limit pathogen-induced immunopathology and colitis. Altogether, our findings identify DNA suppressive motifs as a molecular ligand expressed by commensals and support the idea that a balance between stimulatory and regulatory DNA motifs contributes to the induction of controlled immune responses in the GI tract and gut immune homeostasis. Further, our findings suggest that the endogenous regulatory capacity of DNA motifs enriched in some commensal bacteria could be exploited for therapeutic purposes.
Auxinic herbicides are widely used in agriculture to selectively control broadleaf weeds. Prolonged use of auxinic herbicides has resulted in the evolution of resistance to these herbicides in some biotypes of Brassica kaber (wild mustard), a common weed in agricultural crops. In this study, auxinic herbicide resistance from B. kaber was transferred to Brassica juncea and Brassica rapa, two commercially important Brassica crops, by traditional breeding coupled with in vitro embryo rescue. A high frequency of embryo regeneration and hybrid plant establishment was achieved. Transfer of auxinic herbicide resistance from B. kaber to the hybrids was assessed by whole-plant screening of hybrids with dicamba, a widely used auxinic herbicide. Furthermore, the hybrids were tested for fertility (both pollen and pistil) and their ability to produce backcross progeny. The auxinic herbicide-resistant trait was introgressed into B. juncea by backcross breeding. DNA ploidy of the hybrids as well as of the backcross progeny was estimated by flow cytometry. Creation of auxinic herbicide-resistant Brassica crops by non-transgenic approaches should facilitate effective weed control, encourage less tillage, provide herbicide rotation options, minimize occurrence of herbicide resistance, and increase acceptance of these crops.
Auxinic herbicides; Embryo rescue; Dicamba; Introgression
In low-income countries, surgical site infections (SSIs) are a very frequent form of hospital-acquired infection. Surveillance is an important method for controlling SSI but it is unclear how this can best be performed in low-income settings.
To examine the epidemiological characteristics of various components of an SSI surveillance programme in a single Kenyan hospital.
The study assessed the inter-observer consistency of the surgical wound class (SWC) and American Society of Anesthesiologists (ASA) scores using the kappa statistic. Post-discharge telephone calls were evaluated against an outpatient clinician review ‘gold standard’. The predictive value of components of the Centers for Disease Control and Prevention – National Healthcare Safety Network (CDC-NHNS) risk index was examined in patients having major obstetric or gynaecological surgery (O&G) between August 2010 and February 2011.
After appropriate training, surgeons and anaesthetists were found to be consistent in their use of the SWC and ASA scores respectively. Telephone calls were found to have a sensitivity of 70% [95% confidence interval (CI): 47–87] and a specificity of 100% (95% CI: 95–100) for detection of post-discharge SSI in this setting. In 954 patients undergoing major O&G operations, the SWC score was the only parameter in the CDC-NHNS risk index model associated with the risk of SSI (odds ratio: 4.00; 95% CI: 1.21–13.2; P = 0.02).
Surveillance for SSI can be conducted in a low-income hospital setting, although dedicated staff, intensive training and local modifications to surveillance methods are necessary. Surveillance for post-discharge SSI using telephone calls is imperfect but provides a practical alternative to clinic-based diagnosis. The SWC score was the only predictor of SSI risk in O&G surgery in this context.
Epidemiology; Kenya; Sub-Saharan Africa; Surgical site infection; Surveillance
We have performed a meta-analysis of cancer risk associated with the rs17878362 polymorphism of the TP53 suppressor gene (PIN3, (polymorphism in intron 3), 16 bp sequence insertion/duplication in intron 3), using a compilation of a total of 25 published studies with 10 786 cases and 11 760 controls. Homozygote carriers of the duplicated allele (A2A2) had a significantly increased cancer risk compared with A1A1 carriers (aggregated odds ratio (OR)=1.45, 95% confidence interval (CI)=1.22–1.74). However, there was no significant effect for the A1A2 heterozygotes (A1A2 versus A1A1 aggregated OR=1.08, 95% CI=0.99–1.18). No significant heterogeneity or publication bias was detected in the data set analysed. When comparing populations groups, increased cancer risk was associated with A2A2 carriage in Indian, Mediterranean and Northern Europe populations but not in the Caucasian population of the United States. Analysis by cancer site showed an increased risk for A2A2 carriers for breast and colorectal, but not for lung cancers. These results support that the A2A2 genotype of rs17878362 is associated with increased cancer risk, with population and tumour-specific effects.
TP53; rs17878362; intron3; PIN3
The TP53 tumour-suppressor gene is expressed as several protein isoforms generated by different mechanisms, including use of alternative promoters, splicing sites and translational initiation sites, that are conserved through evolution and within the TP53 homologues, TP63 and TP73. Although first described in the eighties, the importance of p53 isoforms in regulating the suppressive functions of p53 has only become evident in the last 10 years, by analogy with observations that p63 and p73 isoforms appeared indispensable to fully understand the biological functions of TP63 and TP73. This review summarizes recent advances in the field of ‘p53 isoforms', including new data on p63 and p73 isoforms. Details of the alternative mechanisms that produce p53 isoforms and cis- and trans-regulators identified are provided. The main focus is on their biological functions (apoptosis, cell cycle, aging and so on) in cellular and animal models, including mouse, zebrafish and Drosophila. Finally, the deregulation of p53 isoform expression in human cancers is reviewed. Based on these latest results, several developments are expected in the future: the identification of drugs modulating p53 isoform expression; the generation of animal models and the evaluation of the use of p53 isoform as biomarkers in human cancers.
tumour suppressor; p53; p63; p73; p53 family; isoforms
The aim of this study was to identify patients not requiring ureteric stone surgery based on pre-operative imaging (within 24 hours) prior to embarking on semirigid ureteroscopy (R-URS) for urolithiasis.
The imaging of all consecutive patients on whom R-URS for urolithiasis was performed over a 12-month period was reviewed. All patients had undergone a plain x-ray of the kidney, ureters and bladder (KUB), abdominal non-contrast computed tomography (NCCT-KUB) or both on the day of surgery.
A total of 96 patients were identified for the study. Stone sizes ranged from 3mm to 20mm. Thirteen patients (14%) were cancelled as no stone(s) were identified on pre-operative imaging. Of the patients cancelled, 8 (62%) required NCCT-KUB to confirm spontaneous stone passage.
One in seven patients were stone free on the day of surgery. This negates the need for unnecessary anaesthetic and instrumentation of the urinary tract, with the associated morbidity. Up-to-date imaging prior to embarking on elective ureteric stone surgery is highly recommended.
Urolithiasis; Ureteroscopy; Imaging
DB289 is the first oral drug shown in clinical trials to have efficacy in treating African trypanosomiasis (African sleeping sickness). Mild liver toxicity was noted but was not treatment limiting. However, development of DB289 was terminated when several treated subjects developed severe kidney injury, a liability not predicted from preclinical testing. We tested the hypothesis that the kidney safety liability of DB289 would be detected in a mouse diversity panel (MDP) comprised of 34 genetically diverse inbred mouse strains. MDP mice received 10 days of oral treatment with DB289 or vehicle and classical renal biomarkers blood urea nitrogen (BUN) and serum creatinine (sCr), as well as urine biomarkers of kidney injury were measured. While BUN and sCr remained within reference ranges, marked elevations were observed for kidney injury molecule-1 (KIM-1) in the urine of sensitive mouse strains. KIM-1 elevations were not always coincident with elevations in alanine aminotransferase (ALT), suggesting that renal injury was not linked to hepatic injury. Genome-wide association analyses of KIM-1 elevations indicated that genes participating in cholesterol and lipid biosynthesis and transport, oxidative stress, and cytokine release may play a role in DB289 renal injury. Taken together, the data resulting from this study highlight the utility of using an MDP to predict clinically relevant toxicities, to identify relevant toxicity biomarkers that may translate into the clinic, and to identify potential mechanisms underlying toxicities. In addition, the sensitive mouse strains identified in this study may be useful in screening next-in-class compounds for renal injury.
pharmacogenetic; mouse diversity panel; DB289; human African trypanosomiasis; sleeping sickness
Although Toll-like receptor 9 (TLR9) has been implicated in regulating cytokine and type I interferon (IFN) production during malaria in humans and mice, the high AT content of the Plasmodium falciparum genome prompted us to examine the possibility that malarial DNA triggered TLR9-independent DNA sensing pathways. Over 6000 ATTTTTAC (“AT-rich”) motifs are present in the genome of P. falciparum, which we show here potently induce type I IFNs. Parasite DNA, parasitized erythrocytes and oligonucleotides containing the AT-r motif induce type I IFNs via a pathway that did not involve previously described sensors including TLR9, DAI, RNA polymerase-III or IFI16/p204. Rather, AT-rich DNA sensing involved an unknown receptor that coupled to STING, TBK1 and IRF3-IRF7 signaling pathway. Mice lacking both IRF3 and IRF7, the kinase TBK1 or the type I IFN receptor were resistant to otherwise lethal cerebral malaria. Collectively, these observations implicate AT-rich DNA sensing via STING, TBK1 and IRF3-IRF7 in P. falciparum malaria.
Bladder cancer is the fifth most commonly diagnosed cancer and the most expensive adult cancer in average healthcare costs incurred per patient in the USA. However, little is known about factors influencing patients' treatment decisions, quality of life, and responses to treatment impairments. The main focus of this paper is to better understand the impact of muscle invasive bladder cancer on patient quality of life and its added implications for primary caregivers and healthcare providers. In this paper, we discuss treatment options, side effects, and challenges that patients and family caregivers face in different phases along the disease trajectory and further identify crucial areas of needed research.
Vitamin A and its metabolite, retinoic acid (RA), have recently been implicated in the regulation of immune homeostasis via the peripheral induction of regulatory T cells. Here we show that RA is also required to elicit proinflammatory CD4+ helper T cell responses to infection and mucosal vaccination. Retinoic acid receptor alpha (RARα) is the critical mediator of these effects. Strikingly, antagonism of RAR signaling and deficiency in RARα(Rara−/−) results in a cell autonomous CD4+ T cell activation defect. Altogether, these findings reveal a fundamental role for the RA/RARα axis in the development of both regulatory and inflammatory arms of adaptive immunity and establish nutritional status as a broad regulator of adaptive T cell responses.
Genome-wide association studies (GWAS) have demonstrated a significant polygenic contribution to bipolar disorder (BD) where disease risk is determined by the summation of many alleles of small individual magnitude. Modelling polygenic risk scores may be a powerful way of identifying disrupted brain regions whose genetic architecture is related to that of BD. We determined the extent to which common genetic variation underlying risk to BD affected neural activation during an executive processing/language task in individuals at familial risk of BD and healthy controls. Polygenic risk scores were calculated for each individual based on GWAS data from the Psychiatric GWAS Consortium Bipolar Disorder Working Group (PGC-BD) of over 16 000 subjects. The familial group had a significantly higher polygene score than the control group (P=0.04). There were no significant group by polygene interaction effects in terms of association with brain activation. However, we did find that an increasing polygenic risk allele load for BD was associated with increased activation in limbic regions previously implicated in BD, including the anterior cingulate cortex and amygdala, across both groups. The findings suggest that this novel polygenic approach to examine brain-imaging data may be a useful means of identifying genetically mediated traits mechanistically linked to the aetiology of BD.
amygdala; anterior cingulate; bipolar disorder; fMRI; polygenic
Air pollution in Tehran is widely recognized as a serious environmental challenge, posing significant threats to the health of the resident population. Improving air quality will be difficult for many reasons, including climate and topography, heavy dependence on motor vehicles for mobility, and limited resources to reduce polluting emissions. Consequently, it is useful to have information regarding the scale of the health threat and the economic value of reducing that threat.
This paper integrates information on air quality, population, economic valuation, and health science to assess the most serious impact of fine particle pollution on humans, which is increased mortality risk, and provides estimates of the costs of present pollution levels, both in terms of risk and in terms of economic value relative to attaining air quality standards.
Mid-range results indicate that mortality risk for the population aged 30 and over would be reduced from 8.2 per 1,000 residents annually to 7.4 per 1,000 and that the estimated annual economic benefits of this reduced risk would be $378.5 million, if health-based World Health Organization-recommended annual average PM2.5 standards were met.
The potential public health benefits of reducing particulate air pollution are significant, and will increase with growing population.
Air pollution; Mortality; Iran
People with colorectal cancer have impaired quality of life (QoL). We investigated what factors were most highly associated with it.
Four hundred and ninety-six people with colorectal cancer completed questionnaires about QoL, functioning, symptoms, co-morbidity, cognitions and personal and social factors. Disease, treatment and co-morbidity data were abstracted from case notes. Multiple linear regression identified modifiable and unmodifiable factors independently predictive of global quality of life (EORTC-QLQ-C30).
Of unmodifiable factors, female sex (P<0.001), more self-reported co-morbidities (P=0.006) and metastases at diagnosis (P=0.036) significantly predicted poorer QoL, but explained little of the variability in the model (R2=0.064). Adding modifiable factors, poorer role (P<0.001) and social functioning (P=0.003), fatigue (P=0.001), dyspnoea (P=0.001), anorexia (P<0.001), depression (P<0.001) and worse perceived consequences (P=0.013) improved the model fit considerably (R2=0.574). Omitting functioning subscales resulted in recent diagnosis (P=0.002), lower perceived personal control (P=0.020) and travel difficulties (P<0.001) becoming significant predictors.
Most factors affecting QoL are modifiable, especially symptoms (fatigue, anorexia, dyspnoea) and depression. Beliefs about illness are also important. Unmodifiable factors, including metastatic (or unstaged) disease at diagnosis, have less impact. There appears to be potential for interventions to improve QoL in patients with colorectal cancer.
quality of life; colorectal cancer; assessment