The aim of this work is to understand whether shared genetic influences can explain the association between obesity and cognitive performance, including slower and more variable reaction times (RTs) and worse response inhibition.
RT on a four-choice RT task and the go/no-go task, and commission errors on the go/no-go task for 1,312 twins ages 7-10 years were measured. BMI was measured at 9-12 years. Biometric twin models were run to give an estimate of the genetic correlation (rG) between body mass index (BMI) and three cognitive measures: mean RT (MRT), RT variability (RTV; the standard deviation of RTs), and commission errors (a measure of response inhibition).
Genetic correlations indicated that 20%-30% of the genes underlying BMI were shared with both RT measures. However, only small phenotypic correlations between MRT and RTV with later BMI (rPh = ~0.1) were observed. Commission errors were unassociated with later BMI (rPh = −0.03, ns).
Our results are the first to demonstrate significant shared genetic effects between RT performance and BMI. Our findings add biological support to the notion that obesity is associated with slower and more variable RTs. However, our results also emphasize the small nature of the association, which may explain previous negative findings.
Rift Valley fever virus (RVFV) is a formidable pathogen that causes severe disease and abortion in a variety of livestock species and a range of disease in humans that includes hemorrhagic fever, fulminant hepatitis, encephalitis and blindness. The natural transmission cycle involves mosquito vectors, but exposure can also occur through contact with infected fluids and tissues. The lack of approved antiviral therapies and vaccines for human use underlies the importance of small animal models for proof-of-concept efficacy studies. Several mouse and rat models of RVFV infection have been well characterized and provide useful systems for the study of certain aspects of pathogenesis, as well as antiviral drug and vaccine development. However, certain host-directed therapeutics may not act on mouse or rat pathways. Here, we describe the natural history of disease in golden Syrian hamsters challenged subcutaneously with the pathogenic ZH501 strain of RVFV. Peracute disease resulted in rapid lethality within 2 to 3 days of RVFV challenge. High titer viremia and substantial viral loads were observed in most tissues examined; however, histopathology and immunostaining for RVFV antigen were largely restricted to the liver. Acute hepatocellular necrosis associated with a strong presence of viral antigen in the hepatocytes indicates that fulminant hepatitis is the likely cause of mortality. Further studies to assess the susceptibility and disease progression following respiratory route exposure are warranted. The use of the hamsters to model RVFV infection is suitable for early stage antiviral drug and vaccine development studies.
Kidney transplantation has improved survival and quality of life for patients with end-stage renal disease. Despite excellent short-term results due to better and more potent immunosuppressive drugs, long-term survival of transplanted kidneys has not improved accordingly in the last decades. Consequently there is a strong interest in immunosuppressive regimens that maintain efficacy for the prevention of rejection, whilst preserving renal structure and function. In this respect the infusion of mesenchymal stromal cells (MSCs) may be an interesting immune suppressive strategy. MSCs have immune suppressive properties and actively contribute to tissue repair. In experimental animal studies the combination of mammalian target of rapamycin (mTOR) inhibitor and MSCs was shown to attenuate allo immune responses and to promote allograft tolerance. The current study will test the hypothesis that MSC treatment, in combination with the mTOR inhibitor everolimus, facilitates tacrolimus withdrawal, reduces fibrosis and decreases the incidence of opportunistic infections compared to standard tacrolimus dose.
70 renal allograft recipients, 18–75 years old, will be included in this Phase II, open label, randomized, non-blinded, prospective, single centre clinical study. Patients in the MSC treated group will receive two doses of autologous bone marrow derived MSCs IV (target 1,5x106, Range 1-2x106 million MSCs per/kg body weight), 7 days apart, 6 and 7 weeks transplantation in combination with everolimus and prednisolone. At the time of the second MSC infusion tacrolimus will be reduced to 50% and completely withdrawn 1 week later. Patients in the control group will receive everolimus, prednisolone and standard dose tacrolimus. The primary end point is to compare fibrosis by quantitative Sirius Red scoring of MSC treated and untreated groups at 6 months compared to 4 weeks post-transplant. Secondary end points include: composite end point efficacy failure (Biopsy Proven Acute Rejection, graft loss or death); renal function and proteinuria; opportunistic infections; immune monitoring and “subclinical” cardiovascular disease groups by assessing echocardiography in the different treatment groups.
This study will provide information whether MSCs in combination with everolimus can be used for tacrolimus withdrawal, and whether this strategy leads to preservation of renal structure and function in renal recipients.
Mesenchymal stromal cells; Renal transplantation; Fibrosis; Immune modulation; Repair
Aim: Attempts have been made to use CTC values for interpretation of treatment response and to guide change of chemotherapy by using a static cut-off of 5 CTC to stratify patients in favourable or unfavourable responders. We propose a new approach to interpret treatment effect using significant changes in CTC values (SCV-limits1) as grouping parameter for responders and non-responders to chemotherapy among metastatic breast cancer (mBC) patients. Method: CTC were analysed using the CellSearch System in blood from 47 mBC patients before the start of new chemotherapy and before the third cycle of therapy. The new and old approach to interpret changes in CTC values were compared in relation to progression free survival (PFS). Results: The new approach using significant CTC change (P = .032) and the old approach using static cut-off (P > .001) correlated significantly with PFS using a cohort of 47 patients. Conclusion: We propose a new approach to interpret significant changes between baseline and follow-up CTC values as a tool for assessing treatment effect in mBC. Our approach stratified patients in new risk groups that were stratified significantly with respect to PFS. More patients are needed to balance the size of the risk groups for better comparison to the existing approach based on a 5 CTC cut-off.
The degree of functional impairment and adverse developmental outcomes in individuals with attention-deficit/hyperactivity disorder (ADHD) likely reflect interplay between genes and environment. To establish whether physical exercise might reduce the level of ADHD symptoms or ADHD-related impairments, we conducted a comprehensive review of the effect of exercise in children with ADHD. Findings on the impact of exercise in animals and typically developing humans, and an overview of putative mechanisms involved are also presented to provide the context in which to understand this review.
The electronic databases PubMed, OVID and Web of Knowledge were searched for all studies investigating the effect of exercise in children and adolescents with ADHD, as well as animal models of ADHD behaviours (available in January 2013). Of 2,150 initially identified records, 16 were included.
Animal studies indicate that exercise, especially early in development, may be beneficial for ADHD symptom reduction. The limited research investigating the effect of exercise in children and adolescents with ADHD suggests that exercise may improve executive functioning and behavioural symptoms associated with ADHD. While animal research suggests brain-derived neurotrophic factor (BDNF) and catecholamines (CAs) play a role in mediating these effects, the association between BDNF and ADHD remains unclear in humans.
The potential protective qualities of exercise with regard to reducing symptoms and impairments commonly associated may hold promise for the future. Further research is needed to firmly establish whether there are clinically significant effects of exercise on the severity of ADHD symptoms, impairments and associated developmental outcomes.
Zoonotic transmission of lethal henipaviruses (HNVs) from their natural fruit bat reservoirs to humans has only been reported in Australia and South/Southeast Asia. However, a recent study discovered numerous HNV clades in African bat samples. To determine the potential for HNV spillover events among humans in Africa, here we examine well-curated sets of bat (Eidolon helvum, n=44) and human (n=497) serum samples from Cameroon for Nipah virus (NiV) cross-neutralizing antibodies (NiV-X-Nabs). Using a vesicular stomatitis virus (VSV)-based pseudoparticle seroneutralization assay, we detect NiV-X-Nabs in 48% and 3–4% of the bat and human samples, respectively. Seropositive human samples are found almost exclusively in individuals who reported butchering bats for bushmeat. Seropositive human sera also neutralize Hendra virus and Gh-M74a (an African HNV) pseudoparticles, as well as live NiV. Butchering bat meat and living in areas undergoing deforestation are the most significant risk factors associated with seropositivity. Evidence for HNV spillover events warrants increased surveillance efforts.
Henipaviruses (HNVs) infect bats in Asia and Africa, but transmission to humans (often with lethal consequences) is known only in Asia. Here the authors show that 3% of human serum samples from certain areas in Cameroon contain antibodies against HNV, indicating spillover into the human population.
Gelatinous polymers including extracellular polymeric substances (EPSs) are fundamental to biophysical processes in aquatic habitats, including mediating aggregation processes and functioning as the matrix of biofilms. Yet insight into the impact of these sticky molecules on the environmental transmission of pathogens in the ocean is limited. We used the zoonotic parasite Toxoplasma gondii as a model to evaluate polymer-mediated mechanisms that promote transmission of terrestrially derived pathogens to marine fauna and humans. We show that transparent exopolymer particles, a particulate form of EPS, enhance T. gondii association with marine aggregates, material consumed by organisms otherwise unable to access micrometre-sized particles. Adhesion to EPS biofilms on macroalgae also captures T. gondii from the water, enabling uptake of pathogens by invertebrates that feed on kelp surfaces. We demonstrate the acquisition, concentration and retention of T. gondii by kelp-grazing snails, which can transmit T. gondii to threatened California sea otters. Results highlight novel mechanisms whereby aquatic polymers facilitate incorporation of pathogens into food webs via association with particle aggregates and biofilms. Identifying the critical role of invisible polymers in transmission of pathogens in the ocean represents a fundamental advance in understanding and mitigating the health impacts of coastal habitat pollution with contaminated runoff.
extracellular polymeric substances; transparent exopolymer particles; Toxoplasma gondii; zoonotic pathogen; marine transmission; sea otter
The diagnosed incidence of small intestine neuroendocrine tumors (SI-NETs) is increasing, and the underlying genomic mechanisms have not been defined for these tumors. Using exome/genome sequence analysis of SI-NETs, we identified recurrent somatic mutations and deletions in CDKN1B, the cyclin-dependent kinase inhibitor gene, which encodes p27. We observed frameshift mutations of CDKN1B in 14 of 180 SI-NETs, and we detected hemizygous deletions encompassing CDKN1B in 7 out of 50 SI-NETs, nominating p27 as a tumor suppressor and implicating cell cycle dysregulation in the etiology of SI-NET.
It is commonly accepted that summer cyanobacterial blooms cannot be efficiently utilized by grazers due to low nutritional quality and production of toxins; however the evidence for such effects in situ is often contradictory. Using field and experimental observations on Baltic copepods and bloom-forming diazotrophic filamentous cyanobacteria, we show that cyanobacteria may in fact support zooplankton production during summer. To highlight this side of zooplankton-cyanobacteria interactions, we conducted: (1) a field survey investigating linkages between cyanobacteria, reproduction and growth indices in the copepod Acartia tonsa; (2) an experiment testing relationships between ingestion of the cyanobacterium Nodularia spumigena (measured by molecular diet analysis) and organismal responses (oxidative balance, reproduction and development) in the copepod A. bifilosa; and (3) an analysis of long term (1999–2009) data testing relationships between cyanobacteria and growth indices in nauplii of the copepods, Acartia spp. and Eurytemora affinis, in a coastal area of the northern Baltic proper. In the field survey, N. spumigena had positive effects on copepod egg production and egg viability, effectively increasing their viable egg production. By contrast, Aphanizomenon sp. showed a negative relationship with egg viability yet no significant effect on the viable egg production. In the experiment, ingestion of N. spumigena mixed with green algae Brachiomonas submarina had significant positive effects on copepod oxidative balance, egg viability and development of early nauplial stages, whereas egg production was negatively affected. Finally, the long term data analysis identified cyanobacteria as a significant positive predictor for the nauplial growth in Acartia spp. and E. affinis. Taken together, these results suggest that bloom forming diazotrophic cyanobacteria contribute to feeding and reproduction of zooplankton during summer and create a favorable growth environment for the copepod nauplii.
Fecal microbiota transplantation (FMT) is an effective treatment for recurrent Clostridium difficile infection (CDI) and is considered as a treatment for other gastrointestinal (GI) diseases. We followed up the relief of symptoms and long-term, over-a-year microbiota stabilization in a 46-year-old man, who underwent FMT for antibiotic-induced, non-CDI colitis nine months after being treated for CDI by FMT. Fecal and mucosal microbiota was analyzed before the second FMT and during 14 months after FMT by using a high-throughput phylogenetic microarray. FMT resolved the symptoms and restored normal GI-function. Microbiota analysis revealed increased bacterial diversity in the rectal mucosa and a stable fecal microbiota up to three months after FMT. A number of mucosa-associated bacteria increased after FMT and some of these bacteria remained increased in feces up to 14 months. Notably, the increased bacteria included Bifidobacterium spp. and various representatives of Clostridium clusters IV and XIVa, such as Clostridium leptum, Oscillospira guillermondii, Sporobacter termitidis, Anaerotruncus colihominis, Ruminococcus callidus, R. bromii, Lachnospira pectinoschiza, and C. colinum, which are presumed to be anti-inflammatory. The presented case suggests a possible role of microbiota in restoring and maintaining normal GI-functionality and improves our knowledge on the etiology of antibiotic-induced, noninfectious colitis.
We examined the phenotype and function of lymphocytes collected from the peripheral blood (PBL) and tumor (TIL) of patients with two different solid malignancies: colorectal cancer liver metastases (CRLM) and ovarian cancer (OVC).
Tumor and corresponding peripheral blood were collected from 16 CRLM and 22 OVC patients; immediately following resection they were processed and analyzed using a multi-color flow cytometry panel. Cytokine mRNA from purified PBL and TIL CD4+ T cells were also analyzed by qPCR.
Overall, we found similar changes in the phenotypic and cytokine profiles when the TIL were compared to PBL from patients with two different malignancies. The percentage of Treg (CD4+/CD25+/FoxP3+) in PBL and TIL was similar: 8.1% versus 10.2%, respectively in CRLM patients. However, the frequency of Treg in primary OVC TIL was higher than PBL: 19.2% versus 4.5% (p <0.0001). A subpopulation of Treg expressing HLA-DR was markedly increased in TIL compared to PBL in both tumor types, CRLM: 69.0% versus 31.7% (p = 0.0002) and OVC 74.6% versus 37.0% (p <0.0001), which suggested preferential Treg activation within the tumor. The cytokine mRNA profile showed that IL-6, a cytokine known for its immunosuppressive properties through STAT3 upregulation, was increased in TIL samples in patients with OVC and CRLM. Both TIL populations also contained a significantly higher proportion of activated CD8+ T cells (HLA-DR+/CD38+) compared to PBL (CRLM: 30.2% vs 7.7%, (p = 0.0012), OVC: 57.1% vs 12.0%, (p <0.0001)).
This study demonstrates that multi-color flow cytometry of freshly digested tumor samples reveals phenotypic differences in TIL vs PBL T cell sub-populations. The TIL composition in primary and metastatic tumors from two distinct histologies were remarkably similar, showing a greater proportion of activated/suppressive Treg (HLA-DR+, CD39+, CTLA-4+ and Helios+) and activated cytotoxic T cells (CD8+/HLA-DR+/CD38+) when compared to PBL and an increase in IL-6 mRNA from CD4 TIL.
Electronic supplementary material
The online version of this article (doi:10.1186/s40425-014-0038-9) contains supplementary material, which is available to authorized users.
Tumor infiltrating lymphocytes; Regulatory T cells
To determine whether the concomitant use of duloxetine with prescription nonsteroidal anti-inflammatory drugs (NSAIDs) or aspirin was associated with an increased risk for upper gastrointestinal (UGI) bleeding compared with taking these analgesics alone.
Truven Health Analytics MarketScan Research Databases were examined for hospital admissions of adult patients indexed from January 1, 2007–December 31, 2011. Cases were patients with UGI hemorrhage or peptic ulcer disease. Controls were randomly selected from the remaining admissions to match 10:1 with cases based on age, sex, and admission date. Prescription medication exposure groups of interest were: 1) no exposure to duloxetine, NSAIDs or aspirin; 2) duloxetine only; 3) NSAIDs or aspirin only; 4) duloxetine plus NSAIDs or aspirin. Logistic regression and relative excess risk due to interaction was utilized to estimate any increased risk of UGI bleeding for patients prescribed these medications across these groups.
There were 33,571 cases and 335,710 controls identified. Comparing exposure group 2 and group 4, the adjusted odds ratio was 1.03 (95% confidence interval [CI], 0.94, 1.12), and the adjusted relative excess risk due to interaction was 0.352 (95% CI: –0.18, 0.72) for risk of UGI bleeding, neither of which support an increased risk or an interaction between duloxetine and prescription NSAID or aspirin for these events.
There was no evidence of an increased risk for UGI bleeding when duloxetine was taken with prescription NSAIDs or aspirin. In addition, there was no evidence of an interaction between duloxetine and prescription NSAIDs or aspirin for an increased risk of these events.
duloxetine; upper gastrointestinal bleeding; NSAIDs; aspirin
Next-generation sequencing technology has increased the capacity to generate molecular data for plant biological research, including phylogenetics, and can potentially contribute to resolving complex phylogenetic problems. The evolutionary history of Medicago L. (Leguminosae: Trifoliae) remains unresolved due to incongruence between published phylogenies. Identification of the processes causing this genealogical incongruence is essential for the inference of a correct species phylogeny of the genus and requires that more molecular data, preferably from low-copy nuclear genes, are obtained across different species. Here we report the development of 50 novel LCN markers in Medicago and assess the phylogenetic properties of each marker. We used the genomic resources available for Medicago truncatula Gaertn., hybridisation-based gene enrichment (sequence capture) techniques and Next-Generation Sequencing to generate sequences. This alternative proves to be a cost-effective approach to amplicon sequencing in phylogenetic studies at the genus or tribe level and allows for an increase in number and size of targeted loci. Substitution rate estimates for each of the 50 loci are provided, and an overview of the variation in substitution rates among a large number of low-copy nuclear genes in plants is presented for the first time. Aligned sequences of major species lineages of Medicago and its sister genus are made available and can be used in further probe development for sequence-capture of the same markers.
Digital disease detection tools are technologically sophisticated, but dependent on digital information, which for many areas suffering from high disease burdens is simply not an option. In areas where news is often reported in local media with no digital counterpart, integration of local news information with digital surveillance systems, such as HealthMap (Boston Children’s Hospital), is critical. Little research has been published in regards to the specific contribution of local health-related articles to digital surveillance systems. In response, the USAID PREDICT project implemented a local media surveillance (LMS) pilot study in partner countries to monitor disease events reported in print media. This research assessed the potential of LMS to enhance digital surveillance reach in five low- and middle-income countries. Over 16 weeks, select surveillance system attributes of LMS, such as simplicity, flexibility, acceptability, timeliness, and stability were evaluated to identify strengths and weaknesses in the surveillance method. Findings revealed that LMS filled gaps in digital surveillance network coverage by contributing valuable localized information on disease events to the global HealthMap database. A total of 87 health events were reported through the LMS pilot in the 16-week monitoring period, including 71 unique reports not found by the HealthMap digital detection tool. Furthermore, HealthMap identified an additional 236 health events outside of LMS. It was also observed that belief in the importance of the project and proper source selection from the participants was crucial to the success of this method. The timely identification of disease outbreaks near points of emergence and the recognition of risk factors associated with disease occurrence continue to be important components of any comprehensive surveillance system for monitoring disease activity across populations. The LMS method, with its minimal resource commitment, could be one tool used to address the information gaps seen in global ‘hot spot’ regions.
The aim of the study was to explore the serum levels of eight angiogenesis biomarkers in patients with benign, borderline or malignant epithelial ovarian neoplasms and to compare them to those of healthy controls. In addition, we aimed to study how those biomarkers predict the clinical course and survival of patients with epithelial ovarian cancer.
We enrolled 132 patients with ovarian neoplasms and 32 unaffected women in this study. Serum samples were collected preoperatively at the time of diagnosis and the levels of angiogenesis biomarkers were measured with an ELISA.
Levels of Ang-1, Ang-2, VEGF, VEGF-D, VEGF/sVEGFR-2 and Ang-2/ sVEGFR-2 ratios were elevated whereas sVEGFR-2 was lower in patients with ovarian carcinoma than in women with normal ovaries, benign and/or borderline ovarian neoplasms. In ROC analysis, the area under the curve for serum Ang-2/sVEGFR-2 ratio (0.76) was greater than Ang-2 (0.75) and VEGF (0.65) but lower than for CA 125 (0.90) to differentiate ovarian cancer from benign or borderline ovarian tumors. In ovarian cancer high Ang-2/sVEGFR-2 ratio was associated with the presence of ascites, high stage and grade of ovarian cancer, with the size of primary residual tumor >1 cm and with recurrence of disease. Elevated Ang-2, VEGF, VEGF/sVEGFR-2, Ang-2/VEGF and Ang-2/sVEGFR-2 ratios and low level of sVEGFR-2 were significant predictors of poor overall survival (OS) and recurrence free survival (RFS) in univariate survival analyses.
Ovarian cancer patients had elevated levels of angiogenesis related growth factors in circulation reflecting increased angiogenesis and poor prognosis. The serum level of Ang-2 predicted most accurately poor OS and Ang-2/sVEGFR-2 ratio malignancy of ovarian neoplasms and short RFS.
Electronic supplementary material
The online version of this article (doi:10.1186/1471-2407-14-696) contains supplementary material, which is available to authorized users.
Angiopoietins; VEGFs; VEGFRs; Biomarker; Ovarian carcinoma; Prognosis
An estimated 2.3 million disability-adjusted life years are lost globally from leishmaniasis. In Peru's Amazon region, the department of Madre de Dios (MDD) rises above the rest of the country in terms of the annual incidence rates of human leishmaniasis. Leishmania (Viannia) braziliensis is the species most frequently responsible for the form of disease that results in tissue destruction of the nose and mouth. However, essentially nothing is known regarding the reservoirs of this vector-borne, zoonotic parasite in MDD. Wild rodents have been suspected, or proven, to be reservoirs of several Leishmania spp. in various ecosystems and countries. Additionally, people who live or work in forested terrain, especially those who are not regionally local and whose immune systems are thus naïve to the parasite, are at most risk for contracting L. (V.) braziliensis. Hence, the objective of this study was to collect tissues from wild rodents captured at several study sites along the Amazonian segment of the newly constructed Transoceanic Highway and to use molecular laboratory techniques to analyze samples for the presence of Leishmania parasites. Liver tissues were tested via polymerase chain reaction from a total of 217 rodents; bone marrow and skin biopsies (ear and tail) were also tested from a subset of these same animals. The most numerous rodent species captured and tested were Oligoryzomys microtis (40.7%), Hylaeamys perenensis (15.7%), and Proechimys spp. (12%). All samples were negative for Leishmania, implying that although incidental infections may occur, these abundant rodent species are unlikely to serve as primary reservoirs of L. (V.) braziliensis along the Transoceanic Highway in MDD. Therefore, although these rodent species may persist and even thrive in moderately altered landscapes, we did not find any evidence to suggest they pose a risk for L. (V.) braziliensis transmission to human inhabitants in this highly prevalent region.
Elevated blood pressure (BP) levels in childhood have been associated with subsequent atherosclerosis. However, it is uncertain whether this risk is attenuated in individuals who acquire normal BP by adulthood. The present study examined the effect of child and adult BP levels on carotid artery intima-media thickness (cIMT) in adulthood.
Methods and Results
The cohort consisted of 4,210 participants from four prospective studies (mean follow-up 23 years). Childhood elevated BP was defined according to the tables from the National High Blood Pressure Education Program. In adulthood BP was classified as elevated for individuals with systolic BP ≥120mmHg, diastolic BP ≥80mmHg or with self-reported use of antihypertensive medications. cIMT was measured in the left common carotid artery. High IMT was defined as an IMT ≥age-, sex-, race-, and cohort-specific 90th percentile. Individuals with persistently elevated BP and individuals with normal childhood BP, but elevated adult BP had increased risk of high cIMT (RR[95%CI]) 1.82[1.47-2.38] and 1.57[1.22-2.02], respectively) when compared to individuals with normal child and adult BP. In contrast, individuals with elevated BP as children but not as adults did not have significantly increased risk (1.24[0.92-1.67]). In addition, these individuals had lower risk of increased cIMT (0.66[0.50-0.88]) when compared to those with persistently elevated BP. The results were consistent when controlling for age, sex, adiposity and when different BP definitions were applied.
Individuals with persistently elevated BP from childhood to adulthood had increased risk of carotid atherosclerosis. This risk was reduced if elevated BP during childhood resolved by adulthood.
risk factors; atherosclerosis; blood pressure; hypertension; epidemiology
The role of the frq gene in the Neurospora crassa circadian rhythm has been widely studied, but technical limitations have hindered a thorough analysis of frq circadian expression waveform. Through our experiments, we have shown an improved precision in defining Neurospora’s circadian rhythm kinetics using a codon optimized firefly luciferase gene reporter linked to a frq promoter. In vivo examination of this real-time reporter has allowed for a better understanding of the relationship of the light responsive elements of the frq promoter to its circadian feedback components. We provide a detailed phase response curve showing the phase shifts induced by a light pulse applied at different points of the circadian cycle. Using the frq-luc reporter, we have found that a 12-h light:12-h dark cycle (12L:12D) results in a luciferase expression waveform that is more complex and higher in amplitude than that seen in free-running conditions of constant darkness (DD). When using a lighting regime more consistent with solar timing, rather than a square wave pattern, one observes a circadian waveform that is smoother, lower in amplitude, and different in phasing. Using dim light in place of darkness in these experiments also affects the resulting waveform and phasing. Our experiments illustrate Neurospora’s circadian kinetics in greater detail than previous methods, providing further insight into the complex underlying biochemical, genetic, and physiological mechanisms underpinning the circadian oscillator.
Neurospora; circadian; VIVID; WCC; PRC; PTC; luminescence; frq; entrainment
The capacity to conduct zoonotic pathogen surveillance in wildlife is critical for the recognition and identification of emerging health threats. The PREDICT project, a component of United States Agency for International Development’s Emerging Pandemic Threats program, has introduced capacity building efforts to increase zoonotic pathogen surveillance in wildlife in global ‘hot spot’ regions where zoonotic disease emergence is likely to occur. Understanding priorities, challenges, and opportunities from the perspectives of the stakeholders is a key component of any successful capacity building program.
A survey was administered to wildlife officials and to PREDICT-implementing in-country project scientists in 16 participating countries in order to identify similarities and differences in perspectives between the groups regarding capacity needs for zoonotic pathogen surveillance in wildlife.
Both stakeholder groups identified some human-animal interfaces (i.e. areas of high contact between wildlife and humans with the potential risk for disease transmission), such as hunting and markets, as important for ongoing targeting of wildlife surveillance. Similarly, findings regarding challenges across stakeholder groups showed some agreement in that a lack of sustainable funding across regions was the greatest challenge for conducting wildlife surveillance for zoonotic pathogens (wildlife officials: 96% and project scientists: 81%). However, the opportunity for improving zoonotic pathogen surveillance capacity identified most frequently by wildlife officials as important was increasing communication or coordination among agencies, sectors, or regions (100% of wildlife officials), whereas the most frequent opportunities identified as important by project scientists were increasing human capacity, increasing laboratory capacity, and the growing interest or awareness regarding wildlife disease or surveillance programs (all identified by 69% of project scientists).
A One Health approach to capacity building applied at local and global scales will have the greatest impact on improving zoonotic pathogen surveillance in wildlife. This approach will involve increasing communication and cooperation across ministries and sectors so that experts and stakeholders work together to identify and mitigate surveillance gaps. Over time, this transdisciplinary approach to capacity building will help overcome existing challenges and promote efficient targeting of high risk interfaces for zoonotic pathogen transmission.
Wildlife pathogen surveillance; Capacity building; Stakeholder; One Health; Zoonoses; Global health
Electroencephalography (EEG) is an ideal neuroscientific approach, providing a direct measurement of neural activity that demonstrates reliability, developmental stability and high heritability. This systematic review of a subset of domains evaluates the utility of electrophysiological measures as potential intermediate phenotypes for ADHD in the domains of quantitative EEG indices of arousal and intra-individual variability, and functional investigations of inhibitory and error processing using the event-related potential (ERP) technique. Each domain demonstrates consistent and meaningful associations with ADHD, a degree of genetic overlap with ADHD and potential links to specific genetic variants. Investigations of the genetic and environmental contributions to EEG/ERP and shared genetic overlap with ADHD may enhance molecular genetic studies and provide novel insights into aetiology. Such research will aid in the precise characterisation of the clinical deficits seen in ADHD and guide the development of novel intervention and prevention strategies for those at risk.
ADHD; arousal; default-mode; electrophysiology; endophenotype; executive function; genetics; heritability
Twin studies indicate that the frequent co-occurrence of attention deficit hyperactivity disorder (ADHD) symptoms and reading difficulties (RD) is largely due to shared genetic influences. Both disorders are associated with multiple cognitive impairments, but it remains unclear which cognitive impairments share the aetiological pathway, underlying the co-occurrence of the symptoms. We address this question using a sample of twins aged 7–10 and a range of cognitive measures previously associated with ADHD symptoms or RD.
We performed multivariate structural equation modelling analyses on parent and teacher ratings on the ADHD symptom domains of inattention and hyperactivity, parent ratings on RD, and cognitive data on response inhibition (commission errors, CE), reaction time variability (RTV), verbal short-term memory (STM), working memory (WM) and choice impulsivity, from a population sample of 1312 twins aged 7–10 years.
Three cognitive processes showed significant phenotypic and genetic associations with both inattention symptoms and RD: RTV, verbal WM and STM. While STM captured only 11% of the shared genetic risk between inattention and RD, the estimates increased somewhat for WM (21%) and RTV (28%); yet most of the genetic sharing between inattention and RD remained unaccounted for in each case.
While response inhibition and choice impulsivity did not emerge as important cognitive processes underlying the co-occurrence between ADHD symptoms and RD, RTV and verbal memory processes separately showed significant phenotypic and genetic associations with both inattention symptoms and RD. Future studies employing longitudinal designs will be required to investigate the developmental pathways and direction of causality further.
Environmental transmission of the zoonotic parasite Toxoplasma gondii, which is shed only by felids, poses risks to human and animal health in temperate and tropical ecosystems. Atypical T. gondii genotypes have been linked to severe disease in people and the threatened population of California sea otters. To investigate land-to-sea parasite transmission, we screened 373 carnivores (feral domestic cats, mountain lions, bobcats, foxes, and coyotes) for T. gondii infection and examined the distribution of genotypes in 85 infected animals sampled near the sea otter range.
Nested PCR-RFLP analyses and direct DNA sequencing at six independent polymorphic genetic loci (B1, SAG1, SAG3, GRA6, L358, and Apico) were used to characterize T. gondii strains in infected animals. Strains consistent with Type X, a novel genotype previously identified in over 70% of infected sea otters and four terrestrial wild carnivores along the California coast, were detected in all sampled species, including domestic cats. However, odds of Type X infection were 14 times higher (95% CI: 1.3–148.6) for wild felids than feral domestic cats. Type X infection was also linked to undeveloped lands (OR = 22, 95% CI: 2.3–250.7). A spatial cluster of terrestrial Type II infection (P = 0.04) was identified in developed lands bordering an area of increased risk for sea otter Type II infection. Two spatial clusters of animals infected with strains consistent with Type X (P≤0.01) were detected in less developed landscapes.
Differences in T. gondii genotype prevalence among domestic and wild felids, as well as the spatial distribution of genotypes, suggest co-existing domestic and wild T. gondii transmission cycles that likely overlap at the interface of developed and undeveloped lands. Anthropogenic development driving contact between these cycles may increase atypical T. gondii genotypes in domestic cats and facilitate transmission of potentially more pathogenic genotypes to humans, domestic animals, and wildlife.
Toxoplasma gondii, a global parasite shed by domestic and wild felids, can cause severe disease in people and animals. In coastal California, USA, many sea otters have died due to T. gondii. Because T. gondii is shed by felids on land, otter infection suggests that this extremely hardy parasite is transported in freshwater runoff to aquatic environments, where animals and humans can become exposed. Molecular characterization of T. gondii strains infecting terrestrial and marine hosts can provide clues about parasite transmission cycles and sources of otter infection. By testing 373 and characterizing T. gondii infection in 85 terrestrial carnivores (domestic cats and wild carnivores) sharing the California coast, we found that Type X, the type previously identified in over 70% of infected sea otters tested, was more common in wild felids than domestic cats. However, discovery of Type X in domestic cats in this region suggests that they may play an important role in marine infection, as their populations are larger than those of wild felids. Differences in types of T. gondii among carnivores and in urban and agricultural vs. undeveloped areas suggest that there are separate, but overlapping domestic and wild cycles of T. gondii transmission in coastal California.
PURPOSE: This study aims to investigate whether the uptake of 2-(2-nitro-1H-imidazol-1-yl)-N-(2,2,3,3,3-pentafluoropropyl)-acetamide ([18F]EF5) and 2-deoxy-2-[18F]fluoro-d-glucose ([18F]FDG) is associated with a hypoxia-driven adverse phenotype in head and neck squamous cell carcinoma cell lines and tumor xenografts. METHODS: Xenografts were imaged in vivo, and tumor sections were stained for hypoxia-inducible factor 1α (Hif-1α), carbonic anhydrase IX (CA IX), and glucose transporter 1 (Glut-1). Tracer uptakes and the expression of Hif-1α were determined in cell lines under 1% hypoxia. RESULTS: High [18F]EF5 uptake was seen in xenografts expressing high levels of CA IX, Glut-1, and Hif-1α, whereas low [18F]EF5 uptake was detected in xenografts expressing low amounts of CA IX and Hif-1α. The uptake of [18F]EF5 between cell lines varied extensively under normoxic conditions. A clear correlation was found between the expression of Hif-1α and the uptake of [18F]FDG during hypoxia. CONCLUSIONS: The UT-SCC cell lines studied differed with respect to their hypoxic phenotypes, and these variations were detectable with [18F]EF5. Acute hypoxia increases [18F]FDG uptake in vitro, whereas a high [18F]EF5 uptake reflects a more complex phenotype associated with hypoxia and an aggressive growth pattern.
Essential information regarding efficacy and safety of vitamin K-antagonists (VKA) treatment for atrial fibrillation (AF) in non-dialysis dependent chronic kidney disease (CKD) is still lacking in current literature. The aim of our study was to compare the risks of stroke or transient ischemic attack (TIA) and major bleeds between patients without CKD (eGFR >60 ml/min), and those with moderate (eGFR 30–60 ml/min), or severe non-dialysis dependent CKD (eGFR <30 ml/min).
We included 300 patients without CKD, 294 with moderate, and 130 with severe non-dialysis dependent CKD, who were matched for age and sex. Uni- and multivariate Cox regression analyses were performed reporting hazard ratios (HRs) for the endpoint of stroke or TIA and the endpoint of major bleeds as crude values and adjusted for comorbidity and platelet-inhibitor use.
Overall, 6.2% (45/724, 1.7/100 patient years) of patients developed stroke or TIA and 15.6% (113/724, 4.8/100 patient years) a major bleeding event. Patients with severe CKD were at high risk of stroke or TIA and major bleeds during VKA treatment compared with those without renal impairment, HR 2.75 (95%CI 1.25–6.05) and 1.66 (95%CI 0.97–2.86), or with moderate CKD, HR 3.93(1.71–9.00) and 1.86 (95%CI 1.08–3.21), respectively. These risks were similar for patients without and with moderate CKD. Importantly, both less time spent within therapeutic range and high INR-variability were associated with increased risks of stroke or TIA and major bleeds in severe CKD patients.
VKA treatment for AF in patients with severe CKD has a poor safety and efficacy profile, likely related to suboptimal anticoagulation control. Our study findings stress the need for better tailored individualised anticoagulant treatment approaches for patients with AF and severe CKD.