Using a pulse chase 13CO2 plant labeling experiment we compared the flow of plant carbon into macromolecular fractions of rhizosphere soil microorganisms. Time dependent 13C dilution patterns in microbial cellular fractions were used to calculate their turnover time. The turnover times of microbial biomolecules were found to vary: microbial RNA (19 h) and DNA (30 h) turned over fastest followed by chloroform fumigation extraction-derived soluble cell lysis products (14 days), while phospholipid fatty acids (PLFAs) had the slowest turnover (42 days). PLFA/NLFA 13C analyses suggest that both mutualistic arbuscular mycorrhizal and saprophytic fungi are dominant in initial plant carbon uptake. In contrast, high initial 13C enrichment in RNA hints at bacterial importance in initial C uptake due to the dominance of bacterial derived RNA in total extracts of soil RNA. To explain this discrepancy, we observed low renewal rate of bacterial lipids, which may therefore bias lipid fatty acid based interpretations of the role of bacteria in soil microbial food webs. Based on our findings, we question current assumptions regarding plant-microbe carbon flux and suggest that the rhizosphere bacterial contribution to plant assimilate uptake could be higher. This highlights the need for more detailed quantitative investigations with nucleic acid biomarkers to further validate these findings.
13C tracer experiment; bacteria; carbon; DNA; fungi; PLFA; RNA; soil
Lotic ecosystems such as rivers and streams are unique in that they represent a continuum of both space and time during the transition from headwaters to the river mouth. As microbes have very different controls over their ecology, distribution and dispersion compared with macrobiota, we wished to explore biogeographical patterns within a river catchment and uncover the major drivers structuring bacterioplankton communities. Water samples collected across the River Thames Basin, UK, covering the transition from headwater tributaries to the lower reaches of the main river channel were characterised using 16S rRNA gene pyrosequencing. This approach revealed an ecological succession in the bacterial community composition along the river continuum, moving from a community dominated by Bacteroidetes in the headwaters to Actinobacteria-dominated downstream. Location of the sampling point in the river network (measured as the cumulative water channel distance upstream) was found to be the most predictive spatial feature; inferring that ecological processes pertaining to temporal community succession are of prime importance in driving the assemblages of riverine bacterioplankton communities. A decrease in bacterial activity rates and an increase in the abundance of low nucleic acid bacteria relative to high nucleic acid bacteria were found to correspond with these downstream changes in community structure, suggesting corresponding functional changes. Our findings show that bacterial communities across the Thames basin exhibit an ecological succession along the river continuum, and that this is primarily driven by water residence time rather than the physico-chemical status of the river.
We investigate the properties of a Wright-Fisher diffusion process started from frequency x at time 0 and conditioned to be at frequency y at time T. Such a process is called a bridge. Bridges arise naturally in the analysis of selection acting on standing variation and in the inference of selection from allele frequency time series. We establish a number of results about the distribution of neutral Wright-Fisher bridges and develop a novel rejection sampling scheme for bridges under selection that we use to study their behavior.
In a wide array of kidney diseases, type 1 angiotensin (AT1) receptors are present on the immune cells that infiltrate the renal interstitium. Here, we examined the actions of AT1 receptors on macrophages in progressive renal fibrosis and found that macrophage-specific AT1 receptor deficiency exacerbates kidney fibrosis induced by unilateral ureteral obstruction (UUO). Macrophages isolated from obstructed kidneys of mice lacking AT1 receptors solely on macrophages had heightened expression of proinflammatory M1 cytokines, including IL-1. Evaluation of isolated AT1 receptor–deficient macrophages confirmed the propensity of these cells to produce exaggerated levels of M1 cytokines, which led to more severe renal epithelial cell damage via IL-1 receptor activation in coculture compared with WT macrophages. A murine kidney crosstransplantation concomitant with UUO model revealed that augmentation of renal fibrosis instigated by AT1 receptor–deficient macrophages is mediated by IL-1 receptor stimulation in the kidney. This study indicates that a key role of AT1 receptors on macrophages is to protect the kidney from fibrosis by limiting activation of IL-1 receptors in the kidney.
Our study describes the incidence and risk factors for undiagnosed diabetes in elderly cancer patients. Using Surveillance, Epidemiology, and End Results-Medicare data, we followed patients with breast, colorectal, lung, or prostate cancer from 24 months before to 3 months after cancer diagnosis. Medicare claims were used to exclude patients with diabetes 24 to 4 months before cancer (look-back period), identify those with diabetes undiagnosed until cancer, and construct indicators of preventive services, physician contact, and comorbidity during the look-back period. Logistic regression analyses were performed to identify factors associated with undiagnosed diabetes. Overall, 2,678 patients had diabetes undiagnosed until cancer. Rates were the highest in patients with both advanced-stage cancer and low prior primary care/medical specialist contact (breast 8.2%, colorectal 5.9%, lung 4.4%). Nonwhite race/ethnicity, living in a census tract with a higher percent of the population in poverty and a lower percent college educated, lower prior preventive services use, and lack of primary care and/or medical specialist care prior to cancer all were associated with higher (P ≤ 0.05) adjusted odds of undiagnosed diabetes. Undiagnosed diabetes is relatively common in selected subgroups of cancer patients, including those already at high risk of poor outcomes due to advanced cancer stage.
Parkinson's Disease patients wore a device on the wrist that gave reminders to take levodopa and also measured bradykinesia and dyskinesia. Consumption of medications was acknowledged by placing the thumb on the device. Some patients performed this acknowledgement repeatedly and unconsciously. This study examines whether this behaviour reflected increased impulsivity.
Methods and Results
Twenty five participants were selected because they had i) excess acknowledgements described above or ii) Impulsive-Compulsive Behaviours or iii) neither of these. A blinded assessor applied clinical scales to measure Impulsive-Compulsive Behaviours, cognition, depression, anxiety and apathy. A Response Ratio, representing the number of acknowledgements/number of doses (expressed as a percentage) was tightly correlated with ratings of Impulsive-Compulsive Behaviours (r2 = 0.79) in 19/25 subjects. Some of these patients had dyskinesia, which was higher with extraneous responses than with response indicating medication consumption. Six of the 25 subjects had high Impulsive-Compulsive Behaviour Scores, higher apathy scores, low levels of dyskinesia and normal Response Ratios. Patients without ICB (low RR) also had low dyskinesia levels regardless of the relevance of the response.
An elevated Response Ratio is a specific measure of a type of ICB where increased incentive salience is attributed to cues by the presence of high striatal dopamine levels, manifested by high levels of dyskinesia. This study also points to a second form of ICBs which occur in the absence of dyskinesia, has normal Response Ratios and higher apathy scores, and may represent prefrontal pathology.
Preexisting comorbidity adversely impacts breast cancer treatment and outcomes. We examined the incremental impact of comorbidity undetected until cancer. We followed breast cancer patients in SEER-Medicare from 12 months before to 84 months after diagnosis. Two comorbidity indices were constructed: the National Cancer Institute index, using 12 months of claims before cancer, and a second index for previously undetected conditions, using three months after cancer. Conditions present in the first were excluded from the second. Overall, 6,184 (10.1%) had ≥1 undetected comorbidity. Chronic obstructive pulmonary disease (38%) was the most common undetected condition. In multivariable analyses that adjusted for comorbidity detected before cancer, older age, later stage, higher grade, and poor performance status all were associated with higher odds of ≥1 undetected comorbidity. In stage I–III cancer, undetected comorbidity was associated with lower adjusted odds of receiving adjuvant chemotherapy (Odds Ratio (OR) = 0.81, 95% Confidence Interval (CI) 0.73–0.90, P < 0.0001; OR = 0.38, 95% CI 0.30–0.49, P < 0.0001; index score 1 or ≥2, respectively), and with increased mortality (Hazard Ratio (HR) = 1.45, 95% CI 1.38–1.53, P < 0.0001; HR = 2.38, 95% CI 2.18–2.60, P < 0.0001; index score 1 or ≥2). Undetected comorbidity is associated with less aggressive treatment and higher mortality in breast cancer.
Soil carbon (C) storage is dependent upon the complex dynamics of fresh and native organic matter cycling, which are regulated by plant and soil-microbial activities. A fundamental challenge exists to link microbial biodiversity with plant-soil C cycling processes to elucidate the underlying mechanisms regulating soil carbon. To address this, we contrasted vegetated grassland soils with bare soils, which had been plant-free for 3 years, using stable isotope (13C) labeled substrate assays and molecular analyses of bacterial communities. Vegetated soils had higher C and N contents, biomass, and substrate-specific respiration rates. Conversely, following substrate addition unlabeled, native soil C cycling was accelerated in bare soil and retarded in vegetated soil; indicative of differential priming effects. Functional differences were reflected in bacterial biodiversity with Alphaproteobacteria and Acidobacteria dominating vegetated and bare soils, respectively. Significant isotopic enrichment of soil RNA was found after substrate addition and rates varied according to substrate type. However, assimilation was independent of plant presence which, in contrast to large differences in 13CO2 respiration rates, indicated greater substrate C use efficiency in bare, Acidobacteria-dominated soils. Stable isotope probing (SIP) revealed most community members had utilized substrates with little evidence for competitive outgrowth of sub-populations. Our findings support theories on how plant-mediated soil resource availability affects the turnover of different pools of soil carbon, and we further identify a potential role of soil microbial biodiversity. Specifically we conclude that emerging theories on the life histories of dominant soil taxa can be invoked to explain changes in soil carbon cycling linked to resource availability, and that there is a strong case for considering microbial biodiversity in future studies investigating the turnover of different pools of soil carbon.
upland acidic grassland; bacteria; substrate-specific respiration; priming effects; substrate carbon use efficiency; T-RFLP; RNA stable isotope probing; soil organic carbon
The relative contributions of B lymphocytes and plasma cells during allograft rejection remain unclear. Therefore, the effects of B cell depletion on acute cardiac rejection, chronic renal rejection, and skin graft rejection were compared using CD20 or CD19 mAbs. Both CD20 and CD19 mAbs effectively depleted mature B cells, while CD19 mAb treatment depleted plasmablasts and some plasma cells. B cell depletion did not affect acute cardiac allograft rejection, although CD19 mAb treatment prevented allograft-specific IgG production. Strikingly, CD19 mAb treatment significantly reduced renal allograft rejection and abrogated allograft-specific IgG development, while CD20 mAb treatment did not. By contrast, B cell depletion exacerbated skin allograft rejection and augmented the proliferation of adoptively transferred alloantigen-specific CD4+ T cells, demonstrating that B cells can also negatively regulate allograft rejection. Thereby, B cells can either positively or negatively regulate allograft rejection depending on the nature of the allograft and the intensity of the rejection response. Moreover, CD19 mAb may represent a new approach for depleting both B cells and plasma cells to concomitantly impair T cell activation, inhibit the generation of new allograft-specific Abs, or reduce preexisting allograft-specific Ab levels in transplant patients.
Human clinical trials using type 1 angiotensin (AT1) receptor antagonists indicate that angiotensin II is a critical mediator of cardiovascular and renal disease. However, recent studies have suggested that individual tissue pools of AT1 receptors may have divergent effects on target organ damage in hypertension.
We examined the role of AT1 receptors on T lymphocytes in the pathogenesis of hypertension and its complications.
Methods and Results
Deficiency of AT1 receptors on T cells potentiated kidney injury during hypertension with exaggerated renal expression of chemokines and enhanced accumulation of T cells in the kidney. Kidneys and purified CD4+ T cells from “T cell knockout” mice lacking AT1 receptors on T lymphocytes had augmented expression of Th1-associated cytokines including IFN-γ and TNF-α. Within T lymphocytes, the transcription factors T-bet and GATA-3 promote differentiation toward the Th1 and Th2 lineages, respectively, and AT1 receptor-deficient CD4+ T cells had enhanced T-bet / GATA-3 expression ratios favoring induction of the Th1 response. Inversely, mice that were unable to mount a Th1 response due to T-bet deficiency were protected from kidney injury in our hypertension model.
The current studies identify an unexpected role for AT1 receptors on T lymphocytes to protect the kidney in the setting of hypertension by favorably modulating CD4+ T helper cell differentiation.
Hypertension; Kidney disease; T lymphocytes; Inflammation
Activation of type 1 angiotensin (AT1) receptors causes hypertension, leading
to progressive kidney injury. AT1 receptors are expressed on immune cells, and previous
studies have identified a role for immune cells in angiotensin II–dependent hypertension.
We, therefore, examined the role of AT1 receptors on immune cells in the pathogenesis of
hypertension by generating bone marrow chimeras with wild-type donors or donors lacking
AT1A receptors (BMKO). The 2 groups had virtually identical blood pressures at baseline,
suggesting that AT1 receptors on immune cells do not make a unique contribution to the
determination of baseline blood pressure. By contrast, in response to chronic angiotensin II
infusion, the BMKOs had an augmented hypertensive response, suggesting a protective effect of
AT1 receptors on immune cells with respect to blood pressure elevation. The BMKOs had
50% more albuminuria after 4 weeks of angiotensin II–dependent hypertension.
Angiotensin II–induced pathological injury to the kidney was similar in the experimental
groups. However, there was exaggerated renal expression of the macrophage chemokine monocyte
chemoattractant protein 1 in the BMKO group, leading to persistent accumulation of macrophages in
the kidney. This enhanced mononuclear cell infiltration into the BMKO kidneys was associated with
exaggerated renal expression of the vasoactive mediators interleukin-1β and
interleukin-6. Thus, in angiotensin II-induced hypertension, bone marrow-derived AT1
receptors limited mononuclear cell accumulation in the kidney and mitigated the chronic hypertensive
response, possibly through the regulation of vasoactive cytokines.
angiotensin II; hypertension; inflammation; kidney diseases; lymphocytes
Estimating the incidence of medical conditions using claims data often requires constructing a prevalence period that predates an event of interest, for instance the diagnosis of cancer, to exclude those with pre-existing conditions from the incidence risk set. Those conditions missed during the prevalence period may be misclassified as incident conditions (false positives) after the event of interest.
Using Medicare claims, we examined the impact of selecting shorter versus longer prevalence periods on the incidence and misclassification of 12 relatively common conditions in older persons.
The source of data for this study was the National Cancer Institute’s Surveillance, Epidemiology, and End Results cancer registry linked to Medicare claims. Two cohorts of women were included: 33,731 diagnosed with breast cancer between 2000 and 2002, who had ≥ 36 months of Medicare eligibility prior to cancer, the event of interest; and 101,649 without cancer meeting the same Medicare eligibility criterion. Cancer patients were followed from 36 months before cancer diagnosis (prevalence period) up to 3 months after diagnosis (incidence period). Non-cancer patients were followed for up to 39 months after the beginning of Medicare eligibility. A sham date was inserted after 36 months to separate the prevalence and incidence periods. Using 36 months as the gold standard, the prevalence period was then shortened in 6-month increments to examine the impact on the number of conditions first detected during the incidence period.
In the breast cancer cohort, shortening the prevalence period from 36 to 6 months increased the incidence rates (per 1,000 patients) of all conditions; for example: hypertension 196 to 243; diabetes 34 to 76; chronic obstructive pulmonary disease 29 to 46; osteoarthritis 27 to 36; congestive heart failure 20 to 36; osteoporosis 22 to 29; and cerebrovascular disease 13 to 21. Shortening the prevalence period has less impact on those without cancer.
Selecting a short prevalence period to rule out pre-existing conditions can, through misclassification, substantially inflate estimates of incident conditions. In incidence studies based on Medicare claims, selecting a prevalence period of ≥24 months balances the need to exclude pre-existing conditions with retaining the largest possible cohort.
Incidence; Prevalence; Misclassification; Look back; Medical claims; Medicare
In breast cancer, diabetes diagnosed prior to cancer (previously diagnosed) is associated with advanced cancer stage and increased mortality. However, in the general population, 40% of diabetes is undiagnosed until glucose testing, and evidence suggests one consequence of increased evaluation and management around breast cancer diagnosis is the increased detection of previously undiagnosed diabetes. Biological factors – for instance, higher insulin levels due to untreated disease - and others underlying the association between previously diagnosed diabetes and breast cancer could differ in those whose diabetes remains undiagnosed until cancer. Our objectives were to identify factors associated with previously undiagnosed diabetes in breast cancer, and to examine associations between previously undiagnosed diabetes and cancer stage, treatment patterns, and mortality.
Using Surveillance, Epidemiology, and End Results-Medicare, we identified women diagnosed with breast cancer and diabetes between 01/2001 and 12/2005. Diabetes was classified as previously diagnosed if it was identified within Medicare claims between 24 and 4 months before cancer diagnosis, and previously undiagnosed if it was identified from 3 months before to ≤ 3 months after cancer. Patients were followed until 12/2007 or death, whichever came first. Multivariate analyses were performed to examine risk factors for previously undiagnosed diabetes and associations between undiagnosed (compared to previously diagnosed) diabetes, cancer stage, treatment, and mortality.
Of 2,418 patients, 634 (26%) had previously undiagnosed diabetes; the remainder had previously diagnosed diabetes. The mean age was 77.8 years, and 49.4% were diagnosed with in situ or stage I disease. Age > 80 years (40% of the cohort) and limited health system contact (primary care physician and/or preventive services) prior to cancer were associated with higher adjusted odds of previously undiagnosed diabetes. Previously undiagnosed diabetes was associated with higher adjusted odds of advanced stage (III/IV) cancer (Odds Ratio = 1.37: 95% Confidence Interval (CI) 1.05 – 1.80; P = 0.02), and a higher adjusted mortality rate due to causes other than cancer (Hazard Ratio = 1.29; 95% CI 1.02 – 1.63; P = 0.03).
In breast cancer, previously undiagnosed diabetes is associated with advanced stage cancer and increased mortality. Identifying biological factors would require further investigation.
Breast cancer; Diabetes; Previously undiagnosed; Risk factors; Stage; Mortality; Survival
Soil DNA extraction has become a critical step in describing microbial biodiversity. Historically, ascertaining overarching microbial ecological theories has been hindered as independent studies have used numerous custom and commercial DNA extraction procedures. For that reason, a standardized soil DNA extraction method (ISO-11063) was previously published. However, although this ISO method is suited for molecular tools such as quantitative PCR and community fingerprinting techniques, it has only been optimized for examining soil bacteria. Therefore, the aim of this study was to assess an appropriate soil DNA extraction procedure for examining bacterial, archaeal and fungal diversity in soils of contrasting land-use and physico-chemical properties. Three different procedures were tested: the ISO-11063 standard; a custom procedure (GnS-GII); and a modified ISO procedure (ISOm) which includes a different mechanical lysis step (a FastPrep ®-24 lysis step instead of the recommended bead-beating). The efficacy of each method was first assessed by estimating microbial biomass through total DNA quantification. Then, the abundances and community structure of bacteria, archaea and fungi were determined using real-time PCR and terminal restriction fragment length polymorphism approaches. Results showed that DNA yield was improved with the GnS-GII and ISOm procedures, and fungal community patterns were found to be strongly dependent on the extraction method. The main methodological factor responsible for differences between extraction procedure efficiencies was found to be the soil homogenization step. For integrative studies which aim to examine bacteria, archaea and fungi simultaneously, the ISOm procedure results in higher DNA recovery and better represents microbial communities.
Rituximab improves survival in follicular lymphoma (FL), but is considerably more expensive than conventional chemotherapy. We estimated the total direct medical costs, cumulative survival, and cost-effectiveness of adding rituximab to first-line chemotherapy for FL, based on a single source of data representing routine practice in the elderly. Using surveillance, epidemiology, and end results (SEER) registry data plus Medicare claims, we identified 1,117 FL patients who received first-line CHOP (cyclophosphamide (C), doxorubicin, vincristine (V), and prednisone (P)) or CVP +/− rituximab. Multivariate regression was used to estimate adjusted cumulative cost and survival differences between the two groups over four years after beginning treatment. The median age was 73 years (minimum 66 years), 56% had stage III-IV disease, and 67% received rituximab. Adding rituximab to first-line chemotherapy was associated with higher adjusted incremental total cost ($18,695; 95% Confidence Interval (CI) $9,302–$28,643) and longer adjusted cumulative survival (0.18 years; 95% CI 0.10–0.27) over four years of followup. The expected cost-effectiveness was $102,142 (95% CI $34,531–296,337) per life-year gained. In routine clinical practice, adding rituximab to first-line chemotherapy for elderly patients with FL results in higher direct medical costs to Medicare and longer cumulative survival after four years.
Background. Trastuzumab improves survival in HER2-positive women with metastatic breast cancer (MBC). The consequences of longer survival include a higher likelihood of additional metastases, including those in the central nervous system (CNS). The effect of CNS metastases on both trastuzumab discontinuation and survival in older patients has not been described. Patients and Methods. We used the Surveillance Epidemiology and End Results (SEER) Medicare data to identify a cohort of 562 women age 66 or older with MBC who were diagnosed between January 1, 2000 and December 31, 2005, free of CNS metastases, and initiated trastuzumab after MBC diagnosis. Time to discontinuation and time to death were analyzed using proportional hazards models. Results. Newly diagnosed CNS metastases were associated with both higher risk of trastuzumab discontinuation (relative hazard [RH] = 1.78, 95% CI 1.11–2.87) and higher risk of death (RH = 2.49, 95% CI 1.84–3.37). The incidence rate of new CNS metastases was comparable among various sites of metastasis (10.7 to 14.7 per 1,000 patient-months), except for bone which was higher (24.1 per 1,000). Conclusion. The diagnosis of CNS metastases was associated with an increase in both the likelihood of discontinuing trastuzumab therapy as well as the risk of death.
Vascular injury and remodeling are common pathological sequelae of hypertension. Previous studies have suggested that the renin-angiotensin system (RAS) acting through the type I (AT1) angiotensin (AT1)-receptor promotes vascular pathology in hypertension. To study the role of AT1-receptors in this process, we generated mice with cell-specific deletion of AT1-receptors in VSMCs using Cre/Loxp technology. We crossed the SM22α-Cre transgenic mouse line expressing Cre recombinase in smooth muscle cells with a mouse line bearing a conditional allele of the Agtr1a gene (Agtr1a flox), encoding the major murine AT1-receptor isoform (AT1A). In SM22α-Cre+Agtr1a flox/flox (SMKO) mice, AT1A-receptors were efficiently deleted from VSMCs in larger vessels, but not from resistance vessels such as pre-glomerular arterioles. Thus, vasoconstrictor responses to angiotensin II were preserved in SMKOs. To induce hypertensive vascular remodeling, mice were continuously infused with angiotensin II for 4 weeks. During infusion of angiotensin II, blood pressures increased significantly and to a similar extent in SMKOs and controls. In control mice, there was evidence of vascular oxidative stress indicated by enhanced nitrated tyrosine residues in segments of aorta; this was significantly attenuated in SMKOs. Despite these differences in oxidative stress, the extent of aortic medial expansion induced by angiotensin II infusion was virtually identical in both groups. Thus, vascular AT1A-receptors promote oxidative stress in the aortic wall but are not required for remodeling in angiotensin II-dependent hypertension.
angiotensin II; hypertrophy; hyperplasia; aorta; smooth muscle; hypertension
Practice guidelines define hemodialysis catheter dysfunction as blood flow rate (BFR) <300 mL/min. We conducted a study using data from DaVita and the United States Renal Data System to evaluate the impact of catheter dysfunction on dialysis and other medical services. Patients were included if they had ≥8 consecutive weeks of catheter dialysis between 8/2004 and 12/2006. Actual BFR <300 mL/min despite planned BFR ≥300 mL/min was used to define catheter dysfunction during each dialysis session. Among 9,707 patients, the average age was 62,53% were female, and 40% were black. The median duration of catheter dialysis was 190 days, and the cohort accounted for 1,075,701 catheter dialysis sessions. There were 70,361 sessions with catheter dysfunction, and 6,33 1 (65.2%) patients had at least one session with catheter dysfunction. In multivariate repeated measures analysis, catheter dysfunction was associated with increased odds of missing a dialysis session due to access problems (Odds ratio [OR] 2.50; P < 0.001), having an access-related procedure (OR 2.10; P < 0.001), and being hospitalized (OR 1.10; P = 0.001). Catheter dysfunction defined according to NKF vascular access guidelines results in disruptions of dialysis treatment and increased use of other medical services.
Performing inference on contemporary samples of DNA sequence data is an important and challenging task. Computationally intensive methods such as importance sampling (IS) are attractive because they make full use of the available data, but in the presence of recombination the large state space of genealogies can be prohibitive. In this article, we make progress by developing an efficient IS proposal distribution for a two-locus model of sequence data. We show that the proposal developed here leads to much greater efficiency, outperforming existing IS methods that could be adapted to this model. Among several possible applications, the algorithm can be used to find maximum likelihood estimates for mutation and crossover rates, and to perform ancestral inference. We illustrate the method on previously reported sequence data covering two loci either side of the well-studied TAP2 recombination hotspot. The two loci are themselves largely non-recombining, so we obtain a gene tree at each locus and are able to infer in detail the effect of the hotspot on their joint ancestry. We summarize this joint ancestry by introducing the gene graph, a summary of the well-known ancestral recombination graph.
coalescence; Monte Carlo likelihood; probability; sequences; stochastic processes
Blood flow rate (BFR) <300 mL/min commonly is used to define hemodialysis catheter dysfunction and the need for interventions to prevent complications. The objective of this study was to describe patterns of unplanned BFR <300 mL/min during catheter hemodialysis using data from DaVita dialysis facilities and the United States Renal Data System. Patients were included if they received at least eight weeks of hemodialysis exclusively through a catheter between 08/04 and 12/06, and catheter hemodialysis was the first treatment modality following diagnosis of end-stage renal disease (first access), or it immediately followed at least one 30-day period of dialysis exclusively through a fistula or graft (replacement access). Actual BFR <300 mL/min despite a planned BFR ≥300 mL/min defined catheter dysfunction during each dialysis session. There were 3,364 patients, 268,363 catheter dialysis sessions, and 19,118 (7.1%) sessions with catheter dysfunction. Almost two-thirds of patients had ≥1 catheter dysfunction session, and 30% had ≥1 catheter dysfunction session per month. Patients with catheter as a replacement access had a higher rate of catheter dysfunction than those with a catheter as first access (hazard ratio: 1.13; P = 0.04). Catheter dysfunction affects almost one-third of catheter dialysis patients each month and two-thirds overall.
We used Surveillance, Epidemiology, and End Results-Medicare data (2000-2006) to describe treatment and survival in women diagnosed with metastatic breast cancer (MBC) who received trastuzumab. There were 610 patients with a mean age of 74 years. Overall, 32% received trastuzumab alone and 47% received trastuzumab plus a taxane. In multivariate analysis, trastuzumab plus chemotherapy was associated with a lower adjusted cancer mortality rate (Hazard Ratio [HR] 0.54; 95% Confidence Interval [CI] 0.39-0.74; p < .001) than trastuzumab alone among patients who received trastuzumab as part of first-line therapy. Adding chemotherapy to first-line trastuzumab for metastatic breast cancer is associated with improved cancer survival.
Breast cancers; Chemotherapy; Outcomes research
Productivity and predation are thought to be crucial drivers of bacterial diversity. We tested how the productivity–diversity of a natural bacterial community is modified by the presence of protist predators with different feeding preferences. In the absence of predators, there was a unimodal relationship between bacterial diversity and productivity. We found that three protist species (Bodo, Spumella and Cyclidium) had widely divergent effects on bacterial diversity across the productivity gradient. Bodo and Cyclidium had little effect on the shape of the productivity–diversity gradient, while Spumella flattened the relationship. We explain these results in terms of the feeding preferences of these predators.
bacteria; productivity; biodiversity; protist; predation
Prostaglandin (PG) E2 has multiple actions that may affect blood pressure. It is synthesized from arachidonic acid by the sequential actions of phospholipases, cyclooxygenases, and PGE synthases. While microsomal PGE synthase 1 (mPGES1) is the only genetically-verified PGE synthase, results of previous studies examining the consequences of mPGES1-deficiency on blood pressure (BP) are conflicting. To determine whether genetic background modifies the impact of mPGES1 on BP, we generated mPGES1−/− mice on two distinct inbred backgrounds, DBA/1lacJ and 129/SvEv. On the DBA/1 background, baseline BP was similar between wild-type (WT) and mPGES1−/− mice. By contrast, on the 129 background, baseline BPs were significantly higher in mPGES1−/− animals than WT controls. During angiotensin II infusion, the DBA/1 mPGES1−/− and WT mice developed mild hypertension of similar magnitude, while 129-mPGES1−/− mice developed more severe hypertension than WT controls. DBA/1 animals developed only minimal albuminuria in response to angiotensin II infusion. By contrast, WT 129 mice had significantly higher levels of albumin excretion than WT DBA/1 and the extent of albuminuria was further augmented in 129 mPGES1−/− animals. In WT mice of both strains, the increase in urinary excretion of PGE2 with angiotensin II was attenuated in mPGES1−/− animals. Urinary excretion of thromboxane was unaffected by angiotensin II in the DBA/1 lines but increased more than 4-fold in 129 mPGES1−/− mice. These data indicate that genetic background significantly modifies the BP response to mPGES1 deficiency. Exaggerated production of thromboxane may contribute to the robust hypertension and albuminuria in 129 mPGES1-deficient mice.
prostanoids; PGE synthase; blood pressure; strain; hypertension
Diffuse large B-cell lymphoma (DLBCL) comprises 31% of lymphomas in the United States. Although it is an aggressive type of lymphoma, 40% to 50% of patients are cured with treatment. The study objectives were to identify patient factors associated with treatment and survival in DLBCL.
Using Surveillance, Epidemiology, and End Results (SEER) registry data linked to Medicare claims, we identified 7,048 patients diagnosed with DLBCL between January 1, 2001 and December 31, 2005. Patients were followed from diagnosis until the end of their claims history (maximum December 31, 2007) or death. Medicare claims were used to characterize the first infused chemo-immunotherapy (C-I therapy) regimen and to identify radiation. Multivariate analyses were performed to identify patient demographic, socioeconomic, and clinical factors associated with treatment and with survival. Outcomes variables in the survival analysis were all-cause mortality, non-Hodgkin's lymphoma (NHL) mortality, and other/unknown cause mortality.
Overall, 84% (n = 5,887) received C-I therapy or radiation treatment during the observation period: both, 26%; C-I therapy alone, 53%; and radiation alone, 5%. Median age at diagnosis was 77 years, 54% were female, 88% were white, and 43% had Stage III or IV disease at diagnosis. The median time to first treatment was 42 days, and 92% of these patients had received their first treatment by day 180 following diagnosis. In multivariate analysis, the treatment rate was significantly lower among patients ≥ 80 years old, blacks versus whites, those living in a census tract with ≥ 12% poverty, and extra-nodal disease. Blacks had a lower treatment rate overall (Hazard Ratio [HR] 0.77; P < 0.001), and were less likely to receive treatment within 180 days of diagnosis (Odds Ratio [OR] 0.63; P = 0.002) than whites. In multivariate survival analysis, black race was associated with higher all-cause mortality (HR 1.24; P = 0.01) and other/unknown cause mortality (HR 1.35; P = 0.01), but not mortality due to NHL (HR 1.16; P = 0.19).
In elderly patients diagnosed with DLBCL, there are large differences in treatment access and survival between blacks and whites.
Drugs and antibodies that interrupt vascular endothelial growth factor (VEGF) signaling pathways improve outcomes in patients with a variety of cancers by inhibiting tumor angiogenesis. A major adverse effect of these treatments is hypertension, suggesting a critical role for VEGF in blood pressure (BP) regulation. However, the physiological mechanisms underlying the control of BP by VEGF are unclear. To address this question, we administered a specific antibody against the major VEGF receptor, VEGFR2, to normal mice and assessed the consequences on BP. Compared to vehicle-treated controls, administration of the anti-VEGFR2 antibody caused a rapid and sustained increase in BP of ≈10 mm Hg. This increase in BP was associated with a significant reduction in renin mRNA expression in the kidney (p=0.019) and in urinary excretion of aldosterone (p<0.05). Treatment with the anti-VEGFR2 antibody also caused marked reduction in expression of endothelial and neuronal nitric oxide synthases (eNOS and nNOS) in the kidney. To examine the role of nitric oxide (NO) in the hypertension caused by blocking VEGFR2, mice were treated with Nω-nitro-L-arginine methyl ester (L-NAME) (20 mg/kg/day), an inhibitor of NO production. L-NAME administration abolished the difference in blood pressure between the vehicle- and anti-VEGFR2-treated groups. Our data suggest that VEGF, acting via VEGFR2, plays a critical role in blood pressure control by promoting NOS expression and NO activity. Interfering with this pathway is likely to be one mechanism underlying hypertension caused by anti-angiogenic agents targeting VEGF.
hypertension; angiogenesis; cancer; vascular endothelial growth factor; nitric oxide