Targeted radiotherapy of liver malignancies has found to be effective in selected patients. A key limiting factor of these therapies is the relatively low tolerance of the liver parenchyma to radiation. We sought to assess the preventive effects of a combined regimen of pentoxifylline (PTX), ursodeoxycholic acid (UDCA) and low-dose low molecular weight heparin (LMWH) on focal radiation-induced liver injury (fRILI).
Methods and Materials
Patients with liver metastases from colorectal carcinoma who were scheduled for local ablation by radiotherapy (image-guided high-dose-rate interstitial brachytherapy) were prospectively randomized to receive PTX, UDCA and LMWH for 8 weeks (treatment) or no medication (control). Focal RILI at follow-up was assessed using functional hepatobiliary magnetic resonance imaging (MRI). A minimal threshold dose, i.e. the dose to which the outer rim of the fRILI was formerly exposed to, was quantified by merging MRI and dosimetry data.
Results from an intended interim-analysis made a premature termination necessary. Twenty-two patients were included in the per-protocol analysis. Minimal mean hepatic threshold dose 6 weeks after radiotherapy (primary endpoint) was significantly higher in the study treatment-group compared with the control (19.1 Gy versus 14.6 Gy, p = 0.011). Qualitative evidence of fRILI by MRI at 6 weeks was observed in 45.5% of patients in the treatment versus 90.9% of the control group. No significant differences between the groups were observed at the 12-week follow-up.
The post-therapeutic application of PTX, UDCA and low-dose LMWH significantly reduced the extent and incidence fRILI at 6 weeks after radiotherapy. The development of subsequent fRILI at 12 weeks (4 weeks after cessation of PTX, UDCA and LMWH during weeks 1–8) in the treatment group was comparable to the control group thus supporting the observation that the agents mitigated fRILI.
EU clinical trials register 2008-002985-70 ClinicalTrials.gov NCT01149304
Cathepsin K (CatK) is mainly expressed by osteoclasts and plays an important role in bone resorption. As CatK is expressed and secreted by osteoclasts during active bone resorption, it may be a useful and specific biochemical marker of osteoclastic activity. Therefore, CatK serum levels were studied for monitoring the treatment of females with postmenopausal osteoporosis by zoledronic acid. The serum CatK levels were determined in nine postmenopausal females before and after 3, 6 and 12 months of treatment. The levels were significantly reduced after 3 and 6 months (P<0.05), whereas they returned to baseline after 1 year. Taken together, the serum level of CatK may be suitable for monitoring anti-osteoporotic therapy in association with treatment response.
bisphosphonat; cathepsin K; osteoporosis; bone density measurement; compliance
The complex and enormous diversity of microorganisms associated with plant roots is important for plant health and growth and is shaped by numerous factors. This study aimed to unravel the effects of the soil type on bacterial communities in the rhizosphere of field-grown lettuce. We used an experimental plot system with three different soil types that were stored at the same site for 10 years under the same agricultural management to reveal differences directly linked to the soil type and not influenced by other factors such as climate or cropping history. Bulk soil and rhizosphere samples were collected 3 and 7 weeks after planting. The analysis of 16S rRNA gene fragments amplified from total community DNA by denaturing gradient gel electrophoresis and pyrosequencing revealed soil type dependent differences in the bacterial community structure of the bulk soils and the corresponding rhizospheres. The rhizosphere effect differed depending on the soil type and the plant growth developmental stage. Despite the soil type dependent differences in the bacterial community composition several genera such as Sphingomonas, Rhizobium, Pseudomonas, and Variovorax were significantly increased in the rhizosphere of lettuce grown in all three soils. The number of rhizosphere responders was highest 3 weeks after planting. Interestingly, in the soil with the highest numbers of responders the highest shoot dry weights were observed. Heatmap analysis revealed that many dominant operational taxonomic units were shared among rhizosphere samples of lettuce grown in diluvial sand, alluvial loam, and loess loam and that only a subset was increased in relative abundance in the rhizosphere compared to the corresponding bulk soil. The findings of the study provide insights into the effect of soil types on the rhizosphere microbiome of lettuce.
Lactuca sativa; bacterial communities; 16S rRNA gene analysis; DGGE; pyrosequencing; rhizosphere responders
A recent paper by Eklund et al. (2012) showed that up to 70% false positive results may occur when analyzing functional magnetic resonance imaging (fMRI) data using the statistical parametric mapping (SPM) software, which may mainly be caused by insufficient compensation for the temporal correlation between successive scans. Here, we show that a blockwise permutation method can be an effective alternative to the standard correction method for the correlated residuals in the general linear model, assuming an AR(1)-model as used in SPM for analyzing fMRI data. The blockwise permutation approach including a random shift developed by our group (Adolf et al., 2011) accounts for the temporal correlation structure of the data without having to provide a specific definition of the underlying autocorrelation model. 1465 publicly accessible resting-state data sets were re-analyzed, and the results were compared with those of Eklund et al. (2012). It was found that with the new permutation method the nominal familywise error rate for the detection of activated voxels could be maintained approximately under even the most critical conditions in which Eklund et al. found the largest deviations from the nominal error level. Thus, the method presented here can serve as a tool to ameliorate the quality and reliability of fMRI data analyses.
SPM analysis; functional MRI; familywise error rate; blockwise permutation including a random shift; autocorrelation
N-Acetylcystein (NAC) reduces the reperfusion injury and infarct size in experimental macroangiopathic stroke. Here we now investigate the impact of NAC on the development of the histopathology of microangiopathic cerebrovascular disease including initial intravasal erythrocyte accumulations, blood–brain-barrier (BBB)-disturbances, microbleeds and infarcts.
Spontaneously Hypertensive Stroke-Prone Rats (SHRSP) were treated with NAC (12 mg/kg body weight, daily oral application for three to 30 weeks) and compared to untreated SHRSP. In all rats the number of microbleeds, thromboses, infarcts and stases were quantified by HE-staining. Exemplary brains were stained against von Willebrand factor (vWF), IgG, Glutathione and GFAP.
NAC animals exhibited significant more microbleeds, a greater number of vessels with BBB-disturbances, but also an elevation of Glutathione-levels in astrocytes surrounding small vessels. NAC-treatment reduced the numbers of thromboses, infarcts and arteriolar stases.
NAC reduces the frequency of thromboses and infarcts to the expense of an increase of small microbleeds in a rat model of microangiopathic cerebrovascular disease. We suppose that NAC acts via an at least partial inactivation of vWF resulting in an insufficient sealing of initial endothelial injury leading to more small microbleeds. By elevating Glutathione-levels NAC most likely exerts a radical scavenger function and protects small vessels against extended ruptures and subsequent infarcts. Finally, it reveals that stases are mainly caused by endothelial injuries and restricted thromboses.
Animal model; Blood–brain barrier; Cerebral microbleed; Cerebral small vessel disease; von Willebrand factor
Cerebral small vessel disease (CSVD) is associated with vessel wall changes, microbleeds, blood–brain barrier (BBB) disturbances, and reduced cerebral blood flow (CBF). As spontaneously hypertensive stroke-prone rats (SHRSP) may be a valid model of some aspects of human CSVD, we aimed to identify whether those changes occur in definite temporal stages and whether there is an initial phenomenon beyond those common vascular alterations. Groups of 51 SHRSP were examined simultaneously by histologic (Hematoxylin–Eosin, IgG-Immunohistochemistry, vessel diameter measurement) and imaging methods (Magnetic Resonance Imaging, 201-Thallium-Diethyldithiocarbamate/99m-Technetium-HMPAO Single Photon Emission Computed Tomography conducted as pilot study) at different stages of age. Vascular pathology in SHRSP proceeds in definite stages, whereas an age-dependent accumulation of erythrocytes in capillaries and arterioles represents the homogeneous initial step of the disease. Erythrocyte accumulations are followed by BBB disturbances and microbleeds, both also increasing with age. Microthromboses, tissue infarctions with CBF reduction, and disturbed potassium uptake represent the final stage of vascular pathology in SHRSP. Erythrocyte accumulations—we parsimoniously interpreted as stases—without cerebral tissue damage represent the first step of vascular pathology in SHRSP. If that initial phenomenon could be identified in patients, these erythrocyte accumulations might be a promising target for implementing prophylactic and therapeutic strategies in human CSVD.
animal models; blood–brain barrier; microcirculation; SPECT; white matter disease
Symptoms of gastro-esophageal reflux disease (GERD) in pregnancy are reported with a prevalence of 30–80%. The aim of this study was to assess the prevalence and severity of GERD symptoms during the course of pregnancy. Furthermore current practice in medical care for GERD during pregnancy was assessed.
We performed a prospective longitudinal cohort study on 510 pregnant women (mean age 28.12, SD 5.3). Investigations for reflux symptoms where based on the use of validated reflux-disease questionnaire (RDQ). Additional information was collected about the therapy. A group of non-pregnant women (mean age 24.56, SD 5.7) was included as controls. Frequency and severity of reflux symptoms were recorded in each trimester of pregnancy.
The prevalence of GERD symptoms in pregnant women increased from the first trimester with 26.1 to 36.1% in the second trimester and to 51.2% in the third trimester of pregnancy. The prevalence of GERD symptoms in the control group was 9.3%.
Pregnant women received medication for their GERD symptoms in 12.8% during the first, 9.1% during the second and 15.7% during the third trimester. Medications used >90% antacids, 0% PPI.
GERD symptoms occur more often in pregnant women than in non-pregnant and the frequency rises in the course of pregnancy. Medical therapy is used in a minority of cases and often with no adequate symptom relief.
Gastro-esophageal reflux disease; Pregnancy; Heartburn; Regurgitation; GERD symptoms
Gastroesophageal reflux disease (GERD) is associated with impaired epithelial barrier function that is regulated by cell-cell contacts. The aim of the study was to investigate the expression pattern of selected components involved in the formation of tight junctions in relation to GERD.
Eighty-four patients with GERD-related symptoms with endoscopic signs (erosive: n = 47) or without them (non-erosive: n = 37) as well as 26 patients lacking GERD-specific symptoms as controls were included. Endoscopic and histological characterization of esophagitis was performed according to the Los Angeles and adapted Ismeil-Beigi criteria, respectively. Mucosal biopsies from distal esophagus were taken for analysis by histopathology, immunohistochemistry and quantitative reverse-transcription polymerase chain reaction (RT-PCR) of five genes encoding tight junction components [Occludin, Claudin-1, -2, Zona occludens (ZO-1, -2)].
Histopathology confirmed GERD-specific alterations as dilated intercellular spaces in the esophageal mucosa of patients with GERD compared to controls (P < 0.05). Claudin-1 and −2 were 2- to 6-fold upregulation on transcript (P < 0.01) and in part on protein level (P < 0.015) in GERD, while subgroup analysis of revealed this upregulation for ERD only. In both erosive and non-erosive reflux disease, expression levels of Occludin and ZO-1,-2 were not significantly affected. Notably, the induced expression of both claudins did not correlate with histopathological parameters (basal cell hyperplasia, dilated intercellular spaces) in patients with GERD.
Taken together, the missing correlation between the expression of tight junction-related components and histomorphological GERD-specific alterations does not support a major role of the five proteins studied in the pathogenesis of GERD.
Gastroesophageal reflux disease; Tight junction; Claudins; Esophagitis; Inflammation
Larvae of the Western Corn Rootworm (WCR) feeding on maize roots cause heavy economical losses in the US and in Europe. New or adapted pest management strategies urgently require a better understanding of the multitrophic interaction in the rhizosphere. This study aimed to investigate the effect of WCR root feeding on the microbial communities colonizing the maize rhizosphere.
In a greenhouse experiment, maize lines KWS13, KWS14, KWS15 and MON88017 were grown in three different soil types in presence and in absence of WCR larvae. Bacterial and fungal community structures were analyzed by denaturing gradient gel electrophoresis (DGGE) of the16S rRNA gene and ITS fragments, PCR amplified from the total rhizosphere community DNA. DGGE bands with increased intensity were excised from the gel, cloned and sequenced in order to identify specific bacteria responding to WCR larval feeding. DGGE fingerprints showed that the soil type and the maize line influenced the fungal and bacterial communities inhabiting the maize rhizosphere. WCR larval feeding affected the rhiyosphere microbial populations in a soil type and maize line dependent manner. DGGE band sequencing revealed an increased abundance of Acinetobacter calcoaceticus in the rhizosphere of several maize lines in all soil types upon WCR larval feeding.
The effects of both rhizosphere and WCR larval feeding seemed to be stronger on bacterial communities than on fungi. Bacterial and fungal community shifts in response to larval feeding were most likely due to changes of root exudation patterns. The increased abundance of A. calcoaceticus suggested that phenolic compounds were released upon WCR wounding.
Since the mid-1990s, investigational sites in the countries of Central and Eastern Europe (CEE) have been increasingly utilized by pharmaceutical companies because of their high productivity in terms of patient enrolment into clinical trials. Based on the FDA’s publicly accessible Clinical Investigator Inspection List, we present an analysis of findings and outcome classifications from FDA inspections during Investigational New Drug (IND) studies and compare the results for the CEE region to those from Western European countries and the USA. Data from all 5531 FDA clinical trials inspections that occurred between 1994 (when the FDA first performed inspections in CEE) and the end of 2010 were entered into the database for comparative analysis. Of these, 4865 routine data audit (DA) inspections were analyzed: 401 from clinical trials performed in Western Europe, 230 in CEE, 3858 in the USA, and 376 in other countries. The average number of deficiencies per inspection ranged between 0.99 for CEE and 1.97 in Western Europe. No deficiencies were noted during 16.6%, 39.0%, and 21.5% of the inspections in Western Europe, CEE and USA, respectively. The percentages of inspections after which no follow-up action was indicated were 36.9% for Western Europe, 55.7% for CEE, and 44.3% for US sites. CEE was also the region with the lowest percentage of inspections that required official or voluntary action. On the basis of FDA inspection data, the high productivity of CEE sites appears to be accompanied by regulatory compliance as well as by data quality standards that are not inferior to those in Western regions.
clinical trials; inspection; Central and Eastern Europe (CEE); data quality; deficiencies
Human cerebral small vessel disease (CSVD) has been hypothesized to be an age-dependent disease accompanied by similar vascular changes in other organs. SHRSP feature numerous vascular risk factors and may be a valid model of some aspects of human CSVD. Here we compare renal histopathological changes with the brain pathology of spontaneously hypertensive stroke-prone rats (SHRSP).
Material and Methods
We histologically investigated the brains and kidneys of 61 SHRSP at different stages of age (12 to 44 weeks). The brain pathology (aggregated erythrocytes in capillaries and arterioles, microbleeds, microthromboses) and the kidney pathology (aggregated erythrocytes within peritubular capillaries, tubular protein cylinders, glomerulosclerosis) were quantified separately. The prediction of the brain pathology by the kidney pathology was assessed by creating ROC-curves integrating the degree of kidney pathology and age of SHRSP.
Both, brain and kidney pathology, show an age-dependency and proceed in definite stages whereas an aggregation of erythrocytes in capillaries and arterioles, we parsimoniously interpreted as stases, represent the initial finding in both organs. Thus, early renal tubulointerstitial damage characterized by rather few intravasal erythrocyte aggregations and tubular protein cylinders predicts the initial step of SHRSPs' cerebral vascular pathology marked by accumulated erythrocytes. The combined increase of intravasal erythrocyte aggregations and protein cylinders accompanied by glomerulosclerosis and thrombotic renal microangiopathy in kidneys of older SHRSP predicts the final stages of SHRSPs' cerebrovascular lesions marked by microbleeds and thrombotic infarcts.
Our results illustrate a close association between structural brain and kidney pathology and support the concept of small vessel disease to be an age-dependent systemic pathology. Further, an improved joined nephrologic and neurologic diagnostic may help to identify patients with CSVD at an early stage.
To assess brachytherapy catheter positioning accuracy and to evaluate the effects of prolonged irradiation time on the tolerance dose of normal liver parenchyma following single-fraction irradiation with 192 Ir.
Materials and methods
Fifty patients with 76 malignant liver tumors treated by computed tomography (CT)-guided high-dose-rate brachytherapy (HDR-BT) were included in the study. The prescribed radiation dose was delivered by 1 - 11 catheters with exposure times in the range of 844 - 4432 seconds. Magnetic resonance imaging (MRI) datasets for assessing irradiation effects on normal liver tissue, edema, and hepatocyte dysfunction, obtained 6 and 12 weeks after HDR-BT, were merged with 3D dosimetry data. The isodose of the treatment plan covering the same volume as the irradiation effect was taken as a surrogate for the liver tissue tolerance dose. Catheter positioning accuracy was assessed by calculating the shift between the 3D center coordinates of the irradiation effect volume and the tolerance dose volume for 38 irradiation effects in 30 patients induced by catheters implanted in nearly parallel arrangement. Effects of prolonged irradiation were assessed in areas where the irradiation effect volume and tolerance dose volume did not overlap (mismatch areas) by using a catheter contribution index. This index was calculated for 48 irradiation effects induced by at least two catheters in 44 patients.
Positioning accuracy of the brachytherapy catheters was 5-6 mm. The orthogonal and axial shifts between the center coordinates of the irradiation effect volume and the tolerance dose volume in relation to the direction vector of catheter implantation were highly correlated and in first approximation identically in the T1-w and T2-w MRI sequences (p = 0.003 and p < 0.001, respectively), as were the shifts between 6 and 12 weeks examinations (p = 0.001 and p = 0.004, respectively). There was a significant shift of the irradiation effect towards the catheter entry site compared with the planned dose distribution (p < 0.005). Prolonged treatment time increases the normal tissue tolerance dose. Here, the catheter contribution indices indicated a lower tolerance dose of the liver parenchyma in areas with prolonged irradiation (p < 0.005).
Positioning accuracy of brachytherapy catheters is sufficient for clinical practice. Reduced tolerance dose in areas exposed to prolonged irradiation is contradictory to results published in the current literature. Effects of prolonged dose administration on the liver tolerance dose for treatment times of up to 60 minutes per HDR-BT session are not pronounced compared to effects of positioning accuracy of the brachytherapy catheters and are therefore of minor importance in treatment planning.
The German Network of Disorders of Sex Development (DSD)/Intersexuality carried out a large scale clinical evaluation study on quality of life, gender identity, treatment satisfaction, coping, and problems associated with diagnoses and therapies in individuals with disorders of sex development (DSD). DSD are a heterogeneous group of various genetic disorders of sex determination or sex differentiation, all of which are rare conditions. In about half of all cases the molecular genetic diagnosis is unknown and diagnosis rests on clinical features.
Methods and design
The multi-centre clinical evaluation study includes short-term follow-up in some and cross-sectional assessments in all age and diagnostic groups fitting the criteria of DSD. Recruitment was from January 2005 until December 2007 in whole Germany and, additionally, in 2007 in Austria and German-speaking Switzerland. The study consists of a psychosocial inquiry for children, adolescents and their parents, and adults with standardized instruments and the collection of DSD-specific medical data by the attending physician. The main goal was the description of clinical outcomes and the health-care situation of individuals with DSD using a broad generic definition of DSD including all conditions with a mismatch of chromosomal, gonadal and phenotypical sex. 439 children and adolescents, their parents and adults with DSD participated.
The clinical evaluation study represents the most comprehensive study in this clinical field. The paper discusses the study protocol, the data management and data quality as well as the classification used, and it describes the study population. Given the lack of large datasets in rare conditions such as DSD and often biased results from small scale clinical case series, the study aims to generate concrete hypotheses for evidence-based guidelines, which should be tested in further studies.
In many research areas it is necessary to find differences between treatment groups with several variables. For example, studies of microarray data seek to find a significant difference in location parameters from zero or one for ratios thereof for each variable. However, in some studies a significant deviation of the difference in locations from zero (or 1 in terms of the ratio) is biologically meaningless. A relevant difference or ratio is sought in such cases.
This article addresses the use of relevance-shifted tests on ratios for a multivariate parallel two-sample group design. Two empirical procedures are proposed which embed the relevance-shifted test on ratios. As both procedures test a hypothesis for each variable, the resulting multiple testing problem has to be considered. Hence, the procedures include a multiplicity correction. Both procedures are extensions of available procedures for point null hypotheses achieving exact control of the familywise error rate. Whereas the shift of the null hypothesis alone would give straight-forward solutions, the problems that are the reason for the empirical considerations discussed here arise by the fact that the shift is considered in both directions and the whole parameter space in between these two limits has to be accepted as null hypothesis.
The first algorithm to be discussed uses a permutation algorithm, and is appropriate for designs with a moderately large number of observations. However, many experiments have limited sample sizes. Then the second procedure might be more appropriate, where multiplicity is corrected according to a concept of data-driven order of hypotheses.
Changes in synaptic efficacy underlying learning and memory processes are assumed to be associated with alterations of the protein composition of synapses. Here, we performed a quantitative proteomic screen to monitor changes in the synaptic proteome of four brain areas (auditory cortex, frontal cortex, hippocampus striatum) during auditory learning. Mice were trained in a shuttle box GO/NO-GO paradigm to discriminate between rising and falling frequency modulated tones to avoid mild electric foot shock. Control-treated mice received corresponding numbers of either the tones or the foot shocks. Six hours and 24 h later, the composition of a fraction enriched in synaptic cytomatrix-associated proteins was compared to that obtained from naïve mice by quantitative mass spectrometry. In the synaptic protein fraction obtained from trained mice, the average percentage (±SEM) of downregulated proteins (59.9 ± 0.5%) exceeded that of upregulated proteins (23.5 ± 0.8%) in the brain regions studied. This effect was significantly smaller in foot shock (42.7 ± 0.6% down, 40.7 ± 1.0% up) and tone controls (43.9 ± 1.0% down, 39.7 ± 0.9% up). These data suggest that learning processes initially induce removal and/or degradation of proteins from presynaptic and postsynaptic cytoskeletal matrices before these structures can acquire a new, postlearning organisation. In silico analysis points to a general role of insulin-like signalling in this process.
Animal proteomics; Auditory learning; Chemical synapse; Isotope-coded protein labelling; Learning and memory; Quantitative mass spectrometry