It is not well established whether pretreatment 18F-FDG PET/CT can predict local response of head and neck squamous cell carcinoma (HNSCC) to chemoradiotherapy (CRT). We examined 118 patients: 11 with nasopharyngeal cancer (NPC), 30 with oropharyngeal cancer (OPC), and 77 with laryngohypopharyngeal cancer (LHC) who had completed CRT. PET/CT parameters of primary tumor, including metabolic tumor volume (MTV), total lesion glycolysis (TLG), and maximum and mean standardized uptake value (SUVmax and SUVmean), were correlated with local response, according to primary site and human papillomavirus (HPV) status. Receiver-operating characteristic analyses were made to access predictive values of the PET/CT parameters, while logistic regression analyses were used to identify independent predictors. Area under the curve (AUC) of the PET/CT parameters ranged from 0.53 to 0.63 in NPC and from 0.50 to 0.54 in OPC. HPV-negative OPC showed AUC ranging from 0.51 to 0.58, while all of HPV-positive OPCs showed complete response. In contrast, AUC ranged from 0.71 to 0.90 in LHC. Moreover, AUCs of MTV and TLG were significantly higher than those of SUVmax and SUVmean (P < 0.01). After multivariate analysis, high MTV >25.0 mL and high TLG >144.8 g remained as independent, significant predictors of incomplete response compared with low MTV (odds ratio [OR], 13.4; 95% confidence interval [CI], 2.5–72.9; P = 0.003) and low TLG (OR, 12.8; 95% CI, 2.4–67.9; P = 0.003), respectively. In conclusion, predictive efficacy of pretreatment 18F-FDG PET/CT varies with different primary sites and chosen parameters. Local response of LHC is highly predictable by volume-based PET/CT parameters.
Chemoradiotherapy; head and neck squamous cell carcinoma; local response; metabolic tumor volume; total lesion glycolysis
Chromosome 9p21 single nucleotide polymorphism (SNP) is a susceptibility variant for acute myocardial infarction (AMI) in the primary prevention setting. However, it is controversial whether this SNP is also associated with recurrent myocardial infarction (ReMI) in the secondary prevention setting. The purpose of this study is to evaluate the impact of chromosome 9p21 SNP on ReMI in patients receiving secondary prevention programmes after AMI.
A prospective observational study.
Osaka Acute Coronary Insufficiency Study (OACIS) in Japan.
2022 patients from the OACIS database.
Genotyping of the 9p21 rs1333049 variant.
Primary outcome measures
ReMI event after survival discharge for 1 year.
A total of 43 ReMI occurred during the 1 year follow-up period. Although the rs1333049 C allele had an increased susceptibility to their first AMI in an additive model when compared with 1373 healthy controls (OR 1.20, 95% CI 1.09 to 1.33, p=2.3*10−4), patients with the CC genotype had a lower incidence of ReMI at 1 year after discharge of AMI (log-rank p=0.005). The adjusted HR of the CC genotype as compared with the CG/GG genotypes was 0.20 (0.06 to 0.65, p=0.007). Subgroup analysis demonstrated that the association between the rs1333049 CC genotype and a lower incidence of 1 year ReMI was common to all subgroups.
Homozygous carriers of the rs1333049 C allele on chromosome 9p21 showed a reduced risk of 1 year ReMI in the contemporary percutaneous coronary intervention era, although the C allele had conferred susceptibility to their first AMI.
Acute Myocardial Infarction; Secondary Prevention; Single Nucleotide Polymorphism; 9p21
We discuss sample size determination for clinical trials evaluating the joint effects of an intervention on two potentially correlated co-primary time-to-event endpoints. For illustration, we consider the most common case, a comparison of two randomized groups, and use typical copula families to model the bivariate endpoints. A correlation structure of the bivariate logrank statistic is specified to account for the correlation among the endpoints, although the between-group comparison is performed using the univariate logrank statistic. We propose methods to calculate the required sample size to compare the two groups and evaluate the performance of the methods and the behavior of required sample sizes via simulation.
Bivariate dependence; Censored data; Copula model; Logrank statistic; Power
The onset of acute myocardial infarction (AMI) shows characteristic circadian variations involving a definite morning peak and a less-defined night-time peak. However, the factors influencing the circadian patterns of AMI onset and their influence on morning and night-time peaks have not been fully elucidated.
Design, setting and participants
An analysis of patients registered between 1998 and 2008 in the Osaka Acute Coronary Insufficiency Study, which is a prospective, multicentre observational study of patients with AMI in the Osaka region of Japan. The present study included 7755 consecutive patients with a known time of AMI onset.
Main outcomes and measures
A mixture of two von Mises distributions was used to examine whether a circadian pattern of AMI had uniform, unimodal or bimodal distribution, and the likelihood ratio test was then used to select the best circadian pattern among them. The hierarchical likelihood ratio test was used to identify factors affecting the circadian patterns of AMI onset. The Kaplan-Meier method was used to estimate survival curves of 1-year mortality according to AMI onset time.
The overall population had a bimodal circadian pattern of AMI onset characterised by a high and sharp morning peak and a lower and less-defined night-time peak (bimodal p<0.001). Although several lifestyle-related factors had a statistically significant association with the circadian patterns of AMI onset, serum triglyceride levels had the most prominent association with the circadian patterns of AMI onset. Patients with triglyceride ≥150 mg/dL on admission had only one morning peak in the circadian pattern of AMI onset during weekdays, with no peaks detected on weekends, whereas all other subgroups had two peaks throughout the week.
The circadian pattern of AMI onset was characterised by bimodality. Notably, several lifestyle-related factors, particularly serum triglyceride levels, had a strong relation with the circadian pattern of AMI onset.
Trial registration number
Postoperative atrial fibrillation (POAF) is one of the most common complications after cardiac surgery. Patients who develop POAF have a prolonged stay in the intensive care unit and hospital and an increased risk of postoperative stroke. Many guidelines for the management of cardiac surgery patients, therefore, recommend perioperative administration of beta-blockers to prevent and treat POAF. Landiolol is an ultra-short acting beta-blocker, and some randomized controlled trials of landiolol administration for the prevention of POAF have been conducted in Japan. This meta-analysis evaluated the effectiveness of landiolol administration for the prevention of POAF after cardiac surgery.
The Medline/PubMed and BioMed Central databases were searched for randomized controlled trials comparing cardiac surgery patients who received perioperative landiolol with a control group (saline administration, no drug administration, or other treatment). Two independent reviewers selected the studies for inclusion. Data regarding POAF and safety outcomes were extracted. Odds ratios (ORs) with 95% confidence intervals (CIs) were calculated using the Mantel–Haenszel method (fixed effects model).
Six trials with a total of 560 patients were included in the meta-analysis. Landiolol administration significantly reduced the incidence of POAF after cardiac surgery (OR 0.26, 95% CI 0.17–0.40). The effectiveness of landiolol administration was similar in three groups: all patients who underwent coronary artery bypass grafting (CABG) (OR 0.27, 95% CI 0.17–0.43), patients who underwent CABG compared with a control group who received saline or nothing (OR 0.28, 95% CI 0.17–0.45), and all patients who underwent cardiac surgery compared with a control group who received saline or nothing (OR 0.27, 95% CI 0.17–0.42). Only two adverse events associated with landiolol administration were observed (2/302, 0.7%): hypotension in one patient and asthma in one patient.
Landiolol administration reduces the incidence of POAF after cardiac surgery and is well tolerated.
Electronic supplementary material
The online version of this article (doi:10.1007/s12325-014-0116-x) contains supplementary material, which is available to authorized users.
Atrial fibrillation; Beta-blocker; Cardiac surgery; Cardiology; Landiolol; Meta-analysis; Perioperative
Glucocorticoids (GCs) are one of the most effective anti-inflammatory drugs for treating acute and chronic inflammatory diseases. However, several studies have shown that GCs alter collagen metabolism in the skin and induce skin atrophy. Cortisol is the endogenous GC that is released in response to various stressors. Over the last decade, extraadrenal cortisol production in various tissues has been reported. Skin also synthesizes cortisol through a de novo pathway and through an activating enzyme. 11β-hydroxysteroid dehydrogenase 1 (11β-HSD1) is the enzyme that catalyzes the conversion of hormonally inactive cortisone to active cortisol in cells. We previously found that 11β-HSD1 negatively regulates proliferation of keratinocytes. To determine the function of 11β-HSD1 in dermal fibroblasts and collagen metabolism, the effect of a selective 11β-HSD1 inhibitor was studied in mouse tissues and dermal fibroblasts. The expression of 11β-HSD1 increased with age in mouse skin. Subcutaneous injection of a selective 11β-HSD1 inhibitor increased dermal thickness and collagen content in the mouse skin. In vitro, proliferation of dermal fibroblasts derived from 11β-HSD1 null mice (Hsd11b1−/− mice) was significantly increased compared with fibroblasts from wild-type mice. However, in vivo, dermal thickness of Hsd11b1−/− mice was not altered in 3-month-old and 1-year-old mouse skin compared with wild-type mouse skin. These in vivo findings suggest the presence of compensatory mechanisms in Hsd11b1−/− mice. Our findings suggest that 11β-HSD1 inhibition may reverse the decreased collagen content observed in intrinsically and extrinsically aged skin and in skin atrophy that is induced by GC treatment.
The benefits of statins, commonly prescribed for hypercholesterolemia, in treating Alzheimer’s disease (AD) have not yet been fully established. A recent randomized clinical trial did not show any therapeutic effects of two statins on cognitive function in AD. Interestingly, however, the results of the Rotterdam study, one of the largest prospective cohort studies, showed reduced risk of AD in statin users. Based on the current understanding of statin actions and AD pathogenesis, it is still worth exploring whether statins can prevent AD when administered decades before the onset of AD or from midlife. This review discusses the possible beneficial effects of statins, drawn from previous clinical observations, pathogenic mechanisms, which include β-amyloid (Aβ) and tau metabolism, genetic and non-genetic risk factors (apolipoprotein E, cholesterol, sex, hypertension, and diabetes), and other clinical features (vascular dysfunction and oxidative and inflammatory stress) of AD. These findings suggest that administration of statins in midlife might prevent AD in late life by modifying genetic and non-genetic risk factors for AD. It should be clarified whether statins inhibit Aβ accumulation, tau pathological features, and brain atrophy in humans. To answer this question, a randomized controlled study using amyloid positron emission tomography (PET), tau-PET, and magnetic resonance imaging would be useful. This clinical evaluation could help us to overcome this devastating disease.
statin; Alzheimer’s disease; prevention; Abeta; isoprenoids
Objective. To evaluate what types of DNA damages are detected in rheumatoid arthritis (RA). Methods. The DNA adducts such as 8-oxo-hydroxy-7,8-dihydro-2′-deoxyguanosine (8-oxo-dG), 1,N6-etheno-2′-deoxyadenosine (εdA), and heptanone-etheno-2′-deoxycytidine (HεdC) in genomic DNAs, derived from whole blood cells from 46 RA patients and 31 healthy controls, were analyzed by high-performance liquid chromatography tandem mass spectrometry, and their levels in RA patients and controls were compared. In addition, correlation between DNA adducts and clinical parameters of RA was analyzed. Results. Compared with controls, the levels of HεdC in RA were significantly higher (P < 0.0001) and age dependent (r = 0.43, P < 0.01), while there was no significant difference in 8-oxo-dG and εdA accumulation between RA patients and controls. HεdC levels correlated well with the number of swollen joints (r = 0.57, P < 0.0001) and weakly with the number of tender joints (r = 0.26, P = 0.08) of RA patients, while they did not show a significant association with serological markers such as C-reactive protein and matrix metalloproteinase 3. Conclusion. These findings indicate that HεdC may have some influence on the development of RA and/or its complications.
To investigate the utility of serum squamous cell carcinoma antigen (SCC-Ag) levels upon the diagnosis of recurrent cervical cancer for decision making in patient management.
Clinical records from 167 cervical cancer patients who developed recurrence between April 1996 and September 2010 were reviewed. A Cox proportional hazards regression model was used to investigate the prognostic significance of serum SCC-Ag levels at the time of recurrence. The effects of various salvage treatments on survival outcomes of recurrent cervical cancer were examined with respect to serum SCC-Ag levels.
Serum SCC-Ag levels were elevated (>2.0 ng/mL) in 125 patients (75%) when recurrence was diagnosed. These patients exhibited significantly shorter postrecurrence survival than those with normal SCC-Ag levels (log-rank; p=0.033). Multivariate analyses revealed that an elevated serum SCC-Ag level was an independent prognostic factor for poor postrecurrence survival. In patients with SCC-Ag levels <14.0 ng/mL, radiotherapy or surgery resulted in improved survival compared with chemotherapy or supportive care. In contrast, in patients with SCC-Ag levels of ≥14.0 ng/mL, salvage treatment with radiotherapy had only a minimal impact on postrecurrence survival.
The serum SCC-Ag level measured when cervical cancer recurrence is diagnosed can be useful for deciding upon the appropriate salvage treatment.
Decision-making; Recurrent cervical cancer; Squamous cell carcinoma antigen; Survival
Clinical trials with event-time outcomes as co-primary contrasts are common in many areas such as infectious disease, oncology, and cardiovascular disease. We discuss methods for calculating the sample size for randomized superiority clinical trials with two correlated time-to-event outcomes as co-primary contrasts when the time-to-event outcomes are exponentially distributed. The approach is simple and easily applied in practice.
bivariate exponential distribution; conjunctive power; co-primary endpoints; copula; log-transformed hazard ratio; right-censored; type II error rate
Whole-body computed tomography (CT) has gained importance in the early diagnostic phase of trauma care. However, the diagnostic value of CT for seriously injured patients is not thoroughly clarified. This study assessed whether preoperative CT beneficially affected survival of patients with blunt trauma who required emergency bleeding control.
This retrospective study was conducted from January 2004 to December 2010 in two tertiary trauma centers in Japan. The primary inclusion criterion was patients with blunt trauma who required emergency bleeding control (surgery or transcatheter arterial embolization). CT before emergency bleeding control was performed at the attending physician's discretion based on individual patient condition (for example, hemodynamic stability or certain abnormalities in the primary survey). We assessed covariates associated with 28-day mortality with multivariate logistic regression analysis and evaluated standardized mortality ratio (SMR, ratio of observed to predicted mortality by Trauma and Injury Severity Score (TRISS) method) in two subgroups of patients who did or did not undergo CT.
The inclusion criterion was fulfilled by 152 patients with a median Injury Severity Score of 35.3. During the early resuscitation phase, 132 (87%) patients underwent CT and 20 (13%) did not. Severity of injury was significantly higher in the non-CT versus CT group patients. Observed mortality rate was significantly lower in the CT versus non-CT group (18% vs. 80%, P <0.001). Multivariate adjustment for the probability of survival (Ps) by TRISS method confirmed CT as an independent predictor for 28-day mortality (adjusted OR, 7.22; 95% CI, 1.76 to 29.60; P = 0.006). In the subgroup with less severe trauma (TRISS Ps ≥50%), SMR in the CT group was 0.63 (95% CI, 0.23 to 1.03; P = 0.066), indicating no significant difference between observed and predicted mortality in the CT group. In contrast, in the subgroup with more severe trauma (TRISS Ps <50%), SMR was 0.65 (95% CI, 0.41 to 0.90; P = 0.004) only in the CT group, whereas the difference between observed and predicted mortality was not significant in the non-CT group, suggesting a possible beneficial effect of CT on survival only in trauma patients at high risk of death.
CT performed before emergency bleeding control might be associated with improved survival, especially in severe trauma patients with TRISS Ps of <50%.
Tuberous sclerosis complex (TSC) is an autosomal dominant disorder with multi-system involvement and variable manifestations. There has been significant progress in TSC research and the development of technologies used to diagnose this disorder. As a result, individuals with mild TSC are now being diagnosed, including many older adults who have not developed seizures or cognitive abnormalities. We conducted a statistical analysis of the frequency of TSC manifestations in a population of Japanese adults and children, comparing our findings with historical data. The chi-square test was used to examine the frequency of each manifestation by age. A total of 166 outpatients at the Department of Dermatology of Osaka University Hospital during the period from January 2001 to March 2011 were included in the study. Compared to previous reports, the frequency of neurologic manifestations (excepting autism) was lower in this cohort, and the frequency of skin manifestations (excepting hypomelanotic macules) was higher in this cohort. The frequencies of pulmonary lymphangioleiomyomatosis and renal manifestations were not significantly different from those previously reported. Regarding the association of each manifestation with age, the frequency of neurologic manifestations (excepting subependymal giant cell astrocytoma) was significantly higher in younger patients than in older patients. The frequency of skin manifestations and renal angiomyolipoma were significantly higher in older patients than in younger patients. Because of their high frequency and visibility, skin manifestations are useful in the diagnosis of TSC. Moreover, uterine perivascular epithelioid cell tumor was also characterized as a new findings associated with TSC.
Purpose. To investigate aqueous concentrations of vascular endothelial growth factor (VEGF) in eyes with myopic choroidal neovascularization (CNV). Methods. Aqueous samples were collected, and VEGF concentrations were measured by enzyme-linked immunosorbent assay in 16 eyes (16 patients) with active myopic CNV, 23 eyes (16 patients) with high myopia without myopic CNV, and 8 control eyes (7 patients). Differences in the concentrations of VEGF in each group were compared. Results. The estimated mean VEGF concentrations were significantly lower in eyes with myopic CNV (82.0 pg/mL) (P = 0.016
) and with high myopia without myopic CNV (58.9 pg/mL) (P < 0.001) compared with controls (116.6 pg/mL). The estimated mean VEGF concentration was significantly (P < 0.05) higher in eyes with myopic CNV than in those without myopic CNV in highly myopic eyes. In eyes with high myopia with and without CNV, the VEGF concentration was significantly (stepwise regression analysis, R = 0.325, P = 0.044) associated with the presence of myopic CNV but not with age, axial length, or intraocular pressure. Conclusion. Increased levels of VEGF may play a role in the pathogenesis of CNV in highly myopic eyes.
Evidence of efficacy and safety of, and especially mortality related to, recombinant human thrombomodulin (rhTM) treatment for sepsis-induced disseminated intravascular coagulation (DIC) is limited. We hypothesized that patients with sepsis-induced DIC receiving treatment with rhTM would have improved mortality compared with those with similar acuity who did not.
This retrospective cohort study conducted in three tertiary referral hospitals in Japan between January 2006 and June 2011 included all patients with sepsis-induced DIC who required ventilator management. Primary endpoint was in-hospital mortality, with duration of intensive care unit treatment, changes in DIC scores and rate of bleeding complications as secondary endpoints. Regression technique was used to develop a propensity model adjusted for baseline imbalances between groups.
Eligible were 162 patients with sepsis-induced DIC; 68 patients received rhTM and 94 did not. Patients receiving rhTM had higher severity of illness according to baseline characteristics. After adjusting for these imbalances by stratified propensity score analysis, treatment with rhTM was significantly associated with reduced in-hospital mortality (adjusted hazard ratio, 0.45; 95 % confidential interval, 0.26–0.77; p = 0.013). An association between rhTM treatment and higher numbers of intensive care unit-free days, ventilator-free days, and vasopressor-free days were observed. DIC scores were significantly decreased in the rhTM group compared with the control group in the early period after rhTM treatment, whereas the incidence of bleeding-related adverse events did not differ between the two groups.
Therapy with rhTM may be associated with reduced in-hospital mortality in adult mechanically ventilated patients with sepsis-induced DIC.
Sepsis; DIC; Anticoagulant therapy; Thrombomodulin; Outcome assessment; Retrospective studies
Parkinsonian rigidity has been thought to be constant through a full range of joint angle. The aim of this study was to perform a detailed investigation of joint angle dependency of rigidity. We first measured muscle tone at the elbow joint in 20 healthy subjects and demonstrated that an angle of approximately 60° of flexion marks the division of two different angle-torque characteristics. Then, we measured muscle tone at the elbow joint in 24 Parkinson's Disease (PD) patients and calculated elastic coefficients in flexion and extension in the ranges of 10°–60° (distal) and 60°–110° (proximal). Rigidity as represented by the elastic coefficient in the distal phase of elbow joint extension was best correlated with the UPDRS rigidity score (r = 0.77). A significant difference between the UPDRS rigidity score 0 group and 1 group was observed in the elastic coefficient in the distal phase of extension (P < 0.0001), whereas no significant difference was observed in the proximal phase of extension and in each phase of flexion. Parkinsonian rigidity shows variable properties depending on the elbow joint angle, and it is clearly detected at the distal phase of elbow extension.
Background and purpose
It is controversial whether the transverse acetabular ligament (TAL) is a reliable guide for determining the cup orientation during total hip arthroplasty (THA). We investigated the variations in TAL anatomy and the TAL-guided cup orientation.
80 hips with osteoarthritis secondary to hip dysplasia (OA) and 80 hips with osteonecrosis of the femoral head (ON) were examined. We compared the anatomical anteversion of TAL and the TAL-guided cup orientation in relation to both disease and gender using 3D reconstruction of computed tomography (CT) images.
Mean TAL anteversion was 11° (SD 10, range –12 to 35). The OA group (least-square mean 16°, 95% confidence interval (CI): 14–18) had larger anteversion than the ON group (least-square mean 6.2°, CI: 3.8 – 7.5). Females (least-square mean 20°, CI: 17–23) had larger anteversion than males (least-square mean 7.0°, CI: 4.6–9.3) in the OA group, while there were no differences between the sexes in the ON group. When TAL was used for anteversion guidance with the radiographic cup inclination fixed at 40°, 39% of OA hips and 9% of ON hips had more than 10° variance from the target anteversion, which was 15°.
In ON hips, TAL is a good guide for determining cup orientation during THA, although it is not a reliable guide in hips with OA secondary to dysplasia. This is because TAL orientation has large individual variation and is influenced by disease and gender.
We have reported that altered gut flora is associated with septic complications and eventual death in critically ill patients with systemic inflammatory response syndrome. It is unclear how fecal pH affects these patients. We sought to determine whether fecal pH can be used as an assessment tool for the clinical course of critically ill patients.
Four hundred ninety-one fecal samples were collected from 138 patients who were admitted to the Department of Traumatology and Acute Critical Medicine, Osaka University Graduate School of Medicine, Japan. These patients were treated in the intensive care unit for more than 2 days. Fecal pH, fecal organic acids, and fecal bacteria counts were measured and compared by survived group and nonsurvived group, or nonbacteremia group and bacteremia group. Logistic regression was used to estimate relations between fecal pH, age, sex, or APACHE II score and mortality, and incidence of bacteremia. Differences in fecal organic acids or fecal bacteria counts among acidic, neutral, and alkaline feces were analyzed.
The increase of fecal pH 6.6 was significantly associated with the increased mortality (odds ratio, 2.46; 95% confidence interval, 1.25 to 4.82) or incidence of bacteremia (3.25; 1.67 to 6.30). Total organic acid was increased in acidic feces and decreased in alkaline feces. Lactic acid, succinic acid, and formic acid were the main contributors to acidity in acidic feces. In alkaline feces, acetic acid was significantly decreased. Propionic acid was markedly decreased in both acidic and alkaline feces compared with neutral feces. No differences were noted among the groups in bacterial counts.
The data presented here demonstrate that the fecal pH range that extended beyond the normal range was associated with the clinical course and prognosis of critically ill patients.
Healthcare-associated methicillin-resistant Staphylococcus aureus (HA-MRSA) infection in intensive care unit (ICU) patients prolongs ICU stay and causes high mortality. Predicting HA-MRSA infection on admission can strengthen precautions against MRSA transmission. This study aimed to clarify the risk factors for HA-MRSA infection in an ICU from data obtained within 24 hours of patient ICU admission.
We prospectively studied HA-MRSA infection in 474 consecutive patients admitted for more than 2 days to our medical, surgical, and trauma ICU in a tertiary referral hospital in Japan. Data obtained from patients within 24 hours of ICU admission on 11 prognostic variables possibly related to outcome were evaluated to predict infection risk in the early phase of ICU stay. Stepwise multivariate logistic regression analysis was used to identify independent risk factors for HA-MRSA infection.
Thirty patients (6.3%) had MRSA infection, and 444 patients (93.7%) were infection-free. Intubation, existence of open wound, treatment with antibiotics, and steroid administration, all occurring within 24 hours of ICU admission, were detected as independent prognostic indicators. Patients with intubation or open wound comprised 96.7% of MRSA-infected patients but only 57.4% of all patients admitted.
Four prognostic variables were found to be risk factors for HA-MRSA infection in ICU: intubation, open wound, treatment with antibiotics, and steroid administration, all occurring within 24 hours of ICU admission. Preemptive infection control in patients with these risk factors might effectively decrease HA-MRSA infection.
Cross-talk between the coagulation system and inflammatory reactions during sepsis causes organ damage followed by multiple organ dysfunction syndrome or even death. Therefore, anticoagulant therapies have been expected to be beneficial in the treatment of severe sepsis. Recombinant human soluble thrombomodulin (rhTM) binds to thrombin to inactivate coagulation, and the thrombin-rhTM complex activates protein C to produce activated protein C. The purpose of this study was to examine the efficacy of rhTM for treating patients with sepsis-induced disseminated intravascular coagulation (DIC).
This study comprised 65 patients with sepsis-induced DIC who required ventilatory management. All patients fulfilled the criteria of severe sepsis and the International Society on Thrombosis and Haemostasis criteria for overt DIC. The initial 45 patients were treated without rhTM (control group), and the following 20 consecutive patients were treated with rhTM (0.06 mg/kg/day) for six days (rhTM group). The primary outcome measure was 28-day mortality. Stepwise multivariate Cox regression analysis was used to assess which independent variables were associated with mortality. Comparisons of Sequential Organ Failure Assessment (SOFA) score on sequential days between the two groups were analyzed by repeated measures analysis of variance.
Cox regression analysis showed 28-day mortality to be significantly lower in the rhTM group than in the control group (adjusted hazard ratio, 0.303; 95% confidence interval, 0.106 to 0.871; P = 0.027). SOFA score in the rhTM group decreased significantly in comparison with that in the control group (P = 0.028). In the post hoc test, SOFA score decreased rapidly in the rhTM group compared with that in the control group on day 1 (P < 0.05).
We found that rhTM administration may improve organ dysfunction in patients with sepsis-induced DIC. Further clinical investigations are necessary to evaluate the effect of rhTM on the pathophysiology of sepsis-induced DIC.
Gut under severe insult is considered to have an important role in promoting infection and multiple organ dysfunction syndrome from the viewpoint of altered intestinal epithelium, immune system and commensal bacteria. There are few reports, however, about the relationship between gut flora and septic complications.
We analyzed gut flora in patients with systemic inflammatory response syndrome (SIRS) and evaluated key bacteria and their cutoff values for infectious complications and mortality by using classification and regression trees (CART). Eighty-one SIRS patients with a serum C-reactive protein level higher than 10 mg/dL treated in the intensive care unit (ICU) for more than 2 days were included for the study. We quantitatively evaluated nine types of bacteria in fecal samples by plate or tube technique. Two hundred seventy-one samples were analyzed using CART and logistic regression.
The dominant factors for complication of enteritis were the minimum number of total obligate anaerobes and the maximum number of Staphylococcus and Enterococcus. The dominant factors for complication of bacteremia were the minimum numbers of total obligate anaerobes and total facultative anaerobes. The dominant factors for mortality were the numbers of total obligate anaerobes and total facultative anaerobes and age.
A decrease in total obligate anaerobes and an increase in pathogenic bacteria in the gut are associated with septic complications and mortality in patients with SIRS. The altered gut flora may be a potential prognostic marker in SIRS patients.
Gut; Flora; Probiotics; Sepsis; Classification and regression trees; ICU
Survival analysis methods such as the Kaplan-Meier method, log-rank test, and Cox proportional hazards regression (Cox regression) are commonly used to analyze data from randomized withdrawal studies in patients with major depressive disorder. However, unfortunately, such common methods may be inappropriate when a long-term censored relapse-free time appears in data as the methods assume that if complete follow-up were possible for all individuals, each would eventually experience the event of interest.
In this paper, to analyse data including such a long-term censored relapse-free time, we discuss a semi-parametric cure regression (Cox cure regression), which combines a logistic formulation for the probability of occurrence of an event with a Cox proportional hazards specification for the time of occurrence of the event. In specifying the treatment's effect on disease-free survival, we consider the fraction of long-term survivors and the risks associated with a relapse of the disease. In addition, we develop a tree-based method for the time to event data to identify groups of patients with differing prognoses (cure survival CART). Although analysis methods typically adapt the log-rank statistic for recursive partitioning procedures, the method applied here used a likelihood ratio (LR) test statistic from a fitting of cure survival regression assuming exponential and Weibull distributions for the latency time of relapse.
The method is illustrated using data from a sertraline randomized withdrawal study in patients with major depressive disorder.
We concluded that Cox cure regression reveals facts on who may be cured, and how the treatment and other factors effect on the cured incidence and on the relapse time of uncured patients, and that cure survival CART output provides easily understandable and interpretable information, useful both in identifying groups of patients with differing prognoses and in utilizing Cox cure regression models leading to meaningful interpretations.