The majority of patients with lung cancer present with metastatic disease. Chronic inflammation and subsequent activation of NF-κB have been associated the development of cancers. The RelA/p65 subunit of NF-κB is typically associated with transcriptional activation. In this report we show that RelA/p65 can function as an active transcriptional repressor through enhanced methylation of the BRMS1 metastasis suppressor gene promoter via direct recruitment of DNMT-1 to chromatin in response to TNF. TNF-mediated phosphorylation of S276 on RelA/p65 is required for RelA/p65-DNMT-1 interactions, chromatin loading of DNMT-1, and subsequent BRMS1 promoter methylation and transcriptional repression. The ability of RelA/65 to function as an active transcriptional repressor is promoter specific as the NF-κB-regulated gene cIAP2 is transcriptionally activated while BRMS1 is repressed under identical conditions. Small molecule inhibition of either of the minimal interacting domains between RelA/p65-DNMT-1 and RelA/p65-BRMS1 promoter abrogates BRMS1 methylation and its transcriptional repression. The ability of RelA/p65 to directly recruit DNMT-1 to chromatin resulting in promoter-specific methylation and transcriptional repression of tumor metastasis suppressor gene BRMS1 highlights a new mechanism through which NF-κB can regulate metastatic disease, and offers a potential target for newer generation epigenetic oncopharmaceuticals.
DNMT-1; Phosphorylation; RelA-p65; TNF; Transcription
S-Nitrosoglutathione (GSNO) reductase regulates cell signaling pathways relevant to asthma and protects cells from nitrosative stress. Recent evidence suggests that this enzyme may prevent human hepatocellular carcinoma arising in the setting of chronic hepatitis. We hypothesized that GSNO reductase may also protect the lung against potentially carcinogenic reactions associated with nitrosative stress. We report that wild-type Ras is S-nitrosylated and activated by nitrosative stress and that it is denitrosylated by GSNO reductase. In human lung cancer, the activity and expression of GSNO reductase are decreased. Further, the distribution of the enzyme (including its colocalization with wild-type Ras) is abnormal. We conclude that decreased activity of GSNO reductase could leave the human lung vulnerable to the oncogenic effects of nitrosative stress, as is the case in the liver. This potential should be considered when developing therapies that inhibit pulmonary GSNO reductase to treat asthma and other conditions.
lung cancer; S-nitrosoglutathione reductase; Ras
We determined the efficacy, biological activity, pharmacokinetics and safety of the hypomethylating agent 5-azacitidine (Celgene Corp., Summit, New Jersey) in dogs with naturally occurring invasive urothelial carcinoma.
Materials and Methods
We performed a preclinical phase I trial in dogs with naturally occurring invasive urothelial carcinoma to examine once daily subcutaneous administration of 5-azacitidine in 28-day cycles at doses of 0.10 to 0.30 mg/kg per day according to 2 dose schedules, including days 1 to 5 (28-day cohort) or days 1 to 5 and 15 to 19 (14-day cohort). Clinical efficacy was assessed by serial cystosonography, radiography and cystoscopy. Urinary 5-azacitidine pharmacokinetic analysis was also done. Pretreatment and posttreatment peripheral blood mononuclear cell and invasive urothelial carcinoma DNA, respectively, was analyzed for global and gene specific [CDKN2A (p14ARF)] methylation changes.
Enrolled in the study were 19 dogs with naturally occurring invasive urothelial carcinoma. In the 28-day cohort the maximum tolerated dose was 0.20 mg/kg per day with higher doses resulting in grade 3 or 4 neutropenia in 4 of 6 dogs. In the 14-day cohort the maximum tolerated dose was 0.10 mg/kg per day with grade 3 or 4 neutropenia seen in 2 of 3 dogs treated at higher doses. No grade 3 or 4 nonhematological toxicity was observed during either dosing schedule. Of 18 dogs evaluable for tumor response partial remission, stable disease and progressive disease were observed in 4 (22.2%), 9 (50.0%) and 4 (22.2%), respectively. Consistent 5-azacitidine levels (205 to 857 ng/ml) were detected in urine. Pretreatment and posttreatment methylation analysis revealed no significant correlation with clinical response.
Subcutaneous 5-azacitidine showed promising clinical activity in a canine invasive urothelial carcinoma model, thus meriting further development in humans with urothelial carcinoma.
urinary bladder; urothelium; carcinoma; azacitidine; dogs
The survival of patients with non–small-cell lung cancer (NSCLC), even when resectable, remains poor. Several small studies suggest that occult metastases (OMs) in pleura, bone marrow (BM), or lymph nodes (LNs) are present in early-stage NSCLC and are associated with a poor outcome. We investigated the prevalence of OMs in resectable NSCLC and their relationship with survival.
Patients and Methods
Eligible patients had previously untreated, potentially resectable NSCLC. Saline lavage of the pleural space, performed before and after pulmonary resection, was examined cytologically. Rib BM and all histologically negative LNs (N0) were examined for OM, diagnosed by cytokeratin immunohistochemistry (IHC). Survival probabilities were estimated using the Kaplan-Meier method. The log-rank test and Cox proportional hazards regression model were used to compare survival of groups of patients. P < .05 was considered significant.
From July 1999 to March 2004, 1,047 eligible patients (538 men and 509 women; median age, 67.2 years) were entered onto the study, of whom 50% had adenocarcinoma and 66% had stage I NSCLC. Pleural lavage was cytologically positive in only 29 patients. OMs were identified in 66 (8.0%) of 821 BM specimens and 130 (22.4%) of 580 LN specimens. In univariate and multivariable analyses OMs in LN but not BM were associated with significantly worse disease-free survival (hazard ratio [HR], 1.50; P = .031) and overall survival (HR, 1.58; P = .009).
In early-stage NSCLC, LN OMs detected by IHC identify patients with a worse prognosis. Future clinical trials should test the role of IHC in identifying patients for adjuvant therapy.
Sublobar resection (SR) is commonly used for patients considered high-risk for lobectomy. Non-operative therapies are increasingly being reported for similar risk patients because of perceived lower morbidity. We report 30 and 90 day adverse events (AEs) from ACOSOG Z4032; a multicenter phase III study for high-risk stage I non-small cell lung cancer (NSCLC) patients.
Data from 222 evaluable patients randomized to SR (n=114) or SR with brachytherapy (SRB) (n=108) are reported. AEs were recorded using the Common Terminology Criteria for Adverse Events, Version 3.0 at 30 and 90 days post surgery. Risk factors (age, baseline DLCO%, and FEV1%, upper lobe versus lower lobe resections, performance status, surgery approach; VATS versus open and extent ; wedge versus segmentectomy) were analyzed using a multivariable logistic model for their impact on the incidence of Grade 3 (G3+) and higher AEs. Respiratory AEs were also specifically analyzed
Median age, FEV1% and DLCO% were similar for the two treatment groups. There was no difference in the location of resection (upper versus lower lobe) or in the use of segmental or wedge resections. There were no differences between the groups with respect to “respiratory” G3+ (30 days: 14.9% vs. 19.4%; p=0.35; 0–90 days: 19.3% vs. 25%; p=0.31) and “any” G3+AEs (30 days: 25.4% vs. 30.6%; p=0.37; 0–90 days: 29.8% vs. 37%; p=0.25). Further analysis combined the two groups. Mortality occurred in 3 (1.4%) patients by 30 days and in 6 (2.7%) patients by 90 days. Four of the six deaths were felt to be attributable to surgery. When considered as continuous variables, FEV1% was associated with “any” grade 3 + AE at days 0–30 (p=0.03; OR=0.98), and days 0–90 (p=0.05; OR=0.98) respectively; and DLCO% was associated with “respiratory” grade 3+AE at days 0–30 (p=0.03; OR=0.97), and days 0–90 (p=0.05; OR=0.98) respectively. Segmental resection was associated with a higher incidence of any G3+ AE compared to wedge at days 0–30(40.3% versus 22.7%; OR=2.56; p<0.01) and days 0–90 (41.5% versus 29.7%; OR=1.96; p=0.04). The median FEV1% was 50% and the median DLCO% was 46%. Using these median values as potential cutpoints, only a DLCO% of less than 46% was significantly associated with an increased risk of “respiratory” and “any” grade 3+ AE for days 0–30, and 0–90.
In a multicenter setting, SRB was not associated with increased morbidity compared to SR alone. SR/SRB can be performed safely in high-risk patients with NSCLC with low 30 and 90 day mortality and acceptable morbidity. Segmental resection was associated with increased “any” G3+ AE, and DLCO% less than 46% was associated with “any” G3+AE as well as “respiratory” G3+ AE at both 30 and 90 days.
The shortage in organ donation is a major limiting factor for patients with end-stage lung disease. Expanding the donor pool would be beneficial. We investigated the importance of geographic distance between the donor and recipient and hypothesized that it would not be a critical determinant of outcomes after lung transplantation.
We retrospectively reviewed the United Network for Organ Sharing lung transplant database from 2000 to 2005 to allow sufficient time for bronchiolitis obliterans syndrome (BOS) development. Allograft recipients were stratified by geographic distance from their donors (local, regional, and national) and had yearly follow-up. The primary end points were the development of BOS and 1-year and 3-year mortality. Posttransplant outcomes were compared using a multivariable Cox proportional hazard model. Kaplan-Meier curves were compared by log-rank test.
Of 6,055 allograft recipients, donors were local in 59%, regional in 19.3%, and national in 21.7%. BOS-free survival did not differ by geographic distance. Geographic distance did not independently predict BOS (hazard ratio, 1.03; 95% confidence interval, 0.96 to 1.10). Similarly, Kaplan-Meier survival curves were not significantly worse for recipients with national donors. Geographic distance did not independently predict 3-year mortality (hazard ratio, 0.95; 95% confidence interval, 0.89 to 1.01).
With appropriate donor selection, moderately long geographic distance (average ischemic time < 6 hours) between the donor and recipient is not associated with the development of BOS or increased death after lung transplantation. By placing less emphasis on distance, more donors could potentially be used to expand the donor pool.
Virtual screening targeting the urokinase receptor (uPAR) led to (3R)-4-cyclohexyl-3-(hexahydrobenzo[d][1,3]dioxol-5-yl)-N-((hexahydrobenzo[d][1,3]dioxol-5-yl)methyl)butan-1-aminium 1 (IPR-1) and 4-(4-((3,5-dimethylcyclohexyl)carbamoyl)-2-(4-isopropylcyclohexyl)pyrazolidin-3-yl)piperidin-1-ium 3 (IPR-69). Synthesis of an analog of 1, namely 2 (IPR-9), and 3 led to breast MDA-MB-231 invasion, migration and adhesion assays with IC50 near 30 μM. Both compounds blocked angiogenesis with IC50 of 3 μM. Compounds 2 and 3 inhibited cell growth with IC50 of 6 and 18 μM and induced apoptosis. Biochemical assays revealed lead-like properties for 3, but not 2. Compound 3 administered orally reached peak concentration of nearly 40 μM with a half-life of about 2 hours. In NOD-SCID mice inoculated with breast TMD-231 cells in their mammary fat pads, compound 3 showed a 20% reduction in tumor volumes and less extensive metastasis was observed for the treated mice. The suitable pharmacokinetic properties of 3 and the encouraging preliminary results in metastasis make it an ideal starting point for next generation compounds.
The Pacific population of leatherback sea turtles (Dermochelys coriacea) has drastically declined in the last 25 years. This decline has been linked to incidental capture by fisheries, egg and meat harvesting, and recently, to climate variability and resource limitation. Here we couple growth rates with feeding experiments and food intake functions to estimate daily energy requirements of leatherbacks throughout their development. We then estimate mortality rates from available data, enabling us to raise food intake (energy requirements) of the individual to the population level. We place energy requirements in context of available resources (i.e., gelatinous zooplankton abundance). Estimated consumption rates suggest that a single leatherback will eat upward of 1000 metric tonnes (t) of jellyfish in its lifetime (range 924–1112) with the Pacific population consuming 2.1×106 t of jellyfish annually (range 1.0–3.7×106) equivalent to 4.2×108 megajoules (MJ) (range 2.0–7.4×108). Model estimates suggest 2–7 yr-old juveniles comprise the majority of the Pacific leatherback population biomass and account for most of the jellyfish consumption (1.1×106 t of jellyfish or 2.2×108 MJ per year). Leatherbacks are large gelatinous zooplanktivores with consumption to biomass ratios of 96 (up to 192 if feeding strictly on low energy density Cnidarians); they, therefore, have a large capacity to impact gelatinous zooplankton landscapes. Understanding the leatherback's needs for gelatinous zooplankton, versus the availability of these resources, can help us better assess population trends and the influence of climate induced resource limitations to reproductive output.
Z4032 is a randomized clinical trial conducted by the American College of Surgeons Oncology Group that compared sublobar resection alone (SR) to sublobar resection with brachytherapy (SRB) for high-risk operable patients with non-small cell lung cancer (NSCLC). This current report evaluate the early impact that adjuvant brachytherapy has on pulmonary function tests (PFT), dyspnea and perioperative (30-day) respiratory complications on this impaired patient population.
Eligible stage I NSCLC patients with tumors 3cm or less were randomized to SR or SRB. The outcomes measured included the % predicted forced expiratory volume (FEV1%), % predicted carbon monoxide diffusion capacity (DLCO%), dyspnea score using the UC San Diego Shortness of Breath Questionnaire. Pulmonary morbidity was assessed using the Common Terminology Criteria for Adverse Events (AE) Version 3.0 (CTCAE). Outcomes were measured at baseline, and at 3-months. A 10% change in PFT or a 10-point change in dyspnea score was deemed clinically meaningful.
Z4032 permanently closed to patient accrual in January 2010 with a total of 224 patients. At 3-month follow-up, PFT data is currently available on 148 (74 SR/74 SRB) patients described in this report. There were no differences in baseline characteristics between the arms. In the SR arm, 9 (12%) patients reported grade-3 respiratory AE compared to 12 (16%) in the SRB arm (p=0.49). There was no significant change in the percent change in DLCO%, or dyspnea score from baseline to 3-month within either arm. In the case of FEV1%, the percent change from baseline to 3-month was significant within SR arm (p=0.03), with patients reporting an improvement in the FEV1% at month 3. Multivariable regression analysis (adjusted for baseline values) showed no significant impact of treatment arm, tumor location (upper versus other lobe), or surgical approach (VATS versus thoracotomy) on the 3-month values for FEV1%, DLCO% and dyspnea scores. There was no significant difference in the incidence of clinically meaningful (10% PFT change, or 10-point dyspnea score) change between the two arms. Twenty-two percent of patients with lower lobe tumors compared to 9% with upper lobe tumors demonstrated a 10% decline in FEV1% (odds ratio 2.79; 95 CI=1.07 – 7.25; p=0.04).
Adjuvant intraoperative brachytherapy performed in conjunction with sublobar resection does not significantly worsen pulmonary function, or dyspnea at 3-months in a high-risk population with NSCLC. SRB was not associated with increased perioperative pulmonary AE. Lower-lobe resection was the only factor that was significantly associated with a clinically meaningful decline in FEV1%.
Pancreatic cancer is an especially deadly form of cancer with a survival rate <2%. Pancreatic cancers respond poorly to existing chemotherapeutic agents and radiation, and progress for the treatment of pancreatic cancer remains elusive. To address this unmet medical need, a better understanding of critical pathways and molecular mechanisms involved in pancreatic tumor development, progression and resistance to traditional therapy is therefore critical. Reduction-oxidation (redox) signaling systems are emerging as important targets in pancreatic cancer. AP endonuclease1/ Redox effector factor 1 (APE1/Ref-1) is upregulated in human pancreatic cancer cells and modulation of its redox activity blocks the proliferation and migration of pancreatic cancer cells as well as pancreatic cancer-associated endothelial cells (PCECs) in vitro. Modulation of APE1/Ref-1 using a specific inhibitor of APE1/Ref-1’s redox function, E3330 leads to a decrease in transcription factor activity for NFκB, AP-1, and HIF1 in vitro. This study aims to further establish the redox signaling protein APE1/Ref-1 as a molecular target in pancreatic cancer. Here, we show that inhibition of APE1/Ref-1 via E3330 results in tumor growth inhibition in cell lines as well as pancreatic cancer xenograft models in mice. Pharmacokinetic (PK) studies also demonstrate that E3330 attains >10 μM blood concentrations and is detectable in tumor xenografts. Through inhibition of APE1/Ref-1, the activity of NFκB, AP-1, and HIF1α which are key transcriptional regulators involved in survival, invasion and metastasis is blocked. These data indicate that E3330, inhibitor of APE1/Ref-1, has potential in pancreatic cancer and clinical investigation of APE1/Ref-1 molecular target is warranted.
Pancreatic cancer; animal models of cancer; new targets; xenograft models; cellular responses to anticancer drugs; cellular pharmacology; pharmacokinetics and pharmacodynamics; agents with other mechanisms of action
The ability to predict the efficacy of molecularly-targeted therapies for non-small cell lung cancer (NSCLC) for an individual patient remains problematic. The purpose of this study was to identify tumor biomarkers, using a refined “coexpression extrapolation (COXEN)” algorithm with a continuous spectrum of drug activity, that predict drug sensitivity and therapeutic efficacy in NSCLC to Vorinostat, a histone deacetylase inhibitor, and Velcade, a proteasome inhibitor. Using our refined COXEN algorithm, biomarker prediction models were discovered and trained for Vorinostat and Velcade based on in vitro drug activity profiles of 9 NSCLC cell lines (NCI-9). Independently, a panel of 40 NSCLC cell lines (UVA-40) was treated with Vorinostat or Velcade to obtain 50% growth inhibition values. Genome-wide expression profiles for both the NCI-9 and UVA-40 cell lines were determined using HG-U133A Affymetrix platform. Modeling generated multi-gene expression signatures for Vorinostat (45-gene, p=0.002) and Velcade (15-gene, p=0.0002), with one overlapping gene (CFLAR). Examination of Vorinostat gene ontogeny revealed a predilection for cellular replication and death, whereas those of Velcade suggested involvement in cellular development and carcinogenesis. Multivariate regression modeling of the refined COXEN scores significantly predicted the activity of combination therapy in NSCLC cells (p=0.007). Through the refinement of the COXEN algorithm, we provide an in silico method to generate biomarkers that predict tumor sensitivity to molecularly-targeted therapies. Use of this refined COXEN methodology has significant implications for the a priori examination of targeted therapies to more effectively streamline subsequent clinical trial design and cost.
Lung cancer; histone deacetylase inhibitor; proteasome inhibitor; tumor biomarker; molecularly-targeted agents; chemotherapy
The effect of gender, race and socioeconomic status on contemporary outcomes following lung cancer resections has not been comprehensively evaluated nationwide. We hypothesized that risk-adjusted outcomes for lung cancer resections would not be influenced by these factors.
From 2003–2007, 129, 207 patients undergoing lung cancer resections were evaluated using the Nationwide Inpatient Sample (NIS). Multiple regression was utilized to estimate the effects of gender, race and socioeconomic status on risk-adjusted outcomes.
Average patient age was 66.8±10.5 years. Females accounted for 5.0% of the total study population. Among racial groups, whites underwent the large majority of operations (86.2%) followed by Black (6.9%) and Hispanic (2.8%) races. Overall, the incidence of mortality was 2.9%, postoperative complications 30.4%, and pulmonary complications 22.0%. Female gender, race, and mean income were all multivariate correlates of adjusted mortality and morbidity. Black patients incurred decreased risk-adjusted morbidity and mortality compared to white patients. Hispanics and Asians demonstrated decreased risk-adjusted complication rates. Importantly, low-income status independently increased the adjusted odds of mortality.
Female gender is associated with decreased mortality and morbidity following lung cancer resections. Complication rates are lower for Black, Hispanic and Asian patients. Low socioeconomic status increases the risk of in-hospital death. These factors should be considered during patient risk stratification for lung cancer resection.
Gender; Race; Income; Lung Cancer; Surgery; Outcomes
Pre-clinical in vivo studies can help guide the selection of agents and regimens for clinical testing. However, one of the challenges in screening anti-cancer therapies is the assessment of off-target human toxicity. There is a need for in vivo models that can simulate efficacy and toxicities of promising therapeutic regimens. For example, hematopoietic cells of human origin are particularly sensitive to a variety of chemotherapeutic regimens but in vivo models to assess potential toxicities have not been developed. In this study, a xenograft model containing humanized bone marrow is utilized as an in vivo assay to monitor hematotoxicity.
A proof-of-concept, temozolomide-based regimen was developed that inhibits tumor xenograft growth. This regimen was selected for testing since it has been previously shown to cause myelosuppression in mice and humans. The dose-intensive regimen was administered to NOD/SCID/γchainnull mice reconstituted with human hematopoietic cells and the impact of treatment on human hematopoiesis was evaluated.
The dose-intensive regimen resulted in significant decreases in growth of human-glioblastoma xenografts. When this regimen was administered to mice containing humanized bone marrow, flow cytometric analyses indicated that the human bone-marrow cells were significantly more sensitive to treatment than the murine bone-marrow cells, and that the regimen was highly toxic to human-derived hematopoietic cells of all lineages (progenitor, lymphoid, and myeloid).
The humanized bone-marrow xenograft model described has the potential to be used as a platform for monitoring the impact of anti-cancer therapies on human hematopoiesis and could lead to subsequent refinement of therapies prior to clinical evaluation.
human xenograft model; myelosuppression; stem-cell
Breast cancer metastasis suppressor gene-1 (BRMS1) mRNA and protein expression are significantly decreased in non-small cell lung cancer (NSCLC) and this is a poor prognostic indicator. Given that the BRMS1 promoter region contains a promoter-associated CpG island (CGI) that encompasses the transcriptional start site, we hypothesized that decreased BRMS1 mRNA and protein levels in NSCLC was secondary to increased BRMS1 promoter methylation. Methylation-specific PCR (MSP) of the two known CGIs (−3477 to −2214 and −531 to +608) in the BRMS1 genome was performed in NSCLC cells. This demonstrated a robust increase in methylation of the promoter-associated CGI (−531 to +608) but not of the upstream CGI (−3477 to −2214). To experimentally verify that methylation contributes to BRMS1 transcriptional repression, we cloned the BRMS1 promoter region, including the promoter-associated CGI, into a luciferase reporter gene and found that BRMS1 promoter activity was dramatically inhibited under methylated conditions. We then assessed the BRMS1 methylation profile with MSP and bisulphite-sequencing PCR in human NSCLC adenocarcinoma (n = 20) and squamous cell carcinoma (n = 20) relative to adjacent non-cancerous bronchial epithelium. There was a significant increase in BRMS1 promoter methylation in all NSCLC specimens relative to non-cancerous tissues, with the most dramatic difference in squamous cell cancer histology. Subsequent immunostaining demonstrated that nuclear BRMS1 expression is reduced in lung cancer specimens compared to normal bronchial epithelium. The association between BRMS1 promoter methylation and specific clinical and histopathological variables was examined using a general linear model. Pathological tumour stage was associated with increased BRMS1 methylation in squamous cell cancers. These observations demonstrate that methylation of the promoter-associated CGI in BRMS1 results in its transcriptional repression, and highlight the potential clinical relevance of this methylation event with respect to NSCLC tumour histology and pathological stage.
lung cancer; BRMS1; methylation; methylation-specific PCR; immunohistochemistry; pathological stage; squamous cell cancer
Pseudoachalasia is a rare clinical diagnosis with diverse manifestations. We present the case of a 22-year-old male with esophageal adenocarcinoma who was initially diagnosed with achalasia. This unfortunate presentation reinforces the importance of a careful preoperative workup for dysphagia irrespective of age.
Lowe syndrome, which is characterized by defects in the central nervous system, eyes and kidneys, is caused by mutation of the phosphoinositide 5-phosphatase OCRL1. The mechanisms by which loss of OCRL1 leads to the phenotypic manifestations of Lowe syndrome are currently unclear, in part, owing to the lack of an animal model that recapitulates the disease phenotype. Here, we describe a zebrafish model for Lowe syndrome using stable and transient suppression of OCRL1 expression. Deficiency of OCRL1, which is enriched in the brain, leads to neurological defects similar to those reported in Lowe syndrome patients, namely increased susceptibility to heat-induced seizures and cystic brain lesions. In OCRL1-deficient embryos, Akt signalling is reduced and there is both increased apoptosis and reduced proliferation, most strikingly in the neural tissue. Rescue experiments indicate that catalytic activity and binding to the vesicle coat protein clathrin are essential for OCRL1 function in these processes. Our results indicate a novel role for OCRL1 in neural development, and support a model whereby dysregulation of phosphoinositide metabolism and clathrin-mediated membrane traffic leads to the neurological symptoms of Lowe syndrome.
With the exception of surgery, the standard platinum-based chemotherapeutic agents are the preferred treatment for non-small cell lung cancer (NSCLC); however little improvement (5-year survival) has been made. Therefore it is highly desirable to develop innovative therapeutic agents for NSCLC treatment.
Highly enantioselective synthetic methods were developed and a broad compound library was established. Cell toxicity, cell sensitivity, cell proliferation, cell invasion, and three-dimensional colony formation assays were used to assess the anticancer potential of these compounds in non–small-cell lung cancer (NSCLC) cell lines.
We found that the S-form of compound PL54 (PL54S, 5–20 µM) exhibited strong anticancer activity in 5 tested NSCLC cell lines. We further synthesized a highly pure R-form enantiomer of PL54 (PL54R) and its race-mate (PL54Rac) and characterized their anticancer activities. The results showed that PL54S is more potent than PL54R and PL54Rac against the tested cell lines. Furthermore, less cellular toxicity was observed in the normal human lung fibroblasts. Similarly, PL54S displayed greater anti–colony formation activity compared with PL54R and PL54Rac. The cellular sensitivity assay revealed that PL54S and PL54Rac significantly suppressed cologenic formation compared with PL54R and dimethyl sulfoxide controls (p < 0.01). All PL54 compounds (5 to 20 µM) significantly inhibited cell proliferation and invasion of the A549 cell line (p < 0.01). A soft agar colony formation assay revealed that PL54S and PL54Rac (10 mM), but not PL54R, significantly inhibited colony formation of tested NSCLC cells (p < 0.01).
The stereospecific compounds may prove to be a novel technique for the treatment of NSCLC.
We contrasted the forced diving bradycardia between two genetically similar (inbred) rat strains (Fischer and Buffalo), compared to that of outbred rats (Wistar). The animals were habituated to forced diving for 4 weeks. Each animal was then tested during one 40 s dive on each of 3 days. The heart rate (fH) was measured before, during, and after each dive. Fischer and Buffalo exhibited marked difference in dive bradycardia (Fischer: 120.9 ± 14.0 beats min−1 vs. Buffalo: 92.8 ± 12.8 beats min−1, P < 0.05). Outbred rats showed an intermediate response (103.0 ± 30.9 beats min−1) but their between-animal variability in mean dive fH and pre-diving resting fH were higher than the inbred strains (P < 0.05), which showed no difference (P > 0.05). The decreased variability in fH in inbred rats as compared with the outbred group indicates that reduced genetic variability minimizes variability of the diving bradycardia between individuals. Heritability within strains was assessed by the repeatability (R) index and was 0.93 ± 0.05 for the outbred, 0.84 ± 0.16 for Buffalo, and 0.80 ± 0.12 for Fischer rats for fH during diving. Our results suggest that a portion of the mammalian diving bradycardia may be a heritable trait.
forced diving; heart rate; rat; repeatability; quantitative genetics
Measuring the metabolic of sea turtles is fundamental to understanding their ecology yet the presently available methods are limited. Accelerometry is a relatively new technique for estimating metabolic rate that has shown promise with a number of species but its utility with air-breathing divers is not yet established. The present study undertakes laboratory experiments to investigate whether rate of oxygen uptake (o2) at the surface in active sub-adult green turtles Chelonia mydas and hatchling loggerhead turtles Caretta caretta correlates with overall dynamic body acceleration (ODBA), a derivative of acceleration used as a proxy for metabolic rate. Six green turtles (25–44 kg) and two loggerhead turtles (20 g) were instrumented with tri-axial acceleration logging devices and placed singly into a respirometry chamber. The green turtles were able to submerge freely within a 1.5 m deep tank and the loggerhead turtles were tethered in water 16 cm deep so that they swam at the surface. A significant prediction equation for mean o2 over an hour in a green turtle from measures of ODBA and mean flipper length (R2 = 0.56) returned a mean estimate error across turtles of 8.0%. The range of temperatures used in the green turtle experiments (22–30°C) had only a small effect on o2. A o2-ODBA equation for the loggerhead hatchling data was also significant (R2 = 0.67). Together these data indicate the potential of the accelerometry technique for estimating energy expenditure in sea turtles, which may have important applications in sea turtle diving ecology, and also in conservation such as assessing turtle survival times when trapped underwater in fishing nets.
The effect of seasonal variation on postoperative outcomes following lung cancer resections is unknown. We hypothesized that postoperative outcomes following surgical resection for lung cancer within the United States would not be impacted by operative season.
From 2002 to 2007, 182 507 isolated lung cancer resections (lobectomy (n = 147 937), sublobar resection (n = 21 650), and pneumonectomy (n = 13 916)) were evaluated using the Nationwide Inpatient Sample (NIS) database. Patients were stratified according to operative season: spring (n = 47 382), summer (n = 46 131), fall (n = 45 370) and winter (n = 43 624). Multivariate regression models were applied to assess the effect of operative season on adjusted postoperative outcomes.
Patient co-morbidities and risk factors were similar despite the operative season. Lobectomy was the most common operation performed: spring (80.0%), summer (81.3%), fall (81.8%), and winter (81.1%). Lung cancer resections were more commonly performed at large, high-volume (>75th percentile operative volume) centers (P < 0.001). Unadjusted mortality was lowest during the spring (2.6%, P < 0.001) season compared with summer (3.1%), fall (3.0%) and winter (3.2%), while complications were most common in the fall (31.7%, P < 0.001). Hospital length of stay was longest for operations performed in the winter season (8.92 ± 0.11 days, P < 0.001). Importantly, multivariable logistic regression revealed that operative season was an independent predictor of in-hospital mortality (P < 0.001) and of postoperative complications (P < 0.001). Risk-adjusted odds of in-hospital mortality were increased for lung cancer resections occurring during all other seasons compared with those occurring in the spring.
Outcomes following surgical resection for lung cancer are independently influenced by time of year. Risk-adjusted in-hospital mortality and hospital length of stay were lowest during the spring season.
Season; Lung Cancer; Surgery; Outcomes
In 2005, the time-based waiting list for lung transplantation was replaced by an illness/benefit lung allocation score (LAS). Although short-term outcomes after transplantation have been reported to be similar before and after the new system, little is known about long-term results. The objective of this study was to evaluate the impact of LAS on the development of bronchiolitis obliterans syndrome as well as on overall 3-year and bronchiolitis obliterans syndrome-related survival.
Data obtained from the United Network for Organ Sharing were used to review 8091 patients who underwent lung transplantation from 2002 to 2008. Patients were stratified according to time of transplantation into those treated before initiation of the LAS (pre-LAS group, January 2002–April 2005, n = 3729) and those treated after implementation of the score (post-LAS group, May 2005–May 2008, n = 4362). Overall, 3-year survivals for patient groups were compared using a univariate analysis, Cox proportional hazards model to generate a relative risk, and Kaplan-Meier curve analyses.
During the 3-year follow-up period, bronchiolitis obliterans syndrome developed in 22% of lung transplant recipients (n = 1801). Although the incidence of postoperative bronchiolitis obliterans syndrome development was similar between groups, post-LAS patients incurred fewer bronchiolitis obliterans syndrome-free days (609 ± 7.5 vs 682 ± 9; P<.0001; log-rank test P = .0108) than did pre-LAS patients. Overall 3-year survival was lower in post-LAS patients and approached statistical significance (P .05). Similarly, bronchiolitis obliterans syndrome-related survival was worse for patients in the post-LAS group (log-rank test P = .01).
In the current LAS era, lung transplant recipients have significantly fewer bronchiolitis obliterans syndrome-free days after 3-year follow-up. Compared with the pre-LAS population, overall and bronchiolitis obliterans syndrome-related survival appears worse in the post-LAS era. Limitation of known risk factors for development of bronchiolitis obliterans syndrome-may prove even more important in this patient population. (J Thorac Cardiovasc Surg 2011;141:1278–82)
Bronchiolitis obliterans syndrome (BOS) is the major hurdle preventing long-term success in lung transplantation, and is the primary reason for the 50% 5-year survival. Recipient and perioperative risk factors have been investigated in BOS, but less is known about donor factors. Therefore, we investigated what donor factors are important in the development of BOS.
We performed a retrospective review of the United Network for Organ Sharing lung transplant database from 1987 to 2008. Lung transplant recipients had yearly follow-up. Donor factors were evaluated for their influence on BOS development. Kaplan-Meier plots of BOS-free survival were compared for each donor factor and a multivariate Cox proportional hazard model for BOS was created with donor factors.
A total of 17,222 lung transplant recipients were identified; 6,991 recipients had sufficient follow-up BOS data. Of these recipients 57% (n = 3,984) developed BOS within 5 years. Recipients who received lungs from donors who were younger, without an active pulmonary infection, or those without current tobacco use had longer BOS-free survival. Recipients who received lungs with higher partial pressures of oxygen in arterial blood (PaO2) developed more BOS (p < 0.0001). Donor high PaO2, older age, and current tobacco use were independent predictors of BOS in lung transplant recipients.
Donor factors and donor management strategies are important contributors to development of recipient BOS. Identification of these factors may help limit BOS and may identify recipients at high risk. Surprisingly, high PaO2 in the donor is an independent predictor of BOS development.
Medicaid and Uninsured populations are a significant focus of current healthcare reform. We hypothesized that outcomes following major surgical operations in the United States is dependent on primary payer status.
From 2003 to 2007, 893,658 major surgical operations were evaluated using the Nationwide Inpatient Sample (NIS) database: lung resection, esophagectomy, colectomy, pancreatectomy, gastrectomy, abdominal aortic aneurysm repair, hip replacement, and coronary artery bypass. Patients were stratified by primary payer status: Medicare (n = 491,829), Medicaid (n = 40,259), Private Insurance (n = 337,535), and Uninsured (n = 24,035). Multivariate regression models were applied to assess outcomes.
Unadjusted mortality for Medicare (4.4%; odds ratio [OR], 3.51), Medicaid (3.7%; OR, 2.86), and Uninsured (3.2%; OR, 2.51) patient groups were higher compared to Private Insurance groups (1.3%, P < 0.001). Mortality was lowest for Private Insurance patients independent of operation. After controlling for age, gender, income, geographic region, operation, and 30 comorbid conditions, Medicaid payer status was associated with the longest length of stay and highest total costs (P < 0.001). Medicaid (P < 0.001) and Uninsured (P < 0.001) payer status independently conferred the highest adjusted risks of mortality.
Medicaid and Uninsured payer status confers increased risk-adjusted mortality. Medicaid was further associated with the greatest adjusted length of stay and total costs despite risk factors or operation. These differences serve as an important proxy for larger socioeconomic and health system-related issues that could be targeted to improve surgical outcomes for US Patients.
Chronic allograft vasculopathy (CAV) is a major cause of long-term complications and mortality after heart transplantation. Although recipient factors have been implicated, little is known of the role of donor factors in CAV development. We sought to identify donor factors associated with development of CAV after heart transplantation.
We reviewed the United Network for Organ Sharing heart transplant database from August 1987 to May 2008. Univariate and multivariate analyses were performed to assess the association between donor variables and the onset of CAV for adult recipients. Donor age was matched to recipient age and analyzed with respect to development of CAV.
Of the 39,704 recipients, a total of 11,714 (29.5%) experienced CAV. Multivariate analysis demonstrated seven donor factors as independent predictors of CAV: age, ethnicity, sex, weight, history of diabetes, hypertension, and tobacco use. When matching young donors (0 to 19.9 years) and old donors (≥50 years) to each recipient age group, older donors (≥50 years) conferred a higher risk of developing CAV. Further modeling demonstrated that for each recipient group, older donor age (≥50 years) conferred a higher risk of CAV development compared with younger donor age (0 to 19.9 years; p < 0.0001).
Donor factors including sex, hypertension, diabetes, and tobacco use are independently associated with recipient CAV. Older donor age confers a greater risk of CAV development regardless of the age of the recipient. A heightened awareness for the development of CAV is warranted when using older donors in adult cardiac transplantation, in particular with recipients 40 years of age or older.