PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1455365)

Clipboard (0)
None

Related Articles

1.  Left Ventricular Assist Devices 
Executive Summary
Objective
The objective of this health technology policy assessment was to determine the effectiveness and cost-effectiveness of using implantable ventricular assist devices in the treatment of end-stage heart failure.
Heart Failure
Heart failure is a complex syndrome that impairs the ability of the heart to maintain adequate blood circulation, resulting in multiorgan abnormalities and, eventually, death. In the period of 1994 to 1997, 38,702 individuals in Ontario had a first hospital admission for heart failure. Despite reported improvement in survival, the five-year mortality rate for heart failure is about 50%.
For patients with end-stage heart failure that does not respond to medical therapy, surgical treatment or traditional circulatory assist devices, heart transplantation (in appropriate patients) is the only treatment that provides significant patient benefit.
Heart Transplant in Ontario
With a shortage in the supply of donor hearts, patients are waiting longer for a heart transplant and may die before a donor heart is available. From 1999 to 2003, 55 to 74 people received a heart transplant in Ontario each year. Another 12 to 21 people died while waiting for a suitable donor heart. Of these, 1 to 5 deaths occurred in people under 18 years old. The rate-limiting factor in heart transplant is the supply of donor hearts. Without an increase in available donor hearts, attempts at prolonging the life of some patients on the transplant wait list could have a harmful effect on other patients that are being pushed down the waiting list (knock on effect).
LVAD Technology
Ventricular assist devices [VADs] have been developed to provide circulatory assistance to patients with end-stage heart failure. These are small pumps that usually assist the damaged left ventricle [LVADs] and may be situated within the body (intracorporeal] or outside the body [extracorporeal). Some of these devices were designed for use in the right ventricle [RVAD] or both ventricles (bi-ventricular).
LVADs have been mainly used as a “bridge-to-transplant” for patients on a transplant waiting list. As well, they have been used as a “bridge-to-recovery” in acute heart failure, but this experience is limited. There has been an increasing interest in using LVAD as a permanent (destination) therapy.
Review of LVAD by the Medical Advisory Secretariat
The Medical Advisory Secretariat’s review included a descriptive synthesis of findings from five systematic reviews and 60 reports published between January 2000 and December 2003. Additional information was obtained through consultation and by searching the websites of Health Canada, the United Network of Organ Sharing, Organ Donation Ontario, and LVAD manufacturers.
Summary of Findings
Safety and Effectiveness
Previous HTAs and current Level 3 evidence from prospective non-randomized controlled studies showed that when compared to optimal medical therapy, LVAD support significantly improved the pre-transplant survival rates of heart transplant candidates waiting for a suitable donor heart (71% for LVAD and 36% for medical therapy). Pre-transplant survival rates reported ranged from 58% to 90% (median 74%). Improved transplant rates were also reported for people who received pre-transplant LVAD support (e.g. 67% for LVAD vs 33% for medical therapy). Reported transplant rates for LVAD patients ranged from 39% to 90% (median 71%).
Patient’s age greater than 60 years and pre-existing conditions of respiratory failure associated with septicemia, ventilation, and right heart failure were independent risk factors for mortality after the LVAD implantation.
LVAD support was shown to improve the New York Heart Association [NYHA)] functional classification and quality of life of patients waiting for heart transplant. LVAD also enabled approximately 41% - 49% of patients to be discharged from hospitals and wait for a heart transplant at home. However, over 50% of the discharged patients required re-hospitalization due to adverse events.
Post-transplant survival rates for LVAD-bridged patients were similar to or better than the survival rates of patients bridged by medical therapy.
LVAD support has been associated with serious adverse events, including infection (median 53%, range 6%–72%), bleeding (8.6%–48%, median 35%), thromboembolism (5%–37%), neurologic disorders (7%–28%), right ventricular failure (11%–26%), organ dysfunction (5%–50%) and hemolysis (6%–20%). Bleeding tends to occur in the first few post-implant days and is rare thereafter. It is fatal in 2%–7% of patients. Infection and thromboembolism occurred throughout the duration of the implant, though their frequency tended to diminish with time. Device malfunction has been identified as one of the major complications. Fatalities directly attributable to the devices were about 1% in short-term LVAD use. However, mechanical failure was the second most frequent cause of death in patients on prolonged LVAD support. Malfunctions were mainly associated with the external components, and often could be replaced by backed up components.
LVAD has been used as a bridge-to-recovery in patients suffering from acute cardiogenic shock due to cardiomyopathy, myocarditis or cardiotomy. The survival rates were reported to be lower than in bridge-to-transplant (median 26%). Some of the bridge-to-recovery patients (14%–75%) required a heart transplant or remained on prolonged LVAD support. According to an expert in the field, experience with LVAD as a bridge-to-recovery technology has been more favourable in Germany than in North America, where it is not regarded as a major indication since evidence for its effectiveness in this setting is limited.
LVAD has also been explored as a destination therapy. A small, randomized, controlled trial (level 2 evidence) showed that LVAD significantly increased the 1-year survival rate of patients with end-stage heart failure but were not eligible for a heart transplant (51% LVAD vs 25% for medical therapy). However, improved survival was associated with adverse events 2.35 times higher than medically treated patients and a higher hospital re-admission rate. The 2-year survival rate on LVAD decreased to 23%, although it was still significantly better compared to patients on medical therapy (8%). The leading causes of deaths were sepsis (41%) and device failure (17%).
The FDA has given conditional approval for the permanent use of HeartMate SNAP VE LVAS in patients with end-stage heart failure who are not eligible for heart transplantation, although the long-term effect of this application is not known.
In Canada, four LVAD systems have been licensed for bridge-to-transplant only. The use of LVAD support raises ethical issues because of the implications of potential explantation that could be perceived as a withdrawal of life support.
Potential Impact on the Transplant Waiting List
With the shortage of donor hearts for adults, LVAD support probably would not increase the number of patients who receive a heart transplant. If LVAD supported candidates are prioritized for urgent heart transplant, there will be a knock on effect as other transplant candidates without LVAD support would be pushed down, resulting in longer wait, deterioration in health status and die before a suitable donor heart becomes available.
Under the current policy for allocating donor hearts in Ontario, patients on LVAD support would be downgraded to Status 3 with a lower priority to receive a transplant. This would likely result in an expansion of the transplant waiting list with an increasing number of patients on prolonged LVAD support, which is not consistent with the indication of LVAD use approved by Health Canada.
There is indication in the United Kingdom that LVAD support in conjunction with an urgent transplant listing in the pediatric population may decrease the number of deaths on the waiting list without a harmful knock-on effect on other transplant candidates.
Conclusion
LVAD support as a bridge-to-transplant has been shown to improve the survival rate, functional status and quality of life of patients on the heart transplant waiting list. However, due to the shortage of donor hearts and the current heart transplant algorithm, LVAD support for transplant candidates of all age groups would likely result in an expansion of the waiting list and prolonged use of LVAD with significant budget implications but without increasing the number of heart transplants. Limited level 4 evidence showed that LVAD support in children yielded survival rates comparable to those in the adult population. The introduction of LVAD in the pediatric population would be more cost-effective and might not have a negative effect on the transplant waiting list.
PMCID: PMC3387736  PMID: 23074453
2.  UTILIZATION OF AN EMR-BIOREPOSITORY TO IDENTIFY THE GENETIC PREDICTORS OF CALCINEURIN-INHIBITOR TOXICITY IN HEART TRANSPLANT RECIPIENTS 
Calcineurin-inhibitors CI are immunosuppressive agents prescribed to patients after solid organ transplant to prevent rejection. Although these drugs have been transformative for allograft survival, long-term use is complicated by side effects including nephrotoxicity. Given the narrow therapeutic index of CI, therapeutic drug monitoring is used to prevent acute rejection from underdosing and acute toxicity from overdosing, but drug monitoring does not alleviate long-term side effects. Patients on calcineurin-inhibitors for long periods almost universally experience declines in renal function, and a subpopulation of transplant recipients ultimately develop chronic kidney disease that may progress to end stage renal disease attributable to calcineurin inhibitor toxicity (CNIT). Pharmacogenomics has the potential to identify patients who are at high risk for developing advanced chronic kidney disease caused by CNIT and providing them with existing alternate immunosuppressive therapy. In this study we utilized BioVU, Vanderbilt University Medical Center’s DNA biorepository linked to de-identified electronic medical records to identify a cohort of 115 heart transplant recipients prescribed calcineurin-inhibitors to identify genetic risk factors for CNIT We identified 37 cases of nephrotoxicity in our cohort, defining nephrotoxicity as a monthly median estimated glomerular filtration rate (eGFR) < 30 mL/min/1.73m2 at least six months post-transplant for at least three consecutive months. All heart transplant patients were genotyped on the Illumina ADME Core Panel, a pharmacogenomic genotyping platform that assays 184 variants across 34 genes. In Cox regression analysis adjusting for age at transplant, pre-transplant chronic kidney disease, pre-transplant diabetes, and the three most significant principal components (PCAs), we did not identify any markers that met our multiple-testing threshold. As a secondary analysis we also modeled post-transplant eGFR directly with linear mixed models adjusted for age at transplant, cyclosporine use, median BMI, and the three most significant principal components. While no SNPs met our threshold for significance, a SNP previously identified in genetic studies of the dosing of tacrolimus CYP3A5 rs776746, replicated in an adjusted analysis at an uncorrected p-value of 0.02 (coeff(S.E.) = 14.60(6.41)). While larger independent studies will be required to further validate this finding, this study underscores the EMRs usefulness as a resource for longitudinal pharmacogenetic study designs.
PMCID: PMC3923429  PMID: 24297552
3.  Polymorphisms, Mutations, and Amplification of the EGFR Gene in Non-Small Cell Lung Cancers 
PLoS Medicine  2007;4(4):e125.
Background
The epidermal growth factor receptor (EGFR) gene is the prototype member of the type I receptor tyrosine kinase (TK) family and plays a pivotal role in cell proliferation and differentiation. There are three well described polymorphisms that are associated with increased protein production in experimental systems: a polymorphic dinucleotide repeat (CA simple sequence repeat 1 [CA-SSR1]) in intron one (lower number of repeats) and two single nucleotide polymorphisms (SNPs) in the promoter region, −216 (G/T or T/T) and −191 (C/A or A/A). The objective of this study was to examine distributions of these three polymorphisms and their relationships to each other and to EGFR gene mutations and allelic imbalance (AI) in non-small cell lung cancers.
Methods and Findings
We examined the frequencies of the three polymorphisms of EGFR in 556 resected lung cancers and corresponding non-malignant lung tissues from 336 East Asians, 213 individuals of Northern European descent, and seven of other ethnicities. We also studied the EGFR gene in 93 corresponding non-malignant lung tissue samples from European-descent patients from Italy and in peripheral blood mononuclear cells from 250 normal healthy US individuals enrolled in epidemiological studies including individuals of European descent, African–Americans, and Mexican–Americans. We sequenced the four exons (18–21) of the TK domain known to harbor activating mutations in tumors and examined the status of the CA-SSR1 alleles (presence of heterozygosity, repeat number of the alleles, and relative amplification of one allele) and allele-specific amplification of mutant tumors as determined by a standardized semiautomated method of microsatellite analysis. Variant forms of SNP −216 (G/T or T/T) and SNP −191 (C/A or A/A) (associated with higher protein production in experimental systems) were less frequent in East Asians than in individuals of other ethnicities (p < 0.001). Both alleles of CA-SSR1 were significantly longer in East Asians than in individuals of other ethnicities (p < 0.001). Expression studies using bronchial epithelial cultures demonstrated a trend towards increased mRNA expression in cultures having the variant SNP −216 G/T or T/T genotypes. Monoallelic amplification of the CA-SSR1 locus was present in 30.6% of the informative cases and occurred more often in individuals of East Asian ethnicity. AI was present in 44.4% (95% confidence interval: 34.1%–54.7%) of mutant tumors compared with 25.9% (20.6%–31.2%) of wild-type tumors (p = 0.002). The shorter allele in tumors with AI in East Asian individuals was selectively amplified (shorter allele dominant) more often in mutant tumors (75.0%, 61.6%–88.4%) than in wild-type tumors (43.5%, 31.8%–55.2%, p = 0.003). In addition, there was a strong positive association between AI ratios of CA-SSR1 alleles and AI of mutant alleles.
Conclusions
The three polymorphisms associated with increased EGFR protein production (shorter CA-SSR1 length and variant forms of SNPs −216 and −191) were found to be rare in East Asians as compared to other ethnicities, suggesting that the cells of East Asians may make relatively less intrinsic EGFR protein. Interestingly, especially in tumors from patients of East Asian ethnicity, EGFR mutations were found to favor the shorter allele of CA-SSR1, and selective amplification of the shorter allele of CA-SSR1 occurred frequently in tumors harboring a mutation. These distinct molecular events targeting the same allele would both be predicted to result in greater EGFR protein production and/or activity. Our findings may help explain to some of the ethnic differences observed in mutational frequencies and responses to TK inhibitors.
Masaharu Nomura and colleagues examine the distribution ofEGFR polymorphisms in different populations and find differences that might explain different responses to tyrosine kinase inhibitors in lung cancer patients.
Editors' Summary
Background.
Most cases of lung cancer—the leading cause of cancer deaths worldwide—are “non-small cell lung cancer” (NSCLC), which has a very low cure rate. Recently, however, “targeted” therapies have brought new hope to patients with NSCLC. Like all cancers, NSCLC occurs when cells begin to divide uncontrollably because of changes (mutations) in their genetic material. Chemotherapy drugs treat cancer by killing these rapidly dividing cells, but, because some normal tissues are sensitive to these agents, it is hard to kill the cancer completely without causing serious side effects. Targeted therapies specifically attack the changes in cancer cells that allow them to divide uncontrollably, so it might be possible to kill the cancer cells selectively without damaging normal tissues. Epidermal growth factor receptor (EGRF) was one of the first molecules for which a targeted therapy was developed. In normal cells, messenger proteins bind to EGFR and activate its “tyrosine kinase,” an enzyme that sticks phosphate groups on tyrosine (an amino acid) in other proteins. These proteins then tell the cell to divide. Alterations to this signaling system drive the uncontrolled growth of some cancers, including NSCLC.
Why Was This Study Done?
Molecules that inhibit the tyrosine kinase activity of EGFR (for example, gefitinib) dramatically shrink some NSCLCs, particularly those in East Asian patients. Tumors shrunk by tyrosine kinase inhibitors (TKIs) often (but not always) have mutations in EGFR's tyrosine kinase. However, not all tumors with these mutations respond to TKIs, and other genetic changes—for example, amplification (multiple copies) of the EGFR gene—also affect tumor responses to TKIs. It would be useful to know which genetic changes predict these responses when planning treatments for NSCLC and to understand why the frequency of these changes varies between ethnic groups. In this study, the researchers have examined three polymorphisms—differences in DNA sequences that occur between individuals—in the EGFR gene in people with and without NSCLC. In addition, they have looked for associations between these polymorphisms, which are present in every cell of the body, and the EGFR gene mutations and allelic imbalances (genes occur in pairs but amplification or loss of one copy, or allele, often causes allelic imbalance in tumors) that occur in NSCLCs.
What Did the Researchers Do and Find?
The researchers measured how often three EGFR polymorphisms (the length of a repeat sequence called CA-SSR1, and two single nucleotide variations [SNPs])—all of which probably affect how much protein is made from the EGFR gene—occurred in normal tissue and NSCLC tissue from East Asians and individuals of European descent. They also looked for mutations in the EGFR tyrosine kinase and allelic imbalance in the tumors, and then determined which genetic variations and alterations tended to occur together in people with the same ethnicity. Among many associations, the researchers found that shorter alleles of CA-SSR1 and the minor forms of the two SNPs occurred less often in East Asians than in individuals of European descent. They also confirmed that EGFR kinase mutations were more common in NSCLCs in East Asians than in European-descent individuals. Furthermore, mutations occurred more often in tumors with allelic imbalance, and in tumors where there was allelic imbalance and an EGFR mutation, the mutant allele was amplified more often than the wild-type allele.
What Do These Findings Mean?
The researchers use these associations between gene variants and tumor-associated alterations to propose a model to explain the ethnic differences in mutational frequencies and responses to TKIs seen in NSCLC. They suggest that because of the polymorphisms in the EGFR gene commonly seen in East Asians, people from this ethnic group make less EGFR protein than people from other ethnic groups. This would explain why, if a threshold level of EGFR is needed to drive cells towards malignancy, East Asians have a high frequency of amplified EGFR tyrosine kinase mutations in their tumors—mutation followed by amplification would be needed to activate EGFR signaling. This model, though speculative, helps to explain some clinical findings, such as the frequency of EGFR mutations and of TKI sensitivity in NSCLCs in East Asians. Further studies of this type in different ethnic groups and in different tumors, as well as with other genes for which targeted therapies are available, should help oncologists provide personalized cancer therapies for their patients.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0040125.
US National Cancer Institute information on lung cancer and on cancer treatment for patients and professionals
MedlinePlus encyclopedia entries on NSCLC
Cancer Research UK information for patients about all aspects of lung cancer, including treatment with TKIs
Wikipedia pages on lung cancer, EGFR, and gefitinib (note that Wikipedia is a free online encyclopedia that anyone can edit)
doi:10.1371/journal.pmed.0040125
PMCID: PMC1876407  PMID: 17455987
4.  Nuclear Receptor Expression Defines a Set of Prognostic Biomarkers for Lung Cancer 
PLoS Medicine  2010;7(12):e1000378.
David Mangelsdorf and colleagues show that nuclear receptor expression is strongly associated with clinical outcomes of lung cancer patients, and this expression profile is a potential prognostic signature for lung cancer patient survival time, particularly for individuals with early stage disease.
Background
The identification of prognostic tumor biomarkers that also would have potential as therapeutic targets, particularly in patients with early stage disease, has been a long sought-after goal in the management and treatment of lung cancer. The nuclear receptor (NR) superfamily, which is composed of 48 transcription factors that govern complex physiologic and pathophysiologic processes, could represent a unique subset of these biomarkers. In fact, many members of this family are the targets of already identified selective receptor modulators, providing a direct link between individual tumor NR quantitation and selection of therapy. The goal of this study, which begins this overall strategy, was to investigate the association between mRNA expression of the NR superfamily and the clinical outcome for patients with lung cancer, and to test whether a tumor NR gene signature provided useful information (over available clinical data) for patients with lung cancer.
Methods and Findings
Using quantitative real-time PCR to study NR expression in 30 microdissected non-small-cell lung cancers (NSCLCs) and their pair-matched normal lung epithelium, we found great variability in NR expression among patients' tumor and non-involved lung epithelium, found a strong association between NR expression and clinical outcome, and identified an NR gene signature from both normal and tumor tissues that predicted patient survival time and disease recurrence. The NR signature derived from the initial 30 NSCLC samples was validated in two independent microarray datasets derived from 442 and 117 resected lung adenocarcinomas. The NR gene signature was also validated in 130 squamous cell carcinomas. The prognostic signature in tumors could be distilled to expression of two NRs, short heterodimer partner and progesterone receptor, as single gene predictors of NSCLC patient survival time, including for patients with stage I disease. Of equal interest, the studies of microdissected histologically normal epithelium and matched tumors identified expression in normal (but not tumor) epithelium of NGFIB3 and mineralocorticoid receptor as single gene predictors of good prognosis.
Conclusions
NR expression is strongly associated with clinical outcomes for patients with lung cancer, and this expression profile provides a unique prognostic signature for lung cancer patient survival time, particularly for those with early stage disease. This study highlights the potential use of NRs as a rational set of therapeutically tractable genes as theragnostic biomarkers, and specifically identifies short heterodimer partner and progesterone receptor in tumors, and NGFIB3 and MR in non-neoplastic lung epithelium, for future detailed translational study in lung cancer.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Lung cancer, the most common cause of cancer-related death, kills 1.3 million people annually. Most lung cancers are “non-small-cell lung cancers” (NSCLCs), and most are caused by smoking. Exposure to chemicals in smoke causes changes in the genes of the cells lining the lungs that allow the cells to grow uncontrollably and to move around the body. How NSCLC is treated and responds to treatment depends on its “stage.” Stage I tumors, which are small and confined to the lung, are removed surgically, although chemotherapy is also sometimes given. Stage II tumors have spread to nearby lymph nodes and are treated with surgery and chemotherapy, as are some stage III tumors. However, because cancer cells in stage III tumors can be present throughout the chest, surgery is not always possible. For such cases, and for stage IV NSCLC, where the tumor has spread around the body, patients are treated with chemotherapy alone. About 70% of patients with stage I and II NSCLC but only 2% of patients with stage IV NSCLC survive for five years after diagnosis; more than 50% of patients have stage IV NSCLC at diagnosis.
Why Was This Study Done?
Patient responses to treatment vary considerably. Oncologists (doctors who treat cancer) would like to know which patients have a good prognosis (are likely to do well) to help them individualize their treatment. Consequently, the search is on for “prognostic tumor biomarkers,” molecules made by cancer cells that can be used to predict likely clinical outcomes. Such biomarkers, which may also be potential therapeutic targets, can be identified by analyzing the overall pattern of gene expression in a panel of tumors using a technique called microarray analysis and looking for associations between the expression of sets of genes and clinical outcomes. In this study, the researchers take a more directed approach to identifying prognostic biomarkers by investigating the association between the expression of the genes encoding nuclear receptors (NRs) and clinical outcome in patients with lung cancer. The NR superfamily contains 48 transcription factors (proteins that control the expression of other genes) that respond to several hormones and to diet-derived fats. NRs control many biological processes and are targets for several successful drugs, including some used to treat cancer.
What Did the Researchers Do and Find?
The researchers analyzed the expression of NR mRNAs using “quantitative real-time PCR” in 30 microdissected NSCLCs and in matched normal lung tissue samples (mRNA is the blueprint for protein production). They then used an approach called standard classification and regression tree analysis to build a prognostic model for NSCLC based on the expression data. This model predicted both survival time and disease recurrence among the patients from whom the tumors had been taken. The researchers validated their prognostic model in two large independent lung adenocarcinoma microarray datasets and in a squamous cell carcinoma dataset (adenocarcinomas and squamous cell carcinomas are two major NSCLC subtypes). Finally, they explored the roles of specific NRs in the prediction model. This analysis revealed that the ability of the NR signature in tumors to predict outcomes was mainly due to the expression of two NRs—the short heterodimer partner (SHP) and the progesterone receptor (PR). Expression of either gene could be used as a single gene predictor of the survival time of patients, including those with stage I disease. Similarly, the expression of either nerve growth factor induced gene B3 (NGFIB3) or mineralocorticoid receptor (MR) in normal tissue was a single gene predictor of a good prognosis.
What Do These Findings Mean?
These findings indicate that the expression of NR mRNA is strongly associated with clinical outcomes in patients with NSCLC. Furthermore, they identify a prognostic NR expression signature that provides information on the survival time of patients, including those with early stage disease. The signature needs to be confirmed in more patients before it can be used clinically, and researchers would like to establish whether changes in mRNA expression are reflected in changes in protein expression if NRs are to be targeted therapeutically. Nevertheless, these findings highlight the potential use of NRs as prognostic tumor biomarkers. Furthermore, they identify SHP and PR in tumors and two NRs in normal lung tissue as molecules that might provide new targets for the treatment of lung cancer and new insights into the early diagnosis, pathogenesis, and chemoprevention of lung cancer.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1000378.
The Nuclear Receptor Signaling Atlas (NURSA) is consortium of scientists sponsored by the US National Institutes of Health that provides scientific reagents, datasets, and educational material on nuclear receptors and their co-regulators to the scientific community through a Web-based portal
The Cancer Prevention and Research Institute of Texas (CPRIT) provides information and resources to anyone interested in the prevention and treatment of lung and other cancers
The US National Cancer Institute provides detailed information for patients and professionals about all aspects of lung cancer, including information on non-small-cell carcinoma and on tumor markers (in English and Spanish)
Cancer Research UK also provides information about lung cancer and information on how cancer starts
MedlinePlus has links to other resources about lung cancer (in English and Spanish)
Wikipedia has a page on nuclear receptors (note that Wikipedia is a free online encyclopedia that anyone can edit; available in several languages)
doi:10.1371/journal.pmed.1000378
PMCID: PMC3001894  PMID: 21179495
5.  Extracorporeal Lung Support Technologies – Bridge to Recovery and Bridge to Lung Transplantation in Adult Patients 
Executive Summary
For cases of acute respiratory distress syndrome (ARDS) and progressive chronic respiratory failure, the first choice or treatment is mechanical ventilation. For decades, this method has been used to support critically ill patients in respiratory failure. Despite its life-saving potential, however, several experimental and clinical studies have suggested that ventilator-induced lung injury can adversely affect the lungs and patient outcomes. Current opinion is that by reducing the pressure and volume of gas delivered to the lungs during mechanical ventilation, the stress applied to the lungs is eased, enabling them to rest and recover. In addition, mechanical ventilation may fail to provide adequate gas exchange, thus patients may suffer from severe hypoxia and hypercapnea. For these reasons, extracorporeal lung support technologies may play an important role in the clinical management of patients with lung failure, allowing not only the transfer of oxygen and carbon dioxide (CO2) but also buying the lungs the time needed to rest and heal.
Objective
The objective of this analysis was to assess the effectiveness, safety, and cost-effectiveness of extracorporeal lung support technologies in the improvement of pulmonary gas exchange and the survival of adult patients with acute pulmonary failure and those with end-stage chronic progressive lung disease as a bridge to lung transplantation (LTx). The application of these technologies in primary graft dysfunction (PGD) after LTx is beyond the scope of this review and is not discussed.
Clinical Applications of Extracorporeal Lung Support
Extracorporeal lung support technologies [i.e., Interventional Lung Assist (ILA) and extracorporeal membrane oxygenation (ECMO)] have been advocated for use in the treatment of patients with respiratory failure. These techniques do not treat the underlying lung condition; rather, they improve gas exchange while enabling the implantation of a protective ventilation strategy to prevent further damage to the lung tissues imposed by the ventilator. As such, extracorporeal lung support technologies have been used in three major lung failure case types:
As a bridge to recovery in acute lung failure – for patients with injured or diseased lungs to give their lungs time to heal and regain normal physiologic function.
As a bridge to LTx – for patients with irreversible end stage lung disease requiring LTx.
As a bridge to recovery after LTx – used as lung support for patients with PGD or severe hypoxemia.
Ex-Vivo Lung Perfusion and Assessment
Recently, the evaluation and reconditioning of donor lungs ex-vivo has been introduced into clinical practice as a method of improving the rate of donor lung utilization. Generally, about 15% to 20% of donor lungs are suitable for LTx, but these figures may increase with the use of ex-vivo lung perfusion. The ex-vivo evaluation and reconditioning of donor lungs is currently performed at the Toronto General Hospital (TGH) and preliminary results have been encouraging (Personal communication, clinical expert, December 17, 2009). If its effectiveness is confirmed, the use of the technique could lead to further expansion of donor organ pools and improvements in post-LTx outcomes.
Extracorporeal Lung support Technologies
ECMO
The ECMO system consists of a centrifugal pump, a membrane oxygenator, inlet and outlet cannulas, and tubing. The exchange of oxygen and CO2 then takes place in the oxygenator, which delivers the reoxygenated blood back into one of the patient’s veins or arteries. Additional ports may be added for haemodialysis or ultrafiltration.
Two different techniques may be used to introduce ECMO: venoarterial and venovenous. In the venoarterial technique, cannulation is through either the femoral artery and the femoral vein, or through the carotid artery and the internal jugular vein. In the venovenous technique cannulation is through both femoral veins or a femoral vein and internal jugular vein; one cannula acts as inflow or arterial line, and the other as an outflow or venous line. Venovenous ECMO will not provide adequate support if a patient has pulmonary hypertension or right heart failure. Problems associated with cannulation during the procedure include bleeding around the cannulation site and limb ischemia distal to the cannulation site.
ILA
Interventional Lung Assist (ILA) is used to remove excess CO2 from the blood of patients in respiratory failure. The system is characterized by a novel, low-resistance gas exchange device with a diffusion membrane composed of polymethylpentene (PMP) fibres. These fibres are woven into a complex configuration that maximizes the exchange of oxygen and CO2 by simple diffusion. The system is also designed to operate without the help of an external pump, though one can be added if higher blood flow is required. The device is then applied across an arteriovenous shunt between the femoral artery and femoral vein. Depending on the size of the arterial cannula used and the mean systemic arterial pressure, a blood flow of up to 2.5 L/min can be achieved (up to 5.5 L/min with an external pump). The cannulation is performed after intravenous administration of heparin.
Recently, the first commercially available extracorporeal membrane ventilator (NovaLung GmbH, Hechingen, Germany) was approved for clinical use by Health Canada for patients in respiratory failure. The system has been used in more than 2,000 patients with various indications in Europe, and was used for the first time in North America at the Toronto General Hospital in 2006.
Evidence-Based Analysis
The research questions addressed in this report are:
Does ILA/ECMO facilitate gas exchange in the lungs of patients with severe respiratory failure?
Does ILA/ECMO improve the survival rate of patients with respiratory failure caused by a range of underlying conditions including patients awaiting LTx?
What are the possible serious adverse events associated with ILA/ECMO therapy?
To address these questions, a systematic literature search was performed on September 28, 2009 using OVID MEDLINE, MEDLINE In-Process and Other Non-Indexed Citations, EMBASE, the Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Cochrane Library, and the International Agency for Health Technology Assessment (INAHTA) for studies published from January 1, 2005 to September 28, 2008. Abstracts were reviewed by a single reviewer and, for those studies meeting the eligibility criteria, full-text articles were obtained. Reference lists were also examined for any additional relevant studies not identified through the search. Articles with an unknown eligibility were reviewed with a second clinical epidemiologist and then a group of epidemiologists until consensus was established.
Inclusion Criteria
Studies in which ILA/ECMO was used as a bridge to recovery or bridge to LTx
Studies containing information relevant to the effectiveness and safety of the procedure
Studies including at least five patients
Exclusion Criteria
Studies reporting the use of ILA/ECMO for inter-hospital transfers of critically ill patients
Studies reporting the use of ILA/ECMO in patients during or after LTx
Animal or laboratory studies
Case reports
Outcomes of Interest
Reduction in partial pressure of CO2
Correction of respiratory acidosis
Improvement in partial pressure of oxygen
Improvement in patient survival
Frequency and severity of adverse events
The search yielded 107 citations in Medline and 107 citations in EMBASE. After reviewing the information provided in the titles and abstracts, eight citations were found to meet the study inclusion criteria. One study was then excluded because of an overlap in the study population with a previous study. Reference checking did not produce any additional studies for inclusion. Seven case series studies, all conducted in Germany, were thus included in this review (see Table 1).
Also included is the recently published CESAR trial, a multicentre RCT in the UK in which ECMO was compared with conventional intensive care management. The results of the CESAR trial were published when this review was initiated. In the absence of any other recent RCT on ECMO, the results of this trial were considered for this assessment and no further searches were conducted. A literature search was then conducted for application of ECMO as bridge to LTx patients (January, 1, 2005 to current). A total of 127 citations on this topic were identified and reviewed but none were found to have examined the use of ECMO as bridge to LTx.
Quality of Evidence
To grade the quality of evidence, the grading system formulated by the GRADE working group and adopted by MAS was applied. The GRADE system classifies the quality of a body of evidence as high, moderate, low, or very low according to four key elements: study design, study quality, consistency across studies, and directness.
Results
Trials on ILA
Of the seven studies identified, six involved patients with ARDS caused by a range of underlying conditions; the seventh included only patients awaiting LTx. All studies reported the rate of gas exchange and respiratory mechanics before ILA and for up to 7 days of ILA therapy. Four studies reported the means and standard deviations of blood gas transfer and arterial blood pH, which were used for meta-analysis.
Fischer et al. reported their first experience on the use of ILA as a bridge to LTx. In their study, 12 patients at high urgency status for LTx, who also had severe ventilation refractory hypercapnea and respiratory acidosis, were connected to ILA prior to LTx. Seven patients had a systemic infection or sepsis prior to ILA insertion. Six hours after initiation of ILA, the partial pressure of CO2 in arterial blood significantly decreased (P < .05) and arterial blood pH significantly improved (P < .05) and remained stable for one week (last time point reported). The partial pressure of oxygen in arterial blood improved from 71 mmHg to 83 mmHg 6 hours after insertion of ILA. The ratio of PaO2/FiO2 improved from 135 at baseline to 168 at 24 hours after insertion of ILA but returned to baseline values in the following week.
Trials on ECMO
The UK-based CESAR trial was conducted to assess the effectiveness and cost of ECMO therapy for severe, acute respiratory failure. The trial protocol were published in 2006 and details of the methods used for the economic evaluation were published in 2008. The study itself was a pragmatic trial (similar to a UK trial of neonatal ECMO), in which best standard practice was compared with an ECMO protocol. The trial involved 180 patients with acute but potentially reversible respiratory failure, with each also having a Murray score of ≥ 3.0 or uncompensated hypercapnea at a pH of < 7.2. Enrolled patients were randomized in a 1:1 ratio to receive either conventional ventilation treatment or ECMO while on ventilator. Conventional management included intermittent positive pressure ventilation, high frequency oscillatory ventilation, or both. As a pragmatic trial, a specific management protocol was not followed; rather the treatment centres were advised to follow a low volume low pressure ventilation strategy. A tidal volume of 4 to 8 mL/kg body weight and a plateau pressure of < 30 cm H2O were recommended.
Conclusions
ILA
Bridge to recovery
No RCTs or observational studies compared ILA to other treatment modalities.
Case series have shown that ILA therapy results in significant CO2 removal from arterial blood and correction of respiratory acidosis, as well as an improvement in oxygen transfer.
ILA therapy enabled a lowering of respiratory settings to protect the lungs without causing a negative impact on arterial blood CO2 and arterial blood pH.
The impact of ILA on patient long-term survival cannot be determined through the studies reviewed.
In-hospital mortality across studies ranged from 20% to 65%.
Ischemic complications were the most frequent adverse events following ILA therapy.
Leg amputation is a rare but possible outcome of ILA therapy, having occurred in about 0.9% of patients in these case series. New techniques involving the insertion of additional cannula into the femoral artery to perfuse the leg may lower this rate.
Bridge to LTx
The results of one case series (n=12) showed that ILA effectively removes CO2 from arterial blood and corrects respiratory acidosis in patients with ventilation refractory hypercapnea awaiting a LTx
Eight of the 12 patients (67%) awaiting a LTx were successfully transplanted and one-year survival for those transplanted was 80%
Since all studies are case series, the grade of the evidence for these observations is classified as “LOW”.
ECMO
Bridge to recovery
Based on the results of a pragmatic trial and an intention to treat analysis, referral of patient to an ECMO based centre significantly improves patient survival without disability compared to conventional ventilation. The results of CESAR trial showed that:
For patients with information about disability, survival without severe disability was significantly higher in ECMO arm
Assuming that the three patients in the conventional ventilation arm who did not have information about severe disability were all disabled, the results were also significant.
Assuming that none of these patients were disabled, the results were at borderline significance
A greater, though not statistically significant, proportion of patients in ECMO arm survived.
The rate of serious adverse events was higher among patients in ECMO group
The grade of evidence for the above observations is classified as “HIGH”.
Bridge to LTx
No studies fitting the inclusion criteria were identified.
There is no accurate data on the use of ECMO in patients awaiting LTx.
Economic Analysis
The objective of the economic analysis was to determine the costs associated with extracorporeal lung support technologies for bridge to LTx in adults. A literature search was conducted for which the target population was adults eligible for extracorporeal lung support. The primary analytic perspective was that of the Ministry of Health and Long-Term Care (MOHLTC). Articles published in English and fitting the following inclusion criteria were reviewed:
Full economic evaluations including cost-effectiveness analyses (CEA), cost-utility analyses (CUA), cost-benefit analyses (CBA);
Economic evaluations reporting incremental cost-effectiveness ratios (ICER) i.e. cost per quality adjusted life year (QALY), life years gained (LYG), or cost per event avoided; and
Studies in patients eligible for lung support technologies for to lung transplantation.
The search yielded no articles reporting comparative economic analyses.
Resource Use and Costs
Costs associated with both ILA and ECMO (outlined in Table ES-1) were obtained from the University Health Network (UHN) case costing initiative (personal communication, UHN, January 2010). Consultation with a clinical expert in the field was also conducted to verify resource utilization. The consultant was situated at the UHN in Toronto. The UHN has one ECMO machine, which cost approximately $100,000. The system is 18 years old and is used an average of 3 to 4 times a year with 35 procedures being performed over the last 9 years. The disposable cost per patient associated with ECMO is, on average, $2,200. There is a maintenance cost associated with the machine (not reported by the UHN), which is currently absorbed by the hospital’s biomedical engineering department.
The average capital cost of an ILA device is $7,100 per device, per patient, while the average cost of the reusable pump $65,000. The UHN has performed 16 of these procedures over the last 2.5 years. Similarly, there is a maintenance cost not that was reported by UHN but is absorbed by the hospital’s biomedical engineering department.
Resources Associated with Extracorporeal Lung Support Technologies
Hospital costs associated with ILA were based on the average cost incurred by the hospital for 11 cases performed in the FY 07/08 (personal communication, UHN, January 2010). The resources incurred with this hospital procedure included:
Device and disposables
OR transplant
Surgical ICU
Laboratory work
Medical imaging
Pharmacy
Clinical nutrition
Physiotherapy
Occupational therapy
Speech and language pathology
Social work
The average length of stay in hospital was 61 days for ILA (range: 5 to 164 days) and the average direct cost was $186,000 per case (range: $19,000 to $552,000). This procedure has a high staffing requirement to monitor patients in hospital, driving up the average cost per case.
PMCID: PMC3415698  PMID: 23074408
6.  DIP–STR: Highly Sensitive Markers for the Analysis of Unbalanced Genomic Mixtures 
Human Mutation  2013;34(4):644-654.
Samples containing highly unbalanced DNA mixtures from two individuals commonly occur both in forensic mixed stains and in peripheral blood DNA microchimerism induced by pregnancy or following organ transplant. Because of PCR amplification bias, the genetic identification of a DNA that contributes trace amounts to a mixed sample represents a tremendous challenge. This means that standard genetic markers, namely microsatellites, also referred as short tandem repeats (STR), and single-nucleotide polymorphism (SNP) have limited power in addressing common questions of forensic and medical genetics. To address this issue, we developed a molecular marker, named DIP–STR that relies on pairing deletion–insertion polymorphisms (DIP) with STR. This novel analytical approach allows for the unambiguous genotyping of a minor component in the presence of a major component, where DIP–STR genotypes of the minor were successfully procured at ratios up to 1:1,000. The compound nature of this marker generates a high level of polymorphism that is suitable for identity testing. Here, we demonstrate the power of the DIP–STR approach on an initial set of nine markers surveyed in a Swiss population. Finally, we discuss the limitations and potential applications of our new system including preliminary tests on clinical samples and estimates of their performance on simulated DNA mixtures.
doi:10.1002/humu.22280
PMCID: PMC3675636  PMID: 23355272
compound genetic marker; forensic; DNA microchimerism; diagnostics
7.  Genetic Predisposition to Increased Blood Cholesterol and Triglyceride Lipid Levels and Risk of Alzheimer Disease: A Mendelian Randomization Analysis 
PLoS Medicine  2014;11(9):e1001713.
In this study, Proitsi and colleagues use a Mendelian randomization approach to dissect the causal nature of the association between circulating lipid levels and late onset Alzheimer's Disease (LOAD) and find that genetic predisposition to increased plasma cholesterol and triglyceride lipid levels is not associated with elevated LOAD risk.
Please see later in the article for the Editors' Summary
Background
Although altered lipid metabolism has been extensively implicated in the pathogenesis of Alzheimer disease (AD) through cell biological, epidemiological, and genetic studies, the molecular mechanisms linking cholesterol and AD pathology are still not well understood and contradictory results have been reported. We have used a Mendelian randomization approach to dissect the causal nature of the association between circulating lipid levels and late onset AD (LOAD) and test the hypothesis that genetically raised lipid levels increase the risk of LOAD.
Methods and Findings
We included 3,914 patients with LOAD, 1,675 older individuals without LOAD, and 4,989 individuals from the general population from six genome wide studies drawn from a white population (total n = 10,578). We constructed weighted genotype risk scores (GRSs) for four blood lipid phenotypes (high-density lipoprotein cholesterol [HDL-c], low-density lipoprotein cholesterol [LDL-c], triglycerides, and total cholesterol) using well-established SNPs in 157 loci for blood lipids reported by Willer and colleagues (2013). Both full GRSs using all SNPs associated with each trait at p<5×10−8 and trait specific scores using SNPs associated exclusively with each trait at p<5×10−8 were developed. We used logistic regression to investigate whether the GRSs were associated with LOAD in each study and results were combined together by meta-analysis. We found no association between any of the full GRSs and LOAD (meta-analysis results: odds ratio [OR] = 1.005, 95% CI 0.82–1.24, p = 0.962 per 1 unit increase in HDL-c; OR = 0.901, 95% CI 0.65–1.25, p = 0.530 per 1 unit increase in LDL-c; OR = 1.104, 95% CI 0.89–1.37, p = 0.362 per 1 unit increase in triglycerides; and OR = 0.954, 95% CI 0.76–1.21, p = 0.688 per 1 unit increase in total cholesterol). Results for the trait specific scores were similar; however, the trait specific scores explained much smaller phenotypic variance.
Conclusions
Genetic predisposition to increased blood cholesterol and triglyceride lipid levels is not associated with elevated LOAD risk. The observed epidemiological associations between abnormal lipid levels and LOAD risk could therefore be attributed to the result of biological pleiotropy or could be secondary to LOAD. Limitations of this study include the small proportion of lipid variance explained by the GRS, biases in case-control ascertainment, and the limitations implicit to Mendelian randomization studies. Future studies should focus on larger LOAD datasets with longitudinal sampled peripheral lipid measures and other markers of lipid metabolism, which have been shown to be altered in LOAD.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Currently, about 44 million people worldwide have dementia, a group of brain disorders characterized by an irreversible decline in memory, communication, and other “cognitive” functions. Dementia mainly affects older people and, because people are living longer, experts estimate that more than 135 million people will have dementia by 2050. The commonest form of dementia is Alzheimer disease. In this type of dementia, protein clumps called plaques and neurofibrillary tangles form in the brain and cause its degeneration. The earliest sign of Alzheimer disease is usually increasing forgetfulness. As the disease progresses, affected individuals gradually lose their ability to deal with normal daily activities such as dressing. They may become anxious or aggressive or begin to wander. They may also eventually lose control of their bladder and of other physical functions. At present, there is no cure for Alzheimer disease although some of its symptoms can be managed with drugs. Most people with the disease are initially cared for at home by relatives and other unpaid carers, but many patients end their days in a care home or specialist nursing home.
Why Was This Study Done?
Several lines of evidence suggest that lipid metabolism (how the body handles cholesterol and other fats) is altered in patients whose Alzheimer disease develops after the age of 60 years (late onset Alzheimer disease, LOAD). In particular, epidemiological studies (observational investigations that examine the patterns and causes of disease in populations) have found an association between high amounts of cholesterol in the blood in midlife and the risk of LOAD. However, observational studies cannot prove that abnormal lipid metabolism (dyslipidemia) causes LOAD. People with dyslipidemia may share other characteristics that cause both dyslipidemia and LOAD (confounding) or LOAD might actually cause dyslipidemia (reverse causation). Here, the researchers use “Mendelian randomization” to examine whether lifetime changes in lipid metabolism caused by genes have a causal impact on LOAD risk. In Mendelian randomization, causality is inferred from associations between genetic variants that mimic the effect of a modifiable risk factor and the outcome of interest. Because gene variants are inherited randomly, they are not prone to confounding and are free from reverse causation. So, if dyslipidemia causes LOAD, genetic variants that affect lipid metabolism should be associated with an altered risk of LOAD.
What Did the Researchers Do and Find?
The researchers investigated whether genetic predisposition to raised lipid levels increased the risk of LOAD in 10,578 participants (3,914 patients with LOAD, 1,675 elderly people without LOAD, and 4,989 population controls) using data collected in six genome wide studies looking for gene variants associated with Alzheimer disease. The researchers constructed a genotype risk score (GRS) for each participant using genetic risk markers for four types of blood lipids on the basis of the presence of single nucleotide polymorphisms (SNPs, a type of gene variant) in their DNA. When the researchers used statistical methods to investigate the association between the GRS and LOAD among all the study participants, they found no association between the GRS and LOAD.
What Do These Findings Mean?
These findings suggest that the genetic predisposition to raised blood levels of four types of lipid is not causally associated with LOAD risk. The accuracy of this finding may be affected by several limitations of this study, including the small proportion of lipid variance explained by the GRS and the validity of several assumptions that underlie all Mendelian randomization studies. Moreover, because all the participants in this study were white, these findings may not apply to people of other ethnic backgrounds. Given their findings, the researchers suggest that the observed epidemiological associations between abnormal lipid levels in the blood and variation in lipid levels for reasons other than genetics, or to LOAD risk could be secondary to variation in lipid levels for reasons other than genetics, or to LOAD, a possibility that can be investigated by studying blood lipid levels and other markers of lipid metabolism over time in large groups of patients with LOAD. Importantly, however, these findings provide new information about the role of lipids in LOAD development that may eventually lead to new therapeutic and public-health interventions for Alzheimer disease.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001713.
The UK National Health Service Choices website provides information (including personal stories) about Alzheimer's disease
The UK not-for-profit organization Alzheimer's Society provides information for patients and carers about dementia, including personal experiences of living with Alzheimer's disease
The US not-for-profit organization Alzheimer's Association also provides information for patients and carers about dementia and personal stories about dementia
Alzheimer's Disease International is the international federation of Alzheimer disease associations around the world; it provides links to individual associations, information about dementia, and links to World Alzheimer Reports
MedlinePlus provides links to additional resources about Alzheimer's disease (in English and Spanish)
Wikipedia has a page on Mendelian randomization (note: Wikipedia is a free online encyclopedia that anyone can edit; available in several languages)
doi:10.1371/journal.pmed.1001713
PMCID: PMC4165594  PMID: 25226301
8.  Variation in PTX3 Is Associated with Primary Graft Dysfunction after Lung Transplantation 
Rationale: Elevated long pentraxin-3 (PTX3) levels are associated with the development of primary graft dysfunction (PGD) after lung transplantation. Abnormalities in innate immunity, mediated by PTX3 release, may play a role in PGD pathogenesis.
Objectives: Our goal was to test whether variants in the gene encoding PTX3 are risk factors for PGD.
Methods: We performed a candidate gene association study in recipients from the multicenter, prospective Lung Transplant Outcomes Group cohort enrolled between July 2002 and July 2009. The primary outcome was International Society for Heart and Lung Transplantation grade 3 PGD within 72 hours of transplantation. Targeted genotyping of 10 haplotype-tagging PTX3 single-nucleotide polymorphisms (SNPs) was performed in lung transplant recipients. The association between PGD and each SNP was evaluated by logistic regression, adjusting for pretransplantation lung disease, cardiopulmonary bypass use, and population stratification. The association between SNPs and plasma PTX3 levels was tested across genotypes in a subset of recipients with idiopathic pulmonary fibrosis.
Measurements and Main Results: Six hundred fifty-four lung transplant recipients were included. The incidence of PGD was 29%. Two linked 5′ region variants, rs2120243 and rs2305619, were associated with PGD (odds ratio, 1.5; 95% confidence interval, 1.1 to 1.9; P = 0.006 and odds ratio, 1.4; 95% confidence interval, 1.1 to 1.9; P = 0.007, respectively). The minor allele of rs2305619 was significantly associated with higher plasma PTX3 levels measured pretransplantation (P = 0.014) and at 24 hours (P = 0.047) after transplantation in patients with idiopathic pulmonary fibrosis.
Conclusions: Genetic variants of PTX3 are associated with PGD after lung transplantation, and are associated with increased PTX3 plasma levels.
doi:10.1164/rccm.201204-0692OC
PMCID: PMC3480532  PMID: 22822025
primary graft dysfunction; single-nucleotide polymorphism; long pentraxin 3; lung transplantation
9.  Spirometric Assessment of Lung Transplant Patients: One Year Follow-Up 
Clinics (Sao Paulo, Brazil)  2009;64(6):519-525.
OBJECTIVE:
The purpose of this study was to compare spirometry data between patients who underwent single-lung or double-lung transplantation the first year after transplantation.
INTRODUCTION:
Lung transplantation, which was initially described as an experimental method in 1963, has become a therapeutic option for patients with advanced pulmonary diseases due to improvements in organ conservation, surgical technique, immunosuppressive therapy and treatment of post-operative infections.
METHODS:
We retrospectively reviewed the records of the 39 patients who received lung transplantation in our institution between August 2003 and August 2006. Twenty-nine patients survived one year post-transplantation, and all of them were followed.
RESULTS:
The increase in lung function in the double-lung transplant group was more substantial than that of the single-lung transplant group, exhibiting a statistical difference from the 1st month in both the forced expiratory volume in one second (FEV1) and the forced vital capacity (FVC) in comparison to the pre-transplant values (p <0.05).
Comparison between double-lung transplant and single lung-transplant groups of emphysema patients demonstrated a significant difference in lung function beginning in the 3rd month after transplantation.
DISCUSSION:
The analyses of the whole group of transplant recipients and the sub-group of emphysema patients suggest the superiority of bilateral transplant over the unilateral alternative. Although the pre-transplant values of lung function were worse in the double-lung group, this difference was no longer significant in the subsequent months after surgery.
CONCLUSION:
Although both groups demonstrated functional improvement after transplantation, there was a clear tendency to greater improvement in FVC and FEV1 in the bilateral transplant group. Among our subjects, double-lung transplantation improved lung function.
doi:10.1590/S1807-59322009000600006
PMCID: PMC2705150  PMID: 19578655
Lung transplantation; Spirometry; Respiratory function tests; Emphysema; Insufflation
10.  Non‐tuberculous mycobacteria in end stage cystic fibrosis: implications for lung transplantation 
Thorax  2006;61(6):507-513.
WC and NS contributed equally.
Non‐tuberculous mycobacteria (NTM) frequently colonise patients with end stage cystic fibrosis (CF), but its impact on the course of the disease following lung transplantation is unknown.
Methods
Lung transplant recipients with CF who underwent lung transplantation at our institution between January 1990 and May 2003 (n = 146) and CF patients awaiting lung transplantation in May 2003 (n = 31) were studied retrospectively.
Results
The prevalence rate of NTM isolated from respiratory cultures in patients with end stage CF referred for lung transplantation was 19.7%, compared with a prevalence rate of 13.7% for NTM isolates in CF lung transplant recipients. The overall prevalence of invasive NTM disease after lung transplantation was low (3.4%) and was predicted most strongly by pre‐transplant NTM isolation (p = 0.001, Fisher's exact test, odds ratio (OR) 6.13, 95% CI 3.2 to 11.4). This association was restricted to Mycobacterium abscessus (p = 0.005, Fisher's exact test, OR 7.45, 95% CI 2.9 to 16.9). While NTM disease caused significant morbidity in a small number of patients after transplantation, it was successfully treated and did not influence the post‐transplant course of the disease.
Conclusion
The isolation of NTM before transplantation in CF patients should not be an exclusion criterion for lung transplantation, but it may alert the clinician to patients at risk of recurrence following transplantation.
doi:10.1136/thx.2005.049247
PMCID: PMC2111233  PMID: 16601086
cystic fibrosis; lung transplantation; non‐tuberculous mycobacteria; Mycobacterium abscessus
11.  Evaluation of the Lung Cancer Risks at Which to Screen Ever- and Never-Smokers: Screening Rules Applied to the PLCO and NLST Cohorts 
PLoS Medicine  2014;11(12):e1001764.
Martin Tammemägi and colleagues evaluate which risk groups of individuals, including nonsmokers and high-risk individuals from 65 to 80 years of age, should be screened for lung cancer using computed tomography.
Please see later in the article for the Editors' Summary
Background
Lung cancer risks at which individuals should be screened with computed tomography (CT) for lung cancer are undecided. This study's objectives are to identify a risk threshold for selecting individuals for screening, to compare its efficiency with the U.S. Preventive Services Task Force (USPSTF) criteria for identifying screenees, and to determine whether never-smokers should be screened. Lung cancer risks are compared between smokers aged 55–64 and ≥65–80 y.
Methods and Findings
Applying the PLCOm2012 model, a model based on 6-y lung cancer incidence, we identified the risk threshold above which National Lung Screening Trial (NLST, n = 53,452) CT arm lung cancer mortality rates were consistently lower than rates in the chest X-ray (CXR) arm. We evaluated the USPSTF and PLCOm2012 risk criteria in intervention arm (CXR) smokers (n = 37,327) of the Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial (PLCO). The numbers of smokers selected for screening, and the sensitivities, specificities, and positive predictive values (PPVs) for identifying lung cancers were assessed. A modified model (PLCOall2014) evaluated risks in never-smokers. At PLCOm2012 risk ≥0.0151, the 65th percentile of risk, the NLST CT arm mortality rates are consistently below the CXR arm's rates. The number needed to screen to prevent one lung cancer death in the 65th to 100th percentile risk group is 255 (95% CI 143 to 1,184), and in the 30th to <65th percentile risk group is 963 (95% CI 291 to −754); the number needed to screen could not be estimated in the <30th percentile risk group because of absence of lung cancer deaths. When applied to PLCO intervention arm smokers, compared to the USPSTF criteria, the PLCOm2012 risk ≥0.0151 threshold selected 8.8% fewer individuals for screening (p<0.001) but identified 12.4% more lung cancers (sensitivity 80.1% [95% CI 76.8%–83.0%] versus 71.2% [95% CI 67.6%–74.6%], p<0.001), had fewer false-positives (specificity 66.2% [95% CI 65.7%–66.7%] versus 62.7% [95% CI 62.2%–63.1%], p<0.001), and had higher PPV (4.2% [95% CI 3.9%–4.6%] versus 3.4% [95% CI 3.1%–3.7%], p<0.001). In total, 26% of individuals selected for screening based on USPSTF criteria had risks below the threshold PLCOm2012 risk ≥0.0151. Of PLCO former smokers with quit time >15 y, 8.5% had PLCOm2012 risk ≥0.0151. None of 65,711 PLCO never-smokers had PLCOm2012 risk ≥0.0151. Risks and lung cancers were significantly greater in PLCO smokers aged ≥65–80 y than in those aged 55–64 y. This study omitted cost-effectiveness analysis.
Conclusions
The USPSTF criteria for CT screening include some low-risk individuals and exclude some high-risk individuals. Use of the PLCOm2012 risk ≥0.0151 criterion can improve screening efficiency. Currently, never-smokers should not be screened. Smokers aged ≥65–80 y are a high-risk group who may benefit from screening.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Lung cancer is the most commonly occurring cancer in the world and the most common cause of cancer-related deaths. Like all cancers, lung cancer occurs when cells acquire genetic changes that allow them to grow uncontrollably and to move around the body (metastasize). The most common trigger for these genetic changes in lung cancer is exposure to cigarette smoke. Symptoms of lung cancer include a persistent cough and breathlessness. If lung cancer is diagnosed when it is confined to the lung (stage I), the tumor can often be removed surgically. Stage II tumors, which have spread into nearby lymph nodes, are usually treated with surgery plus chemotherapy or radiotherapy. For more advanced lung cancers that have spread throughout the chest (stage III) or the body (stage IV), surgery is rarely helpful and these tumors are treated with chemotherapy and radiotherapy alone. Overall, because most lung cancers are not detected until they are advanced, less than 17% of people diagnosed with lung cancer survive for five years.
Why Was This Study Done?
Screening for lung cancer—looking for early disease in healthy people—could save lives. In the US National Lung Screening Trial (NLST), annual screening with computed tomography (CT) reduced lung cancer mortality by 20% among smokers at high risk of developing cancer compared with screening with a chest X-ray. But what criteria should be used to decide who is screened for lung cancer? The US Preventive Services Task Force (USPSTF), for example, recommends annual CT screening of people who are 55–80 years old, have smoked 30 or more pack-years (one pack-year is defined as a pack of cigarettes per day for one year), and—if they are former smokers—quit smoking less than 15 years ago. However, some experts think lung cancer risk prediction models—statistical models that estimate risk based on numerous personal characteristics—should be used to select people for screening. Here, the researchers evaluate PLCOm2012, a lung cancer risk prediction model based on the incidence of lung cancer among smokers enrolled in the US Prostate, Lung, Colorectal and Ovarian Cancer Screening Trial (PLCO). Specifically, the researchers use NLST and PLCO screening trial data to identify a PLCOm2012 risk threshold for selecting people for screening and to compare the efficiency of the PLCOm2012 model and the USPSTF criteria for identifying “screenees.”
What Did the Researchers Do and Find?
By analyzing NLST data, the researchers calculated that at PLCOm2012 risk ≥0.0151, mortality (death) rates among NLST participants screened with CT were consistently below mortality rates among NLST participants screened with chest X-ray and that 255 people with a PLCOm2012 risk ≥0.0151 would need to be screened to prevent one lung cancer death. Next, they used data collected from smokers in the screened arm of the PLCO trial to compare the efficiency of the PLCOm2012 and USPSTF criteria for identifying screenees. They found that 8.8% fewer people had a PLCOm2012 risk ≥0.0151 than met USPSTF criteria for screening, but 12.4% more lung cancers were identified. Thus, using PLCOm2012 improved the sensitivity and specificity of the selection of individuals for lung cancer screening over using UPSTF criteria. Notably, 8.5% of PLCO former smokers with quit times of more than 15 years had PLCOm2012 risk ≥0.0151, none of the PLCO never-smokers had PLCOm2012 risk ≥0.0151, and the calculated risks and incidence of lung cancer were greater among PLCO smokers aged ≥65–80 years than among those aged 55–64 years.
What Do These Findings Mean?
Despite the absence of a cost-effectiveness analysis in this study, these findings suggest that the use of the PLCOm2012 risk ≥0.0151 threshold rather than USPSTF criteria for selecting individuals for lung cancer screening could improve screening efficiency. The findings have several other important implications. First, these findings suggest that screening may be justified in people who stopped smoking more than 15 years ago; USPSTF currently recommends that screening stop once an individual's quit time exceeds 15 years. Second, these findings do not support lung cancer screening among never-smokers. Finally, these findings suggest that smokers aged ≥65–80 years might benefit from screening, although the presence of additional illnesses and reduced life expectancy need to be considered before recommending the provision of routine lung cancer screening to this section of the population.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001764.
The US National Cancer Institute provides information about all aspects of lung cancer for patients and health-care professionals, including information on lung cancer screening (in English and Spanish)
Cancer Research UK also provides detailed information about lung cancer and about lung cancer screening
The UK National Health Service Choices website has a page on lung cancer that includes personal stories
MedlinePlus provides links to other sources of information about lung cancer (in English and Spanish)
Information about the USPSTF recommendations for lung cancer screening is available
doi:10.1371/journal.pmed.1001764
PMCID: PMC4251899  PMID: 25460915
12.  A longitudinal study of patients’ symptoms before and during the first year after lung transplantation 
Clinical transplantation  2012;26(6):E576-E589.
Background
Lung transplantation provides a viable option for survival of end-stage respiratory disease. In addition to prolonging survival, there is considerable interest in improving patient-related outcomes such as transplant recipients’ symptom experiences.
Methods
A prospective, repeated measures design was used to describe the symptom experience of 85 lung transplant recipients between 2000–2005. The Transplant Symptom Inventory (TSI) was administered before and at 1, 3, 6, 9, and 12 months post-transplant. Ridit analysis provided a unique method for describing symptom experiences and changes.
Results
After lung transplantation, significant (p<.05) improvements were reported for the most frequently occurring and most distressing pre-transplant symptoms (e.g., shortness of breath with activity). Marked increases in the frequency and distress of new symptoms, such as tremors were also reported. Patterns of symptom frequency and distress varied with the time since transplant.
Conclusion
The findings provide data-based information that can be used to inform pre- and post-transplant patient education and also help caregivers anticipate a general time frame for symptom changes in order to prevent or minimize symptoms and their associated distress. In addition, symptoms are described, using an innovative method of illustration which shows “at-a-glance” changes or lack of changes in patients’ symptoms from pre- to post-lung transplant.
doi:10.1111/ctr.12002
PMCID: PMC3518596  PMID: 22988999
symptoms; symptom experience; lung transplant; transplant candidates; transplant recipients
13.  Fialuridine Induces Acute Liver Failure in Chimeric TK-NOG Mice: A Model for Detecting Hepatic Drug Toxicity Prior to Human Testing 
PLoS Medicine  2014;11(4):e1001628.
Gary Peltz, Jeffrey Glenn, and colleagues report that a pre-clinical mouse toxicology model can detect liver toxicity of a drug that caused liver failure in several early clinical trial participants in 1993.
Please see later in the article for the Editors' Summary
Background
Seven of 15 clinical trial participants treated with a nucleoside analogue (fialuridine [FIAU]) developed acute liver failure. Five treated participants died, and two required a liver transplant. Preclinical toxicology studies in mice, rats, dogs, and primates did not provide any indication that FIAU would be hepatotoxic in humans. Therefore, we investigated whether FIAU-induced liver toxicity could be detected in chimeric TK-NOG mice with humanized livers.
Methods and Findings
Control and chimeric TK-NOG mice with humanized livers were treated orally with FIAU 400, 100, 25, or 2.5 mg/kg/d. The response to drug treatment was evaluated by measuring plasma lactate and liver enzymes, by assessing liver histology, and by electron microscopy. After treatment with FIAU 400 mg/kg/d for 4 d, chimeric mice developed clinical and serologic evidence of liver failure and lactic acidosis. Analysis of liver tissue revealed steatosis in regions with human, but not mouse, hepatocytes. Electron micrographs revealed lipid and mitochondrial abnormalities in the human hepatocytes in FIAU-treated chimeric mice. Dose-dependent liver toxicity was detected in chimeric mice treated with FIAU 100, 25, or 2.5 mg/kg/d for 14 d. Liver toxicity did not develop in control mice that were treated with the same FIAU doses for 14 d. In contrast, treatment with another nucleotide analogue (sofosbuvir 440 or 44 mg/kg/d po) for 14 d, which did not cause liver toxicity in human trial participants, did not cause liver toxicity in mice with humanized livers.
Conclusions
FIAU-induced liver toxicity could be readily detected using chimeric TK-NOG mice with humanized livers, even when the mice were treated with a FIAU dose that was only 10-fold above the dose used in human participants. The clinical features, laboratory abnormalities, liver histology, and ultra-structural changes observed in FIAU-treated chimeric mice mirrored those of FIAU-treated human participants. The use of chimeric mice in preclinical toxicology studies could improve the safety of candidate medications selected for testing in human participants.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Before new drugs are approved for clinical use, they undergo extensive preclinical (laboratory-based) and clinical testing. In the preclinical studies, scientists investigate the causes of diseases, identify potential new drugs, and test promising drug candidates in animals. Animal testing is performed to determine whether the new drug is likely to work, and to screen for drug-induced toxicity. In preclinical toxicology studies, new drugs are given to two or more animal species to find out whether the drug has any short- or long-term toxic effects such as damage to the liver (hepatotoxicity). Drugs that pass these animal tests enter clinical trials. Phase I clinical trials test new drugs in a handful of healthy volunteers or patients to evaluate their safety and to identify possible side effects. In phase II trials, a larger group of patients receives the new drug to evaluate its safety further and to get an initial idea of its effectiveness. Finally, in phase III trials, very large groups of patients are randomly assigned to receive the new drug or an established treatment for their disease. These randomized controlled trials provide detailed information about the effectiveness and safety of a candidate drug, and must be completed before a drug can be approved for clinical use.
Why Was This Study Done?
Since animals are not perfect models for people, candidate drugs can cause toxicities in clinical trials that were not predicted by preclinical toxicology testing performed using animal species. For example, in 1993, 15 participants in a phase II trial were given a nucleoside analogue called fialuridine to treat hepatitis B virus infection (nucleoside analogues often have antiviral activity). Seven participants developed liver failure and lactic acidosis (buildup of lactic acid in the blood). Analysis of liver tissue from the affected participants revealed steatosis (fatty degeneration), intracellular fat droplets, and swollen mitochondria (these organelles are the powerhouses of the cell). Five participants subsequently died, and two had to have a liver transplant. In preclinical toxicology testing in mice, rats, dogs, and primates, there had been no indications that fialuridine would be hepatotoxic in people. It now seems that the expression of a nucleoside transporter in the mitochondria of humans but not of other animals may underlie the human-specific mitochondrial toxicity and hepatotoxicity of fialuridine. With several other nucleoside analogues in development, a better screening tool for human-specific mitochondrial toxicity is needed. In this study, the researchers investigate whether fialuridine toxicity can be detected in TK-NOG mice with chimeric (humanized) livers. TK-NOG mice are immunodeficient mice that have been genetically engineered so that human liver cells (hepatocytes) transplanted into these animals establish a long-lived mature “human organ.”
What Did the Researchers Do and Find?
The researchers treated chimeric (with transplanted human liver cells) and control (without transplanted human liver cells) TK-NOG mice with several doses of fialuridine. After treatment with the highest dose (1,600-fold above the dose used in the phase II trial) for four days, the chimeric mice developed liver failure and lactic acidosis. Moreover, steatosis and lipid and mitochondrial abnormalities developed in the regions of their livers that contained human hepatocytes but not in regions that contained mouse hepatocytes. Notably, the control mice had not developed liver toxicity after 14 days of treatment with the highest dose of drug. Liver toxicity was also easily detectable in chimeric mice that had been treated for 14 days with a fialuridine dose only 10-fold above that used in the human trial. Treatment with another nucleoside analogue that does not cause liver toxicity in people did not cause liver toxicity in the chimeric mice.
What Do These Findings Mean?
These findings show that fialuridine-induced liver toxicity can be readily detected using TK-NOG mice that have humanized livers at drug doses only 10-fold higher than those that caused liver failure in the phase II trial. Although the liver toxicity developed much more quickly in these mice than in the human trial participants, the clinical features, laboratory abnormalities, and structural changes seen in the fialuridine-treated chimeric TK-NOG mice closely mirrored those seen in fialuridine-treated people. The use of TK-NOG mice containing humanized livers in toxicology testing will not reveal whether drugs have human-specific toxicities outside the liver. Since they are highly immunocompromised, chimeric TK-NOG mice cannot be used to detect immune-mediated drug toxicities. Nevertheless, these findings suggest that the use of chimeric mice in toxicology studies could help improve the safety of candidate drugs that are tested in humans.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001628.
The US Food and Drug Administration, the body that approves drugs for clinical use in the US, provides an overview for patients about the drug development process from the laboratory to the clinic
The UK Medicines and Healthcare Products Regulatory Agency (MHRA) provides more detailed information for patients and the public about the drug development process, including a section on preclinical research, which includes information on animal testing
The US National Institutes of Health provides information about clinical trials, including personal stories from people who have taken part in clinical trials
The UK National Health Service Choices website has information for patients about clinical trials and medical research, including personal stories about participation in clinical trials
Understanding Animal Research is a UK advocacy group that provides information about the importance of animal research to the public, teachers, scientists, journalists, and policy makers
Wikipedia has a page on animal testing (note that Wikipedia is a free online encyclopedia that anyone can edit; available in several languages)
doi:10.1371/journal.pmed.1001628
PMCID: PMC3988005  PMID: 24736310
14.  Kidney and liver organ transplantation in persons with human immunodeficiency virus 
Executive Summary
Objective
The objective of this analysis is to determine the effectiveness of solid organ transplantation in persons with end stage organ failure (ESOF) and human immunodeficiency virus (HIV+)
Clinical Need: Condition and Target Population
Patients with end stage organ failure who have been unresponsive to other forms of treatment eventually require solid organ transplantation. Similar to persons who are HIV negative (HIV−), persons living with HIV infection (HIV+) are at risk for ESOF from viral (e.g. hepatitis B and C) and non-viral aetiologies (e.g. coronary artery disease, diabetes, hepatocellular carcinoma). Additionally, HIV+ persons also incur risks of ESOF from HIV-associated nephropathy (HIVAN), accelerated liver damage from hepatitis C virus (HCV+), with which an estimated 30% of HIV positive (HIV+) persons are co-infected, and coronary artery disease secondary to antiretroviral therapy. Concerns that the need for post transplant immunosuppression and/or the interaction of immunosuppressive drugs with antiretroviral agents may accelerate the progression of HIV disease, as well as the risk of opportunistic infections post transplantation, have led to uncertainty regarding the overall benefit of transplantation among HIV+ patients. Moreover, the scarcity of donor organs and their use in a population where the clinical benefit of transplantation is uncertain has limited the availability of organ transplantation to persons living with ESOF and HIV.
With the development of highly active anti retroviral therapy (HAART), which has been available in Canada since 1997, there has been improved survival and health-related quality of life for persons living with HIV. HAART can suppress HIV replication, enhance immune function, and slow disease progression. HAART managed persons can now be expected to live longer than those in the pre-HAART era and as a result many will now experience ESOF well before they experience life-threatening conditions related to HIV infection. Given their improved prognosis and the burden of illness they may experience from ESOF, the benefit of solid organ transplantation for HIV+ patients needs to be reassessed.
Evidence-Based Analysis Methods
Research Questions
What are the effectiveness and cost effectiveness of solid organ transplantation in HIV+ persons with ESOF?
Literature Search
A literature search was performed on September 22, 2009 using OVID MEDLINE, MEDLINE In-Process and Other Non-Indexed Citations, EMBASE, the Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Cochrane Library, and the International Agency for Health Technology Assessment (INAHTA) for studies published from January 1, 1996 to September 22, 2009.
Inclusion Criteria
Systematic review with or without a Meta analysis, RCT, Non-RCT with controls
HIV+ population undergoing solid organ transplantation
HIV+ population managed with HAART therapy
Controls include persons undergoing solid organ transplantation who are i) HIV− ii) HCV+ mono-infected, and iii) HIV+ persons with ESOF not transplanted.
Studies that completed and reported results of a Kaplan-Meier Survival Curve analysis.
Studies with a minimum (mean or medium) follow up of 1-year.
English language citations
Exclusion Criteria
Case reports and case series were excluded form this review.
Outcomes of Interest
i) Risk of Death after transplantation
ii) Death censored graft survival (DCGS)
iii) HIV disease progression defined as the post transplant incidence of:
- opportunistic infections or neoplasms,
- CD4+ T-cell count < 200mm3, and
- any detectable level of plasma HIV viral load.
iv) Acute graft rejection,
v) Return to dialysis,
vi) Recurrence of HCV infection
Summary of Findings
No direct evidence comparing an HIV+ cohort undergoing transplantation with the same not undergoing transplantation (wait list) was found in the literature search.
The results of this review are reported for the following comparison cohorts undergoing transplantation:
i) Kidney Transplantation: HIV+ cohort compared with HIV− cohort
ii) Liver Transplantation: HIV+ cohort compared with HIV− negative cohort
iii) Liver Transplantation: HIV+ HCV+ (co-infected) cohort compared with HCV+ (mono-infected) cohort
Kidney Transplantation: HIV+ vs. HIV−
Based on a pooled HIV+ cohort sample size of 285 patients across four studies, the risk of death after kidney transplantation in an HIV+ cohort does not differ to that of an HIV− cohort [hazard ratio (HR): 0.90; 95% CI: 0.36, 2.23]. The quality of evidence supporting this outcome is very low.
Death censored graft survival was reported in one study with an HIV+ cohort sample size of 100, and was statistically significantly different (p=.03) to that in the HIV− cohort (n=36,492). However, the quality of evidence supporting this outcome was determined to be very low. There was also uncertainty in the rate of return to dialysis after kidney transplantation in both the HIV+ and HIV− groups and the effect, if any, this may have on patient survival. Because of the very low quality evidence rating, the effect of kidney transplantation on HIV-disease progression is uncertain.
The rate of acute graft rejection was determined using the data from one study. There was a nonsignificant difference between the HIV+ and HIV− cohorts (OR 0.13; 95% CI: 0.01, 2.64), although again, because of very low quality evidence there is uncertainty in this estimate of effect.
Liver Transplantation: HIV+ vs. HIV−
Based on a combined HIV+ cohort sample size of 198 patient across five studies, the risk of death after liver transplantation in an HIV+ cohort (with at least 50% of the cohort co-infected with HCV+) is statistically significantly 64% greater compared with an HIV− cohort (HR: 1.64; 95% CI: 1.32, 2.02). The quality of evidence supporting this outcome is very low.
Death censored graft survival was reported for an HIV+ cohort in one study (n=11) however the DCGS rate of the contemporaneous control HIV− cohort was not reported. Because of sparse data the quality of evidence supporting this outcome is very low indicating death censored graft survival is uncertain.
Both the CD4+ T-cell count and HIV viral load appear controlled post transplant with an incidence of opportunistic infection of 20.5%. However, the quality of this evidence for these outcomes is very low indicating uncertainty in these effects. Similarly, because of very low quality evidence there is uncertainty in the rate of acute graft rejection among both the HIV+ and HIV− groups
Liver Transplantation: HIV+/HCV+ vs. HCV+
Based on a combined HIV+/HCV+ cohort sample size of 156 from seven studies, the risk of death after liver transplantation is significantly greater (2.8 fold) in a co-infected cohort compared with an HCV+ mono-infected cohort (HR: 2.81; 95% CI: 1.47, 5.37). The quality of evidence supporting this outcome is very low. Death censored graft survival evidence was not available.
Regarding disease progression, based on a combined sample size of 71 persons in the co-infected cohort, the CD4+ T-cell count and HIV viral load appear controlled post transplant; however, again the quality of evidence supporting this outcome is very low. The rate of opportunistic infection in the co-infected cohort was 7.2%. The quality of evidence supporting this estimate is very low, indicating uncertainty in these estimates of effect.
Based on a combined HIV+/HCV+ cohort (n=57) the rate of acute graft rejection does not differ to that of an HCV+ mono-infected cohort (OR: 0.88; 95% CI: 0.44, 1.76). Also based on a combined HIV+/HCV+ cohort (n=83), the rate of HCV+ recurrence does not differ to that of an HCV+ mono-infected cohort (OR: 0.66; 95% CI: 0.27, 1.59). In both cases, the quality of the supporting evidence was very low.
Overall, because of very low quality evidence there is uncertainty in the effect of kidney or liver transplantation in HIV+ persons with end stage organ failure compared with those not infected with HIV. Examining the economics of this issue, the cost of kidney and liver transplants in an HIV+ patient population are, on average, 56K and 147K per case, based on both Canadian and American experiences.
PMCID: PMC3377507  PMID: 23074407
15.  Lung Transplantation 
Lung transplantation is the only definitive therapy for many forms of end-stage lung diseases. However, the success of lung transplantation is limited by many factors: (1) Too few lungs available for transplantation due to limited donors or injury to the donor lung; (2) current methods of preservation of excised lungs do not allow extended periods of time between procurement and implantation; (3) acute graft failure is more common with lungs than other solid organs, thus contributing to poorer short-term survival after lung transplant compared with that for recipients of other organs; (4) lung transplant recipients are particularly vulnerable to pulmonary infections; and (5) chronic allograft dysfunction, manifest by bronchiolitis obliterans syndrome, is frequent and limits long-term survival. Scientific advances may provide significant improvements in the outcome of lung transplantation. The National Heart, Lung, and Blood Institute convened a working group of investigators on June 14–15, 2004, in Bethesda, Maryland, to identify opportunities for scientific advancement in lung transplantation, including basic and clinical research. This workshop provides a framework to identify critical issues related to clinical lung transplantation, and to delineate important areas for productive scientific investigation.
doi:10.1164/rccm.200501-098WS
PMCID: PMC2718411  PMID: 16020804
allograft dysfunction; infection; ischemia-reperfusion injury; lung transplantation; obliterative bronchiolitis; rejection
16.  A Nonhuman Primate Model of Lung Regeneration: Detergent-Mediated Decellularization and Initial In Vitro Recellularization with Mesenchymal Stem Cells 
Tissue Engineering. Part A  2012;18(23-24):2437-2452.
Currently, patients with end-stage lung disease are limited to lung transplantation as their only treatment option. Unfortunately, the lungs available for transplantation are few. Moreover, transplant recipients require life-long immune suppression to tolerate the transplanted lung. A promising alternative therapeutic strategy is decellularization of whole lungs, which permits the isolation of an intact scaffold comprised of innate extracellular matrix (ECM) that can theoretically be recellularized with autologous stem or progenitor cells to yield a functional lung. Nonhuman primates (NHP) provide a highly relevant preclinical model with which to assess the feasibility of recellularized lung scaffolds for human lung transplantation. Our laboratory has successfully accomplished lung decellularization and initial stem cell inoculation of the resulting ECM scaffold in an NHP model. Decellularization of normal adult rhesus macaque lungs as well as the biology of the resulting acellular matrix have been extensively characterized. Acellular NHP matrices retained the anatomical and ultrastructural properties of native lungs with minimal effect on the content, organization, and appearance of ECM components, including collagen types I and IV, laminin, fibronectin, and sulfated glycosaminoglycans (GAG), due to decellularization. Proteomics analysis showed enrichment of ECM proteins in total tissue extracts due to the removal of cells and cellular proteins by decellularization. Cellular DNA was effectively removed after decellularization (∼92% reduction), and the remaining nuclear material was found to be highly disorganized, very-low-molecular-weight fragments. Both bone marrow- and adipose-derived mesenchymal stem cells (MSC) attach to the decellularized lung matrix and can be maintained within this environment in vitro, suggesting that these cells may be promising candidates and useful tools for lung regeneration. Analysis of decellularized lung slice cultures to which MSC were seeded showed that the cells attached to the decellularized matrix, elongated, and proliferated in culture. Future investigations will focus on optimizing the recellularization of NHP lung scaffolds toward the goal of regenerating pulmonary tissue. Bringing this technology to eventual human clinical application will provide patients with an alternative therapeutic strategy as well as significantly reduce the demand for transplantable organs and patient wait-list time.
doi:10.1089/ten.tea.2011.0594
PMCID: PMC3501118  PMID: 22764775
17.  Short tandem repeat sequences in the Mycoplasma genitalium genome and their use in a multilocus genotyping system 
BMC Microbiology  2008;8:130.
Background
Several methods have been reported for strain typing of Mycoplasma genitalium. The value of these methods has never been comparatively assessed. The aims of this study were: 1) to identify new potential genetic markers based on an analysis of short tandem repeat (STR) sequences in the published M. genitalium genome sequence; 2) to apply previously and newly identified markers to a panel of clinical strains in order to determine the optimal combination for an efficient multi-locus genotyping system; 3) to further confirm sexual transmission of M. genitalium using the newly developed system.
Results
We performed a comprehensive analysis of STRs in the genome of the M. genitalium type strain G37 and identified 18 loci containing STRs. In addition to one previously studied locus, MG309, we chose two others, MG307 and MG338, for further study. Based on an analysis of 74 unrelated patient specimens from New Orleans and Scandinavia, the discriminatory indices (DIs) for these three markers were 0.9153, 0.7381 and 0.8730, respectively. Two other previously described markers, including single nucleotide polymorphisms (SNPs) in the rRNA genes (rRNA-SNPs) and SNPs in the MG191 gene (MG191-SNPs) were found to have DIs of 0.5820 and 0.9392, respectively. A combination of MG309-STRs and MG191-SNPs yielded almost perfect discrimination (DI = 0.9894). An additional finding was that the rRNA-SNPs distribution pattern differed significantly between Scandinavia and New Orleans. Finally we applied multi-locus typing to further confirm sexual transmission using specimens from 74 unrelated patients and 31 concurrently infected couples. Analysis of multi-locus genotype profiles using the five variable loci described above revealed 27 of the couples had concordant genotype profiles compared to only four examples of concordance among the 74 unrelated randomly selected patients.
Conclusion
We propose that a combination of the MG309-STRs and MG191-SNPs is efficient for general epidemiological studies and addition of MG307-STRs and MG338-STRs is potentially useful for sexual network studies of M. genitalium infection. The multi-locus typing analysis of 74 unrelated M. genitalium-infected individuals and 31 infected couples adds to the evidence that M. genitalium is sexually transmitted.
doi:10.1186/1471-2180-8-130
PMCID: PMC2515158  PMID: 18664269
18.  Genetic Polymorphism of Interferon Regulatory Factor 5 (IRF5) Correlates with Allograft Acute Rejection of Liver Transplantation 
PLoS ONE  2014;9(4):e94426.
Background
Although liver transplantation is one of the most efficient curative therapies of end stage liver diseases, recipients may suffer liver graft loss opst-operation. IRF-5, a member of Interferon Regulatory Factors, functions as a key regulator in TLR4 cascade, and is capable of inducing inflammatory cytokines. Although TLR4 has been proved to contribute to acute allograft rejection, including after liver transplantation, the correlation between IRF5 gene and acute rejection has not been elucidated yet.
Methods
The study enrolled a total of 289 recipients, including 39 females and 250 males, and 39 recipients developed acute allograft rejection within 6 months post-transplantation. The allograft rejections were diagnosed by liver biopsies. Genome DNA of recipients was extracted from pre-operative peripheral blood. Genotyping of IRF-5, including rs3757385, rs752637 and rs11761199, was performed, followed by SNP frequency and Hardy-Weinberg equilibrium analysis.
Results
The genetic polymorphism of rs3757385 was found associated with acute rejection. G/G homozygous individuals were at higher risk of acute rejection, with a P value of 0.042 (OR = 2.34 (1.07–5.10)).
Conclusions
IRF5, which transcriptionally activates inflammatory cytokines, is genetically associated with acute rejection and might function as a risk factor for acute rejection of liver transplantations.
doi:10.1371/journal.pone.0094426
PMCID: PMC4005731  PMID: 24788560
19.  Integrative Genomic Analyses Identify BRF2 as a Novel Lineage-Specific Oncogene in Lung Squamous Cell Carcinoma 
PLoS Medicine  2010;7(7):e1000315.
William Lockwood and colleagues show that the focal amplification of a gene, BRF2, on Chromosome 8p12 plays a key role in squamous cell carcinoma of the lung.
Background
Traditionally, non-small cell lung cancer is treated as a single disease entity in terms of systemic therapy. Emerging evidence suggests the major subtypes—adenocarcinoma (AC) and squamous cell carcinoma (SqCC)—respond differently to therapy. Identification of the molecular differences between these tumor types will have a significant impact in designing novel therapies that can improve the treatment outcome.
Methods and Findings
We used an integrative genomics approach, combing high-resolution comparative genomic hybridization and gene expression microarray profiles, to compare AC and SqCC tumors in order to uncover alterations at the DNA level, with corresponding gene transcription changes, which are selected for during development of lung cancer subtypes. Through the analysis of multiple independent cohorts of clinical tumor samples (>330), normal lung tissues and bronchial epithelial cells obtained by bronchial brushing in smokers without lung cancer, we identified the overexpression of BRF2, a gene on Chromosome 8p12, which is specific for development of SqCC of lung. Genetic activation of BRF2, which encodes a RNA polymerase III (Pol III) transcription initiation factor, was found to be associated with increased expression of small nuclear RNAs (snRNAs) that are involved in processes essential for cell growth, such as RNA splicing. Ectopic expression of BRF2 in human bronchial epithelial cells induced a transformed phenotype and demonstrates downstream oncogenic effects, whereas RNA interference (RNAi)-mediated knockdown suppressed growth and colony formation of SqCC cells overexpressing BRF2, but not AC cells. Frequent activation of BRF2 in >35% preinvasive bronchial carcinoma in situ, as well as in dysplastic lesions, provides evidence that BRF2 expression is an early event in cancer development of this cell lineage.
Conclusions
This is the first study, to our knowledge, to show that the focal amplification of a gene in Chromosome 8p12, plays a key role in squamous cell lineage specificity of the disease. Our data suggest that genetic activation of BRF2 represents a unique mechanism of SqCC lung tumorigenesis through the increase of Pol III-mediated transcription. It can serve as a marker for lung SqCC and may provide a novel target for therapy.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Lung cancer is the commonest cause of cancer-related death. Every year, 1.3 million people die from this disease, which is mainly caused by smoking. Most cases of lung cancer are “non-small cell lung cancers” (NSCLCs). Like all cancers, NSCLC starts when cells begin to divide uncontrollably and to move round the body (metastasize) because of changes (mutations) in their genes. These mutations are often in “oncogenes,” genes that, when activated, encourage cell division. Oncogenes can be activated by mutations that alter the properties of the proteins they encode or by mutations that increase the amount of protein made from them, such as gene amplification (an increase in the number of copies of a gene). If NSCLC is diagnosed before it has spread from the lungs (stage I disease), it can be surgically removed and many patients with stage I NSCLC survive for more than 5 years after their diagnosis. Unfortunately, in more than half of patients, NSCLC has metastasized before it is diagnosed. This stage IV NSCLC can be treated with chemotherapy (toxic chemicals that kill fast-growing cancer cells) but only 2% of patients with stage IV lung cancer are alive 5 years after diagnosis.
Why Was This Study Done?
Traditionally, NSCLC has been regarded as a single disease in terms of treatment. However, emerging evidence suggests that the two major subtypes of NSCLC—adenocarcinoma and squamous cell carcinoma (SqCC)—respond differently to chemotherapy. Adenocarcinoma and SqCC start in different types of lung cell and experts think that for each cell type in the body, specific combinations of mutations interact with the cell type's own unique characteristics to provide the growth and survival advantage needed for cancer development. If this is true, then identifying the molecular differences between adenocarcinoma and SqCC could provide targets for more effective therapies for these major subtypes of NSCLC. Amplification of a chromosome region called 8p12 is very common in NSCLC, which suggests that an oncogene that drives lung cancer development is present in this chromosome region. In this study, the researchers investigate this possibility by looking for an amplified gene in the 8p12 chromosome region that makes increased amounts of protein in lung SqCC but not in lung adenocarcinoma.
What Did the Researchers Do and Find?
The researchers used a technique called comparative genomic hybridization to show that focal regions of Chromosome 8p are amplified in about 40% of lung SqCCs, but that DNA loss in this region is the most common alteration in lung adenocarcinomas. Ten genes in the 8p12 chromosome region were expressed at higher levels in the SqCC samples that they examined than in adenocarcinoma samples, they report, and overexpression of five of these genes correlated with amplification of the 8p12 region in the SqCC samples. Only one of the genes—BRF2—was more highly expressed in squamous carcinoma cells than in normal bronchial epithelial cells (the cell type that lines the tubes that take air into the lungs and from which SqCC develops). Artificially induced expression of BRF2 in bronchial epithelial cells made these normal cells behave like tumor cells, whereas reduction of BRF2 expression in squamous carcinoma cells made them behave more like normal bronchial epithelial cells. Finally, BRF2 was frequently activated in two early stages of squamous cell carcinoma—bronchial carcinoma in situ and dysplastic lesions.
What Do These Findings Mean?
Together, these findings show that the focal amplification of chromosome region 8p12 plays a role in the development of lung SqCC but not in the development of lung adenocarcinoma, the other major subtype of NSCLC. These findings identify BRF2 (which encodes a RNA polymerase III transcription initiation factor, a protein that is required for the synthesis of RNA molecules that help to control cell growth) as a lung SqCC-specific oncogene and uncover a unique mechanism for lung SqCC development. Most importantly, these findings suggest that genetic activation of BRF2 could be used as a marker for lung SqCC, which might facilitate the early detection of this type of NSCLC and that BRF2 might provide a new target for therapy.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1000315.
The US National Cancer Institute provides detailed information for patients and professionals about all aspects of lung cancer, including information on non-small cell carcinoma (in English and Spanish)
Cancer Research UK also provides information about lung cancer and information on how cancer starts
MedlinePlus has links to other resources about lung cancer (in English and Spanish)
doi:10.1371/journal.pmed.1000315
PMCID: PMC2910599  PMID: 20668658
20.  Lung Cancer Susceptibility and hOGG1 Ser326Cys Polymorphism: A Meta-Analysis 
Cancers  2010;2(4):1813-1829.
Recent lung cancer studies have focused on identifying the effects of single nucleotide polymorphisms (SNPs) in candidate genes, among which DNA repair genes are increasingly being studied. Genetic variations in DNA repair genes are thought to modulate DNA repair capacity and are suggested to be related to lung cancer risk. In this study, we tried to assess reported studies of association between polymorphism of human 8-oxoguanine DNA glycosylase 1 (hOGG1) Ser326Cys and lung cancer. We conducted MEDLINE, Current Contents and Web of Science searches using "hOGG1", "lung cancer" and "polymorphism" as keywords to search for papers published (from January 1995 through August 2010). Data were combined using both a fixed effects (the inverse variance-weighted method) and a random effects (DerSimonian and Laird method) models. The Cochran Q test was used for the assessment of heterogeneity. Publication bias was assessed by both Begg’s and Egger’s tests. We identified 20 case-control studies in 21 different ethnic populations. As two studies were not in the Hardy-Weinberg equilibrium, 18 case-control studies in 19 different ethnic populations (7,792 cases and 9,358 controls) were included in our meta-analysis. Summary frequencies of the Cys allele among Caucasians and Asians based on the random effects model were 20.9% (95% confidence interval (CI) = 18.9–22.9) and 46.1% (95% CI = 40.2–52.0), respectively. The distribution of the Cys allele was significantly different between Asians and Caucasians (P < 0.001). The Cys/Cys genotype was significantly associated with lung cancer risk in Asian populations (odds ratio = 1.27, 95% CI = 1.09–1.48) but not in Caucasian populations. This ethnic difference in lung cancer risk may be due to environmental factors such as cigarette smoking and dietary factors. Although the summary risk for developing lung cancer may not be large, lung cancer is such a common malignancy that even a small increase in risk can translate to a large number of excess lung cancer cases. As lung cancer is a multifactorial disease, further investigations of the gene-gene and gene-environment interactions on the hOGG1 polymorphism-associated lung cancer risk may help to better understand of the molecular pathogenesis of human lung cancer.
doi:10.3390/cancers2041813
PMCID: PMC3840447  PMID: 24281202
epidemiology; genetic polymorphism; lung cancer; meta-analysis; human 8-oxoguanine DNA glycosylase 1 (hOGG1)
21.  HIV and liver transplantation: The British Columbia experience, 2004 to 2013 
Historically, HIV-positive individuals have not been considered to be candidates for liver transplantation due to the need for further immunosuppresion of these patients post-transplant, as well as other factors such as pharmacokinetic interactions between the necessary antiretroviral and immunosuppressant drugs. However, HIV-positive individuals with end-stage liver disease are now eligible for liver transplantation in British Columbia. The purpose of this study was to summarize the outcomes of HIV-positive individuals referred for liver transplanation in British Columbia.
BACKGROUND:
The demand for definitive management of end-stage organ disease in HIV-infected Canadians is growing. Until recently, despite international evidence of good clinical outcomes, HIV-infected Canadians with end-stage liver disease were ineligible for transplantation, except in British Columbia (BC), where the liver transplant program of BC Transplant has accepted these patients for referral, assessment, listing and provision of liver allograft. There is a need to evaluate the experience in BC to determine the issues surrounding liver transplantation in HIV-infected patients.
METHODS:
The present study was a chart review of 28 HIV-infected patients who were referred to BC Transplant for liver transplantation between 2004 and 2013. Data regarding HIV and liver disease status, initial transplant assessment and clinical outcomes were collected.
RESULTS:
Most patients were BC residents and were assessed by the multidisciplinary team at the BC clinic. The majority had undetectable HIV viral loads, were receiving antiretroviral treatments and were infected with hepatitis C virus (n=16). The most common comorbidities were anxiety and mood disorders (n=4), and hemophilia (n=4). Of the patients eligible for transplantation, four were transplanted for autoimmune hepatitis (5.67 years post-transplant), nonalcoholic steatohepatitis (2.33 years), hepatitis C virus (2.25 years) and hepatitis B-delta virus coinfection (recent transplant). One patient died from acute renal failure while waiting for transplantation. Ten patients died during preassessment and 10 were unsuitable transplant candidates. The most common reason for unsuitability was stable disease not requiring transplantation (n=4).
CONCLUSIONS:
To date, interdisciplinary care and careful selection of patients have resulted in successful outcomes including the longest living HIV-infected post-liver transplant recipient in Canada.
PMCID: PMC4173979  PMID: 25285113
Hepatitis; HIV; Liver; Transplant
22.  Genetic variants associated with idiopathic pulmonary fibrosis susceptibility and mortality: a genome-wide association study 
The lancet. Respiratory medicine  2013;1(4):309-317.
Summary
Background
Idiopathic pulmonary fibrosis (IPF) is a devastating disease that probably involves several genetic loci. Several rare genetic variants and one common single nucleotide polymorphism (SNP) of MUC5B have been associated with the disease. Our aim was to identify additional common variants associated with susceptibility and ultimately mortality in IPF.
Methods
First, we did a three-stage genome-wide association study (GWAS): stage one was a discovery GWAS; and stages two and three were independent case-control studies. DNA samples from European-American patients with IPF meeting standard criteria were obtained from several US centres for each stage. Data for European-American control individuals for stage one were gathered from the database of genotypes and phenotypes; additional control individuals were recruited at the University of Pittsburgh to increase the number. For controls in stages two and three, we gathered data for additional sex-matched European-American control individuals who had been recruited in another study. DNA samples from patients and from control individuals were genotyped to identify SNPs associated with IPF. SNPs identified in stage one were carried forward to stage two, and those that achieved genome-wide significance (p<5 × 10−8) in a meta-analysis were carried forward to stage three. Three case series with follow-up data were selected from stages one and two of the GWAS using samples with follow-up data. Mortality analyses were done in these case series to assess the SNPs associated with IPF that had achieved genome-wide significance in the meta-analysis of stages one and two. Finally, we obtained gene-expression profiling data for lungs of patients with IPF from the Lung Genomics Research Consortium and analysed correlation with SNP genotypes.
Findings
In stage one of the GWAS (542 patients with IPF, 542 control individuals matched one-by-one to cases by genetic ancestry estimates), we identified 20 loci. Six SNPs reached genome-wide significance in stage two (544 patients, 687 control individuals): three TOLLIP SNPs (rs111521887, rs5743894, rs5743890) and one MUC5B SNP (rs35705950) at 11p15.5; one MDGA2 SNP (rs7144383) at 14q21.3; and one SPPL2C SNP (rs17690703) at 17q21.31. Stage three (324 patients, 702 control individuals) confirmed the associations for all these SNPs, except for rs7144383. Linkage disequilibrium between the MUC5B SNP (rs35705950) and TOLLIP SNPs (rs111521887 [r2=0.07], rs5743894 [r2=0.16], and rs5743890 [r2=0.01]) was low. 683 patients from the GWAS were included in the mortality analysis. Individuals who developed IPF despite having the protective TOLLIP minor allele of rs5743890 carried an increased mortality risk (meta-analysis with fixed-effect model: hazard ratio 1.72 [95% CI 1.24–2.38]; p=0.0012). TOLLIP expression was decreased by 20% in individuals carrying the minor allele of rs5743890 (p=0.097), 40% in those with the minor allele of rs111521887 (p=3.0 × 10−4), and 50% in those with the minor allele of rs5743894 (p=2.93 × 10−5) compared with homozygous carriers of common alleles for these SNPs.
Interpretation
Novel variants in TOLLIP and SPPL2C are associated with IPF susceptibility. One novel variant of TOLLIP, rs5743890, is also associated with mortality. These associations and the reduced expression of TOLLIP in patients with IPF who carry TOLLIP SNPs emphasise the importance of this gene in the disease.
Funding
National Institutes of Health; National Heart, Lung, and Blood Institute; Pulmonary Fibrosis Foundation; Coalition for Pulmonary Fibrosis; and Instituto de Salud Carlos III.
doi:10.1016/S2213-2600(13)70045-6
PMCID: PMC3894577  PMID: 24429156
23.  Bridge to lung transplantation and rescue post-transplant: the expanding role of extracorporeal membrane oxygenation 
Journal of Thoracic Disease  2014;6(8):1070-1079.
Over the last several decades, the growth of lung transplantation has been hindered by a much higher demand for donor lungs than can be supplied, leading to considerable waiting time and mortality among patients waiting for transplant. This has led to the search for an alternative bridging strategy in patients with end-stage lung disease. The use of extracorporeal membrane oxygenation (ECMO) as a bridge to lung transplantation as well as a rescue strategy post-transplant for primary graft dysfunction (PGD) has been studied previously, however due to initially poor outcomes, its use was not heavily instituted. In recent years, with significant improvement in technologies, several single and multi-center studies have shown promising outcomes related to the use of ECMO as a bridging strategy as well as a therapy for patients suffering from PGD post-transplant. These results have challenged our current notion on ECMO use and hence forced us to reexamine the utility, efficacy and safety of ECMO in conjunction with lung transplantation. Through this review, we will address the various aspects related to ECMO use as a bridge to lung transplantation as well as a rescue post-transplant in the treatment of PGD. We will emphasize newer technologies related to ECMO use, examine recent observational studies and randomized trials of ECMO use before and after lung transplantation, and reflect upon our own institutional experience with the use of ECMO in these difficult clinical situations.
doi:10.3978/j.issn.2072-1439.2014.06.04
PMCID: PMC4133542  PMID: 25132974
Lung transplantation; extracorporeal membrane oxygenation (ECMO); primary graft dysfunction (PGD)
24.  Impact of Commonly Used Transplant Immunosuppressive Drugs on Human NK Cell Function Is Dependent upon Stimulation Condition 
PLoS ONE  2013;8(3):e60144.
Lung transplantation is a recognised treatment for patients with end stage pulmonary disease. Transplant recipients receive life-long administration of immunosuppressive drugs that target T cell mediated graft rejection. However little is known of the impact on NK cells, which have the potential to be alloreactive in response to HLA-mismatched ligands on the lung allograft and in doing so, may impact negatively on allograft survival. NK cells from 20 healthy controls were assessed in response to Cyclosporine A, Mycophenolic acid (MPA; active form of Mycophenolate mofetil) and Prednisolone at a range of concentrations. The impact of these clinically used immunosuppressive drugs on cytotoxicity (measured by CD107a expression), IFN-γ production and CFSE proliferation was assessed in response to various stimuli including MHC class-I negative cell lines, IL-2/IL-12 cytokines and PMA/Ionomycin. Treatment with MPA and Prednisolone revealed significantly reduced CD107a expression in response to cell line stimulation. In comparison, addition of MPA and Cyclosporine A displayed reduced CD107a expression and IFN-γ production following PMA/Ionomycin stimulation. Diminished proliferation was observed in response to treatment with each drug. Additional functional inhibitors (LY294002, PD98059, Rottlerin, Rapamycin) were used to elucidate intracellular pathways of NK cell activation in response to stimulation with K562 or PMA-I. CD107a expression was significantly decreased with the addition of PD98059 following K562 stimulation. Similarly, CD107a expression significantly decreased following PMA-I stimulation with the addition of LY294002, PD98059 and Rottlerin. Ten lung transplant patients, not receiving immunosuppressive drugs pre-transplant, were assessed for longitudinal changes post-transplant in relation to the administration of immunosuppressive drugs. Individual patient dynamics revealed different longitudinal patterns of NK cell function post-transplantation. These results provide mechanistic insights into pathways of NK cell activation and show commonly administered transplant immunosuppression agents and clinical rejection/infection events have differential effects on NK cell function that may impact the immune response following lung transplantation.
doi:10.1371/journal.pone.0060144
PMCID: PMC3605368  PMID: 23555904
25.  Discordance between Mycobacterial Interspersed Repetitive-Unit-Variable-Number Tandem-Repeat Typing and IS6110 Restriction Fragment Length Polymorphism Genotyping for Analysis of Mycobacterium tuberculosis Beijing Strains in a Setting of High Incidence of Tuberculosis▿ †  
Journal of Clinical Microbiology  2008;46(10):3338-3345.
IS6110 restriction fragment length polymorphism (RFLP) genotyping is the most widely used genotyping method to study the epidemiology of Mycobacterium tuberculosis. However, due to the complexity of the IS6110 RFLP genotyping technique, and the interpretation of RFLP data, mycobacterial interspersed repetitive-unit-variable-number tandem-repeat (MIRU-VNTR) genotyping has been proposed as the new genotyping standard. This study aimed to determine the discriminatory power of different MIRU-VNTR locus combinations relative to IS6110 RFLP genotyping, using a collection of Beijing genotype M. tuberculosis strains with a well-established phylogenetic history. Clustering, diversity index, clustering concordance, concordance among unique genotypes, and divergent and convergent evolution were calculated for seven combinations of 27 different MIRU-VNTR loci and compared to IS6110 RFLP results. Our results confirmed previous findings that MIRU-VNTR genotyping can be used to estimate the extent of recent or ongoing transmission. However, molecular epidemiological linking of cases varied significantly depending on the genotyping method used. We conclude that IS6110 RFLP and MIRU-VNTR loci evolve independently and at different rates, which leads to discordance between transmission chains predicted by the respective genotyping methods. Concordance between the two genotyping methods could be improved by the inclusion of genetic distance (GD) into the clustering formulae for some of the MIRU-VNTR loci combinations. In summary, our findings differ from previous reports, which may be explained by the fact that in settings of low tuberculosis incidence, the genetic distance between epidemiologically unrelated isolates was sufficient to define a strain using either marker, whereas in settings of high incidence, continuous evolution and persistence of strains revealed the weaknesses inherent to these markers.
doi:10.1128/JCM.00770-08
PMCID: PMC2566133  PMID: 18716230

Results 1-25 (1455365)