PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (34)
 

Clipboard (0)
None

Select a Filter Below

Year of Publication
Document Types
1.  Predictors of Donor Follow-up after Living Donor Liver Transplantation 
Donor safety in living liver donation is of paramount importance, however, information on long-term outcomes is limited by incomplete follow-up. We sought to ascertain factors that predict post-donation follow-up in 456 living liver donors in the Adult-to-Adult Living Donor Liver Transplantation Cohort Study (A2ALL). Completed donor follow-up was defined as physical, phone, or laboratory contact at a given time point. Univariate and multivariable mixed effects logistic regression models were developed to predict completed follow-up using donor and recipient demographic and clinical data and donor quality of life data. 90% of donors completed follow-up in the first three months, 83% at year 1; completed follow-up ranged from 57% to 72% in years 2–7 and from 41% to 56% in years 8–10. The probability of completed follow-up in the first year was higher for white donors (odds ratio (OR)=3.27, 95% confidence interval (CI)=1.25–8.58), but lower for donors whose recipients had hepatitis C virus or hepatocellular carcinoma (OR=0.34, 95% CI=0.17–0.69). After the first year, older age at donation predicted more complete follow-up. There were significant center differences at all time points (OR range 0.29–10.11), with center variability in both return for in-center visits and in the use of phone/long distance visits. Donor follow-up in the first year post-donation was excellent but decreased with time. Predictors of follow-up varied based on the time since donation. Adapting center best practices, enhanced by using telephone and social media to maintain contact with donors, represents a significant opportunity to gain valuable information about long-term donor outcomes.
doi:10.1002/lt.23912
PMCID: PMC4117821  PMID: 24824858
clinical practice; donor outcomes; donation; follow-up; quality of life
2.  Recovery Time, Quality of Life, and Mortality in Hemodialysis Patients: The Dialysis Outcomes and Practice Patterns Study (DOPPS) 
Background
There is limited information about the clinical and prognostic significance of patient-reported recovery time.
Study Design
Prospective cohort study.
Setting & Participants
6,040 patients in the DOPPS.
Predictor
Answer to question, “How long does it take you to recover from a dialysis session?” categorized as follows: <2, 2–6, 7–12, or >12 hours.
Outcomes & Measurements
Cross-sectional and longitudinal associations between recovery time and patient characteristics, hemodialysis treatment variables, health-related quality of life (HRQoL) and hospitalization and mortality.
Results
32% reported recovery time <2 hours; 41%, 2–6 hours; 17%, 7–12 hours; and 10%, >12 hours. Using proportional odds (ordinal) logistic regression, shorter recovery time was associated with male sex, full-time employment, and higher serum albumin. Longer recovery time was associated with older age, dialysis vintage, body mass index, diabetes, and psychiatric disorder. Greater intradialytic weight loss, longer dialysis session length, and lower dialysate sodium concentration were associated with longer recovery time. In facilities that used uniform dialysate sodium concentration for ≥90% of patients, the adjusted OR of longer recovery time, comparing dialysate sodium concentration <140 vs 140 mEq/L, was 1.72 (95% CI, 1.37–2.16). Recovery time was positively correlated with symptoms of kidney failure and kidney disease burden score, and inversely correlated with HRQoL mental and physical component summary scores. Using Cox regression, adjusting for potential confounders not influenced by recovery time, it was positively associated with first hospitalization and mortality (adjusted HRs for recovery time >12 vs. 2–6 hours of 1.22 [95% CI, 1.09–1.37] and 1.47 [95% CI, 1.19–1.83], respectively).
Limitations
Answers are subjective and not supported by physiological measurements.
Conclusions
Recovery time can be used to identify patients with poorer HRQoL and higher risks of hospitalization and mortality. Interventions to reduce recovery time and possibly to improve clinical outcomes, such as increasing dialysate sodium concentration, need to be tested in randomized trials.
doi:10.1053/j.ajkd.2014.01.014
PMCID: PMC4069238  PMID: 24529994
hemodialysis; DOPPS; quality of life; patient reported; outcomes
3.  Visual Field Improvement in the Collaborative Initial Glaucoma Treatment Study 
American journal of ophthalmology  2014;158(1):96-104.e2.
Purpose
To evaluate critically visual field (VF) improvement in participants in the Collaborative Initial Glaucoma Treatment Study (CIGTS).
Design
Prospective, comparative case series from a randomized clinical trial comparing trabeculectomy and topical medications in treating open-angle glaucoma (OAG).
Methods
607 subjects with newly-diagnosed OAG were identified for study. Baseline and follow-up VF tests were obtained and mean deviation (MD) change from baseline over follow-up was analyzed. Clinically substantial change (loss or improvement) was defined as change from baseline of ≥3 decibels in MD. Baseline factors were inspected to determine their association with VF improvement in repeated measures regression models.
Results
The percentage of participants showing substantial VF improvement over time was similar to that showing VF loss through five years after initial treatment, after which VF loss became more frequent. Measures of better intraocular pressure (IOP) control during treatment were significantly predictive of VF improvement, including a lower mean IOP, a lower minimum IOP, and lower sustained levels of IOP over follow-up. Other predictive factors included female sex [odds ratio (OR)=1.73], visits one year prior to cataract extraction (OR=0.11), and an interaction between treatment and baseline MD wherein surgically treated subjects with worse baseline VF loss were more likely to show VF improvement.
Conclusions
In the CIGTS, substantial VF loss and improvement were comparable through five years of follow-up, after which VF loss became more frequent. Predictive factors for VF improvement included several indicators of better IOP control, which supports the postulate that VF improvement was real.
doi:10.1016/j.ajo.2014.04.003
PMCID: PMC4190842  PMID: 24727262
4.  Impact of prior admissions on 30-day readmissions in Medicare heart failure inpatients 
Mayo Clinic proceedings  2014;89(5):623-630.
Objectives
To determine how all-cause hospitalizations within 12 months preceding an index heart failure (HF) hospitalization affect risk stratification for 30-day all-cause readmission.
Patients and Methods
Early readmission of HF inpatients is challenging to predict, yet this outcome is used to compare hospital performance and guide reimbursement. Most risk models do not consider the potentially important variable of prior admissions. We analyzed Medicare HF inpatients aged ≥66 years admitted to 14 Michigan community hospitals between October 2002 and June 2004. Clinical data were obtained from admission charts, hospitalization dates from Centers for Medicare & Medicaid Services (CMS) claims, and mortality dates from the Social Security Death Index. We used mixed-effects logistic regression and reclassification indices to evaluate the ability of a CMS chart-based readmission risk model, prior admissions, and their combination to predict 30-day readmission in survivors of the index HF hospitalization.
Results
Of 1,807 patients, 43 (2.4%) died during the index admission; 476/1764 survivors (27%) were readmitted ≤30 days after discharge. Adjusted for the CMS readmission model, prior admissions significantly increased the odds of 30-day readmission (1 vs. 0:OR=4.67, 95% CI 3.37–6.46; ≥2 vs. 0:OR=6.49, 95% CI 4.93–8.55; both p<.001), improved model discrimination (c-statistic 0.61 to 0.74, p<.001), and reclassified many patients (net reclassification index 0.40; integrated discrimination index 0.12).
Conclusions
In Medicare HF inpatients, prior all-cause admissions strongly increase all-cause readmission risk and markedly improve risk stratification for 30-day readmission.
doi:10.1016/j.mayocp.2013.12.018
PMCID: PMC4017659  PMID: 24684780
5.  Accommodating Measurements Below a Limit of Detection: A Novel Application of Cox Regression 
American Journal of Epidemiology  2014;179(8):1018-1024.
In environmental epidemiology, measurements of exposure biomarkers often fall below the assay's limit of detection. Existing methods for handling this problem, including deletion, substitution, parametric regression, and multiple imputation, can perform poorly if the proportion of “nondetects” is high or parametric models are misspecified. We propose an approach that treats the measured analyte as the modeled outcome, implying a role reversal when the analyte is a putative cause of a health outcome. Following a scale reversal as well, our approach uses Cox regression to model the analyte, with confounder adjustment. The method makes full use of quantifiable analyte measures, while appropriately treating nondetects as censored. Under the proportional hazards assumption, the hazard ratio for a binary health outcome is interpretable as an adjusted odds ratio: the odds for the outcome at any particular analyte concentration divided by the odds given a lower concentration. Our approach is broadly applicable to cohort studies, case-control studies (frequency matched or not), and cross-sectional studies conducted to identify determinants of exposure. We illustrate the method with cross-sectional survey data to assess sex as a determinant of 2,3,7,8-tetrachlorodibenzo-p-dioxin concentration and with prospective cohort data to assess the association between 2,4,4′-trichlorobiphenyl exposure and psychomotor development.
doi:10.1093/aje/kwu017
PMCID: PMC3966718  PMID: 24671072
2,3,7,8-tetrachlorodibenzo-p-dioxin; 2,4,4′-trichlorobiphenyl; hazard identification; limit of detection; National Health and Nutrition Examination Survey; nondetects; proportional hazards
6.  Estimating Minimally Important Differences for Two Vision-Specific Quality of Life Measures 
Purpose.
To estimate minimally important differences (MIDs) for the Visual Activities Questionnaire (VAQ) and the National Eye Institute-Visual Function Questionnaire (NEI-VFQ).
Methods.
A total of 607 subjects with newly-diagnosed open-angle glaucoma (OAG) was enrolled in the Collaborative Initial Glaucoma Treatment Study (CIGTS) and randomized to initial treatment with medications or surgery. Subjects underwent an ophthalmic examination and telephone-administered quality of life (QOL) interview before randomization and every six months thereafter. The VAQ and NEI-VFQ were used to assess participants' perceptions of their visual function. Clinical measures included the mean deviation (MD) from Humphrey 24-2 full threshold visual field (VF) testing, and best-corrected visual acuity (VA) measured using the Early Treatment Diabetic Retinopathy Study (ETDRS) protocol. Anchor-based (using MD and VA) and distribution-based methods were used to estimate MIDs.
Results.
Anchor-based cross-sectional analyses at 66 months follow-up found a 10-letter increment in better eye VA corresponded to MIDs of 5.2 units for VAQ and 3.8 units for NEI-VFQ total scores. A 3-dB increment in the better eye MD yielded MIDs of 2.6 and 2.3 units for the same two questionnaires. In longitudinal analyses, MIDs for the VAQ were 3.2 units for a 10-letter change of VA and 3.4 units for a 3-dB change in the MD. Distribution-based MIDs were larger.
Conclusions.
A range of MIDs for the VAQ (2.6–6.5 units) and NEI-VFQ (2.3–3.8 units) was found. Although similar in magnitude, MIDs were sensitive to the MID estimation method, the anchor chosen, and differences between questionnaires. (ClinicalTrials.gov number, NCT00000149.)
Minimally important differences (MIDs) were estimated for two vision-specific quality of life measures, VAQ and NEI-VFQ, using anchor-based cross-sectional, anchor-based longitudinal, and distribution-based methods. Subscale MIDs for VAQ (2.6–6.5 units) and NEI-VFQ (2.3–3.8 units) were found.
doi:10.1167/iovs.13-13683
PMCID: PMC4095718  PMID: 24906863
MID; glaucoma; CIGTS; NEI-VFQ; VAQ
7.  Sex-Specific Differences in Hemodialysis Prevalence and Practices and the Male-to-Female Mortality Rate: The Dialysis Outcomes and Practice Patterns Study (DOPPS) 
PLoS Medicine  2014;11(10):e1001750.
In this study, Port and colleagues describe hemodialysis prevalence and patient characteristics by sex, compare men-to-women mortality rate with data from the general population, and evaluate sex interactions with mortality. The results show that women's survival advantage was markedly diminished in hemodialysis patients.
Please see later in the article for the Editors' Summary
Background
A comprehensive analysis of sex-specific differences in the characteristics, treatment, and outcomes of individuals with end-stage renal disease undergoing dialysis might reveal treatment inequalities and targets to improve sex-specific patient care. Here we describe hemodialysis prevalence and patient characteristics by sex, compare the adult male-to-female mortality rate with data from the general population, and evaluate sex interactions with mortality.
Methods and Findings
We assessed the Human Mortality Database and 206,374 patients receiving hemodialysis from 12 countries (Australia, Belgium, Canada, France, Germany, Italy, Japan, New Zealand, Spain, Sweden, the UK, and the US) participating in the international, prospective Dialysis Outcomes and Practice Patterns Study (DOPPS) between June 1996 and March 2012. Among 35,964 sampled DOPPS patients with full data collection, we studied patient characteristics (descriptively) and mortality (via Cox regression) by sex. In all age groups, more men than women were on hemodialysis (59% versus 41% overall), with large differences observed between countries. The average estimated glomerular filtration rate at hemodialysis initiation was higher in men than women. The male-to-female mortality rate ratio in the general population varied from 1.5 to 2.6 for age groups <75 y, but in hemodialysis patients was close to one. Compared to women, men were younger (mean = 61.9±standard deviation 14.6 versus 63.1±14.5 y), were less frequently obese, were more frequently married and recipients of a kidney transplant, more frequently had coronary artery disease, and were less frequently depressed. Interaction analyses showed that the mortality risk associated with several comorbidities and hemodialysis catheter use was lower for men (hazard ratio [HR] = 1.11) than women (HR = 1.33, interaction p<0.001). This study is limited by its inability to establish causality for the observed sex-specific differences and does not provide information about patients not treated with dialysis or dying prior to a planned start of dialysis.
Conclusions
Women's survival advantage was markedly diminished in hemodialysis patients. The finding that fewer women than men were being treated with dialysis for end-stage renal disease merits detailed further study, as the large discrepancies in sex-specific hemodialysis prevalence by country and age group are likely explained by factors beyond biology. Modifiable variables, such as catheter use, showing significant sex interactions suggest interventional targeting.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Throughout life, the kidneys filter waste products (from the normal breakdown of tissues and from food) and excess water from the blood to make urine. Chronic kidney disease—an increasingly common condition globally—gradually destroys the kidney's filtration units (the nephrons). As the nephrons stop working, the rate at which the blood is filtered (the glomerular filtration rate) decreases, and waste products build up in the blood, eventually leading to life-threatening end-stage kidney (renal) disease. Symptoms of chronic kidney disease, which rarely occur until the disease is advanced, include tiredness, swollen feet and ankles, and frequent urination, particularly at night. Chronic kidney disease cannot be cured, but its progression can be slowed by controlling diabetes and other conditions that contribute to its development. End-stage kidney disease is treated by regular hemodialysis (a process in which blood is cleaned by passing it through a filtration machine) or by kidney transplantation.
Why Was This Study Done?
Like many other long-term conditions, the prevalence (the proportion of the population that has a specific disease) of chronic kidney disease and of end-stage renal disease, and treatment outcomes for these conditions, may differ between men and women. Some of these sex-specific differences may arise because of sex-specific differences in normal biological functions. Other sex-specific differences may be related to sex-specific differences in patient care or in patient awareness of chronic kidney disease. A comprehensive analysis of sex-specific differences among individuals with end-stage renal disease might identify both treatment inequalities and ways to improve sex-specific care. Here, in the Dialysis Outcomes and Practice Patterns Study (DOPPS), the researchers investigate sex-specific differences in the prevalence and practices of hemodialysis and in the characteristics of patients undergoing hemodialysis, and investigate the adult male-to-female mortality (death) rate among patients undergoing hemodialysis. The DOPPS is a prospective cohort study that is investigating the characteristics, treatment, and outcomes of adult patients undergoing hemodialysis in representative facilities in 19 countries (12 countries were available for analysis at the time of the current study).
What Did the Researchers Do and Find?
To investigate sex-specific differences in hemodialysis prevalence, the researchers compared data from the Human Mortality Database, which provides detailed population and mortality data for 37 countries, with data collected by the DOPPS. Forty-one percent of DOPPS patients were women, compared to 52% of the general population in 12 of the DOPPS countries. Next, the researchers used data collected from a randomly selected subgroup of patients to examine sex-specific differences in patient characteristics and mortality. The average estimated glomerular filtration rate at hemodialysis initiation was higher in men than women. Moreover, men were more frequently recipients of a kidney transplant than women. Notably, although in the general population in a given age group women were less likely to die than men, among hemodialysis patients, women were as likely to die as men. Finally, the researchers investigated which patient characteristics were associated with the largest sex-specific differences in mortality risk. The use of a hemodialysis catheter (a tube that is inserted into a patient's vein to transfer their blood into the hemodialysis machine) was associated with a lower mortality risk in men than in women.
What Do These Findings Mean?
These findings show that, among patients treated with hemodialysis for end-stage renal disease, women differ from men in many ways. Although some of these sex-specific differences may be related to biology, others may be related to patient care and to patient awareness of chronic kidney disease. Because this is an observational study, these findings cannot prove that the reported differences in hemodialysis prevalence, treatment, and mortality are actually caused by being a man or a woman. Importantly, however, these findings suggest that hemodialysis may abolish the survival advantage that women have over men in the general population and that fewer women than men are being treated for end-stage-renal disease, even though chronic kidney disease is more common in women than in men. Finally, the finding that the use of hemodialysis catheters for access to veins is associated with a higher mortality risk among women than among men suggests that, where possible, women should be offered a surgical process called arteriovenous fistula placement, which is recommended for access to veins during long-term hemodialysis but which may, in the past, have been underused in women.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001750.
More information about the DOPPS program is available
The US National Kidney and Urologic Diseases Information Clearinghouse provides information about all aspects of kidney disease; the US National Kidney Disease Education Program provides resources to help improve the understanding, detection, and management of kidney disease (in English and Spanish)
The UK National Health Service Choices website provides information for patients on chronic kidney disease and about hemodialysis, including some personal stories
The US National Kidney Foundation, a not-for-profit organization, provides information about chronic kidney disease and about hemodialysis (in English and Spanish)
The not-for-profit UK National Kidney Federation provides support and information for patients with kidney disease and for their carers, including information and personal stories about hemodialysis
World Kidney Day, a joint initiative between the International Society of Nephrology and the International Federation of Kidney Foundations, aims to raise awareness about kidneys and kidney disease
MedlinePlus has pages about chronic kidney disease and about hemodialysis
doi:10.1371/journal.pmed.1001750
PMCID: PMC4211675  PMID: 25350533
8.  High density lipoprotein is targeted for oxidation by myeloperoxidase in rheumatoid arthritis 
Annals of the rheumatic diseases  2013;72(10):1725-1731.
Objective
Phagocyte-derived myeloperoxidase (MPO) and pro-inflammatory high density lipoprotein (HDL) associate with rheumatoid arthritis (RA), but the link between MPO and HDL has not been systematically examined. In this study we investigated whether MPO can oxidize HDL and determined MPO-specific oxidative signature by apoA1 by peptide mapping in RA subjects without and with known cardiovascular disease (CVD).
Methods
Two MPO oxidation products, 3-chlorotyrosine and 3-nitrotyrosine were quantified by tandem mass-spectrometry (MS/MS) in in vitro model system studies and in plasma and HDL derived from healthy controls and RA subjects. MPO levels and cholesterol efflux were determined. Site-specific nitration and chlorination of apo A-1 peptides were quantified by MS/MS.
Results
RA subjects demonstrated higher levels of MPO, MPO-oxidized HDL, and diminished cholesterol efflux. There was marked increase in MPO-specific 3-chlorotyrosine and 3-nitrotyrosine content in HDL in RA subjects consistent with specific targeting of HDL, with increased nitration in RA subjects with CVD. Cholesterol efflux capacity was diminished in RA subjects and correlated inversely with HDL 3-chlorotyrosine suggesting a mechanistic role for MPO. Nitrated HDL was elevated in RACVD subjects compared with RA subjects without CVD. Oxidative peptide mapping revealed site-specific unique oxidation signatures on apoA1 for RA subjects without and with CVD.
Conclusion
We report an increase in MPO-mediated HDL oxidation that is regiospecific in RA and accentuated in those with CVD. Decreased cholesterol efflux capacity due MPO-mediated chlorination is a potential mechanism for atherosclerosis in RA and raises the possibility that oxidant-resistant forms of HDL may attenuate this increased risk.
doi:10.1136/annrheumdis-2012-202033
PMCID: PMC4151549  PMID: 23313808
Rheumatoid Arthritis; Myeloperoxidase; High Density Lipoprotein; Mass Spectrometry; Oxidative stress
9.  Computerized Assessment of Competence-Related Abilities in Living Liver Donors: The Adult-to-Adult Living Donor Liver Transplantation Cohort Study (A2ALL) 
Clinical transplantation  2013;27(4):633-645.
Background
Despite its importance, determination of competence to consent to organ donation varies widely based on local standards. We piloted a new tool to aid transplant centers in donor assessment.
Methods
We assessed competence-related abilities among potential living liver donors (LDs) in the 9-center A2ALL study. Prospective LDs viewed an educational video, and were queried to assess Understanding, Appreciation, Reasoning, and ability to express a Final Choice using the MacArthur Competence Assessment Tool for Clinical Research, adapted for computerized administration in LDs (“MacLiver”). Videotaped responses were scored by a clinical neuropsychologist (JF).
Results
Ninety-three LDs were assessed. Mean (standard deviation; domain maximum) scores were: Understanding: 18.1 (2.6; max=22), Appreciation: 5.1 (1.0; max=6), Reasoning: 3.1 (0.8; max=4), and Final Choice: 3.8 (0.5; max=4). Scores did not differ by demographics, relationship to the recipient, eligibility to donate, or eventual donation (p>0.4). Higher education was associated with greater Understanding (p=0.004) and Reasoning (p=0.03).
Conclusion
Standardized, computerized education with independent ratings of responses may (1) alert the clinical staff to potential donors who may not be competent to donate, and (2) highlight areas needing further assessment and education, leading to better informed decision-making.
doi:10.1111/ctr.12184
PMCID: PMC4096778  PMID: 23859354
Living Donation; Comprehension; MacArthur Competence Assessment Tool for Clinical Research; Informed Consent; Ethics; Transplantation
10.  A Randomized Controlled Trial of Pretransplant Antiviral Therapy to Prevent Recurrence of Hepatitis C after Liver Transplantation 
Hepatology (Baltimore, Md.)  2013;57(5):1752-1762.
Hepatitis C virus (HCV) infection recurs in liver recipients who are viremic at transplantation. We conducted a randomized controlled trial to test the efficacy and safety of pre-transplant pegylated interferon alpha-2b plus ribavirin (Peg-IFN-a2b/RBV) for prevention of post-transplant HCV recurrence. Enrollees had HCV and were listed for liver transplantation, with either potential living donors or MELD upgrade for hepatocellular carcinoma. Patients with HCV genotypes (G) 1/4/6 (n=44/2/1) were randomized 2:1 to treatment (n=31) or untreated control (n=16); HCV G2/3 (n=32) were assigned to treatment. Overall, 59 were treated and 20 were not. PEGIFN alfa-2b, starting at 0.75 μg/kg/wk, and ribavirin (RBV), starting at 600 mg/d, were escalated as tolerated. Patients assigned to treatment versus control had similar baseline characteristics. Combined virologic response (CVR) included pre-transplant sustained VR (SVR12) and post-transplant VR (pTVR), defined as undetectable HCV RNA 12 weeks after end of treatment or transplant, respectively. In intent-to-treat analyses, 12 (19%) assigned to treatment and 1 (6%) assigned to control achieved CVR (p=0.29); per-protocol values were 13 (22%) and 0 (0%) (p=0.03). Among treated G1/4/6 patients, 23/30 received transplant of whom 22% had pTVR; among treated G2/3 patients 21/29 received transplant, of whom 29% had pTVR. pTVR was 0%, 18%, and 50% in patients treated for <8, 8–16, and >16 weeks, respectively (p=0.01). Serious adverse events (SAEs) occurred with similar frequency in treated versus untreated patients (68% vs. 55%, p=0.30) but the number of SAEs per patient was higher in the treated group (2.7 vs. 1.3, p=0.003).
Conclusion
Pretransplant treatment with PEGIFN/RBV prevents post-transplant recurrence of HCV in selected patients. Efficacy is higher with >16 weeks of treatment, but treatment is associated with increased risk of potentially serious complications.
doi:10.1002/hep.25976
PMCID: PMC3510348  PMID: 22821361
Peginterferon alfa-2b; Ribavirin; Cirrhosis; Waiting List; post-transplant virologic response
11.  Functional Elements Associated with Hepatic Regeneration in Living Donors after Right Hepatic Lobectomy1,2 
We quantified rates of hepatic regeneration and functional recovery for 6 months after right hepatic lobectomy in living donors for liver transplantation.
Twelve donors were studied at baseline; eight retested at (mean±SD) 11±3 days (T1), 10 at 91±9 days (T2), and 10 at 185±17 days (T3) after donation. Liver and spleen volumes were measured by computed tomography (CT) and single photon emission computed tomography (SPECT). Hepatic metabolism was assessed from caffeine and erythromycin, and hepatic blood flow from cholates, galactose, and perfused hepatic mass (PHM, by SPECT).
Regeneration rates (mL liver per kg body weight per day) were 0.60±0.22 from baseline to T1, 0.05±0.02 from T1 to T2, 0.01±0.01 from T2 to T3 by CT, 0.54±0.20, 0.04±0.01 and 0.01±0.02 by SPECT. At T3, liver volume was 84±7% of baseline by CT and 92±13% by SPECT. Changes in hepatic metabolism did not achieve statistical significance. At T1, unadjusted clearance ratios relative to baseline were 0.75±0.07 for intravenous cholate (p=0.0001), 0.88±0.15 for galactose (p=0.0681), 0.84±0.08 (p=0.002) for PHM, and 0.83±0.19 (p=0.056) for estimated hepatic blood flow. These ratios approached 1.00 by T3. At T1, ratios adjusted per L liver were 20%-50% greater than baseline and trended toward baseline by T3. Several findings were consistent with alteration of the portal circulation: increased cholate shunt, increased spleen volume, decreased platelet count, and decreased clearance of orally-administered cholate.
During the first 2 weeks after donation, hepatic regeneration is rapid and accounts for nearly two-thirds of total regeneration. Increases in hepatic blood flow and uptake of cholate characterize the early phase of regeneration. Right lobe donation alters the portal circulation of living donors, but long-term clinical consequences, if any, are unknown.
doi:10.1002/lt.23592
PMCID: PMC3600052  PMID: 23239552
cholate; SPECT liver-spleen scan; erythromycin; caffeine; galactose
12.  Major bleeding events and risk stratification of antithrombotic agents in hemodialysis: Results from the DOPPS 
Kidney international  2013;84(3):10.1038/ki.2013.170.
Benefits and risks of antithrombotic agents remain unclear in the hemodialysis population. We aimed to determine variation in antithrombotic agent use, rates of major bleeding events, and to determine factors predictive of stroke and bleeding to allow for risk stratification, enabling more rational decisions about using antithrombotic agents.
The sample included 48,144 patients in 12 countries in the Dialysis Outcomes and Practice Patterns Study Phase I–IV. Antithrombotic agents included oral anticoagulants (OAC), ASA and anti-platelet agents (APA). OAC prescription, comorbidities and vascular access were assessed at study entry; data on clinical events including hospitalization due to bleeding were collected every four months during follow-up.
There was wide variation in OAC (0.3–18%), APA (3–25%) and ASA use (8–36%), and major bleeding rates (0.05–0.22 events/year) among countries. Rates of all-cause mortality, cardiovascular mortality, and bleeding events requiring hospitalization were elevated in patients prescribed OAC across adjusted models. The CHADS2 score predicted the risk of stroke in atrial fibrillation patients. Gastrointestinal bleeding in the past 12 months was highly predictive of major bleeding events; for patients with previous gastrointestinal bleeding, the rate of bleeding exceeded the rate of stroke by at least 2-fold across categories of CHADS2 score.
Prescription of antithrombotic agents varied greatly. The CHADS2 score and a history of gastrointestinal bleeding were predictive of stroke and bleeding events, respectively, with bleeding rates substantially exceeding stroke rates in all groups including patients at high stroke risk. Appropriate risk stratification and a cautious approach should be considered before OAC use in the dialysis population.
doi:10.1038/ki.2013.170
PMCID: PMC3885984  PMID: 23677245
13.  Relationship between heart rate variability and pulse wave velocity and their association with patient outcomes in chronic kidney disease  
Clinical Nephrology  2013;81(1):9-19.
Background: Arterial stiffness and low heart rate variability (HRV) have each been associated with increased cardiovascular risk in a variety of patient populations. We explored the relationship between HRV and pulse wave velocity (PWV measure of arterial stiffness) in patients with chronic kidney disease (CKD prior to ESRD) along with examining their association with the outcomes of cardiovascular disease (CVD), death, and progression to end stage renal disease (ESRD). Methods: The RRI-CKD Study is a 4-center prospective cohort study of CKD stages 3 – 5 (n = 834). A subset underwent both HRV testing by 24-hour Holter and carotid-femoral PWV (n = 240). Multiple linear regression was used to assess predictors of PWV and Cox regression to investigate the association of HRV and PWV with time to first CVD event or death and ESRD. Results: Although several HRV measures were inversely correlated with PWV, this association was attenuated after adjustment for age and/or diabetes and no longer significant after adjustment for C-reactive protein. Low HRV and high PWV were individually associated with increased risk of the composite endpoint of CVD/death in multivariable analysis. The risk of the composite of CVD/death was highest for patients with both low HRV and high PWV. Conclusion: Age, diabetes, and inflammation together explained the inverse association between HRV and PWV. Inflammation may play a role in the pathogenesis of both low HRV and high PWV. The combination of low HRV and high PWV showed the strongest association with a composite CVD outcome. Mechanisms underlying abnormalities in PWV and HRV, and the role of these measures as intermediate outcomes in future trials in CKD patients, merit further study.
doi:10.5414/CN108020
PMCID: PMC4504149  PMID: 24356038
autonomic nervous system; cardiovascular disease risk factors; cardiovascular outcomes; cohort study; end stage renal disease
14.  Major Depressive Disorder in a Family Study of Obsessive-Compulsive Disorder with Pediatric Probands 
Depression and anxiety  2011;28(6):10.1002/da.20824.
Objective
The study examined the comorbidity of obsessive-compulsive disorder (OCD) with major depressive disorder (MDD) in a family study of OCD with pediatric probands.
Method
The study examined the lifetime prevalence of MDD in 133 first- and 459 second-degree relatives of pediatric probands with OCD and normal controls, and identified clinical variables associated with MDD in case first-degree relatives (FDR). All available FDR were directly interviewed blind to proband status; parents were also interviewed to assess the family psychiatric history of FDR and second-degree relatives (SDR). Best-estimate diagnoses were made using all sources of information. Data were analyzed with logistic regression and robust Cox regression models.
Results
Lifetime MDD prevalence was higher in case than control FDR (30.4% vs. 15.4%). Lifetime MDD prevalence was higher in FDR of case probands with MDD than in either FDR of case probands without MDD (46.3% vs. 19.7%) or control FDR (46.3% vs. 15.4%). MDD in case FDR was associated with MDD in case probands and with age and OCD in those relatives. Lifetime MDD prevalence was similar in case and control SDR. However, lifetime MDD prevalence was higher in SDR of case probands with MDD than in either SDR of case probands without MDD (31.9% vs. 16.8%) or control SDR (31.9% vs. 15.4%).
Conclusions
The results provide further evidence of the heterogeneity of early-onset OCD and suggest that early-onset OCD comorbid with MDD is a complex familial syndrome.
doi:10.1002/da.20824
PMCID: PMC3859241  PMID: 21538726
anxiety; comorbidity; first-degree relatives; second-degree relatives; logistic regression; survival analysis
15.  Hemoglobin A1c Levels and Mortality in the Diabetic Hemodialysis Population 
Diabetes Care  2012;35(12):2527-2532.
OBJECTIVE
Lowering hemoglobin A1c to <7% reduces the risk of microvascular complications of diabetes, but the importance of maintaining this target in diabetes patients with kidney failure is unclear. We evaluated the relationship between A1c levels and mortality in an international prospective cohort study of hemodialysis patients.
RESEARCH DESIGN AND METHODS
Included were 9,201 hemodialysis patients from 12 countries (Dialysis Outcomes and Practice Patterns Study 3 and 4, 2006–2010) with type 1 or type 2 diabetes and at least one A1c measurement during the first 8 months after study entry. Associations between A1c and mortality were assessed with Cox regression, adjusting for potential confounders.
RESULTS
The association between A1c and mortality was U-shaped. Compared with an A1c of 7–7.9%, the hazard ratios (95% CI) for A1c levels were 1.35 (1.09–1.67) for <5%, 1.18 (1.01–1.37) for 5–5.9%, 1.21 (1.05–1.41) for 6–6.9%, 1.16 (0.94–1.43) for 8–8.9%, and 1.38 (1.11–1.71) for ≥9.0%, after adjustment for age, sex, race, BMI, serum albumin, years of dialysis, serum creatinine, 12 comorbid conditions, insulin use, hemoglobin, LDL cholesterol, country, and study phase. Diabetes medications were prescribed for 35% of patients with A1c <6% and not prescribed for 29% of those with A1c ≥9%.
CONCLUSIONS
A1c levels strongly predicted mortality in hemodialysis patients with type 1 or type 2 diabetes. Mortality increased as A1c moved further from 7–7.9%; thus, target A1c in hemodialysis patients may encompass values higher than those recommended by current guidelines. Modifying glucose-lowering medicines for dialysis patients to target A1c levels within this range may be a modifiable practice to improve outcomes.
doi:10.2337/dc12-0573
PMCID: PMC3507600  PMID: 22912431
16.  Severe Psychiatric Problems in Right Hepatic Lobe Donors for Living Donor Liver Transplantation 
Transplantation  2007;83(11):1506-1508.
Background
The morbidity and mortality from donation of a right hepatic lobe for living donor liver transplantation (LDLT) is an important issue for this procedure. We report the prevalence of severe psychiatric postoperative complications from the Adult-to-Adult Living Donor Liver Transplantation Cohort study (A2ALL), which was established to define the risks and benefits of LDLT for donors and recipients.
Methods
Severe psychiatric complications were evaluated in all donors from the A2ALL study who were evaluated between 1998 and February 2003.
Results
Of the 392 donors, 16 (4.1%) had one or multiple psychiatric complications, including three severe psychiatric complications (suicide, accidental drug overdose, and suicide attempt).
Conclusions
Despite extensive preoperative screening, some donors experience severe psychiatric complications, including suicide, after liver donation. Psychiatric assessment and monitoring of liver donors may help to understand and prevent such tragic events.
doi:10.1097/01.tp.0000263343.21714.3b
PMCID: PMC3762598  PMID: 17565325
Transplantation; Psychiatric; Morbidity; Liver
17.  Spironolactone and colitis: Increased mortality in rodents and in humans 
Inflammatory Bowel Diseases  2011;18(7):1315-1324.
Background and Methods
Crohn’s disease causes intestinal inflammation leading to intestinal fibrosis. Spironolactone is an anti-fibrotic medication commonly used in heart failure to reduce mortality. We examined whether spironolactone is anti-fibrotic in the context of intestinal inflammation. In vitro, spironolactone repressed fibrogenesis in TGFβ-stimulated human colonic myofibroblasts. However, spironolactone therapy significantly increased mortality in two rodent models of inflammation-induced intestinal fibrosis, suggesting spironolactone could be harmful during intestinal inflammation. Since IBD patients rarely receive spironolactone therapy, we examined whether spironolactone use was associated with mortality in a common cause of inflammatory colitis, Clostridium difficile infection (CDI).
Results
Spironolactone use during CDI infection was associated with increased mortality in a retrospective cohort of 4008 inpatients (15.9% vs. 9.1%, n=390 deaths, p<0.0001). In patients without liver disease, the adjusted OR for inpatient mortality associated with 80 mg spironolactone was 1.99 (95% CI: 1.51 – 2.63) In contrast to the main effect of spironolactone mortality, multivariable modeling revealed a protective interaction between liver disease and spironolactone dose. The adjusted odds ratio for mortality after CDI was 1.96 (95% CI: 1.50 – 2.55) for patients without liver disease on spironolactone vs. 1.28 (95% CI: 0.82 – 2.00) for patients with liver disease on spironolactone, when compared to a reference group without liver disease or spironolactone use.
Conclusions
We propose that discontinuation of spironolactone in patients without liver disease during CDI could reduce hospital mortality by 2-fold, potentially reducing mortality from CDI by 35,000 patients annually across Europe and the US.
doi:10.1002/ibd.21929
PMCID: PMC3288762  PMID: 22081497
18.  Antibody Levels to Persistent Pathogens and Incident Stroke in Mexican Americans 
PLoS ONE  2013;8(6):e65959.
Background
Persistent pathogens have been proposed as risk factors for stroke; however, the evidence remains inconclusive. Mexican Americans have an increased risk of stroke especially at younger ages, as well as a higher prevalence of infections caused by several persistent pathogens.
Methodology/Principal
Findings Using data from the Sacramento Area Latino Study on Aging (n = 1621), the authors used discrete-time regression to examine associations between stroke risk and (1) immunoglobulin G antibody levels to Helicobacter pylori (H. pylori), Cytomegalovirus, Varicella Zoster Virus, Toxoplasma gondii and Herpes simplex virus 1, and (2) concurrent exposure to several pathogens (pathogen burden), defined as: (a) summed sero-positivity, (b) number of pathogens eliciting high antibody levels, and (c) average antibody level. Models were adjusted for socio-demographics and stroke risk factors. Antibody levels to H. pylori predicted incident stroke in fully adjusted models (Odds Ratio: 1.58; 95% Confidence Interval: 1.09, 2.28). No significant associations were found between stroke risk and antibody levels to the other four pathogens. No associations were found for pathogen burden and incident stroke in fully adjusted models.
Conclusions/Significance
Our results suggest that exposure to H. pylori may be a stroke risk factor in Mexican Americans and may contribute to ethnic differences in stroke risk given the increased prevalence of exposure to H. pylori in this population. Future studies are needed to confirm this association.
doi:10.1371/journal.pone.0065959
PMCID: PMC3682951  PMID: 23799066
19.  Predictors of heart rate variability and its prognostic significance in chronic kidney disease 
Background.
Heart rate variability (HRV), a noninvasive measure of autonomic dysfunction and a risk factor for cardiovascular disease (CVD), has not been systematically studied in nondialysis chronic kidney disease (CKD).
Methods.
HRV was assessed using 24-h Holter monitoring in 305 subjects from the Renal Research Institute-CKD Study, a four-center prospective cohort of CKD (Stages 3–5). Multiple linear regression was used to assess predictors of HRV (both time and frequency domain) and Cox regression used to predict outcomes of CVD, composite of CVD/death and end-stage renal disease (ESRD).
Results.
A total of 47 CVD, 67 ESRD and 24 death events occurred over a median follow-up of 2.7 years. Lower HRV was significantly associated with older age, female gender, diabetes, higher heart rate, C-reactive protein and phosphorus, lower serum albumin and Stage 5 CKD. Lower HRV (mostly frequency domain) was significantly associated with higher risk of CVD and the composite end point of CVD or death. Significantly, lower HRV (frequency domain) was associated with higher risk of progression to ESRD, although this effect was relatively weaker.
Conclusions.
This study draws attention to the importance of HRV as a relatively under recognized predictor of adverse cardiovascular and renal outcomes in patients with nondialysis CKD. Whether interventions that improve HRV will improve these outcomes in this high-risk population deserves further study.
doi:10.1093/ndt/gfr340
PMCID: PMC3616757  PMID: 21765187
autonomic nervous system; cardiovascular disease risk factors; cardiovascular outcomes; cohort study; end-stage renal disease
20.  Clinical Characteristics of Newly Diagnosed Primary, Pigmentary, and Pseudoexfoliative Open-Angle Glaucoma in the Collaborative Initial Glaucoma Treatment Study 
The British journal of ophthalmology  2012;96(9):1180-1184.
Background/Aims
Three types of open-angle glaucoma (OAG) – primary, pigmentary, and pseudoexfoliative – are frequently encountered. The aim of this study was to compare demographic, ocular, and systemic medical information collected on people with these three OAG types at diagnosis, and determine if the OAG type affected prognosis.
Methods
Information on 607 participants of the Collaborative Initial Glaucoma Treatment Study was accessed. Descriptive statistics characterized their demographic, ocular, and medical status at diagnosis. Comparisons were made using analysis of variance (ANOVA), and chi-square or Fisher exact tests. Multinomial, mixed, and logistic regression analyses were also performed.
Results
Relative to people with primary OAG, those with pigmentary OAG were younger, more likely to be white, less likely to have a family history of glaucoma, and were more myopic. Those with pseudoexfoliative OAG were older, more likely to be white, more likely to be female, less likely to have bilateral disease, and presented with higher IOP and better VA. The type of glaucoma was not associated with intraocular pressure or visual field progression during follow-up.
Conclusion
Characteristics of newly-diagnosed enrollees differed by the type of OAG. While some of these differences relate to the pathogenesis of OAG type, other differences are noteworthy for further evaluation within population-based samples of subjects with newly-diagnosed OAG.
doi:10.1136/bjophthalmol-2012-301820
PMCID: PMC3480313  PMID: 22773091
Glaucoma; Epidemiology
21.  Evaluating clinical change and visual function concerns in drivers and non-drivers with glaucoma 
Purpose
To compare drivers and non-drivers, and describe the specific concerns of drivers, among individuals with glaucoma.
Methods
607 newly-diagnosed glaucoma patients from 14 clinical centers of the Collaborative Initial Glaucoma Treatment Study were randomly assigned to initial medicine or surgery and followed every six months for < 5 years. Driving status (drivers vs. non-drivers) as well as patient-reported visual function was determined by the Visual Activities Questionnaire and the National Eye Institute Visual Function Questionnaire. Clinical evaluation included visual field mean deviation (MD) and visual acuity. Statistical comparisons were made using t, Chi-square, and exact tests, regression, and Rasch analyses.
Results
Drivers were more likely than non-drivers to be male, white, married, employed, and have more education, higher income, and fewer co-morbidities. Over 50% of drivers reported at least “some” difficulty performing tasks involving glare, whereas 22% reported at least “some” difficulty with tasks requiring peripheral vision. At 54 months, drivers with moderate/severe bilateral visual field loss (VFL) reported greater difficulty with night driving and tasks involving visual search and visual processing speed than drivers with less bilateral VFL (all p-values <0.05). While those who remained drivers over follow-up had better MD in both eyes than those who became non-drivers due to eyesight, a number of drivers had marked VFL.
Conclusion
Inquiring about specific difficulties with tasks related to glare, visual processing speed, visual search and peripheral vision in driving, especially among patients with substantial bilateral VF damage, will enable physicians to more effectively counsel patients regarding driving.
doi:10.1167/iovs.08-2575
PMCID: PMC3395081  PMID: 19060263
22.  Outcomes for Adult Living Donor Liver Transplantation: Comparison of A2ALL and National Experience 123 
Objective
To determine if A2ALL (Adult-to-Adult Living Donor Liver Transplantation Cohort Study) findings are reflected in the national experience, and further define risk factors for patient mortality and graft loss in living donor liver transplantation (LDLT).
Background
A2ALL previously identified risk factors for mortality after LDLT, including early center experience, older recipient age and duration of cold ischemia.
Methods
LDLTs at the 9 A2ALL centers (n=702) and 67 non-A2ALL centers (n=1664) from 1/1/98 to 12/31/07 in the Scientific Registry of Transplant Recipients database were analyzed. Potential predictors of time to death or graft failure were tested using multivariable Cox regression, starting at time of transplant.
Results
There was no significant difference in overall mortality between A2ALL and non-A2ALL centers. Higher mortality hazard ratios (HR) were associated with donor age (HR=1.13/10 years, P<0.001), recipient age (HR=1.20/10 years, P<0.001), serum creatinine (HR=1.52 (log scale), P<0.001), diagnosis of HCC (HR=2.12, P<0.001) or HCV (HR=1.18, P=0.03), ICU (HR=2.52, P<0.001) or hospitalized (HR=1.62, P<0.001) vs home, and earlier center experience (LDLT case number ≤15, HR=1.61, P<0.001; HR=2.24, P<0.001 among A2ALL centers, HR=1.45, P=0.005 among non-A2ALL centers). Cold ischemia time (CIT) >4.5 hours was also associated with higher mortality (HR=1.79, P<0.001). Other than for center experience, comparisons of risk factor effects between A2ALL and non-A2ALL centers were not significant. Increased risk of graft failure in early experience was comparable in both groups.
Conclusions
Mortality risk factors were similar at A2ALL and non-A2ALL centers. Variables associated with graft loss were identified and showed similar trends, with some minor differences in degree of significance. These analyses demonstrate that the findings from the A2ALL consortium are relevant to other centers in the U.S. performing LDLT, and conclusions and recommendations from A2ALL may help guide clinical decision making.
doi:10.1002/lt.22288
PMCID: PMC3116058  PMID: 21360649
Liver transplantation; Living donor; Post-operative outcomes; Risk factors
23.  Laboratory test results after living liver donation in the Adult to Adult Living Donor Liver Transplantation Cohort Study (A2ALL) 
Liver Transplantation  2011;17(4):409-417.
Introduction
Information on long-term health among living liver donors is incomplete. Because changes in standard laboratory tests may reflect underlying health among donors, pre- and post- donation results were examined in the Adult-to-Adult Living Donor Liver Transplantation Cohort Study (A2ALL).
Methods
A2ALL followed 487 living liver donors who donated at nine U.S. transplant centers between 1998 and 2009. Aminotransferase and alkaline phosphatase activities (AST, ALT, AP), bilirubin, INR, albumin, white blood cell count (WBC), hemoglobin, platelet count, ferritin, serum creatinine and BUN were measured at evaluation and post-donation: 1 week, 1 month, 3 months, 1 year and yearly thereafter. Repeated measures models were used to estimate median lab values at each time point and test for differences between values at evaluation (baseline) and post-donation time points.
Results
Platelet counts were significantly decreased at every time point compared to baseline, and at three years were 19% lower. Approximately 10% of donors had a platelet count ≤150 (×1000/mm3) at 2–3 years post-donation. Donors with a platelet count ≤150 (×1000/mm3) at one year had had significantly lower mean platelet counts (189±32) vs. the remainder of the cohort (267±56, p<0.0001) at evaluation. Statistically significant differences from evaluation were noted for AST, AP, INR and albumin through the first year, although most measurements were in the normal range. Median values for WBC, hemoglobin, ferritin, albumin, serum creatinine, BUN, and INR were not substantially outside the normal range at any time point.
Conclusions
After three months, most laboratory values return to normal among right hepatic lobe liver donors, with slower return to baseline levels for AST, AP, INR and albumin. Persistently decreased platelet counts warrant further investigation.
doi:10.1002/lt.22246
PMCID: PMC3295864  PMID: 21445924
living donor liver transplantation; complications; liver transplantation; liver function tests; hepatectomy
24.  Visual Field Progression in the Collaborative Initial Glaucoma Treatment Study: The Impact of Treatment and other Baseline Factors 
Ophthalmology  2008;116(2):200-207.
Purpose
To evaluate factors associated with visual field (VF) progression, using all available follow-up through nine years after treatment initiation, in the Collaborative Initial Glaucoma Treatment Study (CIGTS).
Design
Longitudinal follow-up of participants enrolled in a randomized clinical trial.
Participants
607 newly diagnosed glaucoma patients.
Methods
In a randomized clinical trial, 607 subjects with newly diagnosed open-angle glaucoma were initially treated with either medication or trabeculectomy. After treatment initiation and early follow-up, subjects were evaluated clinically at 6-month intervals. Study participants in both arms of the CIGTS were treated aggressively in an effort to reduce intraocular pressure (IOP) to a level at or below a predetermined, eye-specific target pressure. VF progression was analyzed using repeated measures models.
Main outcome measures
VF progression, measured by Humphrey 24-2 full threshold testing and assessed by the change in the mean deviation (MD), and an indicator of substantial worsening of the VF (MD decrease of ≥3 dB from baseline), assessed at each follow-up visit.
Results
Follow-up indicates minimal change from baseline in each initial treatment group’s average MD. However, at the eight year follow-up examination, substantial worsening (≥3 dB) of MD from baseline was found in 21.3% and 25.5% of the initial surgery and initial medicine groups, respectively. The effect of initial treatment on subsequent VF loss was modified by time (P<0.0001), baseline MD (P=0.03), and diabetes (P=0.01). Initial surgery led to less VF progression than initial medicine in subjects with advanced VF loss at baseline, whereas subjects with diabetes had more VF loss over time if treated initially with surgery.
Conclusions
The CIGTS intervention protocol led to a lowering of IOP that persisted over time in both treatment groups. Progression in VF loss was seen in a subset increasing to over 20% of the subjects. Our findings regarding initial surgery being beneficial for subjects who present at diagnosis with more advanced VF loss, but detrimental for patients with diabetes, are noteworthy and warrant independent confirmation.
doi:10.1016/j.ophtha.2008.08.051
PMCID: PMC3316491  PMID: 19019444
25.  IMPROVEMENT IN SURVIVAL ASSOCIATED WITH ADULT-TO-ADULT LIVING DONOR LIVER TRANSPLANTATION1,2 
Gastroenterology  2007;133(6):1806-1813.
Background and Aims
More than 2000 adult-to-adult living donor liver transplants (LDLT) have been performed in the U.S., yet the potential benefit to liver transplant candidates of undergoing LDLT compared to waiting for deceased donor liver transplant (DDLT) is unknown. The aim of this study was to determine if there is a survival benefit of adult LDLT
Methods
Adults with chronic liver disease who had a potential living donor evaluated from 1/98 to 2/03 at nine university-based hospitals were analyzed. Starting at the time of a potential donor’s evaluation, we compared mortality after LDLT to mortality among those who remained on the waitlist or received DDLT. Median follow-up was 4.4 years. Comparisons were made by hazard ratios (HR) adjusted for LDLT candidate characteristics at the time of donor evaluation.
Results
Among 807 potential living donor recipients, 389 received LDLT, 249 received DDLT, 99 died without transplant, and 70 were awaiting transplant at last follow-up. Receipt of LDLT was associated with an adjusted mortality HR of 0.56 (95% confidence interval [CI] 0.42–0.74; P<0.001) relative to candidates who did not receive LDLT. As centers gained greater experience (> 20 LDLT), LDLT benefit was magnified, with a mortality HR of 0.35 (CI 0.23–0.53; P<0.001).
Conclusions
Adult LDLT was associated with lower mortality than the alternative of waiting for DDLT. This reduction in mortality was magnified as centers gained experience with living donor liver transplantation. This reduction in transplant candidate mortality must be balanced against the risks undertaken by the living donors themselves.
doi:10.1053/j.gastro.2007.09.004
PMCID: PMC3170913  PMID: 18054553

Results 1-25 (34)