Kidney transplantation and accompanying medical conditions may result in changes in body composition. Such changes have been evaluated in Caucasian recipients, but not in Asian recipients. Herein, we conducted a study on Asian recipients because Asians have a different body composition from Caucasians. A total of 50 Asian recipients was enrolled as a prospective cohort. Using bioelectrical impedance analysis, body composition (muscle and fat mass) was assessed after 2 weeks (baseline), and at 1, 3, 6, 9, and 12 months following kidney transplantation. To find predictors related to changes, the data were analyzed by multivariate analysis using forward selection. All of the patients had good graft function during the study period. Patients gained approximately 3 kg within 1 yr of kidney transplantation. The proportion of muscle mass significantly decreased (Ptrend = 0.001) and the proportion of fat mass significantly increased over time (Ptrend = 0.002). The multivariate results revealed that male recipients, deceased donor type, and low protein intake were associated with an increase in fat mass and a decrease in muscle mass. The results from this study may help to investigate differences in body composition changes between races, as well as the factors related to these changes.
Asian; Body Composition; Fat; Kidney Transplantation; Muscle
Donor-specific tolerance; Antigen-presenting cells; Intercellular adhesion molecule 1
Race and ethnicity are influential in estimating glomerular filtration rate (GFR). We aimed to find the Korean coefficients for the Modification of Diet in Renal Disease (MDRD) study equations and to obtain novel proper estimation equations. Reference GFR was measured by systemic inulin clearance. Serum creatinine (SCr) values were measured by the alkaline picrate Jaffé kinetic method, then, recalibrated to CX3 analyzer and to isotope dilution mass spectrometry (IDMS). The Korean coefficients for the 4 and 6 variable MDRD and IDMS MDRD study equations based on the SCr recalibrated to CX3 and to IDMS were 0.73989/0.74254 and 0.99096/0.9554, respectively. Coefficients for the 4 and 6 variable MDRD equations based on the SCr measured by Jaffé method were 1.09825 and 1.04334, respectively. The modified equations showed better performances than the original equations. The novel 4 variable equations for Korean based on the SCr measured and recalibrated to IDMS were 107.904×SCr-1.009×age-0.02 (×0.667, if woman) and 87.832×SCr-0.882×age0.01 (×0.653, if woman), respectively. Modified estimations of the MDRD and IDMS MDRD study equations with ethnic coefficients and the novel equations improve the performance of GFR estimation for the overall renal function.
Coefficient; Glomerular Filtration Rate; Inulin Clearance; Modification of Diet in Renal Disease Study
A successful transplantation, across a positive crossmatch barrier, is one of the most persistent long-standing problems in the field of kidney transplant medicine. The aim of this study was to describe seven consecutive living renal transplantations in recipients with positive crossmatch for donors or positive for donor specific antibodies (DSAs). A preconditioning regimen including plasmapheresis and intravenous immunoglobulin was delivered three times a week until the crossmatch and/or DSAs became negative. Mycophenolate mofetil and tacrolimus were started two days before the plasmapheresis. The protocol was modified to include administration of anti-CD 20 antibody (rituximab, 375 mg/m2) from the patient number 3 through the patient number 7. All seven patients achieved negative conversion of the crossmatch or DSAs, and the kidney transplantations were successfully performed in all cases. Acute cellular rejection occurred in two patients, which were subclinical and controlled with high dose steroid treatment. Antibody-mediated rejection occurred in one patient, which was easily reversed with plasmapheresis. All recipients attained normal graft function during the 7-24 months of follow up. Our study suggests that sensitized patients can be transplanted successfully with desensitization pretreatment.
Desensitization, Immunologic; Immunoglobulins, Intravenous; Kidney Transplantation; Plasmapheresis; Rituximab
Previous studies have reported the relationship between thyroid hormone levels and mortality in dialysis patients. However, little is known about the association of free thyroxine (fT4) and mortality in patients on peritoneal dialysis (PD). This study investigated the association between basal and annual variation in fT4 level and mortality in PD patients.
Patients on maintenance PD were enrolled from a prospective multicenter cohort study in Korea; their serum triiodothyronine, fT4, and thyroid-stimulating hormone levels were measured 12 months apart. Patients with overt thyroid disease and those receiving thyroid hormone replacement therapy were excluded from the analysis. Patients were divided into two groups based on the median levels of fT4. The differences of all-cause, infection-related, and cardiovascular mortalities were analyzed between the two groups. The association of basal levels and annual variation with mortality was investigated with Kaplan–Meier curves and Cox proportional hazard models.
Among 235 PD patients, 31 (13.2%) deaths occurred during the mean follow-up period of 24 months. Infection (38.7%) was the most common cause of death. Lower basal fT4 levels were an independent predictor of all-cause and infection-related death (hazard ratio [HR] = 2.74, 95% confidence interval [CI] 1.27–5.90, P = 0.01, and HR = 6.33, 95% CI 1.16–34.64, P = 0.03, respectively). Longitudinally, patients with persistently lower fT4 levels during the 12-month period had significantly higher all-cause mortality than those with persistently higher levels (HR = 3.30, 95% CI 1.15–9.41, P = 0.03). The area under the receiver operating characteristic curve of fT4 for predicting all-cause and infection-related mortality was 0.60 and 0.68, respectively.
fT4 level is an independent predictor of mortality and is especially attributable to infection in PD patients. This predictor was consistent when considering both baseline measurements and annual variation patterns. Close attention to infection in PD patients with relatively lower fT4 levels should be considered.
The clinical benefits of bioimpedance spectroscopy (BIS)-guided fluid management in patients on hemodialysis have been widely demonstrated. However, no previous reports have evaluated the effect of regular and serial BIS-guided fluid management on the residual renal function (RRF) in patients on peritoneal dialysis (PD). Therefore, we will evaluate the clinical efficacy of BIS-guided fluid management for preserving RRF and protecting cardiovascular events in patients on PD.
This is a multicenter, prospective, randomized controlled trial. A total of 138 participants on PD will be enrolled and randomly assigned to receive either BIS-guided fluid management or fluid management based only on the clinical information for 1 year. The primary outcome is the change in the glomerular filtration rate (GFR) between months 0 and 12 after starting treatment. The secondary outcomes will include GFR at month 12, time to the anuric state (urine volume <100 ml/day), and fatal and nonfatal cardiovascular events during treatment.
This is the first clinical trial to investigate the effect of BIS-guided fluid management on RRF and for protecting against cardiovascular events in patients on PD.
Clinical Trials.gov number NCT01887262, June 24, 2013.
Fluid balance; Bioimpedance spectroscopy; Peritoneal dialysis
The effect of high-flux (HF) dialysis on mortality rates could vary with the duration of dialysis. We evaluated the effects of HF dialysis on mortality rates in incident and prevalent hemodialysis (HD) patients.
Incident and prevalent HD patients were selected from the Clinical Research Center registry for end-stage renal disease (ESRD), a Korean prospective observational cohort study. Incident HD patients were defined as newly diagnosed ESRD patients initiating HD. Prevalent HD patients were defined as patients who had been receiving HD for > 3 months. The primary outcome measure was all-cause mortality.
This study included 1,165 incident and 1,641 prevalent HD patients. Following a median 24 months of follow-up, the mortality rates of the HF and low-flux (LF) groups did not significantly differ in the incident patients (hazard ratio [HR], 1.046; 95% confidence interval [CI], 0.592 to 1.847; p = 0.878). In the prevalent patients, HF dialysis was associated with decreased mortality compared with LF dialysis (HR, 0.606; 95% CI, 0.416 to 0.885; p = 0.009).
HF dialysis was associated with a decreased mortality rate in prevalent HD patients, but not in incident HD patients.
Dialysis; Renal dialysis; Mortality
Fimasartan is a novel angiotensin II receptor blocker. Fimasartan is mainly eliminated via biliary excretion, and its urinary elimination is less than 3%.
Based on guidance from the United States Food and Drug Administration, a reduced pharmacokinetic (PK) study was conducted to evaluate the effect of renal function on the PK of fimasartan in patients with renal impairment and healthy volunteers.
A single centre, single-dose, open-label, healthy volunteer controlled trial was conducted in patients with renal impairment (RI) (estimated glomerular filtration rate lower than 30 mL/min/1.73 m2) and age-, weight- and sex-matched healthy volunteers (estimated glomerular filtration rate higher than 90 mL/min/1.73 m2). All participants received a single oral dose of fimasartan 120 mg, after which serial blood sampling for PK evaluation was conducted. Noncompartmental PK analysis of fimasartan was performed. A mixed-effects model approach was used to identify significant covariates and PK parameters.
Sixteen subjects were enrolled (8 healthy volunteers and 8 RI patients). The maximum plasma concentrations and areas under the plasma concentration curves of the RI patients were higher than those of the healthy volunteers, with geometric mean ratios of 1.87 and 1.73, respectively. The relative bioavailability of fimasartan from the population PK analysis was 77% higher in the RI patients than in the healthy volunteers.
The increased drug exposure of fimasartan in RI patients was explained by the increased relative bioavailability. This result can be explained from our knowledge concerning alterations in PK related to renal function.
renal impairment; fimasartan; pharmacokinetics; safety
Controversy persists regarding the appropriate initiation timing of renal replacement therapy for patients with end-stage renal disease. We evaluated the effect of dialysis initiation timing on clinical outcomes. Initiation times were classified according to glomerular filtration rate (GFR).
We enrolled a total of 1691 adult patients who started dialysis between August 2008 and March 2013 in a multi-center, prospective cohort study at the Clinical Research Center for End Stage Renal Disease in the Republic of Korea. The patients were classified into the early-start group or the late-start group according to the mean estimated GFR value, which was 7.37 ml/min/1.73 m2. The primary outcome was patient survival, and the secondary outcomes were hospitalization, cardiovascular events, vascular access complications, change of dialysis modality, and peritonitis. The two groups were compared before and after matching with propensity scores.
Before propensity score matching, the early-start group had a poor survival rate (P<0.001). Hospitalization, cardiovascular events, vascular access complications, changes in dialysis modality, and peritonitis were not different between the groups. A total of 854 patients (427 in each group) were selected by propensity score matching. After matching, neither patient survival nor any of the other outcomes differed between groups.
There was no clinical benefit after adjustment by propensity scores comparing early versus late initiation of dialysis.
Idiopathic membranous nephropathy (iMN) is a common cause of nephrotic syndrome in adults. A biomarker to accurately indicate the severity of iMN and predict long-term prognosis is insufficient. Here, we evaluated the clinical significance of circulating tumor necrosis factor receptors (cTNFRs) as prognostic biomarkers of iMN with nephrotic syndrome. A total of 113 patients with biopsy-proven iMN and 43 healthy volunteers were enrolled in this study. Ninety patients with iMN had nephrotic range proteinuria. Levels of cTNFRs were measured by using serum samples collected at the time of initial diagnosis. Levels of cTNFRs were higher in the patients with nephrotic syndrome than in those with subnephrotic range proteinuria or in the healthy volunteers (P for trend <0.001). Estimated glomerular filtration rate and proteinuria tended to worsen as the cTNFRs levels increased. Having a cTNFR1 level within the highest tertile was a significant risk factor for renal progression after adjustment, in comparison with the other tertiles (hazard ratio [HR], 3.39; 95% confidence interval [95% CI], 1.48–7.78; P = 0.004). The cTNFR2 level within the highest tertile also significantly increased the risk of renal progression (HR, 3.29; 95% CI, 1.43–7.54; P = 0.005). Renal tubular TNFRs expression was associated with cTNFRs level. However, the cTNFRs levels were not associated with autoantibody against phospholipase A2 receptor reactivity/levels or treatment response. This study demonstrated that cTNFRs levels at the time of initial diagnosis could predict renal progression in patients with iMN.
It is known that blood pressure variability (BPV) can independently affect target organ damage (TOD), even with normal blood pressure. There have been few studieson chronic kidney disease (CKD) patients. We evaluated the relationship between BPV and TOD in a cross-sectional, multicenter study on hypertensive CKD patients. We evaluated 1,173 patients using 24-hr ambulatory blood pressure monitoring. BPV was defined as the average real variability, with a mean value of the absolute differences between consecutive readings of systolic blood pressure. TOD was defined as left ventricular hypertrophy (LVH) (by the Romhilt-Estes score ≥4 in electrocardiography) and kidney injury (as determined from an estimated glomerular filtration rate [eGFR]<30 mL/min/1.73 m2 and proteinuria).The mean BPV of the subjects was 15.9±4.63 mmHg. BPV displayed a positive relationship with LVH in a univariate analysis and after adjustment for multi-variables (odds ratio per 1 mmHg increase in BPV: 1.053, P=0.006). In contrast, BPV had no relationship with kidney injury. These data suggest that BPV may be positively associated with LVH in hypertensive CKD patients.
Blood Pressure Variability; Kidney Failure, Chronic; Hypertension; Hypertrophy, Left Ventricular; Target Organ Damage
Vitamin D deficiencies and increases in urinary albumin excretion (UAE) are both important and potentially related health problems; however, the nature of their relationship has not been established in normoalbuminuric subjects.
We obtained data from 14,594 normoalbuminuric Korean adults who underwent voluntary health screenings. We used a generalized additive model to examine the threshold level for relationship between serum 25-hydroxyvitamin D [25(OH)D] and urinary-albumin creatinine ratio (UACR) levels. We conducted multivariate logistic regression for high-normal UAE (UACR, 10–29 mg/g), according to various categories of vitamin D status.
The generalized additive model confirmed a non-linear relationship between serum 25(OH)D and UACR levels, and the threshold concentration of 25(OH)D was 8.0 ng/mL after multivariate adjustment. Comparing subjects who fell into the lowest category of serum 25(OH)D levels with subjects who were in the reference range (the highest category), we observed that the multivariate adjusted odds ratio (OR) for high-normal UAE was significantly increased, regardless of the criteria used to categorize vitamin D levels: OR of the 1st quartile over the 4th quartile, 1.20 (95% CI, 1.04-1.39); OR of the 1.0-4.9th percentile over the 50-100th percentile, 1.56 (95% CI, 1.25-1.93); and OR of vitamin D deficiency group over vitamin D sufficiency group, 1.28 (95% CI, 1.08-1.52).
We demonstrated that there was an inverse relationship between serum 25(OH)D less than 8.0 ng/mL and UACR in normoalbuminuric subjects, suggesting that severe vitamin D deficiency could cause an increase in UAE in subjects with normoalbuminuria.
Epidemiology; Low-grade albuminuria; Threshold; Vitamin D deficiency
The long-term prognosis of clinically early IgA nephropathy (IgAN) patients remains to be clarified. We investigated the long-term outcomes of IgAN patients with an apparently benign presentation and evaluated prognostic factors for renal survival.
We included patients with biopsy-proven IgAN who had estimated glomerular filtration rates (eGFR) ≥60 mL/min/1.73 m2, normal blood pressure, and proteinuria <0.5 g/day at the time of biopsy. The primary outcome was progression to end-stage renal disease (ESRD). The secondary outcome was a 50% increase in serum creatinine level or an increase in proteinuria to >1 g/day.
The analysis included 153 patients who met the inclusion criteria. At diagnosis, their median systolic blood pressure was 120 (110–130) mmHg, eGFR was 85.9 (74.9–100.1) mL/min/1.73 m2, and proteinuria was 0.25 (0.13–0.38) g/day. Of these, 4 patients died and 6 reached ESRD. The 30-year renal survival rate was 85.5%. Three patients had increased serum creatinine levels and 11 developed proteinuria. Remission was observed in 35 (22.9%) patients. A moderate or severe degree of interstitial fibrosis (adjusted odd ratio [OR] 5.93, 95% confidence interval [CI] 1.44–24.45, P = 0.014) and hypoalbuminemia (adjusted OR 6.18, 95% CI 1.20–31.79, P = 0.029) were independent predictors of the secondary outcome.
This study showed that the prognosis of early IgAN was not always favorable, even resulting in progression to ESRD in some cases. Hypoalbuminemia and interstitial fibrosis should also be considered important prognostic factors in clinically early IgAN patients.
IgA nephropathy; Interstitial fibrosis; Progression of renal failure
The nature of cost-saving effects of early referral to a nephrologist in patients with chronic kidney disease (CKD) is not fully evaluated. We evaluated the health care costs before and after dialysis according to the referral time.
A total of 879 patients who were newly diagnosed as having end-stage renal disease from August 2008 to June 2011 were prospectively enrolled. The early referral (ER) group was defined as patients who were referred to a nephrologist more than a year before dialysis and had visited a nephrology clinic 2 or more times. Patients whose referral time was less than a year were considered the late referral (LR) group. Information about medical costs was acquired from the claim data of the Korea Health Insurance Review and Assessment Service.
The total medical costs during the first 12 months after the initiation of dialysis were not different between the 526 ER patients and the 353 LR patients. However, the costs of the ER patients during the first month were significantly lower than those of the LR patients (ER vs. LR: 3029±2219 vs. 3438±2821 US dollars [USD], P = 0.025). The total 12-month health care costs before the initiation of dialysis were significantly lower in the ER group (ER vs. LR: 6206±5873 vs. 8610±7820 USD, P<0.001). In the multivariate analysis, ER significantly lowered the health care costs during the 12 months before (2534.0±436.2 USD, P<0.001) and the first month (428.5±172.3 USD, P = 0.013) after the initiation of dialysis.
The ER of patients with CKD to a nephrologist is associated with decreased medical costs during the pretreatment period of renal replacement therapy and the early period of dialysis initiation.
The effect of flux membranes on mortality in hemodialysis (HD) patients is controversial. Residual renal function (RRF) has shown to not only be as a predictor of mortality but also a contributor to β2-microglobulin clearance in HD patients. Our study aimed to determine the interaction of residual renal function with dialyzer membrane flux on mortality in HD patients.
HD Patients were included from the Clinical Research Center registry for End Stage Renal Disease, a prospective observational cohort study in Korea. Cox proportional hazards regression models were used to study the association between use of high-flux dialysis membranes and all-cause mortality with RRF and without RRF. The primary outcome was all-cause mortality.
This study included 893 patients with 24 h-residual urine volume ≥100 ml (569 and 324 dialyzed using low-flux and high-flux dialysis membranes, respectively) and 913 patients with 24 h-residual urine volume <100 ml (570 and 343 dialyzed using low-flux and high-flux dialysis membranes, respectively). After a median follow-up period of 31 months, mortality was not significantly different between the high and low-flux groups in patients with 24 h-residual urine volume ≥100 ml (HR 0.86, 95% CI, 0.38–1.95, P = 0.723). In patients with 24 h-residual urine volume <100 ml, HD using high-flux dialysis membrane was associated with decreased mortality compared to HD using low-flux dialysis membrane in multivariate analysis (HR 0.40, 95% CI, 0.21–0.78, P = 0.007).
Our data showed that HD using high-flux dialysis membranes had a survival benefit in patients with 24 h-residual urine volume <100 ml, but not in patients with 24 h-residual urine volume ≥100 ml. These findings suggest that high-flux dialysis rather than low-flux dialysis might be considered in HD patients without RRF.
Oxidative stress is a major mediator of adverse outcome after kidney transplantation. Bilirubin is produced by heme oxygenase-1 (HO-1), catalyzed by UDP-glucuronosyltransferase (UGT1A1), and has potential as an antioxidant. In this study, we investigated the effects of HO-1 and UGT1A1 sequence variations on kidney allograft outcomes.
Clinical data were collected from 429 Korean recipients who underwent kidney transplantation from 1990–2008. Genotyping for UGT1A1*28 and HO-1 (A−413T) was performed. Acute rejection and graft survival were monitored as end-points.
Serum levels of total bilirubin were significantly increased after transplantation (0.41±0.19 mg/dL to 0.80±0.33 mg/dL, P<0.001). Post-transplant 1-year bilirubin level was higher in 6/7 or 7/7 carriers compared with 6/6 homozygotes in terms of the UGT1A1*28 polymorphism (6/6 vs. 6/7 vs. 7/7: 0.71±0.27 vs. 1.06±0.36 vs. 1.10±0.45 mg/dL, P<0.001). According to an additive model of genotype analysis, the 7-allele genotype had a protective effect on the development of acute rejection compared with the 6-allele (odds ratio 0.43, 95% CI 0.25–0.73, P for trend = 0.006). Multivariate Cox regression analysis revealed that individuals carrying the 7-allele had a decreased risk of graft loss, by a factor of 0.36 (95% CI 0.15–0.85, P = 0.019). The HO-1 (A−413T) polymorphism had no effect on serum bilirubin levels or graft outcomes.
The UGT1A1*28 polymorphism is associated with changes in serum bilirubin and with graft outcome after kidney transplantation.
Dose selection is an important step in pharmacokinetic (PK) studies of hemodialysis patients. We propose a simulation-based dose-selection method for PK studies of hemodialysis patients using a subpharmacological dose of oseltamivir as a model drug.
The concentrations of oseltamivir and its active metabolite, oseltamivir carboxylate (OC), were measured by liquid chromatography-tandem mass spectrometry. To determine a low oseltamivir dose exhibiting PK linearity, a pilot low dose determination investigation (n = 4) was performed using a single administration dose-escalation study. After the dose was determined, a low dose study (n = 10) was performed, and the optimal dose required to reach the hypothetical target OC exposure (area under the concentration-time curve [AUC] of 60,000 ng · hr/mL) was simulated using a nonparametric superposition method. Finally, observed PKs at the optimal dose were compared to the simulated PKs to verify PK predictability.
In the pilot low dose determination study, 2.5 mg of oseltamivir was determined to be the low dose. Subsequently, we performed a single-dose PK study with the low oseltamivir dose in an additional group of 10 hemodialysis patients. The predicted AUClast of OC following continuous oseltamivir doses was simulated, and 35 mg of oseltamivir corresponded to the hypothetical target AUClast of OC. The observed PK profiles of OC at a 35-mg oseltamivir dose and the simulated data based on the low dose study were in close alignment.
The results indicate that the proposed method provides a rational approach to determine the proper PK dose in hemodialysis patients.
Hemodialysis; Pharmacokinetics; Drug; Dosage
Sunitinib is an oral multitargeted tyrosine kinase inhibitor used mainly for the treatment of metastatic renal cell carcinoma. The renal adverse effects (RAEs) of sunitinib have not been investigated. The aim of this study was to determine the incidence and risk factors of RAEs (proteinuria [PU] and renal insufficiency [RI]) and to investigate the relationship between PU and antitumor efficacy.
We performed a retrospective review of medical records of patients who had received sunitinib for more than 3 months.
One hundred and fifty-five patients (mean age, 58.7 ± 12.6 years) were enrolled, and the mean baseline creatinine level was 1.24 mg/dL. PU developed in 15 of 111 patients, and preexisting PU was aggravated in six of 111 patients. Only one patient developed typical nephrotic syndrome. Following discontinuation of sunitinib, PU was improved in 12 of 17 patients but persisted in five of 17 patients. RI occurred in 12 of 155 patients, and the maximum creatinine level was 3.31 mg/dL. RI improved in two of 12 patients but persisted in 10 of 12 patients. Risk factors for PU were hypertension, dyslipidemia, and chronic kidney disease. Older age was a risk factor for RI. The median progression-free survival was significantly better for patients who showed PU.
The incidence of RAEs associated with sunitinib was lower than those of previous reports. The severity of RAEs was mild to moderate, and partially reversible after cessation of sunitinib. We suggest that blood pressure, urinalysis, and renal function in patients receiving sunitinib should be monitored closely.
Acute kidney injury; Proteinuria; Sunitinib
The impact of dialysis modality on survival is still somewhat controversial. Given possible differences in patients’ characteristics and the cause and rate of death in different countries, the issue needs to be evaluated in Korean cohorts.
A nationwide prospective observational cohort study (NCT00931970) was performed to compare survival between peritoneal dialysis (PD) and hemodialysis (HD). A total of 1,060 end-stage renal disease patients in Korea who began dialysis between September 1, 2008 and June 30, 2011 were followed through December 31, 2011.
The patients (PD, 30.6%; HD, 69.4%) were followed up for 16.3±7.9 months. PD patients were significantly younger, less likely to be diabetic, with lower body mass index, and larger urinary volume than HD patients. Infection was the most common cause of death. Multivariate Cox regression with the entire cohort revealed that PD tended to be associated with a lower risk of death compared to HD [hazard ratio (HR) 0.63, 95% confidence interval (CI) 0.36–1.08]. In propensity score matched pairs (n = 278 in each modality), cumulative survival probabilities for PD and HD patients were 96.9% and 94.1% at 12 months (P = 0.152) and 94.3% and 87.6% at 24 months (P = 0.022), respectively. Patients on PD had a 51% lower risk of death compared to those on HD (HR 0.49, 95% CI 0.25–0.97).
PD exhibits superior survival to HD in the early period of dialysis, even after adjusting for differences in the patients’ characteristics between the two modalities. Notably, the most common cause of death was infection in this Korean cohort.
IL-15 is a powerful T cell growth factor (TCGF) with particular importance for the maintenance of CD8+ T cells. Because costimulation blockade does not result in universal tolerance, we hypothesized that “escape” from costimulation blockade might represent a CD8+ and IL-15/IL-15R+-dependent process. For this analysis, we have used an IL-15 mutant/Fcγ2a protein, a potentially cytolytic protein that is also a high-affinity receptor site specific antagonist for the IL-15Rα receptor protein, as a therapeutic agent. The IL-15-related fusion protein was used as monotherapy or in combination with CTLA4/Fc in murine islet allograft models. As monotherapies, CTLA4/Fc and an IL-15 mutant/Fcγ2a were comparably effective in a semiallogeneic model system, and combined treatment with IL-15 mutant/Fcγ2a plus CTLA4/Fc produced universal permanent engraftment. In a fully MHC-mismatched strain combination known to be refractory to costimulation blockade treatment, combined treatment with both fusion proteins proved to be highly effective; >70% of recipients were tolerized. The analysis revealed that the IL-15 mutant/Fc treatment confers partial protection from both CD4+ and CD8+ T cell graft infiltration. In rejections occurring despite CTLA4/Fc treatment, concomitant treatment with the IL-15 mutant/Fcγ2a protein blocked a CD8+ T cell-dominated rejection processes. This protection was linked to a blunted proliferative response of alloreactive T cells as well silencing of CTL-related gene expression events. Hence, we have demonstrated that targeting the IL-15/IL-15R pathway represents a new and potent strategy to prevent costimulation blockade-resistant CD8+ T cell-driven rejection.
Owing to shared receptor components, the biologic activities of IL-15 are similar to those of IL-2. However, the patterns of tissue expression of IL-2/IL-2Rα and IL-15/IL-15Rα differ. The development of agents targeting the receptor and signaling elements of IL-15 may provide a new perspective for treatment of diseases associated with expression of IL-15/IL-15R. We designed, genetically constructed, and expressed a receptor site-specific IL-15 antagonist by mutating glutamine residues within the C terminus of IL-15 to aspartic acid and genetically linked this mutant IL-15 to murine Fcγ2a. These mutant IL-15 proteins specifically bind to the IL-15R, competitively inhibit IL-15-triggered cell proliferation, and do not activate the STAT-signaling pathway. Because the receptor site-specific antagonist IL-15 mutant/Fcγ2a fusion proteins had a prolonged t½ in vivo and the potential for destruction of IL-15R+ leukocytes, we examined the immunosuppressive activity of this agent. An IL-15 mutant/Fcγ2a fusion protein markedly attenuated Ag-specific delayed-type hypersensitivity responses and decreased leukocyte infiltration within the delayed-type hypersensitivity sites. These findings suggest that 1) IL-15/IL-15R+ cells are crucial to these T cell-dependent immune responses, and 2) treatment with IL-15 mutant/Fcγ2a protein may ameliorate T cell-dependent immune/inflammatory diseases.
Anemia and vitamin D deficiency are both important health issues; however, the nature of the association between vitamin D and either hemoglobin or anemia remains unresolved in the general population.
Data on 11,206 adults were obtained from the fifth Korean National Health and Nutritional Examination Survey. A generalized additive model was used to examine the threshold level for relationship between serum 25-hydroxyvitamin D [25(OH)D] and hemoglobin levels. A multivariate logistic regression for anemia was conducted according to 25(OH)D quintiles. All analyses were stratified according to sex and menstrual status.
The generalized additive model confirmed a threshold 25(OH)D level of 26.4 ng/mL (male, 27.4 ng/mL; premenopausal females, 11.8 ng/mL; postmenopausal females, 13.4 ng/mL). The threshold level affected the pattern of association between 25(OH)D and anemia risk: the odds ratio of the 1st quintile but not the 2nd, 3rd, and 4th quintiles were significantly different from the 5th quintile in both premenopausal and postmenopausal females, however there was no obvious trend in males.
This population-based study demonstrated a non-linear relationship with a threshold effect between serum 25(OH)D and hemoglobin levels in females. Further interventional studies are warranted to determine whether the appropriate level of hemoglobin can be achieved by the correction of vitamin D deficiency.
Elevated serum level of fibroblast growth factor-23 (FGF23) is associated with adverse outcomes in dialyzed patients.
The CUPID study compared the efficacy of a cinacalcet-based regimen with conventional care (vitamin D and P binders) for achieving the stringent NKF-K/DOQI targets for peritoneal dialysis (PD) patients. Additionally, we analyzed change in FGF23 levels between two treatments to explore the cinacalcet effect in lowering FGF23.
Multicenter, open-labeled, randomized controlled study.
Seven university-affiliated hospitals in Korea.
Overall, 66 peritoneal dialysis patients were enrolled.
Sixty six patients were randomly assigned to treatment with either cinacalcet + oral vitamin D (cinacalcet group, n = 33) or oral vitamin D alone (control group, n = 33) to achieve K/DOQI targets. CUPID included a 4-week screening for vitamin D washout, a 12-week dose-titration, and a 4-week assessment phases. We calculated mean values of iPTH, Ca, P, Ca x P, during assessment phase and final FGF23 to assess the outcome.
Main outcome measures
Achievement of >30% reduction of iPTH from baseline (primary) and FGF23 reduction (secondary).
72.7% (n = 24) of the cinacalcet group and 93.9% (n = 31) of the control group completed the study. Cinacalcet group received 30.2 ± 18.0 mg/day of cinacalcet and 0.13 ± 0.32 μg/d oral vitamin D (P < 0.001 vs. control with 0.27 ± 0.18 μg/d vitamin D). The proportion of patients who reached the primary endpoint was not statistically different (48.5% vs. 51.5%, cinacalcet vs. control, P = 1.000). After treatment, cinacalcet group experienced a significant reduction in FGF23 levels (median value from 3,960 to 2,325 RU/ml, P = 0.002), while an insignificant change was shown for control group (from 2,085 to 2,415 RU/ml). The percent change of FGF23 after treatment was also significantly different between the two groups (− 42.54% vs. 15.83%, P = 0.008). After adjustment, cinacalcet treatment was independently associated with the serum FGF23 reduction.
Cinacalcet treatment was independently associated with the reduction of FGF23 in our PD patients.
Controlled trials NCT01101113
Cinacalcet; Fibroblast growth factor 23; Peritoneal dialysis
The effects of air pollution on the respiratory and cardiovascular systems, and the resulting impacts on public health, have been widely studied. However, little is known about the effect of air pollution on the occurrence of hemorrhagic fever with renal syndrome (HFRS), a rodent-borne infectious disease. In this study, we evaluated the correlation between air pollution and HFRS incidence from 2001 to 2010, and estimated the significance of the correlation under the effect of climate variables.
We obtained data regarding HFRS, particulate matter smaller than 10 μm (PM10) as an index of air pollution, and climate variables including temperature, humidity, and precipitation from the national database of South Korea. Poisson regression models were established to predict the number of HFRS cases using air pollution and climate variables with different time lags. We then compared the ability of the climate model and the combined climate and air pollution model to predict the occurrence of HFRS.
The correlations between PM10 and HFRS were significant in univariate analyses, although the direction of the correlations changed according to the time lags. In multivariate analyses of adjusted climate variables, the effects of PM10 with time lags were different. However, PM10 without time lags was selected in the final model for predicting HFRS cases. The model that combined climate and PM10 data was a better predictor of HFRS cases than the model that used only climate data, for both the study period and the year 2011.
This is the first report to document an association between HFRS and PM10 level.
Air pollution; Hantavirus; Hemorrhagic fever with renal syndrome; Particulate matter; Infection
The timing of referral to a nephrologist may influence the outcome of chronic kidney disease patients, but its impact has not been evaluated thoroughly. The results of a recent study showing an association between early referral and patient survival are still being debated. A total of 1028 patients newly diagnosed as end-stage renal disease (ESRD) from July 2008 to October 2011 were enrolled. Early referral (ER) was defined as patients meeting with a nephrologist more than a year before dialysis and dialysis education were provided, and all others were considered late referral (LR). The relationship of referral pattern with mortality in ESRD patients was explored using a Cox proportional hazards regression models. Time from referral to dialysis was significantly longer in 599 ER patients than in 429 LR patients (62.3±58.9 versus 2.9±3.4 months, P<0.001). Emergency HD using a temporary vascular catheter was required in 485 (47.2%) out of all patients and in 262 (43.7%) of ER compared with 223 (52.0%) of LR (P = 0.009). After 2 years of follow-up, the survival rate in ER was better than that in LR (hazard ratio [HR] 2.38, 95% confidence interval [CI] 1.27–4.45, P = 0.007). In patients with diabetes nephropathy, patient survival was also significantly higher in ER than in LR (HR 4.74, 95% CI 1.73–13.00, P = 0.002). With increasing age, HR also increased. Timely referral to a nephrologist in the predialytic stage is associated with reduced mortality.