To study the body mass index (BMI) trajectory in patients with incident end-stage kidney disease and its association with all-cause mortality.
This longitudinal cohort study included 17022 adult patients commencing hemodialysis [HD] (n = 10860) or peritoneal dialysis [PD] (n = 6162) between 2001 and 2008 and had ≥6-month follow-up and ≥2 weight measurements, using the Australia and New Zealand Dialysis and Transplant Registry data. The association of time-varying BMI with all-cause mortality was explored using multivariate Cox regression models.
The median follow-up was 2.3 years. There was a non-linear change in the mean BMI (kg/m2) over time, with an initial decrease from 27.6 (95% confidence interval [CI]: 27.5, 27.7) to 26.7 (95% CI: 26.6, 26.9) at 3-month, followed by increments to 27.1 (95% CI: 27, 27.2) at 1-year and 27.2 (95% CI: 26.8, 27.1) at 3-year, and a gradual decrease subsequently. The BMI trajectory was significantly lower in HD patients who died than those who survived, although this pattern was not observed in PD patients. Compared to the reference time-varying BMI category of 25.1–28 kg/m2, the mortality risks of both HD and PD patients were greater in all categories of time-varying BMI <25 kg/m2. The mortality risks were significantly lower in all categories of time-varying BMI >28.1 kg/m2 among HD patients, but only in the category 28.1–31 kg/m2 among PD patients.
BMI changed over time in a non-linear fashion in incident dialysis patients. Time-varying measures of BMI were significantly associated with mortality risk in both HD and PD patients.
Inflammation at both systemic and local intraperitoneal levels commonly affects peritoneal dialysis (PD) patients. Interest in inflammatory markers as targets of therapeutic intervention has been considerable as they are recognised as predictors of poor clinical outcomes. However, prior to embarking on strategies to reduce inflammatory burden, it is of paramount importance to define the underlying processes that drive the chronic active inflammatory status. The present review aims to comprehensively describe clinical causes of inflammation in PD patients to which potential future strategies may be targeted.
Dietary sodium restriction is a key management strategy in chronic kidney disease (CKD). Recent evidence has demonstrated short-term reduction in blood pressure (BP) and proteinuria with sodium restriction, however the effect on other cardiovascular-related risk factors requires investigation in CKD.
The LowSALT CKD study involved 20 hypertensive Stage III-IV CKD patients counselled by a dietitian to consume a low-sodium diet (<100 mmol/day). The study was a randomised crossover trial comparing 2 weeks of high-sodium (additional 120 mmol sodium tablets) and low-sodium intake (placebo). Measurements were taken after each crossover arm including BP (peripheral and central), adipokines (inflammation markers and adiponectin), volume markers (extracellular-to-intracellular [E/I] fluid ratio; N-terminal pro-brain natriuretic peptide [NT-proBNP]), kidney function (estimated Glomerular Filtration Rate [eGFR]) and proteinuria (urine protein-creatinine ratio [PCR] and albumin-creatinine ratio [ACR]). Outcomes were compared using paired t-test for each cross-over arm.
BP-lowering benefits of a low-sodium intake (peripheral BP (mean ± SD) 148/82 ± 21/12 mmHg) from high-sodium (159/87 ± 15/10 mmHg) intake were reflected in central BP and a reduction in eGFR, PCR, ACR, NTproBNP and E/I ratio. There was no change in inflammatory markers, total or high molecular weight adiponectin.
Short-term benefits of sodium restriction on BP were reflected in significant change in kidney function and fluid volume parameters. Larger, long-term adequately powered trials in CKD are necessary to confirm these results.
Universal Trial Number U1111-1125-2149 registered on 13/10/2011; Australian New Zealand Clinical Trials Registry Number ACTRN12611001097932 registered on 21/10/2011.
Dietary sodium; Nutrition; Chronic kidney disease; Cardiovascular disease; Blood pressure; Kidney function; Inflammation
Repeated exposure to peritoneal dialysis (PD) solutions contributes to cumulative intraperitoneal inflammation and peritoneal injury. The present study aimed to explore the capacity of dialysate interleukin-6(IL-6) to a) predict peritoneal membrane function and peritonitis in incident PD patients, and b) to evaluate the influence of neutral pH, low glucose degradation product (GDP) PD solution on dialysate IL-6 levels.
The study included 88 incident participants from the balANZ trial who had completed 24-months of follow-up. Change in peritoneal solute transport rate (PSTR) and peritonitis were primary outcome measures, and the utility of IL-6 and IL-6 appearance rate (IL-6 AR) in predicting these outcomes was analyzed using multilevel linear regression and Cox proportional hazards models, respectively. Sensitivity analyses were performed by analyzing outcomes in a peritonitis-free cohort (n = 56).
Dialysate IL-6 concentration significantly increased from baseline to 24 months (mean difference 19.07 pg/mL; P < 0.001) but was not affected by the type of PD solution received (P = 0.68). An increase in PSTR from baseline was associated with higher levels of IL-6 (P = 0.004), the use of standard solutions (P = 0.005) and longer PD duration (P < 0.001). Baseline IL-6 level was not associated with a shorter time to first peritonitis (adjusted hazard ratio 1.00, 95% CI 0.99-1.00, P = 0.74). Analysis of IL-6 AR as well as sensitivity analyses in a peritonitis-free cohort yielded comparable results.
Dialysate IL-6 concentration increased with longer PD duration and was a significant, independent predictor of PSTR. The use of biocompatible PD solutions exerted no significant effect on dialysate IL-6 levels but did abrogate the increase in PSTR associated with standard PD solutions. This is the first study to examine the impact of biocompatible solutions on the utility of IL-6 in predicting PSTR and peritonitis.
Biocompatible; Glucose degradation products; Interleukin-6; Peritoneal dialysis; Peritoneal solute transport rate; Peritonitis
♦ Background: The impact of climatic variations on peritoneal dialysis (PD)-related peritonitis has not been studied in detail. The aim of the current study was to determine whether various climatic zones influenced the probability of occurrence or the clinical outcomes of peritonitis.
♦ Methods: Using ANZDATA registry data, the study in cluded all Australian patients receiving PD between 1 October 2003 and 31 December 2008. Climatic regions were defined according to the Köppen classification.
♦ Results: The overall peritonitis rate was 0.59 episodes per patient-year. Most of the patients lived in Temperate regions (65%), with others residing in Subtropical (26%), Tropical (6%), and Other climatic regions (Desert, 0.6%; Grassland, 2.3%). Compared with patients in Temperate regions, those in Tropical regions demonstrated significantly higher overall peritonitis rates and a shorter time to a first peritonitis episode [adjusted hazard ratio: 1.15; 95% confidence interval (CI): 1.01 to 1.31]. Culture-negative peritonitis was significantly less likely in Tropical regions [adjusted odds ratio (OR): 0.42; 95% CI: 0.25 to 0.73]; its occurrence in Subtropical and Other regions was comparable to that in Temperate regions. Fungal peritonitis was independently associated with Tropical regions (OR: 2.18; 95% CI: 1.22 to 3.90) and Other regions (OR: 3.46; 95% CI: 1.73 to 6.91), where rates of antifungal prophylaxis were also lower. Outcomes after first peritonitis episodes were comparable in all groups.
♦ Conclusions: Tropical regions were associated with a higher overall peritonitis rate (including fungal peritonitis) and a shorter time to a first peritonitis episode. Augmented peritonitis prophylactic measures such as antifungal therapy and exit-site care should be considered in PD patients residing in Tropical climates.
Antibiotics; bacteria; climate; fungus; microbiology; peritonitis; outcomes
♦ Objective: Management of peritoneal dialysis (PD)-associated peritonitis requires timely intervention by experienced staff, which may not be uniformly available throughout the week. The aim of the present study was to examine the effects of weekend compared with weekday presentation on peritonitis outcomes.
♦ Methods: The study, which used data from the Australia and New Zealand Dialysis and Transplant Registry, included all Australian patients receiving PD between 1 October 2003 and 31 December 2008. The independent predictors of weekend presentation and subsequent peritonitis outcomes were assessed by multivariate logistic regression.
♦ Results: Peritonitis presentation rates were significantly lower on Saturdays [0.46 episodes per year; 95% confidence interval (CI): 0.42 to 0.49 episodes per year] and on Sundays (0.43 episodes per year; 95% CI: 0.40 to 0.47 episodes per year) than all other weekdays; they peaked on Mondays (0.76 episodes per year; 95% CI: 0.72 to 0.81 episodes per year). Weekend presentation with a first episode of peritonitis was independently associated with lower body mass index and residence less than 100 km away from the nearest PD unit. Patients presenting with peritonitis on the weekend were significantly more likely to be hospitalized [adjusted odds ratio (OR): 2.32; 95% CI: 1.85 to 2.90], although microbial profiles and empiric antimicrobial treatments were comparable between the weekend and weekday groups. Antimicrobial cure rates were also comparable (79% vs 79%, p = 0.9), with the exception of cure rates for culture-negative peritonitis, which were lower on the weekend (80% vs 88%, p = 0.047). Antifungal prophylaxis was less likely to be co-prescribed for first peritonitis episodes presenting on weekdays (OR: 0.68; 95% CI: 0.05 to 0.89).
♦ Conclusions: Patients on PD are less likely to present with peritonitis on the weekend. Nevertheless, the microbiology, treatment, and outcomes of weekend and weekday PD peritonitis presentations are remarkably similar. Exceptions include the associations of weekend presentation with a higher hospitalization rate and a lower cure rate in culture-negative infection.
After hours; bacteria; fungus; microbiology; peritonitis; outcomes; temporal variation
The aim of this study was to investigate the characteristics and outcomes of patients receiving renal replacement therapy for end-stage kidney disease (ESKD) secondary to haemolytic uraemic syndrome (HUS).
The study included all patients with ESKD who commenced renal replacement therapy in Australia and New Zealand between 15/5/1963 and 31/12/2010, using data from the ANZDATA Registry. HUS ESKD patients were compared with matched controls with an alternative primary renal disease using propensity scores based on age, gender and treatment era.
Of the 58422 patients included in the study, 241 (0.4%) had ESKD secondary to HUS. HUS ESKD was independently associated with younger age, female gender and European race. Compared with matched controls, HUS ESKD was not associated with mortality on renal replacement therapy (adjusted hazard ratio [HR] 1.14, 95% CI 0.87-1.50, p = 0.34) or dialysis (HR 1.34, 95% CI 0.93-1.93, p = 0.12), but did independently predict recovery of renal function (HR 54.01, 95% CI 1.45-11.1, p = 0.008). 130 (54%) HUS patients received 166 renal allografts. Overall renal allograft survival rates were significantly lower for patients with HUS ESKD at 1 year (73% vs 91%), 5 years (62% vs 85%) and 10 years (49% vs 73%). HUS ESKD was an independent predictor of renal allograft failure (HR 2.59, 95% CI 1.70-3.95, p < 0.001). Sixteen (12%) HUS patients experienced failure of 22 renal allografts due to recurrent HUS. HUS ESKD was not independently associated with the risk of death following renal transplantation (HR 0.92, 95% CI 0.35-2.44, p = 0.87).
HUS is an uncommon cause of ESKD, which is associated with comparable patient survival on dialysis, an increased probability of renal function recovery, comparable patient survival post-renal transplant and a heightened risk of renal transplant graft failure compared with matched ESKD controls.
Haemolytic uraemic syndrome; Kidney Failure; Chronic; Outcomes; Renal function recovery; Renal transplantation; Thrombotic microangiopathy
Peritoneal dialysis (PD) is a preferred home dialysis modality and has a number of added advantages including improved initial patient survival and cost effectiveness over haemodialysis. Despite these benefits, uptake of PD remains relatively low, especially in developed countries. Wider implementation of PD is compromised by higher technique failure from infections (e.g., PD peritonitis) and ultrafiltration failure. These are inevitable consequences of peritoneal injury, which is thought to result primarily from continuous exposure to PD fluids that are characterised by their “unphysiologic” composition. In order to overcome these barriers, a number of more biocompatible PD fluids, with neutral pH, low glucose degradation product content, and bicarbonate buffer have been manufactured over the past two decades. Several preclinical studies have demonstrated their benefit in terms of improvement in host cell defence, peritoneal membrane integrity, and cytokine profile. This paper aims to review randomised controlled trials assessing the use of biocompatible PD fluids and their effect on clinical outcomes.
Tunnelled central venous dialysis catheter use is significantly limited by the occurrence of catheter-related infections. This randomised controlled trial assessed the efficacy of a 48 hour 70% ethanol lock vs heparin locks in prolonging the time to the first episode of catheter related blood stream infection (CRBSI).
Patients undergoing haemodialysis (HD) via a tunnelled catheter were randomised 1:1 to once per week ethanol locks (with two heparin locks between other dialysis sessions) vs thrice per week heparin locks.
Observed catheter days in the heparin (n=24) and ethanol (n=25) groups were 1814 and 3614 respectively. CRBSI occurred at a rate of 0.85 vs. 0.28 per 1000 catheter days in the heparin vs ethanol group by intention to treat analysis (incident rate ratio (IRR) for ethanol vs. heparin 0.17; 95%CI 0.02-1.63; p=0.12). Flow issues requiring catheter removal occurred at a rate of 1.6 vs 1.4 per 1000 catheter days in the heparin and ethanol groups respectively (IRR 0.85; 95% CI 0.20-3.5 p =0.82 (for ethanol vs heparin).
Catheter survival and catheter-related blood stream infection were not significantly different but there was a trend towards a reduced rate of infection in the ethanol group. This study establishes proof of concept and will inform an adequately powered multicentre trial to definitively examine the efficacy and safety of ethanol locks as an alternative to current therapies used in the prevention of catheter-associated blood stream infections in patients dialysing with tunnelled catheters.
Australian New Zealand Clinical Trials Registry ACTRN12609000493246
Catheter related blood stream infection (CRBSI); Central venous catheter; Ethanol; Lock therapy; Haemodialysis (HD); Prophylaxis
Despite evidence implicating dietary sodium in the pathogenesis of cardiovascular disease (CVD) in chronic kidney disease (CKD), quality intervention trials in CKD patients are lacking. This study aims to investigate the effect of reducing sodium intake on blood pressure, risk factors for progression of CKD and other cardiovascular risk factors in CKD.
The LowSALT CKD study is a six week randomized-crossover trial assessing the effect of a moderate (180 mmol/day) compared with a low (60 mmol/day) sodium intake on cardiovascular risk factors and risk factors for kidney function decline in mild-moderate CKD (stage III-IV). The primary outcome of interest is 24-hour ambulatory blood pressure, with secondary outcomes including arterial stiffness (pulse wave velocity), proteinuria and fluid status. The randomized crossover trial (Phase 1) is supported by an ancillary trial (Phase 2) of longitudinal-observational design to assess the longer term effectiveness of sodium restriction. Phase 2 will continue measurement of outcomes as per Phase 1, with the addition of patient-centered outcomes, such as dietary adherence to sodium restriction (degree of adherence and barriers/enablers), quality of life and taste assessment.
The LowSALT CKD study is an investigator-initiated study specifically designed to assess the proof-of-concept and efficacy of sodium restriction in patients with established CKD. Phase 2 will assess the longer term effectiveness of sodium restriction in the same participants, enhancing the translation of Phase 1 results into practice. This trial will provide much-needed insight into sodium restriction as a treatment option to reduce risk of CVD and CKD progression in CKD patients.
Universal Trial Number: U1111-1125-2149. Australian New Zealand Clinical Trials Registry Number: ACTRN12611001097932
Dietary sodium; Chronic kidney disease; Blood pressure; Cardiovascular disease; Arterial stiffness; Clinical trial; Patient compliance; Taste disturbance
The aim of the study was to determine whether distance between residence and peritoneal dialysis (PD) unit influenced peritonitis occurrence, microbiology, treatment and outcomes.
The study included all patients receiving PD between 1/10/2003 and 31/12/2008, using ANZDATA Registry data.
365 (6%) patients lived ≥100 km from their nearest PD unit (distant group), while 6183 (94%) lived <100 km (local group). Median time to first peritonitis in distant patients (1.34 years, 95% CI 1.07-1.61) was significantly shorter than in local patients (1.68 years, 95% CI 1.59-1.77, p = 0.001), whilst overall peritonitis rates were higher in distant patients (incidence rate ratio 1.32, 95% CI 1.20-1.46). Living ≥100 km away from a PD unit was independently associated with a higher risk of S. aureus peritonitis (adjusted odds ratio [OR] 1.64, 95% CI 1.09-2.47). Distant patients with first peritonitis episodes were less likely to be hospitalised (64% vs 73%, p = 0.008) and receive antifungal prophylaxis (4% vs 10%, p = 0.01), but more likely to receive vancomycin-based antibiotic regimens (52% vs 42%, p < 0.001). Using multivariable logistic regression analysis of peritonitis outcomes, distant patients were more likely to be cured with antibiotics alone (OR 1.55, 95% CI 1.03-2.24). All other outcomes were comparable between the two groups.
Living ≥100 km away from a PD unit was associated with increased risk of S. aureus peritonitis, modified approaches to peritonitis treatment and peritonitis outcomes that were comparable to, or better than patients living closer to a PD unit. Staphylococcal decolonisation should receive particular consideration in remote living patients.
Antibiotics; Bacteria; Fungus; Microbiology; Peritoneal Dialysis; Peritonitis; Outcomes; Relapse; Remoteness
This report discusses the case of a 52 year old female with post-transplant lymphoproliferative disorder, confined to the central nervous system, which was managed with high dose methotrexate (HDMTX) in the context of end stage renal disease. The patient received two doses of HDMTX followed by extended hours high-flux hemodialysis, plasma methotrexate concentration monitoring and leucovorin rescue. The hemodialysis technique used was effective in clearing plasma methotrexate and allowed delivery of HDMTX to achieve complete remission with limited and reversible direct methotrexate-related toxicity. Dialysis-dependent renal failure does not preclude the use of HDMTX when required for curative therapy of malignancy.
High dose methotrexate; end stage renal disease; dialysis; primary central nervous system lymphoma; post-transplant lymphoproliferative disorder
Nightly extended hours hemodialysis may improve left ventricular hypertrophy and function and endothelial function but presents problems of sustainability and increased cost. The effect of alternate nightly home hemodialysis (NHD) on cardiovascular structure and function is not known.
Sixty-three patients on standard hemodialysis (SHD: 3.5-6 hours/session, 3-5 sessions weekly) converted to NHD (6-10 hours/session overnight for 3-5 sessions weekly). 2Dimensional transthoracic echocardiography and ultrasound measures of brachial artery reactivity (BAR), carotid intima-media thickness (CIMT), total arterial compliance (TAC) and augmentation index (AIX) were performed post dialysis at baseline and 18-24 months following conversion to NHD. In 37 patients, indices of oxidative stress: plasma malonyldialdehyde (MDA) and anti-oxidant enzymes: catalase (CAT), glutathione peroxidase (GPX) and superoxide dismutase (SOD) activity and total antioxidant status (TAS) were measured at baseline, 3 and 6 months.
Left ventricular mass index (LVMI) remained stable. Despite significant derangement at baseline, there were no changes in diastolic function measures, CIMT, BAR and TAC. AIX increased. Conversion to NHD improved bone mineral metabolism parameters and blood pressure control. Interdialytic weight gains increased. No definite improvements in measures of oxidative stress were demonstrated.
Despite improvement in uremic toxin levels and some cardiovascular risk factors, conversion to an alternate nightly NHD regimen did not improve cardiovascular structure and function. Continuing suboptimal control of uremic toxins and interdialytic weight gains may be a possible explanation. This study adds to the increasing uncertainty about the nature of improvement in cardiovascular parameters with conversion to intensive hemodialysis regimens. Future randomized controlled trials will be important to determine whether increases in dialysis session duration, frequency or both are most beneficial for improving cardiovascular disease whilst minimizing costs and the impact of dialysis on quality of life.
Diastolic Function; Ejection Fraction; Left Ventricular Mass Index; Left Ventricular Hypertrophy; Nocturnal Hemodialysis; Carotid Intima-Media Thickness; Oxidative Stress; Arterial Compliance
Aluminium-containing phosphate binders have long been used for treatment of hyperphosphatemia in dialysis patients. Their safety became controversial in the early 1980's after reports of aluminium related neurological and bone disease began to appear. Available historical evidence however, suggests that neurological toxicity may have primarily been caused by excessive exposure to aluminium in dialysis fluid, rather than aluminium-containing oral phosphate binders. Limited evidence suggests that aluminium bone disease may also be on the decline in the era of aluminium removal from dialysis fluid, even with continued use of aluminium binders.
The K/DOQI and KDIGO guidelines both suggest avoiding aluminium-containing binders. These guidelines will tend to promote the use of the newer, more expensive binders (lanthanum, sevelamer), which have limited evidence for benefit and, like aluminium, limited long-term safety data. Treating hyperphosphatemia in dialysis patients continues to represent a major challenge, and there is a large body of evidence linking serum phosphate concentrations with mortality. Most nephrologists agree that phosphate binders have the potential to meaningfully reduce mortality in dialysis patients. Aluminium is one of the cheapest, most effective and well tolerated of the class, however there are no prospective or randomised trials examining the efficacy and safety of aluminium as a binder. Aluminium continues to be used as a binder in Australia as well as some other countries, despite concern about the potential for toxicity. There are some data from selected case series that aluminium bone disease may be declining in the era of reduced aluminium content in dialysis fluid, due to rigorous water testing.
This paper seeks to revisit the contemporary evidence for the safety record of aluminium-containing binders in dialysis patients. It puts their use into the context of the newer, more expensive binders and increasing concerns about the risks of calcium binders, which continue to be widely used. The paper seeks to answer whether the continued use of aluminium is justifiable in the absence of prospective data establishing its safety, and we call for prospective trials to be conducted comparing the available binders both in terms of efficacy and safety.
Tibolone is a synthetic steroid, used with increasing frequency to treat symptoms of menopause, including patients with solid-organ transplants who are taking concurrent immune suppression. To the best of our knowledge, there are no reported drug interactions between tibolone and tacrolimus, one of the principal immune suppressants used in kidney transplantation.
We report the case of a 49-year-old Caucasian woman who had received a kidney transplant and who developed acute kidney injury secondary to tacrolimus toxicity 10 days after starting tibolone therapy. No alternative causes were found. Tibolone is known to be a weak competitive inhibitor of CYP3A4, which is involved in tacrolimus metabolism.
Despite a careful evaluation, no alternative reason was found for the acute kidney injury, and her kidney function returned to the previous baseline within several days of cessation of the medication, and with no other specific treatment. Using the Drug Interaction Probability Scale we conclude that she experienced a probable drug interaction. We believe that transplant clinicians should utilise frequent therapeutic drug monitoring of tacrolimus in patients starting or stopping tibolone therapy.
Approximately 50% of patients with stage 3 Chronic Kidney Disease are 25-hydroxyvitamin D insufficient, and this prevalence increases with falling glomerular filtration rate. Vitamin D is now recognised as having pleiotropic roles beyond bone and mineral homeostasis, with the vitamin D receptor and metabolising machinery identified in multiple tissues. Worryingly, recent observational data has highlighted an association between hypovitaminosis D and increased cardiovascular mortality, possibly mediated via vitamin D effects on insulin resistance and inflammation. The main hypothesis of this study is that oral Vitamin D supplementation will ameliorate insulin resistance in patients with Chronic Kidney Disease stage 3 when compared to placebo. Secondary hypotheses will test whether this is associated with decreased inflammation and bone/adipocyte-endocrine dysregulation.
This study is a single-centre, double-blinded, randomised, placebo-controlled trial. Inclusion criteria include; estimated glomerular filtration rate 30-59 ml/min/1.73 m2; aged ≥18 on entry to study; and serum 25-hydroxyvitamin D levels <75 nmol/L. Patients will be randomised 1:1 to receive either oral cholecalciferol 2000IU/day or placebo for 6 months. The primary outcome will be an improvement in insulin sensitivity, measured by hyperinsulinaemic euglycaemic clamp. Secondary outcome measures will include serum parathyroid hormone, cytokines (Interleukin-1β, Interleukin-6, Tumour Necrosis Factor alpha), adiponectin (total and High Molecular Weight), osteocalcin (carboxylated and under-carboxylated), peripheral blood mononuclear cell Nuclear Factor Kappa-B p65 binding activity, brachial artery reactivity, aortic pulse wave velocity and waveform analysis, and indirect calorimetry. All outcome measures will be performed at baseline and end of study.
To date, no randomised controlled trial has been performed in pre-dialysis CKD patients to study the correlation between vitamin D status with supplementation, insulin resistance and markers of adverse cardiovascular risk. We remain hopeful that cholecalciferol may be a safe intervention, with health benefits beyond those related to bone-mineral homeostasis.
Australian and New Zealand Clinical Trials Registry ACTRN12609000246280.
Catheter-related bacteraemias (CRBs) contribute significantly to morbidity, mortality and health care costs in dialysis populations. Despite international guidelines recommending avoidance of catheters for haemodialysis access, hospital admissions for CRBs have doubled in the last decade. The primary aim of the study is to determine whether weekly instillation of 70% ethanol prevents CRBs compared with standard heparin saline.
The study will follow a prospective, open-label, randomized controlled design. Inclusion criteria are adult patients with incident or prevalent tunneled intravenous dialysis catheters on three times weekly haemodialysis, with no current evidence of catheter infection and no personal, cultural or religious objection to ethanol use, who are on adequate contraception and are able to give informed consent. Patients will be randomized 1:1 to receive 3 mL of intravenous-grade 70% ethanol into each lumen of the catheter once a week and standard heparin locks for other dialysis days, or to receive heparin locks only. The primary outcome measure will be time to the first episode of CRB, which will be defined using standard objective criteria. Secondary outcomes will include adverse reactions, incidence of CRB caused by different pathogens, time to infection-related catheter removal, time to exit site infections and costs. Prospective power calculations indicate that the study will have 80% statistical power to detect a clinically significant increase in median infection-free survival from 200 days to 400 days if 56 patients are recruited into each arm.
This investigator-initiated study has been designed to provide evidence to help nephrologists reduce the incidence of CRBs in haemodialysis patients with tunnelled intravenous catheters.
Australian New Zealand Clinical Trials Registry Number: ACTRN12609000493246
The main hypothesis of this study is that oral heme iron polypeptide (HIP; Proferrin® ES) administration will more effectively augment iron stores in erythropoietic stimulatory agent (ESA)-treated peritoneal dialysis (PD) patients than conventional oral iron supplementation (Ferrogradumet®).
Inclusion criteria are peritoneal dialysis patients treated with darbepoietin alpha (DPO; Aranesp®, Amgen) for ≥ 1 month. Patients will be randomized 1:1 to receive either slow-release ferrous sulphate (1 tablet twice daily; control) or HIP (1 tablet twice daily) for a period of 6 months. The study will follow an open-label design but outcome assessors will be blinded to study treatment. During the 6-month study period, haemoglobin levels will be measured monthly and iron studies (including transferring saturation [TSAT] measurements) will be performed bi-monthly. The primary outcome measure will be the difference in TSAT levels between the 2 groups at the end of the 6 month study period, adjusted for baseline values using analysis of covariance (ANCOVA). Secondary outcome measures will include serum ferritin concentration, haemoglobin level, DPO dosage, Key's index (DPO dosage divided by haemoglobin concentration), and occurrence of adverse events (especially gastrointestinal adverse events).
This investigator-initiated multicentre study has been designed to provide evidence to help nephrologists and their peritoneal dialysis patients determine whether HIP administration more effectively augments iron stores in ESP-treated PD patients than conventional oral iron supplementation.
Australia New Zealand Clinical Trials Registry number ACTRN12609000432213.
There has not been a comprehensive, multi-centre study of streptococcal peritonitis in patients on peritoneal dialysis (PD) to date.
The predictors, treatment and clinical outcomes of streptococcal peritonitis were examined by binary logistic regression and multilevel, multivariate poisson regression in all Australian PD patients involving 66 centres between 2003 and 2006.
Two hundred and eighty-seven episodes of streptococcal peritonitis (4.6% of all peritonitis episodes) occurred in 256 individuals. Its occurrence was independently predicted by Aboriginal or Torres Strait Islander racial origin. Compared with other organisms, streptococcal peritonitis was associated with significantly lower risks of relapse (3% vs 15%), catheter removal (10% vs 23%) and permanent haemodialysis transfer (9% vs 18%), as well as a shorter duration of hospitalisation (5 vs 6 days). Overall, 249 (87%) patients were successfully treated with antibiotics without experiencing relapse, catheter removal or death. The majority of streptococcal peritonitis episodes were treated with either intraperitoneal vancomycin (most common) or first-generation cephalosporins for a median period of 13 days (interquartile range 8–18 days). Initial empiric antibiotic choice did not influence outcomes.
Streptococcal peritonitis is a not infrequent complication of PD, which is more common in indigenous patients. When treated with either first-generation cephalosporins or vancomycin for a period of 2 weeks, streptococcal peritonitis is associated with lower risks of relapse, catheter removal and permanent haemodialysis transfer than other forms of PD-associated peritonitis.
Post-transplant anaemia remains a common problem after kidney transplantation, with an incidence ranging from nearly 80% at day 0 to about 25% at 1 year. It has been associated with poor graft outcome, and recently has also been shown to be associated with increased mortality.
Our transplant unit routinely administers oral iron supplements to renal transplant recipients but this is frequently accompanied by side effects, mainly gastrointestinal intolerance. Intravenous iron is frequently administered to dialysis patients and we sought to investigate this mode of administration in transplant recipients after noticing less anaemia in several patients who had received intravenous iron just prior to being called in for transplantation.
This study is a single-centre, prospective, open-label, randomised, controlled trial of oral versus intravenous iron supplements in renal transplant recipients and aims to recruit approximately 100 patients over a 12-month period. Patients will be randomised to receive a single dose of 500 mg iron polymaltose (intravenous iron group) or 2 ferrous sulphate slow-release tablets daily (oral iron group). The primary outcome is time to normalisation of haemoglobin post-transplant. Prospective power calculations have indicated that a minimum of 48 patients in each group would have to be followed up for 3 months in order to have a 90% probability of detecting a halving of the time to correction of haemoglobin levels to ≥110 g/l in iron-treated patients, assuming an α of 0.05. All eligible adult patients undergoing renal transplantation at the Princess Alexandra Hospital will be offered participation in the trial. Exclusion criteria will include iron overload (transferrin saturation >50% or ferritin >800 μg/l), or previous intolerance of either oral or intravenous iron supplements.
If the trial shows a reduction in the time to correction of anaemia with intravenous iron or less side effects than oral iron, then intravenous iron may become the standard of treatment in this patient group.
Background. Serum creatinine concentration is an unreliable and insensitive marker of chronic kidney disease (CKD). To improve CKD detection, the Australasian Creatinine Consensus Working Committee recommended reporting of estimated glomerular filtration rate (eGFR) using the four-variable Modification of Diet in Renal Disease (MDRD) formula with every request for serum creatinine concentration. The aim of this study was to evaluate the impact of automated laboratory reporting of eGFR on the quantity and quality of referrals to nephrology services in Southeast Queensland, Australia.
Methods. Outpatient referrals to a tertiary and regional renal service, and a single private practice were prospectively audited over 3–12 months prior to and 12 months following the introduction of automated eGFR reporting and concomitant clinician education. The appropriateness of referrals to a nephrologist was assessed according to the Kidney Check Australia Taskforce (KCAT) criteria. Significant changes in the quantity and/or quality of referrals over time were analysed by exponentially weighed moving average (EWMA) charts with control limits based on ±3 standard deviations.
Results. A total of 1019 patients were referred to the centres during the study period. Monthly referrals overall increased by 40% following the introduction of eGFR reporting, and this was most marked for the tertiary renal service (52% above baseline). The appropriateness of nephrologist referrals, as adjudicated by the KCAT criteria, fell significantly from 74.3% in the 3 months pre-eGFR reporting to 65.2% in the 12 months thereafter (P < 0.05). Nevertheless, a greater absolute number of CKD patients were appropriately being referred for nephrologist review in the post-eGFR period (24 versus 15 per month). Patients referred following the introduction of eGFR were significantly more likely to be older (median 63.2 versus 59.3 years, P < 0.05), diabetic (25 versus 18%, P = 0.05) and have stage 3 CKD (48% versus 36%, P < 0.01).
Conclusion. The introduction of automated eGFR calculation has led to an overall increase in referrals with a small but significant decrease in referral quality. The increase in referrals was seen predominantly in older and diabetic patients with stage 3 CKD and appeared to result in net benefit.
chronic kidney disease; Cockcroft–Gault equation; glomerular filtration rate; Modification of Diet in Renal Disease equation; serum creatinine
Chronic kidney disease is a major public health problem globally. Despite this, there are fewer high-quality, high-impact clinical trials in nephrology than other internal medicine specialties, which has led to large gaps in evidence. To address this deficiency, the Australasian Kidney Trials Network, a Collaborative Research Group, was formed in 2005. Since then, the Network has provided infrastructure and expertise to conduct patient-focused high-quality, investigator-initiated clinical trials in nephrology. The Network has not only been successful in engaging the nephrology community in Australia and New Zealand but also in forming collaborations with leading researchers from other countries. This article describes the establishment, development, and functions of the Network. The article also discusses the current and future funding strategies to ensure uninterrupted conduct of much needed clinical trials in nephrology to improve the outcomes of patients affected by kidney diseases with cost-effective interventions.
chronic kidney disease; clinical trials; collaborative research; health policy; investigator initiated; randomized controlled trial