PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1117422)

Clipboard (0)
None

Related Articles

1.  Precision of Biomarkers to Define Chronic Inflammation in CKD 
American journal of nephrology  2008;28(5):808-812.
Background/Aims
Several inflammatory biomarkers have been found to be associated with cardiovascular disease or all-cause mortality in dialysis patients, but their usefulness in clinical practice or as surrogate endpoints is not certain. The purpose of the present study was to determine the intrapatient variation of C-reactive protein, IL-6, fetuin-A and albumin in a population of dialysis patients.
Methods
Apparently healthy dialysis patients with either a tunneled dialysis catheter or fistula had monthly assessments of these biomarkers for a total of four determinations, and the intraclass correlation coefficients were calculated as measures of intersubject variance.
Results
Our results showed large within-subject variation relative to the total variation in the measurements (31-46%). Having a tunneled catheter as opposed to a fistula was not significantly associated with mean levels, suggesting that chronic subclinical catheter infection does not explain the variation seen in the biomarkers. In contrast, there was a rapid change in these biomarkers with a clinically apparent acute infection.
Conclusion
These results suggest that these biomarkers have limitations for use as surrogate endpoints in clinical trials due to wide fluctuations, even in apparently clinically healthy individuals.
doi:10.1159/000135692
PMCID: PMC2574778  PMID: 18506106
Biomarkers, precision; Chronic inflammation; Chronic kidney disease; CKD stage 5D; Inflammatory biomarkers, intrapatient variance; Tunneled dialysis catheter
2.  Effects of starting hemodialysis with an arteriovenous fistula or central venous catheter compared with peritoneal dialysis: a retrospective cohort study 
BMC Nephrology  2012;13:88.
Background
Although several studies have demonstrated early survival advantages with peritoneal dialysis (PD) over hemodialysis (HD), the reason for the excess mortality observed among incident HD patients remains to be established, to our knowledge. This study explores the relationship between mortality and dialysis modality, focusing on the role of HD vascular access type at the time of dialysis initiation.
Methods
A retrospective cohort study was performed among local adult chronic kidney disease patients who consecutively initiated PD and HD with a tunneled cuffed venous catheter (HD-TCC) or a functional arteriovenous fistula (HD-AVF) in our institution in the year 2008. A total of 152 patients were included in the final analysis (HD-AVF, n = 59; HD-TCC, n = 51; PD, n = 42). All cause and dialysis access-related morbidity/mortality were evaluated at one year. Univariate and multivariate analysis were used to compare the survival of PD patients with those who initiated HD with an AVF or with a TCC.
Results
Compared with PD patients, both HD-AVF and HD-TCC patients were more likely to be older (p<0.001) and to have a higher frequency of diabetes mellitus (p = 0.017) and cardiovascular disease (p = 0.020). Overall, HD-TCC patients were more likely to have clinical visits (p = 0.069), emergency room visits (p<0.001) and hospital admissions (p<0.001). At the end of follow-up, HD-TCC patients had a higher rate of dialysis access-related complications (1.53 vs. 0.93 vs. 0.64, per patient-year; p<0.001) and hospitalizations (0.47 vs. 0.07 vs. 0.14, per patient-year; p = 0.034) than HD-AVF and PD patients, respectively. The survival rates at one year were 96.6%, 74.5% and 97.6% for HD-AVF, HD-TCC and PD groups, respectively (p<0.001). In multivariate analysis, HD-TCC use at the time of dialysis initiation was the important factor associated with death (HR 16.128, 95%CI [1.431-181.778], p = 0.024).
Conclusion
Our results suggest that HD vascular access type at the time of renal replacement therapy initiation is an important modifier of the relationship between dialysis modality and survival among incident dialysis patients.
doi:10.1186/1471-2369-13-88
PMCID: PMC3476986  PMID: 22916962
3.  Care of undocumented-uninsured immigrants in a large urban dialysis unit 
BMC Nephrology  2012;13:112.
Background
Medical, ethical and financial dilemmas may arise in treating undocumented-uninsured patients with end-stage renal disease (ESRD). Hereby we describe the 10-year experience of treating undocumented-uninsured ESRD patients in a large public dialysis-unit.
Methods
We evaluated the medical files of all the chronic dialysis patients treated at the Tel-Aviv Medical Center between the years 2000–2010. Data for all immigrant patients without documentation and medical insurance were obtained. Clinical data were compared with an age-matched cohort of 77 insured dialysis patients.
Results
Fifteen undocumented-uninsured patients were treated with chronic scheduled dialysis therapy for a mean length of 2.3 years and a total of 4953 hemodialysis sessions, despite lack of reimbursement. All undocumented-uninsured patients presented initially with symptoms attributed to uremia and with stage 5 chronic kidney disease (CKD). In comparison, in the age-matched cohort, only 6 patients (8%) were initially evaluated by a nephrologist at stage 5 CKD. Levels of hemoglobin (8.5 ± 1.7 versus 10.8 ± 1.6 g/dL; p < 0.0001) and albumin (33.8 ± 4.8 versus 37.7 ± 3.9 g/L; p < 0.001) were lower in the undocumented-uninsured dialysis patients compared with the age-matched insured patients at initiation of hemodialysis therapy. These significant changes were persistent throughout the treatment period. Hemodialysis was performed in all the undocumented-uninsured patients via tunneled cuffed catheters (TCC) without higher rates of TCC-associated infections. The rate of skipped hemodialysis sessions was similar in the undocumented-uninsured and age-matched insured cohorts.
Conclusions
Undocumented-uninsured dialysis patients presented initially in the advanced stages of CKD with lower levels of hemoglobin and worse nutritional status in comparison with age-matched insured patients. The type of vascular access for hemodialysis was less than optimal with regards to current guidelines. There is a need for the national and international nephrology communities to establish a policy concerning the treatment of undocumented-uninsured patients with CKD.
doi:10.1186/1471-2369-13-112
PMCID: PMC3615959  PMID: 22992409
Dialysis; ESRD; Undocumented; Uninsured; Immigrants
4.  Risk Models to Predict Chronic Kidney Disease and Its Progression: A Systematic Review 
PLoS Medicine  2012;9(11):e1001344.
A systematic review of risk prediction models conducted by Justin Echouffo-Tcheugui and Andre Kengne examines the evidence base for prediction of chronic kidney disease risk and its progression, and suitability of such models for clinical use.
Background
Chronic kidney disease (CKD) is common, and associated with increased risk of cardiovascular disease and end-stage renal disease, which are potentially preventable through early identification and treatment of individuals at risk. Although risk factors for occurrence and progression of CKD have been identified, their utility for CKD risk stratification through prediction models remains unclear. We critically assessed risk models to predict CKD and its progression, and evaluated their suitability for clinical use.
Methods and Findings
We systematically searched MEDLINE and Embase (1 January 1980 to 20 June 2012). Dual review was conducted to identify studies that reported on the development, validation, or impact assessment of a model constructed to predict the occurrence/presence of CKD or progression to advanced stages. Data were extracted on study characteristics, risk predictors, discrimination, calibration, and reclassification performance of models, as well as validation and impact analyses. We included 26 publications reporting on 30 CKD occurrence prediction risk scores and 17 CKD progression prediction risk scores. The vast majority of CKD risk models had acceptable-to-good discriminatory performance (area under the receiver operating characteristic curve>0.70) in the derivation sample. Calibration was less commonly assessed, but overall was found to be acceptable. Only eight CKD occurrence and five CKD progression risk models have been externally validated, displaying modest-to-acceptable discrimination. Whether novel biomarkers of CKD (circulatory or genetic) can improve prediction largely remains unclear, and impact studies of CKD prediction models have not yet been conducted. Limitations of risk models include the lack of ethnic diversity in derivation samples, and the scarcity of validation studies. The review is limited by the lack of an agreed-on system for rating prediction models, and the difficulty of assessing publication bias.
Conclusions
The development and clinical application of renal risk scores is in its infancy; however, the discriminatory performance of existing tools is acceptable. The effect of using these models in practice is still to be explored.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Chronic kidney disease (CKD)—the gradual loss of kidney function—is increasingly common worldwide. In the US, for example, about 26 million adults have CKD, and millions more are at risk of developing the condition. Throughout life, small structures called nephrons inside the kidneys filter waste products and excess water from the blood to make urine. If the nephrons stop working because of injury or disease, the rate of blood filtration decreases, and dangerous amounts of waste products such as creatinine build up in the blood. Symptoms of CKD, which rarely occur until the disease is very advanced, include tiredness, swollen feet and ankles, puffiness around the eyes, and frequent urination, especially at night. There is no cure for CKD, but progression of the disease can be slowed by controlling high blood pressure and diabetes, both of which cause CKD, and by adopting a healthy lifestyle. The same interventions also reduce the chances of CKD developing in the first place.
Why Was This Study Done?
CKD is associated with an increased risk of end-stage renal disease, which is treated with dialysis or by kidney transplantation (renal replacement therapies), and of cardiovascular disease. These life-threatening complications are potentially preventable through early identification and treatment of CKD, but most people present with advanced disease. Early identification would be particularly useful in developing countries, where renal replacement therapies are not readily available and resources for treating cardiovascular problems are limited. One way to identify people at risk of a disease is to use a “risk model.” Risk models are constructed by testing the ability of different combinations of risk factors that are associated with a specific disease to identify those individuals in a “derivation sample” who have the disease. The model is then validated on an independent group of people. In this systematic review (a study that uses predefined criteria to identify all the research on a given topic), the researchers critically assess the ability of existing CKD risk models to predict the occurrence of CKD and its progression, and evaluate their suitability for clinical use.
What Did the Researchers Do and Find?
The researchers identified 26 publications reporting on 30 risk models for CKD occurrence and 17 risk models for CKD progression that met their predefined criteria. The risk factors most commonly included in these models were age, sex, body mass index, diabetes status, systolic blood pressure, serum creatinine, protein in the urine, and serum albumin or total protein. Nearly all the models had acceptable-to-good discriminatory performance (a measure of how well a model separates people who have a disease from people who do not have the disease) in the derivation sample. Not all the models had been calibrated (assessed for whether the average predicted risk within a group matched the proportion that actually developed the disease), but in those that had been assessed calibration was good. Only eight CKD occurrence and five CKD progression risk models had been externally validated; discrimination in the validation samples was modest-to-acceptable. Finally, very few studies had assessed whether adding extra variables to CKD risk models (for example, genetic markers) improved prediction, and none had assessed the impact of adopting CKD risk models on the clinical care and outcomes of patients.
What Do These Findings Mean?
These findings suggest that the development and clinical application of CKD risk models is still in its infancy. Specifically, these findings indicate that the existing models need to be better calibrated and need to be externally validated in different populations (most of the models were tested only in predominantly white populations) before they are incorporated into guidelines. The impact of their use on clinical outcomes also needs to be assessed before their widespread use is recommended. Such research is worthwhile, however, because of the potential public health and clinical applications of well-designed risk models for CKD. Such models could be used to identify segments of the population that would benefit most from screening for CKD, for example. Moreover, risk communication to patients could motivate them to adopt a healthy lifestyle and to adhere to prescribed medications, and the use of models for predicting CKD progression could help clinicians tailor disease-modifying therapies to individual patient needs.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001344.
This study is further discussed in a PLOS Medicine Perspective by Maarten Taal
The US National Kidney and Urologic Diseases Information Clearinghouse provides information about all aspects of kidney disease; the US National Kidney Disease Education Program provides resources to help improve the understanding, detection, and management of kidney disease (in English and Spanish)
The UK National Health Service Choices website provides information for patients on chronic kidney disease, including some personal stories
The US National Kidney Foundation, a not-for-profit organization, provides information about chronic kidney disease (in English and Spanish)
The not-for-profit UK National Kidney Federation support and information for patients with kidney disease and for their carers, including a selection of patient experiences of kidney disease
World Kidney Day, a joint initiative between the International Society of Nephrology and the International Federation of Kidney Foundations, aims to raise awareness about kidneys and kidney disease
doi:10.1371/journal.pmed.1001344
PMCID: PMC3502517  PMID: 23185136
5.  A Systematic Review and Meta-Analysis of Utility-Based Quality of Life in Chronic Kidney Disease Treatments 
PLoS Medicine  2012;9(9):e1001307.
Melanie Wyld and colleagues examined previously published studies to assess pooled utility-based quality of life of the various treatments for chronic kidney disease. They conclude that the highest utility was for kidney transplants, with home-based automated peritoneal dialysis being second.
Background
Chronic kidney disease (CKD) is a common and costly condition to treat. Economic evaluations of health care often incorporate patient preferences for health outcomes using utilities. The objective of this study was to determine pooled utility-based quality of life (the numerical value attached to the strength of an individual's preference for a specific health outcome) by CKD treatment modality.
Methods and Findings
We conducted a systematic review, meta-analysis, and meta-regression of peer-reviewed published articles and of PhD dissertations published through 1 December 2010 that reported utility-based quality of life (utility) for adults with late-stage CKD. Studies reporting utilities by proxy (e.g., reported by a patient's doctor or family member) were excluded.
In total, 190 studies reporting 326 utilities from over 56,000 patients were analysed. There were 25 utilities from pre-treatment CKD patients, 226 from dialysis patients (haemodialysis, n = 163; peritoneal dialysis, n = 44), 66 from kidney transplant patients, and three from patients treated with non-dialytic conservative care. Using time tradeoff as a referent instrument, kidney transplant recipients had a mean utility of 0.82 (95% CI: 0.74, 0.90). The mean utility was comparable in pre-treatment CKD patients (difference = −0.02; 95% CI: −0.09, 0.04), 0.11 lower in dialysis patients (95% CI: −0.15, −0.08), and 0.2 lower in conservative care patients (95% CI: −0.38, −0.01). Patients treated with automated peritoneal dialysis had a significantly higher mean utility (0.80) than those on continuous ambulatory peritoneal dialysis (0.72; p = 0.02). The mean utility of transplant patients increased over time, from 0.66 in the 1980s to 0.85 in the 2000s, an increase of 0.19 (95% CI: 0.11, 0.26). Utility varied by elicitation instrument, with standard gamble producing the highest estimates, and the SF-6D by Brazier et al., University of Sheffield, producing the lowest estimates. The main limitations of this study were that treatment assignments were not random, that only transplant had longitudinal data available, and that we calculated EuroQol Group EQ-5D scores from SF-36 and SF-12 health survey data, and therefore the algorithms may not reflect EQ-5D scores measured directly.
Conclusions
For patients with late-stage CKD, treatment with dialysis is associated with a significant decrement in quality of life compared to treatment with kidney transplantation. These findings provide evidence-based utility estimates to inform economic evaluations of kidney therapies, useful for policy makers and in individual treatment discussions with CKD patients.
Editors' Summary
Background
Ill health can adversely affect an individual's quality of life, particularly if caused by long-term (chronic) conditions, such as chronic kidney disease—in the United States alone, 23 million people have chronic kidney disease, of whom 570,000 are treated with dialysis or kidney transplantation. In order to measure the cost-effectiveness of interventions to manage medical conditions, health economists use an objective measurement known as quality-adjusted life years. However, although useful, quality-adjusted life years are often criticized for not taking into account the views and preferences of the individuals with the medical conditions. A measurement called a utility solves this problem. Utilities are a numerical value (measured on a 0 to 1 scale, where 0 represents death and 1 represents full health) of the strength of an individual's preference for specified health-related outcomes, as measured by “instruments” (questionnaires) that rate direct comparisons or assess quality of life.
Why Was This Study Done?
Previous studies have suggested that, in people with chronic kidney disease, quality of life (as measured by utility) is higher in those with a functioning kidney transplant than in those on dialysis. However, currently, it is unclear whether the type of dialysis affects quality of life: hemodialysis is a highly technical process that directly filters the blood, usually must be done 2–4 times a week, and can only be performed in a health facility; peritoneal dialysis, in which fluids are infused into the abdominal cavity, can be done nightly at home (automated peritoneal dialysis) or throughout the day (continuous ambulatory peritoneal dialysis). In this study, the researchers reviewed and assimilated all of the available evidence to investigate whether quality of life in people with chronic kidney disease (as measured by utility) differed according to treatment type.
What Did the Researchers Do and Find?
The researchers did a comprehensive search of 11 databases to identify all relevant studies that included people with severe (stage 3, 4, or 5) chronic kidney disease, their form of treatment, and information on utilities—either reported directly, or included in quality of life instruments (SF-36), so the researchers could calculate utilities by using a validated algorithm. The researchers also recorded the prevalence rates of diabetes in study participants. Then, using statistical models that adjusted for various factors, including treatment type and the method of measuring utilities, the researchers were able to calculate the pooled utilities of each form of treatment for chronic kidney disease.
The researchers included 190 studies, representing over 56,000 patients and generating 326 utility estimates, in their analysis. The majority of utilities (77%) were derived through the SF-36 questionnaire via calculation. Of the 326 utility estimates, 25 were from patients pre-dialysis, 226 were from dialysis patients (the majority of whom were receiving hemodialysis), 66 were from kidney transplant patients, and three were from conservative care patients. The researchers found that the highest average utility was for those who had renal transplantation, 0.82, followed by the pre-dialysis group (0.80), dialysis patients (0.71), and, finally, patients receiving conservative care (0.62). When comparing the type of dialysis, the researchers found that there was little difference in utility between hemodialysis and peritoneal dialysis, but patients using automated peritoneal dialysis had, on average, a higher utility (0.80) than those treated with continuous ambulatory peritoneal dialysis (0.72). Finally, the researchers found that patient groups with diabetes had significantly lower utilities than those without diabetes.
What Do These Findings Mean?
These findings suggest that in people with chronic kidney disease, renal transplantation is the best treatment option to improve quality of life. For those on dialysis, home-based automated peritoneal dialysis may improve quality of life more than the other forms of dialysis: this finding is important, as this type of dialysis is not as widely used as other forms and is also cheaper than hemodialysis. Furthermore, these findings suggest that patients who choose conservative care have significantly lower quality of life than patients treated with dialysis, a finding that warrants further investigation. Overall, in addition to helping to inform economic evaluations of treatment options, the information from this analysis can help guide clinicians caring for patients with chronic kidney disease in their discussions about possible treatment options.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001307.
Information about chronic kidney disease is available from the National Kidney Foundation and MedlinePlus
Wikipedia gives information on general utilities (note that Wikipedia is a free online encyclopedia that anyone can edit; available in several languages)
doi:10.1371/journal.pmed.1001307
PMCID: PMC3439392  PMID: 22984353
6.  Arteriovenous Graft Placement in Predialysis Patients: A Potential Catheter-Sparing Strategy 
Background
When pre-dialysis patients are deemed to be unsuitable candidates for an arteriovenous fistula, current guidelines recommend waiting until just before or after initiation of dialysis before placing a graft. This strategy may increase catheter use when these patients start dialysis. We compared the outcomes of patients whose grafts were placed before and after dialysis initiation.
Study design
Retrospective analysis of a prospective computerized vascular access database.
Setting and participants
Chronic kidney disease patients receiving their first arteriovenous graft (n=248) at a large medical center.
Predictor
Timing of graft placement (before or after initiation of dialysis)
Outcome & measurements
Primary graft failure, cumulative graft survival, catheter-dependence, and catheter-related bacteremia.
Results
The first graft was placed pre-dialysis in 62 patients and post-dialysis in 186 patients. Primary graft failure was similar for pre-dialysis and post-dialysis grafts (20 vs 24%; p=0.5). The median cumulative graft survival was similar for the pre-dialysis and post-dialysis grafts (365 vs 414 days; HR, 1.22; 95% CI, 0.81–1.98; p=0.3). The median duration of catheter-dependence after graft placement in the post-dialysis group was 48 days, and was associated with 0.63 (95% CI, 0.48–0.79) episodes of catheter-related bacteremia per patient.
Limitations
Retrospective analysis, single medical center
Conclusion
Grafts placed pre-dialysis have similar primary failure rates and cumulative survival to those placed after starting dialysis. However, post-dialysis graft placement is associated with prolonged catheter dependence and frequent bacteremia. Pre-dialysis graft placement may reduce catheter-dependence and bacteremia in selected patients.
doi:10.1053/j.ajkd.2011.01.026
PMCID: PMC4034174  PMID: 21458898
7.  An integrated review of "unplanned" dialysis initiation: reframing the terminology to "suboptimal" initiation 
BMC Nephrology  2009;10:22.
Background
Ideally, care prior to the initiation of dialysis should increase the likelihood that patients start electively outside of the hospital setting with a mature arteriovenous fistula (AVF) or peritoneal dialysis (PD) catheter. However, unplanned dialysis continues to occur in patients both known and unknown to nephrology services, and in both late and early referrals. The objective of this article is to review the clinical and socioeconomic outcomes of unplanned dialysis initiation. The secondary objective is to explore the potential cost implications of reducing the rate of unplanned first dialysis in Canada.
Methods
MEDLINE and EMBASE from inception to 2008 were used to identify studies examining the clinical, economic or quality of life (QoL) outcomes in patients with an unplanned versus planned first dialysis. Data were described in a qualitative manner.
Results
Eight European studies (5,805 patients) were reviewed. Duration of hospitalization and mortality was higher for the unplanned versus planned population. Patients undergoing a first unplanned dialysis had significantly worse laboratory parameters and QoL. Rates of unplanned dialysis ranged from 24-49%. The total annual burden to the Canadian healthcare system of unplanned dialysis in 2005 was estimated at $33 million in direct hospital costs alone. Reducing the rate of unplanned dialysis by one-half yielded savings ranging from $13.3 to $16.1 million.
Conclusion
The clinical and socioeconomic impact of unplanned dialysis is significant. To more consistently characterize the unplanned population, the term suboptimal initiation is proposed to include dialysis initiation in hospital and/or with a central venous catheter and/or with a patient not starting on their chronic modality of choice. Further research and implementation of initiatives to reduce the rate of suboptimal initiation of dialysis in Canada are needed.
doi:10.1186/1471-2369-10-22
PMCID: PMC2735745  PMID: 19674452
8.  Duration of temporary catheter use for hemodialysis: an observational, prospective evaluation of renal units in Brazil 
BMC Nephrology  2011;12:63.
Background
For chronic hemodialysis, the ideal permanent vascular access is the arteriovenous fistula (AVF). Temporary catheters should be reserved for acute dialysis needs. The AVF is associated with lower infection rates, better clinical results, and a higher quality of life and survival when compared to temporary catheters. In Brazil, the proportion of patients with temporary catheters for more than 3 months from the beginning of therapy is used as an evaluation of the quality of renal units. The aim of this study is to evaluate factors associated with the time between the beginning of hemodialysis with temporary catheters and the placement of the first arteriovenous fistula in Brazil.
Methods
This is an observational, prospective non-concurrent study using national administrative registries of all patients financed by the public health system who began renal replacement therapy (RRT) between 2000 and 2004 in Brazil. Incident patients were eligible who had hemodialysis for the first time. Patients were excluded who: had hemodialysis reportedly started after the date of death (inconsistent database); were younger than 18 years old; had HIV; had no record of the first dialysis unit; and were dialyzed in units with less than twenty patients. To evaluate individual and renal unit factors associated with the event of interest, the frailty model was used (N = 55,589).
Results
Among the 23,824 patients (42.9%) who underwent fistula placement in the period of the study, 18.2% maintained the temporary catheter for more than three months until the fistula creation. The analysis identified five statistically significant factors associated with longer time until first fistula: higher age (Hazard-risk - HR 0.99, 95% CI 0.99-1.00); having hypertension and cardiovascular diseases (HR 0.94, 95% CI 0.9-0.98) as the cause of chronic renal disease; residing in capitals cities (HR 0.92, 95% CI 0.9-0.95) and certain regions in Brazil - South (HR 0.83, 95% CI 0.8-0.87), Midwest (HR 0.88, 95% CI 0.83-0.94), Northeast (HR 0.91, 95% CI 0.88-0.94), or North (HR 0.88, 95% CI 0.83-0.94) and the type of renal unit (public or private).
Conclusion
Monitoring the provision of arteriovenous fistulas in renal units could improve the care given to patients with end stage renal disease.
doi:10.1186/1471-2369-12-63
PMCID: PMC3227575  PMID: 22093280
9.  Cohort profile: Canadian study of prediction of death, dialysis and interim cardiovascular events (CanPREDDICT) 
BMC Nephrology  2013;14:121.
Background
The Canadian Study of Prediction of Death, Dialysis and Interim Cardiovascular Events (CanPREDDICT) is a large, prospective, pan-Canadian, cohort study designed to improve our understanding of determinants of renal and cardiovascular (CV) disease progression in patients with chronic kidney disease (CKD). The primary objective is to clarify the associations between traditional and newer biomarkers in the prediction of specific renal and CV events, and of death in patients with CKD managed by nephrologists. This information could then be used to better understand biological variation in outcomes, to develop clinical prediction models and to inform enrolment into interventional studies which may lead to novel treatments.
Methods/Designs
Commenced in 2008, 2546 patients have been enrolled with eGFR between 15 and 45 ml/min 1.73m2 from a representative sample in 25 rural, urban, academic and non academic centres across Canada. Patients are to be followed for an initial 3 years at 6 monthly intervals, and subsequently annually. Traditional biomarkers include eGFR, urine albumin creatinine ratio (uACR), hemoglobin (Hgb), phosphate and albumin. Newer biomarkers of interest were selected on the basis of biological relevance to important processes, commercial availability and assay reproducibility. They include asymmetric dimethylarginine (ADMA), N-terminal pro-brain natriuretic peptide (NT-pro-BNP), troponin I, cystatin C, high sensitivity C-reactive protein (hsCRP), interleukin-6 (IL6) and transforming growth factor beta 1 (TGFβ1). Blood and urine samples are collected at baseline, and every 6 monthly, and stored at −80°C. Outcomes of interest include renal replacement therapy, CV events and death, the latter two of which are adjudicated by an independent panel.
Discussion
The baseline distribution of newer biomarkers does not appear to track to markers of kidney function and therefore may offer some discriminatory value in predicting future outcomes. The granularity of the data presented at baseline may foster additional questions.
The value of the cohort as a unique resource to understand outcomes of patients under the care of nephrologists in a single payer healthcare system cannot be overstated. Systematic collection of demographic, laboratory and event data should lead to new insights.
The mean age of the cohort was 68 years, 90% were Caucasian, 62% were male, and 48% had diabetes. Forty percent of the cohort had eGFR between 30–45 mL/min/1.73m2, 22% had eGFR values below 20 mL/min/1.73m2; 61% had uACR < 30. Serum albumin, hemoglobin, calcium and 25-hydroxyvitamin D (25(OH)D) levels were progressively lower in the lower eGFR strata, while parathyroid hormone (PTH) levels increased. Cystatin C, ADMA, NT-proBNP, hsCRP, troponin I and IL-6 were significantly higher in the lower GFR strata, whereas 25(OH)D and TGFβ1 values were lower at lower GFR. These distributions of each of the newer biomarkers by eGFR and uACR categories were variable.
doi:10.1186/1471-2369-14-121
PMCID: PMC3691726  PMID: 23758910
Chronic kidney disease; Biomarkers; Observational cohort study; Outcomes; Progression; CV disease
10.  Barriers to successful implementation of care in home haemodialysis (BASIC-HHD):1. Study design, methods and rationale 
BMC Nephrology  2013;14:197.
Background
Ten years on from the National Institute of Health and Clinical Excellence’ technology appraisal guideline on haemodialysis in 2002; the clinical community is yet to rise to the challenge of providing home haemodialysis (HHD) to 10-15% of the dialysis cohort. The renal registry report, suggests underutilization of a treatment type that has had a lot of research interest and several publications worldwide on its apparent benefit for both physical and mental health of patients. An understanding of the drivers to introducing and sustaining the modality, from organizational, economic, clinical and patient perspectives is fundamental to realizing the full benefits of the therapy with the potential to provide evidence base for effective care models. Through the BASIC-HHD study, we seek to understand the clinical, patient and carer related psychosocial, economic and organisational determinants of successful uptake and maintenance of home haemodialysis and thereby, engage all major stakeholders in the process.
Design and methods
We have adopted an integrated mixed methodology (convergent, parallel design) for this study. The study arms include a. patient; b. organization; c. carer and d. economic evaluation. The three patient study cohorts (n = 500) include pre-dialysis patients (200), hospital haemodialysis (200) and home haemodialysis patients (100) from geographically distinct NHS sites, across the country and with variable prevalence of home haemodialysis. The pre-dialysis patients will also be prospectively followed up for a period of 12 months from study entry to understand their journey to renal replacement therapy and subsequently, before and after studies will be carried out for a select few who do commence dialysis in the study period. The process will entail quantitative methods and ethnographic interviews of all groups in the study. Data collection will involve clinical and biomarkers, psychosocial quantitative assessments and neuropsychometric tests in patients. Organizational attitudes and dialysis unit practices will be studied together with perceptions of healthcare providers on provision of home HD. Economic evaluation of home and hospital haemodialysis practices will also be undertaken and we will apply scenario ("what … if") analysis using system dynamics modeling to investigate the impact of different policy choices and financial models on dialysis technology adoption, care pathways and costs. Less attention is often given to the patient’s carers who provide informal support, often of a complex nature to patients afflicted by chronic ailments such as end stage kidney disease. Engaging the carers is fundamental to realizing the full benefits of a complex, home-based intervention and a qualitative study of the carers will be undertaken to elicit their fears, concerns and perception of home HD before and after patient’s commencement of the treatment. The data sets will be analysed independently and the findings will be mixed at the stage of interpretation to form a coherent message that will be informing practice in the future.
Discussion
The BASIC-HHD study is designed to assemble pivotal information on dialysis modality choice and uptake, investigating users, care-givers and care delivery processes and study their variation in a multi-layered analytical approach within a single health care system. The study results would define modality specific service and patient pathway redesign.
Study Registration
This study has been reviewed and approved by the Greater Manchester West Health Research Authority National Research Ethics Service (NRES) The study is on the NIHR (CLRN) portfolio.
doi:10.1186/1471-2369-14-197
PMCID: PMC3851985  PMID: 24044499
Barriers; Home haemodialysis; Mixed methods; Qualitative; Organisation; Adoption; Quality of life
11.  Hemodialysis in children: general practical guidelines 
Over the past 20 years children have benefited from major improvements in both technology and clinical management of dialysis. Morbidity during dialysis sessions has decreased with seizures being exceptional and hypotensive episodes rare. Pain and discomfort have been reduced with the use of chronic internal jugular venous catheters and anesthetic creams for fistula puncture. Non-invasive technologies to assess patient target dry weight and access flow can significantly reduce patient morbidity and health care costs. The development of urea kinetic modeling enables calculation of the dialysis dose delivery, Kt/V, and an indirect assessment of the intake. Nutritional assessment and support are of major importance for the growing child. Even if the validity of these “urea only” data is questioned, their analysis provides information useful for follow-up. Newer machines provide more precise control of ultrafiltration by volumetric assessment and continuous blood volume monitoring during dialysis sessions. Buffered bicarbonate solutions are now standard and more biocompatible synthetic membranes and specific small size material dialyzers and tubing have been developed for young infants. More recently, the concept of “ultrapure” dialysate, i.e. free from microbiological contamination and endotoxins, has developed. This will enable the use of hemodiafiltration, especially with the on-line option, which has many theoretical advantages and should be considered in the case of maximum/optimum dialysis need. Although the optimum dialysis dose requirement for children remains uncertain, reports of longer duration and/or daily dialysis show they are more effective for phosphate control than conventional hemodialysis and should be considered at least for some high-risk patients with cardiovascular impairment. In children hemodialysis has to be individualized and viewed as an “integrated therapy” considering their long-term exposure to chronic renal failure treatment. Dialysis is seen only as a temporary measure for children compared with renal transplantation because this enables the best chance of rehabilitation in terms of educational and psychosocial functioning. In long term chronic dialysis, however, the highest standards should be applied to these children to preserve their future “cardiovascular life” which might include more dialysis time and on-line hemodiafiltration with synthetic high flux membranes if we are able to improve on the rather restricted concept of small-solute urea dialysis clearance.
doi:10.1007/s00467-005-1876-y
PMCID: PMC1766474  PMID: 15947992
Hemodialysis; Children; Guidelines
12.  Rationale and design of the HEALTHY-CATH trial: A randomised controlled trial of Heparin versus EthAnol Lock THerapY for the prevention of Catheter Associated infecTion in Haemodialysis patients 
BMC Nephrology  2009;10:23.
Background
Catheter-related bacteraemias (CRBs) contribute significantly to morbidity, mortality and health care costs in dialysis populations. Despite international guidelines recommending avoidance of catheters for haemodialysis access, hospital admissions for CRBs have doubled in the last decade. The primary aim of the study is to determine whether weekly instillation of 70% ethanol prevents CRBs compared with standard heparin saline.
Methods/design
The study will follow a prospective, open-label, randomized controlled design. Inclusion criteria are adult patients with incident or prevalent tunneled intravenous dialysis catheters on three times weekly haemodialysis, with no current evidence of catheter infection and no personal, cultural or religious objection to ethanol use, who are on adequate contraception and are able to give informed consent. Patients will be randomized 1:1 to receive 3 mL of intravenous-grade 70% ethanol into each lumen of the catheter once a week and standard heparin locks for other dialysis days, or to receive heparin locks only. The primary outcome measure will be time to the first episode of CRB, which will be defined using standard objective criteria. Secondary outcomes will include adverse reactions, incidence of CRB caused by different pathogens, time to infection-related catheter removal, time to exit site infections and costs. Prospective power calculations indicate that the study will have 80% statistical power to detect a clinically significant increase in median infection-free survival from 200 days to 400 days if 56 patients are recruited into each arm.
Discussion
This investigator-initiated study has been designed to provide evidence to help nephrologists reduce the incidence of CRBs in haemodialysis patients with tunnelled intravenous catheters.
Trial Registration
Australian New Zealand Clinical Trials Registry Number: ACTRN12609000493246
doi:10.1186/1471-2369-10-23
PMCID: PMC2738669  PMID: 19691852
13.  Dialysis-associated peritonitis in children 
Peritonitis remains a frequent complication of peritoneal dialysis in children and is the most common reason for technique failure. The microbiology is characterized by a predominance of Gram-positive organisms, with fungi responsible for less than 5% of episodes. Data collected by the International Pediatric Peritonitis Registry have revealed a worldwide variation in the bacterial etiology of peritonitis, as well as in the rate of culture-negative peritonitis. Risk factors for infection include young age, the absence of prophylactic antibiotics at catheter placement, spiking of dialysis bags, and the presence of a catheter exit-site or tunnel infection. Clinical symptoms at presentation are somewhat organism specific and can be objectively assessed with a Disease Severity Score. Whereas recommendations for empiric antibiotic therapy in children have been published by the International Society of Peritoneal Dialysis, epidemiologic data and antibiotic susceptibility data suggest that it may be desirable to take the patient- and center-specific history of microorganisms and their sensitivity patterns into account when prescribing initial therapy. The vast majority of patients are treated successfully and continue peritoneal dialysis, with the poorest outcome noted in patients with peritonitis secondary to Gram-negative organisms or fungi and in those with a relapsing infection.
doi:10.1007/s00467-008-1113-6
PMCID: PMC2810362  PMID: 19190935
Antibiotics; Children; Infection; Peritonitis; Peritoneal dialysis
14.  Laparoscopic Placement and Revision of Peritoneal Dialysis Catheters 
Chronic peritoneal dialysis is an option for many patients with end stage renal disease. Laparoscopy offers an alter-native approach in the management of dialysis patients. Over an 18-month period, laparoscopy was used for placement or revision of seven peritoneal dialysis catheters. All were placed in patients with end stage renal disease for chronic dialysis. Two catheters were initially placed using the laparoscope, and in five other patients, the position of the catheter was revised. Of the two patients who had their catheters placed initially, one patient had a previous lower mid-line incision and underwent laparoscopic placement of a catheter and lysis of pelvic adhesions. The second patient had hepatitis C and chronically elevated liver function tests. He underwent laparoscopic placement of a peritoneal dialysis catheter and liver biopsy. Five patients had laparoscopic revision for non-functional catheters. Four were found to have omental adhesions surrounding the catheter. Three patients were found to have a fibrin clot within the catheter, and in one patient the small bowel was adhered to the catheter. All seven patients had general endotracheal anesthesia. There were no operative or anesthetic complications. The average operative time was 56 minutes. Four patients had their procedure in an ambulatory setting and were discharged home the same day. One patient was admitted for 23-hour observation, and two patients had their procedure while in the hospital for other reasons. In follow-up, there was one early failure at two weeks, which required removal of the catheter for infection. One catheter was removed at the time of a combined kidney/pancreas transplant eight months after revision. The other five catheters are still functional with an average follow-up of ten months. These results suggest that laparoscopy is another method for placement of peritoneal dialysis catheters and more importantly for revision in patients with nonfunctional catheters secondary to adhesions. It also provides an opportunity to evaluate the abdomen and perform concomitant procedures.
PMCID: PMC3015335  PMID: 10323172
Laparoscopy; Dialysis catheter; Renal Disease
15.  A Rare Case of Aeromonas Hydrophila Catheter Related Sepsis in a Patient with Chronic Kidney Disease Receiving Steroids and Dialysis: A Case Report and Review of Aeromonas Infections in Chronic Kidney Disease Patients 
Case reports in nephrology  2013;2013:735194.
Aeromonas hydrophila (AH) is an aquatic bacterium. We present a case of fifty-five-year-old gentleman with chronic kidney disease (CKD) due to crescentic IgA nephropathy who presented to us with fever. He was recently pulsed with methyl prednisolone followed by oral prednisolone and discharged on maintenance dialysis through a double lumen dialysis catheter. Blood culture from peripheral vein and double lumen dialysis catheter grew AH. We speculate low immunity due to steroids and uremia along with touch contamination of dialysis catheter by the patient or dialysis nurse could have led to this rare infection. Dialysis catheter related infection by AH is rare. We present our case here and take the opportunity to give a brief review of AH infections in CKD patients.
doi:10.1155/2013/735194
PMCID: PMC3914193  PMID: 24558624
16.  A Crossover Intervention Trial Evaluating the Efficacy of a Chlorhexidine-Impregnated Sponge (BIOPATCH®) to Reduce Catheter-Related Bloodstream Infections in Hemodialysis Patients 
Background
Catheter-related bloodstream infections (BSI) account for the majority of hemodialysis-related infections. There are no published data on the efficacy of the chlorhexidine-impregnated foam dressing at reducing catheter-related BSI in hemodialysis patients.
Design
Prospective non-blinded cross-over intervention trial to determine the efficacy of a chlorhexidine-impregnated foam dressing (Biopatch®) to reduce catheter-related BSI in hemodialysis patients.
Setting
Two outpatient dialysis centers
Patients
A total of 121 patients who were dialyzed through tunneled central venous catheters received the intervention during the trial.
Methods
The primary outcome of interest was the incidence of catheter-related bloodstream infections. A nested cohort study of all patients who received the Biopatch® Antimicrobial Dressing was also conducted. Backward stepwise logistic regression analysis was used to determine independent risk factors for development of BSI.
Results
37 bloodstream infections occurred in the intervention group for a rate of 6.3 BSIs/1000 dialysis sessions and 30 bloodstream infections in the control group for a rate of 5.2 BSIs/1000 dialysis sessions and [RR 1.22, CI (0.76, 1.97); P=0.46]. The Biopatch® Antimicrobial Dressing was well-tolerated with only two patients (<2%) experiencing dermatitis that led to its discontinuation. The only independent risk factor for development of BSI was dialysis treatment at one dialysis center [aOR 4.4 (1.77, 13.65); P=0.002]. Age ≥ 60 years [aOR 0.28 (0.09, 0.82); P=0.02] was associated with lower risk for BSI.
Conclusion
The use of a chlorhexidine-impregnated foam dressing (Biopatch®) did not decrease catheter-related BSIs among hemodialysis patients with tunneled central venous catheters.
doi:10.1086/657075
PMCID: PMC3077924  PMID: 20879855
chlorhexidine-impregnated dressing; hemodialysis; bloodstream infection; tunneled catheter
17.  Tunneled central venous catheters: Experience from a single center 
Indian Journal of Nephrology  2011;21(2):107-111.
In the past vascular surgeons were called in to place tunneled central venous catheter (TVC) for hemodialysis patients. Advent of percutaneous technique has resulted in an increasing number of interventional nephrologists inserting it. A single centre three year audit of 100 TVCs with a cumulative follow up of 492 patient months is presented here. From 2007 to 2010, 100 TVCs were placed by nephrologists in a percutaneous fashion in the operative room or the interventional nephrology suite. Those who completed minimum of three months on the catheter were included in analysis. There were 69 males and 31 females with a mean age of 52.3±13.6 years.(range: 25-76). Chronic glomerulonephritis was the commonest cause of CKD (45%) followed by diabetes (39%).Right internal jugular vein was the preferred site (94%). TVC was utilized as the primary access to initiate dialysis in 25% of patients in whom a live donor was available for renal transplant. The blood flow was 250-270 ml/min. The Kaplan-Meier analysis showed that 3 months and 6 months catheter survival rates were 80% and 55%, respectively. The main complications were exit site blood ooze, catheter block and kink. Catheter related bacteremia rate was low at 0.4/1000 patient days. Primary cause of drop out was patient death unrelated to the TVCs. Those under the age of 40 years showed better survival, but there was no bearing of gender, catheter site, and etiology of CKD on survival. Tunneled central venous catheters could find a niche as the primary access of choice for pretransplant live donor renal transplants in view of its immediate usage, high blood flows, low infection rates and adequate patency rates for 3-6 months.
doi:10.4103/0971-4065.82133
PMCID: PMC3132329  PMID: 21769173
Bacteremia; catheter survival; hemodialysis; tunneled central venous catheters
18.  Cancer Screening: A Mathematical Model Relating Secreted Blood Biomarker Levels to Tumor Sizes  
PLoS Medicine  2008;5(8):e170.
Background
Increasing efforts and financial resources are being invested in early cancer detection research. Blood assays detecting tumor biomarkers promise noninvasive and financially reasonable screening for early cancer with high potential of positive impact on patients' survival and quality of life. For novel tumor biomarkers, the actual tumor detection limits are usually unknown and there have been no studies exploring the tumor burden detection limits of blood tumor biomarkers using mathematical models. Therefore, the purpose of this study was to develop a mathematical model relating blood biomarker levels to tumor burden.
Methods and Findings
Using a linear one-compartment model, the steady state between tumor biomarker secretion into and removal out of the intravascular space was calculated. Two conditions were assumed: (1) the compartment (plasma) is well-mixed and kinetically homogenous; (2) the tumor biomarker consists of a protein that is secreted by tumor cells into the extracellular fluid compartment, and a certain percentage of the secreted protein enters the intravascular space at a continuous rate. The model was applied to two pathophysiologic conditions: tumor biomarker is secreted (1) exclusively by the tumor cells or (2) by both tumor cells and healthy normal cells. To test the model, a sensitivity analysis was performed assuming variable conditions of the model parameters. The model parameters were primed on the basis of literature data for two established and well-studied tumor biomarkers (CA125 and prostate-specific antigen [PSA]). Assuming biomarker secretion by tumor cells only and 10% of the secreted tumor biomarker reaching the plasma, the calculated minimally detectable tumor sizes ranged between 0.11 mm3 and 3,610.14 mm3 for CA125 and between 0.21 mm3 and 131.51 mm3 for PSA. When biomarker secretion by healthy cells and tumor cells was assumed, the calculated tumor sizes leading to positive test results ranged between 116.7 mm3 and 1.52 × 106 mm3 for CA125 and between 27 mm3 and 3.45 × 105 mm3 for PSA. One of the limitations of the study is the absence of quantitative data available in the literature on the secreted tumor biomarker amount per cancer cell in intact whole body animal tumor models or in cancer patients. Additionally, the fraction of secreted tumor biomarkers actually reaching the plasma is unknown. Therefore, we used data from published cell culture experiments to estimate tumor cell biomarker secretion rates and assumed a wide range of secretion rates to account for their potential changes due to field effects of the tumor environment.
Conclusions
This study introduced a linear one-compartment mathematical model that allows estimation of minimal detectable tumor sizes based on blood tumor biomarker assays. Assuming physiological data on CA125 and PSA from the literature, the model predicted detection limits of tumors that were in qualitative agreement with the actual clinical performance of both biomarkers. The model may be helpful in future estimation of minimal detectable tumor sizes for novel proteomic biomarker assays if sufficient physiologic data for the biomarker are available. The model may address the potential and limitations of tumor biomarkers, help prioritize biomarkers, and guide investments into early cancer detection research efforts.
Sanjiv Gambhir and colleagues describe a linear one-compartment mathematical model that allows estimation of minimal detectable tumor sizes based on blood tumor biomarker assays.
Editors' Summary
Background.
Cancers—disorganized masses of cells that can occur in any tissue—develop when cells acquire genetic changes that allow them to grow uncontrollably and to spread around the body (metastasize). If a cancer (tumor) is detected when it is small, surgery can often provide a cure. Unfortunately, many cancers (particularly those deep inside the body) are not detected until they are large enough to cause pain or other symptoms by pressing against surrounding tissue. By this time, it may be impossible to remove the original tumor surgically and there may be metastases scattered around the body. In such cases, radiotherapy and chemotherapy can sometimes help, but the outlook for patients whose cancers are detected late is often poor. Consequently, researchers are trying to develop early detection tests for different types of cancer. Many tumors release specific proteins—“cancer biomarkers”—into the blood and the hope is that it might be possible to find sets of blood biomarkers that detect cancers when they are still small and thus save many lives.
Why Was This Study Done?
For most biomarkers, it is not known how the amount of protein detected in the blood relates to tumor size or how sensitive the assays for biomarkers must be to improve patient survival. In this study, the researchers develop a “linear one-compartment” mathematical model to predict how large tumors need to be before blood biomarkers can be used to detect them and test this model using published data on two established cancer biomarkers—CA125 and prostate-specific antigen (PSA). CA125 is used to monitor the progress of patients with ovarian cancer after treatment; ovarian cancer is rarely diagnosed in its early stages and only one-fourth of women with advanced disease survive for 5 y after diagnosis. PSA is used to screen for prostate cancer and has increased the detection of this cancer in its early stages when it is curable.
What Did the Researchers Do and Find?
To develop a model that relates secreted blood biomarker levels to tumor sizes, the researchers assumed that biomarkers mix evenly throughout the patient's blood, that cancer cells secrete biomarkers into the fluid that surrounds them, that 0.1%–20% of these secreted proteins enter the blood at a continuous rate, and that biomarkers are continuously removed from the blood. The researchers then used their model to calculate the smallest tumor sizes that might be detectable with these biomarkers by feeding in existing data on CA125 and on PSA, including assay detection limits and the biomarker secretion rates of cancer cells growing in dishes. When only tumor cells secreted the biomarker and 10% of the secreted biomarker reach the blood, the model predicted that ovarian tumors between 0.11 mm3 (smaller than a grain of salt) and nearly 4,000 mm3 (about the size of a cherry) would be detectable by measuring CA125 blood levels (the range was determined by varying the amount of biomarker secreted by the tumor cells and the assay sensitivity); for prostate cancer, the detectable tumor sizes ranged from similar lower size to about 130 mm3 (pea-sized). However, healthy cells often also secrete small quantities of cancer biomarkers. With this condition incorporated into the model, the estimated detectable tumor sizes (or total tumor burden including metastases) ranged between grape-sized and melon-sized for ovarian cancers and between pea-sized to about grapefruit-sized for prostate cancers.
What Do These Findings Mean?
The accuracy of the calculated tumor sizes provided by the researchers' mathematical model is limited by the lack of data on how tumors behave in the human body and by the many assumptions incorporated into the model. Nevertheless, the model predicts detection limits for ovarian and prostate cancer that broadly mirror the clinical performance of both biomarkers. Somewhat worryingly, the model also indicates that a tumor may have to be very large for blood biomarkers to reveal its presence, a result that could limit the clinical usefulness of biomarkers, especially if they are secreted not only by tumor cells but also by healthy cells. Given this finding, as more information about how biomarkers behave in the human body becomes available, this model (and more complex versions of it) should help researchers decide which biomarkers are likely to improve early cancer detection and patient outcomes.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0050170.
The US National Cancer Institute provides a brief description of what cancer is and how it develops and a fact sheet on tumor markers; it also provides information on all aspects of ovarian and prostate cancer for patients and professionals, including information on screening and testing (in English and Spanish)
The UK charity Cancerbackup also provides general information about cancer and more specific information about ovarian and prostate cancer, including the use of CA125 and PSA for screening and follow-up
The American Society of Clinical Oncology offers a wide range of information on various cancer types, including online published articles on the current status of cancer diagnosis and management from the educational book developed by the annual meeting faculty and presenters. Registration is mandatory, but information is free
doi:10.1371/journal.pmed.0050170
PMCID: PMC2517618  PMID: 18715113
19.  Blood Loss through AV Fistula: A Case Report and Literature Review 
Little has been written about acute blood loss from hemodialysis vascular access. We describe a 57-year-old Caucasian male with an approximately 7 gm/dL drop in hemoglobin due to bleeding from a ruptured aneurysm in his right brachiocephalic arteriovenous fistula (AVF). There was no evidence of fistula infection. The patient was successfully managed by blood transfusions and insertion of a tunneled dialysis catheter for dialysis access. Later, the fistula was ligated and a new fistula was constructed in the opposite arm. Aneurysm should be considered in cases of acute vascular access bleeding in chronic dialysis patients.
doi:10.4061/2011/350870
PMCID: PMC3118665  PMID: 21716705
20.  A randomised controlled trial of Heparin versus EthAnol Lock THerapY for the prevention of Catheter Associated infecTion in Haemodialysis patients – the HEALTHY-CATH trial 
BMC Nephrology  2012;13:146.
Background
Tunnelled central venous dialysis catheter use is significantly limited by the occurrence of catheter-related infections. This randomised controlled trial assessed the efficacy of a 48 hour 70% ethanol lock vs heparin locks in prolonging the time to the first episode of catheter related blood stream infection (CRBSI).
Methods
Patients undergoing haemodialysis (HD) via a tunnelled catheter were randomised 1:1 to once per week ethanol locks (with two heparin locks between other dialysis sessions) vs thrice per week heparin locks.
Results
Observed catheter days in the heparin (n=24) and ethanol (n=25) groups were 1814 and 3614 respectively. CRBSI occurred at a rate of 0.85 vs. 0.28 per 1000 catheter days in the heparin vs ethanol group by intention to treat analysis (incident rate ratio (IRR) for ethanol vs. heparin 0.17; 95%CI 0.02-1.63; p=0.12). Flow issues requiring catheter removal occurred at a rate of 1.6 vs 1.4 per 1000 catheter days in the heparin and ethanol groups respectively (IRR 0.85; 95% CI 0.20-3.5 p =0.82 (for ethanol vs heparin).
Conclusions
Catheter survival and catheter-related blood stream infection were not significantly different but there was a trend towards a reduced rate of infection in the ethanol group. This study establishes proof of concept and will inform an adequately powered multicentre trial to definitively examine the efficacy and safety of ethanol locks as an alternative to current therapies used in the prevention of catheter-associated blood stream infections in patients dialysing with tunnelled catheters.
Trial Registration
Australian New Zealand Clinical Trials Registry ACTRN12609000493246
doi:10.1186/1471-2369-13-146
PMCID: PMC3531247  PMID: 23121768
Catheter related blood stream infection (CRBSI); Central venous catheter; Ethanol; Lock therapy; Haemodialysis (HD); Prophylaxis
21.  Risk factors associated with Peritoneal Dialysis catheter survival: A nine year single center study in 315 patients 
The journal of vascular access  2010;11(4):316-322.
PURPOSE
To review the peritoneal dialysis (PD) catheter outcomes at our center and assess factors affecting the catheter survival.
METHODS
We carried out a retrospective study on 315 patients who had their first PD catheter placed between January 2001 and September 2009 at the UT Southwestern/DaVita Peritoneal Dialysis Clinic at Dallas, Texas. Medical records were reviewed for demographic and clinical information of the patients. The primary end point of the study was PD catheter failure, defined as removal of a dysfunctional PD catheter due to catheter-related complications. Catheter survival was estimated using Kaplan Meier method. Cox proportional hazard regression model was used to identify factors that are independently associated with catheter survival.
RESULTS
The mean age of the patients was 49.7 +/− 29 years. The study population included 54.6% females; 42.5% African American, 27.9% Caucasian and 22.9% Hispanic patients. Diabetes was the primary etiology of end-stage renal disease in 43.2% of patients. More than 90% of patients had one or more co-morbidities; and 57.5% had previous abdominal surgery. The mean BMI for the group was 28.6 +/− 13.8 kg/m2. Less than a quarter of the patients (24.1%) had non-infectious/mechanical catheter problems.
Overall PD catheter survival rates over 12, 24 and 36 months were 92.9%, 91.9% and 91.1%, respectively. PD catheter-related non-infectious problem was the only independent variable that was significantly associated with catheter survival (Hazard ratio 22.467; 95% CI 6.665– 75.732). No significant association was observed between the PD catheter survival and other risk factors including age, BMI, diabetic status, comorbidities, previous abdominal surgeries or infections.
CONCLUSIONS
Our study shows an excellent 3-year PD catheter survival (91.1%). Only PD catheter-related non-infectious problems are significantly associated with catheter failure. Other factors such as age, gender, race, BMI, diabetic status, comorbidities, previous abdominal surgeries, peritoneal infections or exit-site/ tunnel infections were not found to affect the PD catheter survival and should not be considered barriers to PD initiation.
PMCID: PMC3207262  PMID: 20890875
Peritoneal dialysis catheter; survival; risk factors
22.  Nephrology for the people: Presidential Address at the 42nd Regional Meeting of the Japanese Society of Nephrology in Okinawa 2012 
The social and economic burdens of dialysis are growing worldwide as the number of patients increases. Dialysis is becoming a heavy burden even in developed countries. Thus, preventing end-stage kidney disease is of the utmost importance. Early detection and treatment is recommended because late referral is common, with most chronic kidney disease (CKD) patients remaining asymptomatic until a late stage. Three-quarters of dialysis patients initiated dialysis therapy within 1 year after referral to the facility. Since its introduction in 2002, the definition of CKD has been widely accepted not only by nephrologists but also by other medical specialties, such as cardiologists and general practitioners. Japan has a long history of general screening for school children, university students, and employees of companies and government offices, with everybody asked to participate. The urine test for proteinuria and hematuria is popular among Japanese people; however, the outcomes have not been well studied. We examined the effects of clinical and laboratory data from several sources on survival of dialysis patients and also predictors of developing dialysis from community-based screening (Okinawa Dialysis Study, OKIDS). At an early CKD stage, patients are usually asymptomatic; therefore, regular health checks using a urine dipstick and serum creatinine are recommended. The intervals for follow-up, however, are debatable due to the cost. CKD is a strong risk factor for developing cardiovascular disease and death and also plays an important role in infection and malignancies, particularly in elderly people. People can live longer with healthy kidneys.
doi:10.1007/s10157-013-0776-x
PMCID: PMC3751387  PMID: 23392566
Survival; Predictor; Chronic kidney disease (CKD); End-stage kidney disease (ESKD); Proteinuria
23.  Hemodialysis Access Usage Patterns in the Incident Dialysis Year and Associated Catheter-Related Complications 
Background
Hemodialysis(HD) access is considered a critical and actionable determinant of morbidity, with a growing literature suggesting that initial HD access type is an important marker of long term outcomes. Accordingly, we examined HD access during the incident dialysis period, focusing on infection risk and successful fistula creation over the first dialysis year.
Study Design
Longitudinal cohort
Setting & Participants
All United States adults admitted to Fresenius Medical Care North America facilities within 15 days of first maintenance dialysis session between January 1 and December 31, 2007.
Predictor
Vascular access type at HD initiation
Outcomes
Vascular access type at 90 days and at the end of the first year on HD, bloodstream infection within the first year by access type, and catheter complication rate.
Results
Amongst the 25,003 incident dialysis patients studied, 19,622(78.5%) initiated dialysis with a catheter, 4,151(16.6%) with a fistula, and 1,230(4.9%) with a graft. At 90 days, 14,105(69.7%) had a catheter, 4432(21.9%) patients had a fistula, and 1705(8.4%) had a graft. Functioning fistulas and grafts at dialysis initiation had first year failure rates of 10% and 15%, respectively. Grafts were seldom replaced by fistulas (3%), while 7,064 (47.6%) of all patients who initiated with a catheter alone still had only a catheter at 1 year. Overall, 3,327 (13.3%) patients had at least one positive blood culture during follow-up, with the risk being similar between fistula and graft groups, but approximately 3-fold higher in patients with a catheter (p<0.001 for either comparison). Nearly one in three catheters (32.5%) will require TPA use by median time of 41 days, with 59% requiring more than one TPA administration.
Limitations
Potential underestimation of bacteremia because follow-up blood culture results did not include samples sent to local laboratories.
Conclusions
Among a large and representative population of incident US dialysis patients, catheter use remains very high over the first year of hemodialysis care, and is associated with high mechanical complication and blood stream infection rates.
doi:10.1053/j.ajkd.2012.09.006
PMCID: PMC3532948  PMID: 23159234
24.  Hypoxia and oxidative stress markers in pediatric patients undergoing hemodialysis: cross section study 
BMC Nephrology  2012;13:136.
Background
Tissue injury due to hypoxia and/or free radicals is common in a variety of disease processes. This cross-sectional study aimed to investigate effect of chronic kidney diseases (CKD) and hemodialysis (HD) on hypoxia and oxidative stress biomarkers.
Methods
Forty pediatric patients with CKD on HD and 20 healthy children were recruited. Plasma hypoxia induced factor-1α (HIF-1α), vascular endothelial growth factor (VEGF) were measured by specific ELISA kits while, total antioxidant capacity (TAC), total peroxide (TPX), pyruvate and lactate by enzymatic/chemical colorimetric methods. Oxidative stress index (OSI) and lactate/pyruvate (L/P) ratio were calculated.
Results
TAC was significantly lower while TPX, OSI and VEGF were higher in patients at before- and after-dialysis session than controls. Lactate and HIF-1α levels were significantly higher at before-dialysis session than controls. Before dialysis, TAC and L/P ratio were lower than after-dialysis. In before-dialysis session, VEGF correlated positively with pyruvate, HIF-1α and OSI correlated positively with TPX, but, negatively with TAC. In after-dialysis session, HIF-1α correlated negatively with TPX and OSI; while, OSI correlated positively with TPX.
Conclusions
CKD patients succumb considerable tissue hypoxia with oxidative stress. Hemodialysis ameliorated hypoxia but lowered antioxidants as evidenced by decreased levels of HIF-1α and TAC at before- compared to after-dialysis levels.
doi:10.1186/1471-2369-13-136
PMCID: PMC3509393  PMID: 23061474
25.  Modifiable Risk Factors for Early Mortality on Hemodialysis 
Data of incident hemodialysis patients from 2001 to 2007 were abstracted from The Renal Disease Registry (TRDR) from central Ontario, Canada and followed until December 2008 to determine 90-day mortality rates for incident hemodialysis patients. Modifiable risk factors of early mortality were determined by a Cox model. In total, 876 of 4807 incident patients died during their first year on dialysis; 304 (34.7%) deaths occurred within the first 90 days of dialysis initiation. The majority of deaths were attributed to a cardiovascular event or infection and more likely occurred in older patients and those with cardiovascular co-morbidities. Of potentially modifiable risk factors, low body mass index (<18.5), a surrogate for malnutrition, was a strong predictor of early mortality [adjusted hazard ratio (HR) 4.22 (CI: 3.12–5.17)]. Also, central venous catheter use was associated with a 2.40 fold increase risk of death (CI: 1.4–3.90). Patients who attended a multidisciplinary pre-dialysis clinic were less likely to die (HR: 0.60, CI: 0.47–0.78). The first 90 days after initiation of dialysis is a period of especially high risk of death. We have identified potentially modifiable risk factors in vascular access type, pre-dialysis care and nutritional status.
doi:10.1155/2012/435736
PMCID: PMC3409533  PMID: 22888426

Results 1-25 (1117422)