PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1355490)

Clipboard (0)
None

Related Articles

1.  Precision of Biomarkers to Define Chronic Inflammation in CKD 
American journal of nephrology  2008;28(5):808-812.
Background/Aims
Several inflammatory biomarkers have been found to be associated with cardiovascular disease or all-cause mortality in dialysis patients, but their usefulness in clinical practice or as surrogate endpoints is not certain. The purpose of the present study was to determine the intrapatient variation of C-reactive protein, IL-6, fetuin-A and albumin in a population of dialysis patients.
Methods
Apparently healthy dialysis patients with either a tunneled dialysis catheter or fistula had monthly assessments of these biomarkers for a total of four determinations, and the intraclass correlation coefficients were calculated as measures of intersubject variance.
Results
Our results showed large within-subject variation relative to the total variation in the measurements (31-46%). Having a tunneled catheter as opposed to a fistula was not significantly associated with mean levels, suggesting that chronic subclinical catheter infection does not explain the variation seen in the biomarkers. In contrast, there was a rapid change in these biomarkers with a clinically apparent acute infection.
Conclusion
These results suggest that these biomarkers have limitations for use as surrogate endpoints in clinical trials due to wide fluctuations, even in apparently clinically healthy individuals.
doi:10.1159/000135692
PMCID: PMC2574778  PMID: 18506106
Biomarkers, precision; Chronic inflammation; Chronic kidney disease; CKD stage 5D; Inflammatory biomarkers, intrapatient variance; Tunneled dialysis catheter
2.  Care of undocumented-uninsured immigrants in a large urban dialysis unit 
BMC Nephrology  2012;13:112.
Background
Medical, ethical and financial dilemmas may arise in treating undocumented-uninsured patients with end-stage renal disease (ESRD). Hereby we describe the 10-year experience of treating undocumented-uninsured ESRD patients in a large public dialysis-unit.
Methods
We evaluated the medical files of all the chronic dialysis patients treated at the Tel-Aviv Medical Center between the years 2000–2010. Data for all immigrant patients without documentation and medical insurance were obtained. Clinical data were compared with an age-matched cohort of 77 insured dialysis patients.
Results
Fifteen undocumented-uninsured patients were treated with chronic scheduled dialysis therapy for a mean length of 2.3 years and a total of 4953 hemodialysis sessions, despite lack of reimbursement. All undocumented-uninsured patients presented initially with symptoms attributed to uremia and with stage 5 chronic kidney disease (CKD). In comparison, in the age-matched cohort, only 6 patients (8%) were initially evaluated by a nephrologist at stage 5 CKD. Levels of hemoglobin (8.5 ± 1.7 versus 10.8 ± 1.6 g/dL; p < 0.0001) and albumin (33.8 ± 4.8 versus 37.7 ± 3.9 g/L; p < 0.001) were lower in the undocumented-uninsured dialysis patients compared with the age-matched insured patients at initiation of hemodialysis therapy. These significant changes were persistent throughout the treatment period. Hemodialysis was performed in all the undocumented-uninsured patients via tunneled cuffed catheters (TCC) without higher rates of TCC-associated infections. The rate of skipped hemodialysis sessions was similar in the undocumented-uninsured and age-matched insured cohorts.
Conclusions
Undocumented-uninsured dialysis patients presented initially in the advanced stages of CKD with lower levels of hemoglobin and worse nutritional status in comparison with age-matched insured patients. The type of vascular access for hemodialysis was less than optimal with regards to current guidelines. There is a need for the national and international nephrology communities to establish a policy concerning the treatment of undocumented-uninsured patients with CKD.
doi:10.1186/1471-2369-13-112
PMCID: PMC3615959  PMID: 22992409
Dialysis; ESRD; Undocumented; Uninsured; Immigrants
3.  Risk Models to Predict Chronic Kidney Disease and Its Progression: A Systematic Review 
PLoS Medicine  2012;9(11):e1001344.
A systematic review of risk prediction models conducted by Justin Echouffo-Tcheugui and Andre Kengne examines the evidence base for prediction of chronic kidney disease risk and its progression, and suitability of such models for clinical use.
Background
Chronic kidney disease (CKD) is common, and associated with increased risk of cardiovascular disease and end-stage renal disease, which are potentially preventable through early identification and treatment of individuals at risk. Although risk factors for occurrence and progression of CKD have been identified, their utility for CKD risk stratification through prediction models remains unclear. We critically assessed risk models to predict CKD and its progression, and evaluated their suitability for clinical use.
Methods and Findings
We systematically searched MEDLINE and Embase (1 January 1980 to 20 June 2012). Dual review was conducted to identify studies that reported on the development, validation, or impact assessment of a model constructed to predict the occurrence/presence of CKD or progression to advanced stages. Data were extracted on study characteristics, risk predictors, discrimination, calibration, and reclassification performance of models, as well as validation and impact analyses. We included 26 publications reporting on 30 CKD occurrence prediction risk scores and 17 CKD progression prediction risk scores. The vast majority of CKD risk models had acceptable-to-good discriminatory performance (area under the receiver operating characteristic curve>0.70) in the derivation sample. Calibration was less commonly assessed, but overall was found to be acceptable. Only eight CKD occurrence and five CKD progression risk models have been externally validated, displaying modest-to-acceptable discrimination. Whether novel biomarkers of CKD (circulatory or genetic) can improve prediction largely remains unclear, and impact studies of CKD prediction models have not yet been conducted. Limitations of risk models include the lack of ethnic diversity in derivation samples, and the scarcity of validation studies. The review is limited by the lack of an agreed-on system for rating prediction models, and the difficulty of assessing publication bias.
Conclusions
The development and clinical application of renal risk scores is in its infancy; however, the discriminatory performance of existing tools is acceptable. The effect of using these models in practice is still to be explored.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Chronic kidney disease (CKD)—the gradual loss of kidney function—is increasingly common worldwide. In the US, for example, about 26 million adults have CKD, and millions more are at risk of developing the condition. Throughout life, small structures called nephrons inside the kidneys filter waste products and excess water from the blood to make urine. If the nephrons stop working because of injury or disease, the rate of blood filtration decreases, and dangerous amounts of waste products such as creatinine build up in the blood. Symptoms of CKD, which rarely occur until the disease is very advanced, include tiredness, swollen feet and ankles, puffiness around the eyes, and frequent urination, especially at night. There is no cure for CKD, but progression of the disease can be slowed by controlling high blood pressure and diabetes, both of which cause CKD, and by adopting a healthy lifestyle. The same interventions also reduce the chances of CKD developing in the first place.
Why Was This Study Done?
CKD is associated with an increased risk of end-stage renal disease, which is treated with dialysis or by kidney transplantation (renal replacement therapies), and of cardiovascular disease. These life-threatening complications are potentially preventable through early identification and treatment of CKD, but most people present with advanced disease. Early identification would be particularly useful in developing countries, where renal replacement therapies are not readily available and resources for treating cardiovascular problems are limited. One way to identify people at risk of a disease is to use a “risk model.” Risk models are constructed by testing the ability of different combinations of risk factors that are associated with a specific disease to identify those individuals in a “derivation sample” who have the disease. The model is then validated on an independent group of people. In this systematic review (a study that uses predefined criteria to identify all the research on a given topic), the researchers critically assess the ability of existing CKD risk models to predict the occurrence of CKD and its progression, and evaluate their suitability for clinical use.
What Did the Researchers Do and Find?
The researchers identified 26 publications reporting on 30 risk models for CKD occurrence and 17 risk models for CKD progression that met their predefined criteria. The risk factors most commonly included in these models were age, sex, body mass index, diabetes status, systolic blood pressure, serum creatinine, protein in the urine, and serum albumin or total protein. Nearly all the models had acceptable-to-good discriminatory performance (a measure of how well a model separates people who have a disease from people who do not have the disease) in the derivation sample. Not all the models had been calibrated (assessed for whether the average predicted risk within a group matched the proportion that actually developed the disease), but in those that had been assessed calibration was good. Only eight CKD occurrence and five CKD progression risk models had been externally validated; discrimination in the validation samples was modest-to-acceptable. Finally, very few studies had assessed whether adding extra variables to CKD risk models (for example, genetic markers) improved prediction, and none had assessed the impact of adopting CKD risk models on the clinical care and outcomes of patients.
What Do These Findings Mean?
These findings suggest that the development and clinical application of CKD risk models is still in its infancy. Specifically, these findings indicate that the existing models need to be better calibrated and need to be externally validated in different populations (most of the models were tested only in predominantly white populations) before they are incorporated into guidelines. The impact of their use on clinical outcomes also needs to be assessed before their widespread use is recommended. Such research is worthwhile, however, because of the potential public health and clinical applications of well-designed risk models for CKD. Such models could be used to identify segments of the population that would benefit most from screening for CKD, for example. Moreover, risk communication to patients could motivate them to adopt a healthy lifestyle and to adhere to prescribed medications, and the use of models for predicting CKD progression could help clinicians tailor disease-modifying therapies to individual patient needs.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001344.
This study is further discussed in a PLOS Medicine Perspective by Maarten Taal
The US National Kidney and Urologic Diseases Information Clearinghouse provides information about all aspects of kidney disease; the US National Kidney Disease Education Program provides resources to help improve the understanding, detection, and management of kidney disease (in English and Spanish)
The UK National Health Service Choices website provides information for patients on chronic kidney disease, including some personal stories
The US National Kidney Foundation, a not-for-profit organization, provides information about chronic kidney disease (in English and Spanish)
The not-for-profit UK National Kidney Federation support and information for patients with kidney disease and for their carers, including a selection of patient experiences of kidney disease
World Kidney Day, a joint initiative between the International Society of Nephrology and the International Federation of Kidney Foundations, aims to raise awareness about kidneys and kidney disease
doi:10.1371/journal.pmed.1001344
PMCID: PMC3502517  PMID: 23185136
4.  Effects of starting hemodialysis with an arteriovenous fistula or central venous catheter compared with peritoneal dialysis: a retrospective cohort study 
BMC Nephrology  2012;13:88.
Background
Although several studies have demonstrated early survival advantages with peritoneal dialysis (PD) over hemodialysis (HD), the reason for the excess mortality observed among incident HD patients remains to be established, to our knowledge. This study explores the relationship between mortality and dialysis modality, focusing on the role of HD vascular access type at the time of dialysis initiation.
Methods
A retrospective cohort study was performed among local adult chronic kidney disease patients who consecutively initiated PD and HD with a tunneled cuffed venous catheter (HD-TCC) or a functional arteriovenous fistula (HD-AVF) in our institution in the year 2008. A total of 152 patients were included in the final analysis (HD-AVF, n = 59; HD-TCC, n = 51; PD, n = 42). All cause and dialysis access-related morbidity/mortality were evaluated at one year. Univariate and multivariate analysis were used to compare the survival of PD patients with those who initiated HD with an AVF or with a TCC.
Results
Compared with PD patients, both HD-AVF and HD-TCC patients were more likely to be older (p<0.001) and to have a higher frequency of diabetes mellitus (p = 0.017) and cardiovascular disease (p = 0.020). Overall, HD-TCC patients were more likely to have clinical visits (p = 0.069), emergency room visits (p<0.001) and hospital admissions (p<0.001). At the end of follow-up, HD-TCC patients had a higher rate of dialysis access-related complications (1.53 vs. 0.93 vs. 0.64, per patient-year; p<0.001) and hospitalizations (0.47 vs. 0.07 vs. 0.14, per patient-year; p = 0.034) than HD-AVF and PD patients, respectively. The survival rates at one year were 96.6%, 74.5% and 97.6% for HD-AVF, HD-TCC and PD groups, respectively (p<0.001). In multivariate analysis, HD-TCC use at the time of dialysis initiation was the important factor associated with death (HR 16.128, 95%CI [1.431-181.778], p = 0.024).
Conclusion
Our results suggest that HD vascular access type at the time of renal replacement therapy initiation is an important modifier of the relationship between dialysis modality and survival among incident dialysis patients.
doi:10.1186/1471-2369-13-88
PMCID: PMC3476986  PMID: 22916962
5.  Analysis of Vascular Access in Haemodialysis Patients - Single Center Experience 
Background
Vascular access is the key in successful management of chronic haemodialysis (HD) patients. Though native arteriovenous fistula (AVF) is considered the access of choice, many patients in our country initiate haemodialysis through central venous catheter (CVC). There is paucity of data on vascular access in haemodialysis patients from southern India.
Aim
Aim of the present study was to review our experience of vascular access in Haemodialysis patients (both central venous catheters and arteriovenous fistula) and to assess its success rate and common complications.
Materials and Methods
This prospective study was conducted between January 2014 and December 2014 in our institute. A total of 50 patients with Chronic Kidney Disease (CKD) underwent vascular access intervention during the above period.
Results
A temporary venous catheter (96%) in the right internal jugular vein was the most common mode of initiation of haemodialysis with 34.48% incidence of catheter related sepsis. Fifty percent of catheters were removed electively with mean duration of catheter survival of 77.23 ± 14.8 days. Wrist AVF (60%) was the most common site of AVF creation followed by arm (30%), mid-forearm (7.5%) and leg (2.5%). Complications include distal oedema (17.5%) and venous hypertension (2.5%). Primary failure occurred in 25% of patients and was more common in diabetic, elderly (>60 years) and in distal fistulas. Elderly patients (>60 years) starting dialysis with a CVC were more likely to be CVC dependent at 90 days.
Conclusion
Late presentation and delayed diagnosis of chronic kidney disease (CKD) necessitates dialysis initiation through temporary catheter. Dialysis catheter with its attendant complications further adds to the morbidity, mortality, health care burden and costs. Early nephrology referral and permanent access creation in the pre dialysis stage could avert the unnecessary complications and costs of catheter.
doi:10.7860/JCDR/2015/13342.6611
PMCID: PMC4625272  PMID: 26557553
Arteriovenous fistula; Temporary haemodialysis catheter
6.  Epidemiology of haemodialysis catheter complications: a survey of 865 dialysis patients from 14 haemodialysis centres in Henan province in China 
BMJ Open  2015;5(11):e007136.
Objectives
To investigate the incidence rates and risk factors for catheter-related complications in different districts and populations in Henan Province in China.
Design
Cross-sectional.
Setting
Fourteen hospitals in Henan Province.
Participants
865 patients with renal dysfunction undergoing dialysis using catheters between October 2013 and October 2014.
Main outcome measures
The main outcome measures were complications, risk factors and patient characteristics. Catheter-related complications included catheter-related infection (catheter exit-site infection, catheter tunnel infection and catheter-related bloodstream infection), catheter dysfunction (thrombosis, catheter malposition or kinking, and fibrin shell formation) and central vein stenosis.
Results
The overall incidence rate was 7.74/1000 catheter-days, affecting 38.61% of all patients, for catheter infections, 10.58/1000 catheter-days, affecting 56.65% of all patients, for catheter dysfunction, and 0.68/1000 catheter-days, affecting 8.79% of all patients, for central vein stenosis. Multivariate analysis showed that increased age, diabetes, primary educational level or below, rural residence, lack of a nephropathy visit before dialysis and pre-established permanent vascular access, not taking oral drugs to prevent catheter thrombus, lower serum albumin levels and higher ferritin levels were independently associated with catheter infections. Rural residence, not taking oral drugs to prevent thrombus, lack of an imaging examination after catheter insertion, non-tunnel catheter type, lack of medical insurance, lack of nephropathy visit before dialysis and pre-established permanent vascular access, left-sided catheter position, access via the femoral vein and lower haemoglobin level were independently associated with catheter dysfunction. Diabetes, lack of nephropathy visit before dialysis and pre-established permanent vascular access, lack of oral drugs to prevent catheter thrombus, left-sided catheter location and higher number of catheter insertions, were independently associated with central vein stenosis.
Conclusions
The rate of catheter-related complications was high in patients with end-stage renal disease in Henan Province. Our finding suggest that strategies should be implemented to decrease complication rates.
doi:10.1136/bmjopen-2014-007136
PMCID: PMC4663418  PMID: 26589425
7.  The impact of Vascular Access on the Adequacy of Dialysis and the Outcome of the Dialysis Treatment: One Center Experience 
Materia Socio-Medica  2015;27(2):114-117.
Introduction:
Chronic kidney disease (CKD) is a gradually reduction in glomerular filtration rate (GFR) caused by destruction of a large number of nephrons. Kidney failure is the final stage of CKD with GFR <15ml/min/1.73m2 or requiring dialysis. Patients must provide vascular access, which is also the “life line” and “Achilles heel” of hemodialysis treatment.
Aim:
The purpose of this research is to show the demographic structure of the hemodialysis center in Konjic, and also demonstrate the impact of vascular access to the adequacy and the outcome of dialysis treatment.
Methods:
This cross-sectional study included 36 patients on hemodialysis in Center in Konjic from September 2010 to December 2014. The method of collecting data is performed through medical records and the quality of dialysis is taken to be Kt/V> 1.2. Statistical analysis was performed using SPSS software and Student T-test.
Results:
The mortality of patients treated by dialysis is 37.8%. The ratio of male and female patients is 55.6% vs. 44.5%, with an average age of 52.91±14.36 years and an average duration of hemodialysis of five years. The highest percentage of patients dialyzed through arterio-venous fistula (AVF) on the forearm (72.2%). In that patients the most common complication is thrombosis with 30.5%, which require recanalization in 11% and replacement in 19.5% of patients. Of the other dialysis patients, 16.7% of patients are dialyzed via a temporary and 11.1% via a permanent catheter (the most common complication in that patients is infection in 83.3% cases) in v.subclavia. Although the AVF is more frequently, experience shows frequent implantation of a permanent catheter in elderly patients due to the less quality of their blood vessels. Although the Kt/V by patients who are dialyzed through temporary catheter is less than 1.2 and by the other two access is greater than 1.2, our results confirm that vascular access does not have an influence on quality of dialysis. Average Kt/V shows that the adequate dialysis dose is delivered in this Center, which means that despite the impact of vascular access in HD quality, other factors also can affect on dialysis treatment, which was noticed by patients and staff.
Conclusion:
Despite the largest mortality rate in patients with a permanent catheter and least in patients with AVF, the type of vascular access does not affect the outcome of dialysis treatment.
doi:10.5455/msm.2015.27.4-114-117
PMCID: PMC4404955  PMID: 26005389
hemodialysis; vascular access; Kt/V
8.  Cost Analysis of Hemodialysis and Peritoneal Dialysis Access in Incident Dialysis Patients 
♦ Background: Although several studies have demonstrated the economic advantages of peritoneal dialysis (PD) over hemodialysis (HD), few reports in the literature have compared the costs of HD and PD access. The aim of the present study was to compare the resources required to establish and maintain the dialysis access in patients who initiated HD with a tunneled cuffed catheter (TCC) or an arteriovenous fistula (AVF) and in patients who initiated PD.
♦ Methods: We retrospectively analyzed the 152 chronic kidney disease patients who consecutively initiated dialysis treatment at our institution in 2008 (HD-AVF, n = 65; HD-CVC, n = 45; PD, n = 42). Detailed clinical and demographic information and data on access type were collected for all patients. A comprehensive measure of total dialysis access costs, including surgery, radiology, hospitalization for access complications, physician costs, and transportation costs was obtained at year 1 using an intention-to-treat approach. All resources used were valued using 2010 prices, and costs are reported in 2010 euros.
♦ Results: Compared with the HD-AVF and HD-TCC modalities, PD was associated with a significantly lower risk of access-related interventions (adjusted rate ratios: 1.572 and 1.433 respectively; 95% confidence intervals: 1.253 to 1.891 and 1.069 to 1.797). The mean dialysis access-related costs per patient-year at risk were €1171.6 [median: €608.8; interquartile range (IQR): €563.1 - €936.7] for PD, €1555.2 (median: €783.9; IQR: €371.4 - €1571.7) for HD-AVF, and €4208.2 (median: €1252.4; IQR: €947.9 - €2983.5) for HD-TCC (p < 0.001). In multivariate analysis, total dialysis access costs were significantly higher for the HD-TCC modality than for either PD or HD-AVF (β = -0.53; 95% CI: -1.03 to -0.02; and β = -0.50; 95% CI: -0.96 to -0.04).
♦ Conclusions: Compared with patients initiating HD, those initiating PD required fewer resources to establish and maintain a dialysis access during the first year of treatment.
doi:10.3747/pdi.2011.00309
PMCID: PMC3862096  PMID: 23455977
Cost analysis; health economics; hemodialysis; dialysis access; vascular access; peritoneal catheter
9.  Geographic and facility variation in initial use of non-tunneled catheters for incident maintenance hemodialysis patients 
BMC Nephrology  2016;17:20.
Background
Non-tunneled (temporary) hemodialysis catheters (NTHCs) are the least-optimal initial vascular access for incident maintenance hemodialysis patients yet little is known about factors associated with NTHC use in this context. We sought to determine factors associated with NTHC use and examine regional and facility-level variation in NTHC use for incident maintenance hemodialysis patients.
Methods
We analyzed registry data collected between January 2001 and December 2010 from 61 dialysis facilities within 12 geographic regions in Canada. Multi-level models and intra-class correlation coefficients were used to evaluate variation in NTHC use as initial hemodialysis access across facilities and geographic regions. Facility and patient characteristics associated with the lowest and highest quartiles of NTHC use were compared.
Results
During the study period, 21,052 patients initiated maintenance hemodialysis using a central venous catheter (CVC). This included 10,183 patients (48.3 %) in whom the initial CVC was a NTHC, as opposed to a tunneled CVC. Crude variation in NTHC use across facilities ranged from 3.7 to 99.4 % and across geographic regions from 32.4 to 85.1 %. In an adjusted multi-level logistic regression model, the proportion of total variation in NTHC use explained by facility-level and regional variation was 40.0 % and 34.1 %, respectively. Similar results were observed for the subgroup of patients who received greater than 12 months of pre-dialysis nephrology care. Patient-level factors associated with increased NTHC use were male gender, history of angina, pulmonary edema, COPD, hypertension, increasing distance from dialysis facility, higher serum phosphate, lower serum albumin and later calendar year.
Conclusions
There is wide variation in NTHC use as initial vascular access for incident maintenance hemodialysis patients across facilities and geographic regions in Canada. Identifying modifiable factors that explain this variation could facilitate a reduction of NTHC use in favor of more optimal initial vascular access.
doi:10.1186/s12882-016-0236-4
PMCID: PMC4769546  PMID: 26920700
Vascular access; Hemodialysis; Temporary hemodialysis catheters; Epidemiology; Practice variation
10.  An integrated review of "unplanned" dialysis initiation: reframing the terminology to "suboptimal" initiation 
BMC Nephrology  2009;10:22.
Background
Ideally, care prior to the initiation of dialysis should increase the likelihood that patients start electively outside of the hospital setting with a mature arteriovenous fistula (AVF) or peritoneal dialysis (PD) catheter. However, unplanned dialysis continues to occur in patients both known and unknown to nephrology services, and in both late and early referrals. The objective of this article is to review the clinical and socioeconomic outcomes of unplanned dialysis initiation. The secondary objective is to explore the potential cost implications of reducing the rate of unplanned first dialysis in Canada.
Methods
MEDLINE and EMBASE from inception to 2008 were used to identify studies examining the clinical, economic or quality of life (QoL) outcomes in patients with an unplanned versus planned first dialysis. Data were described in a qualitative manner.
Results
Eight European studies (5,805 patients) were reviewed. Duration of hospitalization and mortality was higher for the unplanned versus planned population. Patients undergoing a first unplanned dialysis had significantly worse laboratory parameters and QoL. Rates of unplanned dialysis ranged from 24-49%. The total annual burden to the Canadian healthcare system of unplanned dialysis in 2005 was estimated at $33 million in direct hospital costs alone. Reducing the rate of unplanned dialysis by one-half yielded savings ranging from $13.3 to $16.1 million.
Conclusion
The clinical and socioeconomic impact of unplanned dialysis is significant. To more consistently characterize the unplanned population, the term suboptimal initiation is proposed to include dialysis initiation in hospital and/or with a central venous catheter and/or with a patient not starting on their chronic modality of choice. Further research and implementation of initiatives to reduce the rate of suboptimal initiation of dialysis in Canada are needed.
doi:10.1186/1471-2369-10-22
PMCID: PMC2735745  PMID: 19674452
11.  Cohort profile: Canadian study of prediction of death, dialysis and interim cardiovascular events (CanPREDDICT) 
BMC Nephrology  2013;14:121.
Background
The Canadian Study of Prediction of Death, Dialysis and Interim Cardiovascular Events (CanPREDDICT) is a large, prospective, pan-Canadian, cohort study designed to improve our understanding of determinants of renal and cardiovascular (CV) disease progression in patients with chronic kidney disease (CKD). The primary objective is to clarify the associations between traditional and newer biomarkers in the prediction of specific renal and CV events, and of death in patients with CKD managed by nephrologists. This information could then be used to better understand biological variation in outcomes, to develop clinical prediction models and to inform enrolment into interventional studies which may lead to novel treatments.
Methods/Designs
Commenced in 2008, 2546 patients have been enrolled with eGFR between 15 and 45 ml/min 1.73m2 from a representative sample in 25 rural, urban, academic and non academic centres across Canada. Patients are to be followed for an initial 3 years at 6 monthly intervals, and subsequently annually. Traditional biomarkers include eGFR, urine albumin creatinine ratio (uACR), hemoglobin (Hgb), phosphate and albumin. Newer biomarkers of interest were selected on the basis of biological relevance to important processes, commercial availability and assay reproducibility. They include asymmetric dimethylarginine (ADMA), N-terminal pro-brain natriuretic peptide (NT-pro-BNP), troponin I, cystatin C, high sensitivity C-reactive protein (hsCRP), interleukin-6 (IL6) and transforming growth factor beta 1 (TGFβ1). Blood and urine samples are collected at baseline, and every 6 monthly, and stored at −80°C. Outcomes of interest include renal replacement therapy, CV events and death, the latter two of which are adjudicated by an independent panel.
Discussion
The baseline distribution of newer biomarkers does not appear to track to markers of kidney function and therefore may offer some discriminatory value in predicting future outcomes. The granularity of the data presented at baseline may foster additional questions.
The value of the cohort as a unique resource to understand outcomes of patients under the care of nephrologists in a single payer healthcare system cannot be overstated. Systematic collection of demographic, laboratory and event data should lead to new insights.
The mean age of the cohort was 68 years, 90% were Caucasian, 62% were male, and 48% had diabetes. Forty percent of the cohort had eGFR between 30–45 mL/min/1.73m2, 22% had eGFR values below 20 mL/min/1.73m2; 61% had uACR < 30. Serum albumin, hemoglobin, calcium and 25-hydroxyvitamin D (25(OH)D) levels were progressively lower in the lower eGFR strata, while parathyroid hormone (PTH) levels increased. Cystatin C, ADMA, NT-proBNP, hsCRP, troponin I and IL-6 were significantly higher in the lower GFR strata, whereas 25(OH)D and TGFβ1 values were lower at lower GFR. These distributions of each of the newer biomarkers by eGFR and uACR categories were variable.
doi:10.1186/1471-2369-14-121
PMCID: PMC3691726  PMID: 23758910
Chronic kidney disease; Biomarkers; Observational cohort study; Outcomes; Progression; CV disease
12.  Clinical outcomes and mortality in elderly peritoneal dialysis patients 
Clinics  2015;70(5):363-368.
OBJECTIVES:
To evaluate the clinical outcomes and identify the predictors of mortality in elderly patients undergoing peritoneal dialysis.
METHODS:
We conducted a retrospective study including all incident peritoneal dialysis cases in patients ≥65 years of age treated from 2001 to 2014. Demographic and clinical data on the initiation of peritoneal dialysis and the clinical events during the study period were collected. Infectious complications were recorded. Overall and technique survival rates were analyzed.
RESULTS:
Fifty-eight patients who began peritoneal dialysis during the study period were considered for analysis, and 50 of these patients were included in the final analysis. Peritoneal dialysis exchanges were performed by another person for 65% of the patients, whereas 79.9% of patients preferred to perform the peritoneal dialysis themselves. Peritonitis and catheter exit site/tunnel infection incidences were 20.4±16.3 and 24.6±17.4 patient-months, respectively. During the follow-up period, 40 patients were withdrawn from peritoneal dialysis. Causes of death included peritonitis and/or sepsis (50%) and cardiovascular events (30%). The mean patient survival time was 38.9±4.3 months, and the survival rates were 78.8%, 66.8%, 50.9% and 19.5% at 1, 2, 3 and 4 years after peritoneal dialysis initiation, respectively. Advanced age, the presence of additional diseases, increased episodes of peritonitis, the use of continuous ambulatory peritoneal dialysis, and low albumin levels and daily urine volumes (<100 ml) at the initiation of peritoneal dialysis were predictors of mortality. The mean technique survival duration was 61.7±5.2 months. The technique survival rates were 97.9%, 90.6%, 81.5% and 71% at 1, 2, 3 and 4 years, respectively. None of the factors analyzed were predictors of technique survival.
CONCLUSIONS:
Mortality was higher in elderly patients. Factors affecting mortality in elderly patients included advanced age, the presence of comorbid diseases, increased episodes of peritonitis, use of continuous ambulatory peritoneal dialysis, and low albumin levels and daily urine volumes (<100 ml) at the initiation of peritoneal dialysis.
doi:10.6061/clinics/2015(05)10
PMCID: PMC4449459  PMID: 26039954
Peritoneal Dialysis; Elderly; Mortality
13.  Patency and complications of translumbar dialysis catheters 
Seminars in dialysis  2015;28(4):E41-E47.
Background
Translumbar tunneled dialysis catheter (TLDC) is a temporary dialysis access for patients exhausted traditional access for dialysis. While few small studies reported successes with TLDC, additional studies are warranted to understand the short and long-term patency and safety of TLDC.
Methods
We conducted a retrospective analysis of adult patients who received TLDC for hemodialysis access from June 2006 to June 2013. Patient demographics, comorbid conditions, dialysis details, catheter insertion procedures and associated complications, catheter patency, and patient survival data were collected. Catheter patency was studied using Kaplan-Meier curve; catheter functionality was assessed with catheter intervals and catheter related complications were used to estimate catheter safety.
Results
There were 84 TLDCs inserted in 28 patients with 28 primary insertions and 56 exchanges. All TLDC insertions were technically successful with good blood flow during dialysis (>300 ml/min) and no immediate complications (major bleeding or clotting) were noted. The median number of days in place for initial catheter, secondary catheter and total catheter were 65, 84 and 244 respectively. The catheter patency rate at 3, 6 and 12 months were 43%, 25% and 7% respectively. The main complications were poor blood flow (40%) and catheter related infection (36%), which led to 30.8% and 35.9% catheter removal respectively. After translumbar catheter, 42.8% of the patients were successfully converted to another vascular access or peritoneal dialysis.
Conclusion
This study data suggests that TLDC might serve as a safe, alternate access for dialysis patients in short-term who have exhausted conventional vascular access.
doi:10.1111/sdi.12358
PMCID: PMC4836066  PMID: 25800550
End stage renal disease; Vascular access; Dialysis catheter; Clinical outcome
14.  From ZeRO to HeRO: Saving lives one HeRO at a time 
Highlights
•The HeRO graft is a fully subcutaneous vascular access system that bypasses the central venous system.•It provides haemodialysis access for patients with central venous stenosis.•The HeRO graft is a potentially long term option for ongoing dialysis in patients who have exhausted all other options.•Studies show a reduced rate of infection and bacteraemia associated with the HeRO graft.•Patency rates are similar to those of traditional arteriovenous grafts.
Introduction
This case report intends to highlight the Haemodialysis Reliable Outflow (HeRO) graft as a potential long term option for ongoing dialysis in patients with central venous stenosis.
Presentation of case
A 55 year old patient, who developed end stage renal failure (ESRF) after chemotherapy treatment for breast cancer, presented at the limit of her dialysis access after a 15 year haemodialysis history causing central vein stenosis.
The patient was initially started on peritoneal dialysis but after repeated peritonitis was switched to haemodialysis.
Over fifteen years of haemodialysis the patient had fistulae created in all four limbs. She had multiple tunnelled neck lines and developed an occluded left brachiocephalic vein and stenosed superior vena cava. Catheter dialysis via the right internal jugular vein was attempted but proved increasingly problematic due to poor clearances and frequent catheter changes.
A further attempt was made to treat with peritoneal dialysis, but again, this was unsuccessful.
As the patient had two failed attempts at peritoneal dialysis, had exhausted all her peripheral access options, and was having problematic catheter dialysis, she was offered the option of the HeRO graft as a ‘last resort’.
Discussion
The HeRO graft is a fully subcutaneous vascular access system that bypasses the central venous system providing haemodialysis access for patients with central venous stenosis. It consists of an arterial graft component and a venous outflow component, which are connected via a titanium connector. The central vein stenosis/occlusion is stented with insertion of the graft in to the right atrium, which is then secured to the arterial component for needling. So that successful dialysis could be completed as soon as possible post-operatively, the HeRO graft, in this instance, was combined with an immediate needling graft (Acuseal). This allowed the patient to receive successful dialysis within hours of completing the procedure.
Conclusion
This patient had reached the end of her haemodialysis life with no other options available. She was treated successfully with the HeRO graft, which at two months was patent and problem free. The patient had been able to return to work for the first time in 15 months.
Utilising the HeRO graft in this way may provide new, potentially long term, options for safe and effective dialysis in patients with central venous stenosis.
doi:10.1016/j.ijscr.2016.08.004
PMCID: PMC5008058  PMID: 27588751
HeRO, haemodialysis reliable outflow graft; ESRF, end stage renal failure; AV, arteriovenous; FDA, Food and Drug Administration; TDC, tunnelled dialysis catheter; Arteriovenous graft; Central venous stenosis; Dialysis; End stage renal failure; Haemodialysis reliable outflow graft
15.  A Systematic Review and Meta-Analysis of Utility-Based Quality of Life in Chronic Kidney Disease Treatments 
PLoS Medicine  2012;9(9):e1001307.
Melanie Wyld and colleagues examined previously published studies to assess pooled utility-based quality of life of the various treatments for chronic kidney disease. They conclude that the highest utility was for kidney transplants, with home-based automated peritoneal dialysis being second.
Background
Chronic kidney disease (CKD) is a common and costly condition to treat. Economic evaluations of health care often incorporate patient preferences for health outcomes using utilities. The objective of this study was to determine pooled utility-based quality of life (the numerical value attached to the strength of an individual's preference for a specific health outcome) by CKD treatment modality.
Methods and Findings
We conducted a systematic review, meta-analysis, and meta-regression of peer-reviewed published articles and of PhD dissertations published through 1 December 2010 that reported utility-based quality of life (utility) for adults with late-stage CKD. Studies reporting utilities by proxy (e.g., reported by a patient's doctor or family member) were excluded.
In total, 190 studies reporting 326 utilities from over 56,000 patients were analysed. There were 25 utilities from pre-treatment CKD patients, 226 from dialysis patients (haemodialysis, n = 163; peritoneal dialysis, n = 44), 66 from kidney transplant patients, and three from patients treated with non-dialytic conservative care. Using time tradeoff as a referent instrument, kidney transplant recipients had a mean utility of 0.82 (95% CI: 0.74, 0.90). The mean utility was comparable in pre-treatment CKD patients (difference = −0.02; 95% CI: −0.09, 0.04), 0.11 lower in dialysis patients (95% CI: −0.15, −0.08), and 0.2 lower in conservative care patients (95% CI: −0.38, −0.01). Patients treated with automated peritoneal dialysis had a significantly higher mean utility (0.80) than those on continuous ambulatory peritoneal dialysis (0.72; p = 0.02). The mean utility of transplant patients increased over time, from 0.66 in the 1980s to 0.85 in the 2000s, an increase of 0.19 (95% CI: 0.11, 0.26). Utility varied by elicitation instrument, with standard gamble producing the highest estimates, and the SF-6D by Brazier et al., University of Sheffield, producing the lowest estimates. The main limitations of this study were that treatment assignments were not random, that only transplant had longitudinal data available, and that we calculated EuroQol Group EQ-5D scores from SF-36 and SF-12 health survey data, and therefore the algorithms may not reflect EQ-5D scores measured directly.
Conclusions
For patients with late-stage CKD, treatment with dialysis is associated with a significant decrement in quality of life compared to treatment with kidney transplantation. These findings provide evidence-based utility estimates to inform economic evaluations of kidney therapies, useful for policy makers and in individual treatment discussions with CKD patients.
Editors' Summary
Background
Ill health can adversely affect an individual's quality of life, particularly if caused by long-term (chronic) conditions, such as chronic kidney disease—in the United States alone, 23 million people have chronic kidney disease, of whom 570,000 are treated with dialysis or kidney transplantation. In order to measure the cost-effectiveness of interventions to manage medical conditions, health economists use an objective measurement known as quality-adjusted life years. However, although useful, quality-adjusted life years are often criticized for not taking into account the views and preferences of the individuals with the medical conditions. A measurement called a utility solves this problem. Utilities are a numerical value (measured on a 0 to 1 scale, where 0 represents death and 1 represents full health) of the strength of an individual's preference for specified health-related outcomes, as measured by “instruments” (questionnaires) that rate direct comparisons or assess quality of life.
Why Was This Study Done?
Previous studies have suggested that, in people with chronic kidney disease, quality of life (as measured by utility) is higher in those with a functioning kidney transplant than in those on dialysis. However, currently, it is unclear whether the type of dialysis affects quality of life: hemodialysis is a highly technical process that directly filters the blood, usually must be done 2–4 times a week, and can only be performed in a health facility; peritoneal dialysis, in which fluids are infused into the abdominal cavity, can be done nightly at home (automated peritoneal dialysis) or throughout the day (continuous ambulatory peritoneal dialysis). In this study, the researchers reviewed and assimilated all of the available evidence to investigate whether quality of life in people with chronic kidney disease (as measured by utility) differed according to treatment type.
What Did the Researchers Do and Find?
The researchers did a comprehensive search of 11 databases to identify all relevant studies that included people with severe (stage 3, 4, or 5) chronic kidney disease, their form of treatment, and information on utilities—either reported directly, or included in quality of life instruments (SF-36), so the researchers could calculate utilities by using a validated algorithm. The researchers also recorded the prevalence rates of diabetes in study participants. Then, using statistical models that adjusted for various factors, including treatment type and the method of measuring utilities, the researchers were able to calculate the pooled utilities of each form of treatment for chronic kidney disease.
The researchers included 190 studies, representing over 56,000 patients and generating 326 utility estimates, in their analysis. The majority of utilities (77%) were derived through the SF-36 questionnaire via calculation. Of the 326 utility estimates, 25 were from patients pre-dialysis, 226 were from dialysis patients (the majority of whom were receiving hemodialysis), 66 were from kidney transplant patients, and three were from conservative care patients. The researchers found that the highest average utility was for those who had renal transplantation, 0.82, followed by the pre-dialysis group (0.80), dialysis patients (0.71), and, finally, patients receiving conservative care (0.62). When comparing the type of dialysis, the researchers found that there was little difference in utility between hemodialysis and peritoneal dialysis, but patients using automated peritoneal dialysis had, on average, a higher utility (0.80) than those treated with continuous ambulatory peritoneal dialysis (0.72). Finally, the researchers found that patient groups with diabetes had significantly lower utilities than those without diabetes.
What Do These Findings Mean?
These findings suggest that in people with chronic kidney disease, renal transplantation is the best treatment option to improve quality of life. For those on dialysis, home-based automated peritoneal dialysis may improve quality of life more than the other forms of dialysis: this finding is important, as this type of dialysis is not as widely used as other forms and is also cheaper than hemodialysis. Furthermore, these findings suggest that patients who choose conservative care have significantly lower quality of life than patients treated with dialysis, a finding that warrants further investigation. Overall, in addition to helping to inform economic evaluations of treatment options, the information from this analysis can help guide clinicians caring for patients with chronic kidney disease in their discussions about possible treatment options.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001307.
Information about chronic kidney disease is available from the National Kidney Foundation and MedlinePlus
Wikipedia gives information on general utilities (note that Wikipedia is a free online encyclopedia that anyone can edit; available in several languages)
doi:10.1371/journal.pmed.1001307
PMCID: PMC3439392  PMID: 22984353
16.  Arteriovenous Graft Placement in Predialysis Patients: A Potential Catheter-Sparing Strategy 
Background
When pre-dialysis patients are deemed to be unsuitable candidates for an arteriovenous fistula, current guidelines recommend waiting until just before or after initiation of dialysis before placing a graft. This strategy may increase catheter use when these patients start dialysis. We compared the outcomes of patients whose grafts were placed before and after dialysis initiation.
Study design
Retrospective analysis of a prospective computerized vascular access database.
Setting and participants
Chronic kidney disease patients receiving their first arteriovenous graft (n=248) at a large medical center.
Predictor
Timing of graft placement (before or after initiation of dialysis)
Outcome & measurements
Primary graft failure, cumulative graft survival, catheter-dependence, and catheter-related bacteremia.
Results
The first graft was placed pre-dialysis in 62 patients and post-dialysis in 186 patients. Primary graft failure was similar for pre-dialysis and post-dialysis grafts (20 vs 24%; p=0.5). The median cumulative graft survival was similar for the pre-dialysis and post-dialysis grafts (365 vs 414 days; HR, 1.22; 95% CI, 0.81–1.98; p=0.3). The median duration of catheter-dependence after graft placement in the post-dialysis group was 48 days, and was associated with 0.63 (95% CI, 0.48–0.79) episodes of catheter-related bacteremia per patient.
Limitations
Retrospective analysis, single medical center
Conclusion
Grafts placed pre-dialysis have similar primary failure rates and cumulative survival to those placed after starting dialysis. However, post-dialysis graft placement is associated with prolonged catheter dependence and frequent bacteremia. Pre-dialysis graft placement may reduce catheter-dependence and bacteremia in selected patients.
doi:10.1053/j.ajkd.2011.01.026
PMCID: PMC4034174  PMID: 21458898
17.  Fetuin-A and vascular calcification in Indian end-stage renal disease population 
Indian Journal of Nephrology  2016;26(1):33-38.
Fetuin-A levels, its correlation with vascular calcification and other biochemical markers of chronic kidney disease-mineral and bone disorder (CKD-MBD) has not been studied in Indian end-stage renal disease population. Forty patients on dialysis for more than 3 months were studied. Biochemical parameters of CKD-MBD, highly sensitive-C reactive protein (hs-CRP), lipid profile and fetuin-A levels were estimated. Multi-slice computed tomography (MSCT) at the level of L1–L4 was done, and calcification score calculated using AJ 130 smart score. Levels of fetuin-A were correlated with calcification score and biochemical markers of CKD–MBD. Mean fetuin-A levels were 0.33 ± 0.098 g/l. Positive correlation of abdominal aortic calcification scores was found with age (P < 0.01) and duration of dialysis (P = 0.018). No correlation was detected between MSCT score, calcium phosphate product, intact parathyroid hormone, vitamin D, triglycerides and fetuin-A, and there was no correlation between fetuin-A levels, age, dialysis duration and calcium phosphate product but a significant correlations with vitamin D3 (P = 0.034), serum albumin (P = 0.002) was detected. Inverse correlation with hs-CRP was obtained. Patients with ischemic heart disease had numerically lower levels of fetuin-A (P = 0.427) and numerically higher MSCT score (P = 0.135). Patients with low hs-CRP (<10) had numerically higher fetuin-A levels (P = 0.090) and significantly low MSCT scores (P = 0.020). Calcium deposition seen on MSCT increases with age and duration of dialysis but is not related to fetuin-A levels. Inconclusive relationship exists with other parameters of CKD-MBD. Large controlled studies are needed to establish the role of fetuin-A in vascular calcification in Indian population.
doi:10.4103/0971-4065.157007
PMCID: PMC4753739  PMID: 26937076
Abdominal aorta calcification; end-stage renal disease; fetuin-A; India
18.  Duration of temporary catheter use for hemodialysis: an observational, prospective evaluation of renal units in Brazil 
BMC Nephrology  2011;12:63.
Background
For chronic hemodialysis, the ideal permanent vascular access is the arteriovenous fistula (AVF). Temporary catheters should be reserved for acute dialysis needs. The AVF is associated with lower infection rates, better clinical results, and a higher quality of life and survival when compared to temporary catheters. In Brazil, the proportion of patients with temporary catheters for more than 3 months from the beginning of therapy is used as an evaluation of the quality of renal units. The aim of this study is to evaluate factors associated with the time between the beginning of hemodialysis with temporary catheters and the placement of the first arteriovenous fistula in Brazil.
Methods
This is an observational, prospective non-concurrent study using national administrative registries of all patients financed by the public health system who began renal replacement therapy (RRT) between 2000 and 2004 in Brazil. Incident patients were eligible who had hemodialysis for the first time. Patients were excluded who: had hemodialysis reportedly started after the date of death (inconsistent database); were younger than 18 years old; had HIV; had no record of the first dialysis unit; and were dialyzed in units with less than twenty patients. To evaluate individual and renal unit factors associated with the event of interest, the frailty model was used (N = 55,589).
Results
Among the 23,824 patients (42.9%) who underwent fistula placement in the period of the study, 18.2% maintained the temporary catheter for more than three months until the fistula creation. The analysis identified five statistically significant factors associated with longer time until first fistula: higher age (Hazard-risk - HR 0.99, 95% CI 0.99-1.00); having hypertension and cardiovascular diseases (HR 0.94, 95% CI 0.9-0.98) as the cause of chronic renal disease; residing in capitals cities (HR 0.92, 95% CI 0.9-0.95) and certain regions in Brazil - South (HR 0.83, 95% CI 0.8-0.87), Midwest (HR 0.88, 95% CI 0.83-0.94), Northeast (HR 0.91, 95% CI 0.88-0.94), or North (HR 0.88, 95% CI 0.83-0.94) and the type of renal unit (public or private).
Conclusion
Monitoring the provision of arteriovenous fistulas in renal units could improve the care given to patients with end stage renal disease.
doi:10.1186/1471-2369-12-63
PMCID: PMC3227575  PMID: 22093280
19.  Safety and functionality of transhepatic hemodialysis catheters in chronic hemodialysis patients 
PURPOSE
We aimed to investigate the safety and functionality of tunneled transhepatic hemodialysis catheters in chronic hemodialysis patients.
METHODS
Thirty-eight patients (20 women aged 56±10 years and 18 men aged 61±11 years) with transhepatic tunneled hemodialysis catheters were evaluated. The date of the first transhepatic catheterization, indications, procedure details, functional time periods of catheters, reasons for the removal or revision of catheters, catheter-related complications, and current conditions of patients were retrospectively analyzed.
RESULTS
A total of 69 catheters were properly placed in all patients (100% technical success) under imaging guidance during the 91-month follow-up period. The functionality of 35 catheters could not be evaluated: five catheters were removed because of noncomplication related reasons (surgical fistulas were opened in two cases [2/35, 5.7%], transplantation was performed in three cases [3/35, 8.6%]), 18 patients died while their catheters were functional (18/35, 51.4%), and 12 catheters were still functional at the time of the study (12/35, 34.3%). The functionality of catheters was evaluated the remaining 34 catheters that necessitated revision because of complications. Furthermore, only half of the catheters were functional on day 136 when evaluated using Kaplan-Meier analysis. The four main complications were thrombosis (16/34, 47%; complication rate of 0.37 days in 100 catheters), infection (8/34, 23.5%; 0.18 days in 100 catheters), migration (8/34, 23.5%; 0.18 days in 100 catheters), and kinking (2/34, 6%; 0.04 days in 100 catheters).
CONCLUSION
Transhepatic venous catheterization is a safe and functional alternative route in chronic hemodialysis patients without an accessible central venou route. The procedure can be performed with high technical success and low complication rates under imaging guidance.
doi:10.5152/dir.2016.16043
PMCID: PMC5098952  PMID: 27601303
20.  Effect of Clopidogrel on Early Failure of Arteriovenous Fistulas for Hemodialysis 
JAMA  2008;299(18):2164-2171.
Context
The arteriovenous fistula is the preferred type of vascular access for hemodialysis because of lower thrombosis and infection rates and lower health care expenditures compared with synthetic grafts or central venous catheters. Early failure of fistulas due to thrombosis or inadequate maturation is a barrier to increasing the prevalence of fistulas among patients treated with hemodialysis. Small, inconclusive trials have suggested that antiplatelet agents may reduce thrombosis of new fistulas.
Objective
To determine whether clopidogrel reduces early failure of hemodialysis fistulas.
Design, Setting, and Participants
Randomized, double-blind, placebo-controlled trial conducted at 9 US centers composed of academic and community nephrology practices in 2003–2007. Eight hundred seventy-seven participants with end-stage renal disease or advanced chronic kidney disease were followed up until 150 to 180 days after fistula creation or 30 days after initiation of dialysis, whichever occurred later.
Intervention
Participants were randomly assigned to receive clopidogrel (300-mg loading dose followed by daily dose of 75 mg; n = 441) or placebo (n = 436) for 6 weeks starting within 1 day after fistula creation.
Main Outcome Measures
The primary outcome was fistula thrombosis, determined by physical examination at 6 weeks. The secondary outcome was failure of the fistula to become suitable for dialysis. Suitability was defined as use of the fistula at a dialysis machine blood pump rate of 300 mL/min or more during 8 of 12 dialysis sessions.
Results
Enrollment was stopped after 877 participants were randomized based on a stopping rule for intervention efficacy. Fistula thrombosis occurred in 53 (12.2%) participants assigned to clopidogrel compared with 84 (19.5%) participants assigned to placebo (relative risk, 0.63; 95% confidence interval, 0.46–0.97; P = .018). Failure to attain suitability for dialysis did not differ between the clopidogrel and placebo groups (61.8% vs 59.5%, respectively; relative risk, 1.05; 95% confidence interval, 0.94–1.17; P = .40).
Conclusion
Clopidogrel reduces the frequency of early thrombosis of new arteriovenous fistulas but does not increase the proportion of fistulas that become suitable for dialysis.
Trial Registration
clinicaltrials.gov Identifier: NCT00067119
doi:10.1001/jama.299.18.2164
PMCID: PMC4943222  PMID: 18477783
21.  Biomarker Profiling by Nuclear Magnetic Resonance Spectroscopy for the Prediction of All-Cause Mortality: An Observational Study of 17,345 Persons 
PLoS Medicine  2014;11(2):e1001606.
In this study, Würtz and colleagues conducted high-throughput profiling of blood specimens in two large population-based cohorts in order to identify biomarkers for all-cause mortality and enhance risk prediction. The authors found that biomarker profiling improved prediction of the short-term risk of death from all causes above established risk factors. However, further investigations are needed to clarify the biological mechanisms and the utility of these biomarkers to guide screening and prevention.
Please see later in the article for the Editors' Summary
Background
Early identification of ambulatory persons at high short-term risk of death could benefit targeted prevention. To identify biomarkers for all-cause mortality and enhance risk prediction, we conducted high-throughput profiling of blood specimens in two large population-based cohorts.
Methods and Findings
106 candidate biomarkers were quantified by nuclear magnetic resonance spectroscopy of non-fasting plasma samples from a random subset of the Estonian Biobank (n = 9,842; age range 18–103 y; 508 deaths during a median of 5.4 y of follow-up). Biomarkers for all-cause mortality were examined using stepwise proportional hazards models. Significant biomarkers were validated and incremental predictive utility assessed in a population-based cohort from Finland (n = 7,503; 176 deaths during 5 y of follow-up). Four circulating biomarkers predicted the risk of all-cause mortality among participants from the Estonian Biobank after adjusting for conventional risk factors: alpha-1-acid glycoprotein (hazard ratio [HR] 1.67 per 1–standard deviation increment, 95% CI 1.53–1.82, p = 5×10−31), albumin (HR 0.70, 95% CI 0.65–0.76, p = 2×10−18), very-low-density lipoprotein particle size (HR 0.69, 95% CI 0.62–0.77, p = 3×10−12), and citrate (HR 1.33, 95% CI 1.21–1.45, p = 5×10−10). All four biomarkers were predictive of cardiovascular mortality, as well as death from cancer and other nonvascular diseases. One in five participants in the Estonian Biobank cohort with a biomarker summary score within the highest percentile died during the first year of follow-up, indicating prominent systemic reflections of frailty. The biomarker associations all replicated in the Finnish validation cohort. Including the four biomarkers in a risk prediction score improved risk assessment for 5-y mortality (increase in C-statistics 0.031, p = 0.01; continuous reclassification improvement 26.3%, p = 0.001).
Conclusions
Biomarker associations with cardiovascular, nonvascular, and cancer mortality suggest novel systemic connectivities across seemingly disparate morbidities. The biomarker profiling improved prediction of the short-term risk of death from all causes above established risk factors. Further investigations are needed to clarify the biological mechanisms and the utility of these biomarkers for guiding screening and prevention.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
A biomarker is a biological molecule found in blood, body fluids, or tissues that may signal an abnormal process, a condition, or a disease. The level of a particular biomarker may indicate a patient's risk of disease, or likely response to a treatment. For example, cholesterol levels are measured to assess the risk of heart disease. Most current biomarkers are used to test an individual's risk of developing a specific condition. There are none that accurately assess whether a person is at risk of ill health generally, or likely to die soon from a disease. Early and accurate identification of people who appear healthy but in fact have an underlying serious illness would provide valuable opportunities for preventative treatment.
While most tests measure the levels of a specific biomarker, there are some technologies that allow blood samples to be screened for a wide range of biomarkers. These include nuclear magnetic resonance (NMR) spectroscopy and mass spectrometry. These tools have the potential to be used to screen the general population for a range of different biomarkers.
Why Was This Study Done?
Identifying new biomarkers that provide insight into the risk of death from all causes could be an important step in linking different diseases and assessing patient risk. The authors in this study screened patient samples using NMR spectroscopy for biomarkers that accurately predict the risk of death particularly amongst the general population, rather than amongst people already known to be ill.
What Did the Researchers Do and Find?
The researchers studied two large groups of people, one in Estonia and one in Finland. Both countries have set up health registries that collect and store blood samples and health records over many years. The registries include large numbers of people who are representative of the wider population.
The researchers first tested blood samples from a representative subset of the Estonian group, testing 9,842 samples in total. They looked at 106 different biomarkers in each sample using NMR spectroscopy. They also looked at the health records of this group and found that 508 people died during the follow-up period after the blood sample was taken, the majority from heart disease, cancer, and other diseases. Using statistical analysis, they looked for any links between the levels of different biomarkers in the blood and people's short-term risk of dying. They found that the levels of four biomarkers—plasma albumin, alpha-1-acid glycoprotein, very-low-density lipoprotein (VLDL) particle size, and citrate—appeared to accurately predict short-term risk of death. They repeated this study with the Finnish group, this time with 7,503 individuals (176 of whom died during the five-year follow-up period after giving a blood sample) and found similar results.
The researchers carried out further statistical analyses to take into account other known factors that might have contributed to the risk of life-threatening illness. These included factors such as age, weight, tobacco and alcohol use, cholesterol levels, and pre-existing illness, such as diabetes and cancer. The association between the four biomarkers and short-term risk of death remained the same even when controlling for these other factors.
The analysis also showed that combining the test results for all four biomarkers, to produce a biomarker score, provided a more accurate measure of risk than any of the biomarkers individually. This biomarker score also proved to be the strongest predictor of short-term risk of dying in the Estonian group. Individuals with a biomarker score in the top 20% had a risk of dying within five years that was 19 times greater than that of individuals with a score in the bottom 20% (288 versus 15 deaths).
What Do These Findings Mean?
This study suggests that there are four biomarkers in the blood—alpha-1-acid glycoprotein, albumin, VLDL particle size, and citrate—that can be measured by NMR spectroscopy to assess whether otherwise healthy people are at short-term risk of dying from heart disease, cancer, and other illnesses. However, further validation of these findings is still required, and additional studies should examine the biomarker specificity and associations in settings closer to clinical practice. The combined biomarker score appears to be a more accurate predictor of risk than tests for more commonly known risk factors. Identifying individuals who are at high risk using these biomarkers might help to target preventative medical treatments to those with the greatest need.
However, there are several limitations to this study. As an observational study, it provides evidence of only a correlation between a biomarker score and ill health. It does not identify any underlying causes. Other factors, not detectable by NMR spectroscopy, might be the true cause of serious health problems and would provide a more accurate assessment of risk. Nor does this study identify what kinds of treatment might prove successful in reducing the risks. Therefore, more research is needed to determine whether testing for these biomarkers would provide any clinical benefit.
There were also some technical limitations to the study. NMR spectroscopy does not detect as many biomarkers as mass spectrometry, which might therefore identify further biomarkers for a more accurate risk assessment. In addition, because both study groups were northern European, it is not yet known whether the results would be the same in other ethnic groups or populations with different lifestyles.
In spite of these limitations, the fact that the same four biomarkers are associated with a short-term risk of death from a variety of diseases does suggest that similar underlying mechanisms are taking place. This observation points to some potentially valuable areas of research to understand precisely what's contributing to the increased risk.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001606
The US National Institute of Environmental Health Sciences has information on biomarkers
The US Food and Drug Administration has a Biomarker Qualification Program to help researchers in identifying and evaluating new biomarkers
Further information on the Estonian Biobank is available
The Computational Medicine Research Team of the University of Oulu and the University of Bristol have a webpage that provides further information on high-throughput biomarker profiling by NMR spectroscopy
doi:10.1371/journal.pmed.1001606
PMCID: PMC3934819  PMID: 24586121
22.  A Crossover Intervention Trial Evaluating the Efficacy of a Chlorhexidine-Impregnated Sponge (BIOPATCH®) to Reduce Catheter-Related Bloodstream Infections in Hemodialysis Patients 
Background
Catheter-related bloodstream infections (BSI) account for the majority of hemodialysis-related infections. There are no published data on the efficacy of the chlorhexidine-impregnated foam dressing at reducing catheter-related BSI in hemodialysis patients.
Design
Prospective non-blinded cross-over intervention trial to determine the efficacy of a chlorhexidine-impregnated foam dressing (Biopatch®) to reduce catheter-related BSI in hemodialysis patients.
Setting
Two outpatient dialysis centers
Patients
A total of 121 patients who were dialyzed through tunneled central venous catheters received the intervention during the trial.
Methods
The primary outcome of interest was the incidence of catheter-related bloodstream infections. A nested cohort study of all patients who received the Biopatch® Antimicrobial Dressing was also conducted. Backward stepwise logistic regression analysis was used to determine independent risk factors for development of BSI.
Results
37 bloodstream infections occurred in the intervention group for a rate of 6.3 BSIs/1000 dialysis sessions and 30 bloodstream infections in the control group for a rate of 5.2 BSIs/1000 dialysis sessions and [RR 1.22, CI (0.76, 1.97); P=0.46]. The Biopatch® Antimicrobial Dressing was well-tolerated with only two patients (<2%) experiencing dermatitis that led to its discontinuation. The only independent risk factor for development of BSI was dialysis treatment at one dialysis center [aOR 4.4 (1.77, 13.65); P=0.002]. Age ≥ 60 years [aOR 0.28 (0.09, 0.82); P=0.02] was associated with lower risk for BSI.
Conclusion
The use of a chlorhexidine-impregnated foam dressing (Biopatch®) did not decrease catheter-related BSIs among hemodialysis patients with tunneled central venous catheters.
doi:10.1086/657075
PMCID: PMC3077924  PMID: 20879855
chlorhexidine-impregnated dressing; hemodialysis; bloodstream infection; tunneled catheter
23.  Tunneled central venous catheters: Experience from a single center 
Indian Journal of Nephrology  2011;21(2):107-111.
In the past vascular surgeons were called in to place tunneled central venous catheter (TVC) for hemodialysis patients. Advent of percutaneous technique has resulted in an increasing number of interventional nephrologists inserting it. A single centre three year audit of 100 TVCs with a cumulative follow up of 492 patient months is presented here. From 2007 to 2010, 100 TVCs were placed by nephrologists in a percutaneous fashion in the operative room or the interventional nephrology suite. Those who completed minimum of three months on the catheter were included in analysis. There were 69 males and 31 females with a mean age of 52.3±13.6 years.(range: 25-76). Chronic glomerulonephritis was the commonest cause of CKD (45%) followed by diabetes (39%).Right internal jugular vein was the preferred site (94%). TVC was utilized as the primary access to initiate dialysis in 25% of patients in whom a live donor was available for renal transplant. The blood flow was 250-270 ml/min. The Kaplan-Meier analysis showed that 3 months and 6 months catheter survival rates were 80% and 55%, respectively. The main complications were exit site blood ooze, catheter block and kink. Catheter related bacteremia rate was low at 0.4/1000 patient days. Primary cause of drop out was patient death unrelated to the TVCs. Those under the age of 40 years showed better survival, but there was no bearing of gender, catheter site, and etiology of CKD on survival. Tunneled central venous catheters could find a niche as the primary access of choice for pretransplant live donor renal transplants in view of its immediate usage, high blood flows, low infection rates and adequate patency rates for 3-6 months.
doi:10.4103/0971-4065.82133
PMCID: PMC3132329  PMID: 21769173
Bacteremia; catheter survival; hemodialysis; tunneled central venous catheters
24.  Blood Loss through AV Fistula: A Case Report and Literature Review 
Little has been written about acute blood loss from hemodialysis vascular access. We describe a 57-year-old Caucasian male with an approximately 7 gm/dL drop in hemoglobin due to bleeding from a ruptured aneurysm in his right brachiocephalic arteriovenous fistula (AVF). There was no evidence of fistula infection. The patient was successfully managed by blood transfusions and insertion of a tunneled dialysis catheter for dialysis access. Later, the fistula was ligated and a new fistula was constructed in the opposite arm. Aneurysm should be considered in cases of acute vascular access bleeding in chronic dialysis patients.
doi:10.4061/2011/350870
PMCID: PMC3118665  PMID: 21716705
25.  A randomised controlled trial of Heparin versus EthAnol Lock THerapY for the prevention of Catheter Associated infecTion in Haemodialysis patients – the HEALTHY-CATH trial 
BMC Nephrology  2012;13:146.
Background
Tunnelled central venous dialysis catheter use is significantly limited by the occurrence of catheter-related infections. This randomised controlled trial assessed the efficacy of a 48 hour 70% ethanol lock vs heparin locks in prolonging the time to the first episode of catheter related blood stream infection (CRBSI).
Methods
Patients undergoing haemodialysis (HD) via a tunnelled catheter were randomised 1:1 to once per week ethanol locks (with two heparin locks between other dialysis sessions) vs thrice per week heparin locks.
Results
Observed catheter days in the heparin (n=24) and ethanol (n=25) groups were 1814 and 3614 respectively. CRBSI occurred at a rate of 0.85 vs. 0.28 per 1000 catheter days in the heparin vs ethanol group by intention to treat analysis (incident rate ratio (IRR) for ethanol vs. heparin 0.17; 95%CI 0.02-1.63; p=0.12). Flow issues requiring catheter removal occurred at a rate of 1.6 vs 1.4 per 1000 catheter days in the heparin and ethanol groups respectively (IRR 0.85; 95% CI 0.20-3.5 p =0.82 (for ethanol vs heparin).
Conclusions
Catheter survival and catheter-related blood stream infection were not significantly different but there was a trend towards a reduced rate of infection in the ethanol group. This study establishes proof of concept and will inform an adequately powered multicentre trial to definitively examine the efficacy and safety of ethanol locks as an alternative to current therapies used in the prevention of catheter-associated blood stream infections in patients dialysing with tunnelled catheters.
Trial Registration
Australian New Zealand Clinical Trials Registry ACTRN12609000493246
doi:10.1186/1471-2369-13-146
PMCID: PMC3531247  PMID: 23121768
Catheter related blood stream infection (CRBSI); Central venous catheter; Ethanol; Lock therapy; Haemodialysis (HD); Prophylaxis

Results 1-25 (1355490)