PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1562923)

Clipboard (0)
None

Related Articles

1.  Precision of Biomarkers to Define Chronic Inflammation in CKD 
American Journal of Nephrology  2008;28(5):808-812.
Background/Aims
Several inflammatory biomarkers have been found to be associated with cardiovascular disease or all-cause mortality in dialysis patients, but their usefulness in clinical practice or as surrogate endpoints is not certain. The purpose of the present study was to determine the intrapatient variation of C-reactive protein, IL-6, fetuin-A and albumin in a population of dialysis patients.
Methods
Apparently healthy dialysis patients with either a tunneled dialysis catheter or fistula had monthly assessments of these biomarkers for a total of four determinations, and the intraclass correlation coefficients were calculated as measures of intersubject variance.
Results
Our results showed large within-subject variation relative to the total variation in the measurements (31–46%). Having a tunneled catheter as opposed to a fistula was not significantly associated with mean levels, suggesting that chronic subclinical catheter infection does not explain the variation seen in the biomarkers. In contrast, there was a rapid change in these biomarkers with a clinically apparent acute infection.
Conclusion
These results suggest that these biomarkers have limitations for use as surrogate endpoints in clinical trials due to wide fluctuations, even in apparently clinically healthy individuals.
doi:10.1159/000135692
PMCID: PMC2574778  PMID: 18506106
Biomarkers, precision; Chronic inflammation; Chronic kidney disease; CKD stage 5D; Inflammatory biomarkers, intrapatient variance; Tunneled dialysis catheter
2.  Effects of starting hemodialysis with an arteriovenous fistula or central venous catheter compared with peritoneal dialysis: a retrospective cohort study 
BMC Nephrology  2012;13:88.
Background
Although several studies have demonstrated early survival advantages with peritoneal dialysis (PD) over hemodialysis (HD), the reason for the excess mortality observed among incident HD patients remains to be established, to our knowledge. This study explores the relationship between mortality and dialysis modality, focusing on the role of HD vascular access type at the time of dialysis initiation.
Methods
A retrospective cohort study was performed among local adult chronic kidney disease patients who consecutively initiated PD and HD with a tunneled cuffed venous catheter (HD-TCC) or a functional arteriovenous fistula (HD-AVF) in our institution in the year 2008. A total of 152 patients were included in the final analysis (HD-AVF, n = 59; HD-TCC, n = 51; PD, n = 42). All cause and dialysis access-related morbidity/mortality were evaluated at one year. Univariate and multivariate analysis were used to compare the survival of PD patients with those who initiated HD with an AVF or with a TCC.
Results
Compared with PD patients, both HD-AVF and HD-TCC patients were more likely to be older (p<0.001) and to have a higher frequency of diabetes mellitus (p = 0.017) and cardiovascular disease (p = 0.020). Overall, HD-TCC patients were more likely to have clinical visits (p = 0.069), emergency room visits (p<0.001) and hospital admissions (p<0.001). At the end of follow-up, HD-TCC patients had a higher rate of dialysis access-related complications (1.53 vs. 0.93 vs. 0.64, per patient-year; p<0.001) and hospitalizations (0.47 vs. 0.07 vs. 0.14, per patient-year; p = 0.034) than HD-AVF and PD patients, respectively. The survival rates at one year were 96.6%, 74.5% and 97.6% for HD-AVF, HD-TCC and PD groups, respectively (p<0.001). In multivariate analysis, HD-TCC use at the time of dialysis initiation was the important factor associated with death (HR 16.128, 95%CI [1.431-181.778], p = 0.024).
Conclusion
Our results suggest that HD vascular access type at the time of renal replacement therapy initiation is an important modifier of the relationship between dialysis modality and survival among incident dialysis patients.
doi:10.1186/1471-2369-13-88
PMCID: PMC3476986  PMID: 22916962
3.  The impact of Vascular Access on the Adequacy of Dialysis and the Outcome of the Dialysis Treatment: One Center Experience 
Materia Socio-Medica  2015;27(2):114-117.
Introduction:
Chronic kidney disease (CKD) is a gradually reduction in glomerular filtration rate (GFR) caused by destruction of a large number of nephrons. Kidney failure is the final stage of CKD with GFR <15ml/min/1.73m2 or requiring dialysis. Patients must provide vascular access, which is also the “life line” and “Achilles heel” of hemodialysis treatment.
Aim:
The purpose of this research is to show the demographic structure of the hemodialysis center in Konjic, and also demonstrate the impact of vascular access to the adequacy and the outcome of dialysis treatment.
Methods:
This cross-sectional study included 36 patients on hemodialysis in Center in Konjic from September 2010 to December 2014. The method of collecting data is performed through medical records and the quality of dialysis is taken to be Kt/V> 1.2. Statistical analysis was performed using SPSS software and Student T-test.
Results:
The mortality of patients treated by dialysis is 37.8%. The ratio of male and female patients is 55.6% vs. 44.5%, with an average age of 52.91±14.36 years and an average duration of hemodialysis of five years. The highest percentage of patients dialyzed through arterio-venous fistula (AVF) on the forearm (72.2%). In that patients the most common complication is thrombosis with 30.5%, which require recanalization in 11% and replacement in 19.5% of patients. Of the other dialysis patients, 16.7% of patients are dialyzed via a temporary and 11.1% via a permanent catheter (the most common complication in that patients is infection in 83.3% cases) in v.subclavia. Although the AVF is more frequently, experience shows frequent implantation of a permanent catheter in elderly patients due to the less quality of their blood vessels. Although the Kt/V by patients who are dialyzed through temporary catheter is less than 1.2 and by the other two access is greater than 1.2, our results confirm that vascular access does not have an influence on quality of dialysis. Average Kt/V shows that the adequate dialysis dose is delivered in this Center, which means that despite the impact of vascular access in HD quality, other factors also can affect on dialysis treatment, which was noticed by patients and staff.
Conclusion:
Despite the largest mortality rate in patients with a permanent catheter and least in patients with AVF, the type of vascular access does not affect the outcome of dialysis treatment.
doi:10.5455/msm.2015.27.4-114-117
PMCID: PMC4404955  PMID: 26005389
hemodialysis; vascular access; Kt/V
4.  Care of undocumented-uninsured immigrants in a large urban dialysis unit 
BMC Nephrology  2012;13:112.
Background
Medical, ethical and financial dilemmas may arise in treating undocumented-uninsured patients with end-stage renal disease (ESRD). Hereby we describe the 10-year experience of treating undocumented-uninsured ESRD patients in a large public dialysis-unit.
Methods
We evaluated the medical files of all the chronic dialysis patients treated at the Tel-Aviv Medical Center between the years 2000–2010. Data for all immigrant patients without documentation and medical insurance were obtained. Clinical data were compared with an age-matched cohort of 77 insured dialysis patients.
Results
Fifteen undocumented-uninsured patients were treated with chronic scheduled dialysis therapy for a mean length of 2.3 years and a total of 4953 hemodialysis sessions, despite lack of reimbursement. All undocumented-uninsured patients presented initially with symptoms attributed to uremia and with stage 5 chronic kidney disease (CKD). In comparison, in the age-matched cohort, only 6 patients (8%) were initially evaluated by a nephrologist at stage 5 CKD. Levels of hemoglobin (8.5 ± 1.7 versus 10.8 ± 1.6 g/dL; p < 0.0001) and albumin (33.8 ± 4.8 versus 37.7 ± 3.9 g/L; p < 0.001) were lower in the undocumented-uninsured dialysis patients compared with the age-matched insured patients at initiation of hemodialysis therapy. These significant changes were persistent throughout the treatment period. Hemodialysis was performed in all the undocumented-uninsured patients via tunneled cuffed catheters (TCC) without higher rates of TCC-associated infections. The rate of skipped hemodialysis sessions was similar in the undocumented-uninsured and age-matched insured cohorts.
Conclusions
Undocumented-uninsured dialysis patients presented initially in the advanced stages of CKD with lower levels of hemoglobin and worse nutritional status in comparison with age-matched insured patients. The type of vascular access for hemodialysis was less than optimal with regards to current guidelines. There is a need for the national and international nephrology communities to establish a policy concerning the treatment of undocumented-uninsured patients with CKD.
doi:10.1186/1471-2369-13-112
PMCID: PMC3615959  PMID: 22992409
Dialysis; ESRD; Undocumented; Uninsured; Immigrants
5.  Cost Analysis of Hemodialysis and Peritoneal Dialysis Access in Incident Dialysis Patients 
♦ Background: Although several studies have demonstrated the economic advantages of peritoneal dialysis (PD) over hemodialysis (HD), few reports in the literature have compared the costs of HD and PD access. The aim of the present study was to compare the resources required to establish and maintain the dialysis access in patients who initiated HD with a tunneled cuffed catheter (TCC) or an arteriovenous fistula (AVF) and in patients who initiated PD.
♦ Methods: We retrospectively analyzed the 152 chronic kidney disease patients who consecutively initiated dialysis treatment at our institution in 2008 (HD-AVF, n = 65; HD-CVC, n = 45; PD, n = 42). Detailed clinical and demographic information and data on access type were collected for all patients. A comprehensive measure of total dialysis access costs, including surgery, radiology, hospitalization for access complications, physician costs, and transportation costs was obtained at year 1 using an intention-to-treat approach. All resources used were valued using 2010 prices, and costs are reported in 2010 euros.
♦ Results: Compared with the HD-AVF and HD-TCC modalities, PD was associated with a significantly lower risk of access-related interventions (adjusted rate ratios: 1.572 and 1.433 respectively; 95% confidence intervals: 1.253 to 1.891 and 1.069 to 1.797). The mean dialysis access-related costs per patient-year at risk were €1171.6 [median: €608.8; interquartile range (IQR): €563.1 - €936.7] for PD, €1555.2 (median: €783.9; IQR: €371.4 - €1571.7) for HD-AVF, and €4208.2 (median: €1252.4; IQR: €947.9 - €2983.5) for HD-TCC (p < 0.001). In multivariate analysis, total dialysis access costs were significantly higher for the HD-TCC modality than for either PD or HD-AVF (β = -0.53; 95% CI: -1.03 to -0.02; and β = -0.50; 95% CI: -0.96 to -0.04).
♦ Conclusions: Compared with patients initiating HD, those initiating PD required fewer resources to establish and maintain a dialysis access during the first year of treatment.
doi:10.3747/pdi.2011.00309
PMCID: PMC3862096  PMID: 23455977
Cost analysis; health economics; hemodialysis; dialysis access; vascular access; peritoneal catheter
6.  Analysis of Vascular Access in Haemodialysis Patients - Single Center Experience 
Background
Vascular access is the key in successful management of chronic haemodialysis (HD) patients. Though native arteriovenous fistula (AVF) is considered the access of choice, many patients in our country initiate haemodialysis through central venous catheter (CVC). There is paucity of data on vascular access in haemodialysis patients from southern India.
Aim
Aim of the present study was to review our experience of vascular access in Haemodialysis patients (both central venous catheters and arteriovenous fistula) and to assess its success rate and common complications.
Materials and Methods
This prospective study was conducted between January 2014 and December 2014 in our institute. A total of 50 patients with Chronic Kidney Disease (CKD) underwent vascular access intervention during the above period.
Results
A temporary venous catheter (96%) in the right internal jugular vein was the most common mode of initiation of haemodialysis with 34.48% incidence of catheter related sepsis. Fifty percent of catheters were removed electively with mean duration of catheter survival of 77.23 ± 14.8 days. Wrist AVF (60%) was the most common site of AVF creation followed by arm (30%), mid-forearm (7.5%) and leg (2.5%). Complications include distal oedema (17.5%) and venous hypertension (2.5%). Primary failure occurred in 25% of patients and was more common in diabetic, elderly (>60 years) and in distal fistulas. Elderly patients (>60 years) starting dialysis with a CVC were more likely to be CVC dependent at 90 days.
Conclusion
Late presentation and delayed diagnosis of chronic kidney disease (CKD) necessitates dialysis initiation through temporary catheter. Dialysis catheter with its attendant complications further adds to the morbidity, mortality, health care burden and costs. Early nephrology referral and permanent access creation in the pre dialysis stage could avert the unnecessary complications and costs of catheter.
doi:10.7860/JCDR/2015/13342.6611
PMCID: PMC4625272  PMID: 26557553
Arteriovenous fistula; Temporary haemodialysis catheter
7.  Risk Models to Predict Chronic Kidney Disease and Its Progression: A Systematic Review 
PLoS Medicine  2012;9(11):e1001344.
A systematic review of risk prediction models conducted by Justin Echouffo-Tcheugui and Andre Kengne examines the evidence base for prediction of chronic kidney disease risk and its progression, and suitability of such models for clinical use.
Background
Chronic kidney disease (CKD) is common, and associated with increased risk of cardiovascular disease and end-stage renal disease, which are potentially preventable through early identification and treatment of individuals at risk. Although risk factors for occurrence and progression of CKD have been identified, their utility for CKD risk stratification through prediction models remains unclear. We critically assessed risk models to predict CKD and its progression, and evaluated their suitability for clinical use.
Methods and Findings
We systematically searched MEDLINE and Embase (1 January 1980 to 20 June 2012). Dual review was conducted to identify studies that reported on the development, validation, or impact assessment of a model constructed to predict the occurrence/presence of CKD or progression to advanced stages. Data were extracted on study characteristics, risk predictors, discrimination, calibration, and reclassification performance of models, as well as validation and impact analyses. We included 26 publications reporting on 30 CKD occurrence prediction risk scores and 17 CKD progression prediction risk scores. The vast majority of CKD risk models had acceptable-to-good discriminatory performance (area under the receiver operating characteristic curve>0.70) in the derivation sample. Calibration was less commonly assessed, but overall was found to be acceptable. Only eight CKD occurrence and five CKD progression risk models have been externally validated, displaying modest-to-acceptable discrimination. Whether novel biomarkers of CKD (circulatory or genetic) can improve prediction largely remains unclear, and impact studies of CKD prediction models have not yet been conducted. Limitations of risk models include the lack of ethnic diversity in derivation samples, and the scarcity of validation studies. The review is limited by the lack of an agreed-on system for rating prediction models, and the difficulty of assessing publication bias.
Conclusions
The development and clinical application of renal risk scores is in its infancy; however, the discriminatory performance of existing tools is acceptable. The effect of using these models in practice is still to be explored.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Chronic kidney disease (CKD)—the gradual loss of kidney function—is increasingly common worldwide. In the US, for example, about 26 million adults have CKD, and millions more are at risk of developing the condition. Throughout life, small structures called nephrons inside the kidneys filter waste products and excess water from the blood to make urine. If the nephrons stop working because of injury or disease, the rate of blood filtration decreases, and dangerous amounts of waste products such as creatinine build up in the blood. Symptoms of CKD, which rarely occur until the disease is very advanced, include tiredness, swollen feet and ankles, puffiness around the eyes, and frequent urination, especially at night. There is no cure for CKD, but progression of the disease can be slowed by controlling high blood pressure and diabetes, both of which cause CKD, and by adopting a healthy lifestyle. The same interventions also reduce the chances of CKD developing in the first place.
Why Was This Study Done?
CKD is associated with an increased risk of end-stage renal disease, which is treated with dialysis or by kidney transplantation (renal replacement therapies), and of cardiovascular disease. These life-threatening complications are potentially preventable through early identification and treatment of CKD, but most people present with advanced disease. Early identification would be particularly useful in developing countries, where renal replacement therapies are not readily available and resources for treating cardiovascular problems are limited. One way to identify people at risk of a disease is to use a “risk model.” Risk models are constructed by testing the ability of different combinations of risk factors that are associated with a specific disease to identify those individuals in a “derivation sample” who have the disease. The model is then validated on an independent group of people. In this systematic review (a study that uses predefined criteria to identify all the research on a given topic), the researchers critically assess the ability of existing CKD risk models to predict the occurrence of CKD and its progression, and evaluate their suitability for clinical use.
What Did the Researchers Do and Find?
The researchers identified 26 publications reporting on 30 risk models for CKD occurrence and 17 risk models for CKD progression that met their predefined criteria. The risk factors most commonly included in these models were age, sex, body mass index, diabetes status, systolic blood pressure, serum creatinine, protein in the urine, and serum albumin or total protein. Nearly all the models had acceptable-to-good discriminatory performance (a measure of how well a model separates people who have a disease from people who do not have the disease) in the derivation sample. Not all the models had been calibrated (assessed for whether the average predicted risk within a group matched the proportion that actually developed the disease), but in those that had been assessed calibration was good. Only eight CKD occurrence and five CKD progression risk models had been externally validated; discrimination in the validation samples was modest-to-acceptable. Finally, very few studies had assessed whether adding extra variables to CKD risk models (for example, genetic markers) improved prediction, and none had assessed the impact of adopting CKD risk models on the clinical care and outcomes of patients.
What Do These Findings Mean?
These findings suggest that the development and clinical application of CKD risk models is still in its infancy. Specifically, these findings indicate that the existing models need to be better calibrated and need to be externally validated in different populations (most of the models were tested only in predominantly white populations) before they are incorporated into guidelines. The impact of their use on clinical outcomes also needs to be assessed before their widespread use is recommended. Such research is worthwhile, however, because of the potential public health and clinical applications of well-designed risk models for CKD. Such models could be used to identify segments of the population that would benefit most from screening for CKD, for example. Moreover, risk communication to patients could motivate them to adopt a healthy lifestyle and to adhere to prescribed medications, and the use of models for predicting CKD progression could help clinicians tailor disease-modifying therapies to individual patient needs.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001344.
This study is further discussed in a PLOS Medicine Perspective by Maarten Taal
The US National Kidney and Urologic Diseases Information Clearinghouse provides information about all aspects of kidney disease; the US National Kidney Disease Education Program provides resources to help improve the understanding, detection, and management of kidney disease (in English and Spanish)
The UK National Health Service Choices website provides information for patients on chronic kidney disease, including some personal stories
The US National Kidney Foundation, a not-for-profit organization, provides information about chronic kidney disease (in English and Spanish)
The not-for-profit UK National Kidney Federation support and information for patients with kidney disease and for their carers, including a selection of patient experiences of kidney disease
World Kidney Day, a joint initiative between the International Society of Nephrology and the International Federation of Kidney Foundations, aims to raise awareness about kidneys and kidney disease
doi:10.1371/journal.pmed.1001344
PMCID: PMC3502517  PMID: 23185136
8.  A Systematic Review and Meta-Analysis of Utility-Based Quality of Life in Chronic Kidney Disease Treatments 
PLoS Medicine  2012;9(9):e1001307.
Melanie Wyld and colleagues examined previously published studies to assess pooled utility-based quality of life of the various treatments for chronic kidney disease. They conclude that the highest utility was for kidney transplants, with home-based automated peritoneal dialysis being second.
Background
Chronic kidney disease (CKD) is a common and costly condition to treat. Economic evaluations of health care often incorporate patient preferences for health outcomes using utilities. The objective of this study was to determine pooled utility-based quality of life (the numerical value attached to the strength of an individual's preference for a specific health outcome) by CKD treatment modality.
Methods and Findings
We conducted a systematic review, meta-analysis, and meta-regression of peer-reviewed published articles and of PhD dissertations published through 1 December 2010 that reported utility-based quality of life (utility) for adults with late-stage CKD. Studies reporting utilities by proxy (e.g., reported by a patient's doctor or family member) were excluded.
In total, 190 studies reporting 326 utilities from over 56,000 patients were analysed. There were 25 utilities from pre-treatment CKD patients, 226 from dialysis patients (haemodialysis, n = 163; peritoneal dialysis, n = 44), 66 from kidney transplant patients, and three from patients treated with non-dialytic conservative care. Using time tradeoff as a referent instrument, kidney transplant recipients had a mean utility of 0.82 (95% CI: 0.74, 0.90). The mean utility was comparable in pre-treatment CKD patients (difference = −0.02; 95% CI: −0.09, 0.04), 0.11 lower in dialysis patients (95% CI: −0.15, −0.08), and 0.2 lower in conservative care patients (95% CI: −0.38, −0.01). Patients treated with automated peritoneal dialysis had a significantly higher mean utility (0.80) than those on continuous ambulatory peritoneal dialysis (0.72; p = 0.02). The mean utility of transplant patients increased over time, from 0.66 in the 1980s to 0.85 in the 2000s, an increase of 0.19 (95% CI: 0.11, 0.26). Utility varied by elicitation instrument, with standard gamble producing the highest estimates, and the SF-6D by Brazier et al., University of Sheffield, producing the lowest estimates. The main limitations of this study were that treatment assignments were not random, that only transplant had longitudinal data available, and that we calculated EuroQol Group EQ-5D scores from SF-36 and SF-12 health survey data, and therefore the algorithms may not reflect EQ-5D scores measured directly.
Conclusions
For patients with late-stage CKD, treatment with dialysis is associated with a significant decrement in quality of life compared to treatment with kidney transplantation. These findings provide evidence-based utility estimates to inform economic evaluations of kidney therapies, useful for policy makers and in individual treatment discussions with CKD patients.
Editors' Summary
Background
Ill health can adversely affect an individual's quality of life, particularly if caused by long-term (chronic) conditions, such as chronic kidney disease—in the United States alone, 23 million people have chronic kidney disease, of whom 570,000 are treated with dialysis or kidney transplantation. In order to measure the cost-effectiveness of interventions to manage medical conditions, health economists use an objective measurement known as quality-adjusted life years. However, although useful, quality-adjusted life years are often criticized for not taking into account the views and preferences of the individuals with the medical conditions. A measurement called a utility solves this problem. Utilities are a numerical value (measured on a 0 to 1 scale, where 0 represents death and 1 represents full health) of the strength of an individual's preference for specified health-related outcomes, as measured by “instruments” (questionnaires) that rate direct comparisons or assess quality of life.
Why Was This Study Done?
Previous studies have suggested that, in people with chronic kidney disease, quality of life (as measured by utility) is higher in those with a functioning kidney transplant than in those on dialysis. However, currently, it is unclear whether the type of dialysis affects quality of life: hemodialysis is a highly technical process that directly filters the blood, usually must be done 2–4 times a week, and can only be performed in a health facility; peritoneal dialysis, in which fluids are infused into the abdominal cavity, can be done nightly at home (automated peritoneal dialysis) or throughout the day (continuous ambulatory peritoneal dialysis). In this study, the researchers reviewed and assimilated all of the available evidence to investigate whether quality of life in people with chronic kidney disease (as measured by utility) differed according to treatment type.
What Did the Researchers Do and Find?
The researchers did a comprehensive search of 11 databases to identify all relevant studies that included people with severe (stage 3, 4, or 5) chronic kidney disease, their form of treatment, and information on utilities—either reported directly, or included in quality of life instruments (SF-36), so the researchers could calculate utilities by using a validated algorithm. The researchers also recorded the prevalence rates of diabetes in study participants. Then, using statistical models that adjusted for various factors, including treatment type and the method of measuring utilities, the researchers were able to calculate the pooled utilities of each form of treatment for chronic kidney disease.
The researchers included 190 studies, representing over 56,000 patients and generating 326 utility estimates, in their analysis. The majority of utilities (77%) were derived through the SF-36 questionnaire via calculation. Of the 326 utility estimates, 25 were from patients pre-dialysis, 226 were from dialysis patients (the majority of whom were receiving hemodialysis), 66 were from kidney transplant patients, and three were from conservative care patients. The researchers found that the highest average utility was for those who had renal transplantation, 0.82, followed by the pre-dialysis group (0.80), dialysis patients (0.71), and, finally, patients receiving conservative care (0.62). When comparing the type of dialysis, the researchers found that there was little difference in utility between hemodialysis and peritoneal dialysis, but patients using automated peritoneal dialysis had, on average, a higher utility (0.80) than those treated with continuous ambulatory peritoneal dialysis (0.72). Finally, the researchers found that patient groups with diabetes had significantly lower utilities than those without diabetes.
What Do These Findings Mean?
These findings suggest that in people with chronic kidney disease, renal transplantation is the best treatment option to improve quality of life. For those on dialysis, home-based automated peritoneal dialysis may improve quality of life more than the other forms of dialysis: this finding is important, as this type of dialysis is not as widely used as other forms and is also cheaper than hemodialysis. Furthermore, these findings suggest that patients who choose conservative care have significantly lower quality of life than patients treated with dialysis, a finding that warrants further investigation. Overall, in addition to helping to inform economic evaluations of treatment options, the information from this analysis can help guide clinicians caring for patients with chronic kidney disease in their discussions about possible treatment options.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001307.
Information about chronic kidney disease is available from the National Kidney Foundation and MedlinePlus
Wikipedia gives information on general utilities (note that Wikipedia is a free online encyclopedia that anyone can edit; available in several languages)
doi:10.1371/journal.pmed.1001307
PMCID: PMC3439392  PMID: 22984353
9.  Arteriovenous Graft Placement in Predialysis Patients: A Potential Catheter-Sparing Strategy 
Background
When pre-dialysis patients are deemed to be unsuitable candidates for an arteriovenous fistula, current guidelines recommend waiting until just before or after initiation of dialysis before placing a graft. This strategy may increase catheter use when these patients start dialysis. We compared the outcomes of patients whose grafts were placed before and after dialysis initiation.
Study design
Retrospective analysis of a prospective computerized vascular access database.
Setting and participants
Chronic kidney disease patients receiving their first arteriovenous graft (n=248) at a large medical center.
Predictor
Timing of graft placement (before or after initiation of dialysis)
Outcome & measurements
Primary graft failure, cumulative graft survival, catheter-dependence, and catheter-related bacteremia.
Results
The first graft was placed pre-dialysis in 62 patients and post-dialysis in 186 patients. Primary graft failure was similar for pre-dialysis and post-dialysis grafts (20 vs 24%; p=0.5). The median cumulative graft survival was similar for the pre-dialysis and post-dialysis grafts (365 vs 414 days; HR, 1.22; 95% CI, 0.81–1.98; p=0.3). The median duration of catheter-dependence after graft placement in the post-dialysis group was 48 days, and was associated with 0.63 (95% CI, 0.48–0.79) episodes of catheter-related bacteremia per patient.
Limitations
Retrospective analysis, single medical center
Conclusion
Grafts placed pre-dialysis have similar primary failure rates and cumulative survival to those placed after starting dialysis. However, post-dialysis graft placement is associated with prolonged catheter dependence and frequent bacteremia. Pre-dialysis graft placement may reduce catheter-dependence and bacteremia in selected patients.
doi:10.1053/j.ajkd.2011.01.026
PMCID: PMC4034174  PMID: 21458898
10.  An integrated review of "unplanned" dialysis initiation: reframing the terminology to "suboptimal" initiation 
BMC Nephrology  2009;10:22.
Background
Ideally, care prior to the initiation of dialysis should increase the likelihood that patients start electively outside of the hospital setting with a mature arteriovenous fistula (AVF) or peritoneal dialysis (PD) catheter. However, unplanned dialysis continues to occur in patients both known and unknown to nephrology services, and in both late and early referrals. The objective of this article is to review the clinical and socioeconomic outcomes of unplanned dialysis initiation. The secondary objective is to explore the potential cost implications of reducing the rate of unplanned first dialysis in Canada.
Methods
MEDLINE and EMBASE from inception to 2008 were used to identify studies examining the clinical, economic or quality of life (QoL) outcomes in patients with an unplanned versus planned first dialysis. Data were described in a qualitative manner.
Results
Eight European studies (5,805 patients) were reviewed. Duration of hospitalization and mortality was higher for the unplanned versus planned population. Patients undergoing a first unplanned dialysis had significantly worse laboratory parameters and QoL. Rates of unplanned dialysis ranged from 24-49%. The total annual burden to the Canadian healthcare system of unplanned dialysis in 2005 was estimated at $33 million in direct hospital costs alone. Reducing the rate of unplanned dialysis by one-half yielded savings ranging from $13.3 to $16.1 million.
Conclusion
The clinical and socioeconomic impact of unplanned dialysis is significant. To more consistently characterize the unplanned population, the term suboptimal initiation is proposed to include dialysis initiation in hospital and/or with a central venous catheter and/or with a patient not starting on their chronic modality of choice. Further research and implementation of initiatives to reduce the rate of suboptimal initiation of dialysis in Canada are needed.
doi:10.1186/1471-2369-10-22
PMCID: PMC2735745  PMID: 19674452
11.  Duration of temporary catheter use for hemodialysis: an observational, prospective evaluation of renal units in Brazil 
BMC Nephrology  2011;12:63.
Background
For chronic hemodialysis, the ideal permanent vascular access is the arteriovenous fistula (AVF). Temporary catheters should be reserved for acute dialysis needs. The AVF is associated with lower infection rates, better clinical results, and a higher quality of life and survival when compared to temporary catheters. In Brazil, the proportion of patients with temporary catheters for more than 3 months from the beginning of therapy is used as an evaluation of the quality of renal units. The aim of this study is to evaluate factors associated with the time between the beginning of hemodialysis with temporary catheters and the placement of the first arteriovenous fistula in Brazil.
Methods
This is an observational, prospective non-concurrent study using national administrative registries of all patients financed by the public health system who began renal replacement therapy (RRT) between 2000 and 2004 in Brazil. Incident patients were eligible who had hemodialysis for the first time. Patients were excluded who: had hemodialysis reportedly started after the date of death (inconsistent database); were younger than 18 years old; had HIV; had no record of the first dialysis unit; and were dialyzed in units with less than twenty patients. To evaluate individual and renal unit factors associated with the event of interest, the frailty model was used (N = 55,589).
Results
Among the 23,824 patients (42.9%) who underwent fistula placement in the period of the study, 18.2% maintained the temporary catheter for more than three months until the fistula creation. The analysis identified five statistically significant factors associated with longer time until first fistula: higher age (Hazard-risk - HR 0.99, 95% CI 0.99-1.00); having hypertension and cardiovascular diseases (HR 0.94, 95% CI 0.9-0.98) as the cause of chronic renal disease; residing in capitals cities (HR 0.92, 95% CI 0.9-0.95) and certain regions in Brazil - South (HR 0.83, 95% CI 0.8-0.87), Midwest (HR 0.88, 95% CI 0.83-0.94), Northeast (HR 0.91, 95% CI 0.88-0.94), or North (HR 0.88, 95% CI 0.83-0.94) and the type of renal unit (public or private).
Conclusion
Monitoring the provision of arteriovenous fistulas in renal units could improve the care given to patients with end stage renal disease.
doi:10.1186/1471-2369-12-63
PMCID: PMC3227575  PMID: 22093280
12.  Cohort profile: Canadian study of prediction of death, dialysis and interim cardiovascular events (CanPREDDICT) 
BMC Nephrology  2013;14:121.
Background
The Canadian Study of Prediction of Death, Dialysis and Interim Cardiovascular Events (CanPREDDICT) is a large, prospective, pan-Canadian, cohort study designed to improve our understanding of determinants of renal and cardiovascular (CV) disease progression in patients with chronic kidney disease (CKD). The primary objective is to clarify the associations between traditional and newer biomarkers in the prediction of specific renal and CV events, and of death in patients with CKD managed by nephrologists. This information could then be used to better understand biological variation in outcomes, to develop clinical prediction models and to inform enrolment into interventional studies which may lead to novel treatments.
Methods/Designs
Commenced in 2008, 2546 patients have been enrolled with eGFR between 15 and 45 ml/min 1.73m2 from a representative sample in 25 rural, urban, academic and non academic centres across Canada. Patients are to be followed for an initial 3 years at 6 monthly intervals, and subsequently annually. Traditional biomarkers include eGFR, urine albumin creatinine ratio (uACR), hemoglobin (Hgb), phosphate and albumin. Newer biomarkers of interest were selected on the basis of biological relevance to important processes, commercial availability and assay reproducibility. They include asymmetric dimethylarginine (ADMA), N-terminal pro-brain natriuretic peptide (NT-pro-BNP), troponin I, cystatin C, high sensitivity C-reactive protein (hsCRP), interleukin-6 (IL6) and transforming growth factor beta 1 (TGFβ1). Blood and urine samples are collected at baseline, and every 6 monthly, and stored at −80°C. Outcomes of interest include renal replacement therapy, CV events and death, the latter two of which are adjudicated by an independent panel.
Discussion
The baseline distribution of newer biomarkers does not appear to track to markers of kidney function and therefore may offer some discriminatory value in predicting future outcomes. The granularity of the data presented at baseline may foster additional questions.
The value of the cohort as a unique resource to understand outcomes of patients under the care of nephrologists in a single payer healthcare system cannot be overstated. Systematic collection of demographic, laboratory and event data should lead to new insights.
The mean age of the cohort was 68 years, 90% were Caucasian, 62% were male, and 48% had diabetes. Forty percent of the cohort had eGFR between 30–45 mL/min/1.73m2, 22% had eGFR values below 20 mL/min/1.73m2; 61% had uACR < 30. Serum albumin, hemoglobin, calcium and 25-hydroxyvitamin D (25(OH)D) levels were progressively lower in the lower eGFR strata, while parathyroid hormone (PTH) levels increased. Cystatin C, ADMA, NT-proBNP, hsCRP, troponin I and IL-6 were significantly higher in the lower GFR strata, whereas 25(OH)D and TGFβ1 values were lower at lower GFR. These distributions of each of the newer biomarkers by eGFR and uACR categories were variable.
doi:10.1186/1471-2369-14-121
PMCID: PMC3691726  PMID: 23758910
Chronic kidney disease; Biomarkers; Observational cohort study; Outcomes; Progression; CV disease
13.  Barriers to successful implementation of care in home haemodialysis (BASIC-HHD):1. Study design, methods and rationale 
BMC Nephrology  2013;14:197.
Background
Ten years on from the National Institute of Health and Clinical Excellence’ technology appraisal guideline on haemodialysis in 2002; the clinical community is yet to rise to the challenge of providing home haemodialysis (HHD) to 10-15% of the dialysis cohort. The renal registry report, suggests underutilization of a treatment type that has had a lot of research interest and several publications worldwide on its apparent benefit for both physical and mental health of patients. An understanding of the drivers to introducing and sustaining the modality, from organizational, economic, clinical and patient perspectives is fundamental to realizing the full benefits of the therapy with the potential to provide evidence base for effective care models. Through the BASIC-HHD study, we seek to understand the clinical, patient and carer related psychosocial, economic and organisational determinants of successful uptake and maintenance of home haemodialysis and thereby, engage all major stakeholders in the process.
Design and methods
We have adopted an integrated mixed methodology (convergent, parallel design) for this study. The study arms include a. patient; b. organization; c. carer and d. economic evaluation. The three patient study cohorts (n = 500) include pre-dialysis patients (200), hospital haemodialysis (200) and home haemodialysis patients (100) from geographically distinct NHS sites, across the country and with variable prevalence of home haemodialysis. The pre-dialysis patients will also be prospectively followed up for a period of 12 months from study entry to understand their journey to renal replacement therapy and subsequently, before and after studies will be carried out for a select few who do commence dialysis in the study period. The process will entail quantitative methods and ethnographic interviews of all groups in the study. Data collection will involve clinical and biomarkers, psychosocial quantitative assessments and neuropsychometric tests in patients. Organizational attitudes and dialysis unit practices will be studied together with perceptions of healthcare providers on provision of home HD. Economic evaluation of home and hospital haemodialysis practices will also be undertaken and we will apply scenario ("what … if") analysis using system dynamics modeling to investigate the impact of different policy choices and financial models on dialysis technology adoption, care pathways and costs. Less attention is often given to the patient’s carers who provide informal support, often of a complex nature to patients afflicted by chronic ailments such as end stage kidney disease. Engaging the carers is fundamental to realizing the full benefits of a complex, home-based intervention and a qualitative study of the carers will be undertaken to elicit their fears, concerns and perception of home HD before and after patient’s commencement of the treatment. The data sets will be analysed independently and the findings will be mixed at the stage of interpretation to form a coherent message that will be informing practice in the future.
Discussion
The BASIC-HHD study is designed to assemble pivotal information on dialysis modality choice and uptake, investigating users, care-givers and care delivery processes and study their variation in a multi-layered analytical approach within a single health care system. The study results would define modality specific service and patient pathway redesign.
Study Registration
This study has been reviewed and approved by the Greater Manchester West Health Research Authority National Research Ethics Service (NRES) The study is on the NIHR (CLRN) portfolio.
doi:10.1186/1471-2369-14-197
PMCID: PMC3851985  PMID: 24044499
Barriers; Home haemodialysis; Mixed methods; Qualitative; Organisation; Adoption; Quality of life
14.  Advanced Electrophysiologic Mapping Systems 
Executive Summary
Objective
To assess the effectiveness, cost-effectiveness, and demand in Ontario for catheter ablation of complex arrhythmias guided by advanced nonfluoroscopy mapping systems. Particular attention was paid to ablation for atrial fibrillation (AF).
Clinical Need
Tachycardia
Tachycardia refers to a diverse group of arrhythmias characterized by heart rates that are greater than 100 beats per minute. It results from abnormal firing of electrical impulses from heart tissues or abnormal electrical pathways in the heart because of scars. Tachycardia may be asymptomatic, or it may adversely affect quality of life owing to symptoms such as palpitations, headaches, shortness of breath, weakness, dizziness, and syncope. Atrial fibrillation, the most common sustained arrhythmia, affects about 99,000 people in Ontario. It is associated with higher morbidity and mortality because of increased risk of stroke, embolism, and congestive heart failure. In atrial fibrillation, most of the abnormal arrhythmogenic foci are located inside the pulmonary veins, although the atrium may also be responsible for triggering or perpetuating atrial fibrillation. Ventricular tachycardia, often found in patients with ischemic heart disease and a history of myocardial infarction, is often life-threatening; it accounts for about 50% of sudden deaths.
Treatment of Tachycardia
The first line of treatment for tachycardia is antiarrhythmic drugs; for atrial fibrillation, anticoagulation drugs are also used to prevent stroke. For patients refractory to or unable to tolerate antiarrhythmic drugs, ablation of the arrhythmogenic heart tissues is the only option. Surgical ablation such as the Cox-Maze procedure is more invasive. Catheter ablation, involving the delivery of energy (most commonly radiofrequency) via a percutaneous catheter system guided by X-ray fluoroscopy, has been used in place of surgical ablation for many patients. However, this conventional approach in catheter ablation has not been found to be effective for the treatment of complex arrhythmias such as chronic atrial fibrillation or ventricular tachycardia. Advanced nonfluoroscopic mapping systems have been developed for guiding the ablation of these complex arrhythmias.
The Technology
Four nonfluoroscopic advanced mapping systems have been licensed by Health Canada:
CARTO EP mapping System (manufactured by Biosense Webster, CA) uses weak magnetic fields and a special mapping/ablation catheter with a magnetic sensor to locate the catheter and reconstruct a 3-dimensional geometry of the heart superimposed with colour-coded electric potential maps to guide ablation.
EnSite System (manufactured by Endocardial Solutions Inc., MN) includes a multi-electrode non-contact catheter that conducts simultaneous mapping. A processing unit uses the electrical data to computes more than 3,000 isopotential electrograms that are displayed on a reconstructed 3-dimensional geometry of the heart chamber. The navigational system, EnSite NavX, can be used separately with most mapping catheters.
The LocaLisa Intracardiac System (manufactured by Medtronics Inc, MN) is a navigational system that uses an electrical field to locate the mapping catheter. It reconstructs the location of the electrodes on the mapping catheter in 3-dimensional virtual space, thereby enabling an ablation catheter to be directed to the electrode that identifies abnormal electric potential.
Polar Constellation Advanced Mapping Catheter System (manufactured by Boston Scientific, MA) is a multielectrode basket catheter with 64 electrodes on 8 splines. Once deployed, each electrode is automatically traced. The information enables a 3-dimensional model of the basket catheter to be computed. Colour-coded activation maps are reconstructed online and displayed on a monitor. By using this catheter, a precise electrical map of the atrium can be obtained in several heartbeats.
Review Strategy
A systematic search of Cochrane, MEDLINE and EMBASE was conducted to identify studies that compared ablation guided by any of the advanced systems to fluoroscopy-guided ablation of tachycardia. English-language studies with sample sizes greater than or equal to 20 that were published between 2000 and 2005 were included. Observational studies on safety of advanced mapping systems and fluoroscopy were also included. Outcomes of interest were acute success, defined as termination of arrhythmia immediately following ablation; long-term success, defined as being arrhythmia free at follow-up; total procedure time; fluoroscopy time; radiation dose; number of radiofrequency pulses; complications; cost; and the cost-effectiveness ratio.
Quality of the individual studies was assessed using established criteria. Quality of the overall evidence was determined by applying the GRADE evaluation system. (3) Qualitative synthesis of the data was performed. Quantitative analysis using Revman 4.2 was performed when appropriate.
Quality of the Studies
Thirty-four studies met the inclusion criteria. These comprised 18 studies on CARTO (4 randomized controlled trials [RCTs] and 14 non-RCTs), 3 RCTs on EnSite NavX, 4 studies on LocaLisa Navigational System (1 RCT and 3 non-RCTs), 2 studies on EnSite and CARTO, 1 on Polar Constellation basket catheter, and 7 studies on radiation safety.
The quality of the studies ranged from moderate to low. Most of the studies had small sample sizes with selection bias, and there was no blinding of patients or care providers in any of the studies. Duration of follow-up ranged from 6 weeks to 29 months, with most having at least 6 months of follow-up. There was heterogeneity with respect to the approach to ablation, definition of success, and drug management before and after the ablation procedure.
Summary of Findings
Evidence is based on a small number of small RCTS and non-RCTS with methodological flaws.
Advanced nonfluoroscopy mapping/navigation systems provided real time 3-dimensional images with integration of anatomic and electrical potential information that enable better visualization of areas of interest for ablation
Advanced nonfluoroscopy mapping/navigation systems appear to be safe; they consistently shortened the fluoroscopy duration and radiation exposure.
Evidence suggests that nonfluoroscopy mapping and navigation systems may be used as adjuncts to rather than replacements for fluoroscopy in guiding the ablation of complex arrhythmias.
Most studies showed a nonsignificant trend toward lower overall failure rate for advanced mapping-guided ablation compared with fluoroscopy-guided mapping.
Pooled analyses of small RCTs and non-RCTs that compared fluoroscopy- with nonfluoroscopy-guided ablation of atrial fibrillation and atrial flutter showed that advanced nonfluoroscopy mapping and navigational systems:
Yielded acute success rates of 69% to 100%, not significantly different from fluoroscopy ablation.
Had overall failure rates at 3 months to 19 months of 1% to 40% (median 25%).
Resulted in a 10% relative reduction in overall failure rate for advanced mapping guided-ablation compared to fluoroscopy guided ablation for the treatment of atrial fibrillation.
Yielded added benefit over fluoroscopy in guiding the ablation of complex arrhythmia. The advanced systems were shown to reduce the arrhythmia burden and the need for antiarrhythmic drugs in patients with complex arrhythmia who had failed fluoroscopy-guided ablation
Based on predominantly observational studies, circumferential PV ablation guided by a nonfluoroscopy system was shown to do the following:
Result in freedom from atrial fibrillation (with or without antiarrhythmic drug) in 75% to 95% of patients (median 79%). This effect was maintained up to 28 months.
Result in freedom from atrial fibrillation without antiarrhythmic drugs in 47% to 95% of patients (median 63%).
Improve patient survival at 28 months after the procedure as compared with drug therapy.
Require special skills; patient outcomes are operator dependent, and there is a significant learning curve effect.
Complication rates of pulmonary vein ablation guided by an advanced mapping/navigation system ranged from 0% to 10% with a median of 6% during a follow-up period of 6 months to 29 months.
The complication rate of the study with the longest follow-up was 8%.
The most common complications of advanced catheter-guided ablation were stroke, transient ischemic attack, cardiac tamponade, myocardial infarction, atrial flutter, congestive heart failure, and pulmonary vein stenosis. A small number of cases with fatal atrial-esophageal fistula had been reported and were attributed to the high radiofrequency energy used rather than to the advanced mapping systems.
Economic Analysis
An Ontario-based economic analysis suggests that the cumulative incremental upfront costs of catheter ablation of atrial fibrillation guided by advanced nonfluoroscopy mapping could be recouped in 4.7 years through cost avoidance arising from less need for antiarrhythmic drugs and fewer hospitalization for stroke and heart failure.
Expert Opinion
Expert consultants to the Medical Advisory Secretariat noted the following:
Nonfluoroscopy mapping is not necessary for simple ablation procedures (e.g., typical flutter). However, it is essential in the ablation of complex arrhythmias including these:
Symptomatic, drug-refractory atrial fibrillation
Arrhythmias in people who have had surgery for congenital heart disease (e.g., macro re-entrant tachycardia in people who have had surgery for congenital heart disease).
Ventricular tachycardia due to myocardial infarction
Atypical atrial flutter
Advanced mapping systems represent an enabling technology in the ablation of complex arrhythmias. The ablation of these complex cases would not have been feasible or advisable with fluoroscopy-guided ablation and, therefore, comparative studies would not be feasible or ethical in such cases.
Many of the studies included patients with relatively simple arrhythmias (e.g., typical atrial flutter and atrial ventricular nodal re-entrant tachycardia), for which the success rates using the fluoroscopy approach were extremely high and unlikely to be improved upon using nonfluoroscopic mapping.
By age 50, almost 100% of people who have had surgery for congenital heart disease will develop arrhythmia.
Some centres are under greater pressure because of expertise in complex ablation procedures for subsets of patients.
The use of advanced mapping systems requires the support of additional electrophysiologic laboratory time and nursing time.
Conclusions
For patients suffering from symptomatic, drug-refractory atrial fibrillation and are otherwise healthy, catheter ablation offers a treatment option that is less invasive than is open surgical ablation.
Small RCTs that may have been limited by type 2 errors showed significant reductions in fluoroscopy exposure in nonfluoroscopy-guided ablation and a trend toward lower overall failure rate that did not reach statistical significance.
Pooled analysis suggests that advanced mapping systems may reduce the overall failure rate in the ablation of atrial fibrillation.
Observational studies suggest that ablation guided by complex mapping/navigation systems is a promising treatment for complex arrhythmias such as highly symptomatic, drug-refractory atrial fibrillation for which rate control is not an option
In people with atrial fibrillation, ablation guided by advanced nonfluoroscopy mapping resulted in arrhythmia free rates of 80% or higher, reduced mortality, and better quality of life at experienced centres.
Although generally safe, serious complications such as stroke, atrial-esophageal, and pulmonary vein stenosis had been reported following ablation procedures.
Experts advised that advanced mapping systems are also required for catheter ablation of:
Hemodynamically unstable ventricular tachycardia from ischemic heart disease
Macro re-entrant atrial tachycardia after surgical correction of congenital heart disease
Atypical atrial flutter
Catheter ablation of atrial fibrillation is still evolving, and it appears that different ablative techniques may be appropriate depending on the characteristics of the patient and the atrial fibrillation.
Data from centres that perform electrophysiological mapping suggest that patients with drug-refractory atrial fibrillation may be the largest group with unmet need for advanced mapping-guided catheter ablation in Ontario.
Nonfluoroscopy mapping-guided pulmonary vein ablation for the treatment of atrial fibrillation has a significant learning effect; therefore, it is advisable for the province to establish centres of excellence to ensure a critical volume, to gain efficiency and to minimize the need for antiarrhythmic drugs after ablation and the need for future repeat ablation procedures.
PMCID: PMC3379531  PMID: 23074499
15.  Hemodialysis in children: general practical guidelines 
Over the past 20 years children have benefited from major improvements in both technology and clinical management of dialysis. Morbidity during dialysis sessions has decreased with seizures being exceptional and hypotensive episodes rare. Pain and discomfort have been reduced with the use of chronic internal jugular venous catheters and anesthetic creams for fistula puncture. Non-invasive technologies to assess patient target dry weight and access flow can significantly reduce patient morbidity and health care costs. The development of urea kinetic modeling enables calculation of the dialysis dose delivery, Kt/V, and an indirect assessment of the intake. Nutritional assessment and support are of major importance for the growing child. Even if the validity of these “urea only” data is questioned, their analysis provides information useful for follow-up. Newer machines provide more precise control of ultrafiltration by volumetric assessment and continuous blood volume monitoring during dialysis sessions. Buffered bicarbonate solutions are now standard and more biocompatible synthetic membranes and specific small size material dialyzers and tubing have been developed for young infants. More recently, the concept of “ultrapure” dialysate, i.e. free from microbiological contamination and endotoxins, has developed. This will enable the use of hemodiafiltration, especially with the on-line option, which has many theoretical advantages and should be considered in the case of maximum/optimum dialysis need. Although the optimum dialysis dose requirement for children remains uncertain, reports of longer duration and/or daily dialysis show they are more effective for phosphate control than conventional hemodialysis and should be considered at least for some high-risk patients with cardiovascular impairment. In children hemodialysis has to be individualized and viewed as an “integrated therapy” considering their long-term exposure to chronic renal failure treatment. Dialysis is seen only as a temporary measure for children compared with renal transplantation because this enables the best chance of rehabilitation in terms of educational and psychosocial functioning. In long term chronic dialysis, however, the highest standards should be applied to these children to preserve their future “cardiovascular life” which might include more dialysis time and on-line hemodiafiltration with synthetic high flux membranes if we are able to improve on the rather restricted concept of small-solute urea dialysis clearance.
doi:10.1007/s00467-005-1876-y
PMCID: PMC1766474  PMID: 15947992
Hemodialysis; Children; Guidelines
16.  Undercarboxylated osteocalcin as a biomarker of subclinical atherosclerosis in non-dialysis patients with chronic kidney disease 
Background
Studies in recent years have shown that undercarboxylated osteocalcin (uOC) not only maintains bone mineralization, but is also involved in the regulation of atherosclerosis. However, a correlation between uOC and carotid atherosclerosis in non-dialysis patients with chronic kidney disease (CKD) has not been investigated.
A total of 240 non-dialysis patients with CKD were included in the study. For these patients, the median estimated glomerular filtration rate (eGFR) was 20.05 (12.43–49.32) ml/min/1.73m2. Serum uOC levels were measured using enzyme-linked immunosorbent assay (ELISA). Carotid ultrasonography was performed to assess carotid atherosclerotic plaques and intima–media thickness (IMT) in an attempt to analyze the relationship between uOC level and carotid atherosclerosis.
Results
The uOC levels of non-dialysis patients with CKD were significantly lower than those of healthy controls [28.16 (21.40–45.85) ng/mL vs. 36.42 (28.05–49.28) ng/mL, P < 0.01]. The uOC levels gradually decreased as CKD progressed (P < 0.01). The uOC levels were significantly lower in patients with carotid plaques than in patients without carotid plaques [25.98 (20.14–31.35) ng/mL vs. 31.02 (25.86–36.40) ng/mL, P < 0.01]. uOC level showed significant negative correlation with IMT (r = -0.33, P < 0.01). Logistic regression analysis revealed that after adjustment for various confounding factors, decreased uOC levels were shown to indicate increased possibility of carotid atherosclerotic plaque development in non-dialysis patients with CKD (on every 1 SD decrease in the uOC level, odds ratio 1.70, 95 % confidence interval 1.24–2.98, P < 0.01). Multivariate stepwise regression analysis demonstrated that decreased uOC level (β = -0.163, P < 0.05) was an independent risk factor for increased carotid IMT in non-dialysis patients with CKD.
Conclusion
Serum uOC levels in non-dialysis patients with CKD are significantly lower than those in healthy individuals, and uOC is closely associated with subclinical atherosclerosis in CKD patients.
doi:10.1186/s12929-015-0183-6
PMCID: PMC4573290  PMID: 26381729
Osteocalcin; Atherosclerosis; Chronic kidney disease; Biomarker
17.  Rationale and design of the HEALTHY-CATH trial: A randomised controlled trial of Heparin versus EthAnol Lock THerapY for the prevention of Catheter Associated infecTion in Haemodialysis patients 
BMC Nephrology  2009;10:23.
Background
Catheter-related bacteraemias (CRBs) contribute significantly to morbidity, mortality and health care costs in dialysis populations. Despite international guidelines recommending avoidance of catheters for haemodialysis access, hospital admissions for CRBs have doubled in the last decade. The primary aim of the study is to determine whether weekly instillation of 70% ethanol prevents CRBs compared with standard heparin saline.
Methods/design
The study will follow a prospective, open-label, randomized controlled design. Inclusion criteria are adult patients with incident or prevalent tunneled intravenous dialysis catheters on three times weekly haemodialysis, with no current evidence of catheter infection and no personal, cultural or religious objection to ethanol use, who are on adequate contraception and are able to give informed consent. Patients will be randomized 1:1 to receive 3 mL of intravenous-grade 70% ethanol into each lumen of the catheter once a week and standard heparin locks for other dialysis days, or to receive heparin locks only. The primary outcome measure will be time to the first episode of CRB, which will be defined using standard objective criteria. Secondary outcomes will include adverse reactions, incidence of CRB caused by different pathogens, time to infection-related catheter removal, time to exit site infections and costs. Prospective power calculations indicate that the study will have 80% statistical power to detect a clinically significant increase in median infection-free survival from 200 days to 400 days if 56 patients are recruited into each arm.
Discussion
This investigator-initiated study has been designed to provide evidence to help nephrologists reduce the incidence of CRBs in haemodialysis patients with tunnelled intravenous catheters.
Trial Registration
Australian New Zealand Clinical Trials Registry Number: ACTRN12609000493246
doi:10.1186/1471-2369-10-23
PMCID: PMC2738669  PMID: 19691852
18.  Dialysis-associated peritonitis in children 
Peritonitis remains a frequent complication of peritoneal dialysis in children and is the most common reason for technique failure. The microbiology is characterized by a predominance of Gram-positive organisms, with fungi responsible for less than 5% of episodes. Data collected by the International Pediatric Peritonitis Registry have revealed a worldwide variation in the bacterial etiology of peritonitis, as well as in the rate of culture-negative peritonitis. Risk factors for infection include young age, the absence of prophylactic antibiotics at catheter placement, spiking of dialysis bags, and the presence of a catheter exit-site or tunnel infection. Clinical symptoms at presentation are somewhat organism specific and can be objectively assessed with a Disease Severity Score. Whereas recommendations for empiric antibiotic therapy in children have been published by the International Society of Peritoneal Dialysis, epidemiologic data and antibiotic susceptibility data suggest that it may be desirable to take the patient- and center-specific history of microorganisms and their sensitivity patterns into account when prescribing initial therapy. The vast majority of patients are treated successfully and continue peritoneal dialysis, with the poorest outcome noted in patients with peritonitis secondary to Gram-negative organisms or fungi and in those with a relapsing infection.
doi:10.1007/s00467-008-1113-6
PMCID: PMC2810362  PMID: 19190935
Antibiotics; Children; Infection; Peritonitis; Peritoneal dialysis
19.  Geographic and facility variation in initial use of non-tunneled catheters for incident maintenance hemodialysis patients 
BMC Nephrology  2016;17:20.
Background
Non-tunneled (temporary) hemodialysis catheters (NTHCs) are the least-optimal initial vascular access for incident maintenance hemodialysis patients yet little is known about factors associated with NTHC use in this context. We sought to determine factors associated with NTHC use and examine regional and facility-level variation in NTHC use for incident maintenance hemodialysis patients.
Methods
We analyzed registry data collected between January 2001 and December 2010 from 61 dialysis facilities within 12 geographic regions in Canada. Multi-level models and intra-class correlation coefficients were used to evaluate variation in NTHC use as initial hemodialysis access across facilities and geographic regions. Facility and patient characteristics associated with the lowest and highest quartiles of NTHC use were compared.
Results
During the study period, 21,052 patients initiated maintenance hemodialysis using a central venous catheter (CVC). This included 10,183 patients (48.3 %) in whom the initial CVC was a NTHC, as opposed to a tunneled CVC. Crude variation in NTHC use across facilities ranged from 3.7 to 99.4 % and across geographic regions from 32.4 to 85.1 %. In an adjusted multi-level logistic regression model, the proportion of total variation in NTHC use explained by facility-level and regional variation was 40.0 % and 34.1 %, respectively. Similar results were observed for the subgroup of patients who received greater than 12 months of pre-dialysis nephrology care. Patient-level factors associated with increased NTHC use were male gender, history of angina, pulmonary edema, COPD, hypertension, increasing distance from dialysis facility, higher serum phosphate, lower serum albumin and later calendar year.
Conclusions
There is wide variation in NTHC use as initial vascular access for incident maintenance hemodialysis patients across facilities and geographic regions in Canada. Identifying modifiable factors that explain this variation could facilitate a reduction of NTHC use in favor of more optimal initial vascular access.
doi:10.1186/s12882-016-0236-4
PMCID: PMC4769546  PMID: 26920700
Vascular access; Hemodialysis; Temporary hemodialysis catheters; Epidemiology; Practice variation
20.  Laparoscopic Placement and Revision of Peritoneal Dialysis Catheters 
Chronic peritoneal dialysis is an option for many patients with end stage renal disease. Laparoscopy offers an alter-native approach in the management of dialysis patients. Over an 18-month period, laparoscopy was used for placement or revision of seven peritoneal dialysis catheters. All were placed in patients with end stage renal disease for chronic dialysis. Two catheters were initially placed using the laparoscope, and in five other patients, the position of the catheter was revised. Of the two patients who had their catheters placed initially, one patient had a previous lower mid-line incision and underwent laparoscopic placement of a catheter and lysis of pelvic adhesions. The second patient had hepatitis C and chronically elevated liver function tests. He underwent laparoscopic placement of a peritoneal dialysis catheter and liver biopsy. Five patients had laparoscopic revision for non-functional catheters. Four were found to have omental adhesions surrounding the catheter. Three patients were found to have a fibrin clot within the catheter, and in one patient the small bowel was adhered to the catheter. All seven patients had general endotracheal anesthesia. There were no operative or anesthetic complications. The average operative time was 56 minutes. Four patients had their procedure in an ambulatory setting and were discharged home the same day. One patient was admitted for 23-hour observation, and two patients had their procedure while in the hospital for other reasons. In follow-up, there was one early failure at two weeks, which required removal of the catheter for infection. One catheter was removed at the time of a combined kidney/pancreas transplant eight months after revision. The other five catheters are still functional with an average follow-up of ten months. These results suggest that laparoscopy is another method for placement of peritoneal dialysis catheters and more importantly for revision in patients with nonfunctional catheters secondary to adhesions. It also provides an opportunity to evaluate the abdomen and perform concomitant procedures.
PMCID: PMC3015335  PMID: 10323172
Laparoscopy; Dialysis catheter; Renal Disease
21.  Epidemiology of haemodialysis catheter complications: a survey of 865 dialysis patients from 14 haemodialysis centres in Henan province in China 
BMJ Open  2015;5(11):e007136.
Objectives
To investigate the incidence rates and risk factors for catheter-related complications in different districts and populations in Henan Province in China.
Design
Cross-sectional.
Setting
Fourteen hospitals in Henan Province.
Participants
865 patients with renal dysfunction undergoing dialysis using catheters between October 2013 and October 2014.
Main outcome measures
The main outcome measures were complications, risk factors and patient characteristics. Catheter-related complications included catheter-related infection (catheter exit-site infection, catheter tunnel infection and catheter-related bloodstream infection), catheter dysfunction (thrombosis, catheter malposition or kinking, and fibrin shell formation) and central vein stenosis.
Results
The overall incidence rate was 7.74/1000 catheter-days, affecting 38.61% of all patients, for catheter infections, 10.58/1000 catheter-days, affecting 56.65% of all patients, for catheter dysfunction, and 0.68/1000 catheter-days, affecting 8.79% of all patients, for central vein stenosis. Multivariate analysis showed that increased age, diabetes, primary educational level or below, rural residence, lack of a nephropathy visit before dialysis and pre-established permanent vascular access, not taking oral drugs to prevent catheter thrombus, lower serum albumin levels and higher ferritin levels were independently associated with catheter infections. Rural residence, not taking oral drugs to prevent thrombus, lack of an imaging examination after catheter insertion, non-tunnel catheter type, lack of medical insurance, lack of nephropathy visit before dialysis and pre-established permanent vascular access, left-sided catheter position, access via the femoral vein and lower haemoglobin level were independently associated with catheter dysfunction. Diabetes, lack of nephropathy visit before dialysis and pre-established permanent vascular access, lack of oral drugs to prevent catheter thrombus, left-sided catheter location and higher number of catheter insertions, were independently associated with central vein stenosis.
Conclusions
The rate of catheter-related complications was high in patients with end-stage renal disease in Henan Province. Our finding suggest that strategies should be implemented to decrease complication rates.
doi:10.1136/bmjopen-2014-007136
PMCID: PMC4663418  PMID: 26589425
22.  Tenckhoff Peritoneal Dialysis Catheter Insertion in a Northern Ireland District General Hospital 
The Ulster Medical Journal  2015;84(3):166-170.
Introduction
Chronic kidney disease (CKD) affects approximately 5% of the population. Based on 2014 data, peritoneal dialysis (PD) is underutilised in Northern Ireland with a prevalence of only 11% in patients requiring renal replacement therapy (RRT). Recent National Institute of Clinical Excellence (NICE) guidelines aim to increase the rate of PD utilisation to 39% amongst patients requiring RRT. In order to implement these guidelines, nephrologists must have access to a reliable, effective PD catheter insertion service. The aim of this study was to assess the outcomes of PD catheter insertions and incident rates of PD use in a single centre in anticipation of a potential increased uptake.
Methods
A retrospective analysis was conducted of all patients who underwent PD catheter insertion between April 2003 and October 2011. Case notes were reviewed for demographic information, complications, need for re-intervention, and primary catheter patency at 12 months. The UK Renal Registry annual reports were also reviewed for data on annual uptake of PD in our institution.
Results
Fifty-four patients underwent PD catheter insertion between 2005 and 2011; 61% were male with a median age of 58 (range 21-82) years. Early complications (≤30 days) included bowel perforation (n=1) and wound infection (n=2). During this study period 17 (31%) patients required manipulation or reinsertion for catheter obstruction/migration. The primary catheter patency at 12 months was 76%. The average uptake of PD as the first treatment modality (incident use) was 21.3% compared to a Northern Ireland (NI) average of 12.4%.
Conclusion
Complication rates were comparable to the International Society of Peritoneal Dialysis (ISPD) guidelines in this case series and PD uptake was higher than the NI average. Therefore, local provision of an expert surgical PD catheter insertion service may potentially facilitate an increased uptake of this modality amongst RRT patients but further research is warranted.
PMCID: PMC4642258  PMID: 26668419
23.  A Rare Case of Aeromonas Hydrophila Catheter Related Sepsis in a Patient with Chronic Kidney Disease Receiving Steroids and Dialysis: A Case Report and Review of Aeromonas Infections in Chronic Kidney Disease Patients 
Case reports in nephrology  2013;2013:735194.
Aeromonas hydrophila (AH) is an aquatic bacterium. We present a case of fifty-five-year-old gentleman with chronic kidney disease (CKD) due to crescentic IgA nephropathy who presented to us with fever. He was recently pulsed with methyl prednisolone followed by oral prednisolone and discharged on maintenance dialysis through a double lumen dialysis catheter. Blood culture from peripheral vein and double lumen dialysis catheter grew AH. We speculate low immunity due to steroids and uremia along with touch contamination of dialysis catheter by the patient or dialysis nurse could have led to this rare infection. Dialysis catheter related infection by AH is rare. We present our case here and take the opportunity to give a brief review of AH infections in CKD patients.
doi:10.1155/2013/735194
PMCID: PMC3914193  PMID: 24558624
24.  A Crossover Intervention Trial Evaluating the Efficacy of a Chlorhexidine-Impregnated Sponge (BIOPATCH®) to Reduce Catheter-Related Bloodstream Infections in Hemodialysis Patients 
Background
Catheter-related bloodstream infections (BSI) account for the majority of hemodialysis-related infections. There are no published data on the efficacy of the chlorhexidine-impregnated foam dressing at reducing catheter-related BSI in hemodialysis patients.
Design
Prospective non-blinded cross-over intervention trial to determine the efficacy of a chlorhexidine-impregnated foam dressing (Biopatch®) to reduce catheter-related BSI in hemodialysis patients.
Setting
Two outpatient dialysis centers
Patients
A total of 121 patients who were dialyzed through tunneled central venous catheters received the intervention during the trial.
Methods
The primary outcome of interest was the incidence of catheter-related bloodstream infections. A nested cohort study of all patients who received the Biopatch® Antimicrobial Dressing was also conducted. Backward stepwise logistic regression analysis was used to determine independent risk factors for development of BSI.
Results
37 bloodstream infections occurred in the intervention group for a rate of 6.3 BSIs/1000 dialysis sessions and 30 bloodstream infections in the control group for a rate of 5.2 BSIs/1000 dialysis sessions and [RR 1.22, CI (0.76, 1.97); P=0.46]. The Biopatch® Antimicrobial Dressing was well-tolerated with only two patients (<2%) experiencing dermatitis that led to its discontinuation. The only independent risk factor for development of BSI was dialysis treatment at one dialysis center [aOR 4.4 (1.77, 13.65); P=0.002]. Age ≥ 60 years [aOR 0.28 (0.09, 0.82); P=0.02] was associated with lower risk for BSI.
Conclusion
The use of a chlorhexidine-impregnated foam dressing (Biopatch®) did not decrease catheter-related BSIs among hemodialysis patients with tunneled central venous catheters.
doi:10.1086/657075
PMCID: PMC3077924  PMID: 20879855
chlorhexidine-impregnated dressing; hemodialysis; bloodstream infection; tunneled catheter
25.  Tunneled central venous catheters: Experience from a single center 
Indian Journal of Nephrology  2011;21(2):107-111.
In the past vascular surgeons were called in to place tunneled central venous catheter (TVC) for hemodialysis patients. Advent of percutaneous technique has resulted in an increasing number of interventional nephrologists inserting it. A single centre three year audit of 100 TVCs with a cumulative follow up of 492 patient months is presented here. From 2007 to 2010, 100 TVCs were placed by nephrologists in a percutaneous fashion in the operative room or the interventional nephrology suite. Those who completed minimum of three months on the catheter were included in analysis. There were 69 males and 31 females with a mean age of 52.3±13.6 years.(range: 25-76). Chronic glomerulonephritis was the commonest cause of CKD (45%) followed by diabetes (39%).Right internal jugular vein was the preferred site (94%). TVC was utilized as the primary access to initiate dialysis in 25% of patients in whom a live donor was available for renal transplant. The blood flow was 250-270 ml/min. The Kaplan-Meier analysis showed that 3 months and 6 months catheter survival rates were 80% and 55%, respectively. The main complications were exit site blood ooze, catheter block and kink. Catheter related bacteremia rate was low at 0.4/1000 patient days. Primary cause of drop out was patient death unrelated to the TVCs. Those under the age of 40 years showed better survival, but there was no bearing of gender, catheter site, and etiology of CKD on survival. Tunneled central venous catheters could find a niche as the primary access of choice for pretransplant live donor renal transplants in view of its immediate usage, high blood flows, low infection rates and adequate patency rates for 3-6 months.
doi:10.4103/0971-4065.82133
PMCID: PMC3132329  PMID: 21769173
Bacteremia; catheter survival; hemodialysis; tunneled central venous catheters

Results 1-25 (1562923)