PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1403013)

Clipboard (0)
None

Related Articles

1.  Long-Term Survival and Dialysis Dependency Following Acute Kidney Injury in Intensive Care: Extended Follow-up of a Randomized Controlled Trial 
PLoS Medicine  2014;11(2):e1001601.
Martin Gallagher and colleagues examine the long-term outcomes of renal replacement therapy (RRT) dosing in patients with acute kidney injury randomized to normal vs. augmented RRT.
Please see later in the article for the Editors' Summary
Background
The incidence of acute kidney injury (AKI) is increasing globally and it is much more common than end-stage kidney disease. AKI is associated with high mortality and cost of hospitalisation. Studies of treatments to reduce this high mortality have used differing renal replacement therapy (RRT) modalities and have not shown improvement in the short term. The reported long-term outcomes of AKI are variable and the effect of differing RRT modalities upon them is not clear. We used the prolonged follow-up of a large clinical trial to prospectively examine the long-term outcomes and effect of RRT dosing in patients with AKI.
Methods and Findings
We extended the follow-up of participants in the Randomised Evaluation of Normal vs. Augmented Levels of RRT (RENAL) study from 90 days to 4 years after randomization. Primary and secondary outcomes were mortality and requirement for maintenance dialysis, respectively, assessed in 1,464 (97%) patients at a median of 43.9 months (interquartile range [IQR] 30.0–48.6 months) post randomization. A total of 468/743 (63%) and 444/721 (62%) patients died in the lower and higher intensity groups, respectively (risk ratio [RR] 1.04, 95% CI 0.96–1.12, p = 0.49). Amongst survivors to day 90, 21 of 411 (5.1%) and 23 of 399 (5.8%) in the respective groups were treated with maintenance dialysis (RR 1.12, 95% CI 0.63–2.00, p = 0.69). The prevalence of albuminuria among survivors was 40% and 44%, respectively (p = 0.48). Quality of life was not different between the two treatment groups. The generalizability of these findings to other populations with AKI requires further exploration.
Conclusions
Patients with AKI requiring RRT in intensive care have high long-term mortality but few require maintenance dialysis. Long-term survivors have a heavy burden of proteinuria. Increased intensity of RRT does not reduce mortality or subsequent treatment with dialysis.
Trial registration
www.ClinicalTrials.gov NCT00221013
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Throughout life, the kidneys perform the essential task of filtering waste products (from the normal breakdown of tissues and from food) and excess water from the blood to make urine. Chronic kidney disease (caused, for example, by diabetes) gradually destroys the kidneys' filtration units (the nephrons), eventually leading to life-threatening end-stage kidney disease. However, the kidneys can also stop working suddenly because of injury, infection, or poisoning. Acute kidney injury (AKI) is much more common than end-stage kidney disease and its incidence is increasing worldwide. In the US, for example, the number of hospitalizations that included an AKI diagnosis rose from 4,000 in 1996 to 23,000 in 2008. Moreover, nearly half of patients with AKI will die shortly after the condition develops. Symptoms of AKI include changes in urination, swollen feet and ankles, and tiredness. Treatments for AKI aim to prevent fluid and waste build up in the body and treat the underlying cause (e.g., severe infection or dehydration) while allowing the kidneys time to recover. In some patients, it is sufficient to limit the fluid intake and to reduce waste build-up by eating a diet that is low in protein, salt, and potassium. Other patients need renal replacement therapy (RRT), life-supporting treatments such as hemodialysis and hemofiltration, two processes that clean the blood by filtering it outside the body.
Why Was This Study Done?
The long-term outcomes of AKI (specifically, death and chronic kidney disease) and the effects of different RRT modalities on these outcomes are unclear. A recent controlled trial that randomly assigned patients with AKI who were managed in intensive care units (ICUs) to receive two different intensities of continuous hemodiafiltration (a combination of hemodialysis and hemofiltration) found no difference in all-cause mortality (death) at 90 days. Here, the researchers extend the follow-up of this trial (the Randomized Evaluation of Normal vs. Augmented Levels of renal replacement therapy [RENAL] study) to investigate longer-term mortality, the variables that predict mortality, treatment with long-term dialysis (an indicator of chronic kidney disease), and functional outcomes in patients with AKI treated with different intensities of continuous RRT.
What Did the Researchers Do and Find?
For the Prolonged Outcomes Study of RENAL (POST-RENAL), the researchers extended the follow-up of the RENAL participants up to 4 years. Over an average follow-up of 43.9 months, 63% of patients in the lower intensity treatment group died compared to 62% of patients in the higher intensity group. Overall, a third of patients who survived to 90 days died during the extended follow-up. Among the survivors to day 90, 5.1% and 5.8% of patients in the lower and higher intensity groups, respectively, were treated with maintenance dialysis during the extended follow-up. Among survivors who consented to analysis, 40% and 44% of patients in the lower and higher intensity groups, respectively, had albuminuria (protein in the urine, an indicator of kidney damage). Patients in both groups had a similar quality life (determined through telephone interviews). Finally, increasing age, APACHE III score (a scoring system that predicts the survival of patients in ICU), and serum creatinine level (an indicator of kidney function) at randomization were all predictors of long-term mortality.
What Do These Findings Mean?
These findings indicate that patients with AKI in ICUs who require RRT have a high long-term mortality. They show that few survivors require maintenance dialysis for chronic kidney disease but that there is a substantial rate of albuminuria among survivors despite relative preservation of kidney function. The findings also suggest that the intensity of RRT has no significant effect on mortality or the need for dialysis. Because these findings were obtained in a randomized controlled trial, they may not be generalizable to other patient populations. Moreover, although data on mortality and maintenance dialysis were available for all the trial participants, clinical and biochemical outcomes were only available for some participants and may not be representative of all the participants. Despite these study limitations, these findings suggest that survivors of AKI may be at a high risk of death or of developing chronic kidney disease. Survivors of AKI are, therefore, at high risk of further illness and long-term albuminuria reduction strategies may offer a therapeutic intervention for this group of patients.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001601.
The US National Kidney and Urologic Diseases Information Clearinghouse provides information about the kidneys and about all aspects of kidney disease and its treatment; the US National Kidney Disease Education Program provides resources to help improve the understanding, detection, and management of kidney disease (in English and Spanish)
The Mayo Clinic provides information for patients about acute kidney injury
Wikipedia has a page on acute kidney injury (note that Wikipedia is a free online encyclopedia that anyone can edit; available in several languages)
The not-for-profit UK National Kidney Federation provides support and information for patients with kidney disease and for their carers, including a link to a video about acute kidney injury
World Kidney Day, a joint initiative between the International Society of Nephrology and the International Federation of Kidney Foundations (IFKF), aims to raise awareness about kidneys and kidney disease; its website provides information about acute kidney injury
The MedlinePlus Encyclopedia has a pages about acute kidney failure and about renal dialysis
The UK National Institute for Health and Care Excellence (NICE) recently published new guidelines on the treatment of acute kidney injury; a clinical practice guideline for acute kidney injury produced by KDIGO (a not-for-profit organization that aims to improve the care and outcomes of kidney disease patients worldwide through the development and implementation of global clinical practice guidelines) is available; the Acute Kidney Injury app provides a fast and simple way to explore guidelines on the diagnosis, prevention, and management of AKI
doi:10.1371/journal.pmed.1001601
PMCID: PMC3921111  PMID: 24523666
2.  A Systematic Review and Meta-Analysis of Utility-Based Quality of Life in Chronic Kidney Disease Treatments 
PLoS Medicine  2012;9(9):e1001307.
Melanie Wyld and colleagues examined previously published studies to assess pooled utility-based quality of life of the various treatments for chronic kidney disease. They conclude that the highest utility was for kidney transplants, with home-based automated peritoneal dialysis being second.
Background
Chronic kidney disease (CKD) is a common and costly condition to treat. Economic evaluations of health care often incorporate patient preferences for health outcomes using utilities. The objective of this study was to determine pooled utility-based quality of life (the numerical value attached to the strength of an individual's preference for a specific health outcome) by CKD treatment modality.
Methods and Findings
We conducted a systematic review, meta-analysis, and meta-regression of peer-reviewed published articles and of PhD dissertations published through 1 December 2010 that reported utility-based quality of life (utility) for adults with late-stage CKD. Studies reporting utilities by proxy (e.g., reported by a patient's doctor or family member) were excluded.
In total, 190 studies reporting 326 utilities from over 56,000 patients were analysed. There were 25 utilities from pre-treatment CKD patients, 226 from dialysis patients (haemodialysis, n = 163; peritoneal dialysis, n = 44), 66 from kidney transplant patients, and three from patients treated with non-dialytic conservative care. Using time tradeoff as a referent instrument, kidney transplant recipients had a mean utility of 0.82 (95% CI: 0.74, 0.90). The mean utility was comparable in pre-treatment CKD patients (difference = −0.02; 95% CI: −0.09, 0.04), 0.11 lower in dialysis patients (95% CI: −0.15, −0.08), and 0.2 lower in conservative care patients (95% CI: −0.38, −0.01). Patients treated with automated peritoneal dialysis had a significantly higher mean utility (0.80) than those on continuous ambulatory peritoneal dialysis (0.72; p = 0.02). The mean utility of transplant patients increased over time, from 0.66 in the 1980s to 0.85 in the 2000s, an increase of 0.19 (95% CI: 0.11, 0.26). Utility varied by elicitation instrument, with standard gamble producing the highest estimates, and the SF-6D by Brazier et al., University of Sheffield, producing the lowest estimates. The main limitations of this study were that treatment assignments were not random, that only transplant had longitudinal data available, and that we calculated EuroQol Group EQ-5D scores from SF-36 and SF-12 health survey data, and therefore the algorithms may not reflect EQ-5D scores measured directly.
Conclusions
For patients with late-stage CKD, treatment with dialysis is associated with a significant decrement in quality of life compared to treatment with kidney transplantation. These findings provide evidence-based utility estimates to inform economic evaluations of kidney therapies, useful for policy makers and in individual treatment discussions with CKD patients.
Editors' Summary
Background
Ill health can adversely affect an individual's quality of life, particularly if caused by long-term (chronic) conditions, such as chronic kidney disease—in the United States alone, 23 million people have chronic kidney disease, of whom 570,000 are treated with dialysis or kidney transplantation. In order to measure the cost-effectiveness of interventions to manage medical conditions, health economists use an objective measurement known as quality-adjusted life years. However, although useful, quality-adjusted life years are often criticized for not taking into account the views and preferences of the individuals with the medical conditions. A measurement called a utility solves this problem. Utilities are a numerical value (measured on a 0 to 1 scale, where 0 represents death and 1 represents full health) of the strength of an individual's preference for specified health-related outcomes, as measured by “instruments” (questionnaires) that rate direct comparisons or assess quality of life.
Why Was This Study Done?
Previous studies have suggested that, in people with chronic kidney disease, quality of life (as measured by utility) is higher in those with a functioning kidney transplant than in those on dialysis. However, currently, it is unclear whether the type of dialysis affects quality of life: hemodialysis is a highly technical process that directly filters the blood, usually must be done 2–4 times a week, and can only be performed in a health facility; peritoneal dialysis, in which fluids are infused into the abdominal cavity, can be done nightly at home (automated peritoneal dialysis) or throughout the day (continuous ambulatory peritoneal dialysis). In this study, the researchers reviewed and assimilated all of the available evidence to investigate whether quality of life in people with chronic kidney disease (as measured by utility) differed according to treatment type.
What Did the Researchers Do and Find?
The researchers did a comprehensive search of 11 databases to identify all relevant studies that included people with severe (stage 3, 4, or 5) chronic kidney disease, their form of treatment, and information on utilities—either reported directly, or included in quality of life instruments (SF-36), so the researchers could calculate utilities by using a validated algorithm. The researchers also recorded the prevalence rates of diabetes in study participants. Then, using statistical models that adjusted for various factors, including treatment type and the method of measuring utilities, the researchers were able to calculate the pooled utilities of each form of treatment for chronic kidney disease.
The researchers included 190 studies, representing over 56,000 patients and generating 326 utility estimates, in their analysis. The majority of utilities (77%) were derived through the SF-36 questionnaire via calculation. Of the 326 utility estimates, 25 were from patients pre-dialysis, 226 were from dialysis patients (the majority of whom were receiving hemodialysis), 66 were from kidney transplant patients, and three were from conservative care patients. The researchers found that the highest average utility was for those who had renal transplantation, 0.82, followed by the pre-dialysis group (0.80), dialysis patients (0.71), and, finally, patients receiving conservative care (0.62). When comparing the type of dialysis, the researchers found that there was little difference in utility between hemodialysis and peritoneal dialysis, but patients using automated peritoneal dialysis had, on average, a higher utility (0.80) than those treated with continuous ambulatory peritoneal dialysis (0.72). Finally, the researchers found that patient groups with diabetes had significantly lower utilities than those without diabetes.
What Do These Findings Mean?
These findings suggest that in people with chronic kidney disease, renal transplantation is the best treatment option to improve quality of life. For those on dialysis, home-based automated peritoneal dialysis may improve quality of life more than the other forms of dialysis: this finding is important, as this type of dialysis is not as widely used as other forms and is also cheaper than hemodialysis. Furthermore, these findings suggest that patients who choose conservative care have significantly lower quality of life than patients treated with dialysis, a finding that warrants further investigation. Overall, in addition to helping to inform economic evaluations of treatment options, the information from this analysis can help guide clinicians caring for patients with chronic kidney disease in their discussions about possible treatment options.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001307.
Information about chronic kidney disease is available from the National Kidney Foundation and MedlinePlus
Wikipedia gives information on general utilities (note that Wikipedia is a free online encyclopedia that anyone can edit; available in several languages)
doi:10.1371/journal.pmed.1001307
PMCID: PMC3439392  PMID: 22984353
3.  Risk Models to Predict Chronic Kidney Disease and Its Progression: A Systematic Review 
PLoS Medicine  2012;9(11):e1001344.
A systematic review of risk prediction models conducted by Justin Echouffo-Tcheugui and Andre Kengne examines the evidence base for prediction of chronic kidney disease risk and its progression, and suitability of such models for clinical use.
Background
Chronic kidney disease (CKD) is common, and associated with increased risk of cardiovascular disease and end-stage renal disease, which are potentially preventable through early identification and treatment of individuals at risk. Although risk factors for occurrence and progression of CKD have been identified, their utility for CKD risk stratification through prediction models remains unclear. We critically assessed risk models to predict CKD and its progression, and evaluated their suitability for clinical use.
Methods and Findings
We systematically searched MEDLINE and Embase (1 January 1980 to 20 June 2012). Dual review was conducted to identify studies that reported on the development, validation, or impact assessment of a model constructed to predict the occurrence/presence of CKD or progression to advanced stages. Data were extracted on study characteristics, risk predictors, discrimination, calibration, and reclassification performance of models, as well as validation and impact analyses. We included 26 publications reporting on 30 CKD occurrence prediction risk scores and 17 CKD progression prediction risk scores. The vast majority of CKD risk models had acceptable-to-good discriminatory performance (area under the receiver operating characteristic curve>0.70) in the derivation sample. Calibration was less commonly assessed, but overall was found to be acceptable. Only eight CKD occurrence and five CKD progression risk models have been externally validated, displaying modest-to-acceptable discrimination. Whether novel biomarkers of CKD (circulatory or genetic) can improve prediction largely remains unclear, and impact studies of CKD prediction models have not yet been conducted. Limitations of risk models include the lack of ethnic diversity in derivation samples, and the scarcity of validation studies. The review is limited by the lack of an agreed-on system for rating prediction models, and the difficulty of assessing publication bias.
Conclusions
The development and clinical application of renal risk scores is in its infancy; however, the discriminatory performance of existing tools is acceptable. The effect of using these models in practice is still to be explored.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Chronic kidney disease (CKD)—the gradual loss of kidney function—is increasingly common worldwide. In the US, for example, about 26 million adults have CKD, and millions more are at risk of developing the condition. Throughout life, small structures called nephrons inside the kidneys filter waste products and excess water from the blood to make urine. If the nephrons stop working because of injury or disease, the rate of blood filtration decreases, and dangerous amounts of waste products such as creatinine build up in the blood. Symptoms of CKD, which rarely occur until the disease is very advanced, include tiredness, swollen feet and ankles, puffiness around the eyes, and frequent urination, especially at night. There is no cure for CKD, but progression of the disease can be slowed by controlling high blood pressure and diabetes, both of which cause CKD, and by adopting a healthy lifestyle. The same interventions also reduce the chances of CKD developing in the first place.
Why Was This Study Done?
CKD is associated with an increased risk of end-stage renal disease, which is treated with dialysis or by kidney transplantation (renal replacement therapies), and of cardiovascular disease. These life-threatening complications are potentially preventable through early identification and treatment of CKD, but most people present with advanced disease. Early identification would be particularly useful in developing countries, where renal replacement therapies are not readily available and resources for treating cardiovascular problems are limited. One way to identify people at risk of a disease is to use a “risk model.” Risk models are constructed by testing the ability of different combinations of risk factors that are associated with a specific disease to identify those individuals in a “derivation sample” who have the disease. The model is then validated on an independent group of people. In this systematic review (a study that uses predefined criteria to identify all the research on a given topic), the researchers critically assess the ability of existing CKD risk models to predict the occurrence of CKD and its progression, and evaluate their suitability for clinical use.
What Did the Researchers Do and Find?
The researchers identified 26 publications reporting on 30 risk models for CKD occurrence and 17 risk models for CKD progression that met their predefined criteria. The risk factors most commonly included in these models were age, sex, body mass index, diabetes status, systolic blood pressure, serum creatinine, protein in the urine, and serum albumin or total protein. Nearly all the models had acceptable-to-good discriminatory performance (a measure of how well a model separates people who have a disease from people who do not have the disease) in the derivation sample. Not all the models had been calibrated (assessed for whether the average predicted risk within a group matched the proportion that actually developed the disease), but in those that had been assessed calibration was good. Only eight CKD occurrence and five CKD progression risk models had been externally validated; discrimination in the validation samples was modest-to-acceptable. Finally, very few studies had assessed whether adding extra variables to CKD risk models (for example, genetic markers) improved prediction, and none had assessed the impact of adopting CKD risk models on the clinical care and outcomes of patients.
What Do These Findings Mean?
These findings suggest that the development and clinical application of CKD risk models is still in its infancy. Specifically, these findings indicate that the existing models need to be better calibrated and need to be externally validated in different populations (most of the models were tested only in predominantly white populations) before they are incorporated into guidelines. The impact of their use on clinical outcomes also needs to be assessed before their widespread use is recommended. Such research is worthwhile, however, because of the potential public health and clinical applications of well-designed risk models for CKD. Such models could be used to identify segments of the population that would benefit most from screening for CKD, for example. Moreover, risk communication to patients could motivate them to adopt a healthy lifestyle and to adhere to prescribed medications, and the use of models for predicting CKD progression could help clinicians tailor disease-modifying therapies to individual patient needs.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001344.
This study is further discussed in a PLOS Medicine Perspective by Maarten Taal
The US National Kidney and Urologic Diseases Information Clearinghouse provides information about all aspects of kidney disease; the US National Kidney Disease Education Program provides resources to help improve the understanding, detection, and management of kidney disease (in English and Spanish)
The UK National Health Service Choices website provides information for patients on chronic kidney disease, including some personal stories
The US National Kidney Foundation, a not-for-profit organization, provides information about chronic kidney disease (in English and Spanish)
The not-for-profit UK National Kidney Federation support and information for patients with kidney disease and for their carers, including a selection of patient experiences of kidney disease
World Kidney Day, a joint initiative between the International Society of Nephrology and the International Federation of Kidney Foundations, aims to raise awareness about kidneys and kidney disease
doi:10.1371/journal.pmed.1001344
PMCID: PMC3502517  PMID: 23185136
4.  Association of Non-alcoholic Fatty Liver Disease with Chronic Kidney Disease: A Systematic Review and Meta-analysis 
PLoS Medicine  2014;11(7):e1001680.
In a systematic review and meta-analysis, Giovanni Musso and colleagues examine the association between non-alcoholic fatty liver disease and chronic kidney disease.
Please see later in the article for the Editors' Summary
Background
Chronic kidney disease (CKD) is a frequent, under-recognized condition and a risk factor for renal failure and cardiovascular disease. Increasing evidence connects non-alcoholic fatty liver disease (NAFLD) to CKD. We conducted a meta-analysis to determine whether the presence and severity of NAFLD are associated with the presence and severity of CKD.
Methods and Findings
English and non-English articles from international online databases from 1980 through January 31, 2014 were searched. Observational studies assessing NAFLD by histology, imaging, or biochemistry and defining CKD as either estimated glomerular filtration rate (eGFR) <60 ml/min/1.73 m2 or proteinuria were included. Two reviewers extracted studies independently and in duplicate. Individual participant data (IPD) were solicited from all selected studies. Studies providing IPD were combined with studies providing only aggregate data with the two-stage method. Main outcomes were pooled using random-effects models. Sensitivity and subgroup analyses were used to explore sources of heterogeneity and the effect of potential confounders. The influences of age, whole-body/abdominal obesity, homeostasis model of insulin resistance (HOMA-IR), and duration of follow-up on effect estimates were assessed by meta-regression. Thirty-three studies (63,902 participants, 16 population-based and 17 hospital-based, 20 cross-sectional, and 13 longitudinal) were included. For 20 studies (61% of included studies, 11 cross-sectional and nine longitudinal, 29,282 participants), we obtained IPD. NAFLD was associated with an increased risk of prevalent (odds ratio [OR] 2.12, 95% CI 1.69–2.66) and incident (hazard ratio [HR] 1.79, 95% CI 1.65–1.95) CKD. Non-alcoholic steatohepatitis (NASH) was associated with a higher prevalence (OR 2.53, 95% CI 1.58–4.05) and incidence (HR 2.12, 95% CI 1.42–3.17) of CKD than simple steatosis. Advanced fibrosis was associated with a higher prevalence (OR 5.20, 95% CI 3.14–8.61) and incidence (HR 3.29, 95% CI 2.30–4.71) of CKD than non-advanced fibrosis. In all analyses, the magnitude and direction of effects remained unaffected by diabetes status, after adjustment for other risk factors, and in other subgroup and meta-regression analyses. In cross-sectional and longitudinal studies, the severity of NAFLD was positively associated with CKD stages. Limitations of analysis are the relatively small size of studies utilizing liver histology and the suboptimal sensitivity of ultrasound and biochemistry for NAFLD detection in population-based studies.
Conclusion
The presence and severity of NAFLD are associated with an increased risk and severity of CKD.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Chronic kidney disease (CKD)—the gradual loss of kidney function—is becoming increasingly common. In the US, for example, more than 10% of the adult population (about 26 million people) and more than 25% of individuals older than 65 years have CKD. Throughout life, the kidneys perform the essential task of filtering waste products (from the normal breakdown of tissues and from food) and excess water from the blood to make urine. CKD gradually destroys the kidneys' filtration units, the rate of blood filtration decreases, and dangerous amounts of waste products build up in the blood. Symptoms of CKD, which rarely occur until the disease is very advanced, include tiredness, swollen feet, and frequent urination, particularly at night. There is no cure for CKD, but progression of the disease can be slowed by controlling high blood pressure and diabetes (two risk factors for CKD), and by adopting a healthy lifestyle. The same interventions also reduce the chances of CKD developing in the first place.
Why Was This Study Done?
CKD is associated with an increased risk of end-stage renal (kidney) disease and of cardiovascular disease. These life-threatening complications are potentially preventable through early identification and treatment of CKD. Because early recognition of CKD has the potential to reduce its health-related burden, the search is on for new modifiable risk factors for CKD. One possible new risk factor is non-alcoholic fatty liver disease (NAFLD), which, like CKD is becoming increasingly common. Healthy livers contain little or no fat but, in the US, 30% of the general adult population and up to 70% of patients who are obese or have diabetes have some degree of NAFLD, which ranges in severity from simple fatty liver (steatosis), through non-alcoholic steatohepatitis (NASH), to NASH with fibrosis (scarring of the liver) and finally cirrhosis (extensive scarring). In this systematic review and meta-analysis, the researchers investigate whether NAFLD is a risk factor for CKD by looking for an association between the two conditions. A systematic review identifies all the research on a given topic using predefined criteria, meta-analysis uses statistical methods to combine the results of several studies.
What Did the Researchers Do and Find?
The researchers identified 33 studies that assessed NAFLD and CKD in nearly 64,000 participants, including 20 cross-sectional studies in which participants were assessed for NAFLD and CKD at a single time point and 13 longitudinal studies in which participants were assessed for NAFLD and then followed up to see whether they subsequently developed CKD. Meta-analysis of the data from the cross-sectional studies indicated that NAFLD was associated with a 2-fold increased risk of prevalent (pre-existing) CKD (an odds ratio [OR]of 2.12; an OR indicates the chance that an outcome will occur given a particular exposure, compared to the chance of the outcome occurring in the absence of that exposure). Meta-analysis of data from the longitudinal studies indicated that NAFLD was associated with a nearly 2-fold increased risk of incident (new) CKD (a hazard ratio [HR] of 1.79; an HR indicates often a particular event happens in one group compared to how often it happens in another group, over time). NASH was associated with a higher prevalence and incidence of CKD than simple steatosis. Similarly, advanced fibrosis was associated with a higher prevalence and incidence of CKD than non-advanced fibrosis.
What Do These Findings Mean?
These findings suggest that NAFLD is associated with an increased prevalence and incidence of CKD and that increased severity of liver disease is associated with an increased risk and severity of CKD. Because these associations persist after allowing for established risk factors for CKD, these findings identify NAFLD as an independent CKD risk factor. Certain aspects of the studies included in this meta-analysis (for example, only a few studies used biopsies to diagnose NAFLD; most used less sensitive tests that may have misclassified some individuals with NAFLD as normal) and the methods used in the meta-analysis may limit the accuracy of these findings. Nevertheless, these findings suggest that individuals with NAFLD should be screened for CKD even in the absence of other risk factors for the disease, and that better treatment of NAFLD may help to prevent CKD.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001680.
The US National Kidney and Urologic Diseases Information Clearinghouse provides information about all aspects of kidney disease; the US National Digestive Diseases Information Clearinghouse provides information about non-alcoholic liver disease
The US National Kidney Disease Education Program provides resources to help improve the understanding, detection, and management of kidney disease (in English and Spanish)
The UK National Health Service Choices website provides information for patients on chronic kidney disease, including some personal stories, and information on non-alcoholic fatty liver disease
The US National Kidney Foundation, a not-for-profit organization, provides information about chronic kidney disease (in English and Spanish)
The not-for-profit UK National Kidney Federation provides support and information for patients with kidney disease and for their carers
The British Liver Trust, a not-for-profit organization, provides information about non-alcoholic fatty liver disease, including a patient story
doi:10.1371/journal.pmed.1001680
PMCID: PMC4106719  PMID: 25050550
5.  Methoxy polyethylene glycol-epoetin beta for anemia with chronic kidney disease 
Chronic kidney disease (CKD) is a risk factor for end-stage renal failure and cardiovascular events. In patients with CKD, anemia is often caused by decreased erythropoietin production relative to hemoglobin levels. As correction of anemia is associated with improved cardiac and renal function and quality of life, erythropoiesis-stimulating agents (ESAs) are standard therapy for anemia in CKD patients. However, traditional ESAs such as epoetin or darbepoetin have short half-lives and require frequent administration, dose changes, and close monitoring of hemoglobin concentration to maintain target hemoglobin levels. Methoxy polyethylene glycol-epoetin beta (MPG-EPO) is the only ESA that is generated by chemical modification of glycosylated erythropoietin through the integration of one specific, long, linear chain of polyethylene glycol. This ESA induces continuous erythropoietin receptor activation and has a long half-life (approximately 130 hours). Subcutaneous or intravenous administration of MPG-EPO once every 2 weeks or monthly achieved a high hemoglobin response rate in patients with anemia associated with CKD, regardless of whether the patient was undergoing dialysis. According to data from an observational time and motion study, MPG-EPO maintains hemoglobin levels when the same dose is administered, however infrequently. This suggests that compared with the use of traditional ESAs, administration of MPG-EPO reduces the overall time and cost associated with the management of anemia in CKD patients undergoing dialysis. MPG-EPO is generally well tolerated and most adverse events are of mild to moderate severity. The most commonly reported adverse effects are hypertension, nasopharyngitis, and diarrhea. Subcutaneous injection of MPG-EPO is significantly less painful than subcutaneous injection of darbepoetin. In conclusion, MPG-EPO is as effective and safe as traditional ESAs in managing renal anemia, irrespective of whether the patient is undergoing dialysis.
doi:10.2147/IJNRD.S23447
PMCID: PMC3333806  PMID: 22536082
methoxy polyethylene glycol-epoetin beta; renal anemia; end-stage renal failure; hemoglobin; erythropoiesis-stimulating agent
6.  Sex-Specific Differences in Hemodialysis Prevalence and Practices and the Male-to-Female Mortality Rate: The Dialysis Outcomes and Practice Patterns Study (DOPPS) 
PLoS Medicine  2014;11(10):e1001750.
In this study, Port and colleagues describe hemodialysis prevalence and patient characteristics by sex, compare men-to-women mortality rate with data from the general population, and evaluate sex interactions with mortality. The results show that women's survival advantage was markedly diminished in hemodialysis patients.
Please see later in the article for the Editors' Summary
Background
A comprehensive analysis of sex-specific differences in the characteristics, treatment, and outcomes of individuals with end-stage renal disease undergoing dialysis might reveal treatment inequalities and targets to improve sex-specific patient care. Here we describe hemodialysis prevalence and patient characteristics by sex, compare the adult male-to-female mortality rate with data from the general population, and evaluate sex interactions with mortality.
Methods and Findings
We assessed the Human Mortality Database and 206,374 patients receiving hemodialysis from 12 countries (Australia, Belgium, Canada, France, Germany, Italy, Japan, New Zealand, Spain, Sweden, the UK, and the US) participating in the international, prospective Dialysis Outcomes and Practice Patterns Study (DOPPS) between June 1996 and March 2012. Among 35,964 sampled DOPPS patients with full data collection, we studied patient characteristics (descriptively) and mortality (via Cox regression) by sex. In all age groups, more men than women were on hemodialysis (59% versus 41% overall), with large differences observed between countries. The average estimated glomerular filtration rate at hemodialysis initiation was higher in men than women. The male-to-female mortality rate ratio in the general population varied from 1.5 to 2.6 for age groups <75 y, but in hemodialysis patients was close to one. Compared to women, men were younger (mean = 61.9±standard deviation 14.6 versus 63.1±14.5 y), were less frequently obese, were more frequently married and recipients of a kidney transplant, more frequently had coronary artery disease, and were less frequently depressed. Interaction analyses showed that the mortality risk associated with several comorbidities and hemodialysis catheter use was lower for men (hazard ratio [HR] = 1.11) than women (HR = 1.33, interaction p<0.001). This study is limited by its inability to establish causality for the observed sex-specific differences and does not provide information about patients not treated with dialysis or dying prior to a planned start of dialysis.
Conclusions
Women's survival advantage was markedly diminished in hemodialysis patients. The finding that fewer women than men were being treated with dialysis for end-stage renal disease merits detailed further study, as the large discrepancies in sex-specific hemodialysis prevalence by country and age group are likely explained by factors beyond biology. Modifiable variables, such as catheter use, showing significant sex interactions suggest interventional targeting.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Throughout life, the kidneys filter waste products (from the normal breakdown of tissues and from food) and excess water from the blood to make urine. Chronic kidney disease—an increasingly common condition globally—gradually destroys the kidney's filtration units (the nephrons). As the nephrons stop working, the rate at which the blood is filtered (the glomerular filtration rate) decreases, and waste products build up in the blood, eventually leading to life-threatening end-stage kidney (renal) disease. Symptoms of chronic kidney disease, which rarely occur until the disease is advanced, include tiredness, swollen feet and ankles, and frequent urination, particularly at night. Chronic kidney disease cannot be cured, but its progression can be slowed by controlling diabetes and other conditions that contribute to its development. End-stage kidney disease is treated by regular hemodialysis (a process in which blood is cleaned by passing it through a filtration machine) or by kidney transplantation.
Why Was This Study Done?
Like many other long-term conditions, the prevalence (the proportion of the population that has a specific disease) of chronic kidney disease and of end-stage renal disease, and treatment outcomes for these conditions, may differ between men and women. Some of these sex-specific differences may arise because of sex-specific differences in normal biological functions. Other sex-specific differences may be related to sex-specific differences in patient care or in patient awareness of chronic kidney disease. A comprehensive analysis of sex-specific differences among individuals with end-stage renal disease might identify both treatment inequalities and ways to improve sex-specific care. Here, in the Dialysis Outcomes and Practice Patterns Study (DOPPS), the researchers investigate sex-specific differences in the prevalence and practices of hemodialysis and in the characteristics of patients undergoing hemodialysis, and investigate the adult male-to-female mortality (death) rate among patients undergoing hemodialysis. The DOPPS is a prospective cohort study that is investigating the characteristics, treatment, and outcomes of adult patients undergoing hemodialysis in representative facilities in 19 countries (12 countries were available for analysis at the time of the current study).
What Did the Researchers Do and Find?
To investigate sex-specific differences in hemodialysis prevalence, the researchers compared data from the Human Mortality Database, which provides detailed population and mortality data for 37 countries, with data collected by the DOPPS. Forty-one percent of DOPPS patients were women, compared to 52% of the general population in 12 of the DOPPS countries. Next, the researchers used data collected from a randomly selected subgroup of patients to examine sex-specific differences in patient characteristics and mortality. The average estimated glomerular filtration rate at hemodialysis initiation was higher in men than women. Moreover, men were more frequently recipients of a kidney transplant than women. Notably, although in the general population in a given age group women were less likely to die than men, among hemodialysis patients, women were as likely to die as men. Finally, the researchers investigated which patient characteristics were associated with the largest sex-specific differences in mortality risk. The use of a hemodialysis catheter (a tube that is inserted into a patient's vein to transfer their blood into the hemodialysis machine) was associated with a lower mortality risk in men than in women.
What Do These Findings Mean?
These findings show that, among patients treated with hemodialysis for end-stage renal disease, women differ from men in many ways. Although some of these sex-specific differences may be related to biology, others may be related to patient care and to patient awareness of chronic kidney disease. Because this is an observational study, these findings cannot prove that the reported differences in hemodialysis prevalence, treatment, and mortality are actually caused by being a man or a woman. Importantly, however, these findings suggest that hemodialysis may abolish the survival advantage that women have over men in the general population and that fewer women than men are being treated for end-stage-renal disease, even though chronic kidney disease is more common in women than in men. Finally, the finding that the use of hemodialysis catheters for access to veins is associated with a higher mortality risk among women than among men suggests that, where possible, women should be offered a surgical process called arteriovenous fistula placement, which is recommended for access to veins during long-term hemodialysis but which may, in the past, have been underused in women.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001750.
More information about the DOPPS program is available
The US National Kidney and Urologic Diseases Information Clearinghouse provides information about all aspects of kidney disease; the US National Kidney Disease Education Program provides resources to help improve the understanding, detection, and management of kidney disease (in English and Spanish)
The UK National Health Service Choices website provides information for patients on chronic kidney disease and about hemodialysis, including some personal stories
The US National Kidney Foundation, a not-for-profit organization, provides information about chronic kidney disease and about hemodialysis (in English and Spanish)
The not-for-profit UK National Kidney Federation provides support and information for patients with kidney disease and for their carers, including information and personal stories about hemodialysis
World Kidney Day, a joint initiative between the International Society of Nephrology and the International Federation of Kidney Foundations, aims to raise awareness about kidneys and kidney disease
MedlinePlus has pages about chronic kidney disease and about hemodialysis
doi:10.1371/journal.pmed.1001750
PMCID: PMC4211675  PMID: 25350533
7.  Burden and management of chronic kidney disease in Japan: systematic review of the literature 
Background
Chronic kidney disease (CKD) is a common disorder with increasing prevalence worldwide. This systematic literature review aims to provide insights specific to Japan regarding the burden and treatment of CKD.
Methods
We reviewed English and Japanese language publications from the last 10 years, reporting economic, clinical, humanistic, and epidemiologic outcomes, as well as treatment patterns and guidelines on CKD in Japan.
Results
This review identified 85 relevant articles. The prevalence of CKD was found to have increased in Japan, attributable to multiple factors, including better survival on dialysis therapy and a growing elderly population. Risk factors for disease progression differed depending on CKD stage, with proteinuria, smoking, hypertension, and low levels of high-density lipoprotein commonly associated with progression in patients with stage 1 and 2 disease. Serum albumin levels and hemoglobin were the most sensitive variables to progression in patients with stage 3 and 5 disease, respectively. Economic data were limited. Increased costs were associated with disease progression, and with peritoneal dialysis as compared with either hemodialysis or combination therapy (hemodialysis + peritoneal dialysis) treatment options. Pharmacological treatments were found potentially to improve quality of life and result in cost savings. We found no reports of treatment patterns in patients with early-stage CKD; however, calcium channel blockers were the most commonly prescribed antihypertensive agents in hemodialysis patients. Treatment guidelines focused on anemia management related to dialysis and recommendations for peritoneal dialysis treatment and preventative measures. Few studies focused on humanistic burden in Japanese patients; Japanese patients reported greater disease burden but better physical functioning compared with US and European patients.
Conclusion
A dearth of evidence regarding the earlier stages of kidney disease presents an incomplete picture of CKD disease burden in Japan. Further research is needed to gain additional insight into CKD in Japan.
doi:10.2147/IJNRD.S30894
PMCID: PMC3540912  PMID: 23319870
renal disease; epidemiology; disease progression; treatment patterns
8.  Effects of the dose of erythropoiesis stimulating agents on cardiovascular events, quality of life, and health-related costs in hemodialysis patients: the clinical evaluation of the dose of erythropoietins (C.E. DOSE) trial protocol 
Trials  2010;11:70.
Background
Anemia is a risk factor for death, adverse cardiovascular outcomes and poor quality of life in patients with chronic kidney disease (CKD). Erythropoietin Stimulating Agents (ESA) are commonly used to increase hemoglobin levels in this population. In observational studies, higher hemoglobin levels (around 11-13 g/dL) are associated with improved survival and quality of life compared to hemoglobin levels around 9-10 g/dL. A systematic review of randomized trials found that targeting higher hemoglobin levels with ESA causes an increased risk of adverse vascular outcomes. It is possible, but has never been formally tested in a randomized trial, that ESA dose rather than targeted hemoglobin concentration itself mediates the increased risk of adverse vascular outcomes. The Clinical Evaluation of the DOSe of Erythropoietins (C.E. DOSE) trial will assess the benefits and harms of a high versus a low fixed ESA dose for the management of anemia in patients with end stage kidney disease.
Methods/Design
This is a randomized, prospective open label blinded end-point (PROBE) trial due to enrol 2204 hemodialysis patients in Italy. Patients will be randomized 1:1 to 4000 IU/week versus 18000 IU/week of intravenous epoietin alfa or beta, or any other ESA in equivalent doses. The dose will be adjusted only if hemoglobin levels fall outside the 9.5-12.5 g/dL range. The primary outcome will be a composite of all-cause mortality, non fatal stroke, non fatal myocardial infarction and hospitalization for cardiovascular causes. Quality of life and costs will also be assessed.
Discussion
The C.E.DOSE study will help inform the optimal therapeutic strategy for the management of anemia of hemodialysis patients, improving clinical outcomes, quality of life and costs, by ascertaining the potential benefits and harms of different fixed ESA doses.
Trial registration
Clinicaltrials.gov NCT00827021
doi:10.1186/1745-6215-11-70
PMCID: PMC2903576  PMID: 20534124
9.  Renal Function and Risk of Coronary Heart Disease in General Populations: New Prospective Study and Systematic Review 
PLoS Medicine  2007;4(9):e270.
Background
End-stage chronic kidney disease is associated with striking excesses of cardiovascular mortality, but it is uncertain to what extent renal function is related to risk of subsequent coronary heart disease (CHD) in apparently healthy adults. This study aims to quantify the association of markers of renal function with CHD risk in essentially general populations.
Methods and Findings
Estimated glomerular filtration rate (eGFR) was calculated using standard prediction equations based on serum creatinine measurements made in 2,007 patients diagnosed with nonfatal myocardial infarction or coronary death during follow-up and in 3,869 people without CHD in the Reykjavik population-based cohort of 18,569 individuals. There were small and nonsignificant odds ratios (ORs) for CHD risk over most of the range in eGFR, except in the lowest category of the lowest fifth (corresponding to values of <60 ml/min/1.73m2), in which the OR was 1.33 (95% confidence interval 1.01–1.75) after adjustment for several established cardiovascular risk factors. Findings from the Reykjavik study were reinforced by a meta-analysis of six previous reports (identified in electronic and other databases) involving a total of 4,720 incident CHD cases (including Reykjavik), which yielded a combined risk ratio of 1.41 (95% confidence interval 1.19–1.68) in individuals with baseline eGFR less than 60 ml/min/1.73m2 compared with those with higher values.
Conclusions
Although there are no strong associations between lower-than-average eGFR and CHD risk in apparently healthy adults over most of the range in renal function, there may be a moderate increase in CHD risk associated with very low eGFR (i.e., renal dysfunction) in the general population. These findings could have implications for the further understanding of CHD and targeting cardioprotective interventions.
John Danesh and colleagues conclude there may be a moderate increase in risk of coronary heart disease associated with very low estimated glomerular filtration rate.
Editors' Summary
Background.
Coronary heart disease (CHD), the leading cause of death in most Western countries, is a “cardiovascular” disease—literally a disorder affecting the heart and/or blood vessels. In CHD, the blood vessels that supply the heart become increasingly narrow. Eventually, the flow of blood to the heart slows or stops, causing chest pains (angina), breathlessness, and heart attacks. Many factors increase the risk of developing CHD and other cardiovascular diseases, including high blood pressure, high blood levels of cholesterol (a type of fat), or being overweight. Individuals can reduce their chances of developing cardiovascular disease by taking drugs to reduce their blood pressure or cholesterol levels or by making lifestyle changes (so-called cardioprotective interventions). Another important risk factor for cardiovascular disease is end-stage chronic kidney disease (CKD), a condition in which the kidneys stop working. (In healthy people, the kidneys remove waste products and excess fluid from the body.) People with end-stage CKD (which is treated by dialysis) have about a five times higher risk of dying from cardiovascular disease compared with healthy people.
Why Was This Study Done?
End-stage CKD is preceded by a gradual loss of kidney function. There is a clear association between non-dialysis–dependent CKD and the incidence of cardiovascular events (such as heart attacks) in people who already have signs of cardiovascular disease. But are people with slightly dysfunctional kidneys (often because of increasing age) but without any obvious cardiovascular disease at greater risk of developing cardiovascular diseases than people with fully functional kidneys? If the answer is yes, it might be possible to reduce CHD deaths by minimizing the exposure of people with CKD to other risk factors for cardiovascular disease. In this study, the researchers have taken two approaches to answer this question. In a population-based study, they have examined whether there is any association in healthy adults between kidney function measured at the start of the study and incident CHD (the first occurrence of CHD) over subsequent years. In addition, they have systematically searched the published literature for similar studies and combined the results of these studies using statistical methods, a so-called “meta-analysis.”
What Did the Researchers Do and Find?
Between 1967 and 1991, nearly 19,000 middle-aged men and women without a history of heart attacks living in Reykjavik, Iceland, enrolled in a prospective study of cardiovascular disease. Baseline blood samples were taken at enrollment and the participants' health monitored for 20 years on average. The researchers identified 2,007 participants who suffered a nonfatal heart attack or died of CHD during follow-up and 3,869 who remained disease free. They then calculated the estimated glomerular filtration rate (eGFR; a measure of kidney function) for each participant from baseline creatinine measurements (creatinine is a muscle waste product). There was no association between lower-than-average eGFRs and the risk of developing CHD over most of the range of eGFR values. However, people whose eGFR was below approximately 60 units had about a 40% higher risk of developing CHD after allowing for established cardiovascular risk factors than individuals with higher eGFRs. This finding was confirmed by the meta-analysis of six previous studies, which included a further 2,700 incident CHD cases.
What Do These Findings Mean?
These findings indicate that people with an eGFR below about 60 units (the cut-off used to define CKD) may have an increased risk of developing CHD. They also indicate a nonliner association between kidney function and CHD risk. That is, any association with CHD became evident only when the eGFR dropped below about 60 units. These findings need confirming in different ethnic groups and by using more accurate methods to measure eGFRs. Nevertheless, they suggest that improving kidney function across the board is unlikely to have much effect on the overall incidence of CHD. Instead, they suggest that targeting cardioprotective interventions at the one in ten adults in Western countries whose eGFR is below 60 units might be a good way to reduce the burden of CHD.
Additional Information.
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.0040270.
MedlinePlus encyclopedia pages on coronary heart disease, chronic kidney failure, and end-stage kidney disease (in English and Spanish).
Information for patients and carers from the American Heart Association on all aspects of heart disease, including prevention of CHD
Information from the British Heart Foundation on heart disease and on keeping the heart healthy
Information on chronic kidney disease from the US National Kidney Foundation, and the US National Kidney and Urologic Diseases Information Clearing House (in English and Spanish)
Information on chronic kidney disease from the UK National Kidney Foundation
doi:10.1371/journal.pmed.0040270
PMCID: PMC1961630  PMID: 17803353
10.  Kidney and liver organ transplantation in persons with human immunodeficiency virus 
Executive Summary
Objective
The objective of this analysis is to determine the effectiveness of solid organ transplantation in persons with end stage organ failure (ESOF) and human immunodeficiency virus (HIV+)
Clinical Need: Condition and Target Population
Patients with end stage organ failure who have been unresponsive to other forms of treatment eventually require solid organ transplantation. Similar to persons who are HIV negative (HIV−), persons living with HIV infection (HIV+) are at risk for ESOF from viral (e.g. hepatitis B and C) and non-viral aetiologies (e.g. coronary artery disease, diabetes, hepatocellular carcinoma). Additionally, HIV+ persons also incur risks of ESOF from HIV-associated nephropathy (HIVAN), accelerated liver damage from hepatitis C virus (HCV+), with which an estimated 30% of HIV positive (HIV+) persons are co-infected, and coronary artery disease secondary to antiretroviral therapy. Concerns that the need for post transplant immunosuppression and/or the interaction of immunosuppressive drugs with antiretroviral agents may accelerate the progression of HIV disease, as well as the risk of opportunistic infections post transplantation, have led to uncertainty regarding the overall benefit of transplantation among HIV+ patients. Moreover, the scarcity of donor organs and their use in a population where the clinical benefit of transplantation is uncertain has limited the availability of organ transplantation to persons living with ESOF and HIV.
With the development of highly active anti retroviral therapy (HAART), which has been available in Canada since 1997, there has been improved survival and health-related quality of life for persons living with HIV. HAART can suppress HIV replication, enhance immune function, and slow disease progression. HAART managed persons can now be expected to live longer than those in the pre-HAART era and as a result many will now experience ESOF well before they experience life-threatening conditions related to HIV infection. Given their improved prognosis and the burden of illness they may experience from ESOF, the benefit of solid organ transplantation for HIV+ patients needs to be reassessed.
Evidence-Based Analysis Methods
Research Questions
What are the effectiveness and cost effectiveness of solid organ transplantation in HIV+ persons with ESOF?
Literature Search
A literature search was performed on September 22, 2009 using OVID MEDLINE, MEDLINE In-Process and Other Non-Indexed Citations, EMBASE, the Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Cochrane Library, and the International Agency for Health Technology Assessment (INAHTA) for studies published from January 1, 1996 to September 22, 2009.
Inclusion Criteria
Systematic review with or without a Meta analysis, RCT, Non-RCT with controls
HIV+ population undergoing solid organ transplantation
HIV+ population managed with HAART therapy
Controls include persons undergoing solid organ transplantation who are i) HIV− ii) HCV+ mono-infected, and iii) HIV+ persons with ESOF not transplanted.
Studies that completed and reported results of a Kaplan-Meier Survival Curve analysis.
Studies with a minimum (mean or medium) follow up of 1-year.
English language citations
Exclusion Criteria
Case reports and case series were excluded form this review.
Outcomes of Interest
i) Risk of Death after transplantation
ii) Death censored graft survival (DCGS)
iii) HIV disease progression defined as the post transplant incidence of:
- opportunistic infections or neoplasms,
- CD4+ T-cell count < 200mm3, and
- any detectable level of plasma HIV viral load.
iv) Acute graft rejection,
v) Return to dialysis,
vi) Recurrence of HCV infection
Summary of Findings
No direct evidence comparing an HIV+ cohort undergoing transplantation with the same not undergoing transplantation (wait list) was found in the literature search.
The results of this review are reported for the following comparison cohorts undergoing transplantation:
i) Kidney Transplantation: HIV+ cohort compared with HIV− cohort
ii) Liver Transplantation: HIV+ cohort compared with HIV− negative cohort
iii) Liver Transplantation: HIV+ HCV+ (co-infected) cohort compared with HCV+ (mono-infected) cohort
Kidney Transplantation: HIV+ vs. HIV−
Based on a pooled HIV+ cohort sample size of 285 patients across four studies, the risk of death after kidney transplantation in an HIV+ cohort does not differ to that of an HIV− cohort [hazard ratio (HR): 0.90; 95% CI: 0.36, 2.23]. The quality of evidence supporting this outcome is very low.
Death censored graft survival was reported in one study with an HIV+ cohort sample size of 100, and was statistically significantly different (p=.03) to that in the HIV− cohort (n=36,492). However, the quality of evidence supporting this outcome was determined to be very low. There was also uncertainty in the rate of return to dialysis after kidney transplantation in both the HIV+ and HIV− groups and the effect, if any, this may have on patient survival. Because of the very low quality evidence rating, the effect of kidney transplantation on HIV-disease progression is uncertain.
The rate of acute graft rejection was determined using the data from one study. There was a nonsignificant difference between the HIV+ and HIV− cohorts (OR 0.13; 95% CI: 0.01, 2.64), although again, because of very low quality evidence there is uncertainty in this estimate of effect.
Liver Transplantation: HIV+ vs. HIV−
Based on a combined HIV+ cohort sample size of 198 patient across five studies, the risk of death after liver transplantation in an HIV+ cohort (with at least 50% of the cohort co-infected with HCV+) is statistically significantly 64% greater compared with an HIV− cohort (HR: 1.64; 95% CI: 1.32, 2.02). The quality of evidence supporting this outcome is very low.
Death censored graft survival was reported for an HIV+ cohort in one study (n=11) however the DCGS rate of the contemporaneous control HIV− cohort was not reported. Because of sparse data the quality of evidence supporting this outcome is very low indicating death censored graft survival is uncertain.
Both the CD4+ T-cell count and HIV viral load appear controlled post transplant with an incidence of opportunistic infection of 20.5%. However, the quality of this evidence for these outcomes is very low indicating uncertainty in these effects. Similarly, because of very low quality evidence there is uncertainty in the rate of acute graft rejection among both the HIV+ and HIV− groups
Liver Transplantation: HIV+/HCV+ vs. HCV+
Based on a combined HIV+/HCV+ cohort sample size of 156 from seven studies, the risk of death after liver transplantation is significantly greater (2.8 fold) in a co-infected cohort compared with an HCV+ mono-infected cohort (HR: 2.81; 95% CI: 1.47, 5.37). The quality of evidence supporting this outcome is very low. Death censored graft survival evidence was not available.
Regarding disease progression, based on a combined sample size of 71 persons in the co-infected cohort, the CD4+ T-cell count and HIV viral load appear controlled post transplant; however, again the quality of evidence supporting this outcome is very low. The rate of opportunistic infection in the co-infected cohort was 7.2%. The quality of evidence supporting this estimate is very low, indicating uncertainty in these estimates of effect.
Based on a combined HIV+/HCV+ cohort (n=57) the rate of acute graft rejection does not differ to that of an HCV+ mono-infected cohort (OR: 0.88; 95% CI: 0.44, 1.76). Also based on a combined HIV+/HCV+ cohort (n=83), the rate of HCV+ recurrence does not differ to that of an HCV+ mono-infected cohort (OR: 0.66; 95% CI: 0.27, 1.59). In both cases, the quality of the supporting evidence was very low.
Overall, because of very low quality evidence there is uncertainty in the effect of kidney or liver transplantation in HIV+ persons with end stage organ failure compared with those not infected with HIV. Examining the economics of this issue, the cost of kidney and liver transplants in an HIV+ patient population are, on average, 56K and 147K per case, based on both Canadian and American experiences.
PMCID: PMC3377507  PMID: 23074407
11.  Cultural adaptation and validation of the “Kidney Disease and Quality of Life - Short Form (KDQOL-SF™) version 1.3” questionnaire in Egypt 
BMC Nephrology  2012;13:170.
Background
Health Related Quality of Life (HRQOL) instruments need disease and country specific validation. In Arab countries, there is no specific validated questionnaire for assessment of HRQOL in chronic kidney disease (CKD) patients. The aim of this study was to present an Arabic translation, adaptation, and the subsequent validation of the kidney disease quality of life-short form (KDQOL-SFTM) version 1.3 questionnaire in a representative series of Egyptian CKD patients.
Methods
KDQOL-SFTM version 1.3 was translated into Arabic by two independent translators, and then subsequently translated back into English. After translation disparities were reconciled, the final Arabic questionnaire was tested by interviewing 100 pre-dialysis CKD (stage 1-4) patients randomly selected from outpatients attending the Nephrology clinic at the Main Alexandria University Hospital. Test re-test reliability was performed, with a subsample of 50 consecutive CKD patients, by two interviews 7 days apart and internal consistency estimated by Cronbach’s α. Discriminant, concept, and construct validity were assessed.
Results
All items of SF-36 met the criterion for internal consistency and were reproducible. Of the 10 kidney disease targeted scales, only three had Cronbach’s α <0.7: quality of social interaction (0.23), work status (0.28), and cognitive function (0.60). All disease specific scales were reproducible. Results from discriminant validity showed that the study questionnaire could discriminate between patients’ subgroups. As for concept validity, the correlation between all domains of the questionnaire with overall health ratewas significant for all domains except for the work status, sexual function, emotional wellbeing, and role emotional. Furthermore, the correlation between the disease specific domains and the two composite summaries of SF-36 (physical and mental composite summaries) was significant for all domains except for sexual function with mental composite summary. Construct validity was indicated by the observation that the majority of the domains of the kidney disease targeted scale of KDQOL-SFTM 1.3 were significantly inter-correlated. Finally, principal component analysis of the kidney disease targeted scale indicated that this part of the questionnaire could be summarized into 10 factors that together explained 70.9% of the variance.
Conclusion
The results suggest that this Arabic version of the KDQOL-SFTM 1.3 questionnaire is a valid and reliable tool for use in Egyptian patients with CKD.
doi:10.1186/1471-2369-13-170
PMCID: PMC3583144  PMID: 23237591
Chronic kidney disease; Egypt; Health-related quality of life; KDQOL-SFTM 1.3; Questionnaire validation
12.  Clinical Utility of Vitamin D Testing 
Executive Summary
This report from the Medical Advisory Secretariat (MAS) was intended to evaluate the clinical utility of vitamin D testing in average risk Canadians and in those with kidney disease. As a separate analysis, this report also includes a systematic literature review of the prevalence of vitamin D deficiency in these two subgroups.
This evaluation did not set out to determine the serum vitamin D thresholds that might apply to non-bone health outcomes. For bone health outcomes, no high or moderate quality evidence could be found to support a target serum level above 50 nmol/L. Similarly, no high or moderate quality evidence could be found to support vitamin D’s effects in non-bone health outcomes, other than falls.
Vitamin D
Vitamin D is a lipid soluble vitamin that acts as a hormone. It stimulates intestinal calcium absorption and is important in maintaining adequate phosphate levels for bone mineralization, bone growth, and remodelling. It’s also believed to be involved in the regulation of cell growth proliferation and apoptosis (programmed cell death), as well as modulation of the immune system and other functions. Alone or in combination with calcium, Vitamin D has also been shown to reduce the risk of fractures in elderly men (≥ 65 years), postmenopausal women, and the risk of falls in community-dwelling seniors. However, in a comprehensive systematic review, inconsistent results were found concerning the effects of vitamin D in conditions such as cancer, all-cause mortality, and cardiovascular disease. In fact, no high or moderate quality evidence could be found concerning the effects of vitamin D in such non-bone health outcomes. Given the uncertainties surrounding the effects of vitamin D in non-bone health related outcomes, it was decided that this evaluation should focus on falls and the effects of vitamin D in bone health and exclusively within average-risk individuals and patients with kidney disease.
Synthesis of vitamin D occurs naturally in the skin through exposure to ultraviolet B (UVB) radiation from sunlight, but it can also be obtained from dietary sources including fortified foods, and supplements. Foods rich in vitamin D include fatty fish, egg yolks, fish liver oil, and some types of mushrooms. Since it is usually difficult to obtain sufficient vitamin D from non-fortified foods, either due to low content or infrequent use, most vitamin D is obtained from fortified foods, exposure to sunlight, and supplements.
Clinical Need: Condition and Target Population
Vitamin D deficiency may lead to rickets in infants and osteomalacia in adults. Factors believed to be associated with vitamin D deficiency include:
darker skin pigmentation,
winter season,
living at higher latitudes,
skin coverage,
kidney disease,
malabsorption syndromes such as Crohn’s disease, cystic fibrosis, and
genetic factors.
Patients with chronic kidney disease (CKD) are at a higher risk of vitamin D deficiency due to either renal losses or decreased synthesis of 1,25-dihydroxyvitamin D.
Health Canada currently recommends that, until the daily recommended intakes (DRI) for vitamin D are updated, Canada’s Food Guide (Eating Well with Canada’s Food Guide) should be followed with respect to vitamin D intake. Issued in 2007, the Guide recommends that Canadians consume two cups (500 ml) of fortified milk or fortified soy beverages daily in order to obtain a daily intake of 200 IU. In addition, men and women over the age of 50 should take 400 IU of vitamin D supplements daily. Additional recommendations were made for breastfed infants.
A Canadian survey evaluated the median vitamin D intake derived from diet alone (excluding supplements) among 35,000 Canadians, 10,900 of which were from Ontario. Among Ontarian males ages 9 and up, the median daily dietary vitamin D intake ranged between 196 IU and 272 IU per day. Among females, it varied from 152 IU to 196 IU per day. In boys and girls ages 1 to 3, the median daily dietary vitamin D intake was 248 IU, while among those 4 to 8 years it was 224 IU.
Vitamin D Testing
Two laboratory tests for vitamin D are available, 25-hydroxy vitamin D, referred to as 25(OH)D, and 1,25-dihydroxyvitamin D. Vitamin D status is assessed by measuring the serum 25(OH)D levels, which can be assayed using radioimmunoassays, competitive protein-binding assays (CPBA), high pressure liquid chromatography (HPLC), and liquid chromatography-tandem mass spectrometry (LC-MS/MS). These may yield different results with inter-assay variation reaching up to 25% (at lower serum levels) and intra-assay variation reaching 10%.
The optimal serum concentration of vitamin D has not been established and it may change across different stages of life. Similarly, there is currently no consensus on target serum vitamin D levels. There does, however, appear to be a consensus on the definition of vitamin D deficiency at 25(OH)D < 25 nmol/l, which is based on the risk of diseases such as rickets and osteomalacia. Higher target serum levels have also been proposed based on subclinical endpoints such as parathyroid hormone (PTH). Therefore, in this report, two conservative target serum levels have been adopted, 25 nmol/L (based on the risk of rickets and osteomalacia), and 40 to 50 nmol/L (based on vitamin D’s interaction with PTH).
Ontario Context
Volume & Cost
The volume of vitamin D tests done in Ontario has been increasing over the past 5 years with a steep increase of 169,000 tests in 2007 to more than 393,400 tests in 2008. The number of tests continues to rise with the projected number of tests for 2009 exceeding 731,000. According to the Ontario Schedule of Benefits, the billing cost of each test is $51.7 for 25(OH)D (L606, 100 LMS units, $0.517/unit) and $77.6 for 1,25-dihydroxyvitamin D (L605, 150 LMS units, $0.517/unit). Province wide, the total annual cost of vitamin D testing has increased from approximately $1.7M in 2004 to over $21.0M in 2008. The projected annual cost for 2009 is approximately $38.8M.
Evidence-Based Analysis
The objective of this report is to evaluate the clinical utility of vitamin D testing in the average risk population and in those with kidney disease. As a separate analysis, the report also sought to evaluate the prevalence of vitamin D deficiency in Canada. The specific research questions addressed were thus:
What is the clinical utility of vitamin D testing in the average risk population and in subjects with kidney disease?
What is the prevalence of vitamin D deficiency in the average risk population in Canada?
What is the prevalence of vitamin D deficiency in patients with kidney disease in Canada?
Clinical utility was defined as the ability to improve bone health outcomes with the focus on the average risk population (excluding those with osteoporosis) and patients with kidney disease.
Literature Search
A literature search was performed on July 17th, 2009 using OVID MEDLINE, MEDLINE In-Process and Other Non-Indexed Citations, EMBASE, the Cumulative Index to Nursing & Allied Health Literature (CINAHL), the Cochrane Library, and the International Agency for Health Technology Assessment (INAHTA) for studies published from January 1, 1998 until July 17th, 2009. Abstracts were reviewed by a single reviewer and, for those studies meeting the eligibility criteria, full-text articles were obtained. Reference lists were also examined for any additional relevant studies not identified through the search. Articles with unknown eligibility were reviewed with a second clinical epidemiologist, then a group of epidemiologists until consensus was established. The quality of evidence was assessed as high, moderate, low or very low according to GRADE methodology.
Observational studies that evaluated the prevalence of vitamin D deficiency in Canada in the population of interest were included based on the inclusion and exclusion criteria listed below. The baseline values were used in this report in the case of interventional studies that evaluated the effect of vitamin D intake on serum levels. Studies published in grey literature were included if no studies published in the peer-reviewed literature were identified for specific outcomes or subgroups.
Considering that vitamin D status may be affected by factors such as latitude, sun exposure, food fortification, among others, the search focused on prevalence studies published in Canada. In cases where no Canadian prevalence studies were identified, the decision was made to include studies from the United States, given the similar policies in vitamin D food fortification and recommended daily intake.
Inclusion Criteria
Studies published in English
Publications that reported the prevalence of vitamin D deficiency in Canada
Studies that included subjects from the general population or with kidney disease
Studies in children or adults
Studies published between January 1998 and July 17th 2009
Exclusion Criteria
Studies that included subjects defined according to a specific disease other than kidney disease
Letters, comments, and editorials
Studies that measured the serum vitamin D levels but did not report the percentage of subjects with serum levels below a given threshold
Outcomes of Interest
Prevalence of serum vitamin D less than 25 nmol/L
Prevalence of serum vitamin D less than 40 to 50 nmol/L
Serum 25-hydroxyvitamin D was the metabolite used to assess vitamin D status. Results from adult and children studies were reported separately. Subgroup analyses according to factors that affect serum vitamin D levels (e.g., seasonal effects, skin pigmentation, and vitamin D intake) were reported if enough information was provided in the studies
Quality of Evidence
The quality of the prevalence studies was based on the method of subject recruitment and sampling, possibility of selection bias, and generalizability to the source population. The overall quality of the trials was examined according to the GRADE Working Group criteria.
Summary of Findings
Fourteen prevalence studies examining Canadian adults and children met the eligibility criteria. With the exception of one longitudinal study, the studies had a cross-sectional design. Two studies were conducted among Canadian adults with renal disease but none studied Canadian children with renal disease (though three such US studies were included). No systematic reviews or health technology assessments that evaluated the prevalence of vitamin D deficiency in Canada were identified. Two studies were published in grey literature, consisting of a Canadian survey designed to measure serum vitamin D levels and a study in infants presented as an abstract at a conference. Also included were the results of vitamin D tests performed in community laboratories in Ontario between October 2008 and September 2009 (provided by the Ontario Association of Medical Laboratories).
Different threshold levels were used in the studies, thus we reported the percentage of subjects with serum levels of between 25 and 30 nmol/L and between 37.5 and 50 nmol/L. Some studies stratified the results according to factors affecting vitamin D status and two used multivariate models to investigate the effects of these characteristics (including age, season, BMI, vitamin D intake, skin pigmentation, and season) on serum 25(OH)D levels. It’s unclear, however, if these studies were adequately powered for these subgroup analyses.
Study participants generally consisted of healthy, community-dwelling subjects and most excluded individuals with conditions or medications that alter vitamin D or bone metabolism, such as kidney or liver disease. Although the studies were conducted in different parts of Canada, fewer were performed in Northern latitudes, i.e. above 53°N, which is equivalent to the city of Edmonton.
Adults
Serum vitamin D levels of < 25 to 30 nmol/L were observed in 0% to 25.5% of the subjects included in five studies; the weighted average was 3.8% (95% CI: 3.0, 4.6). The preliminary results of the Canadian survey showed that approximately 5% of the subjects had serum levels below 29.5 nmol/L. The results of over 600,000 vitamin D tests performed in Ontarian community laboratories between October 2008 and September 2009 showed that 2.6% of adults (> 18 years) had serum levels < 25 nmol/L.
The prevalence of serum vitamin D levels below 37.5-50 nmol/L reported among studies varied widely, ranging from 8% to 73.6% with a weighted average of 22.5%. The preliminary results of the CHMS survey showed that between 10% and 25% of subjects had serum levels below 37 to 48 nmol/L. The results of the vitamin D tests performed in community laboratories showed that 10% to 25% of the individuals had serum levels between 39 and 50 nmol/L.
In an attempt to explain this inter-study variation, the study results were stratified according to factors affecting serum vitamin D levels, as summarized below. These results should be interpreted with caution as none were adjusted for other potential confounders. Adequately powered multivariate analyses would be necessary to determine the contribution of risk factors to lower serum 25(OH)D levels.
Seasonal variation
Three adult studies evaluating serum vitamin D levels in different seasons observed a trend towards a higher prevalence of serum levels < 37.5 to 50 nmol/L during the winter and spring months, specifically 21% to 39%, compared to 8% to 14% in the summer. The weighted average was 23.6% over the winter/spring months and 9.6% over summer. The difference between the seasons was not statistically significant in one study and not reported in the other two studies.
Skin Pigmentation
Four studies observed a trend toward a higher prevalence of serum vitamin D levels < 37.5 to 50 nmol/L in subjects with darker skin pigmentation compared to those with lighter skin pigmentation, with weighted averages of 46.8% among adults with darker skin colour and 15.9% among those with fairer skin.
Vitamin D intake and serum levels
Four adult studies evaluated serum vitamin D levels according to vitamin D intake and showed an overall trend toward a lower prevalence of serum levels < 37.5 to 50 nmol/L with higher levels of vitamin D intake. One study observed a dose-response relationship between higher vitamin D intake from supplements, diet (milk), and sun exposure (results not adjusted for other variables). It was observed that subjects taking 50 to 400 IU or > 400 IU of vitamin D per day had a 6% and 3% prevalence of serum vitamin D level < 40 nmol/L, respectively, versus 29% in subjects not on vitamin D supplementation. Similarly, among subjects drinking one or two glasses of milk per day, the prevalence of serum vitamin D levels < 40 nmol/L was found to be 15%, versus 6% in those who drink more than two glasses of milk per day and 21% among those who do not drink milk. On the other hand, one study observed little variation in serum vitamin D levels during winter according to milk intake, with the proportion of subjects exhibiting vitamin D levels of < 40 nmol/L being 21% among those drinking 0-2 glasses per day, 26% among those drinking > 2 glasses, and 20% among non-milk drinkers.
The overall quality of evidence for the studies conducted among adults was deemed to be low, although it was considered moderate for the subgroups of skin pigmentation and seasonal variation.
Newborn, Children and Adolescents
Five Canadian studies evaluated serum vitamin D levels in newborns, children, and adolescents. In four of these, it was found that between 0 and 36% of children exhibited deficiency across age groups with a weighted average of 6.4%. The results of over 28,000 vitamin D tests performed in children 0 to 18 years old in Ontario laboratories (Oct. 2008 to Sept. 2009) showed that 4.4% had serum levels of < 25 nmol/L.
According to two studies, 32% of infants 24 to 30 months old and 35.3% of newborns had serum vitamin D levels of < 50 nmol/L. Two studies of children 2 to 16 years old reported that 24.5% and 34% had serum vitamin D levels below 37.5 to 40 nmol/L. In both studies, older children exhibited a higher prevalence than younger children, with weighted averages 34.4% and 10.3%, respectively. The overall weighted average of the prevalence of serum vitamin D levels < 37.5 to 50 nmol/L among pediatric studies was 25.8%. The preliminary results of the Canadian survey showed that between 10% and 25% of subjects between 6 and 11 years (N= 435) had serum levels below 50 nmol/L, while for those 12 to 19 years, 25% to 50% exhibited serum vitamin D levels below 50 nmol/L.
The effects of season, skin pigmentation, and vitamin D intake were not explored in Canadian pediatric studies. A Canadian surveillance study did, however, report 104 confirmed cases1 (2.9 cases per 100,000 children) of vitamin D-deficient rickets among Canadian children age 1 to 18 between 2002 and 2004, 57 (55%) of which from Ontario. The highest incidence occurred among children living in the North, i.e., the Yukon, Northwest Territories, and Nunavut. In 92 (89%) cases, skin pigmentation was categorized as intermediate to dark, 98 (94%) had been breastfed, and 25 (24%) were offspring of immigrants to Canada. There were no cases of rickets in children receiving ≥ 400 IU VD supplementation/day.
Overall, the quality of evidence of the studies of children was considered very low.
Kidney Disease
Adults
Two studies evaluated serum vitamin D levels in Canadian adults with kidney disease. The first included 128 patients with chronic kidney disease stages 3 to 5, 38% of which had serum vitamin D levels of < 37.5 nmol/L (measured between April and July). This is higher than what was reported in Canadian studies of the general population during the summer months (i.e. between 8% and 14%). In the second, which examined 419 subjects who had received a renal transplantation (mean time since transplantation: 7.2 ± 6.4 years), the prevalence of serum vitamin D levels < 40 nmol/L was 27.3%. The authors concluded that the prevalence observed in the study population was similar to what is expected in the general population.
Children
No studies evaluating serum vitamin D levels in Canadian pediatric patients with kidney disease could be identified, although three such US studies among children with chronic kidney disease stages 1 to 5 were. The mean age varied between 10.7 and 12.5 years in two studies but was not reported in the third. Across all three studies, the prevalence of serum vitamin D levels below the range of 37.5 to 50 nmol/L varied between 21% and 39%, which is not considerably different from what was observed in studies of healthy Canadian children (24% to 35%).
Overall, the quality of evidence in adults and children with kidney disease was considered very low.
Clinical Utility of Vitamin D Testing
A high quality comprehensive systematic review published in August 2007 evaluated the association between serum vitamin D levels and different bone health outcomes in different age groups. A total of 72 studies were included. The authors observed that there was a trend towards improvement in some bone health outcomes with higher serum vitamin D levels. Nevertheless, precise thresholds for improved bone health outcomes could not be defined across age groups. Further, no new studies on the association were identified during an updated systematic review on vitamin D published in July 2009.
With regards to non-bone health outcomes, there is no high or even moderate quality evidence that supports the effectiveness of vitamin D in outcomes such as cancer, cardiovascular outcomes, and all-cause mortality. Even if there is any residual uncertainty, there is no evidence that testing vitamin D levels encourages adherence to Health Canada’s guidelines for vitamin D intake. A normal serum vitamin D threshold required to prevent non-bone health related conditions cannot be resolved until a causal effect or correlation has been demonstrated between vitamin D levels and these conditions. This is as an ongoing research issue around which there is currently too much uncertainty to base any conclusions that would support routine vitamin D testing.
For patients with chronic kidney disease (CKD), there is again no high or moderate quality evidence supporting improved outcomes through the use of calcitriol or vitamin D analogs. In the absence of such data, the authors of the guidelines for CKD patients consider it best practice to maintain serum calcium and phosphate at normal levels, while supplementation with active vitamin D should be considered if serum PTH levels are elevated. As previously stated, the authors of guidelines for CKD patients believe that there is not enough evidence to support routine vitamin D [25(OH)D] testing. According to what is stated in the guidelines, decisions regarding the commencement or discontinuation of treatment with calcitriol or vitamin D analogs should be based on serum PTH, calcium, and phosphate levels.
Limitations associated with the evidence of vitamin D testing include ambiguities in the definition of an ‘adequate threshold level’ and both inter- and intra- assay variability. The MAS considers both the lack of a consensus on the target serum vitamin D levels and assay limitations directly affect and undermine the clinical utility of testing. The evidence supporting the clinical utility of vitamin D testing is thus considered to be of very low quality.
Daily vitamin D intake, either through diet or supplementation, should follow Health Canada’s recommendations for healthy individuals of different age groups. For those with medical conditions such as renal disease, liver disease, and malabsorption syndromes, and for those taking medications that may affect vitamin D absorption/metabolism, physician guidance should be followed with respect to both vitamin D testing and supplementation.
Conclusions
Studies indicate that vitamin D, alone or in combination with calcium, may decrease the risk of fractures and falls among older adults.
There is no high or moderate quality evidence to support the effectiveness of vitamin D in other outcomes such as cancer, cardiovascular outcomes, and all-cause mortality.
Studies suggest that the prevalence of vitamin D deficiency in Canadian adults and children is relatively low (approximately 5%), and between 10% and 25% have serum levels below 40 to 50 nmol/L (based on very low to low grade evidence).
Given the limitations associated with serum vitamin D measurement, ambiguities in the definition of a ‘target serum level’, and the availability of clear guidelines on vitamin D supplementation from Health Canada, vitamin D testing is not warranted for the average risk population.
Health Canada has issued recommendations regarding the adequate daily intake of vitamin D, but current studies suggest that the mean dietary intake is below these recommendations. Accordingly, Health Canada’s guidelines and recommendations should be promoted.
Based on a moderate level of evidence, individuals with darker skin pigmentation appear to have a higher risk of low serum vitamin D levels than those with lighter skin pigmentation and therefore may need to be specially targeted with respect to optimum vitamin D intake. The cause-effect of this association is currently unclear.
Individuals with medical conditions such as renal and liver disease, osteoporosis, and malabsorption syndromes, as well as those taking medications that may affect vitamin D absorption/metabolism, should follow their physician’s guidance concerning both vitamin D testing and supplementation.
PMCID: PMC3377517  PMID: 23074397
13.  The kSORT Assay to Detect Renal Transplant Patients at High Risk for Acute Rejection: Results of the Multicenter AART Study 
PLoS Medicine  2014;11(11):e1001759.
Minnie Sarwal and colleagues developed a gene expression assay using peripheral blood samples to detect patients with renal transplant at high risk for acute rejection.
Please see later in the article for the Editors' Summary
Background
Development of noninvasive molecular assays to improve disease diagnosis and patient monitoring is a critical need. In renal transplantation, acute rejection (AR) increases the risk for chronic graft injury and failure. Noninvasive diagnostic assays to improve current late and nonspecific diagnosis of rejection are needed. We sought to develop a test using a simple blood gene expression assay to detect patients at high risk for AR.
Methods and Findings
We developed a novel correlation-based algorithm by step-wise analysis of gene expression data in 558 blood samples from 436 renal transplant patients collected across eight transplant centers in the US, Mexico, and Spain between 5 February 2005 and 15 December 2012 in the Assessment of Acute Rejection in Renal Transplantation (AART) study. Gene expression was assessed by quantitative real-time PCR (QPCR) in one center. A 17-gene set—the Kidney Solid Organ Response Test (kSORT)—was selected in 143 samples for AR classification using discriminant analysis (area under the receiver operating characteristic curve [AUC] = 0.94; 95% CI 0.91–0.98), validated in 124 independent samples (AUC = 0.95; 95% CI 0.88–1.0) and evaluated for AR prediction in 191 serial samples, where it predicted AR up to 3 mo prior to detection by the current gold standard (biopsy). A novel reference-based algorithm (using 13 12-gene models) was developed in 100 independent samples to provide a numerical AR risk score, to classify patients as high risk versus low risk for AR. kSORT was able to detect AR in blood independent of age, time post-transplantation, and sample source without additional data normalization; AUC = 0.93 (95% CI 0.86–0.99). Further validation of kSORT is planned in prospective clinical observational and interventional trials.
Conclusions
The kSORT blood QPCR assay is a noninvasive tool to detect high risk of AR of renal transplants.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Throughout life, the kidneys filter waste products (from the normal breakdown of tissues and food) and excess water from the blood to make urine. If the kidneys stop working for any reason, the rate at which the blood is filtered decreases, and dangerous amounts of creatinine and other waste products build up in the blood. The kidneys can fail suddenly (acute kidney failure) because of injury or poisoning, but usually failing kidneys stop working gradually over many years (chronic kidney disease). Chronic kidney disease is very common, especially in people who have high blood pressure or diabetes and in elderly people. In the UK, for example, about 20% of people aged 65–74 years have some degree of chronic kidney disease. People whose kidneys fail completely (end-stage kidney disease) need regular dialysis (hemodialysis, in which blood is filtered by an external machine, or peritoneal dialysis, which uses blood vessels in the abdominal lining to do the work of the kidneys) or a renal transplant (the surgical transfer of a healthy kidney from another person into the patient's body) to keep them alive.
Why Was This Study Done?
Our immune system protects us from pathogens (disease-causing organisms) by recognizing specific molecules (antigens) on the invader's surface as foreign and initiating a sequence of events that kills the invader. Unfortunately, the immune system sometimes recognizes kidney transplants as foreign and triggers transplant rejection. The chances of rejection can be minimized by “matching” the antigens on the donated kidney to those on the tissues of the kidney recipient and by giving the recipient immunosuppressive drugs. However, acute rejection (rejection during the first year after transplantation) affects about 20% of kidney transplants. Acute rejection needs to be detected quickly and treated with a short course of more powerful immunosuppressants because it increases the risk of transplant failure. The current “gold standard” method for detecting acute rejection if the level of creatinine in the patient's blood begins to rise is to surgically remove a small piece (biopsy) of the transplanted kidney for analysis. However, other conditions can change creatinine levels, acute rejection can occur without creatinine levels changing (subclinical acute rejection), and biopsies are invasive. Here, the researchers develop a noninvasive test for acute kidney rejection called the Kidney Solid Organ Response Test (kSORT) based on gene expression levels in the blood.
What Did the Researchers Do and Find?
For the Assessment of Acute Rejection in Renal Transplantation (AART) study, the researchers used an assay called quantitative polymerase chain reaction (QPCR) to measure the expression of 43 genes whose expression levels change during acute kidney rejection in blood samples collected from patients who had had a kidney transplant. Using a training set of 143 samples and statistical analyses, the researchers identified a 17-gene set (kSORT) that discriminated between patients with and without acute rejection detected by kidney biopsy. The 17-gene set correctly identified 39 of the samples taken from 47 patients with acute rejection as being from patients with acute rejection, and 87 of 96 samples from patients without acute rejection as being from patients without acute rejection. The researchers validated the gene set using 124 independent samples. Then, using 191 serial samples, they showed that the gene set was able to predict acute rejection up to three months before detection by biopsy. Finally, the researchers used 100 blood samples to develop an algorithm (a step-wise calculation) to classify patients as being at high or low risk of acute rejection.
What Do These Findings Mean?
These findings describe the early development of a noninvasive tool (kSORT) that might, eventually, help clinicians identify patients at risk of acute rejection after kidney transplantation. kSORT needs to be tested in more patients before being used clinically, however, to validate its predictive ability, particularly given that the current gold standard test against which it was compared (biopsy) is far from perfect. An additional limitation of kSORT is that it did not discriminate between cell-mediated and antibody-mediated immune rejection. These two types of immune rejection are treated in different ways, so clinicians ideally need a test for acute rejection that indicates which form of immune rejection is involved. The authors are conducting a follow-up study to help determine whether kSORT can be used in clinical practice to identify acute rejection and to identify which patients are at greatest risk of transplant rejection and may require biopsy.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001759.
The US National Kidney and Urologic Diseases Information Clearinghouse provides links to information about all aspects of kidney disease; the US National Kidney Disease Education Program provides resources to help improve the understanding, detection, and management of kidney disease (in English and Spanish)
The UK National Health Service Choices website provides information for patients on chronic kidney disease and about kidney transplants, including some personal stories
The US National Kidney Foundation, a not-for-profit organization, provides information about chronic kidney disease and about kidney transplantation (in English and Spanish)
The not-for-profit UK National Kidney Federation provides support and information for patients with kidney disease and for their carers, including information and personal stories about kidney donation and transplantation
World Kidney Day, a joint initiative between the International Society of Nephrology and the International Federation of Kidney Foundations, aims to raise awareness about kidneys and kidney disease
MedlinePlus provides links to additional resources about kidney diseases, kidney failure, and kidney transplantation; the MedlinePlus encyclopedia has a page about transplant rejection
doi:10.1371/journal.pmed.1001759
PMCID: PMC4227654  PMID: 25386950
14.  Anemia and chronic kidney disease are potential risk factors for mortality in stroke patients: a historic cohort study 
BMC Nephrology  2010;11:27.
Background
Chronic kidney disease (CKD) is associated to a higher stroke risk. Anemia is a common consequence of CKD, and is also a possible risk factor for cerebrovascular diseases. The purpose of this study was to examine if anemia and CKD are independent risk factors for mortality after stroke.
Methods
This historic cohort study was based on a stroke registry and included patients treated for a first clinical stroke in the stroke unit of one academic hospital over a three-year period. Mortality predictors comprised demographic characteristics, CKD, glomerular filtration rate (GFR), anemia and other stroke risk factors. GFR was estimated by means of the simplified Modification of Diet in Renal Disease formula. Renal function was assessed according to the Kidney Disease Outcomes Quality Initiative (K/DOQI)-CKD classification in five groups. A value of hemoglobin < 120 g/L in women and < 130 g/L in men on admission defined anemia. Kaplan-Meier survival curves and Cox models were used to describe and analyze one-year survival.
Results
Among 890 adult stroke patients, the mean (Standard Deviation) calculated GFR was 64.3 (17.8) ml/min/1.73 m2 and 17% had anemia. Eighty-two (10%) patients died during the first year after discharge. Among those, 50 (61%) had K/DOQI CKD stages 3 to 5 and 32 (39%) stages 1 or 2 (p < 0.001). Anemia was associated with an increased risk of death one year after discharge (p < 0.001). After adjustment for other factors, a higher hemoglobin level was independently associated with decreased mortality one year after discharge [hazard ratio (95% CI) 0.98 (0.97-1.00)].
Conclusions
Both CKD and anemia are frequent among stroke patients and are potential risk factors for decreased one-year survival. The inclusion of patients with a first-ever clinical stroke only and the determination of anemia based on one single measure, on admission, constitute limitations to the external validity. We should investigate if an early detection and management of both CKD and anemia could improve survival in stroke patients.
doi:10.1186/1471-2369-11-27
PMCID: PMC2973927  PMID: 20950484
15.  Measurement invariance of the kidney disease and quality of life instrument (KDQOL-SF) across Veterans and non-Veterans 
Background
Studies have demonstrated that perceived health-related quality of life (HRQOL) of patients receiving hemodialysis is significantly impaired. Since HRQOL outcome data are often used to compare groups to determine health care effectiveness it is imperative that measures of HRQOL are valid. However, valid HRQOL comparisons between groups can only be made if instrument invariance is demonstrated. The Kidney Disease Quality of Life-Short Form (KDQOL-SF) is a widely used HRQOL measure for patients with chronic kidney disease (CKD) however, it has not been validated in the Veteran population. Therefore, the purpose of this study was to examine the measurement invariance of the KDQOL-SF across Veterans and non-Veterans with CKD.
Methods
Data for this study were from two large prospective observational studies of patients receiving hemodialysis: 1) Veteran End-Stage Renal Disease Study (VETERAN) (N = 314) and 2) Dialysis Outcomes and Practice Patterns Study (DOPPS) (N = 3,300). Health-related quality of life was measured with the KDQOL-SF, which consists of the SF-36 and the Kidney Disease Component Summary (KDCS). Single-group confirmatory factor analysis was used to evaluate the goodness-of-fit of the hypothesized measurement model for responses to the subscales of the KDCS and SF-36 instruments when analyzed together; and given acceptable goodness-of-fit in each group, multigroup CFA was used to compare the structure of this factor model in the two samples. Pattern of factor loadings (configural invariance), the magnitude of factor loadings (metric invariance), and the magnitude of item intercepts (scalar invariance) were assessed as well as the degree to which factors have the same variances, covariances, and means across groups (structural invariance).
Results
CFA demonstrated that the hypothesized two-factor model (KDCS and SF-36) fit the data of both the Veteran and DOPPS samples well, supporting configural invariance. Multigroup CFA results concerning metric and scalar invariance suggested partial strict invariance for the SF-36, but only weak invariance for the KDCS. Structural invariance was not supported.
Conclusions
Results suggest that Veterans may interpret the KDQOL-SF differently than non-Veterans. Further evaluation of measurement invariance of the KDQOL-SF between Veterans and non-Veterans is needed using large, randomly selected samples before comparisons between these two groups using the KDQOL-SF can be done reliably.
doi:10.1186/1477-7525-8-120
PMCID: PMC2984554  PMID: 20973987
16.  Bone Mineral Metabolism and Subsequent Hospitalization With Poor Quality of Life in Dialysis Patients 
Nephro-urology Monthly  2014;6(1):e14944.
Background:
Significant impairment in health-related quality of life (HRQOL) among dialysis patients could be partly explained by some co-morbid disorders, such as chronic kidney disease-mineral and bone disorder (CKD-MBD). Also disturbance in calcium and phosphorus metabolism would increase mortality and morbidity. Therefore, further efforts to treat these abnormalities may improve the survival.
Objectives:
We designed a large multicenter population-based study in Iran to describe and assess the relation between HRQOL, hospitalization, and bone metabolism markers.
Patients and Methods:
We enrolled a total of 5820 dialysis patients from 132 dialysis centers in different parts of the country whom were volunteers to cooperate between October 2010 and August 2011. The Iranian adapted version of the Kidney disease quality of life-short form (KDQOL-SFTM) version 1.3 questionnaire was used to assess the health related quality of life. The clinical and demographic characteristics were gathered from patients’ data files.
Results:
The mean (SD) age of patients was 54.88 (16.36) years, and the range was 2 to 99 years. Of all patients, 43.1% were female. The scores of kidney disease component summary (KDCS), physical component summary, mental component summary, and total quality of life were significantly higher in the lower quartile of corrected serum calcium and higher quartile of serum parathyroid hormone (PTH) levels (P < 0.05). In a regression analysis of multilevel data, while corrected serum calcium level was associated with total KDCS and short form health survey (SF-36) scores after adjusting for other variables, hospitalization was directly correlated with serum phosphorus level and had reverse correlation with dialysis duration and quality of life.
Conclusions:
In the current study, quality of life was correlated with serum calcium level, calcium-phosphate product, and serum PTH level, while hospitalization was correlated only with serum phosphorus level. However, quality of life was inversely correlated with hospitalization.
doi:10.5812/numonthly.14944
PMCID: PMC3968968  PMID: 24719817
Bone Density; Quality of Life; Renal Dialysis; Calcium; Phosphate
17.  Validation of the Kidney Disease Quality of Life-Short Form: a cross-sectional study of a dialysis-targeted health measure in Singapore 
BMC Nephrology  2010;11:36.
Background
In Singapore, the prevalence of end-stage renal disease (ESRD) and the number of people on dialysis is increasing. The impact of ESRD on patient quality of life has been recognized as an important outcome measure. The Kidney Disease Quality Of Life-Short Form (KDQOL-SF™) has been validated and is widely used as a measure of quality of life in dialysis patients in many countries, but not in Singapore. We aimed to determine the reliability and validity of the KDQOL-SF™ for haemodialysis patients in Singapore.
Methods
From December 2006 through January 2007, this cross-sectional study gathered data on patients ≥21 years old, who were undergoing haemodialysis at National Kidney Foundation in Singapore. We used exploratory factor analysis to determine construct validity of the eight KDQOL-SF™ sub-scales, Cronbach's alpha coefficient to determine internal consistency reliability, correlation of the overall health rating with kidney disease-targeted scales to confirm validity, and correlation of the eight sub-scales with age, income and education to determine convergent and divergent validity.
Results
Of 1980 haemodialysis patients, 1180 (59%) completed the KDQOL-SF™. Full information was available for 980 participants, with a mean age of 56 years. The sample was representative of the total dialysis population in Singapore, except Indian ethnicity that was over-represented. The instrument designers' proposed eight sub-scales were confirmed, which together accounted for 68.4% of the variance. All sub-scales had a Cronbach's α above the recommended minimum value of 0.7 to indicate good reliability (range: 0.72 to 0.95), except for Social function (0.66). Correlation of items within subscales was higher than correlation of items outside subscales in 90% of the cases. The overall health rating positively correlated with kidney disease-targeted scales, confirming validity. General health subscales were found to have significant associations with age, income and education, confirming convergent and divergent validity.
Conclusions
The psychometric properties of the KDQOL-SF™ resulting from this first-time administration of the instrument support the validity and reliability of the KDQOL-SF™ as a measure of quality of life of haemodialysis patients in Singapore. It is, however, necessary to determine the test-retest reliability of the KDQOL-SF™ among the haemodialysis population of Singapore.
doi:10.1186/1471-2369-11-36
PMCID: PMC3014913  PMID: 21172008
18.  Diagnosis and Management of Chronic Kidney Disease in the Elderly: a Field of Ongoing Debate 
Aging and Disease  2012;3(5):360-372.
Chronic kidney disease (CKD) is rather common in elderly adults who comprise the fastest growing subset of patients with end-stage renal disease (ESRD). At present, there are no specific guidelines and recommendations regarding early identification and management of elderly with CKD and the current CKD classification system may overestimate its exact prevalence. Screening strategies based either in a more accurate formula of estimation of GFR alone, or preferably in combination with proteinuria are urgently needed in order to raise awareness and to promote early diagnosis of CKD in the elderly. The number of elderly dialysis patients is also increasing and may lead to severe socio-economic problems worldwide. Both hemodialysis and peritoneal dialysis can sustain life, but present various disadvantages. There is a trend for home based dialysis therapies but the results are based on a small number of patients. Recent reports indicate that dialysis may not provide a clear benefit over non-dialysis regarding survival and quality of life issues, especially in the presence of extensive comorbidities. Current practices around the world regarding access to dialysis in the elderly are rather controversial, reflecting each country’s health policies and ethical patterns. Although advanced age should not be considered as an absolute contraindication for kidney transplantation, it is not frequently offered in elderly ESRD patients due to the shortage of renal grafts. Global judgment of all physical and mental/psychological issues and full informed consent regarding possible complications are mandatory before listing elderly ESRD patients for kidney transplantation. As scientific evidence is rather scarce, there is an urgent need for prospective studies and an individualized approach for the diagnosis and treatment of the elderly CKD patients, in order to optimize care and improve quality of life in this special population.
PMCID: PMC3501392  PMID: 23185717
Chronic kidney disease; elderly; hemodialysis; peritoneal dialysis; renal transplantation
19.  Current and emerging treatment options for the elderly patient with chronic kidney disease 
The objective of this article is to review the current and emerging treatments of CKD prior to dialysis in the elderly. Worldwide, there are increasing numbers of people who are aged over 65 years. In parallel, there are increasing numbers of elderly patients presenting with chronic kidney disease (CKD), particularly in the more advanced stages. The elderly have quite different health care needs related to their associated comorbidity, frailty, social isolation, poor functional status, and cognitive decline. Clinical trials assessing treatments for CKD have usually excluded patients older than 70–75 years; therefore, it is difficult to translate current therapies recommended for younger patients with CKD across to the elderly. Many elderly people with CKD progress to end-stage kidney disease and face the dilemma of whether to undertake dialysis or accept a conservative approach supported by palliative care. This places pressure on the patient, their family, and on health care resources. The clinical trajectory of elderly CKD patients has in the past been unclear, but recent evidence suggests that many patients over 75 years of age with multiple comorbidities have greatly reduced life expectancies and quality of life, even if they choose dialysis treatment. Offering a conservative pathway supported by palliative care is a reasonable option for some patients under these circumstances. The elderly person who chooses to have dialysis will frequently have different requirements than younger patients. Kidney transplantation can still result in improved life expectancy and quality of life in the elderly, in carefully selected people. There is a genuine need for the inclusion of the elderly in CKD clinical trials in the future so we can produce evidence-based therapies for this group. In addition, new therapies to treat and slow CKD progression are needed for all age groups.
doi:10.2147/CIA.S39763
PMCID: PMC3896291  PMID: 24477220
frailty; functional status; palliative care; elderly; dialysis; kidney transplantation
20.  Sleep Quality, Mood, Alertness and Their Variability in CKD and ESRD 
Nephron. Clinical Practice  2010;114(4):c277-c287.
Background/Aims
Little is known about the association of chronic kidney disease (CKD) with sleep quality, mood, and alertness. In this report, we assessed these symptoms among patients with advanced CKD (stages 4–5) and those with end-stage renal disease (ESRD) and compared them to healthy controls without known kidney disease.
Methods
Patients were recruited from local dialysis units, outpatient nephrology clinics and the Thomas E. Starzl Transplant Institute. Healthy control subjects matched for age, gender and race were drawn from an archival database. Daily symptoms of sleep quality, mood, and alertness were assessed by visual analogue scales of the Pittsburgh Sleep Diary. Health-related quality of life was assessed by the Short Form-36 instrument.
Results
Sixty-nine dialysis patients and 23patients with advanced CKD demonstrated worse scores in sleep quality, mood, and alertness (p < 0.001) than controls. In adjusted analyses, European-American race, dialysis dependency, younger age, and physical performance SF-36 components were significantly associated with poor sleep quality, mood and alertness (p < 0.05). The dialysis population demonstrated higher day-to-day variability in scores than either the advanced CKD patients or the controls.
Conclusion
Advanced CKD and dialysis dependency are associated with impaired and highly variable sleep quality, mood, and alertness.
doi:10.1159/000276580
PMCID: PMC2865402  PMID: 20090370
Mood; Sleep quality; Alertness; Chronic kidney disease; End-stage renal disease; Pittsburgh Sleep Diary; SF-36
21.  An epidemiologic model to project the impact of changes in glomerular filtration rate on quality of life and survival among persons with chronic kidney disease 
Purpose
Predicting the timing and number of end-stage renal disease (ESRD) cases from a population of individuals with pre-ESRD chronic kidney disease (CKD) has not previously been reported. The objective is to predict the timing and number of cases of ESRD occurring over the lifetime of a cohort of hypothetical CKD patients in the US based on a range of baseline estimated glomerular filtration rate (eGFR) values and varying rates of eGFR decline.
Methods
A three-state Markov model – functioning kidney, ESRD, and death – with an annual cycle length is used to project changes in baseline eGFR on long-term health outcomes in a hypothetical cohort of CKD patients. Using published eGFR-specific risk equations and adjusting for predictive characteristics, the probability of ESRD (eGFR <10), time to death, and incremental cost-effectiveness ratios for hypothetical treatments (costing US$10, $5, and $2/day), are projected over the cohort’s lifetime under two scenarios: an acute drop in eGFR (mimicking acute kidney injury) and a reduced hazard ratio for ESRD (mimicking an effective intervention).
Results
Among CKD patients aged 50 years, an acute eGFR decrement from 45 mL/minute to 35 mL/minute yields decreases of 1.6 life-years, 1.5 quality-adjusted life-years (QALYs), 0.8 years until ESRD, and an increase of 183 per 1,000 progressing to ESRD. Among CKD patients aged 60 years, lowering the hazard ratio of ESRD to 0.8 yields values of 0.2, 0.2, 0.2, and 46 per 1,000, respectively. Incremental cost-effectiveness ratios are higher (ie, less favorable) for higher baseline eGFR, indicating that interventions occurring later in the course of disease are more likely to be economically attractive.
Conclusion
Both acute kidney injury and slowing the rate of eGFR decline produce substantial shifts in expected numbers and timing of ESRD among CKD patients. This model is a useful tool for planning management of CKD patients.
doi:10.2147/IJNRD.S58074
PMCID: PMC4086666  PMID: 25061330
epidemiology; decision model; policy analysis; cost effectiveness; acute kidney injury; disease progression; end-stage renal disease
22.  The natural history of, and risk factors for, progressive Chronic Kidney Disease (CKD): the Renal Impairment in Secondary care (RIISC) study; rationale and protocol 
BMC Nephrology  2013;14:95.
Background
Chronic kidney disease (CKD) affects up to 16% of the adult population and is associated with significant morbidity and mortality. People at highest risk from progressive CKD are defined by a sustained decline in estimated glomerular filtration rate (eGFR) and/or the presence of significant albuminuria/proteinuria and/or more advanced CKD. Accurate mapping of the bio-clinical determinants of this group will enable improved risk stratification and direct the development of better targeted management for people with CKD.
Methods/Design
The Renal Impairment In Secondary Care study is a prospective, observational cohort study, patients with CKD 4 and 5 or CKD 3 and either accelerated progression and/or proteinuria who are managed in secondary care are eligible to participate. Participants undergo a detailed bio-clinical assessment that includes measures of vascular health, periodontal health, quality of life and socio-economic status, clinical assessment and collection of samples for biomarker analysis. The assessments take place at baseline, and at six, 18, 36, 60 and 120 months; the outcomes of interest include cardiovascular events, progression to end stage kidney disease and death.
Discussion
The determinants of progression of chronic kidney disease are not fully understood though there are a number of proposed risk factors for progression (both traditional and novel). This study will provide a detailed bio-clinical phenotype of patients with high-risk chronic kidney disease (high risk of both progression and cardiovascular events) and will repeatedly assess them over a prolonged follow up period. Recruitment commenced in Autumn 2010 and will provide many outputs that will add to the evidence base for progressive chronic kidney disease.
doi:10.1186/1471-2369-14-95
PMCID: PMC3664075  PMID: 23617441
CKD progression; Observational cohort study; Inflammation; Arterial stiffness; Periodontitis
23.  Clinical management of nondialysis patients with chronic kidney disease: a retrospective observational study. Data from the SONDA study (Survey Of Non-Dialysis outpAtients) 
Background
A lack of awareness of chronic kidney disease (CKD) often results in delayed diagnosis and inadequate treatment.
Purpose
The objective of this study was to assess the therapeutic management and outcome of nondialysis CKD patients.
Methods
Three hundred ninety-seven patients (54.9% males aged 67.5 ± 14.6 years) were retrospectively screened at the Nephrology Department, GB Grassi Hospital, Rome, Italy. After a baseline visit, patient data were collected every 6 months for a total of 24 months. Clinical characteristics were measured at baseline, then the following outcomes were measured every 6 months: staging of CKD, presence of concomitant diseases, treatment and adherence to Kidney Disease Outcomes Quality Initiative (K/DOQI) guidelines for anemia management.
Results
Three hundred sixty-eight (92.7%) patients attended at least one visit and 92 (23.2%) patients attended all four visits. Patients were mainly referred to a nephrologist for chronic renal failure (61.7%) or hypertension (42.8%). At baseline, 79.6% of patients had previous hospitalization and 79.1% were receiving antihypertensive medication. Serum creatinine and/or glomerular filtration rate was examined in >90% of patients, whereas parathyroid hormone was rarely examined (5.5%). Vitamin D supplementation was received by 6.5% of patients. The majority of patients were staged at 3 or 4 CKD (32% and 23.9%, respectively) and did not significantly change over time. The use of antithrombotic, antilipidemic and erythropoietin medication increased over the four surveys. The majority of patients (86.8%) achieved hemoglobin K/DOQI target levels.
Conclusion
These findings demonstrate a current lack of attention of CKD and related disorders (mineral metabolism, electrolyte balance, and anemia) at the level of the general practitioner (GP) and non-nephrology specialist, which can result in both delayed referral and inadequate treatment. By increasing both awareness of CKD and the coordinated relationship between GPs and nephrologists, patient clinical and therapeutic outcome may be improved.
doi:10.2147/IJNRD.S38405
PMCID: PMC3579409  PMID: 23550080
chronic kidney disease; metabolic bone disorders; predialysis; renal failure
24.  Foods with added fiber improve stool frequency in individuals with chronic kidney disease with no impact on appetite or overall quality of life 
BMC Research Notes  2013;6:510.
Background
Fiber intake may be low in individuals with chronic kidney disease (CKD) due to diet restriction and/or poor appetite associated with uremic symptoms, contributing to constipation and reduced quality of life. This report describes the effects of foods with added fiber on gastrointestinal function and symptoms, clinical markers, and quality of life in CKD patients.
Findings
Adults with CKD (n = 15; 9 F, 6 M; 66 ± 15 y) were provided with cereal, cookies and snack bars without added fiber for 2 weeks, followed by similar foods providing 23 g/d of added fiber for 4 weeks, to incorporate into their usual diets. Participants completed the Kidney Disease Quality of Life (KDQOL-36) questionnaire, the Simplified Nutritional Appetite Questionnaire (SNAQ) and the Epworth Sleepiness Scale (ESS) bi-weekly, the Gastrointestinal Symptom Rating Scale (GSRS) weekly, and daily stool frequency and compliance. Control and intervention serum cholesterol and glucose were assessed. Providing 23 g/d of added fiber increased stool frequency (1.3 ± 0.2 to 1.6 ± 0.2 stools/d; P = 0.02), decreased total cholesterol (175 ± 12 to 167 ± 11 mg/dL; P = 0.02) and improved TC:HDL ratio (4.0 ± 0.3 to 3.7 ± 0.2; P = 0.02). GSRS and SNAQ scores did not change, but SNAQ scores suggested poor appetite in 7 participants with or without added fiber. KDQOL Mental Health Composite decreased from 53 ± 2 to 48 ± 2 (P = 0.01) while Physical Health Composite increased from 31 ± 2 to 35 ± 3 (P = 0.02), with no change in overall QOL. The ESS score decreased from 10 ± 1 to 8 ± 1 (P = 0.04).
Conclusion
Consuming foods with added fiber may be an effective means of increasing fiber intakes, improving stool frequency, and lipid profile in individuals with CKD.
Trial registration
ClinicalTrials.gov, # NCT01842087
doi:10.1186/1756-0500-6-510
PMCID: PMC4235217  PMID: 24304924
Chronic kidney disease; Fiber; Quality of life; Gastrointestinal function; Appetite; GSRS; SNAQ; ESS
25.  Psychosocial factors in adults with chronic kidney disease: characteristics of pilot participants in the Tasmanian Chronic Kidney Disease study 
BMC Nephrology  2013;14:83.
Background
Psychosocial factors including depression, anxiety and lower social support are common in patients with chronic kidney disease (CKD). However the influence of these potentially modifiable risk factors on morbidity and mortality in this renal population is unknown. The Tasmanian Chronic Kidney Disease study is a prospective cohort study which aims to examine the influence of both biomedical and psychosocial factors on disease progression, decision making and length and quality of life in adults with severe CKD, prior to kidney replacement therapy (KRT). This paper describes the recruitment, baseline characteristics and initial follow-up of pilot participants.
Methods
Adults aged > 18 years with stage 4 CKD (eGFR 15–29 mls/min/1.73 m2) and not receiving dialysis were recruited via treating physicians. Measures included depression (9-item Patient Health Questionnaire), anxiety (Beck Anxiety Inventory) and social support (Multidimensional Scale of Perceived Social Support). Primary outcomes were kidney disease progression, use of KRT and health-related quality of life (Kidney Disease and Quality of Life Short Form and the EQ-5D).
Results
Of those invited (n = 105), 49 provided consent and completed baseline assessment. There were no significant differences between responders and non-responders in age, gender and socio-economic status (all p > 0.05). Participants were predominantly male (63.3%) with a mean age of 72.6 ± 10.2 years. Mean serum creatinine was 241 ± 62 μmol/L with mean eGFR 22 ± 5 mls/min/1.73 m2. Prevalence of major depression and moderate to severe anxiety was 10% and 9% respectively. Less severe depression and fewer anxiety symptoms were associated with higher health-related quality of life. Follow-up at 10-months showed CKD progression in 34% of participants (use of KRT in 16%, stage 5 CKD without KRT in 18%), one death, with the remainder stable at CKD stage 3 or 4.
Conclusions
Results indicate that a larger prospective study is feasible and has the capacity to examine the influence of biomedical and psychosocial factors on kidney disease progression, use of dialysis and transplantation, and salient personal and economic outcomes. Findings have the potential to provide an evidence base for revising healthcare provision in order to optimize the care of patients with CKD.
doi:10.1186/1471-2369-14-83
PMCID: PMC3637060  PMID: 23586969
Anxiety; Chronic kidney disease; Cohort; Depression; Health-related quality of life; Social support

Results 1-25 (1403013)