Insulin resistance (IR) is associated with increased cardiovascular risk in multiple patient populations, including ones on chronic hemodialysis (CHD). Active vitamin D deficiency is postulated to play a role in the extent of IR observed in CHD patients. We postulated that administration of Paracalcitol, an active vitamin D medication, influences IR in CHD patients.
Pilot randomized-controlled trial
10 prevalent chronic hemodialysis patients on stable Paracalcitol.
Paracalcitol was withheld for 8 weeks in all patients (phase I). Parathyroid hormone levels were managed by calcium sensing receptor agonist, Cinacalcet. At week 8, patients were randomized to continue Cinacalcet or to restart Paracalcitol for 8 weeks (phase II). Primary outcome was the change in IR measured by glucose disposal rate (GDR) by hyperinsulinemic euglycemic clamp (HEGC). Secondary outcomes included changes in IR between groups in indirect indices of IR, biomarkers of inflammation, and adipokines.
Mean age was 49 years (range 46–57) and 40% were female. Compared to baseline, there was no detectable change in the GDR at the end of phase I (p=0.7). There was also no statistically significant difference in GDR between groups at the end of phase II (p=0.9). No changes were observed in indirect indices of IR, adipokines or biomarkers of inflammation in either phase.
The results of this pilot study suggest that withdrawal of Paracalcitol over 8–16 weeks and replacement for 8 weeks after withdrawal does not influence IR measured by HEGC in CHD patients.
insulin resistance; HOMA; chronic kidney disease; metabolism
The use of novel biomarkers to detect incident acute kidney injury (AKI) in the critically ill is hindered by heterogeneity of injury and the potentially confounding effects of prevalent AKI. Here we examined the ability of urine NGAL (NGAL), L-type Fatty Acid Binding Protein (L-FABP), and Cystatin C to predict AKI development, death, and dialysis in a nested case-control study of 380 critically ill adults with an eGFR over 60 ml/min/1.73 m2. One-hundred thirty AKI cases were identified following biomarker measurement and were compared to 250 controls without AKI. Areas under the receiver-operator characteristic curves (AUC-ROCs) for discriminating incident AKI from non-AKI were 0.58(95%CI: 0.52-0.64), 0.59(0.52-0.65), and 0.50(0.48-0.57) for urine NGAL, L-FABP, and Cystatin C, respectively. The combined AUC-ROC for NGAL and L-FABP was 0.59(56-0.69). Both urine NGAL and L-FABP independently predicted AKI during multivariate regression; however, risk reclassification indices were mixed. Neither urine biomarker was independently associated with death or acute dialysis [NGAL hazard ratio 1.35(95%CI: 0.93-1.96), L-FABP 1.15(0.82-1.61)] though both independently predicted the need for acute dialysis [NGAL 3.44(1.73-6.83), L-FABP 2.36(1.30-4.25)]. Thus, urine NGAL and L-FABP independently associated with the development of incident AKI and receipt of dialysis but exhibited poor discrimination for incident AKI using conventional definitions.
Pre-ESRD care associates with improved outcomes among patients receiving dialysis. It is unknown what proportion of US micropolitan and rural dialysis patients receive pre-ESRD care and benefit from such care when compared to urban.
A retrospective cohort study was performed using data from the US Renal Data System. Patients ≥18 years old who initiated dialysis in 2006 and 2007 were classified as rural, micropolitan, or urban and prevalence of pre-ESRD care (early nephrology care >6 months, permanent vascular access, dietary education) was determined using the medical evidence report. The association of pre-ESRD care with dialysis mortality and transplantation was assessed using Cox regression with stratification for geographic residence.
Of 204,463 dialysis patients, 80% were urban, 10.2% were micropolitan, and 9.8% were rural. Overall attainment of pre-ESRD care was poor. After adjustment, there were no significant geographic differences in attainment of early nephrology care or permanent dialysis access. Receiving care reduced all-cause mortality and increased the likelihood of transplantation to a similar degree regardless of geographic residence. Both micropolitan and rural patients received less dietary education (RR 0.80 95% CI 0.76–0.84 and RR 0.85 95% CI 0.80–0.89, respectively).
Among patients who receive dialysis, the prevalence of early nephrology care and permanent dialysis access is poor and does not vary by geographic residence. Micropolitan and rural patients receive less dietary education despite an observed mortality benefit, suggesting that barriers may exist to quality dietary care in more remote locations.
rural; disparity; chronic kidney disease
Physical activity (PA) plays important roles in the development of kidney disease and its complications; however, the validity of standard tools for measuring PA is not well understood.
We investigated the performance of several readily-available and widely-used PA and physical function questionnaires, individually and in combination, against accelerometry among a cohort of CKD participants.
Setting and Participants
Forty-six participants from the Seattle Kidney Study, an observational cohort study of persons with CKD, completed the PA Scale for the Elderly, Human Activity Profile (HAP), Medical Outcomes Study SF-36 questionnaire, and the Four Week PA History Questionnaire (FWH). We simultaneously measured PA using an Actigraph GT3X accelerometer over a 14-day period. We estimated the validity of each instrument by testing its associations with log-transformed accelerometry counts. We used the Akaike information criterion to investigate the performance of combinations of questionnaires.
All questionnaire scores were significantly associated with log-transformed accelerometry counts. The HAP correlated best with accelerometry counts (r2=0.32) followed by the SF-36 (r2=0.23). Forty-three percent of the variability in accelerometry counts data was explained by a model that combined the HAP, SF-36 and FWH.
A combination of measurement tools can account for a modest component of PA in patients with CKD; however, a substantial proportion of physical activity is not captured by standard assessments.
chronic kidney disease; physical activity; accelerometry; questionnaires
Protein energy wasting (PEW) is highly prevalent in patients undergoing maintenance hemodialysis (MHD) patients. Importantly, there is a robust association between the extent of PEW and the risk of hospitalization and death in these patients, regardless of the nutritional marker used. The multiple etiologies of PEW in advanced kidney disease are still being elucidated. Apart from the multiple mechanisms that might lead to PEW, it appears that the common pathway for all the derangements is related to exaggerated protein degradation along with decreased protein synthesis. The hemodialysis procedure per se is an important contributor to this process. Metabolic and hormonal derangements such as acidosis, inflammation and resistance to anabolic properties of insulin resistance and growth hormone are all implicated for the development of PEW in MHD patients. Appropriate management of MHD patients at risk for PEW requires a comprehensive combination of strategies to diminish protein and energy depletion, and to institute therapies that will avoid further losses. The mainstay of nutritional treatment in MHD patients is provision of an adequate amount of protein and energy, using oral supplementation as needed. Intradialytic parenteral nutrition should be attempted in patients who cannot use the gastrointestinal tract efficiently. Other anabolic strategies such as exercise, anabolic hormones, anti-inflammatory therapies and appetite stimulants can be considered as complementary therapies in suitable patients.
Arteriovenous graft stenosis leading to thrombosis is a major cause of complications in patients undergoing hemodialysis. Procedural interventions may restore patency but are costly. Although there is no proven pharmacologic therapy, dipyridamole may be promising because of its known vascular antiproliferative activity.
We conducted a randomized, double-blind, placebo-controlled trial of extended-release dipyridamole, at a dose of 200 mg, and aspirin, at a dose of 25 mg, given twice daily after the placement of a new arteriovenous graft until the primary outcome, loss of primary unassisted patency (i.e., patency without thrombosis or requirement for intervention), was reached. Secondary outcomes were cumulative graft failure and death. Primary and secondary outcomes were analyzed with the use of a Cox proportional-hazards regression with adjustment for prespecified covariates.
At 13 centers in the United States, 649 patients were randomly assigned to receive dipyridamole plus aspirin (321 patients) or placebo (328 patients) over a period of 4.5 years, with 6 additional months of follow-up. The incidence of primary unassisted patency at 1 year was 23% (95% confidence interval [CI], 18 to 28) in the placebo group and 28% (95% CI, 23 to 34) in the dipyridamole–aspirin group, an absolute difference of 5 percentage points. Treatment with dipyridamole plus aspirin significantly prolonged the duration of primary unassisted patency (hazard ratio, 0.82; 95% CI, 0.68 to 0.98; P = 0.03) and inhibited stenosis. The incidences of cumulative graft failure, death, the composite of graft failure or death, and serious adverse events (including bleeding) did not differ significantly between study groups.
Treatment with dipyridamole plus aspirin had a significant but modest effect in reducing the risk of stenosis and improving the duration of primary unassisted patency of newly created grafts. (ClinicalTrials.gov number, NCT00067119.)
Increased fatigue is a predictor of morbidity and mortality in older adults. Fatigability defines a change in performance or self-reported fatigue in response to physical activity (PA). However, the relationship of fatigability to PA-related energy expenditure (PAEE) is unknown. Changes in performance, fatigue, and energy expenditure were measured simultaneously in 17 adults (11 females, 74–94 years old) performing eight standardized PA tasks with various energy expenditure requirements in a whole-room indirect calorimeter. Change in performance was objectively measured using a PA movement monitor and change in fatigue was self-reported on a seven-point scale for each task. Performance and perceived fatigability severity scores were calculated as a ratio of change in performance and fatigue, respectively, and PAEE. We found that change in both objective performance and self-reported fatigue were associated with energy expenditure (Spearman rho = −0.72 and −0.68, respectively, p < 0.001) on a task requiring relatively high level of energy expenditure. The performance and perceived fatigability severity scores were significantly correlated (rho = 0.77, p < 0.001) on this task. In summary, results of this proof of concept pilot study show that both perceived and performance fatigability severity scores are related to PAEE-induced fatigue on a task requiring relatively high level of energy expenditure. We conclude that fatigability severity is a valid measure of PAEE-induced fatigue in older adults.
Aging; Fatigue; Tiredness; Resting energy expenditure; Physical activity
Protein-energy wasting (PEW), which is manifested by low serum levels of albumin or prealbumin, sarcopenia and weight loss, is one of the strongest predictors of mortality in patients with chronic kidney disease (CKD). Although PEW might be engendered by non-nutritional conditions, such as inflammation or other comorbidities, the question of causality does not refute the effectiveness of dietary interventions and nutritional support in improving outcomes in patients with CKD. The literature indicates that PEW can be mitigated or corrected with an appropriate diet and enteral nutritional support that targets dietary protein intake. In-center meals or oral supplements provided during dialysis therapy are feasible and inexpensive interventions that might improve survival and quality of life in patients with CKD. Dietary requirements and enteral nutritional support must also be considered in patients with CKD and diabetes mellitus, in patients undergoing peritoneal dialysis, renal transplant recipients, and in children with CKD. Adjunctive pharmacological therapies, such as appetite stimulants, anabolic hormones, and antioxidative or anti-inflammatory agents, might augment dietary interventions. Intraperitoneal or intradialytic parenteral nutrition should be considered for patients with PEW whenever enteral interventions are not possible or are ineffective. Controlled trials are needed to better assess the effectiveness of in-center meals and oral supplements.
Loss of lean body mass (sarcopenia) is associated with increased morbidity and mortality in patients receiving chronic hemodialysis (CHD). Insulin resistance (IR), which is highly prevalent in patients receiving CHD, has been proposed to play a critical role in the development of sarcopenia. The aim of this study was to examine the effect of IR on amino acid metabolism in patients receiving CHD.
This was a cross-sectional study.
The study included 12 prevalent (i.e., patients that have been on dialysis for more than 90 days) African American patients receiving CHD.
IR was measured as glucose disposal rate (GDR) determined from hyperinsulinemic euglycemic clamp (HGEC) studies performed 3 consecutive times. Plasma amino acid (AA) concentrations were measured by real-time high-performance liquid chromatography (HPLC) throughout the clamp study. The primary outcome was percentage change in leucine concentrations during the clamp study. The main predictor was the GDR measured simultaneously during the HGEC studies. Mixed model analysis was used to account for repeated measures.
All individual AA concentrations declined significantly in response to high-dose insulin administration (P < .001). There was a significant direct association between GDR by HECG studies and the percentage change in leucine concentration (P = .02). Although positive correlations were observed between GDR values and concentration changes from baseline for other AAs, these associations did not reach statistical significance.
Our results suggest that the severity of IR of carbohydrate metabolism is associated with a lesser decline in plasma leucine concentrations, suggesting a similar resistance to protein anabolism. Insulin resistance represents a potential mechanism for sarcopenia commonly observed in patients receiving CHD.
Because of the number of factors affecting the nutritional and metabolic status in patients with advanced chronic kidney disease or who are on maintenance dialysis, the prevention and treatment of protein-energy wasting (PEW) of chronic kidney disease should involve a comprehensive combination of maneuvers to diminish protein and energy depletion, in addition to therapies that will avoid further losses. The available evidence suggests that nutritional supplementation, administered orally or parenterally, is effective in the treatment of maintenance dialysis patients with PEW in whom oral dietary intake from regular meals cannot maintain adequate nutritional status. Increased oral nutrient intake during dialysis and at home is the ideal choice for this intervention. In clinical practice, the advantages of intradialytic oral nutritional supplements include proven efficacy and compliance. Therefore, at a minimum, oral nutritional supplementation given intradialytically should be attempted in maintenance dialysis patients with PEW, accompanied by individualized dietary advice for appropriate intake at home. In ones who cannot tolerate oral feeding, other forms of nutritional supplementation including intradialytic parenteral nutritional are a reasonable strategy. Although not proven conclusively, nutritional interventions in the form of supplementation may lead to considerable improvements in mortality, hospitalization, and treatment costs.
Wasting; cachexia; ESRD; malnutrition; IDPN
Antibody response to the inactivated influenza vaccine is not well described in kidney transplant recipients on newer, but commonly used, immunosuppression medications. We hypothesized that kidney transplant recipient participants on tacrolimus-based regimens would have decreased antibody response compared with healthy controls.
Prospective cohort study of 53 kidney transplant recipient and 106 healthy control participants over the 2006–2007 influenza season. All participants received standard inactivated influenza vaccine.
Setting and participants
Kidney transplant recipients on tacrolimus-based regimens at a single academic medical center and healthy controls.
Presence of kidney transplant.
Proportion of participants achieving seroresponse (four-fold rise in antibody titer) and seroprotection (antibody titer greater than 1:32) one month after vaccination.
Antibody titers before vaccination and one month after vaccination using hemagglutinin inhibition assays for influenza types A/H1N1, A/H3N2, and B.
A smaller proportion of the transplantation group compared with the healthy control group developed the primary outcomes of seroresponse or seroprotection for all three influenza types at one month post vaccination. The response to influenza type A/H3N2 was statistically different, with the transplantation group having 69% decreased odds of developing seroresponse (95% CI 0.16 to 0.62, P = 0.001) and 78% decreased odds of developing seroprotection (95% CI 0.09 to 0.53, P = 0.001) compared with healthy controls. When participants less than 6 months from time of transplantation were considered, this group had significantly decreased response to the vaccine as compared with healthy controls.
Decreased sample size; potential for confounders; outcome measure used is the standard but does not give information about vaccine efficacy.
Kidney transplant recipients, especially within 6 months of transplantation, had diminished antibody response to the 2006–07 inactivated influenza vaccine.
influenza; vaccination; immunosuppression; kidney transplant; tacrolimus
Although annual influenza vaccination is recommended for kidney transplant recipients, efficacy as reflected by serum antibody titers has not been well studied beyond 1 month in kidney transplant recipients.
We performed a single center prospective cohort study of 51 kidney transplant recipients and 102 healthy controls receiving the 2006–2007 influenza vaccine. Anti-hemagluttinin antibody titers to A/H1N1, A/H3N2, and B were measured prior and 1 month after vaccination, and again at the end of influenza season. The primary outcome was the proportion of participants maintaining seroprotection (antibody titer ≥ 1:32) for the duration of influenza season after influenza vaccination.
Median follow up time was 175 and 155 days in the transplant and control groups, respectively. For types A/H1N1 and B, a similar high proportion of the transplant and control groups (88.5% and 81.6% vs 83.7% and 74.2% for A/H1N1 and B, respectively) maintained seroprotection. For type A/H3N2, significantly less of the transplant group (66.7%) versus the control group (90%) maintained a protective influenza vaccine response (odds ratio 0.21, 95% CI 0.07–0.64). This difference disappeared in adjusted analyses. Actual geometric mean titers decreased significantly within both groups (P < 0.001) but this did not differ between groups.
Once they have developed protective vaccine-induced antibody responses to influenza vaccine, kidney transplant recipients are able to maintain adequate protective levels of antibody compared with healthy controls.
transplant; influenza; vaccine; immunosuppression
To document the stability, concurrent validity and clinical correlates of two fatigability severity measures as recommended by the American Geriatrics Society.
Two independent-living and one community senior centers.
43 volunteers, average age 85 ± 6 years.
Perceived fatigability severity was quantified by directly asking subjects to report change in energy following a standardized 10-minute walk at self-selected pace. Performance fatigability severity was defined as a ratio of change in walking speed and total distance walked. The walk test was repeated within two weeks to assess stability. Total daily physical activity (PA) was measured over seven consecutive days using waist-worn accelerometer. Frailty was measured with the VES13 interview scale and gait speed was measured using standardized 25-feet walk test.
The perceived and performance fatigability severity measures were significantly correlated (r=.94, p<01) and stable over two assessments (r= .82 and .85, respectively p<.01). Both fatigability severity measures were significantly correlated with PA level (r=−.42 and −.44, respectively p<.05), frailty (r=.47 and .53, respectively p<.01) and gait speed (r= −.45 and −.54 respectively, p<.01.).
The methodology described in this study permits the calculation of two highly correlated fatigability severity scores, which summarize the relationship of a person’s change in self-reported tiredness or change in physical performance to concurrently measured PA. The fatigability severity scores are reproducible and correlated with clinical measures predictive of decline. The methods used to quantify fatigability severity can be implemented during a brief assessment (< 15 minutes) and should be useful in the design and evaluation of interventions to increase PA in older adults at risk for functional decline.
fatigability; physical activity; elderly
To describe patient hypertension knowledge and associations with blood pressure measurements.
Patients with chronic kidney disease (CKD) were asked about the impact of high blood pressure on kidneys and their target blood pressure goal. Systolic blood pressure was measured using automated sphygmomanometers.
In 338 adults with hypertension and pre-dialysis CKD, the median [IQR] age was 59 [47, 68] years, 45% [n=152] were women, and 18% [n=62] were non-white. Lower systolic blood pressure (SBP) was associated with female sex (SBP mmHg median [IQR] 132 [117,149] women vs. 137 [124,152] men; p=0.04), less advanced CKD (SBP 134 [122,147] stages 1–2 vs. 132 [118,148] stage 3 vs. 140 [125,156] stages 4–5; p=0.01), and patient ability to correctly identify SBP goal (SBP 134 [119,150] correct vs. 141 [125,154] incorrect; p=0.05). In adjusted analysis, knowledge of blood pressure goal remained independently associated with lower SBP (−9.96 mmHg [−19.97, −1.95] in correct respondents vs. incorrect; p<0.001).
Patient knowledge of goal blood pressure is independently associated with improved blood pressure control.
Interventions to improve patient knowledge of specific blood pressure targets may have an important role in optimizing blood pressure management.
Studies have documented an association between chronic kidney disease (CKD) and increased risk of end-stage renal disease (ESRD), death and comorbidities, including cardiovascular disease and metabolic syndrome, in the general population. However, there is little data on the relationship between CKD and ADE (AIDS defining event), and to our knowledge, no studies have analyzed death as a competing risk for ADE among HIV-infected persons. An observational cohort study was performed to determine the incidence and risks for developing an ADE or death among HIV-infected persons with and without CKD from 1998 – 2005. CKD was defined as an estimated glomerular filtration rate (eGFR) less than 60 ml/min/1.73 m2 using the CKD-Epidemiology Collaboration (CKD-EPI) equation. Log rank test and Cox regression which determined time to development of ADE and/or death as combined and separate outcomes, and competing risk models for ADE versus mortality, were performed. Among the 2,127 persons that contributed to the 5,824 person years of follow-up: 22% were female, 34% African-American, 38% on HAART, and 3% had CKD at baseline. ADE occurred in 227 (11%) persons and there were 80 (4%) deaths. CKD was not significantly associated with ADE/death (HR 1.3, 95% CIs: 0.5, 3.2), ADE (HR 1.0, 95% CIs: 0.4, 3.1), or death (HR 1.6, 95% CIs: 0.4, 3.1). Competing risk analyses confirmed no statistically significant associations between CKD and these outcomes. CKD was uncommon in HIV-infected persons presenting for care in this racially diverse cohort, and was not independently associated with risk of developing an ADE or dying during follow-up.
HIV; CKD; AIDS defining event (ADE); mortality
It is likely that patients with chronic kidney disease (CKD) have a limited understanding of their illness. Here we studied the relationships between objective and perceived knowledge in CKD using the Kidney Disease Knowledge Survey and the Perceived Kidney Disease Knowledge Survey. We quantified perceived and objective knowledge in 399 patients at all stages of non-dialysis dependent CKD. Demographically, the patient median age was 58 years, 47% were women, 77% had stages 3-5 CKD, and 83% were Caucasians. The overall median score of the perceived knowledge survey was 2.56 (range: 1-4), and this new measure exhibited excellent reliability and construct validity. In unadjusted analysis, perceived knowledge was associated with patient characteristics defined a priori, including objective knowledge and patient satisfaction with physician communication. In adjusted analysis, older age, male gender, and limited health literacy were associated with lower perceived knowledge. Additional analysis revealed that perceived knowledge was associated with significantly higher odds (2.13), and objective knowledge with lower odds (0.91), of patient satisfaction with physician communication. Thus, our results present a mechanism to evaluate distinct forms of patient kidney knowledge, and identify specific opportunities for education tailored to patients with CKD.
Patient knowledge; kidney disease; health literacy; perceived knowledge; objective knowledge
To compare the relative effectiveness of angiotensin-converting enzyme inhibitors (ACEIs) and angiotensin receptor blockers (ARBs) in reducing cardiovascular mortality in chronic hemodialysis patients, we conducted an observational analysis of all patients initiated on ACEI or ARB therapy undergoing chronic hemodialysis at a large dialysis provider. Survival curves with mortality hazard ratios (HR) were generated using the Kaplan-Meier method and Cox regression. Outcomes were compared using inverse probability of treatment weighting and propensity score matching. Over 6-years, 22,800 patients were newly initiated on an ACEI and 5,828 were initiated on an ARB after at least 60 days of chronic hemodialysis. After adjustment for baseline cardiovascular risk factors, there was no significant difference in the risk of cardiovascular, all cause, or cerebrovascular mortality in patients initiated on an ARB versus ACEI (HR of 0.96). A third of 28,628 patients, newly started on an ACEI or ARB went on to initiate another antihypertensive medication in succession. After adjustment for risk factors, 701 patients initiated on combined ACEI and ARB therapy (HR of 1.45) or 6866 patients on ACEI and non-ARB antihypertensive agent (HR=1.27) were at increased risk of cardiovascular death compared with 1758 patients initiated on an ARB and non-ACEI antihypertensive therapy. Thus, an ARB, in combination with another antihypertensive medication (but not an ACEI), may have a beneficial effect on cardiovascular mortality. As observational studies may be confounded by indication even when adjusted, randomized clinical trials are needed to confirm these findings.
Tacrolimus, an immunosuppressive drug widely prescribed in kidney transplantation, requires therapeutic drug monitoring due to its marked interindividual pharmacokinetic variability and narrow therapeutic index. Previous studies have established that CYP3A5 rs776746 is associated with tacrolimus clearance, blood concentration, and dose requirement. The importance of other drug absorption, distribution, metabolism, and elimination (ADME) gene variants has not been well characterized.
We used novel DNA biobank and electronic medical record resources to identify ADME variants associated with tacrolimus dose requirement. Broad ADME genotyping was performed on 446 kidney transplant recipients who had been dosed to steady state with tacrolimus. The cohort was obtained from Vanderbilt's DNA biobank, BioVU, which contains linked, de-identified electronic medical record data. Genotyping included Affymetrix DMET Plus (1936 polymorphisms), custom Sequenom MassARRAY iPLEX Gold assay (95 polymorphisms), and ancestry-informative markers. The primary outcome was tacrolimus dose requirement defined as blood concentration-to-dose ratio.
In analyses that adjusted for race and other clinical factors, we replicated the association of tacrolimus blood concentration-to-dose ratio with CYP3A5 rs776746 (p = 7.15 × 10−29), and identified associations with nine variants in linkage disequilibrium with rs776746, including eight CYP3A4 variants. No NR1/2 variants were significantly associated. Age, weight, and hemoglobin were also significantly associated with the outcome. In final models, rs776746 explained 39% of variability in dose requirement, and 46% was explained by the model containing clinical covariates.
This study highlights the utility of DNA biobanks and electronic medical records for tacrolimus pharmacogenomic research.
pharmacogenomics; pharmacokinetics; calcineurin inhibitor; tacrolimus; electronic medical records; kidney transplant; cytochrome P4503A5; genetic polymorphism; dosing
We evaluated whether black race is associated with higher incidence of End Stage Renal Disease (ESRD) among a cohort of blacks and whites of similar, generally low socioeconomic status, and whether risk factor patterns differ among blacks and whites and explain the poorly understood racial disparity in ESRD. Incident diagnoses of ESRD among 79,943 black and white participants in the Southern Community Cohort Study (SCCS) were ascertained by linkage with the United States Renal Data System (USRDS) from 2002 through 2009. Person-years of follow up were calculated from date of entry into the SCCS until date of ESRD diagnosis, date of death, or September 1, 2009, whichever occurred first. Cox proportional hazards models were used to estimate hazard ratios (HRs) and 95% confidence intervals (CI) for incident ESRD among black and white participants in relation to baseline characteristics. After 329,003 person-years of follow-up, 687 incident cases of ESRD were identified in the cohort. The age-adjusted ESRD incidence rate was 273 (per 100,000) among blacks, 3.5-fold higher than the rate of 78 among whites. Risk factors for ESRD included male sex (HR = 1.6; 95% CI 1.4–1.9), low income (HR = 1.5; 95% CI 1.2–1.8 for income below vs. above $15,000), smoking (HR = 1.2; 95% CI 1.02–1.4) and histories of diabetes (HRs increasing to 9.4 (95% CI 7.4–11.9) among those with ≥20 years diabetes duration) and hypertension (HR = 2.9; 95% CI 2.3–3.7). Patterns and magnitudes of association were virtually identical among blacks and whites. After adjustment for these risk factors, blacks continued to have a higher risk for ESRD (HR = 2.4; 95% CI = 1.9–3.0) relative to whites. The black-white disparity in risk of ESRD was attenuated but not eliminated after control for known risk factors in a closely socioeconomically matched cohort. Further research characterizing biomedical factors, including CKD progression, in ESRD occurrence in these two racial groups is needed.
Increased oxidative stress and inflammation are highly prevalent in chronic kidney disease (CKD), yet few studies have investigated whether oral antioxidant therapy can alter markers of inflammation or oxidative stress in CKD. The purpose of this study was to investigate whether a combination of mixed tocopherols and alpha lipoic acid (ALA) would alter biomarkers of oxidative stress and inflammation in subjects with Stage 3–4 CKD.
This was a prospective, randomized, double-blind, placebo-controlled pilot trial. 62 subjects were enrolled, and were randomly assigned to receive the combination of mixed tocopherols 666 IU/day plus ALA 600mg/day or their matching placebos for a total of 8 weeks. Plasma F2-isoprostane and protein thiol concentration were measured as biomarkers of oxidative stress, and C-reactive protein (CRP) and interleukin-6 (IL-6) concentration as biomarkers of systemic inflammation.
There were no significant differences in demographics, diabetic status, or estimated glomerular filtration rate (eGFR) between study treatment and placebo groups at baseline. 58 of 62 randomized subjects (93%) completed the study protocol. After two months of treatment, there were no significant changes in F2-isoprostanes, protein thiols, CRP and IL-6 concentrations with mixed tocopherols and ALA treatment compared to matching placebos, whether analyzed as intention to treat or as treated. Diabetic status and baseline body mass index did not influence the results.
Combination oral mixed tocopherols and ALA treatment for 2 months does not influence biomarkers of oxidative stress and inflammation in Stage 3–4 CKD patients.
Little is known about disease specific knowledge in patients with chronic kidney disease (CKD). We developed and examined the results of a survey to characterize kidney disease knowledge.
Survey about kidney disease knowledge, with questions developed by experts. Setting and Participants: 401 adult patients with CKD (Stages 1–5) attending a nephrology clinic from April to October 2009.
Outcomes & Measurements
We calculated survey reliability using the Kuder-Richardson-20 coefficient, and established construct validity by testing a priori hypotheses of associations between the survey and patient characteristics. We descriptively analyzed survey responses and applied linear regression analyses to evaluate associations with patient characteristics. Health literacy was measured using the Rapid Estimate of Adult Literacy in Medicine.
Participants median age was 58 (25th-75th percentile, 46–68) years, 83% were White, 18% had limited literacy, and 77% had CKD Stages 3–5. The 28 question knowledge survey had good reliability (KR-20=0.72), and mean (SD) knowledge score was 66% (15%). In support of construct validity of our knowledge survey, bivariate analysis shows that scores are associated with age (β, −0.01 per ten years; 95% CI, −0.02–−0.005; p=0.003), formal education (β, 0.09; 95% CI, 0.03–0.15; p=0.004), health literacy (β, 0.06; 95% CI, 0.03–0.10; p=0.001), kidney education class participation (β, 0.05; 95% CI, 0.01–0.09; p=0.009), knowing someone else with CKD (β, 0.05; 95% CI, 0.02–0.08; p=0.001), and awareness of one’s own CKD diagnosis (β, 0.07; 95% CI, 0.04–0.10; p<0.001). Findings were similar in adjusted analyses.
Recruitment from one clinic limits generalizabilty of findings.
For patients with CKD, this kidney disease knowledge survey (KiKS) is reliable and valid, and identifies areas of and risk factors for poor kidney knowledge. Further study is needed to determine the impact of CKD knowledge on self-care behaviors and clinical outcomes.
We tested the hypothesis that long-term resistance exercise combined with intradialytic oral nutrition (IDON) supplementation will improve markers of muscle mass and strength further compared to IDON alone in chronic hemodialysis (CHD) patients.
Randomized controlled trial.
Outpatient Dialysis Unit at an academic center.
Main outcome measure
Lean body mass (LBM). Muscle strength and other nutritional parameters were measured as secondary outcomes.
Thirty-two participants (age 43±13 yrs, 21 male) on CHD
Subjects were randomly assigned to IDON plus resistance exercise (NS+EX) or IDON (NS) alone for 6 months. IDON consisted of a lactose-free formula consisting of protein, carbohydrate and fat. Three sets of 12 repetitions of leg-press were completed prior to each dialysis session in the NS+EX arm.
22 out of 32 participants completed the 6-month intervention. There were no statistically significant differences between the study interventions with respect to changes in LBM and body weight when comparing NS+EX to NS. There were also no statistically significant differences in any of the secondary outcomes measured in the study. Body weight (80.3±16.6 kg, 81.1±17.5 kg and 80.9±18.2 kg at baseline, month 3 and month 6, respectively, P=0.02) and 1-Repetition Maximum (468±148 lb, 535±144 lb, 552±142 lb, respectively, P=0.001) increased statistically significantly during the study for all patients combined.
This study did not show further benefits of additional resistance exercise on long-term somatic protein accretion above and beyond nutritional supplementation alone. When both treatments groups were combined, body weight and muscle strength improved during the study.
hemodialysis; protein-energy wasting; resistance exercise; nutrition supplementation
Wasting/cachexia is prevalent among patients with chronic kidney disease (CKD). It is to be distinguished from malnutrition, which is defined as the consequence of insufficient food intake or an improper diet. Malnutrition is characterized by hunger, which is an adaptive response, whereas anorexia is prevalent in patients with wasting/cachexia. Energy expenditure decreases as a protective mechanism in malnutrition whereas it remains inappropriately high in cachexia/wasting. In malnutrition, fat mass is preferentially lost and lean body mass and muscle mass is preserved. In cachexia/wasting, muscle is wasted and fat is relatively underutilized. Restoring adequate food intake or altering the composition of the diet reverses malnutrition. Nutrition supplementation does not totally reverse cachexia/wasting. The diagnostic criteria of cachexia/protein–energy wasting in CKD are considered. The association of wasting surrogates, such as serum albumin and prealbumin, with mortality is strong making them robust outcome predictors. At the patient level, longevity has consistently been observed in patients with CKD who have more muscle and/or fat, who report better appetite and who eat more. Although inadequate nutritional intake may contribute to wasting or cachexia, recent evidence indicates that other factors, including systemic inflammation, perturbations of appetite-controlling hormones from reduced renal clearance, aberrant neuropeptide signaling, insulin and insulin-like growth factor resistance, and metabolic acidosis, may be important in the pathogenesis of CKD-associated wasting. A number of novel therapeutic approaches, such as ghrelin agonists and melanocortin receptor antagonists are currently at the experimental level and await confirmation by randomized controlled clinical trials in patients with CKD-associated cachexia/wasting syndrome.
Wasting; Chronic; Kidney disease