PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (35)
 

Clipboard (0)
None

Select a Filter Below

Year of Publication
Document Types
1.  Physical Performance and Frailty in Chronic Kidney Disease 
American journal of nephrology  2013;38(4):307-315.
Background
Poor physical performance and frailty are associated with elevated risks of death and disability. Chronic kidney disease (CKD) is also strongly associated with these outcomes. The risks of poor physical performance and frailty among CKD patients, however, are not well established.
Methods
We measured the Short Physical Performance Battery (SPPB, a summary test of gait speed, chair-raises and balance; range 0–12) and the five elements of frailty among 1111 Chronic Renal Insufficiency Cohort participants. Adjusting for demographics and multiple comorbidities, we fit a linear regression model for the outcome of SPPB score and an ordinal logistic regression model for frailty status.
Results
Median (interquartile range [IQR]) age was 65 (57–71) years, median estimated glomerular filtration rate (eGFR) for non-dialysis patients was 49 (36–62) ml/min/1.73m2, and median SPPB score was 9 (7–10). Seven percent of participants were frail and 43% were pre-frail. Compared with the SPPB score for eGFR >60 ml/min/1.73m2, the SPPB was 0.51 points lower for eGFR 30 – 59; 0.61 points lower for eGFR 15 – 29; and 1.75 points lower for eGFR <15; (p<0.01 for all comparisons). eGFR 30 – 59 (OR 1.45; p=0.024), eGFR 15 – 29 (OR 2.02; p=0.002) and eGFR <15 (OR 4.83, p<0.001) were associated with worse frailty status compared with eGFR >60 ml/min/1.73m2.
Conclusions
CKD severity was associated with poor physical performance and frailty in a graded fashion. Future trials should determine if outcomes for CKD patients with frailty and poor physical performance are improved by targeted interventions.
doi:10.1159/000355568
PMCID: PMC4019506  PMID: 24107579
Physical performance; frailty; chronic kidney disease
2.  Candidate Gene Association Study of Coronary Artery Calcification in Chronic Kidney Disease: Findings from the Chronic Renal Insufficiency Cohort Study 
Objectives
To identify loci for coronary artery calcification (CAC) in patients with chronic kidney disease (CKD).
Background
CKD is associated with increased CAC and subsequent coronary heart disease (CHD) but the mechanisms remain poorly defined. Genetic studies of CAC in CKD may provide a useful strategy for identifying novel pathways in CHD.
Methods
We performed a candidate gene study (~2,100 genes; ~50,000 SNPs) of CAC within the Chronic Renal Insufficiency Cohort (CRIC) Study (n=1,509; 57% European, 43% African ancestry). SNPs with preliminary evidence of association with CAC in CRIC were examined for association with CAC in PennCAC (n=2,560) and Amish Family Calcification Study (AFCS; n=784) samples. SNPs with suggestive replication were further analyzed for association with myocardial infarction (MI) in the Pakistan Risk of Myocardial Infarction study (PROMIS) (n=14,885).
Results
Of 268 SNPs reaching P <5×10−4 for CAC in CRIC, 28 SNPs in 23 loci had nominal support (P <0.05 and in same direction) for CAC in PennCAC or AFCS. Besides chr9p21 and COL4A1, known loci for CHD, these included SNPs having reported GWAS association with hypertension (e.g., ATP2B1). In PROMIS, four of the 23 suggestive CAC loci (chr9p21, COL4A1, ATP2B1 and ABCA4) had significant associations with MI consistent with their direction of effect on CAC.
Conclusions
We identified several loci associated with CAC in CKD that also relate to MI in a general population sample. CKD imparts a high risk of CHD and may provide a useful setting for discovery of novel CHD genes and pathways.
doi:10.1016/j.jacc.2013.01.103
PMCID: PMC3953823  PMID: 23727086
Coronary artery calcification (CAC); chronic kidney disease (CKD); Chronic Renal Insufficiency Cohort Study (CRIC); myocardial infarction (MI); risk factors; candidate genes; single nucleotide polymorphisms (SNPs)
3.  Race modifies the association between adiposity and inflammation in patients with chronic kidney disease: findings from the CRIC study 
Obesity (Silver Spring, Md.)  2014;22(5):1359-1366.
Objective
To examine the race-specific association of inflammation with adiposity and muscle mass in subjects with chronic kidney disease (CKD).
Design and Methods
Plasma concentration of IL-1β, IL-Receptor antagonist (IL-1RA), IL-6, IL-10, TNF-α, TGF-β, hs-CRP, fibrinogen, and serum albumin were measured in 3,939 Chronic Renal Insufficiency Cohort study participants. Bioelectric impedance analysis was used to determine body fat mass (BFM) and fat free mass (FFM).
Results
Plasma levels of hs-CRP, fibrinogen, IL-1RA, IL-6, and TNF-α increased and serum albumin decreased across the quartiles of body mass index. In multivariable analysis, BFM and FFM were positively associated with hs-CRP, fibrinogen, IL-1β, IL-1RA and IL-6. One standard deviation (SD) increase in BFM and FFM was associated with 0.36 (95% CI 0.33, 0.39) and 0.26 (95% CI 0.22, 0.30) SD increase in log transformed hs-CRP, respectively (p<0.001). Race stratified analysis showed that the association between biomarkers and BFM and FFM differed by race, with Caucasians demonstrating a stronger association with markers of inflammation than African Americans.
Conclusion
BFA and FFM are positively associated with markers of inflammation in patients with CKD. Race stratified analysis showed that Caucasians have a stronger association with markers of inflammation compared to African Americans.
doi:10.1002/oby.20692
PMCID: PMC4327849  PMID: 24415732
Bioelectric impedance analysis; cytokines; acute phase proteins; muscle mass; Body mass index; African Americans
4.  Functional status and survival after kidney transplantation1–12 
Transplantation  2014;97(2):189-195.
Background
Older patients constitute a growing proportion of U.S. kidney transplant recipients and often have a high burden of comorbidities. A summary measure of health such as functional status might enable transplant professionals to better evaluate and counsel these patients about their prognosis after transplant.
Methods
We linked UNOS registry data about post-transplant survival with pre-transplant functional status data (physical function [PF] scale of the Medical Outcomes Study Short Form-36) among individuals undergoing kidney transplant from 6/1/2000 – 5/31/2006. We examined the relationship between survival and functional status with multivariable Cox regression, adjusted for age. Using logistic regression models for three-year survival, we also estimated the reduction in deaths in the hypothetical scenario that recipients with poor functional status in this cohort experienced modest improvements in function.
Results
The cohort comprised 10,875 kidney transplant recipients (KTRs) with a mean age of 50 years; 14% were ≥65. Differences in three-year mortality between highest and lowest PF groups ranged from 3% among recipients <35 years to 14% among recipients ≥65 years. In multivariable Cox regression, worse PF was associated with higher mortality (HR 1.66 for lowest versus highest PF quartiles; p<0.001). Interactions between PF and age were non-significant. We estimated that 11% fewer deaths would occur if KTRs with the lowest functional status experienced modest improvements in function.
Conclusions
Across a wide age range, functional status was an independent predictor of post-transplant survival. Functional status assessment may be a useful tool with which to counsel patients about post-transplant outcomes.
doi:10.1097/TP.0b013e3182a89338
PMCID: PMC3946985  PMID: 24113514
Older age; functional status; kidney transplant; survival
5.  Objectives and Design of the Hemodialysis Fistula Maturation Study 
Background
A large proportion of newly created arteriovenous fistulas cannot be used for dialysis because they fail to mature adequately to support the hemodialysis blood circuit. The Hemodialysis Fistula Maturation (HFM) Study was designed to elucidate clinical and biological factors associated with fistula maturation outcomes.
Study Design
Multicenter prospective cohort study.
Setting & Participants
Approximately 600 patients undergoing creation of a new hemodialysis fistula will be enrolled at 7 centers in the United States and followed up for as long as 4 years.
Predictors
Clinical, anatomical, biological, and process-of-care attributes identified pre-operatively, intra-operatively, or post-operatively.
Outcomes
The primary outcome is unassisted clinical maturation defined as successful use of the fistula for dialysis for four weeks without any maturation-enhancing procedures. Secondary outcomes include assisted clinical maturation, ultrasound-based anatomical maturation, fistula procedures, fistula abandonment, and central venous catheter use.
Measurements
Pre-operative ultrasound arterial and venous mapping, flow-mediated and nitroglycerin-mediated brachial artery dilation, arterial pulse wave velocity, and venous distensibility; intra-operative vein tissue collection for histopathological and molecular analyses; post-operative ultrasounds at 1 day, 2 weeks, 6 weeks, and prior to fistula intervention and initial cannulation.
Results
Assuming complete data, no covariate adjustment, and unassisted clinical maturation of 50%, there will be 80% power to detect ORs of 1.83 and 1.61 for dichotomous predictor variables with exposure prevalences of 20% and 50%, respectively.
Limitations
Exclusion of two-stage transposition fistulas limits generalizability. The requirement for study visits may result in a cohort that is healthier than the overall population of patients undergoing fistula creation.
Conclusions
The HFM Study will be of sufficient size and scope to 1) evaluate a broad range of mechanistic hypotheses, 2) identify clinical practices associated with maturation outcomes, 3) assess the predictive utility of early indicators of fistula outcome, and 4) establish targets for novel therapeutic interventions to improve fistula maturation.
doi:10.1053/j.ajkd.2013.06.024
PMCID: PMC4134933  PMID: 23992885
6.  APOL1 Risk Variants, Race, and Progression of Chronic Kidney Disease 
The New England journal of medicine  2013;369(23):2183-2196.
BACKGROUND
Among patients in the United States with chronic kidney disease, black patients are at increased risk for end-stage renal disease, as compared with white patients.
METHODS
In two studies, we examined the effects of variants in the gene encoding apolipoprotein L1 (APOL1) on the progression of chronic kidney disease. In the African American Study of Kidney Disease and Hypertension (AASK), we evaluated 693 black patients with chronic kidney disease attributed to hypertension. In the Chronic Renal Insufficiency Cohort (CRIC) study, we evaluated 2955 white patients and black patients with chronic kidney disease (46% of whom had diabetes) according to whether they had 2 copies of high-risk APOL1 variants (APOL1 high-risk group) or 0 or 1 copy (APOL1 low-risk group). In the AASK study, the primary outcome was a composite of end-stage renal disease or a doubling of the serum creatinine level. In the CRIC study, the primary outcomes were the slope in the estimated glomerular filtration rate (eGFR) and the composite of end-stage renal disease or a reduction of 50% in the eGFR from baseline.
RESULTS
In the AASK study, the primary outcome occurred in 58.1% of the patients in the APOL1 high-risk group and in 36.6% of those in the APOL1 low-risk group (hazard ratio in the high-risk group, 1.88; P<0.001). There was no interaction between APOL1 status and trial interventions or the presence of baseline proteinuria. In the CRIC study, black patients in the APOL1 high-risk group had a more rapid decline in the eGFR and a higher risk of the composite renal outcome than did white patients, among those with diabetes and those without diabetes (P<0.001 for all comparisons).
CONCLUSIONS
Renal risk variants in APOL1 were associated with the higher rates of end-stage renal disease and progression of chronic kidney disease that were observed in black patients as compared with white patients, regardless of diabetes status. (Funded by the National Institute of Diabetes and Digestive and Kidney Diseases and others.)
doi:10.1056/NEJMoa1310345
PMCID: PMC3969022  PMID: 24206458
7.  The impact of documentation of severe acute kidney injury on mortality 
Clinical nephrology  2013;80(6):417-425.
Aims
Modification of the mortality risk associated with acute kidney injury (AKI) necessitates recognition of AKI when it occurs. We sought to determine whether formal documentation of AKI in the medical record, assessed by billing codes for AKI, would be associated with improved clinical outcomes.
Methods
Retrospective cohort study conducted at three hospitals within a single university health system. Adults without severe underlying kidney disease who suffered in-hospital AKI as defined by a doubling of baseline creatinine (n = 5,438) were included. Those whose AKI was formally documented according to discharge billing codes were compared to those without such documentation in terms of 30-day mortality.
Results
Formal documentation of AKI occurred in 2,325 patients (43%). Higher baseline creatinine, higher peak creatinine, medical admission status, and higher Sequential Organ Failure Assessment (SOFA) score were strongly associated with documentation of AKI. After adjustment for severity of disease, formal AKI documentation was associated with reduced 30-day mortality – OR 0.81 (0.68 – 0.96, p = 0.02). Patients with formal documentation were more likely to receive a nephrology consultation (31% vs. 6%, p < 0.001) and fluid boluses (64% vs. 45%, p < 0.001), and had a more rapid discontinuation of angiotensin-converting enzyme inhibitor and angiotensin-receptor blocker medications (HR 2.04, CI 1.69 – 2.46, p < 0.001).
Conclusions
Formal documentation of AKI is associated with improved survival after adjustment for illness severity among patients with creatinine-defined AKI.
doi:10.5414/CN108072
PMCID: PMC4018223  PMID: 24075024
acute kidney injury; documentation; mortality; cohort studies; international classification of diseases
8.  Effect of Dipyridamole plus Aspirin on Hemodialysis Graft Patency 
The New England journal of medicine  2009;360(21):2191-2201.
BACKGROUND
Arteriovenous graft stenosis leading to thrombosis is a major cause of complications in patients undergoing hemodialysis. Procedural interventions may restore patency but are costly. Although there is no proven pharmacologic therapy, dipyridamole may be promising because of its known vascular antiproliferative activity.
METHODS
We conducted a randomized, double-blind, placebo-controlled trial of extended-release dipyridamole, at a dose of 200 mg, and aspirin, at a dose of 25 mg, given twice daily after the placement of a new arteriovenous graft until the primary outcome, loss of primary unassisted patency (i.e., patency without thrombosis or requirement for intervention), was reached. Secondary outcomes were cumulative graft failure and death. Primary and secondary outcomes were analyzed with the use of a Cox proportional-hazards regression with adjustment for prespecified covariates.
RESULTS
At 13 centers in the United States, 649 patients were randomly assigned to receive dipyridamole plus aspirin (321 patients) or placebo (328 patients) over a period of 4.5 years, with 6 additional months of follow-up. The incidence of primary unassisted patency at 1 year was 23% (95% confidence interval [CI], 18 to 28) in the placebo group and 28% (95% CI, 23 to 34) in the dipyridamole–aspirin group, an absolute difference of 5 percentage points. Treatment with dipyridamole plus aspirin significantly prolonged the duration of primary unassisted patency (hazard ratio, 0.82; 95% CI, 0.68 to 0.98; P = 0.03) and inhibited stenosis. The incidences of cumulative graft failure, death, the composite of graft failure or death, and serious adverse events (including bleeding) did not differ significantly between study groups.
CONCLUSIONS
Treatment with dipyridamole plus aspirin had a significant but modest effect in reducing the risk of stenosis and improving the duration of primary unassisted patency of newly created grafts. (ClinicalTrials.gov number, NCT00067119.)
doi:10.1056/NEJMoa0805840
PMCID: PMC3929400  PMID: 19458364
9.  Urine neutrophil gelatinase-associated lipocalin levels do not improve risk prediction of progressive chronic kidney disease 
Kidney international  2013;83(5):909-914.
Novel biomarkers may improve our ability to predict which patients with chronic kidney disease (CKD) are at higher risk for progressive loss of renal function. Here we assessed the performance of urine neutrophil gelatinase-associated lipocalin (NGAL) for outcome prediction in a diverse cohort of 3386 patients with CKD in the CRIC study. In this cohort, the baseline mean estimated glomerular filtration rate (eGFR) was 42.4 ml/min/1.73m2; the median 24-hour urine protein was 0.2 gm/day; and the median urine NGAL concentration was 17.2 ng/mL. Over an average follow-up of 3.2 years, there were 689 cases in which the eGFR was decreased by half or incident end-stage renal disease developed. Even after accounting for eGFR, proteinuria and other known CKD progression risk factors, urine NGAL remained a significant independent risk factor (Cox model hazard ratio 1.70 highest to lowest quartile). The association between baseline urine NGAL levels and risk of CKD progression was strongest in the first two years of biomarker measurement. Within this time frame, adding urine NGAL to a model which included eGFR, proteinuria and other CKD progression risk factors led to net reclassification improvement of 24.7%; but the C-statistic remained nearly identical. Thus, while urine NGAL was an independent risk factor of progression among patients with established CKD of diverse etiology, it did not substantially improve prediction of outcome events.
doi:10.1038/ki.2012.458
PMCID: PMC3642209  PMID: 23344473
10.  A New Equation to Estimate Glomerular Filtration Rate 
Annals of internal medicine  2009;150(9):604-612.
Background
Equations to estimate glomerular filtration rate (GFR) are routinely used to assess kidney function. Current equations have limited precision and systematically underestimate measured GFR at higher levels.
Objective
To develop a new estimating equation (CKD-EPI creatinine equation).
Design
Cross-sectional analysis. Separate pooled databases for equation development and validation. Representative U.S. population for prevalence estimates.
Setting
Research studies and clinical populations (“studies”) with measured GFR. National Health and Nutrition Examination Survey (NHANES) 1999-2006.
Patients
Equation development in 10 studies (8254 people) and validation in 16 studies (3896 people). Prevalence estimates based on 16,032 people.
Measurements
GFR measured as the clearance of exogenous filtration markers (iothalamate in the development dataset; iothalamate and other markers in the validation dataset). Linear regression to estimate the logarithm of measured GFR from standardized creatinine, sex, race and age.
Results
In the validation dataset, the CKD-EPI performed better than the MDRD Study equation (p<0.001 for all subsequent comparisons), especially at higher GFR: lesser bias (median difference between measured and estimated GFR of 2.5 vs. 5.5 mL/min/1.73 m2, respectively); improved precision (interquartile range of the differences of 16.6 vs. 18.3 mL/min/1.73 m2, respectively); and greater accuracy (percent of estimated GFR within 30% of measured GFR of 84.1 vs. 80.6%, respectively. In NHANES, median (interquartile range) estimated GFR was 94.5 (79.7 – 108.1) vs. 85.0 (72.9 – 98.5) mL/min/1.73 m2, and the prevalence (95% confidence interval) of CKD was 11.5 (10.6, 12.4) % vs. 13.1 (12.1, 14.0) %, respectively.
Limitations
Limited number of elderly people and racial and ethnic minorities with measured GFR.
Conclusions
The CKD-EPI creatinine equation is more accurate than the MDRD Study equation and could replace it for routine clinical use.
PMCID: PMC2763564  PMID: 19414839
11.  African American race, obesity, and blood product transfusion are risk factors for acute kidney injury in critically ill trauma patients 
Journal of critical care  2012;27(5):496-504.
Purpose
Acute kidney injury (AKI) is a common source of morbidity after trauma. We sought to determine novel risk factors for AKI, by Acute Kidney Injury Network (AKIN) criteria, in critically ill trauma patients.
Materials and Methods
Prospective cohort study of 400 patients admitted to the ICU of a level one trauma center, followed for development of AKI over five days.
Results
AKI developed in 147/400 (36.8%) patients. In multivariable regression analysis, independent risk factors for AKI included African American race (OR 1.86; 95% CI 1.08,3.18; p=0.024), body mass index ≥30 (OR 4.72 versus normal BMI, 95% CI 2.59, 8.61, p<0.001), diabetes mellitus (OR 3.26; 95% CI 1.30,8.20; p=0.012), abdominal Abbreviated Injury Scale score ≥4 (OR 3.78; 95% CI 1.79,7.96; p<0.001), and unmatched packed red blood cells administered during resuscitation (OR 1.13 per unit; 95% CI 1.04,1.23; p=0.004). AKIN stages 1, 2, and 3 were associated with hospital mortality rates of 9.8%, 13.7%, and 30.4%, respectively, compared with 3.8% for those without AKI (p<0.001).
Conclusions
AKI in critically ill trauma patients is associated with substantial mortality. The findings of African American race, obesity, and blood product administration as independent risk factors for AKI deserve further study to elucidate underlying mechanisms.
doi:10.1016/j.jcrc.2012.02.002
PMCID: PMC3472045  PMID: 22591570
acute kidney injury; trauma; critical illness; race; obesity; transfusion; epidemiology; risk factors
12.  Estimating GFR Among Participants in the Chronic Renal Insufficiency Cohort (CRIC) Study 
Background
Glomerular filtration rate (GFR) is considered the best measure of kidney function, but repeated assessment is not feasible in most research studies.
Study Design
Cross-sectional study of 1,433 participants from the Chronic Renal Insufficiency Cohort (CRIC) Study (i.e., the GFR subcohort) to derive an internal GFR estimating equation using a split sample approach.
Setting & Participants
Adults from 7 US metropolitan areas with mild to moderate chronic kidney disease; 48% had diabetes and 37% were black.
Index Test
CRIC GFR estimating equation
Reference Test or Outcome
Urinary 125I-iothalamate clearance testing (measured GFR)
Other Measurements
Laboratory measures including serum creatinine and cystatin C, and anthropometrics
Results
In the validation dataset, the model that included serum creatinine, serum cystatin C, age, gender, and race was the most parsimonious and similarly predictive of mGFR compared to a model additionally including bioelectrical impedance analysis phase angle, CRIC clinical center, and 24-hour urinary creatinine excretion. Specifically, the root mean square errors for the separate model were 0.207 vs. 0.202, respectively. The performance of the CRIC GFR estimating equation was most accurate among the subgroups of younger participants, men, non-blacks, non-Hispanics, those without diabetes, those with body mass index <30 kg/m2, those with higher 24-hour urine creatinine excretion, those with lower levels of high-sensitivity C-reactive protein, and those with higher mGFR.
Limitations
Urinary clearance of 125I-iothalamate is an imperfect measure of true GFR; cystatin C is not standardized to certified reference material; lack of external validation; small sample sizes limit analyses of subgroup-specific predictors.
Conclusions
The CRIC GFR estimating equation predicts measured GFR accurately in the CRIC cohort using serum creatinine and cystatin C, age, gender, and race. Its performance was best among younger and healthier participants.
doi:10.1053/j.ajkd.2012.04.012
PMCID: PMC3565578  PMID: 22658574
glomerular filtration rate (GFR); kidney function; GFR estimation
13.  Electronically-measured adherence to immunosuppressive medications and kidney function after deceased donor kidney transplantation* 
Clinical transplantation  2010;25(2):E124-E131.
Background
Non-adherence with immunosuppressive medications can result in allograft rejection and eventually allograft loss.
Methods
In a racially diverse population, we utilized microelectronic cap monitors to determine the association of adherence with a single immunosuppressive medication and kidney allograft outcomes post-transplantation. This prospective cohort study enrolled 243 patients from eight transplant centers to provide adherence and kidney allograft outcomes data. To determine the association of adherence with change in estimated glomerular filtration rate (eGFR), we fit mixed effects models with the outcome being change in eGFR over time. We also fit Cox proportional hazards models to determine the association of adherence with time to persistent 25% and 50% decline in eGFR.
Results
The distribution of adherence post-transplant was as follows: 164 (68%), 49 (20%) and 30 (12%) had >85–100%, 50–85% and <50% adherence, respectively. 79 (33%) and 36 (15%) of the subjects experienced a persistent 25% decline in eGFR or allograft loss and 50% decline in eGFR or allograft loss during follow-up. Adherence was not associated with acute rejection or 25% decline or 50% decline in eGFR. In the adjusted and unadjusted model, adherence and black race were not associated with change in eGFR over time.
Conclusions
Non-adherence with a single immunosuppressive medication, was not associated with kidney allograft outcomes.
doi:10.1111/j.1399-0012.2010.01340.x
PMCID: PMC3566245  PMID: 20977496
kidney transplant; adherence; renal function
14.  Synthetic vascular hemodialysis access vs native arteriovenous fistula: A cost-utility analysis 
Annals of surgery  2012;255(1):181-186.
Objective
To determine the cost-effectiveness of two different vascular access strategies among incident dialysis patients.
Summary Background Data
Vascular access is a principal cause of morbidity and cost in hemodialysis patients. Recent guidelines and initiatives are intended to increase the proportion of patients with a fistula. However, there is growing awareness of the high prevalence of fistula failures and attendant complications.
Methods
A decision analysis using a Markov model was implemented to compare two different vascular access strategies among incident dialysis patients: a) placing an arteriovenous fistula (AVF1st) as the initial access followed by a synthetic vascular access if the AVF did not mature compared to b) placing a synthetic vascular access (SVA1st) as the initial access device. The cost-utility was evaluated across a range of the risk of complications from temporary catheters and SVA.
Results
Under base case assumptions, the AVF1st strategy yielded 2.19 QALYs compared to 2.06 QALYs from the SVA1st strategy. The incremental cost-effectiveness was $9389/QALY for AVF1st compared to SVA1st and was less than $50,000 per QALY as long as the probability of maturation is 36% or greater. AVF1st was the dominant strategy when the AVF maturation rate was 69% or greater.
Conclusion
The high risk of complications of temporary catheters and the overall low AVF maturation rate explain why a universal policy of AVF 1st for all incident dialysis patients may not optimize clinical outcomes. Strong consideration should be given to a more patient-centered approach taking into account the likelihood of AVF maturation.
doi:10.1097/SLA.0b013e31822f4e9b
PMCID: PMC3243807  PMID: 21918428
16.  Assessment of variation in live donor kidney transplantation across transplant centers in the United States1 - 8 
Transplantation  2011;91(12):1357-1363.
Background
Transplant centers vary in the proportion of kidney transplants performed using live donors. Clinical innovations that facilitate live donation may drive this variation.
Methods
We assembled a cohort of renal transplant candidates at 194 US centers using registry data from 1999 – 2005. We measured magnitude of live donor transplant (LDKTx) through development of a standardized live donor transplant ratio (SLDTR) at each center that accounted for center population differences. We examined associations between center characteristics and the likelihood that individual transplant candidates underwent LDKTx. To identify practices through which centers increase LDKTx, we also examined center characteristics associated with consistently being in the upper three quartiles of SLDTR.
Results
The cohort comprised 148,168 patients, among whom 34,593 (23.3%) underwent LDKTx. In multivariable logistic regression, candidates had an increased likelihood of undergoing LDKTx at centers with greater use of “unrelated donors” (defined as non-spouses and non-first-degree family members of the recipient; OR 1.31 for highest versus lowest use, p=0.02) and at centers with programs to overcome donor-recipient incompatibility (OR 1.33, p=0.01.) Centers consistently in the upper three SLDTR quartiles were also more likely to use “unrelated” donors (OR 8.30 per tertile of higher use, p<0.01), to have incompatibility programs (OR 4.79, p<0.01), and to use laparoscopic nephrectomy (OR 2.53 per tertile of higher use, p=0.02).
Conclusion
Differences in center population do not fully account for differences in the use of LDKTx. To maximize opportunities for LDKTx, centers may accept more unrelated donors and adopt programs to overcome biological incompatibility.
doi:10.1097/TP.0b013e31821bf138
PMCID: PMC3462439  PMID: 21562451
Live donor transplantation; center variation
17.  Longer-term outcomes after kidney transplantation from seronegative deceased donors at increased risk for blood-borne viral infection 1 – 17 
Transplantation  2011;91(11):1211-1217.
Background
Transmission of human immunodeficiency virus (HIV) and hepatitis C to transplant recipients has drawn attention to the use of allografts from seronegative donors at increased risk for viral infection (DIRVI).
Methods
We performed a cohort study of 7,803 kidney transplant recipients whose kidneys were recovered through one of two organ procurement organizations (OPO) from 1996 to 2007. Detailed OPO data on donor risk factors were linked to recipient data from the Organ Procurement and Transplantation Network.
Results
Median recipient follow-up was 3.9 years. 368 (5%) patients received DIRVI kidneys, a third of which were procured from donors with a history of injection drug use or commercial sex work. Compared to standard criteria kidney recipients, DIRVI kidney recipients were more likely to be HIV-positive or Black. In multivariable Cox regression, using DIRVI recipients as the reference, recipients of standard criteria donor kidneys had lower mortality (HR 0.71, p<0.01) and no difference in death-censored allograft failure (HR 1.09, p=0.62), whereas recipients of expanded criteria donor kidneys had no significant difference in mortality (HR 0.98, p=0.83), but a higher allograft failure rate (HR 1.93, p<0.01). High-quality data on post-transplant recipient viral testing were not available.
Conclusions
DIRVI kidney recipients experienced higher mortality than standard criteria kidney recipients. This finding could be explained if sicker patients received DIRVI kidneys (i.e., residual confounding) or the less likely possibility of undetected transmission of viral infections. Given the limitations of registry data used in this analysis, prospective studies are needed to further elucidate these findings.
doi:10.1097/TP.0b013e318218d59a
PMCID: PMC3462444  PMID: 21527872
Kidney transplantation; Viral transmission; Human immunodeficiency virus; High Risk Donor
18.  Relationship of Estimated GFR and Coronary Artery Calcification in the (CRIC) Chronic Renal Insufficiency Cohort Study 
Background
Coronary artery calcification (CAC) is associated with increased mortality risk in the general population. Although individuals with chronic kidney disease (CKD) are at markedly increased mortality risk, the incidence, prevalence, and prognosis of CAC in CKD is not well-understood.
Study Design
Cross-sectional observational study.
Setting and Participants
Analysis of 1,908 participants who underwent coronary calcium scanning as part of the multi-ethnic CRIC (Chronic Renal Insufficiency Cohort) Study.
Predictor
Estimated glomerular filtration rate (eGFR) computed using the Modification of Diet in Renal Disease (MDRD) Study equation, stratified by race, sex and diabetic status. eGFR was treated as a continous variable and a categorical variable compared to the reference range of >60 ml/min/1.73 m2
Measurements
CAC detected using CT scans using either an Imatron C-300 electron beam computed tomography scanner or multi-detector CT scanner. CAC was computed using the Agatston score, as a categorical variable. Analyses were performed using ordinal logistic regression.
Results
We found a strong and graded relationship between lower eGFR and increasing CAC. In unadjusted models, ORs increased from 1.68 (95% CI, 1.23–2.31) for eGFR from 50–59 to 2.82 (95% CI, 2.06–3.85) for eGFR of <30. Multivariable adjustment only partially attenuated the results (OR, 1.53; 95% CI, 1.07–2.20) for eGFR<30.
Limitations
Use of eGFR rather than measured GFR.
Conclusions
We demonstrated a graded relationship between severity of CKD and CAC, independent of traditional risk factors. These findings supports recent guidelines that state that if vascular calcification is present, it should be considered as a complementary component to be included in the decision making required for individualizing treatment of CKD.
doi:10.1053/j.ajkd.2011.04.024
PMCID: PMC3183168  PMID: 21783289
19.  Metabolic Syndrome, Components, and Cardiovascular Disease Prevalence in Chronic Kidney Disease: Findings from the Chronic Renal Insufficiency Cohort (CRIC) Study 
American Journal of Nephrology  2011;33(6):477-484.
Background/Aims
Metabolic syndrome may increase the risk for incident cardiovascular disease (CVD) and all-cause mortality in the general population. It is unclear whether, and to what degree, metabolic syndrome is associated with CVD in chronic kidney disease (CKD). We determined metabolic syndrome prevalence among individuals with a broad spectrum of kidney dysfunction, examining the role of the individual elements of metabolic syndrome and their relationship to prevalent CVD.
Methods
We evaluated four models to compare metabolic syndrome or its components to predict prevalent CVD using prevalence ratios in the Chronic Renal Insufficiency Cohort (CRIC) Study.
Results
Among 3,939 CKD participants, the prevalence of metabolic syndrome was 65% and there was a significant association with prevalent CVD. Metabolic syndrome was more common in diabetics (87.5%) compared with non-diabetics (44.3%). Hypertension was the most prevalent component, and increased triglycerides the least prevalent. Using the bayesian information criterion, we found that the factors defining metabolic syndrome, considered as a single interval-scaled variable, was the best of four models of metabolic syndrome, both for CKD participants overall and for diabetics and non-diabetics separately.
Conclusion
The predictive value of this model for future CVD outcomes will subsequently be validated in longitudinal analyses.
doi:10.1159/000327618
PMCID: PMC3095834  PMID: 21525746
Cardiovascular disease; Chronic kidney disease; Chronic Renal Insufficiency Cohort (CRIC) Study; Metabolic syndrome
20.  Diuretics, calciuria and secondary hyperparathyroidism in the Chronic Renal Insufficiency Cohort 
Nephrology Dialysis Transplantation  2011;26(4):1258-1265.
Background. Secondary hyperparathyroidism is a common complication of chronic kidney disease (CKD) that is associated with bone disease, cardiovascular disease and death. Pathophysiological factors that maintain secondary hyperparathyroidism in advanced CKD are well-known, but early mechanisms of the disease that can be targeted for its primary prevention are poorly understood. Diuretics are widely used to control volume status and blood pressure in CKD patients but are also known to have important effects on renal calcium handling, which we hypothesized could alter the risk of secondary hyperparathyroidism.
Methods. We examined the relationship of diuretic treatment with urinary calcium excretion, parathyroid hormone (PTH) levels and prevalence of secondary hyperparathyroidism (PTH ≥ 65 pg/mL) in a cross-sectional study of 3616 CKD patients in the Chronic Renal Insufficiency Cohort.
Results. Compared with no diuretics, treatment with loop diuretics was independently associated with higher adjusted urinary calcium (55.0 versus 39.6 mg/day; P < 0.001), higher adjusted PTH [67.9, 95% confidence interval (CI) 65.2–70.7 pg/mL, versus 52.8, 95% CI 51.1–54.6 pg/mL, P < 0.001] and greater odds of secondary hyperparathyroidism (odds ratio 2.1; 95% CI 1.7–2.6). Thiazide monotherapy was associated with lower calciuria (25.5 versus 39.6 mg/day; P < 0.001) but only modestly lower PTH levels (50.0, 95% CI 47.8–52.3, versus 520.8, 95% CI 51.1–54.6 pg/mL, P = 0.04) compared with no diuretics. However, coadministration of thiazide and loop diuretics was associated with blunted urinary calcium (30.3 versus 55.0 mg/day; P <0.001) and odds of hyperparathyroidism (odds ratio 1.3 versus 2.1; P for interaction = 0.05) compared with loop diuretics alone.
Conclusions. Loop diuretic use was associated with greater calciuria, PTH levels and odds of secondary hyperparathyroidism compared to no treatment. These associations were attenuated in patients who were coadministered thiazides. Diuretic choice is a potentially modifiable determinant of secondary hyperparathyroidism in CKD.
doi:10.1093/ndt/gfr026
PMCID: PMC3108352  PMID: 21382989
calciuria; chronic kidney disease; diuretics; parathyroid hormone
21.  Fibroblast Growth Factor 23 and Risks of Mortality and End-Stage Renal Disease in Patients with Chronic Kidney Disease 
Context
High levels of the phosphate regulating hormone, fibroblast growth factor 23 (FGF23), associate with mortality in patients with end-stage renal disease (ESRD), but little is known about its relationship with adverse outcomes in the much larger population of patients with earlier stages of chronic kidney disease (CKD).
Objective
Evaluate FGF23 as a risk factor for adverse outcomes in patients with CKD.
Design, Setting and Participants
A prospective study of 3,879 participants with CKD stages 2 – 4 who enrolled in the Chronic Renal Insufficiency Cohort between June 2003 and September 2008.
Main Outcome Measures
All-cause mortality and ESRD.
Results
At enrollment, mean estimated glomerular filtration rate (eGFR) was 42.8 ± 13.5 ml/min/1.73m2, and median FGF23 was 145 (interquartile range [IQR] 96 – 239) reference units/ml (RU/ml). During a median follow-up of 3.5 (IQR 2.5 – 4.4) years, 266 participants died (20.3/1000 person-years) and 410 reached ESRD (33.0/1000 person-years). Higher FGF23 levels independently associated with a greater risk of death in adjusted analyses of FGF23 on a continuous scale (hazard ratio [HR] per SD of lnFGF23, 1.5; 95%CI 1.3 – 1.7) or in quartiles (quartile 1, reference; quartile 2, HR 1.3; 95%CI 0.8 – 2.2; quartile 3, HR 2.0; 95%CI 1.2 – 3.3; quartile 4, HR 3.0; 95%CI 1.8 – 5.1). FGF23 was not independently associated with ESRD in adjusted analyses of the entire cohort, however, the effect was modified by eGFR (P for interaction = 0.005), which was the strongest predictor for ESRD. FGF23 independently associated with significantly greater risk of ESRD among participants with eGFR 30 – 44 (HR 1.3 per SD of lnFGF23; 95%CI 1.04 – 1.6) and ≥ 45 (HR 1.7; 95%CI 1.1 – 2.4), but not < 30 ml/min/1.73m2.
Conclusion
Elevated FGF23 is an independent risk factor for ESRD in patients with relatively preserved kidney function and for mortality across the spectrum of CKD.
doi:10.1001/jama.2011.826
PMCID: PMC3124770  PMID: 21673295
22.  FGF23 induces left ventricular hypertrophy 
The Journal of Clinical Investigation  2011;121(11):4393-4408.
Chronic kidney disease (CKD) is a public health epidemic that increases risk of death due to cardiovascular disease. Left ventricular hypertrophy (LVH) is an important mechanism of cardiovascular disease in individuals with CKD. Elevated levels of FGF23 have been linked to greater risks of LVH and mortality in patients with CKD, but whether these risks represent causal effects of FGF23 is unknown. Here, we report that elevated FGF23 levels are independently associated with LVH in a large, racially diverse CKD cohort. FGF23 caused pathological hypertrophy of isolated rat cardiomyocytes via FGF receptor–dependent activation of the calcineurin-NFAT signaling pathway, but this effect was independent of klotho, the coreceptor for FGF23 in the kidney and parathyroid glands. Intramyocardial or intravenous injection of FGF23 in wild-type mice resulted in LVH, and klotho-deficient mice demonstrated elevated FGF23 levels and LVH. In an established animal model of CKD, treatment with an FGF–receptor blocker attenuated LVH, although no change in blood pressure was observed. These results unveil a klotho-independent, causal role for FGF23 in the pathogenesis of LVH and suggest that chronically elevated FGF23 levels contribute directly to high rates of LVH and mortality in individuals with CKD.
doi:10.1172/JCI46122
PMCID: PMC3204831  PMID: 21985788
23.  Chronic Kidney Disease and Prevalent Atrial Fibrillation: The Chronic Renal Insufficiency Cohort (CRIC) 
American heart journal  2010;159(6):1102-1107.
Background
The epidemiology of atrial fibrillation (AF) has been mainly investigated in patients with end-stage renal disease (ESRD), with limited data on less advanced chronic kidney disease (CKD) stages.
Methods
A total of 3267 adult participants (50% non-Hispanic blacks, 46% females) with CKD from the Chronic Renal Insufficiency Cohort (CRIC) were included in this study. None of the study participants had been on dialysis. Those with self-identified race/ethnicity other than non-Hispanic black or white (N=323) or those without ECG data (N=22) were excluded. AF was ascertained by a 12-lead electrocardiogram (ECG) and self-report. Age- sex- race/ethnicity-specific prevalence rates of AF were estimated and compared between subgroups. Cross sectional associations and correlates with prevalent AF were examined using unadjusted and multivariable adjusted logistic regression analysis.
Results
The mean estimated glomerular filtration rate (GFR) was 43.6 (±13.0) ml/min/1.73 m2. AF was present in 18% of the study population and in more than 25% of those 70 years or older. In multivariable adjusted models, 1-SD increase in age (11 years) [odds ratio (OR) and CI 95%: 1.27 (1.13, 1.43), P<0.0001], female sex [0.80 (0.65, 0.98), P=0.0303], smoking (former vs. never) [1.34 (1.08, 1.66), P= 0.0081], history of heart failure [3.28 (2.47, 4.36), P<0.001], and history of cardiovascular disease [1.94 (1.56, 2.43), P<0.0001] were significantly associated with AF. Race/ethnicity, hypertension, diabetes, body mass index, physical activity, education, high sensitivity C-reactive protein, total cholesterol, and alcohol intake were not significantly associated with AF. An estimated GFR <45 ml/min/1.73 m2 was associated with AF in an unadjusted model [1.35 (1.13–1.62)); P=0.0010)], but not after multivariable adjustment [1.12 (0.92– 1.35), P=0.2710].
Conclusions
Nearly one in five participants in CRIC, a national study of CKD, had evidence for AF at study entry, a prevalence similar to that reported among patients with ESRD and 2–3 times of that reported in the general population. Risk factors for AF in this CKD population do not mirror those reported in the general population.
doi:10.1016/j.ahj.2010.03.027
PMCID: PMC2891979  PMID: 20569726
24.  Variability of Creatinine Measurements in Clinical Laboratories: Results from the CRIC Study 
American Journal of Nephrology  2010;31(5):426-434.
Objectives
Estimating equations using serum creatinine (SCr) are often used to assess glomerular filtration rate (GFR). Such creatinine (Cr)-based formulae may produce biased estimates of GFR when using Cr measurements that have not been calibrated to reference laboratories. In this paper, we sought to examine the degree of this variation in Cr assays in several laboratories associated with academic medical centers affiliated with the Chronic Renal Insufficiency Cohort (CRIC) Study; to consider how best to correct for this variation, and to quantify the impact of such corrections on eligibility for participation in CRIC. Variability of Cr is of particular concern in the conduct of CRIC, a large multicenter study of subjects with chronic renal disease, because eligibility for the study depends on Cr-based assessment of GFR.
Methods
A library of 5 large volume plasma specimens from apheresis patients was assembled, representing levels of plasma Cr from 0.8 to 2.4 mg/dl. Samples from this library were used for measurement of Cr at each of the 14 CRIC laboratories repetitively over time. We used graphical displays and linear regression methods to examine the variability in Cr, and used linear regression to develop calibration equations. We also examined the impact of the various calibration equations on the proportion of subjects screened as potential participants who were actually eligible for the study.
Results
There was substantial variability in Cr assays across laboratories and over time. We developed calibration equations for each laboratory; these equations varied substantially among laboratories and somewhat over time in some laboratories. The laboratory site contributed the most to variability (51% of the variance unexplained by the specimen) and variation with time accounted for another 15%. In some laboratories, calibration equations resulted in differences in eligibility for CRIC of as much as 20%.
Conclusions
The substantial variability in SCr assays across laboratories necessitates calibration of SCr measures to a common standard. Failing to do so may substantially affect study eligibility and clinical interpretations when they are determined by Cr-based estimates of GFR.
doi:10.1159/000296250
PMCID: PMC2883847  PMID: 20389058
Chronic renal disease; Creatinine measurements, variability; Chronic Renal Insufficiency Cohort (CRIC) Study; Glomerular filtration rate
25.  Predictors of having a potential live donor: a prospective cohort study of kidney transplant candidates 
The barriers to live donor transplantation are poorly understood. We performed a prospective cohort study of individuals undergoing renal transplant evaluation. Participants completed a questionnaire that assessed clinical characteristics as well as knowledge and beliefs about transplantation. A participant satisfied the primary outcome if anyone contacted the transplant center to be considered as a live donor for that participant. The final cohort comprised 203 transplant candidates, among whom 80 (39.4%) had a potential donor contact the center and 19 (9.4%) underwent live donor transplantation. In multivariable logistic regression, younger candidates (OR 1.65 per 10 fewer years, p<0.01) and those with annual income >=$15,000 (OR 4.22, p=0.03) were more likely to attract a potential live donor. Greater self-efficacy, a measure of the participant’s belief in his or her ability to attract a donor, was a predictor of having a potential live donor contact the center (OR 2.73 per point, p<0.01), while knowledge was not (p=0.56). The lack of association between knowledge and having a potential donor suggests that more intensive education of transplant candidates will not increase live donor transplantation. On the other hand, self-efficacy may be an important target in designing interventions to help candidates find live donors.
doi:10.1111/j.1600-6143.2009.02848.x
PMCID: PMC2864790  PMID: 19845584
live donor; kidney transplantation; health disparities

Results 1-25 (35)