Background. Human immunodeficiency virus (HIV)–infected individuals are at higher risk for chronic kidney disease than HIV-uninfected individuals. We investigated whether the inflammation present in treated HIV infection contributes to kidney dysfunction among HIV-infected men receiving highly active antiretroviral therapy.
Methods. The glomerular filtration rate (GFR) was directly measured (using iohexol) along with 12 markers of inflammation in Multicenter AIDS Cohort Study participants. Exploratory factor analysis was used to identify inflammatory processes related to kidney dysfunction. The estimated levels of these inflammatory processes were used in adjusted logistic regression analyses evaluating cross-sectional associations with kidney function outcomes.
Results. There were 434 HIV-infected men receiving highly active antiretroviral therapy and 200 HIV-uninfected men. HIV-infected men were younger (median age, 51 vs 53 years) and had higher urine protein-creatinine ratios (median, 98 vs 66 mg/g) but comparable GFRs (median, 109 vs 106 mL/min|1.73 m2). We found an inflammatory process dominated by markers: soluble tumor necrosis factor receptor 2, soluble interleukin 2 receptor α, soluble gp130, soluble CD27, and soluble CD14. An increase of 1 standard deviation in that inflammatory process was associated with significantly greater odds of GFR ≤90 mL/min/1.73 m2 (odds ratio, 2.0) and urine protein >200 mg/g (odds ratio, 2.3).
Conclusions. Higher circulating levels of immune activation markers among treated HIV-infected men may partially explain their higher burden of kidney dysfunction compared with uninfected men.
HIV infection; inflammatory markers; chronic kidney disease; glomerular filtration rate; immune activation
Fibroblast growth factor23 (FGF23), an early marker of kidney dysfunction, is associated with cardiovascular death. Its role in HIV-positive individuals is unknown. We measured FGF23 in 100 HIV-negative and 191 HIV-positive nondiabetic adults with normal baseline estimated glomerular filtration rate (GFR). We measured GFR by iohexol annually, albumin-creatinine ratio (ACR) every 6 months, as well as pulse wave velocity, carotid plaque, and carotid intima media thickness (IMT) at baseline and 2 years. Progressive albuminuria was defined as follow-up ACR ≥2-fold than baseline and ≥30 mg/g. Regression models assessed associations of FGF23 with baseline factors and longitudinal changes in disease markers. FGF23 levels were similar in HIV serostatus. Among HIV-positive persons, factors independently associated with higher baseline FGF23 levels included female (adjusted ratio of geometric means [95% CI],1.46 [1.21,1.76]), serum phosphorus (1.20 [1.03,1.40]), HCV (1.31 [1.10,1.56]) and non-suppressed HIV RNA (1.27 [1.01,1.76]). At baseline, FGF23 was not associated with GFR, albuminuria, carotid plaque, or carotid IMT in cross-sectionally adjusted analysis of HIV-positive individuals. However, higher baseline FGF23 was associated with progressive albuminuria (odds ratio1.48 [95% CI]:1.05,2.08) and a more rapid increase in IMT (13 μm/year, 95% CI,3,24). These findings suggest a role for FGF23 in HIV-positive populations in identifying patients at greater risk for cardiovascular and kidney disease.
Anemia is common in chronic kidney disease (CKD) and associated with poor outcomes. In cross-sectional studies, lower estimated glomerular filtration rate (eGFR) has been associated with increased risk for anemia. The aim of this study was to determine how hematocrit changes as eGFR declines and what factors impact this longitudinal association.
We followed 1094 African-Americans with hypertensive nephropathy who participated in the African-American Study of Kidney Disease and Hypertension. Mixed effects models were used to determine longitudinal change in hematocrit as a function of eGFR. Interaction terms were used to assess for differential effects of age, gender, baseline eGFR, baseline proteinuria, malnutrition and inflammation on eGFR-associated declines in hematocrit. In sensitivity analyses, models were run using iGFR (by renal clearance of I125 iothalamate) in place of eGFR.
At baseline, mean hematocrit was 39% and 441 (40%) individuals had anemia. The longitudinal relationship between eGFR and hematocrit differed by baseline eGFR and was steeper when baseline eGFR was <45 mL/min/1.73 m2. For example, the absolute decline in hematocrit per 10 mL/min/1.73 m2 decline in longitudinal eGFR was −3.7, −1.3 and −0.5% for baseline eGFR values of 20, 40 and 60 mL/min/1.73 m2, respectively (P < 0.001 comparing the longitudinal association between baseline eGFR = 40 or 60 versus baseline eGFR = 20 mL/min/1.73 m2). Similarly, male sex, younger age (<65 years) and higher baseline proteinuria (protein-to-creatinine ratio >0.22) were associated with greater hematocrit declines per unit decrease in longitudinal eGFR compared with female sex, older age and low baseline proteinuria, respectively (P-interaction <0.05 for each comparison). The longitudinal eGFR–hematocrit association did not differ by body mass index, serum albumin or C-reactive protein.
Men, younger individuals and those with low baseline eGFR (<45 mL/min/1.73 m2) or baseline proteinuria are particularly at risk for eGFR-related declines in hematocrit.
African-American Study of Kidney Disease and Hypertension (AASK); anemia; chronic kidney disease; hematocrit
HIV-infected African Americans who carry the APOL1 high-risk genotype had faster kidney function decline than persons with the low-risk genotype; this was most pronounced among those without sustained HIV suppression. Associations were not observed among those with sustained viral suppression.
Background. Existing data suggest that human immunodeficiency virus (HIV)-infected African Americans carrying 2 copies of the APOL1 risk alleles have greater risk of kidney disease than noncarriers. We sought to determine whether HIV RNA suppression mitigates APOL1-related kidney function decline among African Americans enrolled in the Multicenter AIDS Cohort Study.
Methods. We genotyped HIV-infected men for the G1 and G2 risk alleles and ancestry informative markers. Mixed-effects models were used to estimate the annual rate of estimated glomerular filtration rate (eGFR) decline, comparing men carrying 2 (high-risk) vs 0–1 risk allele (low-risk). Effect modification by HIV suppression status (defined as HIV type 1 RNA level <400 copies/mL for >90% of follow-up time) was evaluated using interaction terms and stratified analyses.
Results. Of the 333 African American men included in this study, 54 (16%) carried the APOL1 high-risk genotype. Among HIV-infected men with unsuppressed viral loads, those with the high-risk genotype had a 2.42 mL/minute/1.73 m2 (95% confidence interval [CI], −3.52 to −1.32) faster annual eGFR decline than men with the low-risk genotype. This association was independent of age, comorbid conditions, baseline eGFR, ancestry, and HIV-related factors. In contrast, the rate of decline was similar by APOL1 genotype among men with sustained viral suppression (−0.16 mL/minute/1.73 m2/year; 95% CI, −.59 to .27; P for interaction <.001).
Conclusions. Unsuppressed HIV-infected African Americans with the APOL1 high-risk genotype experience an accelerated rate of kidney function decline; HIV suppression with antiretroviral therapy may reduce these deleterious renal effects.
HIV; antiretroviral therapy; genetic; kidney disease
APOL1 genotype is associated with advanced kidney disease in African-Americans, but the pathogenic mechanisms are unclear. Here, associations of APOL1 genotype with urine biomarkers of glomerular and tubular injury, and with kidney function decline, were evaluated.
Setting & Participants
431 HIV-infected African-American women enrolled in Women's Interagency HIV Study (WIHS).
Albumin-creatinine ratio (ACR), four tubular injury biomarkers (interleukin 18 [IL-18], kidney injury molecule 1 [KIM-1], neutrophil gelatinase-associated lipocalin [NGAL], and α1-microglobulin [α1m]), and kidney function estimated using the CKD-EPI cystatin C equation.
Participants were genotyped for APOL1 single-nucleotide polymorphisms rs73885319 (G1 allele) and rs71785313 (G2 allele). Urine biomarker levels were measured using stored samples from 1999-2000. Cystatin C was measured using serum collected at baseline and 4- and 8-year follow-up.
At baseline, ACR levels were higher among 47 women with 2 APOL1 risk alleles versus 384 women with 0/1 risk allele (median, 24 vs. 11 mg/g; p < 0.001). Compared to women with 0/1 risk allele, women with 2 risk alleles had 104% higher ACR (95% CI, 29-223 mg/g) and 2-fold greater risk of ACR > 30 mg/g (95% CI, 1.17-3.44) after multivariable adjustment. APOL1 genotype showed little association with urine IL-18:Cr, KIM-1:Cr, and NGAL:Cr (estimates of -5% [95% CI, -24% to 18%], -20% [95% CI, -36% to 1%], and 10% [95% CI, -26% to 64%], respectively), or detectable urine α1m (prevalence ratio, 1.13; 95% CI, 0.65-1.97) in adjusted analyses. Compared to women with 0/1 allele, women with 2 risk alleles had faster eGFR decline, by 1.2 (95% CI, -2.2 to -0.2) ml/min/1.73 m2 per year, and had 1.7- and 3.4-fold greater rates of incident chronic kidney disease (95% CI, 1.1-2.5) and 10% annual eGFR decline (95% CI, 1.7-6.7), respectively, with minimal attenuation after adjustment for glomerular and tubular injury biomarkers.
Results may not be generalizable to men.
Among HIV-infected African-American women, APOL1-associated kidney injury appears to localize to the glomerulus, rather than the tubules.
APOL1 genotype; risk variant; risk allele; G1 allele; G2 allele; single-nucleotide polymorphism (SNP); albumin-creatinine ratio (ACR); proteinuria; tubular injury biomarker; apolipoprotein L1; kidney disease; renal function; glomerular injury; African American; Women's Interagency HIV Study (WIHS)
Human immunodeficiency virus-infected individuals have benefited from improved viral suppression, but a discrepancy in end-stage renal disease risk between black and nonblack HIV-infected persons remains, in part due to continued disparities in antiretroviral use and viral suppression, and higher rates of comorbidities.
Background. Human immunodeficiency virus (HIV)-infected adults, particularly those of black race, are at high-risk for end-stage renal disease (ESRD), but contributing factors are evolving. We hypothesized that improvements in HIV treatment have led to declines in risk of ESRD, particularly among HIV-infected blacks.
Methods. Using data from the North American AIDS Cohort Collaboration for Research and Design from January 2000 to December 2009, we validated 286 incident ESRD cases using abstracted medical evidence of dialysis (lasting >6 months) or renal transplant. A total of 38 354 HIV-infected adults aged 18–80 years contributed 159 825 person-years (PYs). Age- and sex-standardized incidence ratios (SIRs) were estimated by race. Poisson regression was used to identify predictors of ESRD.
Results. HIV-infected ESRD cases were more likely to be of black race, have diabetes mellitus or hypertension, inject drugs, and/or have a prior AIDS-defining illness. The overall SIR was 3.2 (95% confidence interval [CI], 2.8–3.6) but was significantly higher among black patients (4.5 [95% CI, 3.9–5.2]). ESRD incidence declined from 532 to 303 per 100 000 PYs and 138 to 34 per 100 000 PYs over the time period for blacks and nonblacks, respectively, coincident with notable increases in both the prevalence of viral suppression and the prevalence of ESRD risk factors including diabetes mellitus, hypertension, and hepatitis C virus coinfection.
Conclusions. The risk of ESRD remains high among HIV-infected individuals in care but is declining with improvements in virologic suppression. HIV-infected black persons continue to comprise the majority of cases, as a result of higher viral loads, comorbidities, and genetic susceptibility.
end-stage renal disease (ESRD); chronic kidney disease (CKD); HIV infection/AIDS; HIV/AIDS; glomerular filtration rate (GFR)
Focal segmental glomerulosclerosis (FSGS) recurs after kidney transplantation in more than 30 % of cases and can lead to allograft loss. Serum soluble urokinase -type plasminogen activator receptor (suPAR) is implicated in the pathogenesis of native and recurrent FSGS.
We conducted a retrospective study of 25 adults with post-transplant FSGS. We investigated the relationship between suPAR levels and podocyte changes and the impact of therapy on podocyte structure. We assessed response to therapy by improvement in proteinuria, allograft function and resolution of histologic changes.
A median of 15 (interquartile range: 10–23) plasmapheresis sessions was administered; 13 of the subjects also received rituximab. Median pre-treatment suPAR levels were higher among those with severe (≥75%) versus those with mild (≤25%) podocyte foot process effacement (13,030 vs. 4,806 pg/mL; P=0.02). Overall, mean ± standard deviation of proteinuria improved from 5.1 ± 3.8 to 2.1 ± 2.8 mg/dL (P=0.003), mean podocyte effacement decreased from 57 ± 33% to 22 ± 22 % (P=0.0001), estimated glomerular filtration rates increased from median (interquartile range) of 32.9 (20.6 – 44.2) to 39.3 (28.8 – 63.4) (P<0.0001) and suPAR levels decreased from a median of 6,781 pg/ml to 4,129 pg/ml (P=0.02) with therapy.
Podocyte effacement is the first pathological manifestation of FSGS post-transplant. The degree of podocyte effacement correlates with suPAR levels at time of diagnosis. Response to therapy results in significant reduction of suPAR levels and complete or significant improvement of podocyte effacement.
kidney transplant; podocyte effacement; FSGS; suPAR; rituximab
Serum albumin concentrations are a strong predictor of mortality and cardiovascular disease in HIV-infected individuals. We studied the longitudinal associations between serum albumin levels and kidney function decline in a population of HIV-infected women.
Retrospective cohort analysis.
Setting & Participants
The study participants were recruited from the Women’s Interagency HIV Study (WIHS), a large observational study designed to understand risk factors for the progression of HIV infection in women living in urban communities. 908 participants had baseline assessment of kidney function and two follow-up measures over an average of 8 years.
The primary predictor was serum albumin concentration.
We examined annual change in kidney function. Secondary outcomes included rapid kidney function decline and incident reduced estimated glomerular filtration rate (eGFR).
Kidney function decline was determined by cystatin C–based (eGFRcys) and creatinine-based eGFR (eGFRcr) at baseline and follow up. Each model was adjusted for kidney disease and HIV-related risk factors using linear and relative risk regression.
After multivariate adjustment, each 0.5-g/dL decrement in baseline serum albumin concentration was associated with a 0.56-mL/min faster annual decline in eGFRcys (P<0.001), which was only slightly attenuated to 0.55-mL/min/1.73 m2 after adjustment for albuminuria. Results were similar whether using eGFRcys or eGFRcr. In adjusted analyses, each 0.5-g/dL lower baseline serum albumin was associated with a 1.71-fold greater risk of rapid kidney function decline (p<0.001) and a 1.72-fold greater risk of incident reduced eGFR (p<0.001).
The cohort is composed of only female participants from urban communities within the United States.
Lower levels of serum albumin were strongly associated with kidney function decline and incident reduced eGFR in HIV-infected women, independent of HIV disease status, BMI and albuminuria.
albumin; kidney function; HIV; incident reduced eGFR; albuminuria; disease trajectory; chronic kidney disease (CKD) progression
Natural history studies suggest increased risk for kidney function decline with HIV infection, but few studies have made comparisons with HIV-uninfected women. We examined whether HIV infection treated with highly active antiretroviral therapy (HAART) remains associated with faster kidney function decline in the Women's Interagency HIV Study. HIV-infected women initiating HAART with (n=105) or without (n=373) tenofovir (TDF) were matched to HIV-uninfected women on calendar and length of follow-up, age, systolic blood pressure, hepatitis C antibody serostatus, and diabetes history. Linear mixed models were used to evaluate differences in annual estimated glomerular filtration rate (eGFR). Person-visits were 4,741 and 11,512 for the TDF-treated and non-TDF-treated analyses, respectively. Mean baseline eGFRs were higher among women initiated on TDF-containing HAART and lower among those on TDF-sparing HAART compared to their respective HIV-uninfected matches (p<0.05 for both). HIV-infected women had annual rates of eGFR changes similar to HIV-uninfected matches (p-interaction >0.05 for both). Adjusting for baseline eGFR, mean eGFRs at 1 and 3 years of follow-up among women initiated on TDF-containing HAART were lower than their uninfected matches (−4.98 and −4.26 ml/min/1.73 m2, respectively; p<0.05 for both). Mean eGFR of women initiated on TDF-sparing HAART was lower versus uninfected matches at 5 years (–2.19 ml/min/1.73 m2, p=0.03). HAART-treated HIV-infected women had lower mean eGFRs at follow-up but experienced rates of annual eGFR decline similar to HIV-uninfected women. Tenofovir use in HIV-infected women with normal kidney function did not accelerate long-term kidney function decline relative to HIV-uninfected women.
Proteinuria is associated with adverse clinical outcomes in HIV infection. Here we evaluated whether APOL1 risk alleles, previously associated with advanced kidney disease, is independently associated with proteinuria in HIV infection in a cross-sectional study of HIV-infected women in the Women’s Interagency HIV Study. We estimated the percent difference in urine protein excretion and odds of proteinuria (200 mg/g and higher) associated with two versus one or no APOL1 risk allele using linear and logistic regression, respectively. Of 1285 women successfully genotyped, 379 carried one and 80 carried two risk alleles. Proteinuria was present in 124 women; 78 of whom had proteinuria confirmed on a second sample. In women without prior AIDS, two risk alleles were independently associated with a 69% higher urine protein excretion (95% CI: 36%, 108%) and 5-fold higher odds of proteinuria (95% CI: 2.45, 10.37) versus one or no risk allele. No association was found in women with prior AIDS. Analyses in which women with impaired kidney function were excluded and proteinuria was confirmed by a second urine sample yielded similar estimates. Thus, APOL1 risk alleles are associated with significant proteinuria in HIV-infected persons without prior clinical AIDS, independent of clinical factors traditionally associated with proteinuria. Trials are needed to determine whether APOL1 genotyping identifies individuals who could benefit from earlier intervention to prevent overt renal disease.
Due to the improved longevity afforded by combination antiretroviral therapy (cART), HIV-infected individuals are developing several non-AIDS related comorbid conditions. Consequently, medical management of the HIV-infected population is increasingly complex, with a growing list of potential drug-drug interactions (DDIs). This article reviews some of the most relevant and emerging potential interactions between antiretroviral medications and other agents. The most common DDIs are those involving protease inhibitors or non-nucleoside reverse transcriptase inhibitors which alter the cytochrome P450 enzyme system and/or drug transporters such as p-glycoprotein. Of note are the new agents for the treatment of chronic hepatitis C virus infection. These new classes of drugs and others drugs which are increasingly used in this patient population represent a significant challenge with regard to achieving the goals of effective HIV suppression and minimization of drug-related toxicities. Awareness of DDIs and a multidisciplinary approach are imperative in reaching these goals.
drug-drug interactions; HIV; antiretroviral; hepatitis C virus; therapeutic drug monitoring
Higher left ventricular mass (LV) strongly predicts cardiovascular mortality in hemodialysis patients. Although several parameters of preload and afterload have been associated with higher LV mass, whether these parameters independently predict LV mass, remains unclear.
This study examined a cohort of 391 adults with incident hemodialysis enrolled in the Predictors of Arrhythmic and Cardiovascular Risk in End Stage Renal Disease (PACE) study. The main exposures were systolic and diastolic blood pressure (BP), pulse pressure, arterial stiffness by pulse wave velocity (PWV), volume status estimated by pulmonary pressures using echocardiogram and intradialytic weight gain. The primary outcome was baseline left ventricular mass index (LVMI).
Each systolic, diastolic blood, and pulse pressure measurement was significantly associated with LVMI by linear regression regardless of dialysis unit BP or non-dialysis day BP measurements. Adjusting for cardiovascular confounders, every 10 mmHg increase in systolic or diastolic BP was significantly associated with higher LVMI (SBP β = 7.26, 95 % CI: 4.30, 10.23; DBP β = 10.05, 95 % CI: 5.06, 15.04), and increased pulse pressure was also associated with higher LVMI (β = 0.71, 95 % CI: 0.29, 1.13). Intradialytic weight gain was also associated with higher LVMI but attenuated effects after adjustment (β = 3.25, 95 % CI: 0.67, 5.83). PWV and pulmonary pressures were not associated with LVMI after multivariable adjustment (β = 0.19, 95 % CI: −1.14, 1.79; and β = 0.10, 95 % CI: −0.51, 0.70, respectively). Simultaneously adjusting for all main exposures demonstrated that higher BP was independently associated with higher LVMI (SBP β = 5.64, 95 % CI: 2.78, 8.49; DBP β = 7.29, 95 % CI: 2.26, 12.31, for every 10 mmHg increase in BP).
Among a younger and incident hemodialysis population, higher systolic, diastolic, or pulse pressure, regardless of timing with dialysis, is most associated with higher LV mass. Future studies should consider the use of various BP measures in examining the impact of BP on LVM and cardiovascular disease. Findings from such studies could suggest that high BP should be more aggressively treated to promote LVH regression in incident hemodialysis patients.
Electronic supplementary material
The online version of this article (doi:10.1186/s12882-015-0131-4) contains supplementary material, which is available to authorized users.
Proteinuria occurs commonly among HIV-infected and -uninfected injection drug users (IDUs) and is associated with increased mortality risk. Vitamin D deficiency, highly prevalent among IDUs and potentially modifiable, may contribute to proteinuria. To determine whether vitamin D is associated with proteinuria in this population, we conducted a cross-sectional study in the AIDS Linked to the IntraVenous Experience (ALIVE) Study.
25(OH)-vitamin D levels were measured in 268 HIV-infected and 614 HIV-uninfected participants. The association between vitamin D deficiency (<10 ng/mL) and urinary protein excretion was evaluated by linear regression. The odds of persistent proteinuria (urine protein-to-creatinine ratio >200 mg/g on two occasions) associated with vitamin D deficiency was examined using logistic regression.
One-third of participants were vitamin D-deficient. Vitamin D deficiency was independently associated with higher urinary protein excretion (P<0.05) among HIV-infected and diabetic IDUs (P-interaction<0.05 for all). Persistent proteinuria occurred in 18% of participants. Vitamin D deficiency was associated with >6-fold odds of persistent proteinuria among diabetic IDUs (odds ratio [OR]=6.29, 95% confidence interval [CI]: 1.54, 25.69) independent of sociodemographic characteristics, co-morbid conditions, body mass index, and impaired kidney function (estimated GFR <60 mL/min|1.73 m2); no association, however, was observed among non-diabetic IDUs (OR=1.06, 95% CI: 0.64, 1.76) (P-interaction<0.05).
Vitamin D deficiency was associated with higher urinary protein excretion among those with HIV infection and diabetes. Vitamin D deficiency was independently associated with persistent proteinuria among diabetic IDUs, although not in non-diabetic persons. Whether vitamin D repletion ameliorates proteinuria in these patients requires further study.
Vitamin D deficiency; proteinuria; HIV; injection drug use; diabetes
Tenofovir disoproxil fumarate (TDF) may cause acute kidney injury and proximal tubular dysfunction. However, no detailed studies document urinary phosphate wasting as a marker of TDF-induced tubulopathy.
Records of HIV-infected patients with presumed TDF toxicity were reviewed. We describe the characteristics and clinical course of 15 patients who had documented elevated (>20%) fractional excretion of phosphate (FEphos).
Patients were predominantly Caucasian and male (73 and 80%, respectively), with a mean age of 56 years (range 38–76). Of the 15 patients, 11 had a estimated glomerular filtration rate (eGFR) of >90 mL/min/1.732 at time of TDF initiation. The mean duration of TDF therapy prior to diagnosis of TDF toxicity was 64 months. Mean FEphos was 34% (range 20–62). The mean eGFR at TDF initiation was 104 mL/min/1.73 m2 [standard deviation (SD) 17.0] with a gradual decline to 69 mL/min/1.73 m2 (SD 19.0) by the time of TDF discontinuation. Of 10 patients with repeated FEphos after TDF discontinuation, 9 had improvement of their FEphos. Of these individuals, 6 had normalization of their FEphos. Estimated GFR improved in 12 patients after discontinuation of TDF, though importantly, none returned to their baseline eGFR.
Urinary phosphate wasting is a sensitive marker for TDF-induced proximal tubulopathy and is associated with unrecognized and permanent renal function decline. Tubular dysfunction can develop after years of TDF therapy in those with normal kidney function at the time of drug initiation. This suggests that continuing vigilance be maintained in all those on TDF.
HIV; kidney injury; phosphate wasting; proximal tubular dysfunction; tenofovir
HIV; tenofovir; kidney disease; sub-Saharan Africa
Sudden cardiac death occurs commonly in the end-stage renal disease population receiving dialysis, with 25% dying of sudden cardiac death over 5 years. Despite this high risk, surprisingly few prospective studies have studied clinical- and dialysis-related risk factors for sudden cardiac death and arrhythmic precursors of sudden cardiac death in end-stage renal disease.
We present a brief summary of the risk factors for arrhythmias and sudden cardiac death in persons with end-stage renal disease as the rationale for the Predictors of Arrhythmic and Cardiovascular Risk in End Stage Renal Disease (PACE) study, a prospective cohort study of patients recently initiated on chronic hemodialysis, with the overall goal to understand arrhythmic and sudden cardiac death risk. Participants were screened for eligibility and excluded if they already had a pacemaker or an automatic implantable cardioverter defibrillator. We describe the study aims, design, and data collection of 574 incident hemodialysis participants from the Baltimore region in Maryland, U.S.A.. Participants were recruited from 27 hemodialysis units and underwent detailed clinical, dialysis and cardiovascular evaluation at baseline and follow-up. Cardiovascular phenotyping was conducted on nondialysis days with signal averaged electrocardiogram, echocardiogram, pulse wave velocity, ankle, brachial index, and cardiac computed tomography and angiography conducted at baseline. Participants were followed annually with study visits including electrocardiogram, pulse wave velocity, and ankle brachial index up to 4 years. A biorepository of serum, plasma, DNA, RNA, and nails were collected to study genetic and serologic factors associated with disease.
Studies of modifiable risk factors for sudden cardiac death will help set the stage for clinical trials to test therapies to prevent sudden cardiac death in this high-risk population.
Dialysis; Hemodialysis; Mortality; Sudden death; Sudden cardiac death; Arrhythmia; End stage renal disease
Early recognition and management of chronic kidney disease (CKD) are associated with better outcomes. Internal medicine residency should prepare physicians to diagnose and manage CKD.
To examine whether residency training and program characteristics were associated with CKD knowledge and investigate the effectiveness of an internet-based training module in improving CKD knowledge, we analyzed data from CKD training modules administered annually to U.S. internal medicine residents from July 1, 2005 to June 30, 2009. Baseline CKD knowledge was assessed using pre-tests. The modules’ effectiveness was evaluated by post-tests. Comparisons were performed using X2 tests and paired t-tests.
Of 4,702 residents, 38%, 33%, and 29% were program year (PGY)-1, PGY-2, and PGY-3, respectively. Baseline CKD knowledge was poor, with mean pre-test scores of 45.1-57.0% across the four years. The lowest pre-test performance was on CKD recognition. Pre-test scores were better with higher training levels (P-trend < 0.001 except 2005–2006 [P-trend = 0.35]). Affiliation with a renal fellowship program or program location within a region of high end-stage kidney disease prevalence was not associated with better baseline CKD knowledge. Completion of the CKD module led to significant improvements from pre- to post-test scores (mean improvement 27.8% [SD: 21.3%] which were consistent from 2005 to 2009.
Knowledge of diagnosis and management of CKD improves during residency training but remains poor among graduating residents. Web-based training can be effective in educating physicians on CKD-related issues. Studies are needed to determine whether knowledge gained from such an intervention translates to improved care of CKD patients.
Kidney disease; Education; Internet; Primary care
Cystatin C has been proposed as an alternative marker of kidney function among HIV-infected persons in whom serum creatinine is affected by extra-renal factors.
In this cross-sectional study, we compared estimated glomerular filtration rates (eGFR) using serum creatinine versus cystatin C between 150 HIV-uninfected and 783 HIV-infected men. We evaluated the prevalence of chronic kidney disease (CKD; eGFR<60 mL/min/1.73 m2) and examined the influence of extra-renal factors on GFR-estimates among HIV-infected men.
Estimated GFRSCR was similar by HIV serostatus, but eGFRCYSC was lower in HIV-infected men. A higher proportion of HIV-infected men were classified as having CKD when using eGFRCYSC versus eGFRSCR (7% vs. 5%, P<0.01). In HIV-infected individuals without CKD, eGFRSCR was higher than eGFRCYSC while it was lower than eGFRCYSC in persons with CKD. In HIV-infected men, older age, proteinuria, and prior clinical AIDS were inversely associated with both GFR-estimates. Higher serum albumin levels and ACE-inhibitor/ARB use were associated with lower eGFRSCR. HIV viral load, hepatitis C co-infection, and serum alkaline phosphatase were inversely associated with eGFRCYSC.
Among HIV-uninfected and HIV-infected men of similar social risk behaviors, GFR estimates differed by biomarker and kidney function level. Estimated GFRCYSC classified a larger proportion of HIV-infected men with CKD compared to eGFRSCR. Differences between these GFR-estimating methods may be due to the effects of extra-renal factors on serum creatinine and cystatin C. Until GFR-estimating equations are validated among HIV-infected individuals, current GFR estimates based on these biomarkers should be interpreted with care in this patient population.
HIV; kidney disease; serum creatinine; cystatin C; glomerular filtration rate; Multicenter AIDS Cohort Study
Diabetes and hypertension, common conditions in antiretroviral (ART) treated HIV-infected individuals, are associated with glomerular hyperfiltration, which precedes the onset of proteinuria and accelerated kidney function decline. In the Multicenter AIDS Cohort Study, we examined the extent to which hyperfiltration is present and associated with metabolic, cardiovascular, HIV and treatment risk factors among HIV-infected men.
Cross-sectional cohort using direct measurement of glomerular filtration rate (GFR) by iohexol plasma clearance for 367 HIV-infected men and 241 HIV-uninfected men who were free of CKD.
Hyperfiltration was defined as GFR >140 ml/min/1.73m2 - 1 ml/min/1.73m2 per each year over age 40. Multivariate logistic regression was used to estimate the odds ratios (OR) of prevalent hyperfiltration for metabolic, cardiovascular, HIV and cumulative ART exposure factors.
Among subjects without CKD, the prevalence of hyperfiltration was higher for HIV-infected participants (25%) compared to uninfected participants (17%; p=0.01). HIV infection was associated with hyperfiltration (OR: 1.70, 95%CI: 1.11, 2.61) and modified the association between diabetes and hyperfiltration, such that the association among HIV-uninfected men (OR: 2.56 95%CI: 1.33, 5.54) was not observed among HIV-infected men (OR: 1.19, 95%CI: 0.69, 2.05). These associations were independent of known risk factors for hyperfiltration. Indicators of hyperglycemia and hypertension were also associated with hyperfiltration as was cumulative zidovudine exposure.
Hyperfiltration, a potential modifiable predictor of kidney disease progression, is common among ART-treated HIV-infected men. HIV infection is associated with significant odds of hyperfiltration in addition to known risk factors for kidney damage.
Glomerular hyperfiltration; glomerular filtration rate; HIV; antiretroviral therapy; iohexol
In the early highly active antiretroviral therapy (HAART) era, kidney dysfunction was strongly associated with death among HIV-infected individuals. We re-examined this association in the later HAART period to determine whether chronic kidney disease (CKD) remains a predictor of death after HAART-initiation.
To evaluate the effect of kidney function at the time of HAART initiation on time to all-cause mortality, we evaluated 1415 HIV-infected women initiating HAART in the Women’s Interagency HIV Study (WIHS). Multivariable proportional hazards models with survival times calculated from HAART initiation to death were constructed; participants were censored at the time of the last available visit or December 31, 2006.
CKD (eGFR <60 ml/min/1.73 m2) at HAART initiation was associated with higher mortality risk adjusting for age, race, hepatitis C serostatus, AIDS history and CD4+ cell count (hazard ratio [HR]=2.23, 95% confidence interval [CI]: 1.45–3.43). Adjustment for hypertension and diabetes history attenuated this association (HR=1.89, CI: 0.94–3.80). Lower kidney function at HAART initiation was weakly associated with increased mortality risk in women with prior AIDS (HR=1.09, CI: 1.00–1.19, per 20% decrease in eGFR).
Kidney function at HAART initiation remains an independent predictor of death in HIV-infected individuals, especially in those with a history of AIDS. Our study emphasizes the necessity of monitoring kidney function in this population. Additional studies are needed to determine mechanisms underlying the increased mortality risk associated with CKD in HIV-infected persons.
kidney disease; mortality; HIV; WIHS; antiretroviral therapy
The Model for End-Stage Liver Disease (MELD) score incorporates serum creatinine and was introduced to facilitate allocation of orthotopic liver transplantation (LT). The objective is to determine the impact of MELD and kidney function on all-cause mortality. Among LTs performed in a tertiary referral hospital between 1995 and 2009, 419 cases were studied. Cox proportional hazards models were constructed to estimate the hazard ratios (HR) and 95% confidence intervals (CI) for death. Over mean follow-ups of 8.4 and 3.1 years during the pre-MELD and MELD era, 57 and 63 deaths were observed, respectively. Those transplanted during the MELD era had a higher likelihood of hepatorenal syndrome (8% vs 2%, P < 0.01), lower kidney function (median estimated glomerular filtration rate [eGFR] 77.8 vs 92.6 mL/ min/1.73 m2, P < 0.01), and more pretransplantation renal replacement therapy (RRT) (5% vs 1%; P < 0.01). All-cause mortality risk was similar in the MELD vs the pre-MELD era (HR: 0.98, 95% CI: 0.58–1.65). The risk of death, however, was nearly 3-fold greater (95% CI: 1.14–6.60) among those requiring pre- transplant RRT. Similarly, eGFR < 60 mL/min/1.73 m2 post-transplant was associated with a 2.5-fold higher mortality (95% CI: 1.48–4.11). The study suggests that MELD implementation had no impact on all-cause mortality post-LT. However, the need for pre-transplant RRT and post-transplant kidney dysfunction was associated with a more than 2-fold greater risk of subsequent death.
eGFR; mortality; MELD; liver transplant
Background. Anaemia worsens as kidney function declines. Both conditions are associated with increased mortality. Serum cystatin C is purportedly a more sensitive marker of kidney disease and a better predictor of mortality than serum creatinine. However, studies suggest that extrarenal factors also influence cystatin C levels.
Methods. We determined whether estimates of glomerular filtration rate [estimated glomerular filtration rate (eGFR)] based on serum cystatin C alone or in combination with serum creatinine were superior to those based on serum creatinine in recognizing impaired kidney function in the setting of anaemia in a sub-sample of the Third National Health and Nutrition Examination Survey of the USA consisting of 6734 participants, 20 years or older.
Results. The prevalence of moderate to severe kidney disease (eGFR 15–59 mL/min/1.73 m2) among anaemic persons was 15–16% when based on serum creatinine alone (eGFRSCR) or combined with cystatin C (eGFRSCR + CYSC); this estimate increased to nearly 25% when kidney function was estimated by cystatin C (eGFRCYSC). The adjusted odds ratios of kidney disease in anaemic versus non-anaemic persons were slightly higher with eGFRCYSC than eGFRSCR and eGFRSCR + CYSC in younger adults [odds ratio (OR) = 5.22, 95% confidence interval (CI): 2.23, 12.17], women (OR = 5.34, 95% CI: 2.36, 12.06) and those with elevated C-reactive protein (CRP) (OR = 7.36, 95% CI: 1.98–27.36).
Conclusions. Impaired kidney function was common in individuals with anaemia. Among anaemic individuals, the prevalence estimate for kidney disease was notably higher when kidney function was estimated by cystatin C alone compared with the estimations by serum creatinine alone or in combination with serum cystatin C. eGFRCYSC may be particularly helpful in identifying kidney disease in the setting of anaemia among younger persons, women and those with elevated CRP. Regardless of which renal biomarker is used, our study suggests that an evaluation for underlying kidney disease should be considered in the standard workup of anaemia.
anaemia; chronic kidney failure; creatinine; cystatin C; glomerular filtration rate
Background. The role of active hepatitis C virus (HCV) replication in chronic kidney disease (CKD) risk has not been clarified.
Methods. We compared CKD incidence in a large cohort of HIV-infected subjects who were HCV seronegative, HCV viremic (detectable HCV RNA), or HCV aviremic (HCV seropositive, undetectable HCV RNA). Stages 3 and 5 CKD were defined according to standard criteria. Progressive CKD was defined as a sustained 25% glomerular filtration rate (GFR) decrease from baseline to a GFR < 60 mL/min/1.73 m2. We used Cox models to calculate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs).
Results. A total of 52 602 HCV seronegative, 9508 HCV viremic, and 913 HCV aviremic subjects were included. Compared with HCV seronegative subjects, HCV viremic subjects were at increased risk for stage 3 CKD (adjusted HR 1.36 [95% CI, 1.26, 1.46]), stage 5 CKD (1.95 [1.64, 2.31]), and progressive CKD (1.31 [1.19, 1.44]), while HCV aviremic subjects were also at increased risk for stage 3 CKD (1.19 [0.98, 1.45]), stage 5 CKD (1.69 [1.07, 2.65]), and progressive CKD (1.31 [1.02, 1.68]).
Conclusions. Compared with HIV-infected subjects who were HCV seronegative, both HCV viremic and HCV aviremic individuals were at increased risk for moderate and advanced CKD.
HIV; hepatitis C virus; chronic kidney disease; hepatitis C RNA; cohort study; glomerular filtration rate; injection drug use
Tenofovir disoproxil fumarate is a widely used antiretroviral for HIV infection that has been associated with an increased risk of chronic kidney disease (CKD). Our objective was to derive a scoring system to predict 5-year risk of developing CKD in HIV-infected individuals and to estimate difference in risk associated with tenofovir use.
We evaluated time to first occurrence of CKD (estimated glomerular filtration rate <60 ml/min per 1.73 m2) in 21 590 HIV-infected men from the Veterans Health Administration initiating antiretroviral therapy from 1997 to 2010.
We developed a point-based score using multivariable Cox regression models. Median follow-up was 6.3 years, during which 2059 CKD events occurred.
Dominant contributors to the CKD risk score were traditional kidney risk factors (age, glucose, SBP, hypertension, triglycerides, proteinuria); CD4+ cell count was also a component, but not HIV RNA. The overall 5-year event rate was 7.7% in tenofovir users and 3.8% in nonusers [overall adjusted hazard ratio 2.0, 95% confidence interval (CI) 1.8–2.2]. There was a progressive increase in 5-year CKD risk, ranging from less than 1% (zero points) to 16% (≥9 points) in nonusers of tenofovir, and from 1.4 to 21.4% among tenofovir users. The estimated number-needed-to-harm (NNH) for tenofovir use ranged from 108 for those with zero points to 20 for persons with at least nine points. Among tenofovir users with at least 1 year exposure, NNH ranged from 68 (zero points) to five (≥9 points).
The CKD risk score can be used to predict an HIV-infected individual’s absolute risk of developing CKD over 5 years and may facilitate clinical decision-making around tenofovir use.
chronic kidney disease; HIV; risk score; tenofovir
In the context of HIV, the initiation of effective antiretroviral therapy (ART) has been found to increase the risk of dyslipidemia in HIV-infected individuals, and dyslipidemia has been found to be a risk factor for kidney disease in the general population. Therefore, we examined changes in lipid profiles in HIV-infected men following ART initiation and the association with future kidney dysfunction. HIV-infected men from the Multicenter AIDS Cohort Study initiating ART between December 31, 1995 and September 30, 2011 with measured lipid and serum creatinine values pre-ART and post-ART were selected. The associations between changes in total cholesterol or high-density lipoprotein following ART initiation and the estimated change in glomerular filtration rate (eGFR) over time were assessed using piecewise linear mixed effects models. There were 365 HIV-infected men who contributed to the analysis. In the adjusted models, at 3 years post-ART, those with changes in total cholesterol >50 mg/dl had an average decrease in eGFR of 2.6 ml/min/1.73 m2 per year (p<0.001) and at 5 years post-ART, the average decrease was 2.4 ml/min/1.73 m2 per year (p=0.008). This decline contrasted with the estimates for those with changes in total cholesterol ≤50 mg/dl: 1.4 ml/min/1.73 m2 decrease per year (p<0.001) and 0.1 ml/min/1.73 m2 decrease per year (p=0.594) for the same time periods, respectively. Large decreases in high-density lipoprotein (a decline of greater than 5 mg/dl) were not associated with declines in eGFR. These results indicate that large ART-related increases in total cholesterol may be a risk factor for kidney function decline in HIV-infected men. Should these results be generalizable to the broader HIV population, monitoring cholesterol changes following the initiation of ART may be important in identifying HIV-infected persons at risk for kidney disease.