This US, multicenter, observational study assessed the CKD prevalence in adult patients with type-2 diabetes mellitus (T2DM) and characterized the proportion of detected and undiagnosed CKD in the primary care setting using the following: a clinician survey; a patient physical exam and medical history; a single blood draw for estimated glomerular filtration rate (eGFR) and glycosolated hemoglobin (HbA1c); urine dipstick for protein; urine albumin-creatinine ratio (ACR); two patient quality of life questionnaires; and a 15-month medical record review. The study consisted of 9339 adults with T2DM and 466 investigator sites. Of the 9339 enrolled, 9307 had complete data collection for analysis. The 15-month retrospective review showed urine protein, urine ACR, and eGFR testing were not performed in 51.4%, 52.9% and 15.2% of individuals, respectively. Of the 9307 patients, 5036 (54.1%) had Stage 1–5 CKD based on eGFR and albuminuria; however, only 607 (12.1%) of those patients were identified as having CKD by their clinicians. Clinicians were more successful in diagnosing patients with Stage 3–5 CKD than Stages 1 and 2. There were no differences in clinicians’ likelihood of identification of CKD based on practice setting, number of years in practice, or self-reported patients seen per week. Awareness or patient self-reported CKD was 81.1% with practitioner detection versus 2.6% in the absence of diagnosis. Primary care of T2DM demonstrates recommended urine CKD testing is underutilized, and CKD is significantly under-diagnosed. This is the first study to show CKD detection is associated with awareness.
Tenofovir has been associated with renal tubular injury. Biomarkers that signal early tubular dysfunction are needed because creatinine rise lags behind tenofovir-associated kidney dysfunction. We examined several urinary biomarkers to determine if rises accompanying tenofovir initiation preceded creatinine changes.
Three urinary biomarkers of tubular impairment- neutrophil gelatinase-associated lipocalin (NGAL), N-acetyl- β -D-glucosaminidase (NAG), and β-2-microglobulin (β2MG)-were measured across three time points (one pre-tenofovir visit and two post tenofovir visits) in one hundred and thirty two HIV-positive women from the Women's Interagency HIV Study (WIHS). Women initiating HAART containing tenofovir were propensity score matched to women initiating HAART without tenofovir and women not on HAART.
There were no differences between groups for NGAL or NAG but β2MG was 19 times more likely to be elevated among tenofovir users at the 2nd post tenofovir visit compared to non-TDF users at the pre-tenofovir visit (p<0.01). History of proteinuria was associated with elevated NGAL (p <0.01). Factors associated with elevated NAG were GFR<60 ml/min, history of proteinuria, hepatitis C (p<0.01 for all) and diabetes mellitus (p=0.05). Factors associated with increased odds of elevated β2MG were HIV RNA>100,000 copies/ml, hepatitis C, boosted protease inhibitor (PI) use, and GFR<60 ml/min (p≤0.01 for all).
β2MG levels are elevated in women on tenofovir indicating probable early renal dysfunction. Biomarker elevation is additionally associated with baseline chronic kidney disease, uncontrolled viremia, and boosted PI use. Future studies are needed to explore urinary biomarker thresholds in identifying treated HIV-infected individuals at risk for renal dysfunction.
Tenofovir; urinary biomarkers; HIV infected women
Chronic kidney disease (CKD) is associated with increased morbidity and mortality in coronary artery disease (CAD) patients. We compared the economic attractiveness of CAD revascularization procedures in patients with and without CKD. Our population included 6218 patients with significant CAD undergoing cardiac catheterization at Duke University between 1996 and 2001, with follow-up through 2002. We investigated the influence of CKD (creatinine clearance < 60 mL/min) upon 3-year survival and medical costs in our CAD population. Coronary artery bypass graft (CABG) surgery was an economically attractive alternative vs. percutaneous coronary intervention (PCI) or medical therapy for all patients with left main disease, three-vessel CAD patients without CKD, and two-vessel CAD patients with CKD. Medical therapy was an economically attractive strategy vs. CABG surgery or PCI for three-vessel CAD patients with CKD, two-vessel CAD patients without CKD, and all single-vessel CAD patients.
Cost-benefit analysis; Revascularization; Coronary disease; Kidney disease
Conflicting relationships have been described between anemia correction using erythropoiesis-stimulating agents (ESAs) and progression of chronic kidney disease (CKD). This study was undertaken to examine the impact of target hemoglobin on progression of kidney disease in the CHOIR (Correction of Hemoglobin and Outcomes in Renal Insufficiency) trial.
Secondary analysis of a randomized controlled trial
Setting and participants
1432 participants with CKD and anemia
Participants were randomized to target hemoglobin of 13.5 vs 11.3 gm/dL with the use of epoetin-alfa.
Outcomes and measurements
Cox regression was used to estimate hazard ratios for progression of CKD (a composite of doubling of creatinine, initiation of renal replacement therapy (RRT), or death). Interactions between hemoglobin target and select baseline variables (estimated glomerular filtration rate (eGFR), proteinuria, diabetes, heart failure, and smoking history) were also examined.
Participants randomized to higher hemoglobin targets experienced a shorter time to progression of kidney disease in both univariate (HR, 1.25; 95% CI, 1.03–1.52; p=0.02) and multivariable models (HR, 1.22; 95% CI, 1.00–1.48; p=0.05). These differences were attributable to higher rates of RRT and death among participants in the high hemoglobin arm. Hemoglobin target did not interact with eGFR, proteinuria, diabetes, or heart failure (p>0.05 for all). In the multivariable model, hemoglobin target interacted with tobacco use (p=0.04) such that the higher target had a greater risk of CKD progression among participants that currently smoked (HR, 2.50; 95% CI, 1.23–5.09; p=0.01) which was not present among those who did not currently smoke (HR, 1.15; 95% CI 0.93–1.41; p=0.2).
A post-hoc analysis and thus cause- effect cannot be determined.
These results suggest that high hemoglobin target is associated with a greater risk of progression of CKD. This risk may be augmented by concurrent smoking. Further defining the mechanism of injury may provide insight into methods to optimize outcomes in anemia management.
Targeting a higher hemoglobin in patients with chronic kidney disease leads to adverse cardiovascular outcomes, yet the reasons remain unclear. Herein, we sought to determine whether changes in erythropoiesis-stimulating agent (ESA) dose and in hemoglobin were predictive of changes in blood pressure (BP) and whether these changes were associated with cardiovascular outcomes.
In this secondary analysis of 1421 Correction of Hemoglobin and Outcomes in Renal Disease (CHOIR) participants, mixed model analyses were used to describe monthly changes in ESA dose and hemoglobin with changes in diastolic BP (DBP) and systolic BP (SBP). Poisson modeling was performed to determine whether changes in hemoglobin and BP were associated with the composite end point of death or cardiovascular outcomes.
Monthly average DBP, but not SBP, was higher in participants in the higher hemoglobin arm. Increases in ESA doses and in hemoglobin were significantly associated with linear increases in DBP, but not consistently with increases in SBP. In models adjusted for demographics and comorbid conditions, increases in ESA dose (>0 U) and larger increases in hemoglobin (>1.0 g/dL/month) were associated with poorer outcomes [event rate ratio per 1000 U weekly dose per month increase 1.05, (1.02–1.08), P = 0.002 and event rate ratio 1.70 (1.02–2.85), P = 0.05, respectively]. However, increasing DBP was not associated with adverse outcomes [event rate ratio 1.01 (0.98–1.03), P = 0.7].
Among CHOIR participants, higher hemoglobin targets, increases in ESA dose and in hemoglobin were associated both with increases in DBP and with higher event rates; however, increasing DBP was not associated with adverse outcomes.
anemia; blood pressure; cardiovascular events; chronic kidney disease; erythropoietin dose
Chronic kidney disease is assuming epidemic proportions, and an increasing number of clinical trials are testing treatments developed to improve morbidity and mortality. Surprisingly, however, a large proportion of these trials have had negative or neutral results. When trials unexpectedly demonstrate either no benefit or a detrimental impact of a treatment, especially when that treatment is already used in practice, critics commonly argue that the results were dictated by flawed trial design rather than the intrinsic properties of the treatment. In kidney disease therapeutics, trials commonly rely on observational data and test the hypothesis that these associations may be extrapolated to cause-and-effect. Other key issues in trial design that may affect outcomes include the impact of enrolling relatively healthier subjects, the complexity of recruiting participants with specific characteristics while maintaining generalizability, and the subtleties of event adjudication and quality of life assessments. In this article, general principles of trial design will be discussed and the potential lessons learned from recent trials in nephrology will be critically reviewed.
nephrology; clinical trial; CHOIR; CREATE; HEMO; MDRD
To examine the association between changes in glomerular filtration rates (GFR) and antiretroviral therapy (ART)-mediated suppression of plasma HIV-1 viremia.
Observational, prospective, multicenter cohort study.
ART regimens or treatment strategies in HIV-1-infected subjects were implemented through randomized clinical trials; 1776 ambulatory subjects from these trials also enrolled in this cohort study.
The association between suppression of viremia and GFR changes from baseline was examined using the abbreviated Modification of Diet and Renal Disease equation in mixed effects linear models.
GFR improvement was associated with ART-mediated suppression of plasma viremia in subjects with both chronic kidney disease stage ≥2 and low baseline CD4 cell counts (< 200 cells/µl). In this subset, viral suppression (by > 1.0 log10 copies/ml or to < 400 copies/ml) was associated with an average increase in GFR of 9.2 ml/min per 1.73 m2 from baseline (95% confidence interval, 1.6–16.8; P = 0.02) over a median follow-up of 160 weeks. The magnitude of this association increased in subjects who had greater baseline impairment of renal function, and it did not depend on race or sex.
Viral suppression was associated with GFR improvements in those with both low CD4 cell counts and impaired baseline renal function, supporting an independent contribution of HIV-1 replication to chronic renal dysfunction in advanced HIV disease. GFR improvement not associated with viral suppression also was observed in subjects with higher CD4 cell counts.
antiretroviral therapy; chronic kidney disease; HIV-1 viremia; HIV-associated nephropathy
Reduced kidney function and albuminuria are associated with higher risk for cardiovascular disease (CVD) and mortality in HIV-infected individuals. We investigated whether reduced estimated glomerular filtration rate (eGFR) and albuminuria are associated with subclinical vascular disease, as assessed by carotid intima-medial thickness (cIMT).
Cross-sectional analysis of 476 HIV-infected individuals without clinical evidence of CVD enrolled in the Fat Redistribution and Metabolic Change in HIV infection (FRAM) study, using multivariable linear regression. eGFRCys and eGFRCr were calculated from cystatin C and creatinine levels. Albuminuria was defined as a positive urine dipstick (≥1+) or urine albumin-to-creatinine ratio ≥30 mg/g. Common and internal cIMT were measured by high-resolution B-mode ultrasound.
In unadjusted analyses, eGFRCys and eGFRCr were strongly associated with common and internal cIMT. Each 10 ml/min/1.73 m2 decrease in eGFRCys and eGFRCr was associated with a 0.008 mm higher common cIMT (p = 0.003, p = 0.01) and a 0.024 and 0.029 mm higher internal cIMT (p = 0.003), respectively. These associations were eliminated after adjustment for age, gender, and race. Albuminuria showed little association with common or internal cIMT in all models.
In HIV-infected individuals without prior CVD, reduced kidney function and albuminuria were not independently associated with subclinical vascular disease, as assessed by cIMT. These results suggest that research should focus on searching for novel mechanisms by which kidney disease confers cardiovascular risk in HIV-infected individuals.
Cystatin C; Intima-medial thickness; HIV; Atherosclerosis; Cardiovascular disease; Kidney
In the early highly active antiretroviral therapy (HAART) era, kidney dysfunction was strongly associated with death among HIV-infected individuals. We re-examined this association in the later HAART period to determine whether chronic kidney disease (CKD) remains a predictor of death after HAART-initiation.
To evaluate the effect of kidney function at the time of HAART initiation on time to all-cause mortality, we evaluated 1415 HIV-infected women initiating HAART in the Women’s Interagency HIV Study (WIHS). Multivariable proportional hazards models with survival times calculated from HAART initiation to death were constructed; participants were censored at the time of the last available visit or December 31, 2006.
CKD (eGFR <60 ml/min/1.73 m2) at HAART initiation was associated with higher mortality risk adjusting for age, race, hepatitis C serostatus, AIDS history and CD4+ cell count (hazard ratio [HR]=2.23, 95% confidence interval [CI]: 1.45–3.43). Adjustment for hypertension and diabetes history attenuated this association (HR=1.89, CI: 0.94–3.80). Lower kidney function at HAART initiation was weakly associated with increased mortality risk in women with prior AIDS (HR=1.09, CI: 1.00–1.19, per 20% decrease in eGFR).
Kidney function at HAART initiation remains an independent predictor of death in HIV-infected individuals, especially in those with a history of AIDS. Our study emphasizes the necessity of monitoring kidney function in this population. Additional studies are needed to determine mechanisms underlying the increased mortality risk associated with CKD in HIV-infected persons.
kidney disease; mortality; HIV; WIHS; antiretroviral therapy
High-dose erythropoiesis-stimulating agents (ESA) for anemia of chronic kidney disease (CKD) have been associated with adverse clinical outcomes and do not always improve erythropoiesis. We hypothesized that high-dose ESA requirement would be associated with elevated inflammatory biomarkers, decreased adipokines, and increased circulating, endogenous soluble erythropoietin receptors (sEpoR).
A cross-sectional cohort of anemic 32 CKD participants receiving ESA were enrolled at a single center and cytokine profiles, adipokines, and sEpoR were compared between participants stratified by ESA dose requirement (usual-dose darbepoetin-α (< 1 μg/kg/week) and high-dose (≥1 μg/kg/week)).
Baseline characteristics were similar between groups; however, hemoglobin was lower among participants on high-dose (1.4 μg/kg/week) vs usual-dose (0.5 μg/kg/week) ESA.
In adjusted analyses, high-dose ESA was associated with an increased odds for elevations in c-reactive protein and interleukin-6 (p < 0.05 for both). There was no correlation between high-dose ESA and adipokines. Higher ESA dose correlated with higher levels of sEpoR (rs = 0.39, p = 0.03). In adjusted analyses, higher ESA dose (per μcg/kg/week) was associated with a 53% greater odds of sEpoR being above the median (p < 0.05).
High-dose ESA requirement among anemic CKD participants was associated with elevated inflammatory biomarkers and higher levels of circulating sEpoR, an inhibitor of erythropoiesis. Further research confirming these findings is warranted.
Compared with controls, HIV-infected persons have a greater prevalence of kidney disease as assessed by high levels of cystatin C and albuminuria, but not as assessed by creatinine level. However, the clinical importance of elevated cystatin C and albuminuria in the HIV-infected population has not been studied.
We conducted an observational cohort study to determine the association of kidney disease (measured by albuminuria, cystatin C, and serum creatinine) with mortality.
Setting & Participants
922 HIV-infected persons enrolled in the FRAM (Fat Redistribution and Metabolic Change in HIV infection) study.
Serum cystatin C and serum creatinine were used to estimate glomerular filtration rate (eGFR). Albuminuria was defined as a positive urine dipstick (≥1+) or a urine albumin-creatinine ratio > 30 mg/g.
At baseline, reduced kidney function (eGFRSCysC <60 mL/min/1.73m2) or albuminuria was present in 28% of participants. After five years of follow-up, mortality was 48% among those with both eGFRSCysC <60 mL/min/1.73m2 and albuminuria, 23% in those with eGFRSCysC <60 mL/min/1.73m2 alone, 20% in those with albuminuria alone, and 9% in those with neither condition. After multivariable adjustment for demographics, cardiovascular risk factors, HIV-related factors, and inflammatory markers, eGFRSCysC <60 mL/min/1.73m2 and albuminuria were associated with nearly a twofold increase in mortality, whereas eGFRSCr <60 mL/min/1.73m2 did not appear to have any substantial association with mortality. Together, eGFRSCysC <60 mL/min/1.73m2 and albuminuria accounted for 17% of the population-level attributable risk for mortality.
Vital status was unknown in 261 participants from the original cohort.
Kidney disease marked by albuminuria or increased cystatin C levels appears to be an important risk factor for mortality in HIV-infected individuals. A substantial proportion of this risk may be unrecognized because of the current reliance on serum creatinine to estimate kidney function in clinical practice.
kidney disease; mortality; HIV infection
Microalbuminuria is associated with increased risk of cardiovascular disease and mortality. The objective of the study was to evaluate if HIV infection was an independent risk factor for microalbuminuria.
The relationship between HIV infection and microalbuminuria was assessed using subjects enrolled in the study of Fat Redistribution and Metabolic Change in HIV Infection, which consists of HIV-positive and control men and women. Participants with proteinuria (dipstick ≥1+) were excluded.
Microalbuminuria (urinary albumin/creatinine ratio, ACR>30 mg/g) was present in 11% of HIV infected, and 2% of control participants (P<0.001); a fivefold odds after multivariate adjustment (odds ratio, 5.11; 95% confidence interval, 1.97–13.31; P=0.0008). Several cardiovascular risk factors were associated with higher ACR in HIV participants: insulin resistance (HOMA>4; 32%, P<0.0001), systolic blood pressure (21%, P=0.01 for 120–140 versus <120 mmHg, and 43%, P <0.06 for >140 versus <120 mmHg), and family history of hypertension (17%, P=0.03). Higher CD4 cell count was associated with lower albumin/creatinine ratio (−24%, P=0.009 for 200–400 versus <200 cells/ml and −26%, P=0.005 for >400 versus <200 cells/ml).
HIV infection had a strong and independent association with microalbuminuria, the severity of which was predicted by markers of insulin resistance, hypertension, and advanced HIV infection. These associations warrant further investigation, as the increased prevalence of microalbuminuria in HIV infection may be a harbinger of future risk of cardiovascular and kidney diseases.
Microalbuminuria; kidney; urine protein; insulin resistance; lipodystrophy
Although studies have reported a high prevalence of end-stage renal disease in human immunodeficiency virus (HIV)-infected individuals, little is known about moderate impairments in kidney function. Cystatin C measurement may be more sensitive than creatinine for detecting impaired kidney function in persons with HIV.
We evaluated kidney function in the Fat Redistribution and Metabolic Change in HIV Infection (FRAM) cohort, a representative sample of 1008 HIV-infected persons and 290 controls from the Coronary Artery Risk Development in Young Adults (CARDIA) study in the United States.
Cystatin C level was elevated in HIV-infected individuals; the mean±SD cystatin C level was 0.92±0.22 mg/L in those infected with HIV and 0.76±0.15 mg/L in controls (P<.001). In contrast, both mean creatinine levels and estimated glomerular filtration rates appeared similar in HIV-infected individuals and controls (0.87±0.21 vs 0.85±0.19 mg/dL [to convert to micromoles per liter, multiply by 88.4] [P=.35] and 110±26 vs 106±23 mL/min/1.73 m2 [P=.06], respectively). Persons with HIV infection were more likely to have a cystatin C level greater than 1.0 mg/L (OR, 9.8; 95% confidence interval, 4.4-22.0 [P<.001]), a threshold demonstrated to be associated with increased risk for death and cardiovascular and kidney disease. Among participants with HIV, potentially modifiable risk factors for kidney disease, hypertension, and low high-density lipoprotein concentration were associated with a higher cystatin C level, as were lower CD4 lymphocyte count and coinfection with hepatitis C virus (all P<.001).
Individuals infected with HIV had substantially worse kidney function when measured by cystatin Clevel compared with HIV-negative controls, whereas mean creatinine levels and estimated glomerular filtration rates were similar. Cystatin C measurement could be a useful clinical tool to identify HIV-infected persons at increased risk for kidney and cardiovascular disease.
Hypertension is common in hemodialysis patients; however, the relationship between interdialytic weight gain (IDWG) and blood pressure (BP) is incompletely characterized. This study seeks to define the relationship between IDWG and BP in prevalent hemodialysis subjects.
Study Design, Setting, & Participants
This study used data from 32,295 dialysis sessions in 442 subjects followed up for 6 months in the Crit-Line Intradialytic Monitoring Benefit (CLIMB) Study.
Outcomes & Measurements
Mixed linear regression was used to analyze the relationship between percentage of IDWG (IDWG [%] = [current predialysis weight – previous postdialysis weight]/dry weight * 100) as the independent variable and systolic BP (SBP) and predialysis – postdialysis SBP (ΔSBP) as dependent variables.
In unadjusted analyses, every 1% increase in percentage of IDWG was associated with a 1.00 mm Hg (95% confidence interval [CI], ± 0.24) increase in predialysis SBP (P < 0.0001), 0.65 mm Hg (95% CI, ±0.24) decrease in postdialysis SBP (P < 0.0001), and 1.66 mm Hg (95% CI, ±0.25) increase in ΔSBP (P < 0.0001). After controlling for other significant predictors of SBP, every 1% increase in percentage of IDWG was associated with a 1.00 mm Hg (95% CI, ±0.24) increase in predialysis SBP (P < 0.0001) and a 1.08 mm Hg (95% CI, ±0.22) increase in ΔSBP with hemodialysis (P < 0.0001). However, in subjects with diabetes as the cause of end-stage renal disease, subjects with lower creatinine levels, and older subjects, the magnitude of the association between percentage of IDWG and predialysis SBP was less pronounced. The magnitude of percentage of IDWG on ΔSBP was less pronounced in younger subjects and subjects with lower dry weights. Results were similar with diastolic BP.
Hemodialysis BP measurements are imprecise estimates of BP and true hemodynamic burden in dialysis subjects.
In prevalent hemodialysis subjects, increasing percentage of IDWG is associated with increases in predialysis BP and BP changes with hemodialysis; however, the magnitude of the relationship is modest and modified by other clinical factors. Thus, although overall volume status may impact on BP to a greater extent, day-to-day variations in weight gain have a modest role in BP increases in prevalent subjects with end-stage renal disease.
Interdialytic weight gain; blood pressure; hemodialysis
This study examines the association between microalbuminuria and the development of proteinuria among HIV-infected persons.
948 subjects provided urine samples for albumin, protein, and creatinine measurements semiannually. Microalbuminuria was an albumin-to-creatinine ratio of >30 mg/gm. Proteinuria was a protein-to-creatinine ratio of ≥0.350 mg/mg. The progression from microalbuminuria to proteinuria was described.
At baseline, 69.4% had no detectable proteinuria, 20.2% had microalbuminuria, and 10.4% had proteinuria. Subjects with microalbuminuria and proteinuria were more likely to be black (p=0.03), have lower CD4+ counts (p=0.02,0.0001 compared to subjects without abnormal proteinuria, respectively), and have a higher HIV RNA level (p=0.08,0.04). Among 658 subjects with normal urine protein, 82.7% continued to have no abnormality, 14.3% developed microalbuminuria, and 3.0% developed proteinuria. Subjects without baseline proteinuria (i.e. either normal protein excretion or microalbuminuria) who developed proteinuria were more likely to have microalbuminuria (p=0.001), a lower CD4+ count (p=0.06), and a higher plasma HIV RNA (p=0.03) than those who did not progress to proteinuria. In multivariate analysis, only microalbuminuria remained associated with the development of proteinuria (OR=2.9; 95% CI 1.5, 5.5; p=0.001).
Microalbuminuria predicts the development of proteinuria among HIV-infected persons. Because proteinuria has been linked to poorer outcomes, strategies to affect microalbuminuria should be tested.
HIV-1; microalbuminuria; proteinuria; HIVAN; urine
Background. Patients with end-stage renal disease (ESRD) requiring chronic haemodialysis who undergo coronary artery bypass graft surgery (CABG) are at significant risk for perioperative mortality. However, the impact of changes in ESRD patient volume and characteristics over time on operative outcomes is unclear.
Methods. Using the Nationwide Inpatient Sample database (1988–03), we evaluated rates of CABG surgery with and without concurrent valve surgery among ESRD patients and outcomes including in-hospital mortality, and length of hospital stay. Multivariate regression models were used to account for patient characteristics and potential cofounders.
Results. From 1988 to 2003, annual rates of CABG among ESRD patients doubled from 2.5 to 5 per 1000 patient-years. Concomitantly, patient case-mix changed to include patients with greater co-morbidities such as diabetes, hypertension and obesity (all P < 0.001). Nonetheless, among ESRD patients, in-hospital mortality rates declined nearly 6-fold from over 31% to 5.4% (versus 4.7% to 1.8% among non-ESRD), and the median length of in-hospital stay dropped in half from 25 to 13 days (versus 14 to 10 days among non-ESRD).
Conclusions. Since 1988, an increasing number of patients with ESRD have been receiving CABG in the USA. Despite increasing co-morbidities, operative mortality rates and length of in-hospital stay have declined substantially. Nonetheless, mortality rates remain almost 3-fold higher compared to non-ESRD patients indicating a need for ongoing improvement.
coronary artery bypass graft; end-stage renal disease; in-hospital mortality; perioperative outcomes
In the United States, HIV-related kidney disease disproportionately affects individuals of African descent; however, there are few estimates of kidney disease prevalence in Africa. We evaluated the prevalence of kidney disease among HIV-infected and uninfected Rwandan women.
The Rwandan Women's Interassociation Study and Assessment prospectively enrolled 936 women. Associations with estimated glomerular filtration rate (eGFR)<60 mL/min/1.73 m2 and proteinuria were assessed in separate logistic regression models.
Among 891 non-pregnant women with available data, 2.4% had an eGFR<60 mL/min/1.73 m2 (calculated by the Modification of Diet in Renal Disease equation, MDRD eGFR) and 8.7% had proteinuria ≥1+. The prevalence of decreased eGFR varied markedly depending on the estimating method used, with the highest prevalence by Cockcroft-Gault. Regardless of the method used to estimate GFR, the proportion with decreased eGFR or proteinuria did not differ significantly between HIV-infected and -uninfected women in unadjusted analysis. After adjusting for age and blood pressure, HIV infection was associated with significantly higher odds of decreased MDRD eGFR but not proteinuria.
In a well-characterized cohort of Rwandan women, HIV infection was associated with decreased MDRD eGFR. The prevalence of decreased eGFR among HIV-infected women in our study was lower than that previously reported in African-Americans and in other Central and East African HIV populations, although there was substantial variability depending on the equation used to estimate GFR. Future studies are needed to optimize GFR estimates and to determine the impact of antiretroviral therapy on kidney disease in this population.
How co-infection with hepatitis C virus (HCV) impacts on the trajectory of kidney function among HIV-infected patients is unclear. This study examined the effect of HCV on kidney function over time among women infected with HIV.
Retrospective observational cohort
Setting and Participants
Study sample included participants from the Women's Interagency HIV Study who were HIV-infected and had received HCV antibody testing and serum creatinine measurement at baseline.
Outcomes and Measurement
Estimated glomerular filtration rate (eGFR) calculated from semi-annual serum creatinine measurements using the 4-variable Modification of Diet in Renal Diseases (MDRD) Study equation. Linear mixed models were used to evaluate the independent effect of being HCV seropositive on eGFR over time, adjusting for demographic factors, co-morbid conditions, illicit drug use, measures of HIV disease status, use of medications, and interactions with baseline low eGFR (<60 mL/min/1.73m2).
Of the 2,684 HIV-infected women, 952 (35%) were found to be HCV seropositive. For 180 women with CKD at baseline (eGFR <60 mL/min/1.73m2), being HCV seropositive was independently associated with a fully-adjusted net decline in eGFR of about 5% per year (95% CI: 3.2 to 7.2%), relative to women who were seronegative. In contrast, HCV was not independently associated with decline in eGFR among women without low eGFR at baseline (p<0.001 for interaction).
The MDRD Study equation has not been validated as a measure of GFR among persons with HIV or HCV. Proteinuria was not included in the study analysis. Because the study is observational, the effects of residual confounding cannot be excluded.
Among HIV-infected women with CKD, co-infection with HCV is associated with a modest, but statistically significant decline in eGFR over time. More careful monitoring of kidney function may be warranted for HIV-infected patients with CKD who are also co-infected with HCV.
hepatitis C virus; HIV; kidney diseases; women
Intradialytic increases in blood pressure (BP) can complicate the management of hypertension in hemodialysis (HD) patients but the long-term consequences are uncertain. Thus, we sought to determine if BP elevations during HD were associated with higher 2-year mortality among incident HD patients.
Secondary analysis of a prospective dialysis cohort.
Setting and Participants:
Incident hemodialysis patients in the Dialysis Morbidity and Mortality Wave 2 Study.
Changes in systolic BP (SBP) during hemodialysis (ie, postdialysis SBP–predialysis SBP), averaged from 3 hemodialysis sessions prior to enrollment.
Time to 2-year all-cause mortality.
Cox regression was used to model hazard ratios for mortality associated with changes in SBP during HD, while adjusting for demographics, comorbid conditions, interdialytic weight gain, laboratory variables, and antihypertensive agents.
Of 1,748 patients, 12.2% exhibited >10mmHg increases in SBP during HD. In adjusted analyses, every 10mmHg increase in SBP during HD was independently associated with a 6% increased hazard of death (HR 1.06, CI 1.01-1.11). When also adjusted for diastolic BP and postdialysis SBP, the adjusted hazard of death associated with increasing SBP during HD remained significant (HR 1.12, CI 1.05-1.21, per 10mmHg increase in ΔSBP during HD). However, in analyses adjusted for predialysis SBP, there was a significant interaction between change in SBP and predialysis SBP. In analyses stratified by predialysis SBP, trends for an increased mortality associated with increasing SBP during dialysis were present in patients with predialysis SBP <160mmHg, however, this relationship was only significant in patients with predialysis SBP <120mmHg.
Secondary analysis with a limited number of baseline BP measurements and limited information on dialysis prescription.
Increasing SBP >10mmHg during hemodialysis occurs in ∼10% of incident patients and while increasing systolic BP during HD was associated with decreased 2-year survival, these findings were limited to patients with predialysis systolic BP <120 mmHg.
blood pressure; end-stage renal disease; epidemiology and outcomes; hemodialysis; hypertension; mortality
Trials of anemia correction in chronic kidney disease have found either no benefit or detrimental outcomes of higher targets. We did a secondary analysis of patients with chronic kidney disease enrolled in the Correction of Hemoglobin in the Outcomes in Renal Insufficiency trial to measure the potential for competing benefit and harm from achieved hemoglobin and epoetin dose trials. In the 4 month analysis, significantly more patients in the high-hemoglobin compared to the low-hemoglobin arm were unable to achieve target hemoglobin and required high-dose epoetin-α. In unadjusted analyses, the inability to achieve a target hemoglobin and high-dose epoetin-α were each significantly associated with increased risk of a primary endpoint (death, myocardial infarction, congestive heart failure or stroke). In adjusted models, high-dose epoetin-α was associated with a significant increased hazard of a primary endpoint but the risk associated with randomization to the high hemoglobin arm did not suggest a possible mediating effect of higher target via dose. Similar results were seen in the 9 month analysis. Our study demonstrates that patients achieving their target had better outcomes than those who did not; and among subjects who achieved their randomized target, no increased risk associated with the higher hemoglobin goal was detected. Prospective studies are needed to confirm this relationship and determine safe dosing algorithms for patients unable to achieve target hemoglobin.
anemia; chronic kidney disease; epoetin-α; dose; epidemiology and outcomes
The CHOIR trial in anemic patients with chronic kidney disease compared epoetin-alfa treatment with low (11.3 g/l) and high (13.5 g/l) hemoglobin targets on the composite end point of death, hospitalization for heart failure, stroke, and myocardial infarction. However, other anemia management trials in patients with chronic kidney disease found there was increased risk when hemoglobin is targeted above 13 g/dl. In this secondary analysis of the CHOIR trial, we compared outcomes among the subgroups of patients with diabetes and heart failure to describe the comparative relationship of treatment to these two different hemoglobin goals. By Cox regression analysis, there was no increased risk associated with the higher hemoglobin target among patients with heart failure. In patients without heart failure, however, the hazard ratio (1.86) associated with the higher target was significant. Comparing survival curves in an unadjusted model, patients with diabetes did not have a greater hazard associated with the higher target. Subjects without diabetes had a significantly greater hazard in the high as compared to the low target, but the interaction between diabetes and the target was not significant. We suggest that the increased risks associated with higher hemoglobin targets are not clinically apparent among subgroups with greater mortality risk. These differential outcomes underscore the need for dedicated trials in these subpopulations.
anemia; diabetes mellitus; heart failure; kidney
Pulse pressure is a well established marker of vascular stiffness and is associated with increased mortality in hemodialysis patients. Here we sought to determine if a decrease in pulse pressure during hemodialysis was associated with improved outcomes using data from 438 hemodialysis patients enrolled in the 6-month Crit-Line Intradialytic Monitoring Benefit Study. The relationship between changes in pulse pressure during dialysis (2-week average) and the primary end point of non-access-related hospitalization and death were adjusted for demographics, comorbidities, medications, and laboratory variables. In the analyses that included both pre- and post-dialysis pulse pressure, higher pre-dialysis and lower post-dialysis pulse pressure were associated with a decreased hazard of the primary end point. Further, every 10 mm Hg decrease in pulse pressure during dialysis was associated with a 20% lower hazard of the primary end point. In separate models that included pulse pressure and the change in pulse pressure during dialysis, neither pre- nor post-dialysis pulse pressure were associated with the primary end point, but each 10 mm Hg decrease in pulse pressure during dialysis was associated with about a 20% lower hazard of the primary end point. Our study found that in prevalent dialysis subjects, a decrease in pulse pressure during dialysis was associated with improved outcomes. Further study is needed to identify how to control pulse pressure to improve outcomes.
end-stage renal disease; hemodialysis; intradialytic blood pressure; morbidity and mortality; outcomes; pulse pressure
Because both renal disease and immune activation predict progression to AIDS, we evaluated the relationships between dipstick proteinuria ≥1+ [7% of 1012 subjects], CrCl <90mL/min [18% of 1071 subjects], and percentages of peripheral activated CD8 cells (CD8+CD38+HLA-DR+) in antiretroviral-naïve, HIV-infected subjects enrolled into AIDS Clinical Trials Group studies 384 and A5095. Proteinuria, but not CrCl, was associated with higher percentages of CD8+CD38+HLA-DR+ cells [55% vs. 50%; P=0.01], with even more pronounced differences in men and among Blacks and Hispanics. Proteinuria may be a surrogate measure of greater immune activation in HIV-infected patients initiating antiretroviral therapy.
HIV-1; proteinuria; renal failure; nephropathy; immune activation
Acute kidney injury is common in HIV-infected patients, and has been associated with increased morbidity and mortality. Prior to the introduction of effective antiretroviral therapy, acute kidney injury in HIV-positive patients was most commonly the result of volume depletion, septicemia, or nephrotoxic medications. Acute kidney injury remains a significant problem in the antiretroviral era, and is still commonly attributed to infection or nephrotoxic medications. Less common causes such as direct infectious insults, immune restoration inflammatory syndrome, rhabdomyolysis, and obstruction should be considered when the underlying process is not obvious. In addition to advanced HIV disease, several other patient characteristics have emerged as potential risk factors for acute kidney injury in the antiretroviral era, including older age, diabetes, pre-existing chronic kidney disease, and hepatitis co-infection or liver disease.
Acute kidney injury; acute tubular necrosis; HIV
To evaluate the pharmacokinetics and pharmacogenomics of efavirenz (EFV) and lopinavir/ritonavir (LPV/RTV) in HIV-infected persons requiring hemodialysis.
Prospective, observational study of HIV-infected hemodialysis subjects receiving one 600mg tablet daily of EFV (N=13) or three 133.3/33.3mg capsules twice daily of LPV/RTV (N=13).
24-hour EFV and 12-hour LPV/RTV pharmacokinetics were assessed. Geometric mean ratios were calculated using historical controls with normal renal function. The effects of several candidate gene polymorphisms were also explored.
The geometric mean (95% CI, %CV) Cmin, Cmax, and AUC for the EFV group were 1.81µg/mL (0.93, 3.53; 103%), 5.04µg/mL (3.48, 7.29; 72%), and 71.5µg·h/mL (43.2, 118.3; 93%), respectively. These parameters were 2.76µg/mL (1.86, 4.11; 53%), 8.45µg/mL (6.41, 11.15; 52%), and 69.6µg·h/mL (55.6, 87.2; 37%) for LPV and 0.08µg/mL (0.05, 0.14; 63%), 0.58µg/mL (0.44, 0.76; 41%), and 3.74µg·h/mL (2.91, 4.80; 37%) for RTV. The AUC geometric mean ratios (90% CI) for EFV, LPV, and RTV were 132% (89, 197), 81% (67, 97), and 92% (76, 111), respectively. LPV Cmin was lower than expected in the hemodialysis group. Higher EFV concentrations were associated with the CYP2B6 516G>T polymorphism.
The pharmacokinetics of EFV and LPV/RTV in hemodialysis suggest that no dosing adjustments are necessary in treatment-naïve patients. As HIV-infected hemodialysis patients are disproportionately black, the increased frequency of the CYP2B6 516G>T polymorphism may lead to higher EFV levels. The potentially lower LPV trough levels in this population suggest that LPV/RTV should be used with caution in protease inhibitor-experienced patients.
Pharmacokinetics; Pharmacogenomics; Renal Failure; Dialysis; HIV; Efavirenz; Lopinavir; Ritonavir