To examine the association between changes in glomerular filtration rates (GFR) and antiretroviral therapy (ART)-mediated suppression of plasma HIV-1 viremia.
Observational, prospective, multicenter cohort study.
ART regimens or treatment strategies in HIV-1-infected subjects were implemented through randomized clinical trials; 1776 ambulatory subjects from these trials also enrolled in this cohort study.
The association between suppression of viremia and GFR changes from baseline was examined using the abbreviated Modification of Diet and Renal Disease equation in mixed effects linear models.
GFR improvement was associated with ART-mediated suppression of plasma viremia in subjects with both chronic kidney disease stage ≥2 and low baseline CD4 cell counts (< 200 cells/µl). In this subset, viral suppression (by > 1.0 log10 copies/ml or to < 400 copies/ml) was associated with an average increase in GFR of 9.2 ml/min per 1.73 m2 from baseline (95% confidence interval, 1.6–16.8; P = 0.02) over a median follow-up of 160 weeks. The magnitude of this association increased in subjects who had greater baseline impairment of renal function, and it did not depend on race or sex.
Viral suppression was associated with GFR improvements in those with both low CD4 cell counts and impaired baseline renal function, supporting an independent contribution of HIV-1 replication to chronic renal dysfunction in advanced HIV disease. GFR improvement not associated with viral suppression also was observed in subjects with higher CD4 cell counts.
antiretroviral therapy; chronic kidney disease; HIV-1 viremia; HIV-associated nephropathy
Reduced kidney function and albuminuria are associated with higher risk for cardiovascular disease (CVD) and mortality in HIV-infected individuals. We investigated whether reduced estimated glomerular filtration rate (eGFR) and albuminuria are associated with subclinical vascular disease, as assessed by carotid intima-medial thickness (cIMT).
Cross-sectional analysis of 476 HIV-infected individuals without clinical evidence of CVD enrolled in the Fat Redistribution and Metabolic Change in HIV infection (FRAM) study, using multivariable linear regression. eGFRCys and eGFRCr were calculated from cystatin C and creatinine levels. Albuminuria was defined as a positive urine dipstick (≥1+) or urine albumin-to-creatinine ratio ≥30 mg/g. Common and internal cIMT were measured by high-resolution B-mode ultrasound.
In unadjusted analyses, eGFRCys and eGFRCr were strongly associated with common and internal cIMT. Each 10 ml/min/1.73 m2 decrease in eGFRCys and eGFRCr was associated with a 0.008 mm higher common cIMT (p = 0.003, p = 0.01) and a 0.024 and 0.029 mm higher internal cIMT (p = 0.003), respectively. These associations were eliminated after adjustment for age, gender, and race. Albuminuria showed little association with common or internal cIMT in all models.
In HIV-infected individuals without prior CVD, reduced kidney function and albuminuria were not independently associated with subclinical vascular disease, as assessed by cIMT. These results suggest that research should focus on searching for novel mechanisms by which kidney disease confers cardiovascular risk in HIV-infected individuals.
Cystatin C; Intima-medial thickness; HIV; Atherosclerosis; Cardiovascular disease; Kidney
In the early highly active antiretroviral therapy (HAART) era, kidney dysfunction was strongly associated with death among HIV-infected individuals. We re-examined this association in the later HAART period to determine whether chronic kidney disease (CKD) remains a predictor of death after HAART-initiation.
To evaluate the effect of kidney function at the time of HAART initiation on time to all-cause mortality, we evaluated 1415 HIV-infected women initiating HAART in the Women’s Interagency HIV Study (WIHS). Multivariable proportional hazards models with survival times calculated from HAART initiation to death were constructed; participants were censored at the time of the last available visit or December 31, 2006.
CKD (eGFR <60 ml/min/1.73 m2) at HAART initiation was associated with higher mortality risk adjusting for age, race, hepatitis C serostatus, AIDS history and CD4+ cell count (hazard ratio [HR]=2.23, 95% confidence interval [CI]: 1.45–3.43). Adjustment for hypertension and diabetes history attenuated this association (HR=1.89, CI: 0.94–3.80). Lower kidney function at HAART initiation was weakly associated with increased mortality risk in women with prior AIDS (HR=1.09, CI: 1.00–1.19, per 20% decrease in eGFR).
Kidney function at HAART initiation remains an independent predictor of death in HIV-infected individuals, especially in those with a history of AIDS. Our study emphasizes the necessity of monitoring kidney function in this population. Additional studies are needed to determine mechanisms underlying the increased mortality risk associated with CKD in HIV-infected persons.
kidney disease; mortality; HIV; WIHS; antiretroviral therapy
High-dose erythropoiesis-stimulating agents (ESA) for anemia of chronic kidney disease (CKD) have been associated with adverse clinical outcomes and do not always improve erythropoiesis. We hypothesized that high-dose ESA requirement would be associated with elevated inflammatory biomarkers, decreased adipokines, and increased circulating, endogenous soluble erythropoietin receptors (sEpoR).
A cross-sectional cohort of anemic 32 CKD participants receiving ESA were enrolled at a single center and cytokine profiles, adipokines, and sEpoR were compared between participants stratified by ESA dose requirement (usual-dose darbepoetin-α (< 1 μg/kg/week) and high-dose (≥1 μg/kg/week)).
Baseline characteristics were similar between groups; however, hemoglobin was lower among participants on high-dose (1.4 μg/kg/week) vs usual-dose (0.5 μg/kg/week) ESA.
In adjusted analyses, high-dose ESA was associated with an increased odds for elevations in c-reactive protein and interleukin-6 (p < 0.05 for both). There was no correlation between high-dose ESA and adipokines. Higher ESA dose correlated with higher levels of sEpoR (rs = 0.39, p = 0.03). In adjusted analyses, higher ESA dose (per μcg/kg/week) was associated with a 53% greater odds of sEpoR being above the median (p < 0.05).
High-dose ESA requirement among anemic CKD participants was associated with elevated inflammatory biomarkers and higher levels of circulating sEpoR, an inhibitor of erythropoiesis. Further research confirming these findings is warranted.
Compared with controls, HIV-infected persons have a greater prevalence of kidney disease as assessed by high levels of cystatin C and albuminuria, but not as assessed by creatinine level. However, the clinical importance of elevated cystatin C and albuminuria in the HIV-infected population has not been studied.
We conducted an observational cohort study to determine the association of kidney disease (measured by albuminuria, cystatin C, and serum creatinine) with mortality.
Setting & Participants
922 HIV-infected persons enrolled in the FRAM (Fat Redistribution and Metabolic Change in HIV infection) study.
Serum cystatin C and serum creatinine were used to estimate glomerular filtration rate (eGFR). Albuminuria was defined as a positive urine dipstick (≥1+) or a urine albumin-creatinine ratio > 30 mg/g.
At baseline, reduced kidney function (eGFRSCysC <60 mL/min/1.73m2) or albuminuria was present in 28% of participants. After five years of follow-up, mortality was 48% among those with both eGFRSCysC <60 mL/min/1.73m2 and albuminuria, 23% in those with eGFRSCysC <60 mL/min/1.73m2 alone, 20% in those with albuminuria alone, and 9% in those with neither condition. After multivariable adjustment for demographics, cardiovascular risk factors, HIV-related factors, and inflammatory markers, eGFRSCysC <60 mL/min/1.73m2 and albuminuria were associated with nearly a twofold increase in mortality, whereas eGFRSCr <60 mL/min/1.73m2 did not appear to have any substantial association with mortality. Together, eGFRSCysC <60 mL/min/1.73m2 and albuminuria accounted for 17% of the population-level attributable risk for mortality.
Vital status was unknown in 261 participants from the original cohort.
Kidney disease marked by albuminuria or increased cystatin C levels appears to be an important risk factor for mortality in HIV-infected individuals. A substantial proportion of this risk may be unrecognized because of the current reliance on serum creatinine to estimate kidney function in clinical practice.
kidney disease; mortality; HIV infection
Microalbuminuria is associated with increased risk of cardiovascular disease and mortality. The objective of the study was to evaluate if HIV infection was an independent risk factor for microalbuminuria.
The relationship between HIV infection and microalbuminuria was assessed using subjects enrolled in the study of Fat Redistribution and Metabolic Change in HIV Infection, which consists of HIV-positive and control men and women. Participants with proteinuria (dipstick ≥1+) were excluded.
Microalbuminuria (urinary albumin/creatinine ratio, ACR>30 mg/g) was present in 11% of HIV infected, and 2% of control participants (P<0.001); a fivefold odds after multivariate adjustment (odds ratio, 5.11; 95% confidence interval, 1.97–13.31; P=0.0008). Several cardiovascular risk factors were associated with higher ACR in HIV participants: insulin resistance (HOMA>4; 32%, P<0.0001), systolic blood pressure (21%, P=0.01 for 120–140 versus <120 mmHg, and 43%, P <0.06 for >140 versus <120 mmHg), and family history of hypertension (17%, P=0.03). Higher CD4 cell count was associated with lower albumin/creatinine ratio (−24%, P=0.009 for 200–400 versus <200 cells/ml and −26%, P=0.005 for >400 versus <200 cells/ml).
HIV infection had a strong and independent association with microalbuminuria, the severity of which was predicted by markers of insulin resistance, hypertension, and advanced HIV infection. These associations warrant further investigation, as the increased prevalence of microalbuminuria in HIV infection may be a harbinger of future risk of cardiovascular and kidney diseases.
Microalbuminuria; kidney; urine protein; insulin resistance; lipodystrophy
Although studies have reported a high prevalence of end-stage renal disease in human immunodeficiency virus (HIV)-infected individuals, little is known about moderate impairments in kidney function. Cystatin C measurement may be more sensitive than creatinine for detecting impaired kidney function in persons with HIV.
We evaluated kidney function in the Fat Redistribution and Metabolic Change in HIV Infection (FRAM) cohort, a representative sample of 1008 HIV-infected persons and 290 controls from the Coronary Artery Risk Development in Young Adults (CARDIA) study in the United States.
Cystatin C level was elevated in HIV-infected individuals; the mean±SD cystatin C level was 0.92±0.22 mg/L in those infected with HIV and 0.76±0.15 mg/L in controls (P<.001). In contrast, both mean creatinine levels and estimated glomerular filtration rates appeared similar in HIV-infected individuals and controls (0.87±0.21 vs 0.85±0.19 mg/dL [to convert to micromoles per liter, multiply by 88.4] [P=.35] and 110±26 vs 106±23 mL/min/1.73 m2 [P=.06], respectively). Persons with HIV infection were more likely to have a cystatin C level greater than 1.0 mg/L (OR, 9.8; 95% confidence interval, 4.4-22.0 [P<.001]), a threshold demonstrated to be associated with increased risk for death and cardiovascular and kidney disease. Among participants with HIV, potentially modifiable risk factors for kidney disease, hypertension, and low high-density lipoprotein concentration were associated with a higher cystatin C level, as were lower CD4 lymphocyte count and coinfection with hepatitis C virus (all P<.001).
Individuals infected with HIV had substantially worse kidney function when measured by cystatin Clevel compared with HIV-negative controls, whereas mean creatinine levels and estimated glomerular filtration rates were similar. Cystatin C measurement could be a useful clinical tool to identify HIV-infected persons at increased risk for kidney and cardiovascular disease.
Hypertension is common in hemodialysis patients; however, the relationship between interdialytic weight gain (IDWG) and blood pressure (BP) is incompletely characterized. This study seeks to define the relationship between IDWG and BP in prevalent hemodialysis subjects.
Study Design, Setting, & Participants
This study used data from 32,295 dialysis sessions in 442 subjects followed up for 6 months in the Crit-Line Intradialytic Monitoring Benefit (CLIMB) Study.
Outcomes & Measurements
Mixed linear regression was used to analyze the relationship between percentage of IDWG (IDWG [%] = [current predialysis weight – previous postdialysis weight]/dry weight * 100) as the independent variable and systolic BP (SBP) and predialysis – postdialysis SBP (ΔSBP) as dependent variables.
In unadjusted analyses, every 1% increase in percentage of IDWG was associated with a 1.00 mm Hg (95% confidence interval [CI], ± 0.24) increase in predialysis SBP (P < 0.0001), 0.65 mm Hg (95% CI, ±0.24) decrease in postdialysis SBP (P < 0.0001), and 1.66 mm Hg (95% CI, ±0.25) increase in ΔSBP (P < 0.0001). After controlling for other significant predictors of SBP, every 1% increase in percentage of IDWG was associated with a 1.00 mm Hg (95% CI, ±0.24) increase in predialysis SBP (P < 0.0001) and a 1.08 mm Hg (95% CI, ±0.22) increase in ΔSBP with hemodialysis (P < 0.0001). However, in subjects with diabetes as the cause of end-stage renal disease, subjects with lower creatinine levels, and older subjects, the magnitude of the association between percentage of IDWG and predialysis SBP was less pronounced. The magnitude of percentage of IDWG on ΔSBP was less pronounced in younger subjects and subjects with lower dry weights. Results were similar with diastolic BP.
Hemodialysis BP measurements are imprecise estimates of BP and true hemodynamic burden in dialysis subjects.
In prevalent hemodialysis subjects, increasing percentage of IDWG is associated with increases in predialysis BP and BP changes with hemodialysis; however, the magnitude of the relationship is modest and modified by other clinical factors. Thus, although overall volume status may impact on BP to a greater extent, day-to-day variations in weight gain have a modest role in BP increases in prevalent subjects with end-stage renal disease.
Interdialytic weight gain; blood pressure; hemodialysis
This study examines the association between microalbuminuria and the development of proteinuria among HIV-infected persons.
948 subjects provided urine samples for albumin, protein, and creatinine measurements semiannually. Microalbuminuria was an albumin-to-creatinine ratio of >30 mg/gm. Proteinuria was a protein-to-creatinine ratio of ≥0.350 mg/mg. The progression from microalbuminuria to proteinuria was described.
At baseline, 69.4% had no detectable proteinuria, 20.2% had microalbuminuria, and 10.4% had proteinuria. Subjects with microalbuminuria and proteinuria were more likely to be black (p=0.03), have lower CD4+ counts (p=0.02,0.0001 compared to subjects without abnormal proteinuria, respectively), and have a higher HIV RNA level (p=0.08,0.04). Among 658 subjects with normal urine protein, 82.7% continued to have no abnormality, 14.3% developed microalbuminuria, and 3.0% developed proteinuria. Subjects without baseline proteinuria (i.e. either normal protein excretion or microalbuminuria) who developed proteinuria were more likely to have microalbuminuria (p=0.001), a lower CD4+ count (p=0.06), and a higher plasma HIV RNA (p=0.03) than those who did not progress to proteinuria. In multivariate analysis, only microalbuminuria remained associated with the development of proteinuria (OR=2.9; 95% CI 1.5, 5.5; p=0.001).
Microalbuminuria predicts the development of proteinuria among HIV-infected persons. Because proteinuria has been linked to poorer outcomes, strategies to affect microalbuminuria should be tested.
HIV-1; microalbuminuria; proteinuria; HIVAN; urine
Background. Patients with end-stage renal disease (ESRD) requiring chronic haemodialysis who undergo coronary artery bypass graft surgery (CABG) are at significant risk for perioperative mortality. However, the impact of changes in ESRD patient volume and characteristics over time on operative outcomes is unclear.
Methods. Using the Nationwide Inpatient Sample database (1988–03), we evaluated rates of CABG surgery with and without concurrent valve surgery among ESRD patients and outcomes including in-hospital mortality, and length of hospital stay. Multivariate regression models were used to account for patient characteristics and potential cofounders.
Results. From 1988 to 2003, annual rates of CABG among ESRD patients doubled from 2.5 to 5 per 1000 patient-years. Concomitantly, patient case-mix changed to include patients with greater co-morbidities such as diabetes, hypertension and obesity (all P < 0.001). Nonetheless, among ESRD patients, in-hospital mortality rates declined nearly 6-fold from over 31% to 5.4% (versus 4.7% to 1.8% among non-ESRD), and the median length of in-hospital stay dropped in half from 25 to 13 days (versus 14 to 10 days among non-ESRD).
Conclusions. Since 1988, an increasing number of patients with ESRD have been receiving CABG in the USA. Despite increasing co-morbidities, operative mortality rates and length of in-hospital stay have declined substantially. Nonetheless, mortality rates remain almost 3-fold higher compared to non-ESRD patients indicating a need for ongoing improvement.
coronary artery bypass graft; end-stage renal disease; in-hospital mortality; perioperative outcomes
In the United States, HIV-related kidney disease disproportionately affects individuals of African descent; however, there are few estimates of kidney disease prevalence in Africa. We evaluated the prevalence of kidney disease among HIV-infected and uninfected Rwandan women.
The Rwandan Women's Interassociation Study and Assessment prospectively enrolled 936 women. Associations with estimated glomerular filtration rate (eGFR)<60 mL/min/1.73 m2 and proteinuria were assessed in separate logistic regression models.
Among 891 non-pregnant women with available data, 2.4% had an eGFR<60 mL/min/1.73 m2 (calculated by the Modification of Diet in Renal Disease equation, MDRD eGFR) and 8.7% had proteinuria ≥1+. The prevalence of decreased eGFR varied markedly depending on the estimating method used, with the highest prevalence by Cockcroft-Gault. Regardless of the method used to estimate GFR, the proportion with decreased eGFR or proteinuria did not differ significantly between HIV-infected and -uninfected women in unadjusted analysis. After adjusting for age and blood pressure, HIV infection was associated with significantly higher odds of decreased MDRD eGFR but not proteinuria.
In a well-characterized cohort of Rwandan women, HIV infection was associated with decreased MDRD eGFR. The prevalence of decreased eGFR among HIV-infected women in our study was lower than that previously reported in African-Americans and in other Central and East African HIV populations, although there was substantial variability depending on the equation used to estimate GFR. Future studies are needed to optimize GFR estimates and to determine the impact of antiretroviral therapy on kidney disease in this population.
How co-infection with hepatitis C virus (HCV) impacts on the trajectory of kidney function among HIV-infected patients is unclear. This study examined the effect of HCV on kidney function over time among women infected with HIV.
Retrospective observational cohort
Setting and Participants
Study sample included participants from the Women's Interagency HIV Study who were HIV-infected and had received HCV antibody testing and serum creatinine measurement at baseline.
Outcomes and Measurement
Estimated glomerular filtration rate (eGFR) calculated from semi-annual serum creatinine measurements using the 4-variable Modification of Diet in Renal Diseases (MDRD) Study equation. Linear mixed models were used to evaluate the independent effect of being HCV seropositive on eGFR over time, adjusting for demographic factors, co-morbid conditions, illicit drug use, measures of HIV disease status, use of medications, and interactions with baseline low eGFR (<60 mL/min/1.73m2).
Of the 2,684 HIV-infected women, 952 (35%) were found to be HCV seropositive. For 180 women with CKD at baseline (eGFR <60 mL/min/1.73m2), being HCV seropositive was independently associated with a fully-adjusted net decline in eGFR of about 5% per year (95% CI: 3.2 to 7.2%), relative to women who were seronegative. In contrast, HCV was not independently associated with decline in eGFR among women without low eGFR at baseline (p<0.001 for interaction).
The MDRD Study equation has not been validated as a measure of GFR among persons with HIV or HCV. Proteinuria was not included in the study analysis. Because the study is observational, the effects of residual confounding cannot be excluded.
Among HIV-infected women with CKD, co-infection with HCV is associated with a modest, but statistically significant decline in eGFR over time. More careful monitoring of kidney function may be warranted for HIV-infected patients with CKD who are also co-infected with HCV.
hepatitis C virus; HIV; kidney diseases; women
Intradialytic increases in blood pressure (BP) can complicate the management of hypertension in hemodialysis (HD) patients but the long-term consequences are uncertain. Thus, we sought to determine if BP elevations during HD were associated with higher 2-year mortality among incident HD patients.
Secondary analysis of a prospective dialysis cohort.
Setting and Participants:
Incident hemodialysis patients in the Dialysis Morbidity and Mortality Wave 2 Study.
Changes in systolic BP (SBP) during hemodialysis (ie, postdialysis SBP–predialysis SBP), averaged from 3 hemodialysis sessions prior to enrollment.
Time to 2-year all-cause mortality.
Cox regression was used to model hazard ratios for mortality associated with changes in SBP during HD, while adjusting for demographics, comorbid conditions, interdialytic weight gain, laboratory variables, and antihypertensive agents.
Of 1,748 patients, 12.2% exhibited >10mmHg increases in SBP during HD. In adjusted analyses, every 10mmHg increase in SBP during HD was independently associated with a 6% increased hazard of death (HR 1.06, CI 1.01-1.11). When also adjusted for diastolic BP and postdialysis SBP, the adjusted hazard of death associated with increasing SBP during HD remained significant (HR 1.12, CI 1.05-1.21, per 10mmHg increase in ΔSBP during HD). However, in analyses adjusted for predialysis SBP, there was a significant interaction between change in SBP and predialysis SBP. In analyses stratified by predialysis SBP, trends for an increased mortality associated with increasing SBP during dialysis were present in patients with predialysis SBP <160mmHg, however, this relationship was only significant in patients with predialysis SBP <120mmHg.
Secondary analysis with a limited number of baseline BP measurements and limited information on dialysis prescription.
Increasing SBP >10mmHg during hemodialysis occurs in ∼10% of incident patients and while increasing systolic BP during HD was associated with decreased 2-year survival, these findings were limited to patients with predialysis systolic BP <120 mmHg.
blood pressure; end-stage renal disease; epidemiology and outcomes; hemodialysis; hypertension; mortality
Trials of anemia correction in chronic kidney disease have found either no benefit or detrimental outcomes of higher targets. We did a secondary analysis of patients with chronic kidney disease enrolled in the Correction of Hemoglobin in the Outcomes in Renal Insufficiency trial to measure the potential for competing benefit and harm from achieved hemoglobin and epoetin dose trials. In the 4 month analysis, significantly more patients in the high-hemoglobin compared to the low-hemoglobin arm were unable to achieve target hemoglobin and required high-dose epoetin-α. In unadjusted analyses, the inability to achieve a target hemoglobin and high-dose epoetin-α were each significantly associated with increased risk of a primary endpoint (death, myocardial infarction, congestive heart failure or stroke). In adjusted models, high-dose epoetin-α was associated with a significant increased hazard of a primary endpoint but the risk associated with randomization to the high hemoglobin arm did not suggest a possible mediating effect of higher target via dose. Similar results were seen in the 9 month analysis. Our study demonstrates that patients achieving their target had better outcomes than those who did not; and among subjects who achieved their randomized target, no increased risk associated with the higher hemoglobin goal was detected. Prospective studies are needed to confirm this relationship and determine safe dosing algorithms for patients unable to achieve target hemoglobin.
anemia; chronic kidney disease; epoetin-α; dose; epidemiology and outcomes
The CHOIR trial in anemic patients with chronic kidney disease compared epoetin-alfa treatment with low (11.3 g/l) and high (13.5 g/l) hemoglobin targets on the composite end point of death, hospitalization for heart failure, stroke, and myocardial infarction. However, other anemia management trials in patients with chronic kidney disease found there was increased risk when hemoglobin is targeted above 13 g/dl. In this secondary analysis of the CHOIR trial, we compared outcomes among the subgroups of patients with diabetes and heart failure to describe the comparative relationship of treatment to these two different hemoglobin goals. By Cox regression analysis, there was no increased risk associated with the higher hemoglobin target among patients with heart failure. In patients without heart failure, however, the hazard ratio (1.86) associated with the higher target was significant. Comparing survival curves in an unadjusted model, patients with diabetes did not have a greater hazard associated with the higher target. Subjects without diabetes had a significantly greater hazard in the high as compared to the low target, but the interaction between diabetes and the target was not significant. We suggest that the increased risks associated with higher hemoglobin targets are not clinically apparent among subgroups with greater mortality risk. These differential outcomes underscore the need for dedicated trials in these subpopulations.
anemia; diabetes mellitus; heart failure; kidney
Pulse pressure is a well established marker of vascular stiffness and is associated with increased mortality in hemodialysis patients. Here we sought to determine if a decrease in pulse pressure during hemodialysis was associated with improved outcomes using data from 438 hemodialysis patients enrolled in the 6-month Crit-Line Intradialytic Monitoring Benefit Study. The relationship between changes in pulse pressure during dialysis (2-week average) and the primary end point of non-access-related hospitalization and death were adjusted for demographics, comorbidities, medications, and laboratory variables. In the analyses that included both pre- and post-dialysis pulse pressure, higher pre-dialysis and lower post-dialysis pulse pressure were associated with a decreased hazard of the primary end point. Further, every 10 mm Hg decrease in pulse pressure during dialysis was associated with a 20% lower hazard of the primary end point. In separate models that included pulse pressure and the change in pulse pressure during dialysis, neither pre- nor post-dialysis pulse pressure were associated with the primary end point, but each 10 mm Hg decrease in pulse pressure during dialysis was associated with about a 20% lower hazard of the primary end point. Our study found that in prevalent dialysis subjects, a decrease in pulse pressure during dialysis was associated with improved outcomes. Further study is needed to identify how to control pulse pressure to improve outcomes.
end-stage renal disease; hemodialysis; intradialytic blood pressure; morbidity and mortality; outcomes; pulse pressure
Because both renal disease and immune activation predict progression to AIDS, we evaluated the relationships between dipstick proteinuria ≥1+ [7% of 1012 subjects], CrCl <90mL/min [18% of 1071 subjects], and percentages of peripheral activated CD8 cells (CD8+CD38+HLA-DR+) in antiretroviral-naïve, HIV-infected subjects enrolled into AIDS Clinical Trials Group studies 384 and A5095. Proteinuria, but not CrCl, was associated with higher percentages of CD8+CD38+HLA-DR+ cells [55% vs. 50%; P=0.01], with even more pronounced differences in men and among Blacks and Hispanics. Proteinuria may be a surrogate measure of greater immune activation in HIV-infected patients initiating antiretroviral therapy.
HIV-1; proteinuria; renal failure; nephropathy; immune activation
Acute kidney injury is common in HIV-infected patients, and has been associated with increased morbidity and mortality. Prior to the introduction of effective antiretroviral therapy, acute kidney injury in HIV-positive patients was most commonly the result of volume depletion, septicemia, or nephrotoxic medications. Acute kidney injury remains a significant problem in the antiretroviral era, and is still commonly attributed to infection or nephrotoxic medications. Less common causes such as direct infectious insults, immune restoration inflammatory syndrome, rhabdomyolysis, and obstruction should be considered when the underlying process is not obvious. In addition to advanced HIV disease, several other patient characteristics have emerged as potential risk factors for acute kidney injury in the antiretroviral era, including older age, diabetes, pre-existing chronic kidney disease, and hepatitis co-infection or liver disease.
Acute kidney injury; acute tubular necrosis; HIV
To evaluate the pharmacokinetics and pharmacogenomics of efavirenz (EFV) and lopinavir/ritonavir (LPV/RTV) in HIV-infected persons requiring hemodialysis.
Prospective, observational study of HIV-infected hemodialysis subjects receiving one 600mg tablet daily of EFV (N=13) or three 133.3/33.3mg capsules twice daily of LPV/RTV (N=13).
24-hour EFV and 12-hour LPV/RTV pharmacokinetics were assessed. Geometric mean ratios were calculated using historical controls with normal renal function. The effects of several candidate gene polymorphisms were also explored.
The geometric mean (95% CI, %CV) Cmin, Cmax, and AUC for the EFV group were 1.81µg/mL (0.93, 3.53; 103%), 5.04µg/mL (3.48, 7.29; 72%), and 71.5µg·h/mL (43.2, 118.3; 93%), respectively. These parameters were 2.76µg/mL (1.86, 4.11; 53%), 8.45µg/mL (6.41, 11.15; 52%), and 69.6µg·h/mL (55.6, 87.2; 37%) for LPV and 0.08µg/mL (0.05, 0.14; 63%), 0.58µg/mL (0.44, 0.76; 41%), and 3.74µg·h/mL (2.91, 4.80; 37%) for RTV. The AUC geometric mean ratios (90% CI) for EFV, LPV, and RTV were 132% (89, 197), 81% (67, 97), and 92% (76, 111), respectively. LPV Cmin was lower than expected in the hemodialysis group. Higher EFV concentrations were associated with the CYP2B6 516G>T polymorphism.
The pharmacokinetics of EFV and LPV/RTV in hemodialysis suggest that no dosing adjustments are necessary in treatment-naïve patients. As HIV-infected hemodialysis patients are disproportionately black, the increased frequency of the CYP2B6 516G>T polymorphism may lead to higher EFV levels. The potentially lower LPV trough levels in this population suggest that LPV/RTV should be used with caution in protease inhibitor-experienced patients.
Pharmacokinetics; Pharmacogenomics; Renal Failure; Dialysis; HIV; Efavirenz; Lopinavir; Ritonavir
Proteinuria is associated with progressive renal disease and overall mortality in HIV-infected patients. However, the prevalence and correlates of quantitative proteinuria in the HAART era are unknown.
Spot urine protein to creatinine (P/Cr) ratios, an accepted measure of quantitative daily proteinuria, were measured annually since 2002 in participants of the AIDS Clinical Trials Group Longitudinal Linked Randomized Trials (ALLRT) cohort. We used linear regression models with general estimating equations to identify factors associated with the abnormal P/Cr thresholds of ≥0.2 and ≥1.0.
2857 participants, most of whom were receiving antiretroviral therapy, were analyzed. 16% and 3% had P/Cr levels ≥0.2 and ≥1.0, respectively, at first measurement. P/Cr levels did not change during a median follow-up of 3 (IQR 2, 4) years. Factors associated with P/Cr ≥0.2 at any measurement included greater age, lower glomerular filtration rate, female sex, antiretroviral therapy prior to entry into parent randomized trial, HIV-1 RNA level ≥ 400copies/ml, lower CD4 cell count, and history of hypertension, diabetes, or hepatitis C co-infection (all P<0.04). Black race and higher non-HDL-C levels were associated with P/Cr levels ≥1.0 but not with P/Cr levels ≥0.2. Hepatitis B co-infection and current use of adefovir, indinavir, and tenofovir were not associated with either P/Cr threshold.
Both HIV and non-HIV-related factors are associated with abnormal levels of proteinuria and identify those who are at greater risk of worse clinical outcomes. Several of these factors are differentially associated with lower and higher proteinuria thresholds.
HIV; Proteinuria; Antiretroviral Therapy; Nephropathy
Administrative claims are a rich source of information for epidemiological and health services research; however, the ability to accurately capture specific diseases or complications using claims data has been debated. In this study, the authors examined the validity of International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnosis codes for the identification of hyponatremia in an outpatient managed care population.
We analyzed outpatient laboratory and professional claims for patients aged 18 years and older in the National Managed Care Benchmark Database from Integrated Healthcare Information Services. We obtained all claims for outpatient serum sodium laboratory tests performed in 2004 and 2005, and all outpatient professional claims with a primary or secondary ICD-9-CM diagnosis code of hyponatremia (276.1).
A total of 40,668 outpatient serum sodium laboratory results were identified as hyponatremic (serum sodium < 136 mmol/L). The sensitivity of ICD-9-CM codes for hyponatremia in outpatient professional claims within 15 days before or after the laboratory date was 3.5%. Even for severe cases (serum sodium ≤ 125 mmol/L), sensitivity was < 30%. Specificity was > 99% for all cutoff points.
ICD-9-CM codes in administrative data are insufficient to identify hyponatremia in an outpatient population.
The steady-state pharmacokinetics of lamivudine were evaluated in 11 subjects with human immunodeficiency virus infection and end-stage renal disease, 9 of whom were receiving hemodialysis and 2 of whom were receiving chronic ambulatory peritoneal dialysis (CAPD). All subjects received 150 mg of lamivudine daily for at least 2 weeks prior to sampling for determination of the pharmacokinetics of lamivudine over a 24-h period on 2 consecutive days. On the first day, subjects received 150 mg of oral lamivudine and underwent dialysis (hemodialysis or CAPD). On the second day, subjects received another 150 mg of oral lamivudine but dialysis was not performed. For the subjects undergoing hemodialysis, the geometric mean predose serum lamivudine concentration was 1.14 μg/ml (95% confidence interval [CI], 0.83 to 1.58 μg/ml), the geometric mean maximum concentration in serum (Cmax) was 3.77 μg/ml (95% CI, 3.01 to 4.71 μg/ml), and the geometric mean area under the serum concentration-time curve from time zero to 24 h (AUC0-24) was 49.8 μg · h/ml (95% CI 39.1 to 63.6 μg · h/ml). Hemodialysis removed approximately 28 mg of lamivudine but had no significant effect on Cmax or AUC0-24. In the absence of hemodialysis, the geometric mean lamivudine terminal elimination half-life was 17.2 h (95% CI, 10.5 to 28.1 h), whereas the geometric mean intradialysis half-life of lamivudine was 5.3 h (95% CI, 3.4 to 8.2 h). The pharmacokinetics of lamivudine in subjects undergoing CAPD were similar to those in subjects undergoing hemodialysis. CAPD removed 24 mg of lamivudine over a 24-h period but had no effect on Cmax or AUC0-24. Pharmacokinetic modeling suggests that a lamivudine dose of 25 mg daily in hemodialysis subjects would provide serum exposure similar to that provided by a dose of 150 mg twice daily in patients with normal renal function.
Racial differences in the use of diagnostic and therapeutic services have an impact on outcomes in patients with chronic kidney disease. Important contributors to these racial disparities are inadequate insurance, poor access to health services' networks, and overt prejudice or subconscious bias. The use of an appropriate dose of hemodialysis is a fundamental health intervention for end-stage renal disease, which can act as a measure of the adequacy of healthcare provision. When the dose of hemodialysis was analyzed by race, the greatest deficiency in care was observed for African Americans, who had a 60% greater likelihood of receiving inadequate dialysis compared with whites. The Centers for Medicare and Medicaid Services (CMS) have developed and implemented evidence-based clinical practice guidelines, designed to improve the services provided by the renal community. This approach positively impacted on dialysis doses received by patients, such that between 1993 and 1997, the percentage of patients receiving a benchmark urea reduction ratio (URR) > or = 65% increased from 43% in 1993 to 72% in 1997. However, the most dramatic improvement was seen among African Americans who had a 92% increase in the proportion of patients achieving a URR > or = 65%. Rather than focusing on who is treated, processes should be adopted to focus on how patients are treated. Increasing the use of evidence-based practices offers strategies aimed at assuring equal treatment for all and encompasses physician accountability, without the need for specific race-based intervention programs.