Serum albumin concentrations are a strong predictor of mortality and cardiovascular disease in HIV-infected individuals. We studied the longitudinal associations between serum albumin levels and kidney function decline in a population of HIV-infected women.
Retrospective cohort analysis.
Setting & Participants
The study participants were recruited from the Women’s Interagency HIV Study (WIHS), a large observational study designed to understand risk factors for the progression of HIV infection in women living in urban communities. 908 participants had baseline assessment of kidney function and two follow-up measures over an average of 8 years.
The primary predictor was serum albumin concentration.
We examined annual change in kidney function. Secondary outcomes included rapid kidney function decline and incident reduced estimated glomerular filtration rate (eGFR).
Kidney function decline was determined by cystatin C–based (eGFRcys) and creatinine-based eGFR (eGFRcr) at baseline and follow up. Each model was adjusted for kidney disease and HIV-related risk factors using linear and relative risk regression.
After multivariate adjustment, each 0.5-g/dL decrement in baseline serum albumin concentration was associated with a 0.56-mL/min faster annual decline in eGFRcys (P<0.001), which was only slightly attenuated to 0.55-mL/min/1.73 m2 after adjustment for albuminuria. Results were similar whether using eGFRcys or eGFRcr. In adjusted analyses, each 0.5-g/dL lower baseline serum albumin was associated with a 1.71-fold greater risk of rapid kidney function decline (p<0.001) and a 1.72-fold greater risk of incident reduced eGFR (p<0.001).
The cohort is composed of only female participants from urban communities within the United States.
Lower levels of serum albumin were strongly associated with kidney function decline and incident reduced eGFR in HIV-infected women, independent of HIV disease status, BMI and albuminuria.
albumin; kidney function; HIV; incident reduced eGFR; albuminuria; disease trajectory; chronic kidney disease (CKD) progression
In contrast to findings from cohorts comprised primarily of HIV-infected men, verbal memory deficits are the largest cognitive deficit found in HIV-infected women from the Women’s Interagency HIV Study (WIHS), and this deficit is not explained by depressive symptoms or substance abuse. HIV-infected women may be at greater risk for verbal memory deficits due to a higher prevalence of cognitive risk factors such as high psychosocial stress and lower socioeconomic status. Here, we investigate the association between perceived stress using the Perceived Stress Scale (PSS-10) and verbal memory performance using the Hopkins Verbal Learning Test (HVLT) in 1009 HIV-infected and 496 at-risk HIV-uninfected WIHS participants. Participants completed a comprehensive neuropsychological test battery which yielded seven cognitive domain scores, including a primary outcome of verbal memory. HIV infection was not associated with a higher prevalence of high perceived stress (i.e., PSS-10 score in the top tertile) but was associated with worse performance on verbal learning (p<0.01) and memory (p<0.001), as well as attention (p=0.02). Regardless of HIV status, high stress was associated with poorer performance in those cognitive domains (p’s< 0.05) as well as processing speed (p=0.01) and executive function (p<0.01). A significant HIV by stress interaction was found only for the verbal memory domain (p=0.02); among HIV-infected women only, high stress was associated with lower performance (p’s<0.001). That association was driven by the delayed verbal memory measure in particular. These findings suggest that high levels of perceived stress contribute to the deficits in verbal memory observed in WIHS women.
HIV; Verbal memory; Stress; Women; Cognition
Chronic kidney disease (CKD) is common in HIV; CKD is associated with mortality. Urinary markers of tubular injury have been associated with future kidney disease risk, but associations with mortality are unknown.
We evaluated the association of urinary interleukin-18(IL-18), liver fatty acid binding protein(L-FABP), kidney injury molecule-1(KIM-1), neutrophil gelatinase-associated lipocalin(NGAL), albumin-to-creatinine ratio(ACR) with 10-year, all-cause death in 908 HIV-infected women. Kidney function was estimated using cystatin C (eGFRcys).
There were 201 deaths during 9,269 person-years of follow-up. After demographic adjustment, compared to the lowest tertile, highest tertiles of IL-18 (HR 2.54,95%CI 1.75–3.68), KIM-1 (2.04,1.44–2.89), NGAL(1.50,1.05–2.14), and ACR(1.63,1.13–2.36) were associated with higher mortality. After multivariable adjustment including eGFRcys, only the highest tertiles of IL-18, (1.88,1.29–2.74) and ACR (1.46,1.01–2.12) remained independently associated with mortality. Findings with KIM-1 were borderline (1.41, 0.99–2.02). We found a J-shaped association between L-FABP and mortality. Compared to persons in the lowest tertile, HR for middle tertile of L-FABP was 0.67 (0.46–0.98) after adjustment. Findings were stronger when IL-18, ACR and L-FABP were simultaneously included in models.
Among HIV-infected women, some urinary markers of tubular injury are associated with mortality risk, independently of eGFRcys and ACR. These markers represent potential tools to identify early kidney injury in persons with HIV.
HIV; IL-18; KIM-1; L-FABP; NGAL; urinary biomarkers
Recent studies suggest that HIV-specific antibody-dependent cell-mediated cytotoxicity (ADCC) antibodies contribute to protective immunity against HIV. An important characteristic of future HIV vaccines will, therefore, be the ability to stimulate production of these antibodies in both men and women. Early studies suggest that men may have a better ADCC antibody response against HIV than women. Our objective was to determine whether men and women differ with respect to their ADCC response to HIV-1 gp120. HIV-positive, asymptomatic untreated men and women were matched for race, age, CD4+ T cell number, HIV-1 viral load, and treatment and HIV-1 gp120 ADCC antibody titers were compared. A standard 51Cr-release assay was used to determine HIV-1 gp120 ADCC antibody titers in HIV-1-seropositive individuals from the Multicenter AIDS Cohort Study (MACS; n=32) and the Women's Interagency HIV Study (WIHS; n=32). Both sexes had high ADCC titers against HIV-1 gp120: 34.4% (n=11) and 40.6% (n=13) of men and women, respectively, had titers of 10,000; 62.5% (n=20) and 56.3% (n=18) had titers of 100,000. Groups did not differ in percent specific release (% SR), lytic units (LU), correlations of titer to viral load, or titer to CD4+ T cells in men or women. Both groups also had similar cross-clade ADCC antibody responses (p>0.5 for % SR and LU). Comparable groups of asymptomatic HIV-1-infected men and women had comparable HIV-1 gp120 ADCC antibodies. Both sexes had significant cross-clade reactivity. Differences between men and women may become evident as disease progresses; this should be evaluated at later stages of HIV-1 infection.
We examined whether established associations between HIV disease and HIV disease progression on worse health-related quality of life (HQOL) were applicable to women with severe trauma histories, in this case Rwandan women genocide survivors, the majority of whom were HIV infected. Additionally, this study attempted to clarify whether post-traumatic stress symptoms were uniquely associated with HQOL or confounded with depression.
The Rwandan Women’s Interassociation Study and Assessment (RWISA) was a longitudinal prospective study of HIV-infected and uninfected women. At study entry 922 women (705 HIV+ and 217 HIV−) completed measures of symptoms of post-traumatic stress and HQOL as well as other demographic, clinical and behavioral characteristics.
Even after controlling for potential confounders and mediators, HIV+ women, in particular those with the lowest CD4 counts, scored significantly worse on HQOL and overall QOL than did HIV− women. Even after controlling for depression and HIV disease progression, women with more post-traumatic stress symptoms scored worse on HQOL and overall QOL than women with fewer post-traumatic stress symptoms.
This study demonstrated that post-traumatic stress symptoms were independently associated with HQOL and overall QOL, independent of depression and other confounders or potential mediators. Future research should examine whether the long term impact of treatment on physical and psychological symptoms of HIV and post-traumatic stress symptoms would generate improvement in HQOL.
Quality of Life; Posttraumatic Stress Disorder; HIV; Women; Rwanda
This study addressed whether psychopharmacologic and psychotherapeutic treatment of depressed HIV+ women met standards defined in the best practice literature, and tested hypothesized predictors of standard-concordant care. 1,352 HIV-positive women in the multi-center Women’s Interagency HIV Study were queried about depressive symptoms and mental health service utilization using standards published by the American Psychiatric Association and the Agency for Healthcare Quality and Research to define adequate depression treatment. We identified those who: 1) reported clinically significant depressive symptoms (CSDS) using Centers for Epidemiological Studies – Depression Scale (CES-D) scores of ≥ 16; or 2) had lifetime diagnoses of major depressive disorder (MDD) assessed by World Mental Health Composite International Diagnostic Interviews plus concurrent elevated depressive symptoms in the past 12 months. Adequate treatment prevalence was 46.2% (n=84) for MDD and 37.9% (n=211) for CSDS. Multivariable logistic regression analysis found that adequate treatment was more likely among women who saw the same primary care provider consistently, who had poorer role functioning, who paid out-of-pocket for healthcare, and who were not African American or Hispanic/Latina. This suggests that adequate depression treatment may be increased by promoting healthcare provider continuity, outreaching individuals with lower levels of role impairment, and addressing the specific needs and concerns of African American and Hispanic/Latina women.
Women and HIV; Depression Treatment; Psychopharmacology; Psychotherapy
We examined the interaction of illicit drug use and depressive symptoms, and how they affect the subsequent likelihood of highly active antiretroviral therapy (HAART) use among women with HIV/AIDS.
Subjects included 1,710 HIV-positive women recruited from six sites in the U.S. including Brooklyn, Bronx, Chicago, Los Angeles, San Francisco/Bay Area, and Washington, DC. Cases of probable depression were identified using depressive symptom scores on the Centers for Epidemiologic Studies Depression Scale. Crack, cocaine, heroin, and amphetamine use were self-reported at 6-month time intervals. We conducted multivariate logistic random regression analysis of data collected during sixteen waves of semiannual interviews conducted from April 1996 through March 2004.
We found an interaction effect between illicit drug use and depression that acted to suppress subsequent HAART use, controlling for virologic and immunologic indicators, socio-demographic variables, time, and study site.
This is the first study to document the interactive effects of drug use and depressive symptoms on reduced likelihood of HAART use in a national cohort of women. Since evidence-based behavioral health and antiretroviral therapies for each of these three conditions are now available, comprehensive HIV treatment is an achievable public health goal.
HIV; depression; HAART; drug use
HIV infection and illicit drug use are each associated with diminished cognitive performance. This study examined the separate and interactive effects of HIV and recent illicit drug use on verbal memory, processing speed and executive function in the multicenter Women's Interagency HIV Study (WIHS).
Participants included 952 HIV-infected and 443 HIV-uninfected women (mean age=42.8, 64% African-American). Outcome measures included the Hopkins Verbal Learning Test - Revised (HVLT-R) and the Stroop test. Three drug use groups were compared: recent illicit drug users (cocaine or heroin use in past 6 months, n=140), former users (lifetime cocaine or heroin use but not in past 6 months, n=651), and non-users (no lifetime use of cocaine or heroin, n=604).
The typical pattern of recent drug use was daily or weekly smoking of crack cocaine. HIV infection and recent illicit drug use were each associated with worse verbal learning and memory (p's<.05). Importantly, there was an interaction between HIV serostatus and recent illicit drug use such that recent illicit drug use (compared to non-use) negatively impacted verbal learning and memory only in HIV-infected women (p's <0.01). There was no interaction between HIV serostatus and illicit drug use on processing speed or executive function on the Stroop test.
The interaction between HIV serostatus and recent illicit drug use on verbal learning and memory suggests a potential synergistic neurotoxicity that may affect the neural circuitry underlying performance on these tasks.
Natural history studies suggest increased risk for kidney function decline with HIV infection, but few studies have made comparisons with HIV-uninfected women. We examined whether HIV infection treated with highly active antiretroviral therapy (HAART) remains associated with faster kidney function decline in the Women's Interagency HIV Study. HIV-infected women initiating HAART with (n=105) or without (n=373) tenofovir (TDF) were matched to HIV-uninfected women on calendar and length of follow-up, age, systolic blood pressure, hepatitis C antibody serostatus, and diabetes history. Linear mixed models were used to evaluate differences in annual estimated glomerular filtration rate (eGFR). Person-visits were 4,741 and 11,512 for the TDF-treated and non-TDF-treated analyses, respectively. Mean baseline eGFRs were higher among women initiated on TDF-containing HAART and lower among those on TDF-sparing HAART compared to their respective HIV-uninfected matches (p<0.05 for both). HIV-infected women had annual rates of eGFR changes similar to HIV-uninfected matches (p-interaction >0.05 for both). Adjusting for baseline eGFR, mean eGFRs at 1 and 3 years of follow-up among women initiated on TDF-containing HAART were lower than their uninfected matches (−4.98 and −4.26 ml/min/1.73 m2, respectively; p<0.05 for both). Mean eGFR of women initiated on TDF-sparing HAART was lower versus uninfected matches at 5 years (–2.19 ml/min/1.73 m2, p=0.03). HAART-treated HIV-infected women had lower mean eGFRs at follow-up but experienced rates of annual eGFR decline similar to HIV-uninfected women. Tenofovir use in HIV-infected women with normal kidney function did not accelerate long-term kidney function decline relative to HIV-uninfected women.
Human leukocyte antigen (HLA) genotype has been associated with probability of spontaneous clearance of hepatitis C virus (HCV). However, no prior studies have examined whether this relationship may be further characterized by grouping HLA alleles according to their supertypes, defined by their binding capacities. There is debate regarding the most appropriate method to define supertypes. Therefore, previously reported HLA supertypes (46 class I and 25 class II) were assessed for their relation with HCV clearance in a population of 758 HCV-seropositive women. Two HLA class II supertypes were significant in multivariable models that included: (i) supertypes with significant or borderline associations with HCV clearance after adjustment for multiple tests, and (ii) individual HLA alleles not part of these supertypes, but associated with HCV clearance in our prior study in this population. Specifically, supertype DRB3 (prevalence ratio (PR)=0.4; p=0.004) was associated with HCV persistence while DR8 (PR=1.8; p=0.01) was associated with HCV clearance. Two individual alleles (B*57:01 and C*01:02) associated with HCV clearance in our prior study became non-significant in analysis that included supertypes while B*57:03 (PR=1.9; p=0.008) and DRB1*07:01 (PR=1.7; p=0.005) retained significance. These data provide epidemiologic support for the significance of HLA supertypes in relation to HCV clearance.
hepatitis C virus; HLA; human leukocyte antigen; supertype
HIV-infected persons have substantially higher risk of kidney failure than persons without HIV, but serum creatinine levels are insensitive for detecting declining kidney function. We hypothesized that urine markers of kidney injury would be associated with declining kidney function among HIV-infected women.
In the Women's Interagency HIV Study (WIHS), we measured concentrations of albumin-to-creatinine ratio (ACR), interleukin-18 (IL-18), kidney injury marker-1 (KIM-1), and neutrophil gelatinase-associated lipocalin (NGAL) from stored urine among 908 HIV-infected and 289 uninfected participants. Primary analyses used cystatin C based estimated glomerular filtration rate (CKD-EPI eGFRcys) as the outcome, measured at baseline and two follow-up visits over eight years; secondary analyses used creatinine (CKD-EPI eGFRcr). Each urine biomarker was categorized into tertiles, and kidney decline was modeled with both continuous and dichotomized outcomes.
Compared with the lowest tertiles, the highest tertiles of ACR (−0.15ml/min/1.73m2, p<0.0001), IL-18 (−0.09ml/min/1.73m2, p<0.0001) and KIM-1 (−0.06ml/min/1.73m2, p<0.001) were independently associated with faster eGFRcys decline after multivariate adjustment including all three biomarkers among HIV-infected women. Among these biomarkers, only IL-18 was associated with each dichotomized eGFRcys outcome: ≥3% (Relative Risk 1.40; 95%CI 1.04-1.89); ≥5% (1.88; 1.30-2.71); and ≥10% (2.16; 1.20-3.88) for the highest versus lowest tertile. In alternative models using eGFRcr, the high tertile of KIM-1 had independent associations with 5% (1.71; 1.25-2.33) and 10% (1.78; 1.07-2.96) decline, and the high IL-18 tertile with 10% decline (1.97; 1.00-3.87).
Among HIV-infected women in the WIHS cohort, novel urine markers of kidney injury detect risk for subsequent declines in kidney function.
HIV; KIM-1; NGAL; IL-18; albumin-to-creatinine ratio; cystatin C; kidney injury
Contraception can reduce the dual burden of high fertility and high HIV prevalence in sub-Sahara Africa, but significant barriers remain regarding access and use. We describe factors associated with nonuse of contraception and with use of specific contraceptive methods in HIV positive and HIV negative Rwandan women. Data from 395 HIV-positive and 76 HIV-negative women who desired no pregnancy in the previous 6 months were analyzed using univariate and multivariate logistic regression models to identify clinical and demographic characteristics that predict contraceptive use. Differences in contraceptive methods used were dependent on marital/partner status, partner's knowledge of a woman's HIV status, and age. Overall, condoms, abstinence, and hormonal methods were the most used, though differences existed by HIV status. Less than 10% of women both HIV+ and HIV− used no contraception. Important differences exist between HIV-positive and HIV-negative women with regard to contraceptive method use that should be addressed by interventions seeking to improve contraceptive prevalence.
To estimate the association between vitamin D deficiency and bacterial vaginosis (BV) among nonpregnant HIV-infected and uninfected women.
In a substudy of the Women's Interagency HIV Study, including women from Chicago and New York, the association between BV and vitamin D deficiency, demographics, and disease characteristics was tested using generalized estimating equations. Deficiency was defined as <20 ng/mL 25 (OH) vitamin D and insufficiency as >20 and ≤30 ng/mL. BV was defined by the Amsel criteria.
Among 602 observations of nonpregnant women (480 HIV infected and 122 uninfected), BV was found in 19%. Vitamin D deficiency was found in 59.4%, and insufficiency was found in 24.4%. In multivariable analysis, black race was the most significant predictor of BV (adjusted odds ratio [AOR] 5.90, (95% confidence interval [CI] 2.52-13.8). Vitamin D deficiency was independently associated with BV among HIV-infected women (AOR 3.12, 95% CI 1.16-8.38) but not among HIV-uninfected women. There was a negative linear correlation between vitamin D concentration and prevalence of BV in HIV-infected women (r=−0.15, p=0.001).
Vitamin D deficiency was very common in this cohort and significantly associated with BV among HIV-infected women. These preliminary findings suggest that further epidemiologic and mechanistic exploration of the relationship between vitamin D and BV in HIV-infected women is warranted.
HIV causes inflammation that can be at least partially corrected by HAART. To determine the qualitative and quantitative nature of cytokine perturbation, we compared cytokine patterns in three HIV clinical groups including HAART responders (HAART), untreated HIV non-controllers (NC), and HIV-uninfected (NEG).
Multiplex assays were used to measure 32 cytokines in a cross-sectional study of participants in the Women's Interagency HIV Study (WIHS). Participants from 3 groups were included: HAART (n=17), NC (n=14), and HIV NEG (n=17).
Several cytokines and chemokines showed significant differences between NC and NEG participants, including elevated IP-10 and TNF-α and decreased IL-12(p40), IL-15, and FGF-2 in NC participants. Biomarker levels among HAART women more closely resembled the NEG, with the exception of TNF-α and FGF-2. Secondary analyses of the combined HAART and NC groups revealed that IP-10 showed a strong, positive correlation with viral load and negative correlation with CD4+ T cell counts. The growth factors VEGF, EGF, and FGF-2 all showed a positive correlation with increased CD4+ T cell counts.
Untreated, progressive HIV infection was associated with decreased serum levels of cytokines important in T cell homeostasis (IL-15) and T cell phenotype determination (IL-12), and increased levels of innate inflammatory mediators such as IP-10 and TNF-α. HAART was associated with cytokine profiles that more closely resembled those of HIV uninfected women. The distinctive pattern of cytokine levels in the 3 study groups may provide insights into HIV pathogenesis, and responses to therapy.
HIV; CD4+ T cells; cytokines; chemokines; HAART
Depression and posttraumatic stress disorder (PTSD) are common in developing and postconflict countries. The purpose of this study is to examine longitudinal changes in PTSD in HIV-infected and uninfected Rwandan women who experienced the 1994 genocide.
Five hundred thirty-five HIV-positive and 163 HIV-negative Rwandan women in an observational cohort study were followed for 18 months. Data on PTSD symptoms were collected longitudinally by the Harvard Trauma Questionnaire (HTQ) and analyzed in relationship to demographics, HIV status, antiretroviral treatment (ART), and depression. PTSD was defined as a score on the HTQ of ≥2.
There was a continuing reduction in HTQ scores at each follow-up visit. The prevalence of PTSD symptoms changed significantly, with 61% of the cohort having PTSD at baseline vs. 24% after 18 months. Women with higher HTQ score were most likely to have improvement in PTSD symptoms (p<0.0001). Higher rate of baseline depressive symptoms (p<0.0001) was associated with less improvement in PTSD symptoms. HIV infection and ART were not found to be consistently related to PTSD improvement.
HIV care settings can become an important venue for the identification and treatment of psychiatric problems affecting women with HIV in postconflict and developing countries. Providing opportunities for women with PTSD symptoms to share their history of trauma to trained counselors and addressing depression, poverty, and ongoing violence may contribute to reducing symptoms.
Background. Human leukocyte antigen (HLA) class I and II genotype is associated with clearance of hepatitis C virus (HCV) infection, but little is known regarding its relation with HCV viral load or risk of liver disease in patients with persistent HCV infection.
Methods. High-resolution HLA class I and II genotyping was conducted in a prospective cohort of 519 human immunodeficiency virus (HIV)–seropositive and 100 HIV-seronegative women with persistent HCV infection. The end points were baseline HCV viral load and 2 noninvasive indexes of liver disease, fibrosis-4 (FIB-4), and the aspartate aminotransferase to platelet ratio index (APRI), measured at baseline and prospectively.
Results. DQB1*0301 was associated with low baseline HCV load (β = −.4; 95% confidence interval [CI], −.6 to −.3; P < .00001), as well as with low odds of FIB-4–defined (odds ratio [OR], .5; 95% CI, .2–.9; P = .02) and APRI-defined liver fibrosis (OR, .5; 95% CI, .3–1.0; P = .06) at baseline and/or during follow-up. Most additional associations with HCV viral load also involved HLA class II alleles. Additional associations with FIB-4 and APRI primarily involved class I alleles, for example, the relation of B*1503 with APRI-defined fibrosis had an OR of 2.0 (95% CI, 1.0–3.7; P = .04).
Conclusions. HLA genotype may influence HCV viral load and risk of liver disease, including DQB1*0301, which was associated with HCV clearance in prior studies.
In a longitudinal study of outcomes on atazanavir-based therapy in a large cohort of HIV-infected women, hair levels of atazanavir were the strongest independent predictor of virologic suppression. Hair antiretroviral concentrations may serve as a useful tool in HIV care.
Background. Adequate exposure to antiretrovirals is important to maintain durable responses, but methods to assess exposure (eg, querying adherence and single plasma drug level measurements) are limited. Hair concentrations of antiretrovirals can integrate adherence and pharmacokinetics into a single assay.
Methods. Small hair samples were collected from participants in the Women's Interagency HIV Study (WIHS), a large cohort of human immunodeficiency virus (HIV)-infected (and at-risk noninfected) women. From 2003 through 2008, we analyzed atazanavir hair concentrations longitudinally for women reporting receipt of atazanavir-based therapy. Multivariate random effects logistic regression models for repeated measures were used to estimate the association of hair drug levels with the primary outcome of virologic suppression (HIV RNA level, <80 copies/mL).
Results. 424 WIHS participants (51% African-American, 31% Hispanic) contributed 1443 person-visits to the analysis. After adjusting for age, race, treatment experience, pretreatment viral load, CD4 count and AIDS status, and self-reported adherence, hair levels were the strongest predictor of suppression. Categorized hair antiretroviral levels revealed a monotonic relationship to suppression; women with atazanavir levels in the highest quintile had odds ratios (ORs) of 59.8 (95% confidence ratio, 29.0–123.2) for virologic suppression. Hair atazanavir concentrations were even more strongly associated with resuppression of viral loads in subgroups in which there had been previous lapses in adherence (OR, 210.2 [95% CI, 46.0–961.1]), low hair levels (OR, 132.8 [95% CI, 26.5–666.0]), or detectable viremia (OR, 400.7 [95% CI, 52.3–3069.7]).
Conclusions. Antiretroviral hair levels surpassed any other predictor of virologic outcomes to HIV treatment in a large cohort. Low antiretroviral exposure in hair may trigger interventions prior to failure or herald virologic failure in settings where measurement of viral loads is unavailable. Monitoring hair antiretroviral concentrations may be useful for prolonging regimen durability.
While the human leukocyte antigen (HLA) genotype has been associated with the rate of HIV disease progression in untreated patients, little is known regarding these relationships in patients using highly active antiretroviral therapy (HAART). The limited data reported to date identified few HLA-HIV disease associations in patients using HAART and even occasional associations that were opposite of those found in untreated patients. We conducted high-resolution HLA class I and II genotyping in a random sample (n = 860) of HIV-seropositive women enrolled in a long-term cohort initiated in 1994. HLA-HIV disease associations before and after initiation of HAART were examined using multivariate analyses. In untreated HIV-seropositive patients, we observed many of the predicted associations, consistent with prior studies. For example, HLA-B*57 (β = −0.7; 95% confidence interval [CI] = −0.9 to −0.5; P = 5 × 10−11) and Bw4 (β = −0.2; 95% CI = −0.4 to −0.1; P = 0.009) were inversely associated with baseline HIV viral load, and B*57 was associated with a low risk of rapid CD4+ decline (odds ratio [OR] = 0.2; 95% CI = 0.1 to 0.6; P = 0.002). Conversely, in treated patients, the odds of a virological response to HAART were lower for B*57:01 (OR = 0.2; 95% CI = 0.0 to 0.9; P = 0.03), and Bw4 (OR = 0.4; 95% CI = 0.1 to 1.0; P = 0.04) was associated with low odds of an immunological response. The associations of HLA genotype with HIV disease are different and sometimes even opposite in treated and untreated patients.
We previously reported an increased risk of all-cause and AIDS mortality among HIV-infected women with albuminuria (proteinuria or microalbuminuria) enrolled in the Women’s Interagency HIV Study (WIHS) prior to the introduction of highly active antiretroviral therapy (HAART).
The current analysis includes 1,073 WIHS participants who subsequently initiated HAART. Urinalysis for proteinuria and semi-quantitative testing for microalbuminuria from two consecutive study visits prior to HAART initiation were categorized as follows: confirmed proteinuria (both specimens positive for protein), confirmed microalbuminuria (both specimens positive with at least one microalbuminuria), unconfirmed albuminuria (one specimen positive for proteinuria or microalbuminuria), or negative (both specimens negative). Time from HAART initiation to death was modeled using proportional hazards analysis.
Compared to the reference group of women with two negative specimens, the hazard ratio (HR) for all-cause mortality was significantly elevated for women with confirmed microalbuminuria (HR 1.9; 95% CI 1.2–2.9). Confirmed microalbuminuria was also independently associated with AIDS death (HR 2.3; 95% CI 1.3–4.3), while women with confirmed proteinuria were at increased risk for non-AIDS death (HR 2.4; 95% CI 1.2–4.6).
In women initiating HAART, pre-existing microalbuminuria independently predicted increased AIDS mortality, while pre-existing proteinuria predicted increased risk of non-AIDS death. Urine testing may identify HIV-infected individuals at increased risk for mortality even after the initiation of HAART. Future studies should consider whether these widely available tests can identify individuals who would benefit from more aggressive management of HIV infection and comorbid conditions associated with mortality in this population.
HIV; microalbuminuria; proteinuria; mortality; non-AIDS death
In the early highly active antiretroviral therapy (HAART) era, kidney dysfunction was strongly associated with death among HIV-infected individuals. We re-examined this association in the later HAART period to determine whether chronic kidney disease (CKD) remains a predictor of death after HAART-initiation.
To evaluate the effect of kidney function at the time of HAART initiation on time to all-cause mortality, we evaluated 1415 HIV-infected women initiating HAART in the Women’s Interagency HIV Study (WIHS). Multivariable proportional hazards models with survival times calculated from HAART initiation to death were constructed; participants were censored at the time of the last available visit or December 31, 2006.
CKD (eGFR <60 ml/min/1.73 m2) at HAART initiation was associated with higher mortality risk adjusting for age, race, hepatitis C serostatus, AIDS history and CD4+ cell count (hazard ratio [HR]=2.23, 95% confidence interval [CI]: 1.45–3.43). Adjustment for hypertension and diabetes history attenuated this association (HR=1.89, CI: 0.94–3.80). Lower kidney function at HAART initiation was weakly associated with increased mortality risk in women with prior AIDS (HR=1.09, CI: 1.00–1.19, per 20% decrease in eGFR).
Kidney function at HAART initiation remains an independent predictor of death in HIV-infected individuals, especially in those with a history of AIDS. Our study emphasizes the necessity of monitoring kidney function in this population. Additional studies are needed to determine mechanisms underlying the increased mortality risk associated with CKD in HIV-infected persons.
kidney disease; mortality; HIV; WIHS; antiretroviral therapy
The clinical importance of the association of HIV infection and antiretroviral therapy (ART) with low bone mineral density (BMD) in premenopausal women is uncertain because BMD stabilizes on established ART and fracture data are limited.
We measured time to first new fracture at any site with median follow-up of 5.4 years in 2391 (1728 HIV-infected, 663 HIV-uninfected) participants in the Women’s Interagency HIV Study (WIHS). Self-report of fracture was recorded at semiannual visits. Proportional hazard models assessed predictors of incident fracture.
At baseline, HIV-infected women were older (40 ± 9 vs. 36 ± 10 years, P <0.0001), more likely to report postmenopausal status and be hepatitis C virus-infected, and weighed less than HIV-uninfected women. Among HIV-infected women, mean CD4+ cell count was 482 cells/μl; 66% were taking ART. Unadjusted incidence of fracture did not differ between HIV-infected and uninfected women (1.8 vs. 1.4/100 person-years, respectively, P = 0.18). In multivariate models, white (vs. African-American) race, hepatitis C virus infection, and higher serum creatinine, but not HIV serostatus, were statistically significant predictors of incident fracture. Among HIV-infected women, older age, white race, current cigarette use, and history of AIDS-defining illness were associated with incidence of new fracture.
Among predominantly premenopausal women, there was little difference in fracture incidence rates by HIV status, rather traditional risk factors were important predictors. Further research is necessary to characterize fracture risk in HIV-infected women during and after the menopausal transition.
fracture; fragility fracture; HIV-infected women; premenopausal
Prevalence of microalbuminuria is increased in patients with HIV. Microalbuminuria is associated with increased mortality in other populations, including diabetics, for whom microalbuminuria testing is standard of care. We investigated whether microalbuminuria is associated with mortality in HIV-infected women not receiving antiretroviral therapy.
Urinalysis for proteinuria and semi-quantitative testing for microalbuminuria were performed in specimens from two consecutive visits in 1,547 HIV-infected women enrolled in the Women’s Interagency HIV Study in 1994–1995. Time to death was modeled using proportional hazards analysis.
Compared to women without albuminuria, the hazard ratio (HR) for all-cause mortality was increased in women with one (HR 3.4; 95% CI 2.2–5.2) or two specimens positive for either proteinuria or microalbuminuria (HR 3.9; 95% CI 2.1–7.0). The highest risk was observed in women with both specimens positive for proteinuria (HR 5.8; 95% CI 3.4–9.8). The association between albuminuria and all-cause mortality risk remained significant after adjustment for demographics, HIV disease severity, and related comorbidities. Similar results were obtained for AIDS death.
We identified a graded relationship between albuminuria and the risk of all-cause and AIDS mortality.
HIV; microalbuminuria; proteinuria; mortality
To examine changes in the causes of death and mortality in women with human immunodeficiency virus (HIV) infection in the era of combination antiretroviral therapy.
Among women with, or at risk of, HIV infection, who were enrolled in a national study from 1994 to 1995, we used an algorithm that classified cause of death as due to acquired immunodeficiency syndrome (AIDS) or non-AIDS causes based on data from death certificates and the CD4 count. Poisson regression models were used to estimate death rates and to determine the risk factors for AIDS and non-AIDS deaths.
Of 2059 HIV-infected women and 569 who were at risk of HIV infection, 468 (18%) had died by April 2000 (451 HIV-infected and 17 not infected). Causes of death were available for 428 participants (414 HIV-infected and 14 not infected). Among HIV-infected women, deaths were classified as AIDS (n = 294), non-AIDS (n = 91), or indeterminate (n = 29). The non-AIDS causes included liver failure (n = 19), drug overdose (n = 16), non-AIDS malignancies (n = 12), cardiac disease (n = 10), and murder, suicide, or accident (n = 10). All-cause mortality declined an average of 26% per year (P = 0.03) and AIDS-related mortality declined by 39% per year (P = 0.01), whereas non-AIDS-related mortality remained stable (10% average annual decrease, P = 0.73). Factors that were independently associated with non-AIDS-related mortality included depression, history of injection drug use with hepatitis C infection, cigarette smoking, and age.
A substantial minority (20%) of deaths among women with HIV was due to causes other than AIDS. Our data suggest that to decrease mortality further among HIV-infected women, attention must be paid to treatable conditions, such as hepatitis C, depression, and drug and tobacco use.
We assessed the prevalence and predictors of latent Toxoplasma infection in a large group of human immunodeficiency virus (HIV)–infected and HIV-uninfected at-risk US women. The prevalence of latent Toxoplasma infection was 15% (380 of 2525 persons) and did not differ by HIV infection status. HIV-infected women aged ≥50 years and those born outside of the United States were more likely to have latent Toxoplasma infection, with prevalences of 32% and 41%, respectively.
Opiate use is common in HIV- and hepatitis C virus (HCV)-infected individuals, however its contribution to the risk of diabetes mellitus is not well understood.
Prospective study of 1,713 HIV-infected and 652 uninfected participants from the Women’s Interagency HIV Study between October 2000 and March 2006. Diabetes defined as fasting glucose ≥126 mg/dl, or self-report of diabetes medication use or confirmed diabetes diagnosis. Opiate use determined using an interviewer-administered questionnaire. Detectable plasma HCV RNA confirmed HCV infection.
Current opiate users had a higher prevalence of diabetes (15%) than non-users (10%, p=.03), as well as a higher risk of incident diabetes (adjusted relative hazard [RHadj] 1.58, 95% CI 1.01, 2.46), after controlling for HCV infection, HIV/antiretroviral therapy status and diabetes risk factors including age, race/ethnicity, family history of diabetes and body mass index. HCV infection was also an independent risk factor for diabetes (RHadj 1.61, 95% CI 1.02, 2.52). HCV-infected women reporting current opiate use had the highest diabetes incidence (4.83 cases/100 person-years).
Among women with or at-risk for HIV, opiate use is associated with increased diabetes risk independently of HCV infection. Diabetic screening should be part of care for opiate users, and those infected with HCV.
opiate use; diabetes mellitus; fasting glucose; Hepatitis C virus; HIV; women