HIV-infected persons have substantially higher risk of kidney failure than persons without HIV, but serum creatinine levels are insensitive for detecting declining kidney function. We hypothesized that urine markers of kidney injury would be associated with declining kidney function among HIV-infected women.
In the Women's Interagency HIV Study (WIHS), we measured concentrations of albumin-to-creatinine ratio (ACR), interleukin-18 (IL-18), kidney injury marker-1 (KIM-1), and neutrophil gelatinase-associated lipocalin (NGAL) from stored urine among 908 HIV-infected and 289 uninfected participants. Primary analyses used cystatin C based estimated glomerular filtration rate (CKD-EPI eGFRcys) as the outcome, measured at baseline and two follow-up visits over eight years; secondary analyses used creatinine (CKD-EPI eGFRcr). Each urine biomarker was categorized into tertiles, and kidney decline was modeled with both continuous and dichotomized outcomes.
Compared with the lowest tertiles, the highest tertiles of ACR (−0.15ml/min/1.73m2, p<0.0001), IL-18 (−0.09ml/min/1.73m2, p<0.0001) and KIM-1 (−0.06ml/min/1.73m2, p<0.001) were independently associated with faster eGFRcys decline after multivariate adjustment including all three biomarkers among HIV-infected women. Among these biomarkers, only IL-18 was associated with each dichotomized eGFRcys outcome: ≥3% (Relative Risk 1.40; 95%CI 1.04-1.89); ≥5% (1.88; 1.30-2.71); and ≥10% (2.16; 1.20-3.88) for the highest versus lowest tertile. In alternative models using eGFRcr, the high tertile of KIM-1 had independent associations with 5% (1.71; 1.25-2.33) and 10% (1.78; 1.07-2.96) decline, and the high IL-18 tertile with 10% decline (1.97; 1.00-3.87).
Among HIV-infected women in the WIHS cohort, novel urine markers of kidney injury detect risk for subsequent declines in kidney function.
HIV; KIM-1; NGAL; IL-18; albumin-to-creatinine ratio; cystatin C; kidney injury
Chronic kidney disease and HIV infection both independently increase the risk of anemia. It is not known if individuals with both HIV infection and kidney dysfunction are at greater than expected risk of anemia resulting from the combined effect of these factors. Men from the Multicenter AIDS Cohort Study with AIDS-free time after 1996 were included in the analysis if they had an initial hemoglobin value greater than 13 g/dl and available serum creatinine measurements for the estimation of glomerular filtration rate. Hemoglobin data were fit parametrically using a linear mixed effects model and effects of medication use on hemoglobin levels were removed using censoring methods. The effect of both HIV infection and glomerular filtration rate less than 60 ml/min/1.73 m2 on the mean hemoglobin value was assessed. The risk of having anemia (hemoglobin level falling below 13 g/dl) was estimated. There were 862 HIV-infected and 1,214 HIV-uninfected men who contributed to the analysis. Hemoglobin values across all 17,341 person-visits, adjusting for age, were generally lower in HIV-infected AIDS-free men with impaired kidney function by −0.22 g/dl (95% CI: −0.42, −0.03) compared to men with either HIV infection or impaired kidney function, but not both. HIV-infected AIDS-free men with impaired kidney function have a higher risk of anemia by 1.2% compared to HIV-uninfected men with normal kidney function. Comorbid conditions and medication use did not explain this increase in risk. HIV infection and impaired kidney function have a combined impact on lowering hemoglobin levels, resulting in a higher risk of anemia.
Serum ferritin and transferrin saturation (TSAT) are used to assess iron status in children with chronic kidney disease (CKD), but their sensitivity in identifying those at risk of lower hemoglobin (HGB) values is unclear.
We assessed the association of iron status markers (ferritin, TSAT, and serum iron) with age- and gender-related HGB percentile in mild-to-moderate CKD in 304 children in the Chronic Kidney Disease in Children (CKiD) Study. Standardized HGB percentile values were examined by KDOQI-recommended ferritin (≥100 ng/ml) and TSAT (≥20 %) thresholds. Regression tree methods were used to identify iron status markers and clinical characteristics most associated with lower HGB percentiles.
The cohort was 62 % male, 23 % African American, and 12 % Hispanic, median age 12 years, and median HGB 12.9 g/dl. 34 % had low TSAT and 93 % low ferritin as defined by KDOQI. Distribution of HGB percentile values was lower in those with ferritin ≥100 ng/ml, while TSAT ≥20 % was associated with only modest increase in HGB percentile. In regression tree analysis, lower glomerular filtration rate (GFR), serum iron <50 μg/dl and ferritin ≥100 ng/ml were most strongly associated with lower HGB percentile.
The level of GFR was significantly associated with HGB. Higher serum ferritin was associated with lower HGB in this cohort. Low serum iron in the context of normal/increased ferritin and low HGB may be a useful indicator of iron-restricted erythropoiesis.
Chronic kidney disease; Hemoglobin; Iron deficiency; Anemia
More than two-thirds of the world's HIV-positive individuals live in sub-Saharan Africa, where genetic susceptibility to kidney disease is high and resources for kidney disease screening and antiretroviral therapy (ART) toxicity monitoring are limited. Equations to estimate glomerular filtration rate (GFR) from serum creatinine were derived in Western populations and may be less accurate in this population.
We compared results from published GFR estimating equations with a direct measure of GFR by iohexol clearance in 99 HIV-infected, ART-naïve Kenyan adults. Iohexol concentration was measured from dried blood spots on filter paper. The bias ratio (mean of the ratio of estimated to measured GFR) and accuracy (percentage of estimates within 30% of the measured GFR) were calculated.
The median age was 35 years, and 60% were women. The majority had asymptomatic HIV, with median CD4+ cell count of 355 cells/mm3. Median measured GFR was 115 mL/min/1.73 m2. Overall accuracy was highest for the Chronic Kidney Disease Epidemiology Consortium (CKD-EPI) equation. Consistent with a prior report, bias and accuracy were improved by eliminating the coefficient for black race (85% of estimates within 30% of measured GFR). Accuracy of all equations was poor in participants with GFR 60–90 mL/min/1.73 m2 (<65% of estimates within 30% of measured GFR), although this subgroup was too small to reach definitive conclusions.
Overall accuracy was highest for the CKD-EPI equation. Eliminating the coefficient for race further improved performance. Future studies are needed to determine the most accurate GFR estimate for use in individuals with GFR <90 mL/min/1.73 m2, in whom accurate estimation of kidney function is important to guide drug dosing. Direct measurement of GFR by iohexol clearance using a filter paper based assay is feasible for research purposes in resource-limited settings, and could be used to develop more accurate GFR estimates in African populations.
In adults with chronic kidney disease (CKD), cigarette smoking is associated with an increased risk for CKD progression and transplant failure. In children, secondhand smoke (SHS) exposure has been associated with elevated blood pressure. There are no studies on the prevalence and effect of SHS exposure in CKD.
Subjects were enrolled in the Chronic Kidney Disease in Children (CKiD) Study, an observational cohort of 366 children aged 1 to 16 years with CKD. Secondhand smoke exposure was obtained via questionnaire. SHS exposure was also determined based on urine cotinine (Ucot) measurements (1 ng/mL≤Ucot<75 ng/mL). The cross-sectional association of SHS exposure with proteinuria was assessed.
Using Ucot, 22 % of subjects were exposed to SHS. SHS exposure was significantly associated with lower maternal education and African American race, and a greater prevalence of nephrotic range proteinuria and left ventricular hypertrophy. In a multivariate model (including sex, age, race, maternal education, income level, private insurance status, abnormal birth history and CKD diagnosis), the prevalence odds of nephrotic range proteinuria was 2.64, (95 % confidence interval 1.08, 6.42) higher in children exposed to SHS compared to those unexposed.
In our cohort of children with CKD, SHS exposure was common (22 %) and independently associated with nephrotic range proteinuria. Exposure to SHS may be an important factor to consider in CKD progression.
Proteinuria; Tobacco use; Chronic kidney disease progression; Secondhand smoke exposure; Urine cotinine; Pediatric chronic kidney disease
Background: The role of environmental exposure to lead as a risk factor for chronic kidney disease (CKD) and its progression remains controversial, and most studies have been limited by a lack of direct glomerular filtration rate (GFR) measurement.
Objective: We evaluated the association between lead exposure and GFR in children with CKD.
Methods: In this cross-sectional study, we examined the association between blood lead levels (BLLs) and GFR measured by the plasma disappearance of iohexol among 391 participants in the Chronic Kidney Disease in Children (CKiD) prospective cohort study.
Results: Median BLL and GFR were 1.2 µg/dL and 44.4 mL/min per 1.73 m2, respectively. The average percent change in GFR for each 1-µg/dL increase in BLL was –2.1 (95% CI: –6.0, 1.8). In analyses stratified by CKD diagnosis, the association between BLL and GFR was stronger among children with glomerular disease underlying CKD; in this group, each 1-µg/dL increase in BLL was associated with a –12.1 (95% CI: –22.2, –1.9) percent change in GFR. In analyses stratified by anemia status, each 1-µg/dL increase in BLL among those with and without anemia was associated with a –0.3 (95% CI: –7.2, 6.6) and –4.6 (95% CI: –8.9, –0.3) percent change in GFR, respectively.
Conclusions: There was no significant association between BLL and directly measured GFR in this relatively large cohort of children with CKD, although associations were observed in some subgroups. Longitudinal analyses are needed to examine the temporal relationship between lead and GFR decline, and to further examine the impact of underlying cause of CKD and anemia/hemoglobin status among patients with CKD.
children; chronic kidney disease; kidney; lead; nephrotoxicity; pediatric
Data from 1790 HIV-infected and uninfected men in the Multicenter AIDS Cohort Study (MACS) were analyzed to evaluate relationships between physical function, incident diabetes mellitus (DM) and insulin resistance among HIV-infected and -uninfected men. DM was defined in two ways, using less stringent and more stringent criteria. The 10-item Physical Functioning Scale from the Short Form-36 Health Survey measured baseline physical function. Cumulative DM incidence was highest among HIV-uninfected and HIV-infected men with low physical function. Physical function was a risk factor for DM in HIV-uninfected men and remained so after controlling for BMI, DM family history and race. Among HIV-infected men, physical function was an independent risk factor for DM using the less stringent diabetes definition. This study supports our previous findings that low physical function is an important risk factor for DM in the MACS cohort.
AIDS; diabetes mellitus; HIV; insulin resistance; physical function
To compare the health related quality of life (HRQOL) of children with chronic kidney disease (CKD) to healthy children; to evaluate the association between CKD severity and HRQOL; to identity demographic, socioeconomic and health-status variables associated with impairment in HRQOL in children with mild to moderate CKD.
Patients and Methods
This is a cross-sectional assessment of HRQOL in children aged 2-16 with mild to moderate CKD using the Varni PedsQL™. Overall HRQOL and PedsQL domain means for parents and youth were compared to previously published norms using independent sample t-tests. Study participants were categorized according to kidney disease stage (measured by iohexol based glomerular filtration rate, iGFR) and group differences in HRQOL were evaluated using ANOVA and Cuzick trend tests. The association between hypothesized predictors of HRQOL and PedsQL scores was evaluated with linear and logistic regression analyses.
The study sample was comprised of 402 participants (Mean age =11 yrs, 60% male, 70% Caucasian, 40% anemic, median iGFR=42.5 ml/min/1.73m2, median CKD duration= 7 yrs). Youth with CKD had significantly lower physical, school, emotional and social domain scores than healthy youth (p<.001). IGFR was not associated with HRQOL. Longer disease duration and older age was associated with higher PedsQL scores in the domains of physical, emotional and social functioning (p<.05). Older age was associated with lower school functioning domain scores (p<.05). Maternal education ≥16 years was associated with higher PedsQL scores in the domains of physical, school, and social functioning (p<.05). Short stature was associated with lower scores in the physical functioning domain (p<.05).
Children with mild to moderate CKD, in comparison to healthy children, report poorer overall HRQOL as well as poorer physical, school, emotional and social functioning. Early intervention to improve linear growth and to address school functioning difficulties is recommended.
HRQOL; kidney disease; QOL; short-stature
Fungal load quantification is a critical component of fungal community analyses. Limitation of current approaches for quantifying the fungal component in the human microbiome suggests the need for new broad-coverage techniques.
We analyzed 2,085 18S rRNA gene sequences from the SILVA database for assay design. We generated and quantified plasmid standards using a qPCR-based approach. We evaluated assay coverage against 4,968 sequences and performed assay validation following the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines.
We designed FungiQuant, a TaqMan® qPCR assay targeting a 351 bp region in the fungal 18S rRNA gene. Our in silico analysis showed that FungiQuant is a perfect sequence match to 90.0% of the 2,617 fungal species analyzed. We showed that FungiQuant’s is 100% sensitive and its amplification efficiencies ranged from 76.3% to 114.5%, with r2-values of >0.99 against the 69 fungal species tested. Additionally, FungiQuant inter- and intra-run coefficients of variance ranged from <10% and <20%, respectively. We further showed that FungiQuant has a limit of quantification 25 copies and a limit of detection at 5 copies. Lastly, by comparing results from human-only background DNA with low-level fungal DNA, we showed that amplification in two or three of a FungiQuant performed in triplicate is statistically significant for true positive fungal detection.
FungiQuant has comprehensive coverage against diverse fungi and is a robust quantification and detection tool for delineating between true fungal detection and non-target human DNA.
Determination of the prevalence of accumulated antiretroviral drug resistance among persons infected with human immunodeficiency virus (HIV) is complicated by the lack of routine measurement in clinical care. By using data from 8 clinic-based cohorts from the North American AIDS Cohort Collaboration on Research and Design, drug-resistance mutations from those with genotype tests were determined and scored using the Genotypic Resistance Interpretation Algorithm developed at Stanford University. For each year from 2000 through 2005, the prevalence was calculated using data from the tested subset, assumptions that incorporated clinical knowledge, and multiple imputation methods to yield a complete data set. A total of 9,289 patients contributed data to the analysis; 3,959 had at least 1 viral load above 1,000 copies/mL, of whom 2,962 (75%) had undergone at least 1 genotype test. Using these methods, the authors estimated that the prevalence of accumulated resistance to 2 or more antiretroviral drug classes had increased from 14% in 2000 to 17% in 2005 (P < 0.001). In contrast, the prevalence of resistance in the tested subset declined from 57% to 36% for 2 or more classes. The authors’ use of clinical knowledge and multiple imputation methods revealed trends in HIV drug resistance among patients in care that were markedly different from those observed using only data from patients who had undergone genotype tests.
antiretroviral therapy, highly active; drug resistance; genotype; HIV
Immune responses to Pneumocystis jirovecii are not well understood in HIV infection, but antibody responses to proteins may be useful as a marker of Pneumocystis risk or presence of Pneumocystis pneumonia (PcP).
Retrospective analysis of a prospective cohort
Enzyme-linked immunosorbent assays of antibodies to recombinant Pneumocystis proteins of major surface glycoprotein fragments (MsgC1, C3, C8, and C9) and of antibody titers to recombinant kexin protein (KEX1) were performed on three sequential serum samples up to 18 months prior to and three samples after first AIDS-defining illness from Multicenter AIDS Cohort Study participants and compared between those who had PcP or a non-PcP AIDS-defining illness.
Fifty-four participants had PcP and 47 had a non-PcP AIDS-defining illness. IgG levels to MsgC fragments were similar between groups prior to first AIDS-defining illness, but the PcP group had higher levels of IgG to MsgC9 (median units/ml 50.2 vs. 22.2, p=0.047) post-illness. Participants with PcP were more likely to have an increase in MsgC3 (OR 3.9, p=0.02), MsgC8 (OR 5.5, p=0.001), and MsgC9 (OR 4.0, p=0.007). The PcP group was more likely to have low KEX1 IgG prior to development of PcP (OR 3.6, p=0.048) independent of CD4 cell count and to have an increase in high IgG titers to KEX1 after PcP.
HIV-infected individuals develop immune responses to both Msg and kexin proteins after PcP. Low KEX1 IgG titers may be a novel marker of future PcP risk before CD4 cell count has declined below 200 cells/μl.
HIV; Acquired Immunodeficiency Syndrome; Pneumocystis; serology
In the early highly active antiretroviral therapy (HAART) era, kidney dysfunction was strongly associated with death among HIV-infected individuals. We re-examined this association in the later HAART period to determine whether chronic kidney disease (CKD) remains a predictor of death after HAART-initiation.
To evaluate the effect of kidney function at the time of HAART initiation on time to all-cause mortality, we evaluated 1415 HIV-infected women initiating HAART in the Women’s Interagency HIV Study (WIHS). Multivariable proportional hazards models with survival times calculated from HAART initiation to death were constructed; participants were censored at the time of the last available visit or December 31, 2006.
CKD (eGFR <60 ml/min/1.73 m2) at HAART initiation was associated with higher mortality risk adjusting for age, race, hepatitis C serostatus, AIDS history and CD4+ cell count (hazard ratio [HR]=2.23, 95% confidence interval [CI]: 1.45–3.43). Adjustment for hypertension and diabetes history attenuated this association (HR=1.89, CI: 0.94–3.80). Lower kidney function at HAART initiation was weakly associated with increased mortality risk in women with prior AIDS (HR=1.09, CI: 1.00–1.19, per 20% decrease in eGFR).
Kidney function at HAART initiation remains an independent predictor of death in HIV-infected individuals, especially in those with a history of AIDS. Our study emphasizes the necessity of monitoring kidney function in this population. Additional studies are needed to determine mechanisms underlying the increased mortality risk associated with CKD in HIV-infected persons.
kidney disease; mortality; HIV; WIHS; antiretroviral therapy
This study was an investigation of the effect of ultraviolet B light exposure on the risk of cortical cataract as a function of the region of the lens. The degree to which the lower nasal predominance of cortical cataract is a result of UVB exposure was assessed.
In studies of cortical cataract, a severity score representing the area covered by cataract is often used as the primary outcome. However, additional disease information may exist in the spatial distribution of opacities. Further, it has been hypothesized that the lower nasal region of the lens is the most susceptible to damage by environmental ultraviolet light exposure.
In a sample of 107 lens images from the Salisbury Eye Evaluation Study, a digital cortical cataract grading algorithm was used to capture the location of opacities in binary images. These images were used to estimate the severity of cataract in 16 regions around the lens. The effect of individual cumulative lifetime ocular exposure to ultraviolet B light on cortical cataract risk for each lens region was examined, as estimated by using an empiric model and baseline occupation and leisure activities data, in a linear mixed-effects model.
The lower nasal regions had the highest cortical cataract severity in both the right and left eyes. In the combined data, region 9 (the lower nasal corner of the lens) was estimated to have the highest severity. In an assessment of the high- and low-exposure ultraviolet light groups (dichotomized at the median exposure level), higher exposure had the most effect in the lower regions of the lens.
These results indicate that there are regional lens differences in the association between cataract and exposure to ultraviolet light but that ultraviolet light may not entirely explain the variations in cortical cataract severity across the lens.
Measurement of iohexol plasma disappearance GFR in children require optimization of duration and sampling times Shortening the plasma disappearance study may overestimate GFR. We examined iohexol plasma disappearance curves in 27 children to determine the degree of overestimation in GFR due to shortening studies from 6 to 5 and to 4 hours. GFR measured after 5 hours was comparable to that after 6 hours, but shortening to 4 hours resulted in a 3% (p< 0.01) overestimation of GFR. Another simplification would be reducing the number of blood samples used to determine GFR. We followed the lead of Brochner-Mortensen and derived the relationship between a single compartment (slowGFR) disappearance curve and that from a double exponential analysis (two compartments sampled at 10, 30, 120, and 300 minutes), using 489 GFR measurements (median= 44 ml/min per1.73m2) in visit 1 of the Chronic Kidney Disease in Children (CKiD) study. Using polynomial regression methods, we developed coefficients to accurately measure GFR from a single compartment: GFR2 = K1*slowGFR + K2*(slowGFR)2. The coefficients (K1=1.0019 ; K2=-0.001258) were then used to generate GFR2 to be compared with 361 four point GFRs in visit 2. There was excellent correlation (r=0.999) and no bias or change in between-individuals’ dispersion. Conclusion: GFR can be accurately measured in children with CKD using the slow component of the iohexol plasma disappearance curve, provided the duration of study is at least 5 hours.
The optimal time for the initiation of antiretroviral therapy for asymptomatic patients with human immunodeficiency virus (HIV) infection is uncertain.
We conducted two parallel analyses involving a total of 17,517 asymptomatic patients with HIV infection in the United States and Canada who received medical care during the period from 1996 through 2005. None of the patients had undergone previous antiretroviral therapy. In each group, we stratified the patients according to the CD4+ count (351 to 500 cells per cubic millimeter or >500 cells per cubic millimeter) at the initiation of antiretroviral therapy. In each group, we compared the relative risk of death for patients who initiated therapy when the CD4+ count was above each of the two thresholds of interest (early-therapy group) with that of patients who deferred therapy until the CD4+ count fell below these thresholds (deferred-therapy group).
In the first analysis, which involved 8362 patients, 2084 (25%) initiated therapy at a CD4+ count of 351 to 500 cells per cubic millimeter, and 6278 (75%) deferred therapy. After adjustment for calendar year, cohort of patients, and demographic and clinical characteristics, among patients in the deferred-therapy group there was an increase in the risk of death of 69%, as compared with that in the early-therapy group (relative risk in the deferred-therapy group, 1.69; 95% confidence interval [CI], 1.26 to 2.26; P<0.001). In the second analysis involving 9155 patients, 2220 (24%) initiated therapy at a CD4+ count of more than 500 cells per cubic millimeter and 6935 (76%) deferred therapy. Among patients in the deferred-therapy group, there was an increase in the risk of death of 94% (relative risk, 1.94; 95% CI, 1.37 to 2.79; P<0.001).
The early initiation of antiretroviral therapy before the CD4+ count fell below two prespecified thresholds significantly improved survival, as compared with deferred therapy.
Diagnostic images are often assessed for clinical outcomes using subjective methods, which are limited by the skill of the reviewer. Computer-aided diagnosis (CAD) algorithms that assist reviewers in their decisions concerning outcomes have been developed to increase sensitivity and specificity in the clinical setting. However, these systems have not been well utilized in research settings to improve the measurement of clinical endpoints. Reductions in bias through their use could have important implications for etiologic research.
Using the example of cortical cataract detection, we developed an algorithm for assisting a reviewer in evaluating digital images for the presence and severity of lesions. Available image processing and statistical methods that were easily implementable were used as the basis for the CAD algorithm. The performance of the system was compared to the subjective assessment of five reviewers using 60 simulated images. Cortical cataract severity scores from 0 to 16 were assigned to the images by the reviewers and the CAD system, with each image assessed twice to obtain a measure of variability. Image characteristics that affected reviewer bias were also assessed by systematically varying the appearance of the simulated images.
The algorithm yielded severity scores with smaller bias on images where cataract severity was mild to moderate (approximately ≤ 6/16ths). On high severity images, the bias of the CAD system exceeded that of the reviewers. The variability of the CAD system was zero on repeated images but ranged from 0.48 to 1.22 for the reviewers. The direction and magnitude of the bias exhibited by the reviewers was a function of the number of cataract opacities, the shape and the contrast of the lesions in the simulated images.
CAD systems are feasible to implement with available software and can be valuable when medical images contain exposure or outcome information for etiologic research. Our results indicate that such systems have the potential to decrease bias and discriminate very small changes in disease severity. Simulated images are a tool that can be used to assess performance of a CAD system when a gold standard is not available.