Background. The role of active hepatitis C virus (HCV) replication in chronic kidney disease (CKD) risk has not been clarified.
Methods. We compared CKD incidence in a large cohort of HIV-infected subjects who were HCV seronegative, HCV viremic (detectable HCV RNA), or HCV aviremic (HCV seropositive, undetectable HCV RNA). Stages 3 and 5 CKD were defined according to standard criteria. Progressive CKD was defined as a sustained 25% glomerular filtration rate (GFR) decrease from baseline to a GFR < 60 mL/min/1.73 m2. We used Cox models to calculate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs).
Results. A total of 52 602 HCV seronegative, 9508 HCV viremic, and 913 HCV aviremic subjects were included. Compared with HCV seronegative subjects, HCV viremic subjects were at increased risk for stage 3 CKD (adjusted HR 1.36 [95% CI, 1.26, 1.46]), stage 5 CKD (1.95 [1.64, 2.31]), and progressive CKD (1.31 [1.19, 1.44]), while HCV aviremic subjects were also at increased risk for stage 3 CKD (1.19 [0.98, 1.45]), stage 5 CKD (1.69 [1.07, 2.65]), and progressive CKD (1.31 [1.02, 1.68]).
Conclusions. Compared with HIV-infected subjects who were HCV seronegative, both HCV viremic and HCV aviremic individuals were at increased risk for moderate and advanced CKD.
HIV; hepatitis C virus; chronic kidney disease; hepatitis C RNA; cohort study; glomerular filtration rate; injection drug use
In the last decade, timely initiation of antiretroviral therapy and resulting virologic suppression have greatly improved in North America concurrent with the development of better tolerated and more potent regimens, but significant barriers to treatment uptake remain.
Background. Since the mid-1990s, effective antiretroviral therapy (ART) regimens have improved in potency, tolerability, ease of use, and class diversity. We sought to examine trends in treatment initiation and resulting human immunodeficiency virus (HIV) virologic suppression in North America between 2001 and 2009, and demographic and geographic disparities in these outcomes.
Methods. We analyzed data on HIV-infected individuals newly clinically eligible for ART (ie, first reported CD4+ count <350 cells/µL or AIDS-defining illness, based on treatment guidelines during the study period) from 17 North American AIDS Cohort Collaboration on Research and Design cohorts. Outcomes included timely ART initiation (within 6 months of eligibility) and virologic suppression (≤500 copies/mL, within 1 year). We examined time trends and considered differences by geographic location, age, sex, transmission risk, race/ethnicity, CD4+ count, and viral load, and documented psychosocial barriers to ART initiation, including non–injection drug abuse, alcohol abuse, and mental illness.
Results. Among 10 692 HIV-infected individuals, the cumulative incidence of 6-month ART initiation increased from 51% in 2001 to 72% in 2009 (Ptrend < .001). The cumulative incidence of 1-year virologic suppression increased from 55% to 81%, and among ART initiators, from 84% to 93% (both Ptrend < .001). A greater number of psychosocial barriers were associated with decreased ART initiation, but not virologic suppression once ART was initiated. We found significant heterogeneity by state or province of residence (P < .001).
Conclusions. In the last decade, timely ART initiation and virologic suppression have greatly improved in North America concurrent with the development of better-tolerated and more potent regimens, but significant barriers to treatment uptake remain, both at the individual level and systemwide.
antiretroviral therapy; healthcare disparities; HIV; time factors; viral load
In resource-constrained settings, tuberculosis (TB) is a common opportunistic infection and cause of death in HIV-infected persons. TB may be present at the start of antiretroviral therapy (ART), but it is often under-diagnosed. We describe approaches to TB diagnosis and screening of TB in ART programs in low- and middle-income countries.
Methods and findings
We surveyed ART programs treating HIV-infected adults in sub-Saharan Africa, Asia and Latin America in 2012 using online questionnaires to collect program-level and patient-level data. Forty-seven sites from 26 countries participated. Patient-level data were collected on 987 adult TB patients from 40 sites (median age 34.7 years; 54% female). Sputum smear microscopy and chest radiograph were available in 47 (100%) sites, TB culture in 44 (94%), and Xpert MTB/RIF in 23 (49%). Xpert MTB/RIF was rarely available in Central Africa and South America. In sites with access to these diagnostics, microscopy was used in 745 (76%) patients diagnosed with TB, culture in 220 (24%), and chest X-ray in 688 (70%) patients. When free of charge culture was done in 27% of patients, compared to 21% when there was a fee (p = 0.033). Corresponding percentages for Xpert MTB/RIF were 26% and 15% of patients (p = 0.001). Screening practices for active disease before starting ART included symptom screening (46 sites, 98%), chest X-ray (38, 81%), sputum microscopy (37, 79%), culture (16, 34%), and Xpert MTB/RIF (5, 11%).
Mycobacterial culture was infrequently used despite its availability at most sites, while Xpert MTB/RIF was not generally available. Use of available diagnostics was higher when offered free of charge.
25-hydroxyvitamin D [25(OH)D] levels after recovery from tuberculosis (TB) may reflect pre-morbid levels and therefore provide insight into pathogenesis. We assessed 25(OH)D levels after recovery from TB disease, and compared to levels in persons without TB disease.
Case-control study. Cases were persons who had recovered from culture-confirmed Mycobacterium tuberculosis disease. Controls were persons without TB disease. Total 25(OH)D was measured from stored plasma specimens using liquid chromatography-mass spectrometry.
29 persons with prior TB disease and 36 controls were included. Median 25(OH)D levels were 24.7 ng/mL (IQR, 18.3–34.1) in prior TB disease, and 33.6 ng/mL (IQR, 26.2–42.4) in controls (Mann-Whitney; P=0.01). Multivariable linear regression analysis showed that black race (adjusted mean difference [β]=−8.3 ng/mL; 95% CI −14.5, −2.2; P<0.01), enrollment in winter (β=−10.4 ng/mL; 95% CI −17.0, −3.8; P<0.01) and prior TB disease (β=−5.8 ng/mL; 95% CI −11.4, −0.3; P=0.05) were associated with lower 25(OH)D levels.
Persons who had recovered from TB disease had lower 25(OH)D levels compared to controls without TB disease, after adjusting for important confounders. Larger, longitudinal studies are needed to further characterize the possible role of low 25(OH)D in the pathogenesis of TB disease and TB recurrence after recovery.
25-hydroxyvitamin D; vitamin D; tuberculosis; Mycobacterium tuberculosis
Fluoroquinolone resistance in Mycobacterium tuberculosis can be conferred by mutations in gyrA or gyrB. The prevalence of resistance mutations outside the quinolone resistance-determining region (QRDR) of gyrA or gyrB is unclear, since such regions are rarely sequenced. M. tuberculosis isolates from 1,111 patients with newly diagnosed culture-confirmed tuberculosis diagnosed in Tennessee from 2002 to 2009 were screened for phenotypic ofloxacin resistance (>2 μg/ml). For each resistant isolate, two ofloxacin-susceptible isolates were selected: one with antecedent fluoroquinolone exposure and one without. The complete gyrA and gyrB genes were sequenced and compared with M. tuberculosis H37Rv. Of 25 ofloxacin-resistant isolates, 11 (44%) did not have previously reported resistance mutations. Of these, 10 had novel polymorphisms: 3 in the QRDR of gyrA, 1 in the QRDR of gyrB, and 6 outside the QRDR of gyrA or gyrB; 1 did not have any gyrase polymorphisms. Polymorphisms in gyrA codons 1 to 73 were more common in fluoroquinolone-susceptible than in fluoroquinolone-resistant strains (20% versus 0%; P = 0.016). In summary, almost half of fluoroquinolone-resistant M. tuberculosis isolates did not have previously described resistance mutations, which has implications for genotypic diagnostic tests.
Background. Screening for tuberculosis prior to highly active antiretroviral therapy (HAART) initiation is not routinely performed in low-incidence settings. Identifying factors associated with developing tuberculosis after HAART initiation could focus screening efforts.
Methods. Sixteen cohorts in the United States and Canada contributed data on persons infected with human immunodeficiency virus (HIV) who initiated HAART December 1995–August 2009. Parametric survival models identified factors associated with tuberculosis occurrence.
Results. Of 37845 persons in the study, 145 were diagnosed with tuberculosis after HAART initiation. Tuberculosis risk was highest in the first 3 months of HAART (20 cases; 215 cases per 100000 person-years; 95% confidence interval [CI]: 131–333 per 100000 person-years). In a multivariate Weibull proportional hazards model, baseline CD4+ lymphocyte count <200, black race, other nonwhite race, Hispanic ethnicity, and history of injection drug use were independently associated with tuberculosis risk. In addition, in a piece-wise Weibull model, increased baseline HIV-1 RNA was associated with increased tuberculosis risk in the first 3 months; male sex tended to be associated with increased risk.
Conclusions. Screening for active tuberculosis prior to HAART initiation should be targeted to persons with baseline CD4 <200 lymphocytes/mm3 or increased HIV-1 RNA, persons of nonwhite race or Hispanic ethnicity, history of injection drug use, and possibly male sex.
Injection drug use is associated with poor HIV outcomes even among persons receiving highly active antiretroviral therapy (HAART), but there are limited data on the relationship between non-injection drug use and HIV disease progression.
We conducted an observational study of HIV-infected persons entering care between January 1, 1999 and December 31, 2004, with follow-up through December 31, 2005.
There were 1,712 persons in the study cohort: 262 with a history of injection drug use (IDU), 785 with a history of non-injection drug use, and 665 with no history of drug use; 56% were white, and 24% were females. Median follow-up was 2.1 years, 33% had HAART prior to first visit, 40% initiated first HAART during the study period, and 306 (17.9%) had an AIDS-defining event or died. Adjusting for sex, age, race, prior antiretroviral use, CD4 cell count, and HIV-1 RNA, patients with a history of injection drug use were more likely to advance to AIDS or death than non-users (adjusted hazard ratio (HR) = 1.97, 95% confidence interval (CI) 1.43-2.70, P<0.01). There was no statistically significant difference of disease progression between non-injection drug users and non-users (HR=1.19, 95% CI 0.92-1.56, P=0.19). An analysis among the subgroup who initiated their first HAART during the study period (n=687) showed a similar pattern (IDUs: 1.83, 1.09-3.06, P=0.02; non-IDUs: 1.21, 0.81-1.80, P=0.35). Seventy-four patients had active IDU during the study period, 768 active non-IDU, and 870 no substance use. Analyses based on active drug use during the study period did not substantially differ from those based on history of drug use.
This study shows no relationship between non-injection drug use and HIV disease progression. This study is limited by using history drug use and lumping together different types of drugs. Further studies ascertaining specific type and extent of non-injection drug use in a prospective way, and with longer follow-up, are needed.
Injection drug use; non-injection drug use; CD4 cell count; HIV viral load; HIV disease progression; antiretroviral therapy
With successful antiretroviral therapy, non-communicable diseases, including malignancies, are increasingly contributing to morbidity and mortality among HIV-infected persons. The epidemiology of AIDS-defining cancers (ADCs) and non-AIDS-defining cancers (NADCs) in HIV-infected populations in Brazil has not been well described. It is not known if cancer trends in HIV-infected populations in Brazil are similar to those of other countries where antiretroviral therapy is also widely available.
We performed a retrospective analysis of clinical cohorts at Instituto Nacional de Infectologia Evandro Chagas (INI) in Rio de Janeiro and Vanderbilt Comprehensive Care Clinic (VCCC) in Nashville from 1998 to 2010. We used Poisson regression and standardized incidence ratios (SIRs) to examine incidence trends. Clinical and demographic predictors of ADCs and NADCs were examined using Cox proportional hazards models.
This study included 2,925 patients at INI and 3,927 patients at VCCC. There were 57 ADCs at INI (65% Kaposi sarcoma), 47 at VCCC (40% Kaposi sarcoma), 45 NADCs at INI, and 82 at VCCC. From 1998 to 2004, incidence of ADCs remained statistically unchanged at both sites. From 2005 to 2010, ADC incidence decreased in both cohorts (INI incidence rate ratio per year = 0.74, p < 0.01; VCCC = 0.75, p < 0.01). Overall Kaposi sarcoma incidence was greater at INI than VCCC (3.0 vs. 1.2 cases per 1,000 person-years, p < 0.01). Incidence of NADCs remained constant throughout the study period (overall INI incidence 3.6 per 1,000 person-years and VCCC incidence 5.3 per 1,000 person-years). Compared to general populations, overall risk of NADCs was increased at both sites (INI SIR = 1.4 [95% CI 1.1-1.9] and VCCC SIR = 1.3 [1.0-1.7]). After non-melanoma skin cancers, the most frequent NADCs were anal cancer at INI (n = 7) and lung cancer at VCCC (n = 11). In multivariate models, risk of ADC was associated with male sex and immunosuppression. Risk of NADC was associated with increased age.
In both cohorts, ADCs have decreased over time, though incidence of KS was higher at INI than VCCC. Rates of NADCs remained constant over time at both sites.
Electronic supplementary material
The online version of this article (doi:10.1186/1750-9378-10-4) contains supplementary material, which is available to authorized users.
HIV; Malignancy; Cancer; Brazil; Anal cancer; Kaposi sarcoma; Non-Hodgkin lymphoma; Lung cancer; Age; Sex
Despite the success of antiretroviral therapy (ART), excess mortality continues for those with HIV infection. A comprehensive approach to risk assessment, addressing multiorgan system injury on ART, is needed. We sought to develop and validate a practical and generalizable mortality risk index for HIV-infected individuals on ART.
Design and methods
The Veterans Aging Cohort Study (VACS) was used to develop the VACS Index, based on age, CD4 cell count, HIV-1 RNA, hemoglobin, aspartate and alanine transaminase, platelets, creatinine and hepatitis C status, and a Restricted Index based on age, CD4 cell count and HIV-1 RNA with an outcome of death up to 6 years after ART initiation. Validation was in six independent cohorts participating in the ART Cohort Collaboration (ART-CC).
In both the development (4932 patients, 656 deaths) and validation cohorts (3146 patients, 86 deaths) the VACS Index had better discrimination than the Restricted Index (c-statistics 0.78 and 0.72 in VACS, 0.82 and 0.78 in ART-CC). The VACS Index also demonstrated better discrimination than the Restricted Index for HIV deaths and non-HIV deaths, in men and women, those younger and older than 50 years, with and without detectable HIV-1 RNA, and with or without HCV coinfection.
Among HIV-infected patients treated with ART, the VACS Index more accurately discriminates mortality risk than traditional HIV markers and age alone. By accounting for multiorgan system injury, the VACS Index may prove a useful tool in clinical care and research.
anemia; cohort study; comorbidity; FIB-4; HIV; mortality; prognostic index
Heteroresistance is the coexistence of populations with differing nucleotides at a drug resistance locus within a sample of organisms. Although Sanger sequencing is the gold standard for sequencing, it may be less sensitive than deep sequencing for detecting fluoroquinolone heteroresistance in Mycobacterium tuberculosis. Twenty-seven fluoroquinolone monoresistant and 11 fluoroquinolone-susceptible M. tuberculosis isolates were analyzed by Sanger and Illumina deep sequencing. Individual sequencing reads were analyzed to detect heteroresistance in the gyrA and gyrB genes. Heteroresistance to fluoroquinolones was identified in 10/26 (38%) phenotypically fluoroquinolone-resistant samples and 0/11 (P = 0.02) fluoroquinolone-susceptible controls. One resistant sample was excluded because of contamination with the laboratory strain M. tuberculosis H37Rv. Sanger sequencing revealed resistance-conferring mutations in 15 isolates, while deep sequencing revealed mutations in 20 isolates. Isolates with fluoroquinolone resistance-conferring mutations by Sanger sequencing all had at least those same mutations identified by deep sequencing. By deep sequencing, 10 isolates had a single fixed (defined as >95% frequency) mutation, while 10 were heteroresistant, 5 of which had a single unfixed (defined as <95% frequency) mutation and 5 had multiple unfixed mutations. Illumina deep sequencing identified a higher proportion of fluoroquinolone-resistant M. tuberculosis isolates with heteroresistance than did Sanger sequencing. The heteroresistant isolates frequently demonstrated multiple mutations, but resistant isolates with fixed mutations each had only a single mutation.
To determine the rate of and risk factors for discontinuation of isoniazid due to adverse effects during the treatment of latent tuberculosis infection in a large, multi-site study.
The Tuberculosis Epidemiologic Studies Consortium (TBESC) conducted a prospective study from March 2007–September 2008 among adults initiating isoniazid for treatment of LTBI at 12 sites in the US and Canada. The relative risk for isoniazid discontinuation due to adverse effects was determined using negative binomial regression. Adjusted models were constructed using forward stepwise regression.
Of 1,306 persons initiating isoniazid, 617 (47.2%, 95% CI 44.5–50.0%) completed treatment and 196 (15.0%, 95% CI 13.1–17.1%) discontinued due to adverse effects. In multivariable analysis, female sex (RR 1.67, 95% CI 1.32–2.10, p<0.001) and current alcohol use (RR 1.41, 95% CI 1.13–1.77, p=0.003) were independently associated with isoniazid discontinuation due to adverse effects.
The rate of discontinuation of isoniazid due to adverse effects was substantially higher than reported earlier. Women were at increased risk of discontinuing isoniazid due to adverse effects; close monitoring of women for adverse effects may be warranted. Current alcohol use was also associated with isoniazid discontinuation; counseling patients to abstain from alcohol could decrease discontinuation due to adverse effects.
Mycobacterium tuberculosis; latent tuberculosis infection; female sex; alcohol use; adverse effects
Treatment of latent Mycobacterium tuberculosis infection with isoniazid can cause hepatotoxicity, but the risk of isoniazid-associated hepatotoxicity among persons coinfected with hepatitis C virus (HCV) is unknown. We conducted a prospective study among 146 injection drug users with M. tuberculosis infection and normal baseline hepatic transaminase values who were treated with isoniazid. Of 146 participants, 138 (95%) were HCV-seropositive. Thirty-seven participants (25%) were human immunodeficiency virus (HIV)—seropositive. Thirty-two (22%; 95% confidence interval [CI], 16%–30%) of 146 participants developed transaminase value elevations to >3 times the upper limit of normal. Transaminase value elevation was associated with concurrent alcohol use but not with race, age, presence of hepatitis B surface antigen, HIV-1 infection, or current injection drug use. Isoniazid was withdrawn from 11 participants (8%; 95% CI, 4%–13%). Of 8 deaths during follow-up, none were attributed to isoniazid-associated hepatotoxicity. The risk of transaminase value elevation and drug discontinuation for HCV-infected persons receiving isoniazid was within the range reported for populations with lower HCV prevalence.
It is estimated that more than two billion people have latent M. tuberculosis infection, and this population serves as an important reservoir for future tuberculosis cases. Prevalence estimates are limited by difficulties in diagnosing the infection, including the lack of an ideal test, and an incomplete understanding of latency. Current tests include the tuberculin skin test and two interferon-γ release assays: QuantiFERON Gold In-Tube and T-SPOT.TB. This update focuses on recent publications regarding the ability of these tests to predict tuberculosis disease, their reproducibility over serial tests, and discordance between tests. We also discuss recent advances in the treatment of latent M. tuberculosis infection, including the three-month regimen of once-weekly rifapentine plus isoniazid, and prolonged isoniazid therapy for HIV-infected persons living in high-tuberculosis-incidence settings. We provide an update on the tolerability of the three-month regimen.
Latent tuberculosis infection; Tuberculin skin test; Interferon gamma release assay; QuantiFERON-TB gold in tube assay; T-SPOT. TB assay
To characterize risk factors for non-completion of treatment for latent tuberculosis infection (LTBI). Secondarily, to assess the impact of LTBI treatment regimen on subsequent risk of tuberculosis.
Close contacts of adults (≥15 years) with pulmonary tuberculosis were prospectively enrolled in a multi-center study in the U.S. and Canada from January 2002–December 2006. Close contacts to TB patients were screened and cross-matched with tuberculosis registries to identify those who developed active tuberculosis.
Of the 3,238 contacts screened, 1,714 (53%) were diagnosed with LTBI. Preventive therapy was recommended in 1,371 (80%); 1,147 contacts (84%) initiated therapy, of whom 723 (63%) completed treatment. In multivariate analysis, study site, initial interview sites other than a home or healthcare setting, and treatment with isoniazid were significantly associated with LTBI treatment non-completion. Fourteen tuberculosis cases were identified in contacts, all of whom initiated isoniazid. There were two cases among persons who received six or more months of isoniazid (66 cases/100,000 person-years), and nine cases among persons who received 0–5 months (median 2 months) of isoniazid (792 cases/100,000 person-years; p<0.001); data on duration of isoniazid for three cases were not available.
Only 53% (723 of 1,371) of close contacts for whom preventive therapy was recommended actually completed treatment. Close contacts of TB patients were significantly less likely to complete LTBI treatment if they took isoniazid. Less than six months of isoniazid therapy was associated with increased risk of active TB.
latent tuberculosis infection; treatment non-completion; treatment effectiveness
There have been inconsistent findings on the association between current drug use and HIV disease progression and virologic suppression. Drug use was often measured using self-report of historical use. Objective measurement of current drug use is preferred.
In this cross-sectional study, we assessed drug use through Computer-Assisted Self Interviews (CASI) and point-of-care urine drug screen (UDS) among 225 HIV-infected patients, and evaluated the association between current drug use and virologic suppression.
About half (54%) of participants had a positive UDS, with a lower self-reported rate by CASI (42%) (Kappa score = 0.59). By UDS, 36.0% were positive for marijuana, 25.8% for cocaine, 7.6% for opiates, and 2.2% for methamphetamine or amphetamine. Factors associated with virologic suppression (plasma HIV RNA <50 copies/mL) were Caucasian race (P = 0.03), higher CD4 count (P < 0.01), current use of antiretroviral therapy (ART) (P < 0.01), and a negative UDS (P < 0.01). Among 178 current ART users, a positive UDS remained significantly associated with lower likelihood of virologic suppression (P = 0.04).
UDS had good agreement with CASI in detecting frequently used drugs such as marijuana and cocaine. UDS at routine clinic visits may provide “real-time” prognostic information to optimize management.
Drug use; HIV; Computer-assisted self-interview; Urine drug screen; Antiretroviral therapy; Virologic suppression
Decision analysis techniques can compare management strategies when there are insufficient data from clinical studies to guide decision-making. We compared the outcomes of decision analyses and subsequent clinical studies in the infectious disease literature to assess the validity of the conclusions of the decision analyses.
A search strategy to identify decision analyses in infectious diseases topics published from 1990-2005 was developed and performed using PubMed. Abstracts of all identified articles were reviewed and infectious diseases-related decision analyses were retained. Subsequent clinical trials and observational studies that corresponded to these decision analyses were identified using pre-specified search strategies. Clinical studies were considered a match for the decision analysis if they assessed the same patient population, intervention, and outcome. Agreement or disagreement between the conclusions of the decision analysis and clinical study were determined by author review.
The initial PubMed search yielded 318 references. Forty decision analyses pertaining to 29 infectious diseases topics were identified. Of the 40, 16 (40%) from 13 infectious diseases topics had matching clinical studies. In 12/16 (75%), conclusions of at least one clinical study agreed with those of the decision analysis. Three of the four decision analyses in which conclusions disagreed were from the same topic (management of febrile children).
There was substantial agreement between the conclusions of decision analyses and clinical studies in infectious diseases, supporting the validity of decision analysis and its utility in guiding management decisions.
Infectious diseases; Decision support techniques; Clinical trials
Fluoroquinolone exposure before tuberculosis (TB) diagnosis is common. We anticipated that exposure to older-generation fluoroquinolones is associated with greater fluoroquinolone MICs in Mycobacterium tuberculosis than exposure to newer agents. A nested case–control study was performed among newly diagnosed TB patients reported to the Tennessee Department of Health (January 2002–December 2009). Each fluoroquinolone-resistant case (n = 25) was matched to two fluoroquinolone-susceptible controls (n = 50). Ciprofloxacin and ofloxacin were classified as older-generation fluoroquinolones; levofloxacin, moxifloxacin and gatifloxacin were considered newer agents. There was no difference between median ofloxacin MIC for isolates from 9 patients exposed only to older fluoroquinolones, 25 exposed only to newer fluoroquinolones, 6 exposed to both and 35 fluoroquinolone-unexposed patients (Kruskal–Wallis, P = 0.35). Using multivariate proportional odds logistic regression adjusting for age and sex, duration of exposure to newer fluoroquinolones was independently associated with higher MIC (OR = 1.79, 95% CI 1.22–2.64), but duration of exposure to older fluoroquinolones was not (OR = 0.94, 95% CI 0.50–1.78). Isolates from patients exposed only to newer fluoroquinolones tended to have mutations at gyrA codons 90, 91 or 94 more frequently than those exposed only to older fluoroquinolones (44% vs. 11%). We were surprised to find that duration of exposure to newer fluoroquinolones, but not older ones, was independently associated with higher ofloxacin MIC. This suggests that the mutant selection window lower boundary is likely to have clinical relevance; caution is warranted when newer fluoroquinolones are prescribed to patients with TB risk factors.
Drug-resistant tuberculosis; Genotypic resistance; Moxifloxacin; Levofloxacin; Ciprofloxacin
Highly active antiretroviral therapy (HAART) has been shown to be effective in different populations, but data among injection drug users are limited. Human immunodeficiency virus-infected injection drug users recruited into the Acquired Immunodeficiency Syndrome Link to Intravenous Experiences (ALIVE) Study as early as 1988 were tested semiannually to identify their first CD4-positive T-lymphocyte cell count below 200/μl; they were followed for mortality through 2002. Visits were categorized into the pre-HAART (before mid-1996) and the HAART eras and further categorized by HAART use. Survival analysis with staggered entry was used to evaluate the effect of HAART on acquired immunodeficiency syndrome-related mortality, adjusting for other medications and demographic, clinical, and behavioral factors. Among 665 participants, 258 died during 2,402 person-years of follow-up. Compared with survival in the pre-HAART era, survival in the HAART era was shown by multivariate analysis to be improved for both those who did and did not receive HAART (relative hazards = 0.06 and 0.33, respectively; p < 0.001). Inferences were unchanged after restricting analyses to data starting with 1993 and considerations of lead-time bias and human immunodeficiency viral load. The annual CD4-positive T-lymphocyte cell decline was less in untreated HAART-era participants than in pre-HAART-era participants (— 10/μl vs. —37/μl, respectively), suggesting that indications for treatment may have contributed to improved survival and that analyses restricted to the HAART era probably underestimate HAART effectiveness.
antiretroviral therapy; highly active; HIV; substance abuse; intravenous; substance-related disorders; survival; treatment outcome
Obesity and chronic, treated HIV infection are both associated with persistent systemic inflammation and a similar constellation of metabolic and cardiovascular diseases, but the combined effects of excess adiposity and HIV on circulating proinflammatory cytokines and other biomarkers previously shown to predict disease risk is not well described. We measured inflammation biomarker levels in 158 predominantly virologically suppressed adults on long-term antiretroviral therapy (ART) with a range of body mass index (BMI) values from normal to morbidly obese. We assessed the relationship between BMI and each biomarker using multivariable linear regression adjusted for age, sex, race, CD4+ count, tobacco use, data source, protease inhibitor use, and routine nonsteroidal antiinflammatory drug (NSAID) or aspirin use. Among normal-weight (n=48) and overweight participants (n=41; BMI <30 kg/m2), incremental BMI increases were associated with significantly higher serum highly sensitive C-reactive protein (hsCRP; β=2.47, p=0.02) and tumor necrosis factor (TNF)-α receptor 1 levels (β=1.53, p=0.03), and significantly lower CD14 levels (β=0.84, p=0.01), but similar associations were not observed in the obese participants. Among the obese (n=69; BMI ≥30 kg/m2), however, higher serum levels of interleukin-6 (IL-6; β=1.30, p=0.02) and macrophage inflammatory protein-1α (β=1.77, p<0.01) were associated with higher BMI, a finding not observed among the nonobese. Among all participants, IL-6 and TNF-α receptor 1 levels were most closely associated with hsCRP (p<0.01). Further studies are needed to determine whether higher serum inflammation biomarker levels found in obese HIV-infected individuals on ART reflect an increased likelihood of adverse health outcomes, or if novel markers to estimate mortality and disease risk are needed in this population.
Among HIV-infected patients who initiated antiretroviral therapy (ART), patterns of cause-specific death varied by ART duration and were strongly related to age, sex, and transmission risk group. Deaths from non-AIDS malignancies were much more frequent than those from cardiovascular disease.
Background. Patterns of cause-specific mortality in individuals infected with human immunodeficiency virus type 1 (HIV-1) are changing dramatically in the era of antiretroviral therapy (ART).
Methods. Sixteen cohorts from Europe and North America contributed data on adult patients followed from the start of ART. Procedures for coding causes of death were standardized. Estimated hazard ratios (HRs) were adjusted for transmission risk group, sex, age, year of ART initiation, baseline CD4 count, viral load, and AIDS status, before and after the first year of ART.
Results. A total of 4237 of 65 121 (6.5%) patients died (median, 4.5 years follow-up). Rates of AIDS death decreased substantially with time since starting ART, but mortality from non-AIDS malignancy increased (rate ratio, 1.04 per year; 95% confidence interval [CI], 1.0–1.1). Higher mortality in men than women during the first year of ART was mostly due to non-AIDS malignancy and liver-related deaths. Associations with age were strongest for cardiovascular disease, heart/vascular, and malignancy deaths. Patients with presumed transmission through injection drug use had higher rates of all causes of death, particularly for liver-related causes (HRs compared with men who have sex with men: 18.1 [95% CI, 6.2–52.7] during the first year of ART and 9.1 [95% CI, 5.8–14.2] thereafter). There was a persistent role of CD4 count at baseline and at 12 months in predicting AIDS, non-AIDS infection, and non-AIDS malignancy deaths. Lack of viral suppression on ART was associated with AIDS, non-AIDS infection, and other causes of death.
Conclusions. Better understanding of patterns of and risk factors for cause-specific mortality in the ART era can aid in development of appropriate care for HIV-infected individuals and inform guidelines for risk factor management.
HIV; cause-specific mortality; antiretroviral therapy
HIV infection and low CD4+ T-cell count are associated with an increased risk of persistent oncogenic HPV infection – the major risk factor for cervical cancer. Few reported prospective cohort studies have characterized the incidence of invasive cervical cancer (ICC) in HIV-infected women.
Data were obtained from HIV-infected and -uninfected female participants in the NA-ACCORD with no history of ICC at enrollment. Participants were followed from study entry or January, 1996 through ICC, loss-to follow-up or December, 2010. The relationship of HIV infection and CD4+ T-cell count with risk of ICC was assessed using age-adjusted Poisson regression models and standardized incidence ratios (SIR). All cases were confirmed by cancer registry records and/or pathology reports. Cervical cytology screening history was assessed through medical record abstraction.
A total of 13,690 HIV-infected and 12,021 HIV-uninfected women contributed 66,249 and 70,815 person-years (pys) of observation, respectively. Incident ICC was diagnosed in 17 HIV-infected and 4 HIV-uninfected women (incidence rate of 26 and 6 per 100,000 pys, respectively). HIV-infected women with baseline CD4+ T-cells of ≥ 350, 200–349 and <200 cells/uL had a 2.3-times, 3.0-times and 7.7-times increase in ICC incidence, respectively, compared with HIV-uninfected women (Ptrend =0.001). Of the 17 HIV-infected cases, medical records for the 5 years prior to diagnosis showed that 6 had no documented screening, 5 had screening with low grade or normal results, and 6 had high-grade results.
This study found elevated incidence of ICC in HIV-infected compared to -uninfected women, and these rates increased with immunosuppression.
Human papilloma virus; HIV-infection; Invasive Cervical Cancer; Immunosuppression
Data on the effectiveness of second-line combination antiretroviral therapy (cART) are limited. We evaluated virologic outcomes of second cART in a multicenter cohort collaboration. The study population initiated first and second modern cART between 1996 and 2010. The second cART required a switch in at least the anchor agent of first cART. We evaluated time to virologic failure of second cART and factors associated with greater risk of failure using multivariable Cox proportional hazards models. Of 488 patients who switched to second-line cART, 67% were black and 32% were women. The median HIV-1 RNA at second cART initiation was 9,565 copies/ml [interquartile range (IQR); 123, 94,108]. The time to virologic failure of second cART was longer if HIV-1 RNA was undetectable at switch (p=0.001), although 12% and 17% of patients with undetectable and detectable HIV-1 RNA experienced virologic failure within 6 months of second cART initiation, respectively. A lower CD4 cell count at second cART initiation was associated with a greater risk of virologic failure. Failure rates decreased in more recent calendar years [adjusted relative hazard of 0.40 comparing 2008 to 2010 with 1996 to 1998 (95% confidence interval; 0.15, 1.00)]; however, type of anchor agent was not associated with failure. In conclusion, virologic failure of second cART was less likely if patients switched with undetectable HIV-1 RNA, although risk of early failure was similar. The effectiveness of second cART regimens improved over calendar time and was independent of the anchor agent in the regimen.
Retention in care is key to improving HIV outcomes. Our goal was to describe “churn” in patterns of entry, exit, and retention in HIV care in the US and Canada.
Adults contributing ≥1 CD4 count or HIV-1 RNA (HIV-lab) from 2000–2008 in North American Cohort Collaboration on Research and Design (NA-ACCORD) clinical cohorts were included. Incomplete retention was defined as lack of 2 HIV-labs (≥90 days apart) within 12 months, summarized by calendar year. We used beta-binomial regression models to estimate adjusted odds ratios (OR) and 95% confidence intervals (CI) of factors associated with incomplete retention.
Among 61,438 participants, 15,360 (25%) with incomplete retention significantly differed in univariate analyses (p<0.001) from 46,078 (75%) consistently retained by age, race/ethnicity, HIV risk, CD4, ART use, and country of care (US vs. Canada). From 2000–2004, females (OR=0.82, CI:0.70–0.95), older individuals (OR=0.78, CI:0.74–0.83 per 10 years), and ART users (OR= 0.61, CI:0.54–0.68 vs all others) were less likely to have incomplete retention, while black individuals (OR=1.31, CI:1.16–1.49, vs. white), those with injection drug use (IDU) HIV risk (OR=1.68, CI:1.49–1.89, vs. non-IDU) and those in care longer (OR=1.09, CI:1.07–1.11 per year) were more likely to have incomplete retention. Results from 2005–2008 were similar.
From 2000 to 2008, 75% of the NA-ACCORD population was consistently retained in care with 25% experiencing some change in status, or churn. In addition to the programmatic and policy implications, our findings identify patient groups who may benefit from focused retention efforts.
retention; churn; HIV clinical care; North America; HRSA HAB; National HIV/AIDS Strategy
Clarifying the relationship between illicit drug use and HIV-1 virologic suppression requires characterization of both illicit drug use activity and adherence to antiretroviral therapy (ART). We developed a rapid clinical questionnaire to assess prior 7-day illicit drug use and ART adherence in a cross-sectional study among 1,777 HIV-infected persons in care. Of these, 76% were male, 35% were African-American, and 8% reported injection drug use as their probable route of HIV-1 infection. Questionnaire-reported frequencies of cocaine and marijuana use within the previous 7 days were 3.3% and 12.1%, respectively. Over three quarters (77.8%) of participants were on ART, of whom 69.7% had HIV-1 virologic suppression (HIV-1 RNA<48 copies/mL). Univariate analyses revealed that compared to no use, cocaine and marijuana use were both associated with missed ART doses (P<0.01). Multivariable logistic regression analysis adjusting for non-adherence demonstrated that cocaine use was independently associated with failing to achieve virologic suppression (adjusted odds ratio (aOR), 0.46; 95% confidence interval (CI), 0.22–0.98) but marijuana use was not (aOR, 1.08; 95% CI, 0.72–1.62). This result strengthens the evidence of a direct effect of cocaine on virologic control, independent of non-adherence to ART.
Drug use; cocaine; marijuana; antiretroviral therapy; HIV-1 virologic suppression
Combination antiretroviral therapy (ART) has significantly increased survival among HIV-positive adults in the United States (U.S.) and Canada, but gains in life expectancy for this region have not been well characterized. We aim to estimate temporal changes in life expectancy among HIV-positive adults on ART from 2000–2007 in the U.S. and Canada.
Participants were from the North American AIDS Cohort Collaboration on Research and Design (NA-ACCORD), aged ≥20 years and on ART. Mortality rates were calculated using participants' person-time from January 1, 2000 or ART initiation until death, loss to follow-up, or administrative censoring December 31, 2007. Life expectancy at age 20, defined as the average number of additional years that a person of a specific age will live, provided the current age-specific mortality rates remain constant, was estimated using abridged life tables.
The crude mortality rate was 19.8/1,000 person-years, among 22,937 individuals contributing 82,022 person-years and 1,622 deaths. Life expectancy increased from 36.1 [standard error (SE) 0.5] to 51.4 [SE 0.5] years from 2000–2002 to 2006–2007. Men and women had comparable life expectancies in all periods except the last (2006–2007). Life expectancy was lower for individuals with a history of injection drug use, non-whites, and in patients with baseline CD4 counts <350 cells/mm3.
A 20-year-old HIV-positive adult on ART in the U.S. or Canada is expected to live into their early 70 s, a life expectancy approaching that of the general population. Differences by sex, race, HIV transmission risk group, and CD4 count remain.