Quantifying pediatric immunologic recovery by highly active antiretroviral therapy (HAART) initiation at different CD4 percentage (CD4%) and age thresholds may inform decisions about timing of treatment initiation.
HIV-1-infected, HAART-naive children in Europe and the Americas were followed from 2002 through 2009 in PENPACT-1. Data from 162 vertically infected children, with at least World Health Organization “mild” immunosuppression and CD4% <10th percentile, were analyzed for improvement to a normal CD4% (≥10th percentile) within 4 years after HAART initiation. Data from 209 vertically infected children, regardless of immune status, were analyzed for CD4% outcomes at 4 years and viral failure within 4 years.
Seventy-two percent of baseline immunosuppressed children recovered to normal within 4 years. Compared with “severe” immunosuppression, more children with “mild” immunosuppression (difference 36%, 95% confidence interval [CI]: 22% to 49%) or “advanced” immunosuppression (difference 20.8%, 95% CI: 5.8% to 35.9%) recovered a normal CD4%. For each 5-year increase in baseline age, the proportion of children achieving a normal CD4% declined by 19% (95% CI: 11% to 27%). Combining baseline CD4% and age effects resulted in >90% recovery when initiating HAART with “mild” immunosuppression at any age or “advanced” immunosuppression at age <3 years. Baseline CD4% effects became greater with increasing age (P = .02). At 4 years, most immunologic benefits were still significant but diminished. Viral failure was highest in infancy (56%) and adolescence (63%).
Initiating HAART at higher CD4% and younger ages maximizes potential for immunologic recovery. Guidelines should weigh immunologic benefits against long-term risks.
child; HIV; immunologic; reconstitution; treatment failure
Viral suppression is a key indicator of antiretroviral therapy (ART) response among HIV-infected patients. Dried blood spots (DBS) are an appealing alternative to conventional plasma-based virologic testing, improving access to monitoring in resource-limited settings. However, validity of DBS obtained from fingerstick in field settings remains unknown.
Investigate feasibility and accuracy of DBS vs plasma collected by healthcare workers in real-world settings of remote hospitals in Malawi. Compare venous DBS to fingerstick DBS for identifying treatment failure.
We recruited patients from ART clinics at two district hospitals in Malawi, collecting plasma, venous DBS (vDBS), and fingerstick DBS (fsDBS) cards for the first 149 patients, and vDBS and fsDBS only for the subsequent 398 patients. Specimens were tested using Abbott RealTime HIV-1 Assay (lower detection limit 40 copies/ml (plasma) and 550 copies/ml (DBS)).
21/149 (14.1%) had detectable viremia (>1.6 log copies/ml), 13 of which were detectable for plasma, vDBS, and fsDBS. Linear regression demonstrated high correlation for plasma vs. DBS (vDBS: β=1.19, R2 0.93 (p<0.0001); fsDBS β=1.20, R2 0.90 (p<0.0001)) and vDBS vs. fsDBS (β=0.88, R2 0.73, (p<0.0001)). Mean difference between plasma and vDBS was 0.51 log copies/ml [SD: 0.33] and plasma and fsDBS 0.46 log copies/ml [SD: 0.30]. At 5000 copies/ml, sensitivity was 100%, and specificity was 98.6% and 97.8% for vDBS and fsDBS, respectively, compared to plasma.
DBS from venipuncture and fingerstick perform well at the failure threshold of 5000 copies/ml. Fingerstick specimen source may improve access to virologic treatment monitoring in resource-limited settings given task-shifting in high-volume, low-resource facilities.
The current gold standard for infant diagnosis of HIV-1 is the Roche Amplicor Qualitative DNA assay, but it is being phased out.
Compare the Abbott qualitative assay and the Gen-Probe Aptima assay to the gold standard Roche DNA assay using dried blood spots (DBS).
The Gen-Probe Aptima and Abbott qualitative HIV-1 assays were compared to the Roche DNA assay for early infant diagnosis. Specificity and sensitivity were determined for the three assays using DBS from 50 HIV-exposed uninfected infants and 269 HIV-1 infected adults from North Carolina, respectively. All of the negative and 151 of the positive DBS had valid results on the 3 different assays, and an additional 118 positive DBS had valid results on the Roche DNA and Aptima assays.
All three assays were very specific. The Roche DNA assay was the most sensitive (96.7%) over a wide range of HIV PVL, including samples with PVL<400 copies/ml. Restricted to samples with PVL>400 copies/ml, the Gen-Probe Aptima assay had sensitivity (96.5%) comparable to the Roche DNA assay (98.8%). The Abbott Qualitative assay was the least sensitive and only had sensitivity above 95% among samples with PVL over 1000 copies/ml.
The Abbott HIV-1 Qualitative assay was not as sensitive as the comparator assays, so it would not be a useful replacement assay, especially for infants taking antiretroviral prophylaxis. The Gen-Probe Aptima assay is an adequate replacement option for infant diagnosis using DBS.
The Center for HIV/AIDS Vaccine Immunology (CHAVI) consortium was established to determine the host and virus factors associated with HIV transmission, infection and containment of virus replication, with the goal of advancing the development of an HIV protective vaccine. Studies to meet this goal required the use of cryopreserved Peripheral Blood Mononuclear Cell (PBMC) specimens, and therefore it was imperative that a quality assurance (QA) oversight program be developed to monitor PBMC samples obtained from study participants at multiple international sites. Nine site-affiliated laboratories in Africa and the USA collected and processed PBMCs, and cryopreserved PBMC were shipped to CHAVI repositories in Africa and the USA for long-term storage. A three-stage program was designed, based on Good Clinical Laboratory Practices (GCLP), to monitor PBMC integrity at each step of this process. The first stage evaluated the integrity of fresh PBMCs for initial viability, overall yield, and processing time at the site-affiliated laboratories (Stage 1); for the second stage, the repositories determined post-thaw viability and cell recovery of cryopreserved PBMC, received from the site-affiliated laboratories (Stage 2); the third stage assessed the long-term specimen storage at each repository (Stage 3). Overall, the CHAVI PBMC QA oversight program results highlight the relative importance of each of these stages to the ultimate goal of preserving specimen integrity from peripheral blood collection to long-term repository storage.
cellular immunity; cryopreservation; vaccine; human clinical trials; PBMC; HIV; biorepository
Women with human immunodeficiency virus (HIV)–1 subtype C had significantly higher genital tract viral loads compared to women with HIV-1 subtype B and men with HIV-1 subtype C or B. Women in general were significantly less likely to have genital tract viral load below the lower limit of quantification compared to men.
Background. Combination antiretroviral therapy (cART) reduces genital tract human immunodeficiency virus type 1 (HIV-1) load and reduces the risk of sexual transmission, but little is known about the efficacy of cART for decreasing genital tract viral load (GTVL) and differences in sex or HIV-1 subtype.
Methods. HIV-1 RNA from blood plasma, seminal plasma, or cervical wicks was quantified at baseline and at weeks 48 and 96 after entry in a randomized clinical trial of 3 cART regimens.
Results. One hundred fifty-eight men and 170 women from 7 countries were studied (men: 55% subtype B and 45% subtype C; women: 24% subtype B and 76% subtype C). Despite similar baseline CD4+ cell counts and blood plasma viral loads, women with subtype C had the highest GTVL (median, 5.1 log10 copies/mL) compared to women with subtype B and men with subtype C or B (4.0, 4.0, and 3.8 log10 copies/mL, respectively; P < .001). The proportion of participants with a GTVL below the lower limit of quantification (LLQ) at week 48 (90%) and week 96 (90%) was increased compared to baseline (16%; P < .001 at both times). Women were significantly less likely to have GTVL below the LLQ compared to men (84% vs 94% at week 48, P = .006; 84% vs 97% at week 96, P = .002), despite a more sensitive assay for seminal plasma than for cervical wicks. No difference in GTVL response across the 3 cART regimens was detected.
Conclusions. The female genital tract may serve as a reservoir of persistent HIV-1 replication during cART and affect the use of cART to prevent sexual and perinatal transmission of HIV-1.
HIV-1 genital tract RNA; HIV-1 subtypes B and C; antiretroviral drugs
The qualitative Roche HIV-1 DNA Amplicor assay has been used for the past 20 years to diagnose HIV infection in infants and young children but is being phased out; hence, alternative assays must be found. The Gen-Probe Aptima qualitative HIV-1 RNA assay is currently the only FDA-cleared HIV-1 nucleic acid assay approved for diagnosis, but data on the use of this assay with infant plasma are limited. We assessed Aptima's performance using control material for reproducibility and limit of detection and 394 plasma samples (0.2 to 0.5 ml) from HIV-exposed infected and uninfected infants and children for analytical sensitivity and specificity. Assays to assess within-run repeatability and between-run reproducibility indicated that the controls with 10,000 (5 of 5), 200 (5 of 5), 100 (16 of 16), 50 (12 of 12), and 25 (20 of 20) HIV-1 RNA copies/ml (cp/ml) were always positive, and negatives were always negative (20 of 20). The limit of detection was 14 cp/ml, as determined by probit analysis. The analytic sensitivity of the assay was 99.5% (189/190 samples; 95% confidence interval [CI], 97.1 to 99.9%) and specificity was 99.5% (199/200 samples; 95% CI, 97.2 to 99.9%). These results suggest that the assay is suitable for early infant diagnosis of HIV-1.
Detectable HIV-1 in body compartments can lead to transmission and antiretroviral resistance. Although sex differences in viral shedding have been demonstrated, mechanisms and magnitude are unclear. We compared RNA levels in blood, genital-secretions and saliva; and drug resistance in plasma and genital-secretions of men and women starting/changing antiretroviral therapy (ART) in the AIDS Clinical Trials Group (ACTG) 5077 study.
Blood, saliva and genital-secretions (compartment fluids) were collected from HIV-infected adults (≥13 years) at 14 United-States sites, who were initiating or changing ART with plasma viral load (VL) ≥2,000 copies/mL. VL testing was performed on all compartment fluids and HIV resistance genotyping on plasma and genital-secretions. Spearman rank correlations were used to evaluate concordance and Fisher’s and McNemar’s exact tests to compare VL between sexes and among compartments.
Samples were available for 143 subjects; 36% treated (23 men, 29 women) and 64% ‘untreated’ (40 men, 51 women). RNA detection was significantly more frequent in plasma (100%) than genital-secretions (57%) and saliva (64%) (P<0.001). A higher proportion of men had genital shedding versus women (78% versus 41%), and RNA detection was more frequent in saliva versus genital-secretions in women when adjusted for censoring at the limit of assay detection. Inter-compartment fluid VL concordance was low in both sexes. In 22 (13 men, 9 women) paired plasma-genital-secretion genotypes from treated subjects, most had detectable resistance in both plasma (77%) and genital-secretions (68%). Resistance discordance was observed between compartments in 14% of subjects.
HIV shedding and drug resistance detection prior to initiation/change of ART in ACTG 5077 subjects differed among tissues and between sexes, making the gold standard blood-plasma compartment assessment not fully representative of HIV at other tissue sites. Mechanisms of potential sex-dependent tissue compartmentalization should be further characterized to aid in optimizing treatment and prevention of HIV transmission.
Dried blood spots (DBS) have been used as alternative specimens to plasma to increase access to HIV viral load (VL) monitoring and early infant diagnosis (EID) in remote settings. We systematically reviewed evidence on the performance of DBS compared to plasma for VL monitoring and EID.
Methods and Findings
Thirteen peer reviewed HIV VL publications and five HIV EID papers were included. Depending on the technology and the viral load distribution in the study population, the percentage of DBS samples that are within 0.5 log of VL in plasma ranged from 52–100%. Because the input sample volume is much smaller in a blood spot, there is a risk of false negatives with DBS. Sensitivity of DBS VL was found to be 78–100% compared to plasma at VL below 1000 copies/ml, but this increased to 100% at a threshold of 5000 copies/ml. Unlike a plasma VL test which measures only cell free HIV RNA, a DBS VL also measures proviral DNA as well as cell-associated RNA, potentially leading to false positive results when using DBS. The systematic review showed that specificity was close to 100% at DBS VL above 5000 copies/ml, and this threshold would be the most reliable for predicting true virologic failure using DBS. For early infant diagnosis, DBS has a sensitivity of 100% compared to fresh whole blood or plasma in all studies.
Although limited data are available for EID, DBS offer a highly sensitive and specific sampling strategy to make viral load monitoring and early infant diagnosis more accessible in remote settings. A standardized approach for sampling, storing, and processing DBS samples would be essential to allow successful implementation.
PROSPERO Registration #: CRD42013003621.
Viral load (VL) monitoring is the standard of care in developing country settings for detecting HIV treatment failure. Since 2010 the World Health Organization has recommended a phase-in approach to VL monitoring in resource-limited settings. We conducted a systematic review of the accuracy and precision of HIV VL technologies for treatment monitoring.
Methods and Findings
A search of Medline and Embase was conducted for studies evaluating the accuracy or reproducibility of commercially available HIV VL assays. 37 studies were included for review including evaluations of the Amplicor Monitor HIV-1 v1.5 (n = 25), Cobas TaqMan v2.0 (n = 11), Abbott RealTime HIV-1 (n = 23), Versant HIV-1 RNA bDNA 3.0 (n = 15), Versant HIV-1 RNA kPCR 1.0 (n = 2), ExaVir Load v3 (n = 2), and NucliSens EasyQ v2.0 (n = 1). All currently available HIV VL assays are of sufficient sensitivity to detect plasma virus levels at a lower detection limit of 1,000 copies/mL. Bias data comparing the Abbott RealTime HIV-1, TaqMan v2.0 to the Amplicor Monitor v1.5 showed a tendency of the Abbott RealTime HIV-1 to under-estimate results while the TaqMan v2.0 overestimated VL counts. Compared to the Amplicor Monitor v1.5, 2–26% and 9–70% of results from the Versant bDNA 3.0 and Abbott RealTime HIV-1 differed by greater than 0.5log10. The average intra and inter-assay variation of the Abbott RealTime HIV-1 were 2.95% (range 2.0–5.1%) and 5.44% (range 1.17–30.00%) across the range of VL counts (2log10–7log10).
This review found that all currently available HIV VL assays are of sufficient sensitivity to detect plasma VL of 1,000 copies/mL as a threshold to initiate investigations of treatment adherence or possible treatment failure. Sources of variability between VL assays include differences in technology platform, plasma input volume, and ability to detect HIV-1 subtypes. Monitoring of individual patients should be performed on the same technology platform to ensure appropriate interpretation of changes in VL.
Prospero registration # CD42013003603.
In 1998 a collaboration between Duke University and the University of North Carolina, Chapel Hill (UNC) was founded to enhance identification of persons with acute HIV-1 infection (AHI). The Duke-UNC AHI Research Consortium Cohort consists of patients ≥18 years old with a positive nucleic acid amplification test (NAAT) and either a negative enzyme immunoassay (EIA) test or a positive EIA with a negative/indeterminate Western blot. Patients were referred to the cohort from acute care settings and state-funded HIV testing sites that use NAAT testing on pooled HIV-1 antibody-negative samples. Between 1998 and 2010, 155 patients with AHI were enrolled: 81 (52%) African-Americans, 63 (41%) white, non-Hispanics, 137 (88%) males, 108 (70%) men who have sex with men (MSM), and 18 (12%) females. The median age was 27 years (IQR 22–38). Most (n=138/155) reported symptoms with a median duration of 17.5 days. The median nadir CD4 count was 408 cells/mm3 (IQR 289–563); the median observed peak HIV-1 level was 726,859 copies/ml (IQR 167,585–3,565,728). The emergency department was the most frequent site of initial presentation (n=55/152; 3 missing data). AHI diagnosis was made at time of first contact in 62/137 (45%; 18 missing data) patients. This prospectively enrolled cohort is the largest group of patients with AHI reported from the Southeastern United States. The demographics reflect the epidemic of this geographic area with a high proportion of African-Americans, including young black MSM. Highlighting the challenges of diagnosing AHI, less than half of the patients were diagnosed at the first healthcare visit. Women made up a small proportion despite increasing numbers in our clinics.
Understanding human immunodeficiency virus type 1 (HIV-1) transmission is central to developing effective prevention strategies, including a vaccine. We compared phenotypic and genetic variation in HIV-1 env genes from subjects in acute/early infection and subjects with chronic infections in the context of subtype C heterosexual transmission. We found that the transmitted viruses all used CCR5 and required high levels of CD4 to infect target cells, suggesting selection for replication in T cells and not macrophages after transmission. In addition, the transmitted viruses were more likely to use a maraviroc-sensitive conformation of CCR5, perhaps identifying a feature of the target T cell. We confirmed an earlier observation that the transmitted viruses were, on average, modestly underglycosylated relative to the viruses from chronically infected subjects. This difference was most pronounced in comparing the viruses in acutely infected men to those in chronically infected women. These features of the transmitted virus point to selective pressures during the transmission event. We did not observe a consistent difference either in heterologous neutralization sensitivity or in sensitivity to soluble CD4 between the two groups, suggesting similar conformations between viruses from acute and chronic infection. However, the presence or absence of glycosylation sites had differential effects on neutralization sensitivity for different antibodies. We suggest that the occasional absence of glycosylation sites encoded in the conserved regions of env, further reduced in transmitted viruses, could expose specific surface structures on the protein as antibody targets.
The contribution of immune activation to accelerated HIV-disease progression in older individuals has not been delineated.
Prospective multicenter cohort of older (≥45 years) and younger (18–30 years) HIV-infected adults initiating 192 weeks of antiretroviral therapy (ART). Longitudinal models of CD4 cell restoration examined associations with age-group, thymic volume, immune activation, and viral load.
Forty-five older and 45 younger adults (median age 50 and 26 years, respectively) were studied. Older patients had fewer naive CD4 cells (P <0.001) and higher HLA-DR/CD38 expression on CD4 (P = 0.05) and CD8 cells (P = 0.07) than younger patients at any time on ART. The rate of naive and total CD4 cell increase was similar between age groups, but older patients had a faster mean rate of B-cell increase (by +0.7 cells/week; P = 0.01), to higher counts than healthy controls after 192 weeks (P = 0.003). Naive CD4 increases from baseline were associated with immune activation reductions (as declines from baseline of %CD8 cells expressing HLA-DR/CD38; P <0.0001), but these increases were attenuated in older patients, or in those with small thymuses. A 15% reduction in activation was associated with naive gains of 29.9 and 6.2 cells/μl in younger, versus older patients, or with gains of 25.7, 23.4, and 2.1 cells/μl in patients with the largest, intermediate, and smallest thymuses, respectively (P <0.01 for interactions between activation reduction and age-group or thymic volume).
Older patients had significant B-cell expansion, higher levels of immune activation markers, and significantly attenuated naive CD4 cell gains associated with activation reduction.
aging; immune activation; immunosenescence; thymus
BACKGROUND AND OBJECTIVE:
The impact of maternal antiretrovirals (ARVs) during pregnancy, labor, and postpartum on infant outcomes is unclear.
Infants born to HIV-infected mothers in ARV studies were followed for 18 months.
Between June 2006 and December 2008, 236 infants enrolled from Africa (n = 36), India (n = 47), Thailand (n = 152), and Brazil (n = 1). Exposure to ARVs in pregnancy included ≥3 ARVs (10%), zidovudine/intrapartum ARV (81%), and intrapartum ARV (9%). There were 4 infant infections (1 in utero, 3 late postpartum) and 4 deaths with 1.8% mortality (95% confidence interval [CI], 0.1%–3.5%) and 96.4% HIV-1–free survival (95% CI, 94.0%–98.9%). Birth weight was ≥2.5 kg in 86%. In the first 6 months, Indian infants (nonbreastfed) had lowest median weights and lengths and smallest increases in growth. After 6 months, African infants had the lowest median weight and weight-for-age z scores. Infants exposed to highest maternal viral load had the lowest height and height-for-age z scores. Serious adverse events occurred in 38% of infants, did not differ by country, and correlated with less maternal ARV exposure. Clinical diagnoses were seen in 84% of Thai, 31% of African, and 9% of Indian infants. Congenital defects/inborn errors of metabolism were seen in 18 (7.6%) infants, of which 17 were Thai (11%: 95% CI, 6.7%–17.0%); none had first trimester ARV exposure.
Infant follow-up in large international cohorts is feasible and provides important safety and HIV transmission data following maternal ARV exposure. Increased surveillance increases identification of congenital/inborn errors.
maternal ARV exposure; infant safety; ARV toxicities; A5190; P1054; MTCT; HIV
In resource-limited settings where no safe alternative to breastfeeding exists, WHO recommends that antiretroviral prophylaxis be given to either HIV-infected mothers or infants throughout breastfeeding. We assessed the effect of 28 weeks of maternal or infant antiretroviral prophylaxis on postnatal HIV infection at 48 weeks.
The Breastfeeding, Antiretrovirals, and Nutrition (BAN) Study was undertaken in Lilongwe, Malawi, between April 21, 2004, and Jan 28, 2010. 2369 HIV-infected breastfeeding mothers with a CD4 count of 250 cells per μL or more and their newborn babies were randomly assigned with a variable-block design to one of three, 28-week regimens: maternal triple antiretroviral (n=849); daily infant nevirapine (n=852); or control (n=668). Patients and local clinical staff were not masked to treatment allocation, but other study investigators were. All mothers and infants received one dose of nevirapine (mother 200 mg; infant 2 mg/kg) and 7 days of zidovudine (mother 300 mg; infants 2 mg/kg) and lamivudine (mothers 150 mg; infants 4 mg/kg) twice a day. Mothers were advised to wean between 24 weeks and 28 weeks after birth. The primary endpoint was HIV infection by 48 weeks in infants who were not infected at 2 weeks and in all infants randomly assigned with censoring at loss to follow-up. This trial is registered with ClinicalTrials.gov, number NCT00164736.
676 mother–infant pairs completed follow-up to 48 weeks or reached an endpoint in the maternal-antiretroviral group, 680 in the infant-nevirapine group, and 542 in the control group. By 32 weeks post partum, 96% of women in the intervention groups and 88% of those in the control group reported no breastfeeding since their 28-week visit. 30 infants in the maternal-antiretroviral group, 25 in the infant-nevirapine group, and 38 in the control group became HIV infected between 2 weeks and 48 weeks of life; 28 (30%) infections occurred after 28 weeks (nine in maternal-antiretroviral, 13 in infant-nevirapine, and six in control groups). The cumulative risk of HIV-1 transmission by 48 weeks was significantly higher in the control group (7%, 95% CI 5–9) than in the maternal-antiretroviral (4%, 3–6; p=0·0273) or the infant-nevirapine (4%, 2–5; p=0·0027) groups. The rate of serious adverse events in infants was significantly higher during 29–48 weeks than during the intervention phase (1·1 [95% CI 1·0–1·2] vs 0·7 [0·7–0·8] per 100 person-weeks; p<0·0001), with increased risk of diarrhoea, malaria, growth faltering, tuberculosis, and death. Nine women died between 2 weeks and 48 weeks post partum (one in maternal-antiretroviral group, two in infant-nevirapine group, six in control group).
In resource-limited settings where no suitable alternative to breastfeeding is available, antiretroviral prophylaxis given to mothers or infants might decrease HIV transmission. Weaning at 6 months might increase infant morbidity
US Centers for Disease Control and Prevention.
(See the editorial commentary by Branson and Stekler, on pages 521–4.)
Background. Most human immunodeficiency virus (HIV) point-of-care tests detect antibodies (Ab) but not p24 antigen (Ag) or RNA. In the absence of antibodies, p24 antigen and RNA typically indicate acute HIV infection. We conducted a field evaluation of the Determine® HIV-1/2 Ag/Ab Combo rapid test (Combo RT).
Methods. The antigen portion of the Combo RT (for acute HIV infection) was compared with a Roche Monitor HIV RNA polymerase chain reaction assay. The antibody portion of Combo RT (for established HIV infection) was compared with rapid test algorithms. Participants were enrolled at a sexually transmitted infection clinic and HIV testing and counseling center in Lilongwe, Malawi. Rapid testing was conducted with parallel testing in the clinic and serial testing in the center. The Combo RT was performed in clinic participants with negative or discordant antibody results and in all center participants.
Results. Of the participants 838 were HIV negative, 163 had established HIV infection, and 8 had acute HIV infection. For detecting acute HIV infection, the antigen portion had a sensitivity of 0.000 and a specificity of 0.983. For detecting established HIV infection, the antibody portion had a sensitivity of 0.994 and a specificity of 0.992.
Conclusions. Combo RT displayed excellent performance for detecting established HIV infection and poor performance for detecting acute HIV infection. In this setting, Combo RT is no more useful than current algorithms.
Characterize responses to a NNRTI-based antiretroviral treatment (ART) initiated during acute HIV infection (AHI).
This was a prospective, single-arm evaluation of once daily, co-formulated emtricitabine/tenofovir/efavirenz initiated during AHI.
The primary endpoint is the proportion of responders with HIV RNA <200 copies/mL by week 24. We examined time-to-viral-suppression and CD8 cell activation in relation to baseline participant characteristics. We compared time-to-viral-suppression and viral dynamics using linear mixed effects models between acutely infected participants and chronically-infected controls.
Between January 2005 and May 2009, 61 AHI participants were enrolled. Of participants whose enrollment date allowed 24 and 48 weeks of follow-up, 47 of 51 (92%) achieved viral suppression to <200 copies/mL by week 24, and 35 of 41 (85.4%) to <50 copies/mL by week 48. The median time from ART initiation to suppression <50 copies/mL was 93 days (range 14–337). Higher HIV RNA levels at ART initiation (p=0.02), but not time from estimated-date-of-infection to ART initiation (p=0.86), were associated with longer time-to-viral-suppression. The median baseline frequency of activated CD8+CD38+HLA-DR+ T-cells was 67% (range 40–95), and was not significantly associated with longer time to viral load suppression (p=0.15). Viremia declined to <50 copies/mL more rapidly in AHI than chronically-infected participants. Mixed model analysis demonstrated similar phase I HIV RNA decay rates between acute and chronically-infected participants, and more rapid viral decline in acutely-infected participants in phase II.
Once daily emtricitabine/tenofovir/efavirenz initiated during AHI achieves rapid and sustained HIV suppression during this highly infectious period.
Acute HIV infection; NNRTIs; antiretroviral therapy; immune activation; viral dynamics
HIV-1 RNA quantitation continues to be extremely important for monitoring patients infected with HIV-1, and a number of assays have been utilized for this purpose. Differences in assay performance with respect to log10 recovery and HIV-1 subtype specificity have been well documented for commercially available assays, although comparisons are usually limited to one or two assay platforms. Two new FDA-approved assays, the Roche Cobas AmpliPrep/Cobas TaqMan HIV-1 test (RT) and the Abbott RealTime HIV-1 assay (AR), that utilize real-time PCR have replaced previous HIV-1 RNA platforms. Inadequate detection of some strains of HIV-1 resulted in the addition of a new primer/probe set and the introduction of a second version of the RT assay. In this study, comparisons of assay performance between the different FDA-approved HIV-1 RNA assay platforms (both new and existing) were performed by using validation data that included both well-characterized virus stock and locally collected clinical samples. Laboratories across diverse geographical regions performed the validation testing and submitted data to the Virology Quality Assurance program (VQA) for analysis. Correlation values for clinical sample testing varied across the assay platforms (r = 0.832 to 0.986), and average log10 recoveries for HIV-1 RNA controls (compared to the nominal value) ranged from −0.215 to 0.181. These data demonstrate the need for use of one assay platform for longitudinal patient monitoring, but the data also reinforce the notion that no one assay is superior and that testing across platforms may be required for discordance reconciliation.
Breastfeeding is a leading cause of infant HIV-1 infection in the developing world, yet only a minority of infants exposed to HIV-1 via breastfeeding become infected. As a genetic bottleneck severely restricts the number of postnatally-transmitted variants, genetic or phenotypic properties of the virus Envelope (Env) could be important for the establishment of infant infection. We examined the efficiency of virologic functions required for initiation of infection in the gastrointestinal tract and the neutralization sensitivity of HIV-1 Env variants isolated from milk of three postnatally-transmitting mothers (n=13 viruses), five clinically-matched nontransmitting mothers (n=16 viruses), and seven postnatally-infected infants (n = 7 postnatally-transmitted/founder (T/F) viruses).
There was no difference in the efficiency of epithelial cell interactions between Env virus variants from the breast milk of transmitting and nontransmitting mothers. Moreover, there was similar efficiency of DC-mediated trans-infection, CCR5-usage, target cell fusion, and infectivity between HIV-1 Env-pseudoviruses from nontransmitting mothers and postnatal T/F viruses. Milk Env-pseudoviruses were generally sensitive to neutralization by autologous maternal plasma and resistant to breast milk neutralization. Infant T/F Env-pseudoviruses were equally sensitive to neutralization by broadly-neutralizing monoclonal and polyclonal antibodies as compared to nontransmitted breast milk Env variants.
Postnatally-T/F Env variants do not appear to possess a superior ability to interact with and cross a mucosal barrier or an exceptional resistance to neutralization that define their capability to initiate infection across the infant gastrointestinal tract in the setting of preexisting maternal antibodies.
HIV; Mother to child transmission; Galcer; Dendritic cells; Neutralizing antibodies
(See the editorial commentary by Tossonian and Conway, on pages 10–12.)
Background. The benefits of antiretroviral therapy during early human immunodeficiency virus type 1 (HIV-1) infection remain unproved.
Methods. A5217 study team randomized patients within 6 months of HIV-1 seroconversion to receive either 36 weeks of antiretrovirals (immediate treatment [IT]) or no treatment (deferred treatment [DT]). Patients were to start or restart antiretroviral therapy if they met predefined criteria. The primary end point was a composite of requiring treatment or retreatment and the log10 HIV-1 RNA level at week 72 (both groups) and 36 (DT group).
Results. At the June 2009 Data Safety Monitoring Board (DSMB) review, 130 of 150 targeted participants had enrolled. Efficacy analysis included 79 individuals randomized ≥72 weeks previously. For the primary end point, the IT group at week 72 had a better outcome than the DT group at week 72 (P = .005) and the DT group at week 36 (P = .002). The differences were primarily due to the higher rate of progression to needing treatment in the DT group (50%) versus the IT (10%) group. The DSMB recommended stopping the study because further follow-up was unlikely to change these findings.
Conclusions. Progression to meeting criteria for antiretroviral initiation in the DT group occurred more frequently than anticipated, limiting the ability to evaluate virologic set point. Antiretrovirals during early HIV-1 infection modestly delayed the need for subsequent treatment.
Clinical Trials Registration. NCT00090779.
We evaluated the efficacy of a maternal triple-drug antiretroviral regimen or infant nevirapine prophylaxis for 28 weeks during breast-feeding to reduce postnatal transmission of human immunodeficiency virus type 1 (HIV-1) in Malawi.
We randomly assigned 2369 HIV-1–positive, breast-feeding mothers with a CD4+ lymphocyte count of at least 250 cells per cubic millimeter and their infants to receive a maternal antiretroviral regimen, infant nevirapine, or no extended postnatal antiretroviral regimen (control group). All mothers and infants received perinatal prophylaxis with single-dose nevirapine and 1 week of zidovudine plus lamivudine. We used the Kaplan–Meier method to estimate the cumulative risk of HIV-1 transmission or death by 28 weeks among infants who were HIV-1–negative 2 weeks after birth. Rates were compared with the use of the log-rank test.
Among mother–infant pairs, 5.0% of infants were HIV-1–positive at 2 weeks of life. The estimated risk of HIV-1 transmission between 2 and 28 weeks was higher in the control group (5.7%) than in either the maternal-regimen group (2.9%, P = 0.009) or the infant-regimen group (1.7%, P<0.001). The estimated risk of infant HIV-1 infection or death between 2 and 28 weeks was 7.0% in the control group, 4.1% in the maternal-regimen group (P = 0.02), and 2.6% in the infant-regimen group (P<0.001). The proportion of women with neutropenia was higher among those receiving the antiretroviral regimen (6.2%) than among those in either the nevirapine group (2.6%) or the control group (2.3%). Among infants receiving nevirapine, 1.9% had a hypersensitivity reaction.
The use of either a maternal antiretroviral regimen or infant nevirapine for 28 weeks was effective in reducing HIV-1 transmission during breast-feeding. (ClinicalTrials.gov number, NCT00164736.)
Tenofovir (TFV) is effective in preventing simian immunodeficiency virus (SIV) transmission in a macaque model, is available as the oral agent tenofovir disoproxil fumarate (TDF), and may be useful in the prevention of mother-to-child transmission of human immunodeficiency virus (HIV). We conducted a trial of TDF and TDF-emtricitabine (FTC) in HIV-infected pregnant women and their infants. Women received a single dose of either 600 mg TDF, 900 mg TDF, or 900 mg TDF-600 mg FTC at labor onset or prior to a cesarean section. Infants received no drug or a single dose of TDF at 4 mg/kg of body weight or of TDF at 4 mg/kg plus FTC at 3 mg/kg as soon as possible after birth. All regimens were safe and well tolerated. Maternal areas under the serum concentration-time curve (AUC) and concentrations at the end of sampling after 24 h (C24) were similar between the two doses of TDF; the maximum concentrations of the drugs in serum (Cmax) and cord blood concentrations were higher in women delivering via cesarean section than in those who delivered vaginally (P = 0.04 and 0.046, respectively). The median ratio of the TFV concentration in cord blood to that in the maternal plasma at delivery was 0.73 (range, 0.26 to 1.95). Without TDF administration, infants had a median TFV concentration of 12 ng/ml 12 h after birth. Following administration of a single dose of TDF at 4 mg/kg, infant TFV concentrations fell below the targeted level, 50 ng/ml, by 24 h postdose. In HIV-infected pregnant women and their infants, 600 mg of TDF is acceptable as a single dose during labor. Low concentrations at birth support infant dosing as soon after birth as possible. Rapidly decreasing TFV levels in infants suggest that multiple or higher doses of TDF will be necessary to maintain concentrations that are effective for viral suppression.
There is a worldwide need for a pediatric HIV-1 diagnostic test that has a high diagnostic accuracy, is technically simple and cost efficient. The Up24 HIV-1 assay, which requires both the HIV-1 p24 ELISA and the ELAST signal amplification kit, has previously been shown to be a robust tool to diagnose pediatric HIV-1 from dried whole blood spots (DBS) (Cachafeiro et al., JCM 2009;47:459–6213). In order to make the assay more accessible to a resource-limited clinical setting, we eliminated the ELAST system, which simplified the Up24 assay, reduced its cost, and tested the accuracy of the modified assay in a rural Malawian hospital.
In this proof of concept study, we tested the ability of a simplified Up24 antigen assay, without ELAST, to detect HIV-1 on DBS obtained via heel prick from 6-week-old Malawian infants.
A case–control study of DBS collected from 113 HIV-infected and 109 HIV-negative infants, using the HIV-1 DNA PCR assay as the reference standard.
The simplified HIV-1 Up24 assay had a sensitivity and specificity of 84% and 98%, respectively. When HIV-1 prevalence is 15%, the positive- and negative-predictive values are 89% and 97%, respectively.
The simplified Up24 assay has a good positive- and a robust negative-predictive values, is easier to perform and has a reduced cost compared to both HIV DNA PCR and Up24 assays. With additional testing, the simplified Up24 assay has the potential to increase global access to pediatric HIV-1 diagnostics.
Human immunodeficiency virus type 1; (HIV-1); Pediatrics; Diagnostics; Malawi; HIV-1 p24 antigen detection assay; Dried blood spot (DBS)
The Cavidi viral load assay and the ultra-sensitive p24 antigen assay (Up24 Ag) have been suggested as more feasible alternatives to PCR-based HIV viral load assays for use in monitoring patients infected with HIV-1 in resource-limited settings.
To describe the performance of the Cavidi ExaVir Load™ assay (version 2.0) and two versions of the Up24 antigen assay and to characterize their agreement with the Roche Monitor HIV-1 RNA assay (version 1.5).
Observational study using a convenience sample of 342 plasma specimens from 108 patients enrolled in two ACTG clinical trials to evaluate the performance characteristics of the Up24 Ag assay using two different lysis buffers and the Cavidi ExaVir Load™ assay.
In analysis of agreement with the Roche assay, the Cavidi assay demonstrated superiority to the Up24 Ag assays in accuracy and precision, as well as sensitivity, specificity, and positive and negative predictive values for HIV-1 RNA ≥400, ≥1000 and ≥5000 copies/mL. Logistic performance curves indicated that the Cavidi assay was superior to the Up24 assays for viral loads greater than 650 copies/mL.
The results suggest that the Cavidi ExaVir Load assay could be used for monitoring HIV-1 viral load in resource-limited settings.
Cavidi; p24 antigen; viral load; resource limited setting
We assessed whether 7 days of zidovudine+lamivudine postpartum with single-dose nevirapine at labor decreases nevirapine resistance in HIV-infected women in Malawi.
HIV-infected pregnant women receiving intrapartum single-dose nevirapine and 7 days of zidovudine+lamivudine (n=132), and women receiving intrapartum single-dose nevirapine alone (n=66) were followed from an antenatal visit through 6 weeks postpartum. Plasma specimens at 2 and 6 weeks postpartum were tested for genotypic resistance to nevirapine by population sequencing and sensitive real-time PCR. Poisson regression was used to determine predictors of postpartum nevirapine resistance.
Median HIV RNA was similar at entry (4.27 log vs. 4.35 log, p=0.87), differed at 2 weeks postpartum (2.67 log vs. 3.58 log, p<0.0001), but not at 6 weeks postpartum (4.49 log vs. 4.40 log, p=0.79), between single-dose nevirapine/zidovudine+lamivudine and single-dose nevirapine groups, respectively. Nevirapine resistance, measured by population sequencing and sensitive real-time PCR, was significantly less common in those receiving single-dose nevirapine/zidovudine+lamivudine compared to single-dose nevirapine, respectively, at 2 weeks (10% (4/40) vs. 74% (31/42), p<0.0001) and 6 weeks postpartum [10% (11/115) vs. 64% (41/64), p<0.0001; adjusted relative risk=0.18, 95% confidence interval (0.10–0.34)].
The significant decrease in nevirapine resistance conferred by one week of zidovudine+lamivudine should help policymakers optimize peripartum HIV prophylaxis recommendations.