In HIV-1 subtype C infected populations in south India, we searched for novel mutations associated with failing antiretroviral therapy that included nucleoside reverse transcriptase (RT) inhibitors. HIV-1 RT sequences were generated from treated and untreated groups and each nucleotide position was analysed with appropriate corrections for multiple testing. We found that nonsynonymous mutations at positions 208 and 228 were strongly associated with the presence of thymidine analogue mutations in the treated group, and were not present at all in the naïve group. The role of these substitutions on treatment outcomes and the evolution of drug resistance in HIV-1 subtype-C infected populations warrant further investigation.
HIV-1 Drug Resistance Mutation; HIV-1 Subtype C; HIV in India; Thymidine Analogue Mutation (TAMs); HIV-1 RT mutation at codon 208; 228; HIV Drug Resistance in India
To develop less costly methods to virologically monitor patients receiving antiretroviral therapy, we evaluated methods that use pooled blood samples and quantitative information available from viral load assays to monitor a cohort of patients on first-line antiretroviral therapy for virologic failure.
We evaluated 150 blood samples collected after 6 months of therapy from participants enrolled in a San Diego primary infection program between January 1998 and January 2007. Samples were screened for virologic failure with individual viral load testing, 10 × 10 matrix pools and minipools of five samples. For the pooled platforms (matrix and minipools), we used a search and retest algorithm based on the quantitative viral load data to resolve samples that remained ambiguous for virologic failure. Viral load thresholds were more than 500 and more than 1500 copies/ml for the matrix and more than 250 and more than 500 copies/ml for the minipool. Efficiency, accuracy and result turnaround times were evaluated.
Twenty-three percent of cohort samples were detectable at more than 50 HIV RNA copies/ml. At an algorithm threshold of more than 500 HIV RNA copies/ml, both minipool and matrix methods used less than half the number of viral load assays to screen the cohort, compared with testing samples individually. Both pooling platforms had negative predictive values of 100% for viral loads of more than 500 HIV RNA copies/ml and at least 94% for viral loads of more than 250 HIV RNA copies/ml.
In this cohort, both pooling methods improved the efficiency of virologic monitoring over individual testing with a minimal decrease in accuracy. These methods may allow for the induction and sustainability of the virologic monitoring of patients receiving antiretroviral therapy in resource-limited settings.
antiretroviral therapy; drug resistance; HIV; monitoring; pooling; viral load
Efforts to identify all persons infected with HIV in the United States are driven by the hope that early diagnosis will lower risk behaviors and decrease HIV transmission. Identification of HIV-infected people earlier in the course of their infection with HIV antigen/antibody (Ag/Ab) combination assays (4th-generation HIV assays) should help achieve this goal. We compared HIV RNA nucleic acid test (NAT) results to the results of a 4th-generation Ag/Ab assay (Architect HIV Ag/Ab Combo [HIV Combo] assay; Abbott Diagnostics) in 2,744 HIV antibody-negative samples. Fourteen people with acute HIV infection (HIV antibody negative/NAT positive) were identified; the HIV Combo assay detected nine of these individuals and was falsely negative in the remaining five. All five persons missed by the HIV Combo assay were in the stage of exponential increase in plasma virus associated with acute HIV infection (3, 7, 20, 35, 48). In contrast, most acutely infected persons detected by the HIV Combo assay demonstrated either a plateauing or decreasing plasma viral load. The HIV Combo assay also classified as positive five other samples which were negative by NAT. Taken together, the HIV Combo assay had a sensitivity of 73.7% and a specificity of 99.8%. Using published data, we estimated secondary transmission events had HIV infection in these five individuals remained undiagnosed. Screening of our population with NAT cost more than screening with the HIV Combo assay but achieved new diagnoses that we predict resulted in health care savings that far exceed screening costs. These findings support the use of more sensitive assays, like NAT, in HIV screening of populations with a high prevalence of acute HIV infection.
Malaria and HIV co-infection adversely impact the outcome of both diseases and previous studies have mostly focused on falciparum malaria. Plasmodium vivax contributes to almost half of the malaria cases in India, but the disease burden of HIV and P. vivax co-infection is unclear.
HIV-infected subjects (n=460) were randomly selected from the 4,611 individuals seen at a Voluntary Counseling and Testing Center in Chennai, India between Jan 2 to Dec 31 2008. Malaria testing was performed on stored plasma samples by nested PCR using both genus-specific and species-specific primers and immunochromatography-based rapid diagnostic test for detecting antibodies against Plasmodium falciparum and P. vivax.
Recent malaria co-infection, defined by the presence of antibodies, was detected in 9.8% (45/460) participants. Plasmodium vivax accounted for majority of the infections (60%) followed by P. falciparum (27%) and mixed infections (13%). Individuals with HIV and malaria co-infection were more likely to be men (p=0.01). Between those with and without malaria, there was no difference in age (p=0.14), CD4+ T-cell counts (p=0.19) or proportion CD4+ T-cell below 200/mL (p=0.51).
Retrospective testing of stored plasma samples for malaria antibodies can facilitate identification of populations with high rates of co-infection, and in this southern India HIV-infected cohort there was a considerable burden of malaria co-infection, predominantly due to P. vivax. However, the rate of P. falciparum infection was more than 6-fold higher among HIV-infected individuals than what would be expected in the general population in the region. Interestingly, individuals co-infected with malaria and HIV were not more likely to be immunosuppressed than individuals with HIV infection alone.
Plasmodium vivax; Plasmodium falciparum; Malaria; HIV; Co-infection; Malaria antibody; Retrospective test
To determine the influence of asymptomatic genital viral infections on the cellular components of semen and blood, we evaluated the associations between the numbers and activation statuses of CD4+ and CD8+ T lymphocytes in both compartments and the seminal levels of cytomegalovirus (CMV), herpes simplex virus (HSV), and human immunodeficiency virus 1 (HIV). Paired blood and semen samples were collected from 36 HIV-infected antiretroviral-naïve individuals and from 40 HIV-uninfected participants. We performed multiparameter flow cytometry analysis (CD45, CD45RA, CD3, CD4, CD8, and CD38) of seminal and blood cellular components and measured HIV RNA and CMV and HSV DNA levels in seminal and blood plasma by real-time PCR. Compared to HIV-uninfected participants, in the seminal compartment HIV-infected participants had higher levels of CMV (P < 0.05), higher numbers of total CD3+ (P < 0.01) and CD8+ subset (P < 0.01) T lymphocytes, and higher CD4+ and CD8+ T lymphocyte activation (RA-CD38+) (P < 0.01). Seminal CMV levels positively correlated with absolute numbers of CD4+ and CD8+ T cells in semen (P < 0.05) and with the activation status of CD4+ T cells in semen and in blood (P < 0.01). HIV levels in semen (P < 0.05) and blood (P < 0.01) were positively associated with T-cell activation in blood. Activation of CD8+ T cells in blood remained an independent predictor of HIV levels in semen in multivariate analysis. The virologic milieu in the male genital tract strongly influences the recruitment and activation of immune cells in semen and may also modulate T-cell immune activation in blood. These factors likely influence replication dynamics, sexual transmission risk, and disease outcomes for all three viruses.
During the late 1980s and early 1990s, an estimated 10,000 Romanian children were infected with HIV-1 subtype F nosocomially through contaminated needles and blood transfusions. However, the geographic source and origins of this epidemic remain unclear.
Here we used phylogenetic inference and “relaxed” molecular clock dating analysis to further characterize the Romanian HIV-1 subtype F epidemic.
These analyses revealed a major lineage of Romanian HIV sequences consisting nearly entirely of virus sampled from adolescents and children and a distinct cluster that included a much higher ratio of adult sequences. Divergence time estimates inferred the time of most recent common ancestor of subtype F1 sequences to be 1973 (1966–1980) and for all Angolan sequences to 1975 (1968–1980). The most common ancestor of the Romanian sequences was dated to 1978 (1972–1983) with pediatric and adolescent sequences interspersed throughout the lineage. The phylogenetic structure of the entire subtype F epidemic suggests that multiple introductions of subtype F into Romania occurred either from the Angolan epidemic or from more distant ancestors. Since the historical records note that the Romanian pediatric epidemic did not begin until the late 1980s, the inferred time of most recent common ancestor of the Romanian lineage of 1978 suggests that there were multiple introductions of subtype F occurred into the pediatric population from HIV already circulating in Romania.
Analysis of the subtype F HIV-1 epidemic in an historical context allows for a deeper appreciation of how the HIV pandemic has been influenced by socio-political events.
Phylogeography; Romania; Subtype F; Socio-political; HIV
Current HIV screening guidelines in the United States recommend expanding the scope of HIV screening to include routine screening in health care settings; however, this will require increased resources. Since testing of pooled samples can decrease costs, the test characteristics of pooled rapid antibody testing were determined and optimal pool sizes were estimated for populations with HIV prevalence ranging from 0.25–10%. Based on these results, pooled testing methods were evaluated for screening patients admitted to hospital in San Diego, California. Evaluation of pooled antibody testing on samples collected from individuals with known HIV infection found only a modest reduction in sensitivity. These false negative results were only found among samples with very low optical density readings (<0.125 by the ADVIA Centaur® HIV assay). These readings are considered as HIV negative by the ADVIA Centaur® HIV assay, and therefore likely correspond to samples collected during acute infection. Further evaluation of pooled testing of samples collected from individuals during recent infection, found that mini-pool testing of five samples detected HIV antibody in 86% of samples taken within 60 days of the initial infection and 92% of samples taken within 90 days of the initial infection. Based on estimations of optimal pool sizes for low prevalence populations, it was decided to evaluate mini-pools consisting of 10 samples to screen the study’s hospitalized patients. During this evaluation, the HIV prevalence among hospitalized patients was 0.8%, and the 10 sample mini-pool testing had 100% sensitivity and specificity. Additionally, pooled testing resulted in an 84.5% reduction in the number of rapid HIV antibody tests needed, as compared to testing each sample individually. Even when incorporating the increased costs of technician time, mini-pooled tested would have resulted in a net savings of 8760 USD for the 523 samples tested in the study. Taken together, these results indicate that pooled rapid antibody testing may reduce substantially the costs for HIV screening in low prevalence populations without a loss in accuracy.
HIV screening; rapid antibody testing; pooled testing; cost-effective screening
To develop a low cost method to screen for virologic failure of antiretroviral therapy (ART) and HIV-1 drug resistance, we performed a retrospective evaluation of a screening assay using serial dilutions of HIV-1 RNA-spiked blood plasma and samples from patients receiving >6 months of first-line ART.
Serial dilution testing was used to assess sensitivity of a simple PCR-based assay (targeted at ≥1,000 HIV RNA copies/mL). We created blood plasma minipools of five samples, extracted HIV RNA from the pools, PCR amplified the reverse transcriptase (RT) coding region of the HIV-1 pol gene from extracted RNA, sequenced PCR product of positive pools, and used sequences to determine drug resistance. Sensitivity, specificity, and predictive values were determined for different levels of virologic failure based on maximum viral loads of individual samples within a pool.
Of 295 samples analyzed, 43 (15%) had virologic failure at ≥50 copies/mL (range 50–10,500 copies/mL, four at ≥1,000 copies/mL). The assay demonstrated 100% sensitivity to detect virus from these four samples, requiring only one round of PCR, and 56% and 89% sensitivity to detect samples with ≥50 and ≥500 copies/mL using two rounds. Amplified PCR products of all positive pools were successfully sequenced and 30% harbored ≥1 major resistance mutation. This method would have cost 10% of the combined costs of individual viral load and resistance testing.
We present a novel method that can screen for both virologic failure of first-line ART and drug resistance. The method is much less expensive than current methods, which may offer sustainability in resource-limited settings.
Similar to other resource-limited settings, cost restricts availability of viral load monitoring for most patients receiving antiretroviral therapy in Tijuana, Mexico. We evaluated if a pooling method could improve efficiency and reduce costs while maintaining accuracy.
We evaluated 700 patient blood plasma specimens at a reference laboratory in Tijuana for detectable viremia, individually and in 10 × 10 matrix pools. Thresholds for virologic failure were set at ≥500, ≥1000 and ≥1500 HIV RNA copies per milliliter. Detectable pools were deconvoluted using pre-set algorithms. Accuracy and efficiency of the pooling method were compared with individual testing. Quality assurance (QA) measures were evaluated after 1 matrix demonstrated low efficiency relative to individual testing.
Twenty-two percent of the cohort had detectable HIV RNA (≥50 copies/mL). Pooling methods saved approximately one third of viral load assays over individual testing, while maintaining negative predictive values of >90% to detect samples with virologic failure (≥50 copies/mL). One matrix with low relative efficiency would have been detected earlier using the developed QA measures, but its exclusion would have only increased relative efficiency from 39% to 42%. These methods would have saved between $13,223 and $14,308 for monitoring this cohort.
Despite limited clinical data, high prevalence of detectable viral loads and a contaminated matrix, pooling greatly improved efficiency of virologic monitoring while maintaining accuracy. By improving cost-effectiveness, these methods could provide sustainability of virologic monitoring in resource-limited settings, and incorporation of developed QA measures will most likely maximize pooling efficiency in future uses.
HIV; pooling; resource-limited settings; viral loads; virologic failure
Reports of a high frequency of the transmission of minority viral populations with drug-resistant mutations (DRM) are inconsistent with evidence that HIV-1 infections usually arise from mono- or oligoclonal transmission. We performed ultradeep sequencing (UDS) of partial HIV-1 gag, pol, and env genes from 32 recently infected individuals. We then evaluated overall and per-site diversity levels, selective pressure, sequence reproducibility, and presence of DRM and accessory mutations (AM). To differentiate biologically meaningful mutations from those caused by methodological errors, we obtained multinomial confidence intervals (CI) for the proportion of DRM at each site and fitted a binomial mixture model to determine background error rates for each sample. We then examined the association between detected minority DRM and the virologic failure of first-line antiretroviral therapy (ART). Similar to other studies, we observed increased detection of DRM at low frequencies (average, 0.56%; 95% CI, 0.43 to 0.69; expected UDS error, 0.21 ± 0.08% mutations/site). For 8 duplicate runs, there was variability in the proportions of minority DRM. There was no indication of increased diversity or selection at DRM sites compared to other sites and no association between minority DRM and AM. There was no correlation between detected minority DRM and clinical failure of first-line ART. It is unlikely that minority viral variants harboring DRM are transmitted and maintained in the recipient host. The majority of low-frequency DRM detected using UDS are likely errors inherent to UDS methodology or a consequence of error-prone HIV-1 replication.
Phylogeography can improve the understanding of local and worldwide HIV epidemics, including the migration of subepidemics across national borders. We analyzed HIV-1 sequences sampled from Mexico and San Diego, California to determine the relatedness of these epidemics. We sampled the HIV epidemics in (1) Mexico by downloading all publicly available HIV-1 pol sequences from antiretroviral-naive individuals in GenBank (n = 100) and generating similar sequences from cohorts of injection drug users and female sex workers in Tijuana, Mexico (n = 27) and (2) in San Diego, California by pol sequencing well-characterized primary (n = 395) and chronic (n = 267) HIV infection cohorts. Estimates of population structure (FST), genetic distance cluster analysis, and a cladistic measure of migration events (Slatkin–Maddison test) were used to assess the relatedness of the epidemics. Both a test of population differentiation (FST = 0.06; p < 0.01) and a cladistic estimate of migration events (84 migrations, p < 0.01) indicated that the Tijuana and San Diego epidemics were not freely mixing. A conservative cluster analysis identified 72 clusters (two or more sequences), with two clusters containing both Mexican and San Diego sequences (permutation p < 0.01). Analysis of this very large dataset of HIV-1 sequences suggested that the HIV-1 epidemics in San Diego, California and Tijuana, Mexico are distinct. Larger epidemiological studies are needed to quantify the magnitude and associations of cross-border mixing.
Nucleic acid testing (NAT) in routine HIV testing programs can increase the detection of infected individuals, but the most effective implementation of NAT remains unclear.
To determine how many HIV cases can be identified with NAT and how many persons can be contacted, to identify predictors of acute and early HIV infection cases, and to test reporting of negative results by automated Internet and voicemail systems.
San Diego County, California.
Persons seeking HIV testing.
Rates and predictors of HIV infection by stage, notification of positive NAT results, use of automated Internet or voicemail systems to access negative NAT results, and estimated HIV infections prevented.
Of 3151 persons tested, 79 had newly diagnosed cases of HIV: 64 had positive results from rapid HIV test, and 15 had positive results only by NAT (that is, NAT increased the HIV detection yield by 23%). Of all HIV infections, 44% (in 35 persons) were in the acute and early stages. Most participants (56%) and persons with HIV (91%) were men who have sex with men (MSM). All persons with NAT-positive results were notified within 1 week. Of all 3070 uninfected patients, 2105 (69%) retrieved their negative NAT results, with 1358 using the Internet system. After adjustment for covariates, persons reporting MSM behavior, higher incomes, younger ages, no testing at substance abuse rehabilitation centers, no recent syphilis, and no methamphetamine use were more likely to access negative NAT results by either Internet or voicemail systems.
Findings may not be generalizable to other populations and testing programs.
Nucleic acid testing programs that include automated systems for result reporting can increase case yield, especially in settings that cater to MSM.
Primary Funding Source
California HIV/AIDS Research Program and the National Institutes of Health.
As described elsewhere in this supplement, development of effective methods for prevention of human immunodeficiency virus (HIV) infection has proven to be more challenging than development of effective treatment for the disease. New strategies to control the HIV epidemic are urgently needed; this urgency creates interest in investigation of the possibility of using antiretroviral treatment in combination with other modalities to control the epidemic. This article summarizes current knowledge concerning prevention modalities in the context of the drivers of the HIV epidemic in specific communities, describes challenges in investigating test-and-treat strategies, and proposes research directions for addressing these challenges to investigate the impact of prevention strategies on mitigation of epidemics.
When antiretroviral therapy does not fully suppress HIV replication, suboptimal levels of antiretrovirals can select for antiretroviral resistant variants of HIV. These variants may exhibit reduced replication capacity and result in lower viral loads in blood. Our study evaluated whether antiretroviral resistance was associated with viral loads in the cerebrospinal fluid (CSF) and better neuropsychological (NP) performance.
We enrolled ninety-four participants and each participant underwent a comprehensive neuromedical evaluation that used structured clinical assessments of medical history, ART and other medication use, comprehensive NP testing and neurological and general physical signs of disease. Blood was collected by venipuncture and all participants were offered lumbar puncture. Univariate and multivariate statistical methods were used to analyze the relationship between antiretroviral resistance, blood and CSF HIV RNA levels, substance use, and NP performance.
Antiretroviral resistance, detected in blood, was associated with lower CSF viral loads (p<0.01) and better NP performance (p=0.04) in multivariate analyses, independent of past and current ARV use and blood viral loads (Model: p< 0.01). However, HIV RNA levels in CSF did not independently correlate with NP performance. Low viral loads in the CSF limited our ability to investigate the relationship between antiretroviral resistance detected in CSF and NP performance.
Even in the absence of ART, antiretroviral resistance-associated mutations correlate with better NP performance possibly because these mutations reflect reduced neurovirulence compared with wild-type HIV.
Although it is known that most HIV-1 infections worldwide result from exposure to virus in semen, it has not yet been established whether transmitted strains originate as RNA virions in seminal plasma or as integrated proviral DNA in infected seminal leukocytes. We present phylogenetic evidence that among six transmitting pairs of men who have sex with men, blood plasma virus in the recipient is consistently more closely related to the seminal plasma virus in the source. All sequences were subtype B, and the env C2V3 of transmitted variants tended to have higher mean isoelectric points, contain potential N-linked glycosylation sites, and favor CCR5 co-receptor usage. A statistically robust phylogenetically corrected analysis did not detect genetic signatures reliably associated with transmission, but further investigation of larger samples of transmitting pairs holds promise for determining which structural and genetic features of viral genomes are associated with transmission.
Pooling strategies have been used to reduce the costs of polymerase chain reaction-based screening for acute HIV infection in populations in which the prevalence of acute infection is low (less than 1%). Only limited research has been done for conditions in which the prevalence of screening positivity is higher (greater than 1%).
Methods and Results
We present data on a variety of pooling strategies that incorporate the use of polymerase chain reaction-based quantitative measures to monitor for virologic failure among HIV-infected patients receiving antiretroviral therapy. For a prevalence of virologic failure between 1% and 25%, we demonstrate relative efficiency and accuracy of various strategies. These results could be used to choose the best strategy based on the requirements of individual laboratory and clinical settings such as required turnaround time of results and availability of resources.
Virologic monitoring during antiretroviral therapy is not currently being performed in many resource-constrained settings largely because of costs. The presented pooling strategies may be used to significantly reduce the cost compared with individual testing, make such monitoring feasible, and limit the development and transmission of HIV drug resistance in resource-constrained settings. They may also be used to design efficient pooling strategies for other settings with quantitative screening measures.
AIDS; efficiency; matrix
Most of our knowledge about how antiretrovirals and host immune responses influence the HIV-1 protease gene is derived from studies of subtype B virus. We investigated the effect of protease resistance-associated mutations (PRAMs) and population-based HLA haplotype frequencies on polymorphisms found in CRF01_AE pro.
We used all CRF01_AE protease sequences retrieved from the LANL database and obtained regional HLA frequencies from the dbMHC database. Polymorphisms and major PRAMs in the sequences were identified using the Stanford Resistance Database, and we performed phylogenetic and selection analyses using HyPhy. HLA binding affinities were estimated using the Immune Epitope Database and Analysis.
Overall, 99% of CRF01_AE sequences had at least 1 polymorphism and 10% had at least 1 major PRAM. Three polymorphisms (L10 V, K20RMI and I62 V) were associated with the presence of a major PRAM (P < 0.05). Compared to the subtype B consensus, six additional polymorphisms (I13 V, E35D, M36I, R41K, H69K, L89M) were identified in the CRF01_AE consensus; all but L89M were located within epitopes recognized by HLA class I alleles. Of the predominant HLA haplotypes in the Asian regions of CRF01_AE origin, 80% were positively associated with the observed polymorphisms, and estimated HLA binding affinity was estimated to decrease 19–40 fold with the observed polymorphisms at positions 35, 36 and 41.
Polymorphisms in CRF01_AE protease gene were common, and polymorphisms at residues 10, 20 and 62 most likely represent selection by use of protease inhibitors, whereas R41K and H69K were more likely attributable to recognition of epitopes by the HLA haplotypes of the host population.
CRF01_AE; HIV; HLA; polymorphisms; protease; resistance
To identify a pre-HAART gene expression signature in peripheral blood mononuclear cells (PBMCs) predictive of CD4+ T-cell recovery during HAART in HIV-infected individuals.
This retrospective study evaluated PBMC gene expression in 24 recently HIV-infected individuals before the initiation of HAART to identify genes whose expression is predictive of CD4+ T-cell recovery after 48 weeks of HAART.
The change in CD4+ T-cell count (ΔCD4) over the 48-week study period was calculated for each of the 24 participants. Twelve participants were assigned to the ‘good’ (ΔCD4 ≥ 200 cells/μl) and 12 to the ‘poor’ (ΔCD4 < 200 cells/μl) CD4+ T-cell recovery group. Gene expression profiling of the entire transcriptome using Illumina BeadChips was performed with PBMC samples obtained before HAART. Gene expression classifiers capable of predicting CD4+ T-cell recovery group (good vs. poor), as well as the specific ΔCD4 value, at week 48 were constructed using methods of Class Prediction.
The expression of 40 genes in PBMC samples taken before HAART predicted CD4+ T-cell recovery group (good vs. poor) at week 48 with 100% accuracy. The expression of 22 genes predicted a specific ΔCD4 value for each HIV-infected individual that correlated well with actual values (R = 0.82). Predicted ΔCD4 values were also used to assign individuals to good vs. poor CD4+ T-cell recovery groups with 79% accuracy.
Gene expression in PBMCs can be used as biomarkers to successfully predict disease outcomes among HIV-infected individuals treated with HAART.
CD4; gene expression; HIV; immune reconstitution; pathogenesis; prognosis
We examined neurocognitive functioning among persons with acute or early HIV infection (AEH) and hypothesized that the neurocognitive performance of AEH individuals would be intermediate between HIV seronegatives (HIV−) and those with chronic HIV infection. Comprehensive neurocognitive testing was accomplished with 39 AEH, 63 chronically HIV infected, and 38 HIV− participants. All AEH participants were HIV infected for less than 1 year. Average domain deficit scores were calculated in seven neurocognitive domains. HIV−, AEH, and chronically HIV infected groups were ranked from best (rank of 1) to worst (rank of 3) in each domain. All participants received detailed substance use, neuromedical, and psychiatric evaluations and HIV infected persons provided information on antiretroviral treatment and completed laboratory evaluations including plasma and CSF viral loads. A nonparametric test of ordered alternatives (Page test), and the appropriate nonparametric follow-up test, was used to evaluate level of neuropsychological (NP) functioning across and between groups. The median duration of infection for the AEH group was 16 weeks [interquartile range, IQR: 10.3–40.7] as compared to 4.9 years [2.8–11.1] in the chronic HIV group. A Page test using ranks of average scores in the seven neurocognitive domains showed a significant monotonic trend with the best neurocognitive functioning in the HIV− group (mean rank = 1.43), intermediate neurocognitive functioning in the AEH group (mean rank = 1.71), and the worst in the chronically HIV infected (mean rank = 2.86; L statistic = 94, p < 0.01); however, post-hoc testing comparing neurocognitive impairment of each group against each of the other groups showed that the chronically infected group was significantly different from both the HIV− and AEH groups on neurocognitive performance; the AEH group was statistically indistinguishable from the HIV− group. Regression models among HIV infected participants were unable to identify significant predictors of neurocognitive performance. Neurocognitive functioning was worst among persons with chronic HIV infection. Although a significant monotonic trend existed and patterns of the data suggest the AEH individuals may fall intermediate to HIV− and chronic participants, we were not able to statistically confirm this hypothesis.
HIV infection; HIV-associated neurocognitive disorders; Acute or early HIV; Primary HIV
Typically, population-based sequencing of HIV does not detect minority variants present at levels below 20-30%. Single genome amplification (SGA) and sequencing improves detection, but it requires many PCRs to find the optimal terminal dilution to use. A novel method for guiding the selection of a terminal dilution was developed and compared to standard methods. A quantitative real-time PCR (qRT-PCR) protocol was developed. HIV RNA was extracted, reverse transcribed, and quantitated. A bioinformatics web-based application was created for calculating the optimal concentration of cDNA to use based on results of a trial PCR using the dilution suggested by the qRT-PCR results. This method was compared to the standard. Using the standard protocol, the mean number of PCRs giving an average of 30 (26-34, SD=3) SGA per sample was 245 (218-266, SD=20) after an average of 8 trial dilutions. Using this method, 135 PCRs (135-135, SD=0) produced 30 (27-30, SD=1) SGA using exactly two dilutions. This new method reduced turnaround time from 8 to 2 days.
Standard methods of SGA sequencing can be costly and both time- and labor-intensive. By choosing a terminal dilution concentration with the proposed method, the number of PCRs required is decreased and efficiency improved.
Single Genome Amplification; quantitative real-time PCR; population-based sequencing; single genome sequencing; minority variant detection
Pooling clinical specimens reduces the number of assays needed when screening for infectious diseases. Polymerase chain reaction (PCR)-based assays are the most sensitive tests to diagnose malaria, but its high cost limits its use. We adapted a pooling platform that could reduce the number of assays needed to detect malaria infection. To evaluate this platform, two sets of 100 serum samples, with 1% and 5% malaria prevalence, were tested. DNA, extracted from pooled samples, was amplified by malaria-specific PCR. Additional validation was performed by determining the level of PCR detection based on 1:10 and 1:100 dilution. The platform correctly detected all malaria samples in the two test matrices. The use of stored serum samples also has important implications for studies investigating malaria prevalence rates retrospectively. Field studies, using serum and whole blood specimens, are needed to validate this technique for the adaptation of these methods for clinical utility.
A novel combination of three codon inserts in the pol coding region of HIV-1 RNA was identified in a highly antiretroviral experienced study subject with HIV-1 infection. A one codon insert was observed in the protease region between codon 40 and 41 simultaneously with a two codon insert present in the reverse transcriptase region at codon 69.