Higher plasma D-dimer levels are strong predictors of mortality in HIV+ individuals. The factors associated with D-dimer levels during HIV infection, however, remain poorly understood.
In this cross-sectional study, participants in three randomized controlled trials with measured D-dimer levels were included (N = 9,848). Factors associated with D-dimer were identified by linear regression. Covariates investigated were: age, gender, race, body mass index, nadir and baseline CD4+ count, plasma HIV RNA levels, markers of inflammation (C-reactive protein [CRP], interleukin-6 [IL-6]), antiretroviral therapy (ART) use, ART regimens, co-morbidities (hepatitis B/C, diabetes mellitus, prior cardiovascular disease), smoking, renal function (estimated glomerular filtration rate [eGFR] and cystatin C) and cholesterol.
Women from all age groups had higher D-dimer levels than men, though a steeper increase of D-dimer with age occurred in men. Hepatitis B/C co-infection was the only co-morbidity associated with higher D-dimer levels. In this subgroup, the degree of hepatic fibrosis, as demonstrated by higher hyaluronic acid levels, but not viral load of hepatitis viruses, was positively correlated with D-dimer. Other factors independently associated with higher D-dimer levels were black race, higher plasma HIV RNA levels, being off ART at baseline, and increased levels of CRP, IL-6 and cystatin C. In contrast, higher baseline CD4+ counts and higher high-density lipoprotein cholesterol were negatively correlated with D-dimer levels.
D-dimer levels increase with age in HIV+ men, but are already elevated in women at an early age due to reasons other than a higher burden of concomitant diseases. In hepatitis B/C co-infected individuals, hepatic fibrosis, but not hepatitis viral load, was associated with higher D-dimer levels.
The interaction of positive and negative relationships (i.e. I like you, but you dislike me – referred to as relational dissonance) is an underexplored phenomenon. Further, it is often only poor (or negative) mental health that is examined in relation to social networks, with little regard for positive psychological wellbeing. Finally, these issues are compounded by methodological constraints. This study explores a new concept of relational dissonance alongside mutual antipathies and friendships and their association with mental health using multivariate exponential random graph models with an Australian sample of secondary school students. Results show male students with relationally dissonant ties have lower positive mental health measures. Girls with relationally dissonant ties have lower depressed mood, but those girls being targeted by negative ties are more likely to have depressed mood. These findings have implications for the development of interventions focused on promoting adolescent wellbeing and consideration of the appropriate measurement of wellbeing and mental illness.
Introduction: Abnormalities in the neurophysiological measures P300 amplitude and latency constitute endophenotypes for psychosis. Disrupted-in-Schizophrenia-1 (DISC1) has been proposed as a promising susceptibility gene for schizophrenia, and a previous study has suggested that it is associated with P300 deficits in schizophrenia. Methods: We examined the role of variation in DISC1 polymorphisms on the P300 endophenotype in a large sample of patients with schizophrenia or psychotic bipolar disorder (n = 149), their unaffected relatives (n = 130), and unrelated healthy controls (n = 208) using linear regression and haplotype analysis. Results: Significant associations between P300 amplitude and latency and DISC1 polymorphisms/haplotypes were found. Those homozygous for the A allele of single-nucleotide polymorphism (SNP) rs821597 displayed significantly reduced P300 amplitudes in comparison with homozygous for the G allele (P = .009) and the heterozygous group (P = .018). Haplotype analysis showed a significant association for DISC1 haplotypes (rs3738401|rs6675281|rs821597|rs821616|rs967244|rs980989) and P300 latency. Haplotype GCGTCG and ACGTTT were associated with shorter latencies. Discussion: The P300 waveform appears to be modulated by variation in individual SNPs and haplotypes of DISC1. Because DISC1 is involved in neurodevelopment, one hypothesis is that disruption in neural connectivity impairs cognitive processes illustrated by P300 deficits observed in this sample.
psychosis; schizophrenia; bipolar disorder; EEG; ERP; P300; DISC1; endophenotype; neurophysiology; family study; haplotype analysis; biomarker
Accurate models of proprioceptive neural patterns could 1 day play an important role in the creation of an intuitive proprioceptive neural prosthesis for amputees. This paper looks at combining efficient implementations of biomechanical and proprioceptor models in order to generate signals that mimic human muscular proprioceptive patterns for future experimental work in prosthesis feedback. A neuro-musculoskeletal model of the upper limb with 7 degrees of freedom and 17 muscles is presented and generates real time estimates of muscle spindle and Golgi Tendon Organ neural firing patterns. Unlike previous neuro-musculoskeletal models, muscle activation and excitation levels are unknowns in this application and an inverse dynamics tool (static optimization) is integrated to estimate these variables. A proprioceptive prosthesis will need to be portable and this is incompatible with the computationally demanding nature of standard biomechanical and proprioceptor modeling. This paper uses and proposes a number of approximations and optimizations to make real time operation on portable hardware feasible. Finally technical obstacles to mimicking natural feedback for an intuitive proprioceptive prosthesis, as well as issues and limitations with existing models, are identified and discussed.
proprioceptive feedback; neuroprosthesis; neuromusculoskeletal model; upper limb; biomechanics; muscle spindles; golgi tendon organ; static optimization
Following mucosal human immunodeficiency virus type 1 (HIV-1) transmission, type 1 interferons (IFNs) are rapidly induced at sites of initial virus replication in the mucosa and draining lymph nodes. However, the role played by IFN-stimulated antiviral activity in restricting HIV-1 replication during the initial stages of infection is not clear. We hypothesized that if type 1 IFNs exert selective pressure on HIV-1 replication in the earliest stages of infection, the founder viruses that succeed in establishing systemic infection would be more IFN-resistant than viruses replicating during chronic infection, when type 1 IFNs are produced at much lower levels. To address this hypothesis, the relative resistance of virus isolates derived from HIV-1-infected individuals during acute and chronic infection to control by type 1 IFNs was analysed.
The replication of plasma virus isolates generated from subjects acutely infected with HIV-1 and molecularly cloned founder HIV-1 strains could be reduced but not fully suppressed by type 1 IFNs in vitro. The mean IC50 value for IFNα2 (22 U/ml) was lower than that for IFNβ (346 U/ml), although at maximally-inhibitory concentrations both IFN subtypes inhibited virus replication to similar extents. Individual virus isolates exhibited differential susceptibility to inhibition by IFNα2 and IFNβ, likely reflecting variation in resistance to differentially up-regulated IFN-stimulated genes. Virus isolates from subjects acutely infected with HIV-1 were significantly more resistant to in vitro control by IFNα than virus isolates generated from the same individuals during chronic, asymptomatic infection. Viral IFN resistance declined rapidly after the acute phase of infection: in five subjects, viruses derived from six-month consensus molecular clones were significantly more sensitive to the antiviral effects of IFNs than the corresponding founder viruses.
The establishment of systemic HIV-1 infection by relatively IFNα-resistant founder viruses lends strong support to the hypothesis that IFNα plays an important role in the control of HIV-1 replication during the earliest stages of infection, prior to systemic viral spread. These findings suggest that it may be possible to harness the antiviral activity of type 1 IFNs in prophylactic and potentially also therapeutic strategies to combat HIV-1 infection.
Human immunodeficiency virus type 1; Type 1 interferon; Viral inhibition; Founder virus; Acute infection
Therapies to decrease immune activation might be of benefit in slowing HIV disease progression.
To determine whether hydroxychloroquine decreases immune activation and slows CD4 cell decline.
Design, Setting, and Patients
Randomized, double-blind, placebo-controlled trial performed at 10 HIV outpatient clinics in the United Kingdom between June 2008 and February 2011. The 83 patients enrolled had asymptomatic HIV infection, were not taking antiretroviral therapy, and had CD4 cell counts greater than 400 cells/μL.
Hydroxychloroquine, 400 mg, or matching placebo once daily for 48 weeks.
Main Outcome Measures
The primary outcome measure was change in the proportion of activated CD8 cells (measured by the expression of CD38 and HLA-DR surface markers), with CD4 cell count and HIV viral load as secondary outcomes. Analysis was by intention to treat using mixed linear models.
There was no significant difference in CD8 cell activation between the 2 groups (−4.8% and −4.2% in the hydroxychloroquine and placebo groups, respectively, at week 48; difference, −0.6%; 95% CI, −4.8% to 3.6%; P=.80). Decline in CD4 cell count was greater in the hydroxychloroquine than placebo group (−85 cells/μL vs −23 cells/μL at week 48; difference, −62 cells/μL; 95% CI, −115 to −8; P=.03). Viral load increased in the hydroxychloroquine group compared with placebo (0.61 log10 copies/mL vs 0.23 log10 copies/mL at week 48; difference, 0.38 log10 copies/mL; 95% CI, 0.13 to 0.63; P=.003). Antiretroviral therapy was started in 9 patients in the hydroxychloroquine group and 1 in the placebo group. Trial medication was well tolerated, but more patients reported influenza-like illness in the hydroxychloroquine group compared with the placebo group (29% vs 10%; P=.03).
Among HIV-infected patients not taking antiretroviral therapy, the use of hydroxychloroquine compared with placebo did not reduce CD8 cell activation but did result in a greater decline in CD4 cell count and increased viral replication.
isrctn.org Identifier: ISRCTN30019040
There are few data on the persistence of individual human immunodeficiency virus type 1 (HIV-1) transmitted drug resistance (TDR) mutations in the absence of selective drug pressure. We studied 313 patients in whom TDR mutations were detected at their first resistance test and who had a subsequent test performed while ART-naive. The rate at which mutations became undetectable was estimated using exponential regression accounting for interval censoring. Most thymidine analogue mutations (TAMs) and T215 revertants (but not T215F/Y) were found to be highly stable, with NNRTI and PI mutations being relatively less persistent. Our estimates are important for informing HIV transmission models.
persistence; transmitted; HIV-1; resistance; mutations
To determine protease mutations that develop at viral failure for protease inhibitor (PI)-naive patients on a regimen containing the PI atazanavir.
Resistance tests on patients failing atazanavir, conducted as part of routine clinical care in a multicentre observational study, were randomly matched by subtype to resistance tests from PI-naive controls to account for natural polymorphisms. Mutations from the consensus B sequence across the protease region were analysed for association and defined using the IAS-USA 2011 classification list.
Four hundred and five of 2528 (16%) patients failed therapy containing atazanavir as a first PI over a median (IQR) follow-up of 1.76 (0.84–3.15) years and 322 resistance tests were available for analysis. Recognized major atazanavir mutations were found in six atazanavir-experienced patients (P < 0.001), including I50L and N88S. The minor mutations most strongly associated with atazanavir experience were M36I, M46I, F53L, A71V, V82T and I85V (P < 0.05). Multiple novel mutations, I15S, L19T, K43T, L63P/V, K70Q, V77I and L89I/T/V, were also associated with atazanavir experience.
Viral failure on atazanavir-containing regimens was not common and major resistance mutations were rare, suggesting that adherence may be a major contributor to viral failure. Novel mutations were described that have not been previously documented.
HIV; drug resistance mutations; naive patients; protease inhibitors; virological failure
Rare Earth Elements (REE) are essential to modern society but the origins of many large REE deposits remain unclear. The U-Th-Pb ages, chemical compositions and C, O and Mg isotopic compositions of Bayan Obo, the world's largest REE deposit, indicate a protracted mineralisation history with unusual chemical and isotopic features. Coexisting calcite and dolomite are in O isotope disequilibrium; some calcitic carbonatite samples show highly varied δ26Mg which increases with increasing Si and Mg; and ankerite crystals show decreases in Fe and REE from rim to centre, with highly varied REE patterns. These and many other observations are consistent with an unusual mineralisation process not previously considered; protracted fluxing of calcitic carbonatite by subduction-released high-Si fluids during the closure of the Palaeo-Asian Ocean. The fluids leached Fe and Mg from the mantle wedge and scavenged REE, Nb and Th from carbonatite, forming the deposit through metasomatism of overlying sedimentary carbonate.
Shiga toxin-producing Escherichia coli (STEC) O157:H7 is the causal agent for more than 96,000 cases of diarrheal illness and 3,200 infection-attributable hospitalizations annually in the United States.
Materials and Methods
We defined a confirmed case as a compatible illness in a person with the outbreak strain during 10/07/2011-11/30/2011. Investigation included hypothesis generation, a case-control study utilizing geographically-matched controls, and a case series investigation. Environmental inspections and tracebacks were conducted.
We identified 58 cases in 10 states; 67% were hospitalized and 6.4% developed hemolytic uremic syndrome. Any romaine consumption was significantly associated with illness (matched Odds Ratio (mOR) = 10.0, 95% Confidence Interval (CI) = 2.1–97.0). Grocery Store Chain A salad bar was significantly associated with illness (mOR = 18.9, 95% CI = 4.5–176.8). Two separate traceback investigations for romaine lettuce converged on Farm A. Case series results indicate that cases (64.9%) were more likely than the FoodNet population (47%) to eat romaine lettuce (p-value = 0.013); 61.3% of cases reported consuming romaine lettuce from the Grocery Store Chain A salad bar.
This multistate outbreak of STEC O157:H7 infections was associated with consumption of romaine lettuce. Traceback analysis determined that a single common lot of romaine lettuce harvested from Farm A was used to supply Grocery Store Chain A and a university campus linked to a case with the outbreak strain. An investigation at Farm A did not identify the source of contamination. Improved ability to trace produce from the growing fields to the point of consumption will allow more timely prevention and control measures to be implemented.
Investigate the cost and effects of a single-pill versus two- or three pill first-line antiretroviral combinations in reducing viral load, increasing CD4 counts, and first-line failure rate associated with respective regimens at 6 and 12 months.
Patients on first-line TDF+3TC+EFV, TDF+FTC+EFV, Truvada®+EFV or Atripla® between 1996–2008 were identified and viral load and CD4 counts measured at baseline, six and twelve months respectively. Factors that independently predicted treatment failure at six and twelve months were derived using multivariate Cox's proportional hazard regression analyses. Use and cost of hospital services were calculated at six and twelve months respectively.
All regimens reduced viral load to below the limit of detection and CD4 counts increased to similar levels at six and twelve months for all treatment regimens. No statistically significant differences were observed for rate of treatment failure at six and twelve months. People on Atripla® generated lower healthcare costs for non-AIDS patients at £5,340 (£5,254 to £5,426) per patient-semester and £9,821 (£9,719 to £9,924) per patient-year that was £1,344 (95%CI £1,222 to £1,465) less per patient-semester and £1,954 (95%CI £1,801 to £2,107) less per patient-year compared with Truvada®+EFV; healthcare costs for AIDS patients were similar across all regimens.
The single pill regimen is as effective as the two- and three-pill regimens of the same drugs, but if started as first-line induction therapy there would be a 20% savings on healthcare costs at six and 17% of costs at twelve months compared with Truvada®+EFV, that generated the next lowest costs.
Pharmacists are viewed as highly trained yet underutilised and there is growing support to extend the role of the pharmacist within the primary health care sector. The integration of a pharmacist into a general practice medical centre is not a new concept however is a novel approach in Australia and evidence supporting this role is currently limited. This study aimed to describe the opinions of local stakeholders in South-East Queensland on the integration of a pharmacist into the Australian general practice environment.
A sample of general practitioners, health care consumers, pharmacists and practice managers in South-East Queensland were invited to participate in focus groups or semi-structured interviews. Seeding questions common to all sessions were used to facilitate discussion. Sessions were audio recorded and transcribed verbatim. Leximancer software was used to qualitatively analyse responses.
A total of 58 participants took part in five focus groups and eighteen semi-structured interviews. Concepts relating to six themes based on the seeding questions were identified. These included positively viewed roles such as medication reviews and prescribing, negatively viewed roles such as dispensing and diagnosing, barriers to pharmacist integration such as medical culture and remuneration, facilitators to pharmacist integration such as remuneration and training, benefits of integration such as access to the patient’s medical file, and potential funding models.
These findings and future research may aid the development of a new model of integrated primary health care services involving pharmacist practitioners.
To calculate use, cost and cost-effectiveness of people living with HIV (PLHIV) starting routine treatment and care before starting combination antiretroviral therapy (cART) and PLHIV starting first-line 2NRTIs+NNRTI or 2NRTIs+PIboosted, comparing PLHIV with CD4≤200 cells/mm3 and CD4>200 cells/mm3. Few studies have calculated the use, cost and cost-effectiveness of routine treatment and care before starting cART and starting cART above and below CD4 200 cells/mm3.
Use, costs and cost-effectiveness were calculated for PLHIV in routine pre-cART and starting first-line cART, comparing CD4≤200 cells/mm3 with CD4>200 cells/mm3 (2008 UK prices).
cART naïve patients CD4≤200 cells/mm3 had an annual cost of £6,407 (95%CI £6,382 to £6,425) PPY compared with £2,758 (95%CI £2,752 to £2,761) PPY for those with CD4>200 cells/mm3; cost per life year gained of pre-cART treatment and care for those with CD4>200 cells/mm3 was £1,776 (cost-saving to £2,752). Annual cost for starting 2NRTIs+NNRTI or 2NRTIs+PIboosted with CD4≤200 cells/mm3 was £12,812 (95%CI £12,685–£12,937) compared with £10,478 (95%CI £10,376–£10,581) for PLHIV with CD4>200 cells/mm3. Cost per additional life-year gained on first-line therapy for those with CD4>200 cells/mm3 was £4639 (£3,967 to £2,960).
PLHIV starting to use HIV services before CD4≤200 cells/mm3 is cost-effective and enables them to be monitored so they start cART with a CD4>200 cells/mm3, which results in better outcomes and is cost-effective. However, 25% of PLHIV accessing services continue to present with CD4≤200 cells/mm3. This highlights the need to investigate the cost-effectiveness of testing and early treatment programs for key populations in the UK.
The rapid and continual viral escape from neutralizing antibodies is well documented in HIV-1 infection. Here we report in vivo emergence of viruses with heightened sensitivity to neutralizing antibodies, sometimes paralleling the development of neutralization escape.
Sequential viral envs were amplified from seven HIV-1 infected men monitored from seroconversion up to 5 years after infection. Env-recombinant infectious molecular clones were generated and tested for coreceptor use, macrophage tropism and neutralization sensitivity to homologous and heterologous serum, soluble CD4 and monoclonal antibodies IgG1b12, 2G12 and 17b. We found that HIV-1 evolves sensitivity to contemporaneous neutralizing antibodies during infection. Neutralization sensitive viruses grow out even when potent autologous neutralizing antibodies are present in patient serum. Increased sensitivity to neutralization was associated with susceptibility of the CD4 binding site or epitopes induced after CD4 binding, and mediated by complex envelope determinants including V3 and V4 residues. The development of neutralization sensitive viruses occurred without clinical progression, coreceptor switch or change in tropism for primary macrophages.
We propose that an interplay of selective forces for greater virus replication efficiency without the need to resist neutralizing antibodies in a compartment protected from immune surveillance may explain the temporal course described here for the in vivo emergence of HIV-1 isolates with high sensitivity to neutralizing antibodies.
CD8+ T cells play an important role in control of viral replication during acute and early human immunodeficiency virus type 1 (HIV-1) infection, contributing to containment of the acute viral burst and establishment of the prognostically-important persisting viral load. Understanding mechanisms that impair CD8+ T cell-mediated control of HIV replication in primary infection is thus of importance. This study addressed the relative extent to which HIV-specific T cell responses are impacted by viral mutational escape versus reduction in response avidity during the first year of infection.
18 patients presenting with symptomatic primary HIV-1 infection, most of whom subsequently established moderate-high persisting viral loads, were studied. HIV-specific T cell responses were mapped in each individual and responses to a subset of optimally-defined CD8+ T cell epitopes were followed from acute infection onwards to determine whether they were escaped or declined in avidity over time. During the first year of infection, sequence variation occurred in/around 26/33 epitopes studied (79%). In 82% of cases of intra-epitopic sequence variation, the mutation was confirmed to confer escape, although T cell responses were subsequently expanded to variant sequences in some cases. In contrast, < 10% of responses to index sequence epitopes declined in functional avidity over the same time-frame, and a similar proportion of responses actually exhibited an increase in functional avidity during this period.
Escape appears to constitute a much more important means of viral evasion of CD8+ T cell responses in acute and early HIV infection than decline in functional avidity of epitope-specific T cells. These findings support the design of vaccines to elicit T cell responses that are difficult for the virus to escape.
Calculate time to first-line treatment failure, annual cost and
cost-effectiveness of NNRTI versus PIboosted first-line HAART regimens in
the UK, 1996–2006.
Population costs for HIV services are increasing in the UK and interventions
need to be effective and efficient to reduce or stabilize costs. 2NRTIs
+ NNRTI regimens are cost-effective regimens for first-line HAART, but
these regimens have not been compared with first-line PIboosted
Times to first-line treatment failure and annual costs were calculated for
first-line HAART regimens by CD4 count when starting HAART (2006 UK prices).
Cost-effectiveness of 2NRTIs+NNRTI versus
2NRTIs+PIboosted regimens was calculated for four CD4
55% of 5,541 people living with HIV (PLHIV) started HAART with CD4
count ≤200 cells/mm3, many of whom were Black Africans. Annual treatment
cost decreased as CD4 count increased; most marked differences were observed
between starting HAART with CD4 ≤200 cells/mm3 compared with CD4 count
>200 cells/mm3. 2NRTI+PIboosted and 2NRTI+NNRTI
regimens were the most effective regimens across the four CD4 strata;
2NRTI+NNRTI was cost-saving or cost-effective compared with 2NRTI
+ PIboosted regimens.
To ensure more effective and efficient provision of HIV services,
2NRTI+NNRTI should be started as first-line HAART regimen at CD4 counts
≤350 cell/mm3, unless specific contra-indications exist. This will
increase the number of PLHIV receiving HAART and will initially increase
population costs of providing HIV services. However, starting PLHIV earlier
on cost-effective regimens will maintain them in better health and use fewer
health or social services, thereby generating fewer treatment and care
costs, enabling them to remain socially and economically active members of
society. This does raise a number of ethical issues, which will have to be
acknowledged and addressed, especially in countries with limited
Intervention with antiretroviral treatment (ART) and control of viral replication at the time of HIV-1 seroconversion may curtail cumulative immunological damage. We have therefore hypothesized that ART maintenance over a very prolonged period in HIV-1 seroconverters could induce an immuno-virological status similar to that of HIV-1 long-term non-progressors (LTNPs).
We have investigated a cohort of 20 HIV-1 seroconverters on long-term ART (LTTS) and compared it to one of 15 LTNPs. Residual viral replication and reservoirs in peripheral blood, as measured by cell-associated HIV-1 RNA and DNA, respectively, were demonstrated to be similarly low in both cohorts. These two virologically matched cohorts were then comprehensively analysed by polychromatic flow cytometry for HIV-1-specific CD4+ and CD8+ T-cell functional profile in terms of cytokine production and cytotoxic capacity using IFN-γ, IL-2, TNF-α production and perforin expression, respectively. Comparable levels of highly polyfunctional HIV-1-specific CD4+ and CD8+ T-cells were found in LTTS and LTNPs, with low perforin expression on HIV-1-specific CD8+ T-cells, consistent with a polyfunctional/non-cytotoxic profile in a context of low viral burden.
Our results indicate that prolonged ART initiated at the time of HIV-1 seroconversion is associated with immuno-virological features which resemble those of LTNPs, strengthening the recent emphasis on the positive impact of early treatment initiation and paving the way for further interventions to promote virological control after treatment interruption.
The hallmark of chronic viral infections is a progressive exhaustion of antigen specific CD8+ T cells that leads to persisting viral replication. It is generally believed that exhaustion is a consequence of the accumulation of multiple inhibitory receptors on CD8+ T cells that makes them dysfunctional. Here we show that during human chronic HIV-1 infection a CD8+ T cell positive costimulatory pathway mediated by DNAM-1 is also disrupted. Thus, DNAM-1 downregulation on CD8+ T cells aggravates the impairment of CTL effector function in chronic HIV-1 infection.
HIV-1; exhaustion; co-stimulation
Non-neutralising antibodies to the envelope glycoprotein are elicited during acute HIV-1 infection and are abundant throughout the course of disease progression. Although these antibodies appear to have negligible effects on HIV-1 infection when assayed in standard neutralisation assays, they have the potential to exert either inhibitory or enhancing effects through interactions with complement and/or Fc receptors. Here we report that non-neutralising antibodies produced early in response to HIV-1 infection can enhance viral infectivity.
We investigated this complement-mediated antibody-dependent enhancement (C'-ADE) of early HIV infection by carrying out longitudinal studies with primary viruses and autologous sera derived sequentially from recently infected individuals, using a T cell line naturally expressing the complement receptor 2 (CR2; CD21). The C'-ADE was consistently observed and in some cases achieved infection-enhancing levels of greater than 350-fold, converting a low-level infection to a highly destructive one. C'-ADE activity declined as a neutralising response to the early virus emerged, but later virus isolates that had escaped the neutralising response demonstrated an increased capacity for enhanced infection by autologous antibodies. Moreover, sera with autologous enhancing activity were capable of C'ADE of heterologous viral isolates, suggesting the targeting of conserved epitopes on the envelope glycoprotein. Ectopic expression of CR2 on cell lines expressing HIV-1 receptors was sufficient to render them sensitive to C'ADE.
Taken together, these results suggest that non-neutralising antibodies to the HIV-1 envelope that arise during acute infection are not 'passive', but in concert with complement and complement receptors may have consequences for HIV-1 dissemination and pathogenesis.
In May 2008, PulseNet detected a multistate outbreak of Salmonella enterica serotype Saintpaul infections. Initial investigations identified an epidemiologic association between illness and consumption of raw tomatoes, yet cases continued. In mid-June, we investigated two clusters of outbreak strain infections in Texas among patrons of Restaurant A and two establishments of Restaurant Chain B to determine the outbreak's source.
We conducted independent case-control studies of Restaurant A and B patrons. Patients were matched to well controls by meal date. We conducted restaurant environmental investigations and traced the origin of implicated products. Forty-seven case-patients and 40 controls were enrolled in the Restaurant A study. Thirty case-patients and 31 controls were enrolled in the Restaurant Chain B study. In both studies, illness was independently associated with only one menu item, fresh salsa (Restaurant A: matched odds ratio [mOR], 37; 95% confidence interval [CI], 7.2–386; Restaurant B: mOR, 13; 95% CI 1.3–infinity). The only ingredient in common between the two salsas was raw jalapeño peppers. Cultures of jalapeño peppers collected from an importer that supplied Restaurant Chain B and serrano peppers and irrigation water from a Mexican farm that supplied that importer with jalapeño and serrano peppers grew the outbreak strain.
Jalapeño peppers, contaminated before arrival at the restaurants and served in uncooked fresh salsas, were the source of these infections. Our investigations, critical in understanding the broader multistate outbreak, exemplify an effective approach to investigating large foodborne outbreaks. Additional measures are needed to reduce produce contamination.
In the present study, we analyzed the functional profile of CD8+ T-cell responses directed against autologous transmitted/founder HIV-1 isolates during acute and early infection, and examined whether multifunctionality is required for selection of virus escape mutations. Seven anti-retroviral therapy-naïve subjects were studied in detail between 1 and 87 weeks following onset of symptoms of acute HIV-1 infection. Synthetic peptides representing the autologous transmitted/founder HIV-1 sequences were used in multiparameter flow cytometry assays to determine the functionality of HIV-1-specific CD8+ T memory cells. In all seven patients, the earliest T cell responses were predominantly oligofunctional, although the relative contribution of multifunctional cell responses increased significantly with time from infection. Interestingly, only the magnitude of the total and not of the poly-functional T-cell responses was significantly associated with the selection of escape mutants. However, the high contribution of MIP-1β-producing CD8+ T-cells to the total response suggests that mechanisms not limited to cytotoxicity could be exerting immune pressure during acute infection. Lastly, we show that epitope entropy, reflecting the capacity of the epitope to tolerate mutational change and defined as the diversity of epitope sequences at the population level, was also correlated with rate of emergence of escape mutants.
An important role for the polyfunctional T-cell fraction of anti-HIV CD8 responses during chronic HIV infection has previously been suggested. This study characterized the role of polyfunctional T-cells directed against the transmitted/founder virus in the selection of viral escape mutants during acute HIV-1 infection within a unique cohort of individuals recruited within 3 weeks from the onset of symptoms at the time when the virus load was still declining. For the first time, the sequences of the transmitted/founder virus isolated from each patient were used. Interestingly, polyfunctionality was not found to be a pre-requisite for selection of escape mutations. A novel significant correlation is found between the order of appearance of escape mutations in different epitope sequences and both the magnitude of the CD8+ T-cell responses and the degree of entropy of the individual epitopes. A high proportion of the T-cells participating in the total response produced MIP-1β, suggesting that mechanisms not limited to the killing of infected cells might play a relevant role in early infection. This highlights the importance of measuring the quality of the CD8+ lymphocyte response and the sequence of the transmitted virus isolates to better understand the mechanisms of control of HIV replication during acute infection.
The number of people living with HIV (PLHIV) is increasing in the UK. This study estimated the annual population cost of providing HIV services in the UK, 1997–2006 and projected them 2007–2013.
Annual cost of HIV treatment for PLHIV by stage of HIV infection and type of ART was calculated (UK pounds, 2006 prices). Population costs were derived by multiplying the number of PLHIV by their annual cost for 1997–2006 and projected 2007–2013.
Average annual treatment costs across all stages of HIV infection ranged from £17,034 in 1997 to £18,087 in 2006 for PLHIV on mono-therapy and from £27,649 in 1997 to £32,322 in 2006 for those on quadruple-or-more ART. The number of PLHIV using NHS services rose from 16,075 to 52,083 in 2006 and was projected to increase to 78,370 by 2013. Annual population cost rose from £104 million in 1997 to £483 million in 2006, with a projected annual cost between £721 and £758 million by 2013. When including community care costs, costs increased from £164 million in 1997, to £683 million in 2006 and between £1,019 and £1,065 million in 2013.
Increased number of PLHIV using NHS services resulted in rising UK population costs. Population costs are expected to continue to increase, partly due to PLHIV's longer survival on ART and the relative lack of success of HIV preventing programs. Where possible, the cost of HIV treatment and care needs to be reduced without reducing the quality of services, and prevention programs need to become more effective. While high income countries are struggling to meet these increasing costs, middle- and lower-income countries with larger epidemics are likely to find it even more difficult to meet these increasing demands, given that they have fewer resources.