We evaluated the limits of detection (LoD) for an 11-plex PCR-Luminex assay performed on Whatman FTA Elute cards smeared with stool containing pathogens associated with travelers’ diarrhea. LoDs ranged between 102-105 CFU, PFU or cysts/g for most pathogens except Cryptosporidium. Campylobacter and norovirus LoD increased with prolonged storage of cards.
PCR; travelers’ diarrhea; stool card
Adenovirus is a recognized cause of influenza‐like illness (ILI). The proportion of ILI attributable to adenovirus is not known. Moreover, knowledge gaps remain with respect to the epidemiologic, virologic, and clinical characteristics of adenovirus‐associated ILI among otherwise healthy individuals.
An observational, longitudinal study of <65‐year‐old patients with febrile ILI at five medical centers was conducted from 2009 to 2014. Nasopharyngeal specimens obtained at enrollment were first tested by single‐reaction PCR for adenovirus, then further evaluated by a multiplex PCR assay for other respiratory viral pathogens. Symptoms over a 28‐day period were collected.
We enrolled 1536 individuals, among whom 43 (2·8%) were positive for adenovirus. The median age of cases was 3·4 years (range: 4 months to 41 years). Three were hospitalized. Species and serotype information was available for 33 (76·7%) cases. Species C (n = 21) was the most common, followed by B3 (n = 9) and one each of E4a, D46, and A. Species C infections were more frequent in children (P < 0·01). Half of the cases were positive for at least one other respiratory viral pathogen. Symptoms were generally mild and most commonly included cough (90%), fatigue (79%), rhinorrhea (74%), loss of appetite (71%), and sore throat (64%). Children with non‐C adenovirus infection were more likely to report sore throat (P = 0·05) and hoarseness (P = 0·06) than those with species C infection.
Adenovirus is frequently detected with other respiratory viruses. Persons with non‐C adenovirus infections reported more severe symptoms, suggesting there may be species‐specific differences in virulence and/or host response to infection.
Adenovirus; influenza‐like illness; military
The principal goal of HAART is sustained viral load (VL) suppression resulting in immune reconstitution and improved HIV outcomes. We studied the factors associated with 10 years of continuous VL suppression on HAART in the US Military HIV Natural History Study.
Participants with continuous VL suppression (CS, n = 149) were compared to those who did not have continuous viral load suppression (NCS, n = 127) for ≥10 years on HAART. Factors associated with >10 years of VL suppression were evaluated by multivariate logistic regression. Additionally, association between CS and CD4 reconstitution was analyzed with a mixed effects model.
Compared to NCS participants, a lower proportion of CS participants started HAART in the early HAART era (66 vs 90 %, for years 1996–1999; p < 0.001) and had less antiretroviral use prior to HAART (37 vs 83 %; p < 0.001). At initial HAART, the median CD4 cell count was higher and VL was lower for CS compared to NCS participants (375 cells/uL [256, 499] vs 261 cells/uL [146, 400]; p < 0.001 and 4.4 log10 copies/mL [3.5, 4.9] vs 4.5 log10 copies/mL [3.8, 5.0]; p = 0.048, respectively). New AIDS events were lower during HAART (5 vs 13 %; p = 0.032) and post-HAART CD4 trajectories were greater for the CS compared to NCS group. Factors negatively associated with ≥10 years of VL suppression included log10 VL at first HAART (OR 0.61, 95 % CI 0.4, 0.92; p = 0.020) and antiretroviral use prior to HAART (OR 0.16, 95 % CI 0.06, 0.38; p < .001).
Sustained VL suppression is a key to long-term health in HIV-infected patients, as demonstrated by the lower proportion of AIDS events observed 10 years after HAART initiation. The current use of more potent and well-tolerated regimens may mitigate the negative factors of pre-HAART VL and prior ARV use encountered by treatment initiated in the early HAART era.
HIV; AIDS; Viral load; Suppression; HAART; CD4 cell count
Infectious diarrhea is a common problem among travelers. Expert guidelines recommend the prompt use of antibiotics for self-treatment of moderate or severe traveler’s diarrhea (TD). There is limited data on whether travelers follow these self-treatment guidelines. We evaluated the risk factors associated with TD, use of TD self-treatment, and risk of irritable bowel syndrome (IBS) during travel.
Department of Defense beneficiaries traveling outside the US for ≤ 6.5 months were enrolled in a prospective cohort study. Participants received pre- and post-travel surveys, and could opt into a travel illness diary and follow-up surveys for symptoms of IBS. Standard definitions were used to assess for TD and IBS. Sub-optimal self-treatment was defined as use of antibiotics (with or without antidiarrheal agents) for mild TD, or use of antidiarrheals alone or no self-treatment in cases of moderate or severe TD.
Twenty-four percent of participants (270/1120) met criteria for TD. The highest incidence was recorded in Africa (8.6 cases/100 person-weeks, 95% CI: 6.7–10.5). Two hundred and twelve TD cases provided information regarding severity and self-treatment: 89 (42%) had mild TD and 123 (58%) had moderate or severe TD. Moderate or severe TD was independently associated with suboptimal self-treatment (OR 10.4 [95% CI: 4.92–22.0]). Time to last unformed stool did not differ between optimal and suboptimal self-treatment. IBS occurred in 4.5% (7/154) of TD cases and 3.1% (16/516) of patients without TD (p=0.39). Among TD cases, a lower incidence of IBS was noted in participants who took antibiotics (4.8% (5/105) vs. 2.2% (1/46)), but the difference did not reach statistical significance (p=0.60).
Our results suggest the underutilization of antibiotics in travelers with moderate or severe TD. Further studies are needed to systematically evaluate pre-travel instruction and traveler adherence to self-treatment guidelines, and the impact of suboptimal self-treatment on outcomes.
Whether poor virologic control is associated with incident cancers after initiation of combination antiretroviral therapy (ART) remains unclear. In a large cohort, time-updated HIV RNA levels ≥1,000 copies/ml predicted AIDS-defining cancers (ADCs), non-AIDS-defining cancers (NADCs), and skin cancers. Virologic control may be an important strategy in reducing cancer events among HIV-infected persons.
Host factors and complications have been associated with higher mortality in infective endocarditis (IE). We sought to develop and validate a model of clinical characteristics to predict 6‐month mortality in IE.
Methods and Results
Using a large multinational prospective registry of definite IE (International Collaboration on Endocarditis [ICE]–Prospective Cohort Study [PCS], 2000–2006, n=4049), a model to predict 6‐month survival was developed by Cox proportional hazards modeling with inverse probability weighting for surgery treatment and was internally validated by the bootstrapping method. This model was externally validated in an independent prospective registry (ICE‐PLUS, 2008–2012, n=1197). The 6‐month mortality was 971 of 4049 (24.0%) in the ICE‐PCS cohort and 342 of 1197 (28.6%) in the ICE‐PLUS cohort. Surgery during the index hospitalization was performed in 48.1% and 54.0% of the cohorts, respectively. In the derivation model, variables related to host factors (age, dialysis), IE characteristics (prosthetic or nosocomial IE, causative organism, left‐sided valve vegetation), and IE complications (severe heart failure, stroke, paravalvular complication, and persistent bacteremia) were independently associated with 6‐month mortality, and surgery was associated with a lower risk of mortality (Harrell's C statistic 0.715). In the validation model, these variables had similar hazard ratios (Harrell's C statistic 0.682), with a similar, independent benefit of surgery (hazard ratio 0.74, 95% CI 0.62–0.89). A simplified risk model was developed by weight adjustment of these variables.
Six‐month mortality after IE is ≈25% and is predicted by host factors, IE characteristics, and IE complications. Surgery during the index hospitalization is associated with lower mortality but is performed less frequently in the highest risk patients. A simplified risk model may be used to identify specific risk subgroups in IE.
infection; mortality; prognosis; surgery; valves; Infectious Endocarditis; Valvular Heart Disease; Mortality/Survival; Clinical Studies
Using appropriate analytical methods to examine data from the International Collaboration on Endocarditis–Prospective Cohort Study, we found that early valve surgery was not associated with reduced 1-year mortality in Staphylococcus aureus prosthetic valve infective endocarditis.
Background. The impact of early valve surgery (EVS) on the outcome of Staphylococcus aureus (SA) prosthetic valve infective endocarditis (PVIE) is unresolved. The objective of this study was to evaluate the association between EVS, performed within the first 60 days of hospitalization, and outcome of SA PVIE within the International Collaboration on Endocarditis–Prospective Cohort Study.
Methods. Participants were enrolled between June 2000 and December 2006. Cox proportional hazards modeling that included surgery as a time-dependent covariate and propensity adjustment for likelihood to receive cardiac surgery was used to evaluate the impact of EVS and 1-year all-cause mortality on patients with definite left-sided S. aureus PVIE and no history of injection drug use.
Results. EVS was performed in 74 of the 168 (44.3%) patients. One-year mortality was significantly higher among patients with S. aureus PVIE than in patients with non–S. aureus PVIE (48.2% vs 32.9%; P = .003). Staphylococcus aureus PVIE patients who underwent EVS had a significantly lower 1-year mortality rate (33.8% vs 59.1%; P = .001). In multivariate, propensity-adjusted models, EVS was not associated with 1-year mortality (risk ratio, 0.67 [95% confidence interval, .39–1.15]; P = .15).
Conclusions. In this prospective, multinational cohort of patients with S. aureus PVIE, EVS was not associated with reduced 1-year mortality. The decision to pursue EVS should be individualized for each patient, based upon infection-specific characteristics rather than solely upon the microbiology of the infection causing PVIE.
endocarditis; prosthetic valve; surgery; 1-year mortality
HIV-infected persons have increased risk of MRSA colonization and skin and soft-tissue infections (SSTI). However, no large clinical trial has examined the utility of decolonization procedures in reducing MRSA colonization or infection among community-dwelling HIV-infected persons.
550 HIV-infected adults at four geographically diverse US military HIV clinics were prospectively screened for MRSA colonization at five body locations every 6 months during a 2-year period. Those colonized were randomized in a double-blind fashion to nasal mupirocin (Bactroban) twice daily and hexachlorophene (pHisoHex) soaps daily for 7 days compared to placeboes similar in appearance but without specific antibacterial activity. The primary endpoint was MRSA colonization at 6-months post-randomization; secondary endpoints were time to MRSA clearance, subsequent MRSA infections/SSTI, and predictors for MRSA clearance at the 6-month time point.
Forty-nine (9%) HIV-infected persons were MRSA colonized and randomized. Among those with 6-month colonization data (80% of those randomized), 67% were negative for MRSA colonization in both groups (p = 1.0). Analyses accounting for missing 6-month data showed no significant differences could have been achieved. In the multivariate adjusted models, randomization group was not associated with 6-month MRSA clearance. The median time to MRSA clearance was similar in the treatment vs. placebo groups (1.4 vs. 1.8 months, p = 0.35). There was no difference on subsequent development of MRSA infections/SSTI (p = 0.89). In a multivariable model, treatment group, demographics, and HIV-specific factors were not predictive of MRSA clearance at the 6-month time point.
A one-week decolonization procedure had no effect on MRSA colonization at the 6-month time point or subsequent infection rates among community-dwelling HIV-infected persons. More aggressive or novel interventions may be needed to reduce the burden of MRSA in this population.
Background. Few data exist on the incidence and risk factors of Staphylococcus aureus colonization and skin and soft tissue infections (SSTIs) among patients infected with human immunodeficiency virus (HIV).
Methods. Over a 2-year period, we prospectively evaluated adults infected with HIV for incident S aureus colonization at 5 body sites and SSTIs. Cox proportional hazard models using time-updated covariates were performed.
Results. Three hundred twenty-two participants had a median age of 42 years (interquartile range, 32–49), an HIV duration of 9.4 years (2.7–17.4), and 58% were on highly active antiretroviral therapy (HAART). Overall, 102 patients (32%) became colonized with S aureus with an incidence rate of 20.6 (95% confidence interval [CI], 16.8–25.0) per 100 person-years [PYs]. Predictors of colonization in the final multivariable model included illicit drug use (hazard ratios [HR], 4.26; 95% CI, 1.33–13.69) and public gym use (HR 1.66, 95% CI, 1.04–2.66), whereas antibacterial soap use was protective (HR, 0.50; 95% CI, 0.32–0.78). In a separate model, perigenital colonization was associated with recent syphilis infection (HR, 4.63; 95% CI, 1.01–21.42). Fifteen percent of participants developed an SSTI (incidence rate of 9.4 cases [95% CI, 6.8–12.7] per 100 PYs). Risk factors for an SSTI included incident S aureus colonization (HR 2.52; 95% CI, 1.35–4.69), public shower use (HR, 2.59; 95% CI, 1.48–4.56), and hospitalization (HR 3.54; 95% CI, 1.67–7.53). The perigenital location for S aureus colonization was predictive of SSTIs. Human immunodeficiency virus-related factors (CD4 count, HIV RNA level, and HAART) were not associated with colonization or SSTIs.
Conclusions. Specific behaviors, but not HIV-related factors, are predictors of colonization and SSTIs. Behavioral modifications may be the most important strategies in preventing S aureus colonization and SSTIs among persons infected with HIV.
behaviors; colonization; HIV; human immunodeficiency virus; MRSA; risk factors; skin and soft tissue infections; Staphylococcus aureus
The well described biological and epidemiologic associations of syphilis and HIV are particularly relevant to the military, as service members are young and at risk for sexually transmitted infections. We therefore used the results of serial serologic testing to determine the prevalence, incidence, and risk factors for incident syphilis in a cohort of HIV-infected Department of Defense beneficiaries.
Participants with a positive non-treponemal test at HIV diagnosis that was confirmed on treponemal testing were categorized as prevalent cases, whereas participants with an initial negative non-treponemal test who subsequently developed a confirmed positive non-treponemal test as incident cases.
At HIV diagnosis the prevalence of syphilis was 5.8% (n=202). 4239 participants contributed 27,192 person years (PY) to the incidence analysis and 347 (8%) developed syphilis (rate 1.3/100 PY; [1.1, 1.4]). Syphilis incidence was highest during the calendar years 2006 - 2009 (2.5/100 PY; [2.0, 2.9]). In multivariate analyses, younger age (per 10 year increase HR 0.8;[0.8-0.9]); male gender (HR 5.6; [2.3-13.7]); non European-American ethnicity (African-American (HR 3.2; [2.5-4.2]; Hispanic HR 1.9; [1.2-3.0]); history of hepatitis B (HR 1.5; [1.2-1.9]) or gonorrhea (HR 1.4; [1.1 −1.8]) were associated with syphilis.
The significant burden of disease both at and after HIV diagnosis, observed in this cohort, suggests that the cost-effectiveness of extending syphilis screening to at risk military members should be assessed. In addition, HIV infected persons continue to acquire syphilis, emphasizing the continued importance of prevention for positive programs.
Seroincidence; Seroprevalence; Risk Factors; Syphilis; HIV infected persons
The HACEK organisms (Haemophilus species, Aggregatibacter species, Cardiobacterium hominis, Eikenella corrodens, and Kingella species) are rare causes of infective endocarditis (IE). The objective of this study is to describe the clinical characteristics and outcomes of patients with HACEK endocarditis (HE) in a large multi-national cohort. Patients hospitalized with definite or possible infective endocarditis by the International Collaboration on Endocarditis Prospective Cohort Study in 64 hospitals from 28 countries were included and characteristics of HE patients compared with IE due to other pathogens. Of 5591 patients enrolled, 77 (1.4%) had HE. HE was associated with a younger age (47 vs. 61 years; p<0.001), a higher prevalence of immunologic/vascular manifestations (32% vs. 20%; p<0.008) and stroke (25% vs. 17% p = 0.05) but a lower prevalence of congestive heart failure (15% vs. 30%; p = 0.004), death in-hospital (4% vs. 18%; p = 0.001) or after 1 year follow-up (6% vs. 20%; p = 0.01) than IE due to other pathogens (n = 5514). On multivariable analysis, stroke was associated with mitral valve vegetations (OR 3.60; CI 1.34–9.65; p<0.01) and younger age (OR 0.62; CI 0.49–0.90; p<0.01). The overall outcome of HE was excellent with the in-hospital mortality (4%) significantly better than for non-HE (18%; p<0.001). Prosthetic valve endocarditis was more common in HE (35%) than non-HE (24%). The outcome of prosthetic valve and native valve HE was excellent whether treated medically or with surgery. Current treatment is very successful for the management of both native valve prosthetic valve HE but further studies are needed to determine why HE has a predilection for younger people and to cause stroke. The small number of patients and observational design limit inferences on treatment strategies. Self selection of study sites limits epidemiological inferences.
The typical clinical presentation of several spotted fever group Rickettsia infections includes eschars. Clinical diagnosis of the condition is usually made by analysis of blood samples. We describe a more sensitive, noninvasive means of obtaining a sample for diagnosis by using an eschar swab specimen from patients infected with Rickettsia parkeri.
Rickettsia; Swab; Eschar; R. parkeri; diagnostic; diagnosis; bacteria; vector-borne infections
The impact of early surgery on mortality in patients with native valve endocarditis (NVE) is unresolved. This study seeks to evaluate valve surgery compared to medical therapy for NVE, and to identify characteristics of patients who are most likely to benefit from early surgery.
Methods and Results
Using a prospective, multinational cohort of patients with definite NVE, the effect of early surgery on in-hospital mortality was assessed using propensity-based matching adjusting for survivor bias, and instrumental variable analysis. Patients were stratified by propensity quintile, paravalvular complications, valve perforation, systemic embolization, stroke, Staphylococcus aureus infection and congestive heart failure.
Of the 1552 patients with NVE, 720 (46%) underwent early surgery and 832 (54%) were treated with medical therapy. Compared to medical therapy, early surgery was associated with a significant reduction in mortality in the overall cohort (12.1% [87/720] vs. 20.7% [172/832]) and after propensity-based matching and adjustment for survivor bias (absolute risk reduction (ARR) = -5.9 %; p<0.001). Using a combined instrument, the instrumental variable adjusted ARR in mortality associated with early surgery was -11.2% (p<0.001). In sub-group analysis, surgery was found to confer a survival benefit compared to medical therapy among patients with a higher propensity for surgery (ARR= -10.9% for quintiles 4 and 5; p=0.002); those with paravalvular complications (ARR= -17.3 %; p<0.001), systemic embolization (ARR= -12.9%; p=0.002), S aureus NVE (ARR= -20.1%; p<0.001) and stroke (ARR= -13%; p=0.02) but not with valve perforation or congestive heart failure.
Early surgery for NVE is associated with an in-hospital mortality benefit compared to medical therapy alone.
early surgery; infective endocarditis; medical therapy; in hospital mortality
Human immunodeficiency virus (HIV)-infected persons are at risk for severe influenza infections. Although vaccination against the H1N1 pandemic influenza strain is recommended, currently, there are no data on the durability of post-vaccination antibody responses in this population.
HIV-infected and HIV-uninfected adults (18–50 years old) received a single dose of monovalent 2009 influenza A (H1N1) vaccine (strain A/California/7/2009H1N1). Antibody levels to the 2009 H1N1 pandemic strain were determined at day 0, day 28, and 6 months by hemagglutination-inhibition assay. A seroprotective response was a post-vaccination titer of ≥1:40 among those with a pre-vaccination level of ≤1:10. Geometric mean titers (GMT) and factors associated with higher levels were also evaluated.
We studied 127 participants with a median age of 35 (interquartile range (IQR) 28, 42) years. Among the HIV-infected arm (n=63), the median CD4 count was 595 (IQR 476, 819) cells/mm3 and 83% were receiving HAART. Thirty-five percent of all participants had a pre-vaccination level of >1:10. HIV-infected compared to HIV-uninfected adults were less likely to generate a seroprotective response at day 28 (54% vs. 75%, adjusted OR 0.23, p=0.021) or have a durable response at 6 months post-vaccination (28% vs. 56%, adjusted OR 0.19, p=0.005). Additionally, although pre-vaccination GMT were similar in both arms (median 7 vs. 8, p=0.11), the GMT at 6 months was significantly lower among HIV-infected versus HIV-uninfected adults (median 20 vs. 113, p=0.003). Among HIV-infected persons, younger age (p=0.035) and receipt of HAART (p=0.028) were associated with higher GMTs at 6 months.
Despite vaccination, most HIV-infected adults do not have durable seroprotective antibody responses to the 2009 influenza A (H1N1) virus, and hence may remain vulnerable to infection. In addition to HAART use, more immunogenic vaccines are likely needed for improving protection against influenza in this population.
influenza; pandemic 2009 H1N1; vaccine responses; HIV; durability; long-term immunity
Whether seroresponse to a vaccine such as hepatitis B virus (HBV) vaccine can provide a measure of the functional immune status of HIV-infected persons is unknown.This study evaluated the relationship between HBV vaccine seroresponses and progression to clinical AIDS or death.
Methods and Findings
From a large HIV cohort, we evaluated those who received HBV vaccine only after HIV diagnosis and had anti-HBs determination 1–12 months after the last vaccine dose. Non-response and positive response were defined as anti-HBs <10 and ≥10 IU/L, respectively. Participants were followed from date of last vaccination to clinical AIDS, death, or last visit. Univariate and multivariable risk of progression to clinical AIDS or death were evaluated with Cox regression models. A total of 795 participants vaccinated from 1986–2010 were included, of which 41% were responders. During 3,872 person-years of observation, 122 AIDS or death events occurred (53% after 1995). Twenty-two percent of non-responders experienced clinical AIDS or death compared with 5% of responders (p<0.001). Non-response to HBV vaccine was associated with a greater than 2-fold increased risk of clinical AIDS or death (HR 2.47; 95% CI, 1.38–4.43) compared with a positive response, after adjusting for CD4 count, HIV viral load, HAART use, and delayed type hypersensitivity skin test responses (an in vivo marker of cell-mediated immunity). This association remained evident among those with CD4 count ≥500 cells/mm3 (HR 3.40; 95% CI, 1.39–8.32).
HBV vaccine responses may have utility in assessing functional immune status and risk stratificating HIV-infected individuals, including those with CD4 count ≥500 cells/mm3.
Background. Limited data exist on the immunogenicity of the 2009 influenza A (H1N1) vaccine among immunocompromised persons, including those with human immunodeficiency virus (HIV) infection.
Methods. We compared the immunogenicity and tolerability of a single dose of the monovalent 2009 influenza A (H1N1) vaccine (strain A/California/7/2009H1N1) between HIV-infected and HIV-uninfected adults 18–50 years of age. The primary end point was an antibody titer of ≥1:40 at day 28 after vaccination in those with a prevaccination level of ≤1:10, as measured by hemagglutination-inhibition assay. Geometric mean titers, influenza-like illnesses, and tolerability were also evaluated.
Results. One hundred thirty-one participants were evaluated (65 HIV-infected and 66 HIV-uninfected patients), with a median age of 35 years (interquartile range, 27–42 years). HIV-infected persons had a median CD4 cell count of 581 cells/mm3 (interquartile range, 476–814 cells/mm3) , and 82% were receiving antiretroviral medications. At baseline, 35 patients (27%) had antibody titers of >1:10. HIV-infected patients (29 [56%] of 52), compared with HIV-uninfected persons (35 [80%] of 44), were significantly less likely to develop an antibody response (odds ratio, .20; P = .003). Changes in the median geometric mean titer from baseline to day 28 were also significantly lower in HIV-infected patients than in HIV-uninfected persons (75 vs 153; P = .001). Five influenza-like illnesses occurred (2 cases in HIV-infected persons), but none was attributable to the 2009 influenza H1N1 virus. The vaccine was well tolerated in both groups.
Conclusions. Despite high CD4 cell counts and receipt of antiretroviral medications, HIV-infected adults generated significantly poorer antibody responses, compared with HIV-uninfected persons. Future studies evaluating a 2-dose series or more-immunogenic influenza A (H1N1) vaccines among HIV-infected adults are needed (ClinicalTrials.gov NCT00996970).
Knowing one's HIV status is particularly important in the setting of recent tuberculosis (TB) exposure. Blood tests for assessment of tuberculosis infection, such as the QuantiFERON Gold in-tube test (QFT; Cellestis Limited, Carnegie, Victoria, Australia), offer the possibility of simultaneous screening for TB and HIV with a single blood draw. We performed a cross-sectional analysis of all contacts to a highly infectious TB case in a large meatpacking factory. Twenty-two percent were foreign-born and 73% were black. Contacts were tested with both tuberculin skin testing (TST) and QFT. HIV testing was offered on an opt-out basis. Persons with TST ≥10 mm, positive QFT, and/or positive HIV test were offered latent TB treatment. Three hundred twenty-six contacts were screened: TST results were available for 266 people and an additional 24 reported a prior positive TST for a total of 290 persons with any TST result (89.0%). Adequate QFT specimens were obtained for 312 (95.7%) of persons. Thirty-two persons had QFT results but did not return for TST reading. Twenty-two percent met the criteria for latent TB infection. Eighty–eight percent accepted HIV testing. Two (0.7%) were HIV seropositive; both individuals were already aware of their HIV status, but one had stopped care a year previously. None of the HIV-seropositive persons had latent TB, but all were offered latent TB treatment per standard guidelines. This demonstrates that opt-out HIV testing combined with QFT in a large TB contact investigation was feasible and useful. HIV testing was also widely accepted. Pairing QFT with opt-out HIV testing should be strongly considered when possible.
To evaluate the efficacy of highly-active antiretroviral therapy (HAART) in individuals taking cytochrome P450 enzyme-inducing antiepileptics (EI-EADs), we evaluated the virologic response to HAART with or without concurrent antiepileptic use.
Participants in the US Military HIV Natural History Study were included if taking HAART for ≥6 months with concurrent use of EI-AEDs phenytoin, carbamazepine, or phenobarbital for ≥28 days. Virologic outcomes were compared to HAART-treated participants taking AEDs that are not CYP450 enzyme-inducing (NEI-AED group) as well as to a matched group of individuals not taking AEDs (non-AED group). For participants with multiple HAART regimens with AED overlap, the first 3 overlaps were studied.
EI-AED participants (n = 19) had greater virologic failure (62.5%) compared to NEI-AED participants (n = 85; 26.7%) for the first HAART/AED overlap period (OR 4.58 [1.47-14.25]; P = 0.009). Analysis of multiple overlap periods yielded consistent results (OR 4.29 [1.51-12.21]; P = 0.006). Virologic failure was also greater in the EI-AED versus NEI-AED group with multiple HAART/AED overlaps when adjusted for both year of and viral load at HAART initiation (OR 4.19 [1.54-11.44]; P = 0.005). Compared to the non-AED group (n = 190), EI-AED participants had greater virologic failure (62.5% vs. 42.5%; P = 0.134), however this result was only significant when adjusted for viral load at HAART initiation (OR 4.30 [1.02-18.07]; P = 0.046).
Consistent with data from pharmacokinetic studies demonstrating that EI-AED use may result in subtherapeutic levels of HAART, EI-AED use is associated with greater risk of virologic failure compared to NEI-AEDs when co-administered with HAART. Concurrent use of EI-AEDs and HAART should be avoided when possible.
The results from two methodologically identical double-blind studies indicate that telavancin is noninferior to vancomycin based on clinical response in the treatment of hospital-acquired pneumonia due to Gram-positive pathogens.
Background. Telavancin is a lipoglycopeptide bactericidal against gram-positive pathogens.
Methods. Two methodologically identical, double-blind studies (0015 and 0019) were conducted involving patients with hospital-acquired pneumonia (HAP) due to gram-positive pathogens, particularly methicillin-resistant Staphylococcus aureus (MRSA). Patients were randomized 1:1 to telavancin (10 mg/kg every 24 h) or vancomycin (1 g every 12 h) for 7–21 days. The primary end point was clinical response at follow-up/test-of-cure visit.
Results. A total of 1503 patients were randomized and received study medication (the all-treated population). In the pooled all-treated population, cure rates with telavancin versus vancomycin were 58.9% versus 59.5% (95% confidence interval [CI] for the difference, –5.6% to 4.3%). In the pooled clinically evaluable population (n = 654), cure rates were 82.4% with telavancin and 80.7% with vancomycin (95% CI for the difference, –4.3% to 7.7%). Treatment with telavancin achieved higher cure rates in patients with monomicrobial S. aureus infection and comparable cure rates in patients with MRSA infection; in patients with mixed gram-positive/gram-negative infections, cure rates were higher in the vancomycin group. Incidence and types of adverse events were comparable between the treatment groups. Mortality rates for telavancin-treated versus vancomycin-treated patients were 21.5% versus 16.6% (95% CI for the difference, –0.7% to 10.6%) for study 0015 and 18.5% versus 20.6% (95% CI for the difference, –7.8% to 3.5%) for study 0019. Increases in serum creatinine level were more common in the telavancin group (16% vs 10%).
Conclusions. The primary end point of the studies was met, indicating that telavancin is noninferior to vancomycin on the basis of clinical response in the treatment of HAP due to gram-positive pathogens.
Sepsis is caused by a heterogeneous group of infectious etiologies. Early diagnosis and the provision of appropriate antimicrobial therapy correlate with positive clinical outcomes. Current microbiological techniques are limited in their diagnostic capacities and timeliness. Multiplex PCR has the potential to rapidly identify bloodstream infections and fill this diagnostic gap. We identified patients from two large academic hospital emergency departments with suspected sepsis. The results of a multiplex PCR that could detect 25 bacterial and fungal pathogens were compared to those of blood culture. The results were analyzed with respect to the likelihood of infection, sepsis severity, the site of infection, and the effect of prior antibiotic therapy. We enrolled 306 subjects with suspected sepsis. Of these, 43 were later determined not to have infectious etiologies. Of the remaining 263 subjects, 70% had sepsis, 16% had severe sepsis, and 14% had septic shock. The majority had a definite infection (41.5%) or a probable infection (30.7%). Blood culture and PCR performed similarly with samples from patients with clinically defined infections (areas under the receiver operating characteristic curves, 0.64 and 0.60, respectively). However, blood culture identified more cases of septicemia than PCR among patients with an identified infectious etiology (66 and 46, respectively; P = 0.0004). The two tests performed similarly when the results were stratified by sepsis severity or infection site. Blood culture tended to detect infections more frequently among patients who had previously received antibiotics (P = 0.06). Conversely, PCR identified an additional 24 organisms that blood culture failed to detect. Real-time multiplex PCR has the potential to serve as an adjunct to conventional blood culture, adding diagnostic yield and shortening the time to pathogen identification.
We investigated associations between the genotypic and phenotypic features of Staphylococcus aureus bloodstream isolates and the clinical characteristics of bacteremic patients enrolled in a phase III trial of S. aureus bacteremia and endocarditis. Isolates underwent pulsed-field gel electrophoresis, PCR for 33 putative virulence genes, and screening for heteroresistant glycopeptide intermediate S. aureus (hGISA). A total of 230 isolates (141 methicillin-susceptible S. aureus and 89 methicillin-resistant S. aureus [MRSA]) were analyzed. North American and European S. aureus isolates differed in their genotypic characteristics. Overall, 26% of the MRSA bloodstream isolates were USA 300 strains. Patients with USA 300 MRSA bacteremia were more likely to be injection drug users (61% versus 15%; P < 0.001), to have right-sided endocarditis (39% versus 9%; P = 0.002), and to be cured of right-sided endocarditis (100% versus 33%; P = 0.01) than patients with non-USA 300 MRSA bacteremia. Patients with persistent bacteremia were less likely to be infected with Panton-Valentine leukocidin gene (pvl)-constitutive MRSA (19% versus 56%; P = 0.005). Although 7 of 89 MRSA isolates (8%) exhibited the hGISA phenotype, no association with persistent bacteremia, daptomycin resistance, or bacterial genotype was observed. This study suggests that the virulence gene profiles of S. aureus bloodstream isolates from North America and Europe differ significantly. In this study of bloodstream isolates collected as part of a multinational randomized clinical trial, USA 300 and pvl-constitutive MRSA strains were associated with better clinical outcomes.