Tumor markers with increased sensitivity and specificity for endometrial cancer are needed to help monitor response to therapy and to detect recurrent disease. Currently, the tumor maker CA125 is utilized in this role with limited value. The objectives of this study were to examine the levels of several novel tumor markers HE4, SMRP, CA72.4 and CA125 as potential markers in patients diagnosed with endometrioid adenocarcinoma of the uterus.
Preoperative serum samples from surgically staged patients with endometrioid adenocarcinoma of the uterus were analyzed for levels of HE4, SMRP, CA72-4 and CA125. Control samples were obtained from healthy postmenopausal women. Logistic regression models and receiver operating characteristic (ROC) curves were constructed for each tumor marker and for all combinations, with cross-validation analyses to obtain average sensitivities at set specificities of 90%, 95%, and 98%.
Serum samples from 156 healthy subjects and 171 patients with endometrial cancer (122 stage I, 17 stage II, 26 stage III, and 6 stage IV) were analyzed. At a 95% specificity, the sensitivities for differentiating between healthy subjects and all stages of cancer were 45.5% for HE4 and 24.6% for CA125. For stage I disease, HE4 yielded a 17.1% improvement in sensitivity compared with CA125.
HE4 is elevated in all stages of endometrial cancer and is more sensitive in early-stage endometrial cancer compared to CA125. Further investigation of HE4 as a marker for early detection of recurrent endometrial cancer and monitoring response to therapy is warranted.
Patients diagnosed with epithelial ovarian cancer (EOC) have improved outcomes when cared for at centers experienced in the management of EOC. The objective of this trial was to validate a predictive model to assess the risk for EOC in women with a pelvic mass.
Women diagnosed with a pelvic mass and scheduled to have surgery were enrolled on a multicenter prospective study. Preoperative serum levels of HE4 and CA125 were measured. Separate logistic regression algorithms for premenopausal and postmenopausal women were utilized to categorize patients into low and high risk groups for EOC.
Twelve sites enrolled 531 evaluable patients with 352 benign tumors, 129 EOC, 22 LMP tumors, 6 non EOC and 22 non ovarian cancers. The postmenopausal group contained 150 benign cases of which 112 were classified as low risk giving a specificity of 75.0% (95% CI 66.9-81.4), and 111 EOC and 6 LMP tumors of which 108 were classified as high risk giving a sensitivity of 92.3% (95% CI=85.9-96.4). The premenopausal group had 202 benign cases of which 151 were classified as low risk providing a specificity of 74.8% (95% CI=68.2--80.6), and 18 EOC and 16 LMP tumors of which 26 were classified as high risk, providing a sensitivity of 76.5% (95% CI=58.8--89.3).
An algorithm utilizing HE4 and CA125 successfully classified patients into high and low risk groups with 93.8% of EOC correctly classified as high risk. This model can be used to effectively triage patients to centers of excellence.
To compare the Risk of Malignancy Index (RMI) to the Risk of Ovarian Malignancy Algorithm (ROMA) to predict EOC in women with a pelvic mass.
457 women with imaging results from ultrasound, CT and MRI, and serum HE4 and CA 125 determined prior to surgery for pelvic mass were evaluable. RMI values were determined using CA 125, imaging score and menopausal status. ROMA values were determined using HE4, CA 125, and menopausal status.
At a set specificity of 75%, ROMA had a sensitivity of 94.3% and RMI had a sensitivity of 84.6% for distinguishing benign from EOC (p=0.0029). In patients with stage I and II disease, ROMA achieved a sensitivity of 85.3% compared with 64.7% for RMI (p<0.0001).
The dual marker algorithm utilizing HE4 and CA125 to calculate a ROMA value achieves a significantly higher sensitivity for identifying women with EOC than does RMI.
ovarian cancer; pelvic mass; HE4; CA125; ROMA
It is often difficult to distinguish a benign pelvic mass from a malignancy and tools to help referring physician are needed. The purpose of this study was to validate the Risk of Ovarian Malignancy Algorithm (ROMA) in women presenting with a pelvic mass.
This was a prospective, multicenter, blinded clinical trial that included women who presented to a gynecologist, a family practitioner, an internist or a general surgeon with an adnexal mass. Serum HE4 and CA125 were determined preoperatively. A ROMA score was calculated and classified patients into high and low-risk groups for having a malignancy. The sensitivity, specificity, negative predictive value (NPV) and positive predictive value (PPV) of ROMA was estimated.
A total of 472 patients were evaluated with 383 women diagnosed with benign disease and 89 women with a malignancy. The incidence of all cancers was 15% and 10% for ovarian cancer. In the postmenopausal group a sensitivity of 92.3% and a specificity of 76.0% and for the premenopausal group ROMA had a sensitivity of 100% and specificity of 74.2% for detecting ovarian cancer. When considering all women together, ROMA had a sensitivity of 93.8%, a specificity of 74.9% and a NPV of 99.0%.
The use of the serum biomarkers HE4 and CA125 with ROMA has a high sensitivity for the prediction of ovarian cancer in women with a pelvic mass. These findings support the use of ROMA as a tool for the triage of women with an adnexal mass to gynecologic oncologists.
The Post-exposure Prophylaxis in Infants (PEPI)-Malawi trial evaluated infant antiretroviral regimens for prevention of post-natal HIV transmission. A multi-assay algorithm (MAA) that includes the BED capture immunoassay, an avidity assay, CD4 cell count, and viral load was used to identify women who were vs. were not recently infected at the time of enrollment (MAA recent, N = 73; MAA non-recent, N = 2,488); a subset of the women in the MAA non-recent group known to have been HIV infected for at least 2 years before enrollment (known non-recent, N = 54). Antibody maturation and viral diversification were examined in these women.
Samples collected at enrollment (N = 2,561) and 12–24 months later (N = 1,306) were available for serologic analysis using the BED and avidity assays. A subset of those samples was used for analysis of viral diversity, which was performed using a high resolution melting (HRM) diversity assay. Viral diversity analysis was performed using all available samples from women in the MAA recent group (61 enrollment samples, 38 follow-up samples) and the known non-recent group (43 enrollment samples, 22 follow-up samples). Diversity data from PEPI-Malawi were also compared to similar data from 169 adults in the United States (US) with known recent infection (N = 102) and known non-recent infection (N = 67).
In PEPI-Malawi, results from the BED and avidity assays increased over time in the MAA recent group, but did not change significantly in the MAA non-recent group. At enrollment, HIV diversity was lower in the MAA recent group than in the known non-recent group. HRM diversity assay results from women in PEPI-Malawi were similar to those from adults in the US with known duration of HIV infection.
Antibody maturation and HIV diversification patterns in African women provide additional support for use of the MAA to identify populations with recent HIV infection.
Viral suppression and viral breakthrough impact the humoral immune response to HIV infection. We evaluated the impact of viral suppression and viral breakthrough on results obtained with two cross-sectional HIV incidence assays.
All samples were collected from adults in the US who were HIV infected for >2 years. Samples were tested with the BED capture enzyme immunoassay (BED-CEIA) which measures the proportion of IgG that is HIV-specific, and with an antibody avidity assay based on the Genetic Systems 1/2+ O ELISA. We tested 281 samples: (1) 30 samples from 18 patients with natural control of HIV-1 infection known as elite controllers or suppressors (2) 72 samples from 18 adults on antiretroviral therapy (ART), with 1 sample before and 2–6 samples after ART initiation, and (3) 179 samples from 20 virally-suppressed adults who had evidence of viral breakthrough receiving ART (>400 copies/ml HIV RNA) and with subsequent viral suppression.
For elite suppressors, 10/18 had BED-CEIA values <0.8 normalized optical density units (OD-n) and these values did not change significantly over time. For patients receiving ART, 14/18 had BED-CEIA values that decreased over time, with a median decrease of 0.42 OD-n (range 0.10 to 0.63)/time point receiving ART. Three patterns of BED-CEIA values were observed during viral breakthrough: (1) values that increased then returned to pre-breakthrough values when viral suppression was re-established, (2) values that increased after viral breakthrough, and (3) values that did not change with viral breakthrough.
Viral suppression and viral breakthrough were associated with changes in BED-CEIA values, reflecting changes in the proportion of HIV-specific IgG. These changes can result in misclassification of patients with long-term HIV infection as recently infected using the BED-CEIA, thereby influencing a falsely high value for cross-sectional incidence estimates.
Geographic location may be related to the receipt of quality HIV healthcare services. Clinical outcomes and healthcare utilization were evaluated in rural, urban and peri-urban patients seen at high-volume U.S. urban-based HIV care sites.
Zip codes for 8,773 HIV patients followed in 2005 at 7 HIV Research Network sites were categorized as rural (population<10K), peri-urban (10K – 100K) and urban (>100K). Clinical and demographic characteristics, inpatient and outpatient (OP) utilization, AIDS defining illness rates, receipt of highly active antiretroviral therapy (HAART), opportunistic infection (OI) prophylaxis usage and virologic suppression were compared among patients, using Χ2 tests for categorical variables, t-tests for means, and logistic regression for HAART utilization.
HIV-infected rural (n=170) and peri-urban (n=215) patients were less likely to be Black or Hispanic than urban HIV patients. Peri-urban subjects were more likely to report MSM as their HIV risk factor than rural or urban subjects. Age, gender, CD4 or HIV-RNA distribution, virologic suppression, HAART usage or OI prophylaxis did not differ by geographic location. In multivariate analysis, rural and peri-urban patients were less likely to have ≥4 annual outpatient visits than urban patients. Rural patients were less likely to receive HAART if they were Black. Overall, geographic location (as defined by home zip code) did not affect receipt of HAART or OI prophylaxis.
Although demographic and healthcare utilization differences were seen among rural, peri-urban, and urban HIV patients, most HIV outcomes and medication use were comparable across geographic areas. As with HIV care for urban-dwelling patients, areas for improvement for non-urban HIV patients include access to HAART among minorities and IDUs.
rural; HIV/AIDS Care; HAART; Outcomes; Quality of care; highly active antiretroviral therapy; HIV Research Network
To assess the effect of HIV-infection on postoperative survival among non-small cell lung cancer (NSCLC) patients.
A retrospective cohort study compared 22 HIV-infected lung cancer patients to 2,430 lung cancer patients with HIV-unspecified status resected at Johns Hopkins Hospital from 1985-2009. Sub-cohort comparative analyses were performed using individual matching methods.
Thirty day mortality rates did not differ between HIV-infected and HIV-unspecified patients. Survival rates in HIV-infected lung cancer patients were significantly shorter than in HIV-unspecified patients (median 26 vs. 48 months; p=0.001). After adjustment, the relative hazard of mortality among HIV-infected NSCLC patients was ≥3 fold that of HIV- unspecified patients (aHR=3.08, 95% CI 1.85; 5.13). When additional surgical characteristics were modeled in a matched sub-cohort, the association remained statistically significant (aHR=2.31, 95% CI 1.11; 4.81). Moreover, HIV-infected lung cancer patients with CD4 counts <200 cells/mm3 had shortened median survival than those with CD4 ≥200 cells/mm3 (8 vs. 40 months; p=0.031). Postoperative pulmonary and infectious complications were also elevated in the HIV-infected group (p=0.001 and <0.001, respectively). After surgery, median time to cancer progression was shorter among HIV-infected patients (20.4 months) versus HIV-unspecified patients (p=0.061).
HIV-infected NSCLC patients are associated with more postoperative complications, rapid progression to disease recurrence and poorer postoperative survival. Optimizing immune status prior to surgery and careful patient selection based on DLCO may improve patient outcomes.
Lung Cancer Surgery; Infection; Outcomes
Insulin and its plasma membrane receptor constitute an ancient response system critical to cell growth and differentiation. Studies using intact Rana pipiens oocytes have shown that insulin can act at receptors on the oocyte surface to initiate resumption of the first meiotic division. We have reexamined the insulin-induced cascade of electrical and ion transport-related plasma membrane events using both oocytes and intact plasma membranes in order to characterize the insulin receptor-steroid response system associated with the meiotic divisions.
[125I]Insulin binding (Kd = 54 ± 6 nM) at the oocyte plasma membrane activates membrane serine protease(s), followed by the loss of low affinity ouabain binding sites, with a concomitant 3–4 fold increase in high affinity ouabain binding sites. The changes in protease activity and ouabain binding are associated with increased Na+/Ca2+ exchange, increased endocytosis, decreased Na+ conductance resulting in membrane hyperpolarization, increased 2-deoxy-D-glucose uptake and a sustained elevation of intracellular pH (pHi). Hyperpolarization is largely due to Na+-channel inactivation and is the main driving force for glucose uptake by the oocyte via Na+/glucose cotransport. The Na+ sym- and antiporter systems are driven by the Na+ free energy gradient generated by Na+/K+-ATPase. Shifts in α and/or β Na+-pump subunits to caveolar (lipid raft) membrane regions may activate Na/K-ATPase and contribute to the Na+ free energy gradient and the increase in both Na+/glucose co-transport and pHi.
Under physiological conditions, resumption of meiosis results from the concerted action of insulin and progesterone at the cell membrane. Insulin inactivates Na+ channels and mobilizes fully functional Na+-pumps, generating a Na+ free energy gradient which serves as the energy source for several membrane anti- and symporter systems.
Insulin; Progesterone; Meiosis; Oocyte; Ouabain; Na/K-ATPase; Protease
cost; HIV; utilization; CD4 count; HAART; HIV Research Network
The U.S. National HIV/AIDS Strategy targets for 2015 include increasing access to care and improving health outcomes for persons living with HIV in the United States (PLWH-US).
To demonstrate the utility of the NA-ACCORD (North American AIDS Cohort Collaboration on Research and Design) for monitoring trends in the HIV epidemic in the United States and to present trends in HIV treatment and related health outcomes.
Trends from annual cross-sectional analyses comparing patients from pooled, multicenter, prospective, clinical HIV cohort studies with PLWH-US, as reported to national surveillance systems in 40 states.
U.S. HIV outpatient clinics.
HIV-infected adults with 1 or more HIV RNA plasma viral load (HIV VL) or CD4 T-lymphocyte (CD4) cell count measured in any calendar year from 1 January 2000 to 31 December 2008.
Annual rates of antiretroviral therapy use, HIV VL, and CD4 cell count at death.
45 529 HIV-infected persons received care in an NA-ACCORD–participating U.S. clinical cohort from 2000 to 2008. In 2008, the 26 030 NA-ACCORD participants in care and the 655 966 PLWH-US had qualitatively similar demographic characteristics. From 2000 to 2008, the proportion of participants prescribed highly active antiretroviral therapy increased by 9 percentage points to 83% (P < 0.001), whereas the proportion with suppressed HIV VL (≤2.7 log10 copies/mL) increased by 26 percentage points to 72% (P < 0.001). Median CD4 cell count at death more than tripled to 0.209 × 109 cells/L (P < 0.001).
The usual limitations of observational data apply.
The NA-ACCORD is the largest cohort of HIV-infected adults in clinical care in the United States that is demographically similar to PLWH-US in 2008. From 2000 to 2008, increases were observed in the percentage of prescribed HAART, the percentage who achieved a suppressed HIV VL, and the median CD4 cell count at death.
Primary Funding Source
National Institutes of Health, Centers for Disease Control and Prevention, Canadian Institutes of Health Research, Canadian HIV Trials Network, and the government of British Columbia, Canada.
Oligodendroglioma is characterized by unique clinical, pathological, and genetic features. Recurrent losses of chromosomes 1p and 19q are strongly associated with this brain cancer but knowledge of the identity and function of the genes affected by these alterations is limited. We performed exome sequencing on a discovery set of 16 oligodendrogliomas with 1p/19q co-deletion to identify new molecular features at base-pair resolution. As anticipated, there was a high rate of IDH mutations: all cases had mutations in either IDH1 (14/16) or IDH2 (2/16). In addition, we discovered somatic mutations and insertions/deletions in the CIC gene on chromosome 19q13.2 in 13/16 tumours. These discovery set mutations were validated by deep sequencing of 13 additional tumours, which revealed 7 others with CIC mutations, thus bringing the overall mutation rate in oligodendrogliomas in this study to 20/29 (69%). In contrast, deep sequencing of astrocytomas and oligoastrocytomas without 1p/19q loss revealed that CIC alterations were otherwise rare (1/60; 2%). Of the 21 non-synonymous somatic mutations in 20 CIC-mutant oligodendrogliomas, 9 were in exon 5 within an annotated DNA interacting domain and 3 were in exon 20 within an annotated protein interacting domain. The remaining 9 were found in other exons and frequently included truncations. CIC mutations were highly associated with oligodendroglioma histology, 1p/19q co-deletion and IDH1/2 mutation (p<0.001). Although we observed no differences in the clinical outcomes of CIC mutant versus wild-type tumors, in a background of 1p/19q co-deletion, hemizygous CIC mutations are likely important. We hypothesize that the mutant CIC on the single retained 19q allele is linked to the pathogenesis of oligodendrogliomas with IDH mutation. Our detailed study of genetic aberrations in oligodendroglioma suggests a functional interaction between CIC mutation, IDH1/2 mutation and 1p/19q co-deletion.
Glioma; Oligodendroglioma; Next Generation Sequencing; Capicua; IDH1
To examine long-term effects of antiretroviral therapy (ART) on kidney function, we evaluated the incidence and risk factors for chronic kidney disease (CKD) among ART-naive, HIV-infected adults and compared changes in estimated glomerular filtration rates (eGFR) before and after starting ART.
Multicenter observational cohort study of patients with at least one serum creatinine measurement before and after initiating ART. Cox proportional hazard models, and marginal structure models examined CKD risk factors; mixed-effects linear models examined eGFR slopes.
Three thousand, three hundred and twenty-nine patients met entry criteria, contributing 10 099 person-years of observation on ART. ART was associated with a significantly slower rate of eGFR decline (from −2.18 to −1.37 ml/min per 1.73 m2 per year; P = 0.02). The incidence of CKD defined by eGFR thresholds of 60, 45 and 30 ml/min per 1.73 m2 was 10.5, 3.4 and 1.6 per 1000 person-years, respectively. In adjusted analyses black race, hepatitis C coinfection, lower time-varying CD4 cell count and higher time-varying viral load on ART were associated with higher CKD risk, and the magnitude of these risks increased with more severe CKD. Tenofovir and a ritonavir-boosted protease inhibitor (rPI) was also associated with higher CKD risk [hazard odds ratio for an eGFR threshold <60 ml/min per 1.73 m2: 3.35 (95% confidence interval (CI) = 1.40–8.02)], which developed in 5.7% of patients after 4 years of exposure to this regimen-type.
ART was associated with reduced CKD risk in association with CD4 cell restoration and plasma viral load suppression, despite an increased CKD risk that was associated with initial regimens that included tenofovir and rPI.
antiretroviral therapy; chronic kidney disease; tenofovir
With improved combination antiretroviral therapy (ART)-related survival, diabetes and hypertension increasingly contribute to morbidity and mortality among individuals with HIV. However, there is limited data on diabetes and blood pressure control in this population. We examined whether virologic control is associated with control of diabetes and hypertension.
We examined HIV viral load, hemoglobin A1c (HbA1c), and blood pressure measurements from 70 diabetics and 291 hypertensives in the Johns Hopkins HIV Clinical Cohort, an urban, university-based cohort. All patients were treated for HIV and diabetes or hypertension. HbA1c and HIV-1 RNA were captured electronically from laboratory data, and blood pressure was collected electronically from vital signs taken at clinic visits. We used HIV-1 RNA values within 30 days of the HbA1c measurement or blood pressure measurement. The relationships between HIV-1 RNA and HbA1c and HIV-1 RNA and blood pressure were examined using separate random effects generalized least squares linear regression models.
The study sample was predominantly male and black, with a high prevalence of comorbid hepatitis C virus infection and psychiatric illness. In multivariable analysis, each log10 increase in HIV-1 RNA was associated with higher HbA1c (β=0.47 units, p<0.001) among diabetics and higher mean arterial pressure (MAP) among hypertensive patients (β=1.95 mmHg, p<0.001).
Suboptimal control of HIV, indicated by detectable viral load, correlates with suboptimal control of diabetes and hypertension, indicated by higher HbA1c and MAP. Achieving control of multiple medical comorbidities and HIV simultaneously may require expansion of current adherence interventions focused primarily on antiretroviral therapy.
HIV; Diabetes; Hypertension
The human leukocyte antigen (HLA) is key to many aspects of human physiology and medicine. All current sequence-based HLA typing methodologies are targeted approaches requiring the amplification of specific HLA gene segments. Whole genome, exome and transcriptome shotgun sequencing can generate prodigious data but due to the complexity of HLA loci these data have not been immediately informative regarding HLA genotype. We describe HLAminer, a computational method for identifying HLA alleles directly from shotgun sequence datasets (http://www.bcgsc.ca/platform/bioinfo/software/hlaminer). This approach circumvents the additional time and cost of generating HLA-specific data and capitalizes on the increasing accessibility and affordability of massively parallel sequencing.
The ability to conditionally rewire pathways in human cells holds great therapeutic potential. Transcription activator-like effectors (TALEs) are a class of naturally occurring specific DNA binding proteins that can be used to introduce targeted genome modifications or control gene expression. Here we present TALE hybrids engineered to respond to endogenous signals and capable of controlling transgenes by applying a predetermined and tunable action at the single-cell level. Specifically, we first demonstrate that combinations of TALEs can be used to modulate the expression of stably integrated genes in kidney cells. We then introduce a general purpose two-hybrid approach that can be customized to regulate the function of any TALE either using effector molecules or a heterodimerization reaction. Finally, we demonstrate the successful interface of TALEs to specific endogenous signals, namely hypoxia signaling and microRNAs, essentially closing the loop between cellular information and chromosomal transgene expression.
Viremia copy-years predicted all-cause mortality independent of traditional, cross-sectional viral load measures and time-updated CD4+ T-lymphocyte count in antiretroviral therapy-treated patients suggesting cumulative human immunodeficiency virus replication causes harm independent of its effect on the degree of immunodeficiency.
Background. Cross-sectional plasma human immunodeficiency virus (HIV) viral load (VL) measures have proven invaluable for clinical and research purposes. However, cross-sectional VL measures fail to capture cumulative plasma HIV burden longitudinally. We evaluated the cumulative effect of exposure to HIV replication on mortality following initiation of combination antiretroviral therapy (ART).
Methods. We included treatment-naive HIV-infected patients starting ART from 2000 to 2008 at 8 Center for AIDS Research Network of Integrated Clinical Systems sites. Viremia copy-years, a time-varying measure of cumulative plasma HIV exposure, were determined for each patient using the area under the VL curve. Multivariable Cox models were used to evaluate the independent association of viremia copy-years for all-cause mortality.
Results. Among 2027 patients contributing 6579 person-years of follow-up, the median viremia copy-years was 5.3 log10 copy × y/mL (interquartile range: 4.9–6.3 log10 copy × y/mL), and 85 patients (4.2%) died. When evaluated separately, viremia copy-years (hazard ratio [HR] = 1.81 per log10 copy × y/mL; 95% confidence interval [CI], 1.51–2.18 per log10 copy × y/mL), 24-week VL (1.74 per log10 copies/mL; 95% CI, 1.48–2.04 per log10 copies/mL), and most recent VL (HR = 1.89 per log10 copies/mL; 95% CI: 1.63–2.20 per log10 copies/mL) were associated with increased mortality. When simultaneously evaluating VL measures and controlling for other covariates, viremia copy-years increased mortality risk (HR = 1.44 per log10 copy × y/mL; 95% CI, 1.07–1.94 per log10 copy × y/mL), whereas no cross-sectional VL measure was independently associated with mortality.
Conclusions. Viremia copy-years predicted all-cause mortality independent of traditional, cross-sectional VL measures and time-updated CD4+ T-lymphocyte count in ART-treated patients, suggesting cumulative HIV replication causes harm independent of its effect on the degree of immunodeficiency.
The extracellular biofilm matrix includes primarily DNA and exopolysaccharides (EPS), which function to maintain aggregate structures and to protect biofilms from antibiotics and the immune response. Both polymers are anionic and have cation binding activity, however the impact of this activity on biofilms is not fully understood. Host cell contact is considered the primary signal for activation of most type III secretion systems (T3SS), although calcium limitation is frequently used as a trigger of contact-independent T3SS expression. We hypothesized that alginate, which is a known calcium binding exopolysaccharide produced in mucoid Pseudomonas aeruginosa isolates, can activate the T3SS in biofilms. The addition of exogenous purified alginate to planktonic, non-mucoid PAO1 cultures induced expression of exoS, exoT and exoY-lux reporters of the T3SS in a concentration-dependent manner. Induction by alginate was comparable to induction by the calcium chelator NTA. We extended our analysis of the T3SS in flow chamber-cultivated biofilms, and showed that hyperproduction of alginate in mucA22 mucoid isolates resulted in induction of the exoS-gfp transcriptional reporter compared to non-mucoid paired isolates. We confirmed the transcriptional effects of alginate on the T3SS expression using a FlAsH fluorescence method and showed high levels of the ExoT-Cys4 protein in mucoid biofilms. Induction of the T3SS could be prevented in planktonic cultures and mucoid biofilms treated with excess calcium, indicating that Ca2+ chelation by the EPS matrix caused contact-independent induction. However, mucoid isolates generally had reduced exoS-lux expression in comparison to paired, non-mucoid isolates when grown as planktonic cultures and agar colonies. In summary, we have shown a mucoid biofilm-specific induction of the type III secretion system and highlight a difference between planktonic and biofilm cultures in the production of virulence factors.
To evaluate the prevalence and risk factors for low bone mineral density (BMD) in persons co-infected with HIV and Hepatitis C.
HIV/HCV co-infected study participants (n=179) were recruited into a prospective cohort and underwent dual-energy X-ray absorptiometry (DXA) within 1 year of a liver biopsy. Fibrosis staging was evaluated according to the METAVIR system. Osteoporosis was defined as a T-score ≤ −2.5. Z-scores at the total hip, femoral neck, and lumbar spine were used as the primary outcome variables to assess the association between degree of liver disease, HIV-related variables, and BMD.
The population was 65% male, 85% Black with mean age 50.3 years. The prevalence of osteoporosis at either at the total hip, femoral neck, or lumbar spine was 28%, with 5% having osteoporosis of the total hip, 6% at the femoral neck, 25% at the spine. The mean Z-scores (standard deviation) were −0.42 (1.01) at the total hip, −0.16 (1.05) at the femoral neck, and −0.82 (1.55) at the lumbar spine. In multivariable models, controlled HIV replication (HIV RNA < 400 copies/mL vs ≥400 copies/mL) was associated with lower Z-scores (mean ± standard error) at the total hip (−0.44±0.17, p=0.01), femoral neck (−0.59±0.18, p=0.001), and the spine (−0.98±0.27, p=0.0005). There was no association between degree of liver fibrosis and Z-score.
Osteoporosis was very common in this population of predominately African-American HIV/HCV co-infected patients, particularly at the spine. Lower BMD was associated with controlled HIV replication, but not liver disease severity.
hepatitis C; bone mineral density; hepatic fibrosis; HIV
Since 2003, U.S. organizations have recommended universal screening, rather than targeted screening, of HIV-infected persons for gonorrhea (NG) and Chlamydia (CT). Our objective was to determine whether wider testing resulting from these guidelines would produce an increase in NG/CT diagnoses.
We studied 3,283 patients receiving HIV care 1999–2007 in the Johns Hopkins Hospital HIV clinic. The two primary outcomes were: 1) the occurrence of any NG/CT testing in each year of care and 2) the occurrence of any positive result(s) in years of testing. The proportion of all patients in care who were diagnosed with NG/CT was defined as the number of patients with positive results divided by the number of patients in care. Trends were analyzed with repeated measures logistic regression.
The proportion of patients tested for NG/CT increased steadily from 0.12 in 1999 to 0.33 in 2007 (OR per year for being tested, 1.17 [1.15, 1.19]). The proportion positive among those tested decreased significantly after 2003 (OR per year 0.67 [0.55, 0.81]). The proportion of all patients in care diagnosed with NG/CT therefore remained generally stable 1999–2007 (OR per year 0.97 [0.91, 1.04]).
Universal annual screening, as implemented, did not increase the proportion of all patients in care who were diagnosed with NG/CT. Similarly low implementation rates have been reported in cross-sectional studies. If future efforts to enhance implementation do not yield increases in diagnoses, then guidelines focusing on targeted screening of high risk groups rather than universal screening may be warranted.
HIV prevention; health service research; Neisseria gonorrhoeae; Chlamydia trachomatis; screening; guidelines
Background. Screening for tuberculosis prior to highly active antiretroviral therapy (HAART) initiation is not routinely performed in low-incidence settings. Identifying factors associated with developing tuberculosis after HAART initiation could focus screening efforts.
Methods. Sixteen cohorts in the United States and Canada contributed data on persons infected with human immunodeficiency virus (HIV) who initiated HAART December 1995–August 2009. Parametric survival models identified factors associated with tuberculosis occurrence.
Results. Of 37845 persons in the study, 145 were diagnosed with tuberculosis after HAART initiation. Tuberculosis risk was highest in the first 3 months of HAART (20 cases; 215 cases per 100000 person-years; 95% confidence interval [CI]: 131–333 per 100000 person-years). In a multivariate Weibull proportional hazards model, baseline CD4+ lymphocyte count <200, black race, other nonwhite race, Hispanic ethnicity, and history of injection drug use were independently associated with tuberculosis risk. In addition, in a piece-wise Weibull model, increased baseline HIV-1 RNA was associated with increased tuberculosis risk in the first 3 months; male sex tended to be associated with increased risk.
Conclusions. Screening for active tuberculosis prior to HAART initiation should be targeted to persons with baseline CD4 <200 lymphocytes/mm3 or increased HIV-1 RNA, persons of nonwhite race or Hispanic ethnicity, history of injection drug use, and possibly male sex.
Determination of the prevalence of accumulated antiretroviral drug resistance among persons infected with human immunodeficiency virus (HIV) is complicated by the lack of routine measurement in clinical care. By using data from 8 clinic-based cohorts from the North American AIDS Cohort Collaboration on Research and Design, drug-resistance mutations from those with genotype tests were determined and scored using the Genotypic Resistance Interpretation Algorithm developed at Stanford University. For each year from 2000 through 2005, the prevalence was calculated using data from the tested subset, assumptions that incorporated clinical knowledge, and multiple imputation methods to yield a complete data set. A total of 9,289 patients contributed data to the analysis; 3,959 had at least 1 viral load above 1,000 copies/mL, of whom 2,962 (75%) had undergone at least 1 genotype test. Using these methods, the authors estimated that the prevalence of accumulated resistance to 2 or more antiretroviral drug classes had increased from 14% in 2000 to 17% in 2005 (P < 0.001). In contrast, the prevalence of resistance in the tested subset declined from 57% to 36% for 2 or more classes. The authors’ use of clinical knowledge and multiple imputation methods revealed trends in HIV drug resistance among patients in care that were markedly different from those observed using only data from patients who had undergone genotype tests.
antiretroviral therapy, highly active; drug resistance; genotype; HIV
We showed a dramatic decrease in the human immunodeficiency virus type 1 (HIV-1) ribonucleic acid (RNA) level in a community of patients who receive care in an urban region in the United States, suggesting a decreased risk of HIV transmission as treatment has improved in those under care.
(See the Editorial Commentary by Sax, on pages 605–608.)
Background. We have previously showed that as antiretroviral therapy has improved over time since 1995–1996, the likelihood of achieving virologic suppression has also improved. Antiretroviral therapy and antiretroviral therapy guidelines have continued to evolve, and we wished to determine the trend in human immunodeficiency virus (HIV-1) RNA levels over time in HIV-infected persons receiving care in our large urban HIV clinical practice in Baltimore, Maryland.
Methods. The HIV-1 RNA level was assessed each year from 1996 through 2010 at the date closest to 1 July for all patients in care and followed up in the Johns Hopkins HIV Clinical Cohort. The clinic population’s median HIV-1 RNA level and stratified threshold levels were plotted. The demographic characteristics of the population were also assessed over time.
Results. From 1996 (shortly after highly active antiretroviral therapy [HAART] was introduced) to 2010, the median HIV-1 RNA level decreased from 10 400 to <200 copies/mL. The proportion of patients with an HIV-1 RNA level >500 copies/mL decreased from 75% to only 16% during this same period. The population itself became older, had a higher proportion of women, and a lower proportion of patients with injection drug use as a transmission risk, but it was geographically stable. There was an increase in HAART use over time.
Discussion. Our results demonstrate the remarkable impact of increased use of and improved management with HAART in this urban HIV-infected population.
Late presentation to HIV clinical care increases individual risk of (multiple) clinical events and death, and decreases successful response to highly active antiretroviral therapy (HAART). In Brazil, provision of HAART free of charge to all HIV-infected individuals could lead to increased testing and linkage to care.
We assessed the immune status of 2,555 patients who newly presented for HIV clinical care between 1997 and 2009 at the Johns Hopkins Clinical Cohort, in Baltimore, USA and at the Instituto de Pesquisa Clinica Evandro Chagas Clinical Cohort, in Rio de Janeiro, Brazil. The mean change in the CD4 cell count per year was estimated using multivariate linear regression models.
Overall, from 1997 to 2009, 56% and 54% of the patients presented for HIV clinical care with CD4 count ≤ 350 cells/mm3 in Baltimore and Rio de Janeiro, respectively. On average, 75% of the patients presented with viral load > 10,000 copies/mL. In Rio de Janeiro only, the overall adjusted per year increase in the mean CD4 cell count was statistically significant [5 cells/mm3 (95% CI 1, 10 cells/mm3)].
We found that, over years, the majority of patients presented late, that is, with a CD4 count < 350 cells/mm3. Our findings indicate that, despite the availability of HAART for more than a decade, and mass media campaigns stimulating HIV testing in both countries, the proportion of patients who start therapy at an advanced stage of the disease is still high.
HIV/AIDS; late diagnosis/presentation; immunodeficiency; care