Recently, increasing attention has focused on making causal inference when interference is possible. In the presence of interference, treatment may have several types of effects. In this paper, we consider inference about such effects when the population consists of groups of individuals where interference is possible within groups but not between groups. A two stage randomization design is assumed where in the first stage groups are randomized to different treatment allocation strategies and in the second stage individuals are randomized to treatment or control conditional on the strategy assigned to their group in the first stage. For this design, the asymptotic distributions of estimators of the causal effects are derived when either the number of individuals per group or the number of groups grows large. Under certain homogeneity assumptions, the asymptotic distributions provide justification for Wald-type confidence intervals (CIs) and tests. Empirical results demonstrate the Wald CIs have good coverage in finite samples and are narrower than CIs based on either the Chebyshev or Hoeffding inequalities provided the number of groups is not too small. The methods are illustrated by two examples which consider the effects of cholera vaccination and an intervention to encourage voting.
causal inference; confidence interval; interference; Normal mixture; randomization
A pooled analysis of individual data from >5000 human immunodeficiency virus type 1 (HIV-1)–infected mothers and their infants from Africa and India who participated in 5 randomized trials shows that extended prophylaxis with nevirapine or with nevirapine and zidovudine significantly reduces postnatal HIV-1 infection.
Background. In resource-limited settings, mothers infected with human immunodeficiency virus type 1 (HIV-1) face a difficult choice: breastfeed their infants but risk transmitting HIV-1 or not breastfeed their infants and risk the infants dying of other infectious diseases or malnutrition. Recent results from observational studies and randomized clinical trials indicate daily administration of nevirapine to the infant can prevent breast-milk HIV-1 transmission.
Methods. Data from 5396 mother-infant pairs who participated in 5 randomized trials where the infant was HIV-1 negative at birth were pooled to estimate the efficacy of infant nevirapine prophylaxis to prevent breast-milk HIV-1 transmission. Four daily regimens were compared: nevirapine for 6 weeks, 14 weeks, or 28 weeks, or nevirapine plus zidovudine for 14 weeks.
Results. The estimated 28-week risk of HIV-1 transmission was 5.8% (95% confidence interval [CI], 4.3%–7.9%) for the 6-week nevirapine regimen, 3.7% (95% CI, 2.5%–5.4%) for the 14-week nevirapine regimen, 4.8% (95% CI, 3.5%–6.7%) for the 14-week nevirapine plus zidovudine regimen, and 1.8% (95% CI, 1.0%–3.1%) for the 28-week nevirapine regimen (log-rank test for trend, P < .001). Cox regression models with nevirapine as a time-varying covariate, stratified by trial site and adjusted for maternal CD4 cell count and infant birth weight, indicated that nevirapine reduces the rate of HIV-1 infection by 71% (95% CI, 58%–80%; P < .001) and reduces the rate of HIV infection or death by 58% (95% CI, 45%–69%; P < .001).
Conclusions. Extended prophylaxis with nevirapine or with nevirapine and zidovudine significantly reduces postnatal HIV-1 infection. Longer duration of prophylaxis results in a greater reduction in the risk of infection.
breast milk; HIV; nevirapine
An intensive, prospective, open-label pharmacokinetic (PK) study in a subset of HIV-infected mothers and their uninfected infants enrolled in the Breastfeeding, Antiretroviral, and Nutrition study was performed to describe drug exposure and antiviral response.
Women using Combivir®[zidovudine (ZDV)+ lamivudine (3TC)]+Aluvia®[lopinavir/ritonavir(LPV/RTV)] were enrolled. Breast milk (BM) and mother and infant plasma (MP, IP) samples were obtained over 6hrs after observed dosing at 6, 12, or 24wks post-partum for drug concentrations and HIV RNA.
30 mother/infant pairs (10 each at 6, 12,and 24wks post-partum) were enrolled. Relative to MP, BM concentrations of ZDV and 3TC were 35% and 21% higher, while LPV and RTV were 80% lower. Only 3TC was detected in IP with concentrations 96% and 98% lower than MP and BM, respectively. Concentrations in all matrices were similar at 6-24wks. The majority (98.3%) of BM concentrations were >HIVwt IC50, with one having detectable virus. There was no association between PK parameters and MP or BM HIV RNA.
ZDV and 3TC concentrated in BM while LPV and RTV did not, possibly due to protein binding and drug transporter affinity. Undetectable to low ARV concentrations in IP suggests prevention of transmission while breast feeding may be due to ARV effects on systemic or BM HIV RNA in the mother. Low IP 3TC exposure may predispose an infected infant to HIV resistance, necessitating testing and treating infants early.
In many biological and environmental studies, measured data is subject to a limit of detection. The limit of detection is generally defined as the lowest concentration of analyte that can be differentiated from a blank sample with some certainty. Data falling below the limit of detection is left-censored, falling below a level that is easily quantified by a measuring device. A great deal of interest lies in estimating the limit of detection for a particular measurement device. In this paper we propose a change-point model to estimate the limit of detection using data from an experiment with known analyte concentrations. Estimation of the limit of detection proceeds by a two-stage maximum likelihood method. Extensions are considered that allow for censored measurements and data from multiple experiments. A simulation study is conducted demonstrating that in some settings the change-point model provides less biased estimates of the limit of detection than conventional methods. The proposed method is then applied to data from an HIV pilot study.
change point; limit of detection; linear calibration curve; two-stage maximum likelihood
If a vaccine does not protect individuals completely against infection, it could still reduce infectiousness of infected vaccinated individuals to others. Typically, vaccine efficacy for infectiousness is estimated based on contrasts between the transmission risk to susceptible individuals from infected vaccinated individuals compared with that from infected unvaccinated individuals. Such estimates are problematic, however, because they are subject to selection bias and do not have a causal interpretation. Here, we develop causal estimands for vaccine efficacy for infectiousness for four different scenarios of populations of transmission units of size two. These causal estimands incorporate both principal stratification, based on the joint potential infection outcomes under vaccine and control, and interference between individuals within transmission units. In the most general scenario, both individuals can be exposed to infection outside the transmission unit and both can be assigned either vaccine or control. The three other scenarios are special cases of the general scenario where only one individual is exposed outside the transmission unit or can be assigned vaccine. The causal estimands for vaccine efficacy for infectiousness are well defined only within certain principal strata and, in general, are identifiable only with strong unverifiable assumptions. Nonetheless, the observed data do provide some information, and we derive large sample bounds on the causal vaccine efficacy for infectiousness estimands. An example of the type of data observed in a study to estimate vaccine efficacy for infectiousness is analyzed in the causal inference framework we developed.
causal inference; principal stratification; interference; infectious disease; vaccine
We evaluated the efficacy of a peer-educator network intervention as a
strategy to reduce HIV acquisition among injection drug users (IDUs) and their
drug and/or sexual networks. A randomized controlled trial was conducted in St.
Petersburg, Russia among IDU index participants and their risk network
participants. Network units were randomized to the control or experimental
intervention. Only the experimental index participants received training
sessions to communicate risk reduction techniques to their network members.
Analysis includes 76 index and 84 network participants who were HIV uninfected.
The main outcome measure was HIV sero-conversion. The incidence rates in the
control and experimental groups were 19.57 (95 % CI 10.74–35.65)
and 7.76 (95 % CI 3.51–17.19) cases per 100 p/y, respectively.
The IRR was 0.41 (95 % CI 0.15–1.08) without a statistically
significant difference between the two groups (log rank test statistic
X2 = 2.73, permutation p value = 0.16).
Retention rate was 67 % with a third of the loss due to incarceration or
death. The results show a promising trend that this strategy would be successful
in reducing the acquisition of HIV among IDUs.
Injection drug users; Russia; HIV prevention; Network
We estimated the incidence of watery diarrhea in the community before and after introduction of the pentavalent rotavirus vaccine in León, Nicaragua. A random sample of households was selected before and after rotavirus vaccine introduction. All children < 5 years of age in selected households were eligible for inclusion. Children were followed every 2 weeks for watery diarrhea episodes. The incidence rate was estimated as numbers of episodes per 100 child-years of exposure time. A mixed effects Poisson regression model was fit to compare incidence rates in the pre-vaccine and vaccine periods. The pre-vaccine cohort (N = 726) experienced 36 episodes per 100 child-years, and the vaccine cohort (N = 826) experienced 25 episodes per 100 child-years. The adjusted incidence rate ratio was 0.60 (95% confidence interval [CI] 0.40, 0.91) during the vaccine period versus the pre-vaccine period, indicating a lower incidence of watery diarrhea in the community during the vaccine period.
Estimation of treatment effects in randomized studies is often hampered by possible selection bias induced by conditioning on or adjusting for a variable measured post-randomization. One approach to obviate such selection bias is to consider inference about treatment effects within principal strata, i.e., principal effects. A challenge with this approach is that without strong assumptions principal effects are not identifiable from the observable data. In settings where such assumptions are dubious, identifiable large sample bounds may be the preferred target of inference. In practice these bounds may be wide and not particularly informative. In this work we consider whether bounds on principal effects can be improved by adjusting for a categorical baseline covariate. Adjusted bounds are considered which are shown to never be wider than the unadjusted bounds. Necessary and sufficient conditions are given for which the adjusted bounds will be sharper (i.e., narrower) than the unadjusted bounds. The methods are illustrated using data from a recent, large study of interventions to prevent mother-to-child transmission of HIV through breastfeeding. Using a baseline covariate indicating low birth weight, the estimated adjusted bounds for the principal effect of interest are 63% narrower than the estimated unadjusted bounds.
Bounds; Causal Effects; Partial Identifiability; Potential Outcomes; Principal Strata
The healthy worker survivor bias is well-recognized in occupational epidemiology. Three component associations are necessary for this bias to occur: i) prior exposure and employment status; ii) employment status and subsequent exposure; and iii) employment status and mortality. Together, these associations result in time-varying confounding affected by prior exposure. We illustrate how these associations can be assessed using standard regression methods.
We use data from 2975 asbestos textile factory workers hired between January 1940 and December 1965 and followed for lung cancer mortality through December 2001.
At entry, median age was 24 years, with 42% female and 19% non-Caucasian. Over follow-up, 21% and 17% of person-years were classified as at work and exposed to any asbestos, respectively. For a 100 fiber-year/mL increase in cumulative asbestos, the covariate-adjusted hazard of leaving work decreased by 52% (95% confidence interval [CI], 46–58). The association between employment status and subsequent asbestos exposure was strong due to nonpositivity: 88.3% of person-years at work (95% CI, 87.0–89.5) were classified as exposed to any asbestos; no person-years were classified as exposed to asbestos after leaving work. Finally, leaving active employment was associated with a 48% (95% CI, 9–71) decrease in the covariate-adjusted hazard of lung cancer mortality.
We found strong associations for the components of the healthy worker survivor bias in these data. Standard methods, which fail to properly account for time-varying confounding affected by prior exposure, may provide biased estimates of the effect of asbestos on lung cancer mortality under these conditions.
Epidemiologic methods; Occupational health; Healthy worker effect; Bias; Lung cancer; Mortality
To investigate the intraindividual pharmacokinetics of total (protein bound + unbound) and unbound lopinavir/ritonavir (LPV/RTV) and to assess whether the pediatric formulation (100mg/25mg) can overcome any pregnancy-associated changes.
Prospective longitudinal pharmacokinetic (PK) study
HIV-infected pregnant antiretroviral therapy-naïve and experienced women receiving LPV/RTV 400mg/100mg tablets twice daily. Intensive PK evaluations were performed at 20–24 weeks (PK1), 30 weeks (PK2) followed by empiric dose increase using the pediatric formulation (100mg/25mg twice daily), 32 weeks (PK3), and 8 weeks postpartum (PK4).
Twelve women completed pre-specified PK evaluations. Median (range) age was 28 (1–35) years and baseline BMI was 32 (19–41) kg/m2. During pregnancy, total area under the time concentration (AUC0–12hr) for LPV was significantly lower than postpartum [PK1, PK2 or PK3 vs. PK4, p= 0.005]. Protein unbound LPV AUC0–12hr remained unchanged during pregnancy [PK1: 1.6 (1.3–1.9) vs. PK2: 1.6 (1.3–1.9) µg*hr/mL, p=0.4] despite a 25% dose increase [PK2 vs. PK3: 1.8 (1.3–2.1) µg*hr/mL, p=0.5]. Protein unbound LPV predose concentrations (C12h) did not significantly change despite dose increase [PK2: 0.10 (0.08–0.15) vs. PK3: 0.12 (0.10–0.15) µg/mL, p=0.09]. Albumin and LPV AUC0–12h fraction unbound were correlated (rs=0.3, p=0.03).
Total LPV exposure was significantly decreased throughout pregnancy despite the increased dose. However, the exposure of unbound LPV did not change significantly regardless of trimester or dose. Predose concentrations of unbound LPV were not affected by the additional dose and were 70-fold greater than the minimum efficacy concentration. These findings suggest dose adjustments may not be necessary in all HIV-infected pregnant women.
HIV; pregnancy; lopinavir/ritonavir; protein unbound; pharmacokinetics
Obese adults have a greater risk of morbidity and mortality from infection with pandemic H1N1 influenza A virus (pH1N1). The objective of the present study was to elucidate the specific mechanisms by which obesity and overweight impact the cellular immune response to pH1N1.
Design and Methods
We stimulated peripheral blood mononuclear cells from healthy weight, overweight, and obese individuals ex vivo with live pH1N1 and then measured markers of activation and function using flow cytometry and cytokine secretion using cytometric bead array assays.
Our data indicate that CD4+ and CD8+ T cells from overweight and obese individuals expressed lower levels of CD69, CD28, CD40 ligand, and interleukin-12 receptor, as well as, produced lower levels of interferon-γ and granzyme B, compared to healthy weight individuals, suggesting deficiencies in activation and function. Dendritic cells from the three groups expressed similar levels of major histocompatibility complex-II, CD40, CD80, and CD86, as well as, produced similar levels of interleukin-12.
The defects in CD4+ and CD8+ T cells may contribute to the increased morbidity and mortality from pH1N1 in obese individuals. These data also provide evidence that both overweight and obesity cause impairments in immune function.
Obesity; overweight; pH1N1 influenza; CD4+ T cells; CD8+ T cells; dendritic cells
Relative contribution of these infections on anemia in pregnancy is not certain. While measures to protect pregnant women against malaria have been scaling up, interventions against helminthes have received much less attention. In this study, we determine the relative impact of helminthes and malaria on maternal anemia.
A prospective observational study was conducted in coastal Kenya among a cohort of pregnant women who were recruited at their first antenatal care (ANC) visit and tested for malaria, hookworm, and other parasitic infections and anemia at enrollment. All women enrolled in the study received presumptive treatment with sulfadoxine-pyrimethamine, iron and multi-vitamins and women diagnosed with helminthic infections were treated with albendazole. Women delivering a live, term birth, were also tested for maternal anemia, fetal anemia and presence of infection at delivery.
Of the 706 women studied, at the first ANC visit, 27% had moderate/severe anemia and 71% of women were anemic overall. The infections with highest prevalence were hookworm (24%), urogenital schistosomiasis (17%), trichuria (10%), and malaria (9%). In adjusted and unadjusted analyses, moderate/severe anemia at first ANC visit was associated with the higher intensities of hookworm and P. falciparum microscopy-malaria infections. At delivery, 34% of women had moderate/severe anemia and 18% of infants' cord hemoglobin was consistent with fetal anemia. While none of the maternal infections were significantly associated with fetal anemia, moderate/severe maternal anemia was associated with fetal anemia.
More than one quarter of women receiving standard ANC with IPTp for malaria had moderate/severe anemia in pregnancy and high rates of parasitic infection. Thus, addressing the role of co-infections, such as hookworm, as well as under-nutrition, and their contribution to anemia is needed.
International guidelines recommend routine prevention and treatments which are safe and effective during pregnancy to reduce hookworm, malaria and other infections among pregnant women living in geographic areas where these infections are prevalent. Despite their effectiveness, programs to address common infections such as hookworm, schistosomiasis and malaria during pregnancy have not been widely adopted. Hookworm, malaria and other infections have been associated with anemia in children, but the studies on the impact of these infections on anemia in pregnancy have not been as clear. This study was undertaken to evaluate the prevalence of parasitic infections among women attending antenatal care which provided the nationally recommended malaria preventive treatment program in coastal Kenya. At the first ANC visit, more than 70% of women were anemic, nearly one-fourth had hookworm and about 10% had malaria. Women with high levels of hookworm or malaria infections were at risk of anemia.
We consider the optimal configuration of a square array group testing algorithm (denoted A2) to minimize the expected number of tests per specimen. For prevalence greater than 0.2498, individual testing is shown to be more efficient than A2. For prevalence less than 0.2498, closed form lower and upper bounds on the optimal group sizes for A2 are given. Arrays of dimension 2 × 2, 3 × 3, and 4 × 4 are shown to never be optimal. The results are illustrated by considering the design of a specimen pooling algorithm for detection of recent HIV infections in Malawi.
Antiretroviral therapy (ART) can reduce HIV levels in plasma to undetectable levels, but rather little is known about the effects of ART outside of the peripheral blood regarding persistent virus production in tissue reservoirs. Understanding the dynamics of ART-induced reductions in viral RNA (vRNA) levels throughout the body is important for the development of strategies to eradicate infectious HIV from patients. Essential to a successful eradication therapy is a component capable of killing persisting HIV infected cells during ART. Therefore, we determined the in vivo efficacy of a targeted cytotoxic therapy to kill infected cells that persist despite long-term ART. For this purpose, we first characterized the impact of ART on HIV RNA levels in multiple organs of bone marrow-liver-thymus (BLT) humanized mice and found that antiretroviral drug penetration and activity was sufficient to reduce, but not eliminate, HIV production in each tissue tested. For targeted cytotoxic killing of these persistent vRNA+ cells, we treated BLT mice undergoing ART with an HIV-specific immunotoxin. We found that compared to ART alone, this agent profoundly depleted productively infected cells systemically. These results offer proof-of-concept that targeted cytotoxic therapies can be effective components of HIV eradication strategies.
Antiretroviral therapy (ART) improves the quality of life for HIV infected individuals. However, ART is currently a lifelong commitment because HIV persists during treatment despite being suppressed below detection. If therapy is stopped, the HIV reappears. A concerted effort is ongoing to develop new eradication therapies to prevent virus rebound, but there are challenges to be overcome. Our work is a major step forward in this process. We measured persistent HIV throughout the body during ART using bone marrow/liver/thymus (BLT) humanized mice, a model validated to study HIV persistence. HIV infected BLT mice were treated with tenofovir, emtricitabine and raltegravir. Despite documented tissue penetration by these drugs, we found that HIV expression persists in cells isolated from all the tissues analyzed (bone marrow, thymus, spleen, lymph nodes, liver, lung, intestines and peripheral blood cells). We therefore complemented ART with an immunotoxin that specifically kills HIV expressing cells while leaving other cells untouched. Our results demonstrate a dramatic reduction in persistent HIV throughout the body resulting from the killing of virus producing cells. Thus, our study provides new insights into the locations of HIV persistence during ART and a demonstration that persistent HIV can be successfully targeted inside the body.
Simulation studies were conducted to estimate the statistical power of repeated low-dose challenge experiments in non-human primates to detect a candidate HIV vaccine’s effect. The effect of various design parameters on power was explored. Simulation results indicate repeated low-dose challenge studies with total sample size 50 (25 per arm) typically provide adequate power to detect a 50% reduction in the per-exposure probability of infection due to vaccination. Power generally increases with the maximum number of allowable challenges per animal, the per-exposure risk of infection in controls, and the proportion susceptible to infection.
HIV; Macaque; Vaccine
Assessing per-protocol treatment effcacy on a time-to-event endpoint is a common
objective of randomized clinical trials. The typical analysis uses the same method employed for the
intention-to-treat analysis (e.g., standard survival analysis) applied to the subgroup meeting
protocol adherence criteria. However, due to potential post-randomization selection bias, this
analysis may mislead about treatment efficacy. Moreover, while there is extensive literature on
methods for assessing causal treatment effects in compliers, these methods do not apply to a common
class of trials where a) the primary objective compares survival curves, b) it is inconceivable to
assign participants to be adherent and event-free before adherence is measured, and c) the exclusion
restriction assumption fails to hold. HIV vaccine efficacy trials including the recent RV144 trial
exemplify this class, because many primary endpoints (e.g., HIV infections) occur before adherence
is measured, and nonadherent subjects who receive some of the planned immunizations may be partially
protected. Therefore, we develop methods for assessing per-protocol treatment efficacy for this
problem class, considering three causal estimands of interest. Because these estimands are not
identifiable from the observable data, we develop nonparametric bounds and semiparametric
sensitivity analysis methods that yield estimated ignorance and uncertainty intervals. The methods
are applied to RV144.
As-treated; Bounds; Causal inference; Exclusion restriction; Ignorance region; Intention to treat; Principal stratification; Selection bias; Survival analysis
Background. Limited data exist on cotrimoxazole prophylactic treatment (CPT)
in pregnant women, including protection against malaria versus standard intermittent preventive
therapy with sulfadoxine-pyrimethamine (IPTp). Methods. Using observational
data we examined the effect of CPT in HIV-infected pregnant women on malaria during pregnancy,
low birth weight and preterm birth using proportional hazards, logistic, and log binomial regression,
respectively. We used linear regression to assess effect of CPT on CD4 count.
Results. Data from 468 CPT-exposed and 768 CPT-unexposed women
were analyzed. CPT was associated with protection against malaria versus
IPTp (hazard ratio: 0.35, 95% Confidence Interval (CI): 0.20, 0.60). After
adjustment for time period this effect was not statistically significant (adjusted hazard
ratio: 0.66, 95% CI: 0.28, 1.52). Among women receiving and not receiving CPT,
rates of low birth weight (7.1% versus 7.6%) and preterm birth (23.5% versus 23.6%) were similar.
CPT was associated with lower CD4 counts 24 weeks postpartum in women
receiving (−77.6 cells/μL, 95% CI: −125.2, −30.1) and not
receiving antiretrovirals (−33.7 cells/μL, 95% CI: −58.6, −8.8).
Conclusions. Compared to IPTp, CPT provided comparable protection against malaria in HIV-infected
pregnant women and against preterm birth or low birth weight. Possible implications of CPT-associated lower CD4 postpartum warrant further examination.
In randomized trials to prevent breast milk transmission of human immunodeficiency virus (HIV) from mother to infant, investigators are often interested in assessing the effect of a treatment or intervention on the cumulative risk of HIV infection by time (age) t in infants who are alive and uninfected at a certain time point τ0 < t. Such comparisons are challenging for two reasons. First, infants are typically randomized at birth (time 0 < τ0) such that comparisons between trial arms among the subset of infants alive and uninfected at τ0 are subject to selection bias. Second, in most mother-to-child transmission (MTCT) trials competing risks are often present, such as death or cessation of breastfeeding prior to HIV infection. In this paper we present methods for assessing the causal effect of a treatment on competing risk outcomes within principal strata. In MTCT trials, the causal effect of interest is that of treatment on the risk of HIV infection by time t > τ0 within the principal stratum of infants who would be alive and uninfected by τ0 regardless of randomization assignment. Large sample non-parametric bounds and a semi-parametric sensitivity analysis model are developed for drawing inference about this causal effect. A simulation study is presented demonstrating that the proposed methods perform well in finite samples. The proposed methods are applied to a large, recent MTCT trial.
Causal inference; Infectious diseases; Principal stratification; Sensitivity analysis
Causal inference; infectious disease; infectiousness; interference; principal stratification; vaccine efficacy
The p16INK4a tumor suppressor gene is a mediator of cellular senescence and has been suggested to be a biomarker of ‘molecular’ age in several tissues including T-cells. To determine the association of both active and suppressed HIV infection with T-cell aging, T-cell p16INK4a expression was compared between 60 HIV+ suppressed subjects, 23 HIV+ untreated subjects, and 18 contemporaneously collected HIV-negative controls, as well as 148 HIV-negative historical samples. Expression did not correlate with chronologic age in untreated HIV+ patients, consistent with an effect of active HIV replication on p16INK4a expression. In patients on cART with suppressed viral loads, however, p16INK4a levels were similar to uninfected controls and correlated with chronologic age, with a trend toward an inverse correlation with CD4 count. These data show that p16INK4a is a reliable biomarker of T cell aging in HIV+ patients with suppressed viral loads and suggest that poor CD4 cell recovery on cART may be associated with increased T-cell expression of p16INK4a, a marker of cellular senescence.
Background.Little is known about type-specific associations between prevalent human papillomavirus (HPV) infections and risk of acquiring other HPV types in men. Data on natural clustering of HPV types are needed as a prevaccine distribution to which postvaccine data can be compared.
Methods.Using data from a randomized controlled trial of male circumcision in Kisumu, Kenya, adjusted mean survival ratios were estimated for acquisition of any-HPV, high-risk (HR) HPV, and individual HR-HPV types among men uninfected as compared to those infected with vaccine-relevant HPV types 16, 18, 31, 45, 6, or 11 at baseline.
Results.Among 1097 human immunodeficiency virus–negative, uncircumcised men, 2303 incident HPV infections were detected over 2534 person-years of follow-up. Although acquisition of individual HR-HPV types varied by baseline HPV type, there was no clear evidence of shorter times to acquisition among men without vaccine-relevant HPV-16, -18, -31, -45, -6, or -11 infections at baseline, as compared to men who did have these infections at baseline.
Conclusions.These prospective data on combinations of HPV infections over time do not suggest the potential for postvaccination HPV type replacement. Future surveillance studies are needed to definitely determine whether elimination of HPV types by vaccination will alter the HPV type distribution in the population.
A fundamental assumption usually made in causal inference is that of no interference between individuals (or units); that is, the potential outcomes of one individual are assumed to be unaffected by the treatment assignment of other individuals. However, in many settings, this assumption obviously does not hold. For example, in the dependent happenings of infectious diseases, whether one person becomes infected depends on who else in the population is vaccinated. In this article, we consider a population of groups of individuals where interference is possible between individuals within the same group. We propose estimands for direct, indirect, total, and overall causal effects of treatment strategies in this setting. Relations among the estimands are established; for example, the total causal effect is shown to equal the sum of direct and indirect causal effects. Using an experimental design with a two-stage randomization procedure (first at the group level, then at the individual level within groups), unbiased estimators of the proposed estimands are presented. Variances of the estimators are also developed. The methodology is illustrated in two different settings where interference is likely: assessing causal effects of housing vouchers and of vaccines.
Group-randomized trials; Potential outcomes; Stable unit treatment value assumption; SUTVA; Vaccine
Racial differences in antiretroviral treatment responses remain incompletely explained and may be a consequence of differential pharmacokinetics (PK) associated with race. Raltegravir, an inhibitor of HIV-1 integrase, is commonly used in the treatment of HIV-infected patients, many of whom are African-American. However, there are few data regarding the PK of raltegravir in African-Americans. HIV-infected men and women, self-described as African-American and naive to antiretroviral therapy were treated with raltegravir (RAL) at 400 mg twice a day, plus a fixed dose of tenofovir-emtricitabine (TDF/FTC) at 300 mg/200 mg once daily. Intensive PK sampling was conducted over 24 h at week 4. Drug concentrations at two trough values of 12 and 24 h after dosing (C12 and C24), area under the concentration-curve values (AUC), maximum drug concentration (Cmax), and the time at which this concentration occurred (Tmax) in plasma were estimated with noncompartmental pharmacokinetic methods and compared to data from a subset of white subjects randomized to the RAL twice a day (plus TDF/FTC) arm of the QDMRK study, a phase III study of the safety and efficacy of once daily versus twice daily RAL in treatment naive patients. A total of 38 African-American participants were enrolled (90% male) into the REAL cohort with the following median baseline characteristics: age of 36 years, body mass index (BMI) of 23 kg/m2, and a CD4 cell count of 339/ml. Plasma HIV RNA levels were below 200 copies/ml in 95% of participants at week 4. The characteristics of the 16 white QDMRK study participants were similar, although fewer (69%) were male, the median age was higher (45 years), and BMI was lower (19 kg/m2). There was considerable interindividual variability in RAL concentrations in both cohorts. Median C12 in REAL was 91 ng/ml (range, 10 to 1,386) and in QDMRK participants was 128 ng/ml (range, 15 to 1,074). The Cmax median concentration was 1,042 ng/ml (range, 196 to 10,092) for REAL and 1,360 ng/ml (range, 218 to 9,701) for QDMRK. There were no significant differences in any RAL PK parameter between these cohorts of African-American and white individuals. Based on plasma PK, and with similar adherence rates, the performance of RAL among HIV-infected African-Americans should be no different than that of infected patients who are white.