PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1381724)

Clipboard (0)
None

Related Articles

1.  On a Closed-Form Doubly Robust Estimator of the Adjusted Odds Ratio for a Binary Exposure 
American Journal of Epidemiology  2013;177(11):1314-1316.
Epidemiologic studies often aim to estimate the odds ratio for the association between a binary exposure and a binary disease outcome. Because confounding bias is of serious concern in observational studies, investigators typically estimate the adjusted odds ratio in a multivariate logistic regression which conditions on a large number of potential confounders. It is well known that modeling error in specification of the confounders can lead to substantial bias in the adjusted odds ratio for exposure. As a remedy, Tchetgen Tchetgen et al. (Biometrika. 2010;97(1):171–180) recently developed so-called doubly robust estimators of an adjusted odds ratio by carefully combining standard logistic regression with reverse regression analysis, in which exposure is the dependent variable and both the outcome and the confounders are the independent variables. Double robustness implies that only one of the 2 modeling strategies needs to be correct in order to make valid inferences about the odds ratio parameter. In this paper, I aim to introduce this recent methodology into the epidemiologic literature by presenting a simple closed-form doubly robust estimator of the adjusted odds ratio for a binary exposure. A SAS macro (SAS Institute Inc., Cary, North Carolina) is given in an online appendix to facilitate use of the approach in routine epidemiologic practice, and a simulated data example is also provided for the purpose of illustration.
doi:10.1093/aje/kws377
PMCID: PMC3664333  PMID: 23558352
case-control sampling; doubly robust estimator; logistic regression; odds ratio; SAS macro
2.  Alternatives for logistic regression in cross-sectional studies: an empirical comparison of models that directly estimate the prevalence ratio 
Background
Cross-sectional studies with binary outcomes analyzed by logistic regression are frequent in the epidemiological literature. However, the odds ratio can importantly overestimate the prevalence ratio, the measure of choice in these studies. Also, controlling for confounding is not equivalent for the two measures. In this paper we explore alternatives for modeling data of such studies with techniques that directly estimate the prevalence ratio.
Methods
We compared Cox regression with constant time at risk, Poisson regression and log-binomial regression against the standard Mantel-Haenszel estimators. Models with robust variance estimators in Cox and Poisson regressions and variance corrected by the scale parameter in Poisson regression were also evaluated.
Results
Three outcomes, from a cross-sectional study carried out in Pelotas, Brazil, with different levels of prevalence were explored: weight-for-age deficit (4%), asthma (31%) and mother in a paid job (52%). Unadjusted Cox/Poisson regression and Poisson regression with scale parameter adjusted by deviance performed worst in terms of interval estimates. Poisson regression with scale parameter adjusted by χ2 showed variable performance depending on the outcome prevalence. Cox/Poisson regression with robust variance, and log-binomial regression performed equally well when the model was correctly specified.
Conclusions
Cox or Poisson regression with robust variance and log-binomial regression provide correct estimates and are a better alternative for the analysis of cross-sectional studies with binary outcomes than logistic regression, since the prevalence ratio is more interpretable and easier to communicate to non-specialists than the odds ratio. However, precautions are needed to avoid estimation problems in specific situations.
doi:10.1186/1471-2288-3-21
PMCID: PMC521200  PMID: 14567763
Cox regression; cross-sectional studies; logistic regression; odds ratio; Poisson regression; prevalence ratio; robust variance; statistical models
3.  Re-evaluation of link between interpregnancy interval and adverse birth outcomes: retrospective cohort study matching two intervals per mother 
Objective To re-evaluate the causal effect of interpregnancy interval on adverse birth outcomes, on the basis that previous studies relying on between mother comparisons may have inadequately adjusted for confounding by maternal risk factors.
Design Retrospective cohort study using conditional logistic regression (matching two intervals per mother so each mother acts as her own control) to model the incidence of adverse birth outcomes as a function of interpregnancy interval; additional unconditional logistic regression with adjustment for confounders enabled comparison with the unmatched design of previous studies.
Setting Perth, Western Australia, 1980-2010.
Participants 40 441 mothers who each delivered three liveborn singleton neonates.
Main outcome measures Preterm birth (<37 weeks), small for gestational age birth (<10th centile of birth weight by sex and gestational age), and low birth weight (<2500 g).
Results Within mother analysis of interpregnancy intervals indicated a much weaker effect of short intervals on the odds of preterm birth and low birth weight compared with estimates generated using a traditional between mother analysis. The traditional unmatched design estimated an adjusted odds ratio for an interpregnancy interval of 0-5 months (relative to the reference category of 18-23 months) of 1.41 (95% confidence interval 1.31 to 1.51) for preterm birth, 1.26 (1.15 to 1.37) for low birth weight, and 0.98 (0.92 to 1.06) for small for gestational age birth. In comparison, the matched design showed a much weaker effect of short interpregnancy interval on preterm birth (odds ratio 1.07, 0.86 to 1.34) and low birth weight (1.03, 0.79 to 1.34), and the effect for small for gestational age birth remained small (1.08, 0.87 to 1.34). Both the unmatched and matched models estimated a high odds of small for gestational age birth and low birth weight for long interpregnancy intervals (longer than 59 months), but the estimated effect of long interpregnancy intervals on the odds of preterm birth was much weaker in the matched model than in the unmatched model.
Conclusion This study questions the causal effect of short interpregnancy intervals on adverse birth outcomes and points to the possibility of unmeasured or inadequately specified maternal factors in previous studies.
doi:10.1136/bmj.g4333
PMCID: PMC4137882  PMID: 25056260
4.  Long-Term Exposure to Silica Dust and Risk of Total and Cause-Specific Mortality in Chinese Workers: A Cohort Study 
PLoS Medicine  2012;9(4):e1001206.
A retro-prospective cohort study by Weihong Chen and colleagues provides new estimates for the risk of total and cause-specific mortality due to long-term silica dust exposure among Chinese workers.
Background
Human exposure to silica dust is very common in both working and living environments. However, the potential long-term health effects have not been well established across different exposure situations.
Methods and Findings
We studied 74,040 workers who worked at 29 metal mines and pottery factories in China for 1 y or more between January 1, 1960, and December 31, 1974, with follow-up until December 31, 2003 (median follow-up of 33 y). We estimated the cumulative silica dust exposure (CDE) for each worker by linking work history to a job–exposure matrix. We calculated standardized mortality ratios for underlying causes of death based on Chinese national mortality rates. Hazard ratios (HRs) for selected causes of death associated with CDE were estimated using the Cox proportional hazards model. The population attributable risks were estimated based on the prevalence of workers with silica dust exposure and HRs. The number of deaths attributable to silica dust exposure among Chinese workers was then calculated using the population attributable risk and the national mortality rate. We observed 19,516 deaths during 2,306,428 person-years of follow-up. Mortality from all causes was higher among workers exposed to silica dust than among non-exposed workers (993 versus 551 per 100,000 person-years). We observed significant positive exposure–response relationships between CDE (measured in milligrams/cubic meter–years, i.e., the sum of silica dust concentrations multiplied by the years of silica exposure) and mortality from all causes (HR 1.026, 95% confidence interval 1.023–1.029), respiratory diseases (1.069, 1.064–1.074), respiratory tuberculosis (1.065, 1.059–1.071), and cardiovascular disease (1.031, 1.025–1.036). Significantly elevated standardized mortality ratios were observed for all causes (1.06, 95% confidence interval 1.01–1.11), ischemic heart disease (1.65, 1.35–1.99), and pneumoconiosis (11.01, 7.67–14.95) among workers exposed to respirable silica concentrations equal to or lower than 0.1 mg/m3. After adjustment for potential confounders, including smoking, silica dust exposure accounted for 15.2% of all deaths in this study. We estimated that 4.2% of deaths (231,104 cases) among Chinese workers were attributable to silica dust exposure. The limitations of this study included a lack of data on dietary patterns and leisure time physical activity, possible underestimation of silica dust exposure for individuals who worked at the mines/factories before 1950, and a small number of deaths (4.3%) where the cause of death was based on oral reports from relatives.
Conclusions
Long-term silica dust exposure was associated with substantially increased mortality among Chinese workers. The increased risk was observed not only for deaths due to respiratory diseases and lung cancer, but also for deaths due to cardiovascular disease.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Walk along most sandy beaches and you will be walking on millions of grains of crystalline silica, one of the commonest minerals on earth and a major ingredient in glass and in ceramic glazes. Silica is also used in the manufacture of building materials, in foundry castings, and for sandblasting, and respirable (breathable) crystalline silica particles are produced during quarrying and mining. Unfortunately, silica dust is not innocuous. Several serious diseases are associated with exposure to this dust, including silicosis (a chronic lung disease characterized by scarring and destruction of lung tissue), lung cancer, and pulmonary tuberculosis (a serious lung infection). Moreover, exposure to silica dust increases the risk of death (mortality). Worryingly, recent reports indicate that in the US and Europe, about 1.7 and 3.0 million people, respectively, are occupationally exposed to silica dust, figures that are dwarfed by the more than 23 million workers who are exposed in China. Occupational silica exposure, therefore, represents an important global public health concern.
Why Was This Study Done?
Although the lung-related adverse health effects of exposure to silica dust have been extensively studied, silica-related health effects may not be limited to these diseases. For example, could silica dust particles increase the risk of cardiovascular disease (diseases that affect the heart and circulation)? Other environmental particulates, such as the products of internal combustion engines, are associated with an increased risk of cardiovascular disease, but no one knows if the same is true for silica dust particles. Moreover, although it is clear that high levels of exposure to silica dust are dangerous, little is known about the adverse health effects of lower exposure levels. In this cohort study, the researchers examined the effect of long-term exposure to silica dust on the risk of all cause and cause-specific mortality in a large group (cohort) of Chinese workers.
What Did the Researchers Do and Find?
The researchers estimated the cumulative silica dust exposure for 74,040 workers at 29 metal mines and pottery factories from 1960 to 2003 from individual work histories and more than four million measurements of workplace dust concentrations, and collected health and mortality data for all the workers. Death from all causes was higher among workers exposed to silica dust than among non-exposed workers (993 versus 551 deaths per 100,000 person-years), and there was a positive exposure–response relationship between silica dust exposure and death from all causes, respiratory diseases, respiratory tuberculosis, and cardiovascular disease. For example, the hazard ratio for all cause death was 1.026 for every increase in cumulative silica dust exposure of 1 mg/m3-year; a hazard ratio is the incidence of an event in an exposed group divided by its incidence in an unexposed group. Notably, there was significantly increased mortality from all causes, ischemic heart disease, and silicosis among workers exposed to respirable silica concentrations at or below 0.1 mg/m3, the workplace exposure limit for silica dust set by the US Occupational Safety and Health Administration. For example, the standardized mortality ratio (SMR) for silicosis among people exposed to low levels of silica dust was 11.01; an SMR is the ratio of observed deaths in a cohort to expected deaths calculated from recorded deaths in the general population. Finally, the researchers used their data to estimate that, in 2008, 4.2% of deaths among industrial workers in China (231,104 deaths) were attributable to silica dust exposure.
What Do These Findings Mean?
These findings indicate that long-term silica dust exposure is associated with substantially increased mortality among Chinese workers. They confirm that there is an exposure–response relationship between silica dust exposure and a heightened risk of death from respiratory diseases and lung cancer. That is, the risk of death from these diseases increases as exposure to silica dust increases. In addition, they show a significant relationship between silica dust exposure and death from cardiovascular diseases. Importantly, these findings suggest that even levels of silica dust that are considered safe increase the risk of death. The accuracy of these findings may be affected by the accuracy of the silica dust exposure estimates and/or by confounding (other factors shared by the people exposed to silica such as diet may have affected their risk of death). Nevertheless, these findings highlight the need to tighten regulations on workplace dust control in China and elsewhere.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001206.
The American Lung Association provides information on silicosis
The US Centers for Disease Control and Prevention provides information on silica in the workplace, including links to relevant US National Institute for Occupational Health and Safety publications, and information on silicosis and other pneumoconioses
The US Occupational Safety and Health Administration also has detailed information on occupational exposure to crystalline silica
What does silicosis mean to you is a video provided by the US Mine Safety and Health Administration that includes personal experiences of silicosis; Dont let silica dust you is a video produced by the Association of Occupational and Environmental Clinics that identifies ways to reduce silica dust exposure in the workplace
The MedlinePlus encyclopedia has a page on silicosis (in English and Spanish)
The International Labour Organization provides information on health surveillance for those exposed to respirable crystalline silica
The World Health Organization has published a report about the health effects of crystalline silica and quartz
doi:10.1371/journal.pmed.1001206
PMCID: PMC3328438  PMID: 22529751
5.  A case-control study to investigate the risk of leukaemia associated with exposure to benzene in petroleum marketing and distribution workers in the United Kingdom. 
OBJECTIVES: To investigate the risk of leukaemia in workers in the petroleum distribution industry who were exposed to low levels of benzene. METHODS: From the cohort of distribution workers, 91 cases were identified as having leukaemia on either a death certificate or on cancer registration. These cases were compared with controls (four per case) randomly selected from the cohort, who were from the same company as the respective case, matched for age, and alive and under follow up at the time of case occurrence. Work histories were collected for the cases and controls, together with information about the terminals at which they had worked, fuel compositions, and occupational hygiene measurements of benzene. These data were used to derive quantitative estimates of personal exposure to benzene. Odds ratios (OR) were calculated conditional on the matching, to identify those variables in the study which were associated with risk of leukaemia. Examination of the potential effects of confounding and other variables was carried out with conditional logistic regression. Analyses were carried out for all leukaemia and separately for acute lymphoblastic, chronic lymphocytic, acute myeloid and monocytic, and chronic myeloid leukaemias. RESULTS: There was no significant increase in the overall risk of all leukaemias with higher cumulative exposure to benzene or with intensity of exposure, but risk was consistently doubled in subjects employed in the industry for > 10 years. Acute lymphoblastic leukaemia tended to occur in workers employed after 1950, who started work after the age of 30, worked for a short duration, and experienced low cumulative exposure with few peaks. The ORs did not increase with increasing cumulative exposure. The risk of chronic lymphocytic leukaemia seemed to be related most closely to duration of employment and the highest risk occurred in white collar workers with long service. These workers had only background levels of benzene exposure. There was no evidence of an association of risk with any exposure variables, and no evidence of an increasing risk with increasing cumulative exposure, mean intensity, or maximum intensity of exposure. The patterns of risk for acute myeloid and monocytic leukaemia were different from those of the lymphoid subgroups, in which duration of employment was the variable most closely related to risk. Risk was increased to an OR of 2.8 (95% confidence interval (95% CI) 0.8 to 9.4) for a cumulative exposure between 4.5 and 45 ppm-years compared with < 0.45 ppm-years. For mean intensity between 0.2 and 0.4 ppm an OR of 2.8 (95% CI 0.9 to 8.5) was found compared with < 0.02 ppm. Risk did not increase with cumulative exposure, maximum intensity, or mean intensity of exposure when treated as continuous variables. Cases of acute myeloid and monocytic leukaemia were more often classified as having peaked exposures than controls, and when variables characterising peaks, particularly daily and weekly peaks, were included in the analysis these tended to dominate the other exposure variables. However, because of the small numbers it is not possible to distinguish the relative influence of peaked and unpeaked exposures on risk of acute myeloid and monocytic leukaemia. There was no evidence of an increased risk of chronic myeloid leukaemia with increases in cumulative exposure, maximum intensity, mean intensity, and duration of employment, either as continuous or categorical variables. Analyses exploring the sensitivity of the results to the source and quality of the work histories showed similar patterns in general. However, no increases in ORs for categories of cumulative exposure were found for acute myeloid and monocytic leukaemia in the data set which included work histories obtained from personnel records still in existence, although numbers were reduced. Analyses excluding the last five and 10 years of exposure showed a tendency for ORs to reduce for chronic lymphocytic leukaemia and chronic myeloid leukaemia, and to increase for acute myeloid and monocytic leukaemia. Limitations of the study include uncertainties and gaps in the information collected, and small numbers in subcategories of exposure which can lead to wide CIs around the risk estimates and poor fit of the mathematical models. CONCLUSIONS: There is no evidence in this study of an association between exposure to benzene and lymphoid leukaemia, either acute or chronic. There is some suggestion of a relation between exposure to benzene and myeloid leukaemia, in particular for acute myeloid and monocytic leukaemia. Peaked exposures seemed to be experienced for this disease. However, in view of the limitations of the study, doubt remains as to whether the risk of acute myeloid and monocytic leukaemia is increased by cumulative exposures of < 45 ppm-years. Further work is recommended to review the work histories and redefine their quality, to explore the discrepancies between results for categorical and continuous variables, and to develop ranges around the expose estimates to enable further sensitivity analyses to be carried out.
PMCID: PMC1128678  PMID: 9155776
6.  What's the Risk? A Simple Approach for Estimating Adjusted Risk Measures from Nonlinear Models Including Logistic Regression 
Health Services Research  2009;44(1):288-302.
Objective
To develop and validate a general method (called regression risk analysis) to estimate adjusted risk measures from logistic and other nonlinear multiple regression models. We show how to estimate standard errors for these estimates. These measures could supplant various approximations (e.g., adjusted odds ratio [AOR]) that may diverge, especially when outcomes are common.
Study Design
Regression risk analysis estimates were compared with internal standards as well as with Mantel–Haenszel estimates, Poisson and log-binomial regressions, and a widely used (but flawed) equation to calculate adjusted risk ratios (ARR) from AOR.
Data Collection
Data sets produced using Monte Carlo simulations.
Principal Findings
Regression risk analysis accurately estimates ARR and differences directly from multiple regression models, even when confounders are continuous, distributions are skewed, outcomes are common, and effect size is large. It is statistically sound and intuitive, and has properties favoring it over other methods in many cases.
Conclusions
Regression risk analysis should be the new standard for presenting findings from multiple regression analysis of dichotomous outcomes for cross-sectional, cohort, and population-based case–control studies, particularly when outcomes are common or effect size is large.
doi:10.1111/j.1475-6773.2008.00900.x
PMCID: PMC2669627  PMID: 18793213
Multiple regression analysis; logistic regression; nonlinear models; odds ratio; relative risk; risk adjustment; risk ratio
7.  Association between Prenatal Exposure to Antiretroviral Therapy and Birth Defects: An Analysis of the French Perinatal Cohort Study (ANRS CO1/CO11) 
PLoS Medicine  2014;11(4):e1001635.
Jeanne Sibiude and colleagues use the French Perinatal Cohort to estimate the prevalence of birth defects in children born to HIV-infected women receiving antiretroviral therapy during pregnancy.
Please see later in the article for the Editors' Summary
Background
Antiretroviral therapy (ART) has major benefits during pregnancy, both for maternal health and to prevent mother-to-child transmission of HIV. Safety issues, including teratogenic risk, need to be evaluated. We estimated the prevalence of birth defects in children born to HIV-infected women receiving ART during pregnancy, and assessed the independent association of birth defects with each antiretroviral (ARV) drug used.
Methods and Findings
The French Perinatal Cohort prospectively enrolls HIV-infected women delivering in 90 centers throughout France. Children are followed by pediatricians until 2 y of age according to national guidelines.
We included 13,124 live births between 1994 and 2010, among which, 42% (n = 5,388) were exposed to ART in the first trimester of pregnancy. Birth defects were studied using both European Surveillance of Congenital Anomalies (EUROCAT) and Metropolitan Atlanta Congenital Defects Program (MACDP) classifications; associations with ART were evaluated using univariate and multivariate logistic regressions. Correction for multiple comparisons was not performed because the analyses were based on hypotheses emanating from previous findings in the literature and the robustness of the findings of the current study. The prevalence of birth defects was 4.4% (95% CI 4.0%–4.7%), according to the EUROCAT classification. In multivariate analysis adjusting for other ARV drugs, maternal age, geographical origin, intravenous drug use, and type of maternity center, a significant association was found between exposure to zidovudine in the first trimester and congenital heart defects: 2.3% (74/3,267), adjusted odds ratio (AOR) = 2.2 (95% CI 1.3–3.7), p = 0.003, absolute risk difference attributed to zidovudine +1.2% (95% CI +0.5; +1.9%). Didanosine and indinavir were associated with head and neck defects, respectively: 0.5%, AOR = 3.4 (95% CI 1.1–10.4), p = 0.04; 0.9%, AOR = 3.8 (95% CI 1.1–13.8), p = 0.04. We found a significant association between efavirenz and neurological defects (n = 4) using the MACDP classification: AOR = 3.0 (95% CI 1.1–8.5), p = 0.04, absolute risk +0.7% (95% CI +0.07%; +1.3%). But the association was not significant using the less inclusive EUROCAT classification: AOR = 2.1 (95% CI 0.7–5.9), p = 0.16. No association was found between birth defects and lopinavir or ritonavir with a power >85% for an odds ratio of 1.5, nor for nevirapine, tenofovir, stavudine, or abacavir with a power >70%. Limitations of the present study were the absence of data on termination of pregnancy, stillbirths, tobacco and alcohol intake, and concomitant medication.
Conclusions
We found a specific association between in utero exposure to zidovudine and heart defects; the mechanisms need to be elucidated. The association between efavirenz and neurological defects must be interpreted with caution. For the other drugs not associated with birth defects, the results were reassuring. Finally, whatever the impact that some ARV drugs may have on birth defects, it is surpassed by the major role of ART in the successful prevention of mother-to-child transmission of HIV.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
AIDS and HIV infection are commonly treated with antiretroviral therapy (ART), a combination of individual drugs that work together to prevent the replication of the virus and further spread of the infection. Starting in the 1990s, studies have shown that ART of HIV-infected women can substantially reduce transmission of the virus to the child during pregnancy and birth. Based on these results, ART was subsequently recommended for pregnant women. Since 2004, ART has been standard therapy for pregnant women with HIV/AIDS in high-income countries, and it is now recommended for all HIV-infected women worldwide. Several different antiviral drug combinations have been shown to be effective and are used to prevent mother-to-infant transmission. However, as with any other drugs taken during pregnancy, there is concern that ART can harm the developing fetus.
Why Was This Study Done?
Several previous studies have assessed the risk that ART taken by a pregnant woman might pose to her developing fetus, but the results have been inconsistent. Animal studies suggested an elevated risk for some drugs but not others. While some clinical studies have reported increases in birth defects in children born to mothers on ART, others have shown no such increase.
The discrepancy may be due to differences between the populations included in the studies and the different methods used to diagnose birth defects. Additional large studies are therefore necessary to obtain more and better evidence on the potential harm of individual anti-HIV drugs to children exposed during pregnancy. So in this study, the authors conducted a large cohort study in France to assess the relationship between different antiretroviral drugs and specific birth defects.
What Did the Researchers Do and Find?
The researchers used a large national health database known as the French Perinatal Cohort that contains information on HIV-infected mothers who delivered infants in 90 centers throughout France. Pediatricians follow all children, whatever their HIV status, to two years of age, and health statistics are collected according to national health-care guidelines. Analyzing the records, the researchers estimated the rate at which birth defects occurred in children exposed to antiretroviral drugs during pregnancy.
The researchers included 13,124 children who were born alive between 1994 and 2010 and had been exposed to ART during pregnancy. Children exposed in the first trimester of pregnancy, and those exposed during the second or third trimester, were compared to a control group (children not exposed to the drug during the whole pregnancy). Using two birth defect classification systems (EUROCAT and MACDP—MACDP collects more details on disease classification than EUROCAT), the researchers sought to detect a link between the occurrence of birth defects and exposure to individual antiretroviral drugs.
They found a small increase in the risk for heart defects in children with exposure to zidovudine. They also found an association between efavirenz exposure and a small increase in neurological defects, but only when using the MACDP classification system. The authors found no association between other antiretroviral drugs, including nevirapine (acting similar to efavirenz); tenofovir, stavudine, and abacavir (all three acting similar to zidovudine); and lopinavir and ritonavir (proteinase inhibitors) and any type of birth defect.
What Do These Findings Mean?
These findings show that, overall, the risks of birth defects in children exposed to antiretroviral drugs in utero are small when considering the clear benefit of preventing mother-to-child transmission of HIV. However, where there are safe and effective alternatives, it might be appropriate to avoid use by pregnant women of those drugs that are associated with elevated risks of birth defects.
Worldwide, a large number of children are exposed to zidovudine in utero, and these results suggest (though cannot prove) that these children may be at a slightly higher risk of heart defects. Current World Health Organization (WHO) guidelines for the prevention of mother-to-child transmission no longer recommend zidovudine for first-line therapy.
The implications of the higher rate of neurological birth defects observed in infants exposed to efavirenz in the first trimester are less clear. The EUROCAT classification excludes minor neurological abnormalities without serious medical consequences, and so the WHO guidelines that stress the importance of careful clinical follow-up of children with exposure to efavirenz seem adequate, based on the findings of this study. The study is limited by the lack of data on the use of additional medication and alcohol and tobacco use, which could have a direct impact on fetal development, and by the absence of data on birth defects and antiretroviral drug exposure from low-income countries. However, the findings of this study overall are reassuring and suggest that apart from zidovudine and possibly efavirenz, other antiretroviral drugs are not associated with birth defects, and their use during pregnancy does not pose a risk to the infant.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001635.
This study is further discussed in a PLOS Medicine Perspective by Mofenson and Watts
The World Health Organization has a webpage on mother-to-child transmission of HIV
The US National Institutes of Health provides links to additional information on mother-to-child transmission of HIV
The Elizabeth Glaser Pediatric AIDS Foundation also has a webpage on mother-to-child transmission
The French Perinatal Cohort has a webpage describing the cohort and its main publications (in French, with a summary in English)
doi:10.1371/journal.pmed.1001635
PMCID: PMC4004551  PMID: 24781315
8.  Case-control study of oral contraceptives and risk of thromboembolic stroke: results from International Study on Oral Contraceptives and Health of Young Women. 
BMJ : British Medical Journal  1997;315(7121):1502-1504.
OBJECTIVE: To determine the influence of oral contraceptives (particularly those containing modern progestins) on the risk for ischaemic stroke in women aged 16-44 years. DESIGN: Matched case-control study. SETTING: 16 Centres in the United Kingdom, Germany, France, Switzerland, and Austria. SUBJECTS: Cases were 220 women aged 16-44 who had an incident ischaemic stroke. Controls were 775 women (at least one hospital and one community control per case) unaffected by stroke who were matched with the corresponding case for 5 year age band and for hospital or community setting. Information on exposure and confounding variables were collected in a face to face interview. MAIN OUTCOME MEASURES: Odds ratios derived with stratified analysis and unconditional logistic regression to adjust for potential confounding. RESULTS: Adjusted odds ratios (95% confidence intervals) for ischaemic stroke (unmatched analysis) were 4.4 (2.0 to 9.9), 3.4 (2.1 to 5.5), and 3.9 (2.3 to 6.6) for current use of first, second, and third generation oral contraceptives, respectively. The risk ratio for third versus second generation was 1.1 (0.7 to 2.0) and was similar in the United Kingdom and other European countries. The risk estimates were lower if blood pressure was checked before prescription. CONCLUSION: Although there is a small relative risk of occlusive stroke for women of reproductive age who currently use oral contraceptives, the attributable risk is very small because the incidence in this age range is very low. There is no difference between the risk of oral contraceptives of the third and second generation; only first generation oral contraceptives seem to be associated with a higher risk. This small increase in risk may be further reduced by efforts to control cardiovascular risk factors, particularly high blood pressure.
PMCID: PMC2127931  PMID: 9420491
9.  Is a Cutoff of 10% Appropriate for the Change-in-Estimate Criterion of Confounder Identification? 
Journal of Epidemiology  2014;24(2):161-167.
Background
When using the change-in-estimate criterion, a cutoff of 10% is commonly used to identify confounders. However, the appropriateness of this cutoff has never been evaluated. This study investigated cutoffs required under different conditions.
Methods
Four simulations were performed to select cutoffs that achieved a significance level of 5% and a power of 80%, using linear regression and logistic regression. A total of 10 000 simulations were run to obtain the percentage differences of the 4 fitted regression coefficients (with and without adjustment).
Results
In linear regression, larger effect size, larger sample size, and lower standard deviation of the error term led to a lower cutoff point at a 5% significance level. In contrast, larger effect size and a lower exposure–confounder correlation led to a lower cutoff point at 80% power. In logistic regression, a lower odds ratio and larger sample size led to a lower cutoff point at a 5% significance level, while a lower odds ratio, larger sample size, and lower exposure–confounder correlation yielded a lower cutoff point at 80% power.
Conclusions
Cutoff points for the change-in-estimate criterion varied according to the effect size of the exposure–outcome relationship, sample size, standard deviation of the regression error, and exposure–confounder correlation.
doi:10.2188/jea.JE20130062
PMCID: PMC3983286  PMID: 24317343
causality; confounding factors; regression; simulation; statistical models
10.  Estimates of Pandemic Influenza Vaccine Effectiveness in Europe, 2009–2010: Results of Influenza Monitoring Vaccine Effectiveness in Europe (I-MOVE) Multicentre Case-Control Study 
PLoS Medicine  2011;8(1):e1000388.
Results from a European multicentre case-control study reported by Marta Valenciano and colleagues suggest good protection by the pandemic monovalent H1N1 vaccine against pH1N1 and no effect of the 2009–2010 seasonal influenza vaccine on H1N1.
Background
A multicentre case-control study based on sentinel practitioner surveillance networks from seven European countries was undertaken to estimate the effectiveness of 2009–2010 pandemic and seasonal influenza vaccines against medically attended influenza-like illness (ILI) laboratory-confirmed as pandemic influenza A (H1N1) (pH1N1).
Methods and Findings
Sentinel practitioners swabbed ILI patients using systematic sampling. We included in the study patients meeting the European ILI case definition with onset of symptoms >14 days after the start of national pandemic vaccination campaigns. We compared pH1N1 cases to influenza laboratory-negative controls. A valid vaccination corresponded to >14 days between receiving a dose of vaccine and symptom onset. We estimated pooled vaccine effectiveness (VE) as 1 minus the odds ratio with the study site as a fixed effect. Using logistic regression, we adjusted VE for potential confounding factors (age group, sex, month of onset, chronic diseases and related hospitalizations, smoking history, seasonal influenza vaccinations, practitioner visits in previous year). We conducted a complete case analysis excluding individuals with missing values and a multiple multivariate imputation to estimate missing values. The multivariate imputation (n = 2902) adjusted pandemic VE (PIVE) estimates were 71.9% (95% confidence interval [CI] 45.6–85.5) overall; 78.4% (95% CI 54.4–89.8) in patients <65 years; and 72.9% (95% CI 39.8–87.8) in individuals without chronic disease. The complete case (n = 1,502) adjusted PIVE were 66.0% (95% CI 23.9–84.8), 71.3% (95% CI 29.1–88.4), and 70.2% (95% CI 19.4–89.0), respectively. The adjusted PIVE was 66.0% (95% CI −69.9 to 93.2) if vaccinated 8–14 days before ILI onset. The adjusted 2009–2010 seasonal influenza VE was 9.9% (95% CI −65.2 to 50.9).
Conclusions
Our results suggest good protection of the pandemic monovalent vaccine against medically attended pH1N1 and no effect of the 2009–2010 seasonal influenza vaccine. However, the late availability of the pandemic vaccine and subsequent limited coverage with this vaccine hampered our ability to study vaccine benefits during the outbreak period. Future studies should include estimation of the effectiveness of the new trivalent vaccine in the upcoming 2010–2011 season, when vaccination will occur before the influenza season starts.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Following the World Health Organization's declaration of pandemic phase six in June 2009, manufacturers developed vaccines against pandemic influenza A 2009 (pH1N1). On the basis of the scientific opinion of the European Medicines Agency, the European Commission initially granted marketing authorization to three pandemic vaccines for use in European countries. During the autumn of 2009, most European countries included the 2009–2010 seasonal influenza vaccine and the pandemic vaccine in their influenza vaccination programs.
The Influenza Monitoring Vaccine Effectiveness in Europe network (established to monitor seasonal and pandemic influenza vaccine effectiveness) conducted seven case-control and three cohort studies in seven European countries in 2009–2010 to estimate the effectiveness of the pandemic and seasonal vaccines. Data from the seven pilot case-control studies were pooled to provide overall adjusted estimates of vaccine effectiveness.
Why Was This Study Done?
After seasonal and pandemic vaccines are made available to populations, it is necessary to estimate the effectiveness of the vaccines at the population level during every influenza season. Therefore, this study was conducted in European countries to estimate the pandemic influenza vaccine effectiveness and seasonal influenza vaccine effectiveness against people presenting to their doctor with influenza-like illness who were confirmed (by laboratory tests) to be infected with pH1N1.
What Did the Researchers Do and Find?
The researchers conducted a multicenter case-control study on the basis of practitioner surveillance networks from seven countries—France, Hungary, Ireland, Italy, Romania, Portugal, and Spain. Patients consulting a participating practitioner for influenza-like illness had a nasal or throat swab taken within 8 days of symptom onset. Cases were swabbed patients who tested positive for pH1N1. Patients presenting with influenza-like illness whose swab tested negative for any influenza virus were controls.
Individuals were considered vaccinated if they had received a dose of the vaccine more than 14 days before the date of onset of influenza-like illness and unvaccinated if they were not vaccinated at all, or if the vaccine was given less than 15 days before the onset of symptoms. The researchers analyzed pandemic influenza vaccination effectiveness in those vaccinated less than 8 days, those vaccinated between and including 8 and 14 days, and those vaccinated more than 14 days before onset of symptoms compared to those who had never been vaccinated.
The researchers used modeling (taking account of all potential confounding factors) to estimate adjusted vaccine effectiveness and stratified the adjusted pandemic influenza vaccine effectiveness and the adjusted seasonal influenza vaccine effectiveness in three age groups (<15, 15–64, and ≥65 years of age).
The adjusted results suggest that the 2009–2010 seasonal influenza vaccine did not protect against pH1N1 illness. However, one dose of the pandemic vaccines used in the participating countries conferred good protection (65.5%–100% according to various stratifications performed) against pH1N1 in people who attended their practitioner with influenza-like illness, especially in people aged <65 years and in those without any chronic disease. Furthermore, good pandemic influenza vaccine effectiveness was observed as early as 8 days after vaccination.
What Do These Findings Mean?
The results of this study provide early estimates of the pandemic influenza vaccine effectiveness suggesting that the monovalent pandemic vaccines have been effective. The findings also give an indication of the vaccine effectiveness for the Influenza A (H1N1) 2009 strain included in the 2010–2011 seasonal vaccines, although specific vaccine effectiveness studies will have to be conducted to verify if similar good effectiveness are observed with 2010–2011 trivalent vaccines. However, the results of this study should be interpreted with caution because of limitations in the pandemic context (late timing of the studies, low incidence, low vaccine coverage leading to imprecise estimates) and potential biases due the study design, confounding factors, and missing values. The researchers recommend that in future season studies, the sample size per country should be enlarged in order to allow for precise pooled and stratified analyses.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1000388.
The World Health Organization has information on H1N1 vaccination
The US Centers for Disease Control and Prevention provides a fact sheet on the 2009 H1N1 influenza virus
The US Department of Health and Human services has a comprehensive website on flu
The European Centre for Disease Prevention and Control provides information on 2009 H1N1 pandemic
The European Centre for Disease Prevention and Control presents a summary of the 2009 H1N1 pandemic in Europe and elsewhere
doi:10.1371/journal.pmed.1000388
PMCID: PMC3019108  PMID: 21379316
11.  Determining the Probability Distribution and Evaluating Sensitivity and False Positive Rate of a Confounder Detection Method Applied To Logistic Regression 
Background
In epidemiologic studies researchers are often interested in detecting confounding (when a third variable is both associated with and affects associations between the outcome and predictors). Confounder detection methods often compare regression coefficients obtained from “crude” models that exclude the possible confounder(s) and “adjusted” models that include the variable(s). One such method compares the relative difference in effect estimates to a cutoff of 10% with differences of at least 10% providing evidence of confounding.
Methods
In this study we derive the asymptotic distribution of the relative change in effect statistic applied to logistic regression and evaluate the sensitivity and false positive rate of the 10% cutoff method using the asymptotic distribution. We then verify the results using simulated data.
Results
When applied to a logistic regression models with a dichotomous outcome, exposure, and possible confounder, we found the 10% cutoff method to have an asymptotic lognormal distribution. For sample sizes of at least 300 the authors found that when confounding existed, over 80% of models had >10% changes in odds ratios. When the confounder was not associated with the outcome, the false positive rate increased as the strength of the association between the predictor and confounder increased. When the confounder and predictor were independent of one another, false positives were rare (most < 10%).
Conclusions
Researchers must be aware of high false positive rates when applying change in estimate confounder detection methods to data where the exposure is associated with possible confounder variables.
doi:10.4172/2155-6180.1000142
PMCID: PMC3571096  PMID: 23420565
10% Rule; Variable Selection; Model Building; Sensitivity; False Positive Rate
12.  Seasonal modification of the association between temperature and adult emergency department visits for asthma: a case-crossover study 
Environmental Health  2012;11:55.
Background
The objective of this study is to characterize the effect of temperature on emergency department visits for asthma and modification of this association by season. This association is of interest in its own right, and also important to understand because temperature may be an important confounder in analyses of associations between other environmental exposures and asthma. For example, the case-crossover study design is commonly used to investigate associations between air pollution and respiratory outcomes, such as asthma. This approach controls for confounding by month and season by design, and permits adjustment for potential confounding by temperature through regression modeling. However, such models may fail to adequately control for confounding if temperature effects are seasonal, since case-crossover analyses rarely account for interactions between matching factors (such as calendar month) and temperature.
Methods
We conducted a case-crossover study to determine whether the association between temperature and emergency department visits for asthma varies by season or month. Asthma emergency department visits among North Carolina adults during 2007–2008 were identified using a statewide surveillance system. Marginal as well as season- and month-specific associations between asthma visits and temperature were estimated with conditional logistic regression.
Results
The association between temperature and adult emergency department visits for asthma is near null when the overall association is examined [odds ratio (OR) per 5 degrees Celsius = 1.01, 95% confidence interval (CI): 1.00, 1.02]. However, significant variation in temperature-asthma associations was observed by season (chi-square = 18.94, 3 degrees of freedom, p <0.001) and by month of the year (chi-square = 45.46, 11 degrees of freedom, p <0.001). ORs per 5 degrees Celsius were increased in February (OR = 1.06, 95% CI: 1.02, 1.10), July (OR = 1.16, 95% CI: 1.04, 1.29), and December (OR = 1.04, 95% CI: 1.01, 1.07) and decreased in September (OR = 0.92, 95% CI: 0.87, 0.97).
Conclusions
Our empirical example suggests that there is significant seasonal variation in temperature-asthma associations. Epidemiological studies rarely account for interactions between ambient temperature and temporal matching factors (such as month of year) in the case-crossover design. These findings suggest that greater attention should be given to seasonal modification of associations between temperature and respiratory outcomes in case-crossover analyses of other environmental asthma triggers.
doi:10.1186/1476-069X-11-55
PMCID: PMC3489538  PMID: 22898319
Asthma; Temperature; Season; Case-crossover
13.  Keeping children safe at home: protocol for a case–control study of modifiable risk factors for scalds 
Injury Prevention  2014;20(5):e11.
Background
Scalds are one of the most common forms of thermal injury in young children worldwide. Childhood scald injuries, which mostly occur in the home, result in substantial health service use and considerable morbidity and mortality. There is little research on effective interventions to prevent scald injuries in young children.
Objectives
To determine the relationship between a range of modifiable risk factors for medically attended scalds in children under the age of 5 years.
Design
A multicentre case-control study in UK hospitals and minor injury units with parallel home observation to validate parental reported exposures. Cases will be 0–4 years old with a medically attended scald injury which occurred in their home or garden, matched on gender and age with community controls. An additional control group will comprise unmatched hospital controls drawn from children aged 0–4 years attending the same hospitals and minor injury units for other types of injury. Conditional logistic regression will be used for the analysis of cases and matched controls, and unconditional logistic regression for the analysis of cases and unmatched controls to estimate ORs and 95% CI, adjusted and unadjusted for confounding variables.
Main exposure measures
Use of safety equipment and safety practices for scald prevention and scald hazards.
Discussion
This large case-control study will investigate modifiable risk factors for scalds injuries, adjust for potential confounders and validate measures of exposure. Its findings will enhance the evidence base for prevention of scalds injuries in young children.
doi:10.1136/injuryprev-2014-041255
PMCID: PMC4174015  PMID: 24842981
14.  Conditional Poisson models: a flexible alternative to conditional logistic case cross-over analysis 
Background
The time stratified case cross-over approach is a popular alternative to conventional time series regression for analysing associations between time series of environmental exposures (air pollution, weather) and counts of health outcomes. These are almost always analyzed using conditional logistic regression on data expanded to case–control (case crossover) format, but this has some limitations. In particular adjusting for overdispersion and auto-correlation in the counts is not possible. It has been established that a Poisson model for counts with stratum indicators gives identical estimates to those from conditional logistic regression and does not have these limitations, but it is little used, probably because of the overheads in estimating many stratum parameters.
Methods
The conditional Poisson model avoids estimating stratum parameters by conditioning on the total event count in each stratum, thus simplifying the computing and increasing the number of strata for which fitting is feasible compared with the standard unconditional Poisson model. Unlike the conditional logistic model, the conditional Poisson model does not require expanding the data, and can adjust for overdispersion and auto-correlation. It is available in Stata, R, and other packages.
Results
By applying to some real data and using simulations, we demonstrate that conditional Poisson models were simpler to code and shorter to run than are conditional logistic analyses and can be fitted to larger data sets than possible with standard Poisson models. Allowing for overdispersion or autocorrelation was possible with the conditional Poisson model but when not required this model gave identical estimates to those from conditional logistic regression.
Conclusions
Conditional Poisson regression models provide an alternative to case crossover analysis of stratified time series data with some advantages. The conditional Poisson model can also be used in other contexts in which primary control for confounding is by fine stratification.
Electronic supplementary material
The online version of this article (doi:10.1186/1471-2288-14-122) contains supplementary material, which is available to authorized users.
doi:10.1186/1471-2288-14-122
PMCID: PMC4280686  PMID: 25417555
Statistics; Conditional distributions; Poisson regression; Time series regression; Environment
15.  A case-control study of malignant and non-malignant respiratory disease among employees of a fiberglass manufacturing facility. II. Exposure assessment. 
A case-control study of malignant and non-malignant respiratory disease among employees of the Owens-Corning Fiberglas Corporation's Newark, Ohio plant was undertaken. The aim was to determine the extent to which exposures to substances in the Newark plant environment, to non-workplace factors, or to a combination may play a part in the risk of mortality from respiratory disease among workers in this plant. A historical environmental reconstruction of the plant was undertaken to characterise the exposure profile for workers in this plant from its beginnings in 1934 to the end of 1987. The exposure profile provided estimates of cumulative exposure to respirable fibres, fine fibres, asbestos, talc, formaldehyde, silica, and asphalt fumes. Employment histories from Owens-Corning Fiberglas provided information on employment characteristics (duration of employment, year of hire, age at first hire) and an interview survey obtained information on demographic characteristics (birthdate, race, education, marital state, parent's ethnic background, and place of birth), lifetime residence, occupational and smoking histories, hobbies, and personal and family medical history. Matched, unadjusted odds ratios (ORs) were used to assess the association between lung cancer or non-malignant respiratory disease and the cumulative exposure history, demographic characteristics, and employment variables. Only the smoking variables and employment characteristics (year of hire and age at first hire) were statistically significant for lung cancer. For non-malignant respiratory disease, only the smoking variables were statistically significant in the univariate analysis. Of the variables entered into a conditional logistic regression model for lung cancer, only smoking (smoked for six months or more v never smoked: OR = 26.17, 95% confidence interval (95% CI) 3.316-206.5) and age at first hire (35 and over v less than 35: OR = 0.244, 95% CI 0.083-0.717) were statistically significant. There were, however, increased ORs for year of employment (first hired before 1945 v first hire after 1945: OR = 1.944, 95% CI 0.850-4.445), talc (cumulative exposure >1000 fibres/ml days v never exposed: OR = 1.355, 95% CI 0.407-5.515), and asphalt fumes (cumulative exposure >0.01 mg/m(3) days v never exposed: OR 1.131, 95% CI 0.468-2.730). For non-malignant respiratory disease, only the smoking variable was significant in the conditional logistic regression analysis (OR = 2.637, 95% CI 1.146-6.069). There were raised ORs for the higher cumulative exposure categories for respirable fibres, asbestos, silica, and asphalt fumes. For both silica and asphalt fumes, ORs were more than double the reference groups for all exposure categories. A limited number of subjects were exposed to fine fibres. The scarcity of cases and controls limits the extent to which analyses for fine fibre may be carried out. Within those limitations, among those who had worked with fine fibre, the unadjusted, unmatched OR for lung cancer was (1.0 (95% CI 0.229-4.373) and for non-malignant respiratory disease, the OR was 1.5 (95% CI 0.336-6.702). The unadjusted OR for lung cancer for exposure to fine fibre was consistent with that for all respirable fibre and does not suggest an association. For non-malignant respiratory disease, the unadjusted OR for fine fibre was opposite in direction from that for all respirable fibres. Within the limitations of the available data on fibre, there is o suggestion that exposure to fine fibre has resulted in an increase in risk of lung cancer. The increased OR for non-malignant respiratory disease is inconclusive. The results of this population, in this place and time, neither respirable fibres nor any of the substances investigated as part of the plant environment are statistically significant factors for lung cancer risk although there are increased ORs for exposure to talc and asphalt fumes. Smoking is the most important factors in risk for lung cancer in this population. The situation is less clear for non-malignant respiratory disease. Unlike lung cancer, non-malignant respiratory represents a constellation of outcomes and not a single well defined end point. Although smoking was the only statistically significant factor for non-malignant respiratory disease in this analysis, the ORs for respirable fibres, asbestos, silica, and asphalt fumes were greater than unity for the highest exposure categories. Although the raised ORs for these substances may represent the results of a random process, they may be suggestive of an increased risk and require further investigation.
PMCID: PMC1012175  PMID: 8398858
16.  Effectiveness of the Standard WHO Recommended Retreatment Regimen (Category II) for Tuberculosis in Kampala, Uganda: A Prospective Cohort Study 
PLoS Medicine  2011;8(3):e1000427.
Prospective evaluation of the effectiveness of the WHO-recommended standardized retreatment regimen for tuberculosis by Edward Jones-López and colleagues reveals an unacceptable proportion of unsuccessful outcomes.
Background
Each year, 10%–20% of patients with tuberculosis (TB) in low- and middle-income countries present with previously treated TB and are empirically started on a World Health Organization (WHO)-recommended standardized retreatment regimen. The effectiveness of this retreatment regimen has not been systematically evaluated.
Methods and Findings
From July 2003 to January 2007, we enrolled smear-positive, pulmonary TB patients into a prospective cohort to study treatment outcomes and mortality during and after treatment with the standardized retreatment regimen. Median time of follow-up was 21 months (interquartile range 12–33 months). A total of 29/148 (20%) HIV-uninfected and 37/140 (26%) HIV-infected patients had an unsuccessful treatment outcome. In a multiple logistic regression analysis to adjust for confounding, factors associated with an unsuccessful treatment outcome were poor adherence (adjusted odds ratio [aOR] associated with missing half or more of scheduled doses 2.39; 95% confidence interval (CI) 1.10–5.22), HIV infection (2.16; 1.01–4.61), age (aOR for 10-year increase 1.59; 1.13–2.25), and duration of TB symptoms (aOR for 1-month increase 1.12; 1.04–1.20). All patients with multidrug-resistant TB had an unsuccessful treatment outcome. HIV-infected individuals were more likely to die than HIV-uninfected individuals (p<0.0001). Multidrug-resistant TB at enrolment was the only common risk factor for death during follow-up for both HIV-infected (adjusted hazard ratio [aHR] 17.9; 6.0–53.4) and HIV-uninfected (14.7; 4.1–52.2) individuals. Other risk factors for death during follow-up among HIV-infected patients were CD4<50 cells/ml and no antiretroviral treatment (aHR 7.4, compared to patients with CD4≥200; 3.0–18.8) and Karnofsky score <70 (2.1; 1.1–4.1); and among HIV-uninfected patients were poor adherence (missing half or more of doses) (3.5; 1.1–10.6) and duration of TB symptoms (aHR for a 1-month increase 1.9; 1.0–3.5).
Conclusions
The recommended regimen for retreatment TB in Uganda yields an unacceptable proportion of unsuccessful outcomes. There is a need to evaluate new treatment strategies in these patients.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
One-third of the world's population is currently infected with Mycobacterium tuberculosis, the bacterium that causes tuberculosis (TB), and 5%–10% of HIV-uninfected individuals will go on to develop disease and become infectious. The risk of progression from infection to disease in HIV infected is much higher. If left untreated, each person with active TB may infect 10 to 15 people every year, reinforcing the public health priority of controlling TB through adequate treatment. Patients with a previous history of TB treatment are a major concern for TB programs throughout the world because these patients are at a much higher risk of harboring a form of TB that is resistant to the drugs most frequently used, resulting in poorer treatment outcomes and significantly complicating current management strategies. More then 1 million people in over 90 countries need to be “re-treated” after failing, interrupting, or relapsing from previous TB treatment.
Every year, 10%–20% of people with TB in low- and middle-income countries are started on a standardized five-drug retreatment regimen as recommended by the World Health Organization (WHO). Yet, unlike treatment regimens for newly diagnosed TB patients, the recommended retreatment regimen (also known as the category II regimen) has never been properly evaluated in randomized clinical trials or prospective cohort studies. Rather, this regimen was recommended by experts before the current situation of widespread drug-resistant TB and HIV infection.
Why Was This Study Done?
WHO surveillance data suggest that the retreatment regimen is successful in about 70% of patients, but retrospective studies that have evaluated the regimen's efficacy showed variable treatment responses with success rates ranging from 26% to 92%. However, these studies have generally only assessed outcomes at the completion of the retreatment regimen, and few have examined the risk of TB recurrence, especially in people who are also infected with HIV and so are more likely to experience TB recurrence—an issue of particular concern in sub-Saharan Africa. Therefore, in this study based in Kampala, Uganda, the researchers conducted a prospective cohort study to assess treatment and survival outcomes in patients previously treated for TB and to identify factors associated with poor outcomes. Given the overwhelming contribution of HIV infection to death, the researchers categorized their survival analysis by HIV status.
What Did the Researchers Do and Find?
The researchers recruited consecutive smear-positive TB patients who were admitted to Mulago Hospital, Kampala, Uganda, for the retreatment of TB with the standard retreatment regimen between July 2003 and January 2007. Eligible patients received daily directly observed therapy and after hospital discharge, were seen every month during their 8-month TB-retreatment course. Home health visitors assessed treatment adherence through treatment card review, monthly pill counts, and patient self-report. After the completion of the retreatment regimen, patients were evaluated for TB recurrence every 3 months for a median of 21 months. The researchers then used a statistical model to identify treatment outcomes and mortality HIV-uninfected and HIV-infected patients.
The researchers found that 29/148 (20%) of HIV-uninfected and 37/140 (26%) of HIV-infected patients had an unsuccessful treatment outcome. Factors associated with an unsuccessful treatment outcome were poor adherence, HIV infection, increasing age, and duration of TB symptoms. All patients with multidrug resistant TB, a form of TB that is resistant to the two most important drugs used to treat TB, had an unsuccessful treatment outcome. In addition, HIV-infected subjects were more likely to die than HIV-uninfected subjects (p<0.0001), and having multidrug resistant TB at enrollment was the only common risk factor for death during follow-up for both HIV-infected and HIV uninfected patients. Other risk factors for death among HIV-infected patients were CD4<50 cells/ml and no antiretroviral therapy treatment and among HIV-uninfected patients were poor adherence and duration of TB symptoms.
What Do These Findings Mean?
The researchers found that although 70%–80% of patients had a successful treatment outcome on completion of antituberculous therapy (a result that compares well with retrospective studies), the standard retreatment regimen had low treatment response rates and was associated with poor long-term outcomes in certain subgroups of patients, particularly those with multidrug resistant TB and HIV.
These findings indicate that the standard retreatment approach to TB as implemented in low- and middle-income settings is inadequate and stress the importance of a new, more effective, strategies. Improved access to rapid diagnostics for TB drug-resistance, second-line TB treatment, and antiretroviral therapy is urgently needed, along with a strong evidence base to guide clinicians and policy makers on how best to use these tools.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1000427.
The World Health Organization has information on TB, TB retreatment, and multidrug-resistant TB
WHO also provides information on TB/HIV coinfection
The Stop TB Partnership provides information on the global plan to stop TB
doi:10.1371/journal.pmed.1000427
PMCID: PMC3058098  PMID: 21423586
17.  Risk of Violent Crime in Individuals with Epilepsy and Traumatic Brain Injury: A 35-Year Swedish Population Study 
PLoS Medicine  2011;8(12):e1001150.
Seena Fazel and colleagues report findings from a longitudinal follow-up study in Sweden that evaluated the risks of violent crime subsequent to hospitalization for epilepsy, or traumatic brain injury. The researchers control for familial confounding with sibling controls. The analyses call into question an association between epilepsy and violent crime, although they do suggest that there may be a relationship between traumatic brain injury and violent crime.
Background
Epilepsy and traumatic brain injury are common neurological conditions, with general population prevalence estimates around 0.5% and 0.3%, respectively. Although both illnesses are associated with various adverse outcomes, and expert opinion has suggested increased criminality, links with violent behaviour remain uncertain.
Methods and Findings
We combined Swedish population registers from 1973 to 2009, and examined associations of epilepsy (n = 22,947) and traumatic brain injury (n = 22,914) with subsequent violent crime (defined as convictions for homicide, assault, robbery, arson, any sexual offense, or illegal threats or intimidation). Each case was age and gender matched with ten general population controls, and analysed using conditional logistic regression with adjustment for socio-demographic factors. In addition, we compared cases with unaffected siblings.
Among the traumatic brain injury cases, 2,011 individuals (8.8%) committed violent crime after diagnosis, which, compared with population controls (n = 229,118), corresponded to a substantially increased risk (adjusted odds ratio [aOR] = 3.3, 95% CI: 3.1–3.5); this risk was attenuated when cases were compared with unaffected siblings (aOR = 2.0, 1.8–2.3). Among individuals with epilepsy, 973 (4.2%) committed a violent offense after diagnosis, corresponding to a significantly increased odds of violent crime compared with 224,006 population controls (aOR = 1.5, 1.4–1.7). However, this association disappeared when individuals with epilepsy were compared with their unaffected siblings (aOR = 1.1, 0.9–1.2). We found heterogeneity in violence risk by age of disease onset, severity, comorbidity with substance abuse, and clinical subgroups. Case ascertainment was restricted to patient registers.
Conclusions
In this longitudinal population-based study, we found that, after adjustment for familial confounding, epilepsy was not associated with increased risk of violent crime, questioning expert opinion that has suggested a causal relationship. In contrast, although there was some attenuation in risk estimates after adjustment for familial factors and substance abuse in individuals with traumatic brain injury, we found a significantly increased risk of violent crime. The implications of these findings will vary for clinical services, the criminal justice system, and patient charities.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
News stories linking mental illness (diseases that appear primarily as abnormalities of thought, feeling or behavior) with violence frequently hit the headlines. But what about neurological conditions—disorders of the brain, spinal cord, and nerves? People with these disorders, which include dementia, Parkinson's disease, and brain tumors, often experience stigmatization and discrimination, a situation that is made worse by the media and by some experts suggesting that some neurological conditions increase the risk of violence. For example, many modern textbooks assert that epilepsy—a neurological condition that causes repeated seizures or fits—is associated with increased criminality and violence. Similarly, various case studies have linked traumatic brain injury—damage to the brain caused by a sudden blow to the head—with an increased risk of violence.
Why Was This Study Done?
Despite public and expert perceptions, very little is actually known about the relationship between epilepsy and traumatic brain injury and violence. In particular, few if any population-based, longitudinal studies have investigated whether there is an association between the onset of either of these two neurological conditions and violence at a later date. This information might make it easier to address the stigma that is associated with these conditions. Moreover, it might help scientists understand the neurobiological basis of violence, and it could help health professionals appropriately manage individuals with these two disorders. In this longitudinal study, the researchers begin to remedy the lack of hard information about links between neurological conditions and violence by investigating the risk of violent crime associated with epilepsy and with traumatic brain injury in the Swedish population.
What Did the Researchers Do and Find?
The researchers used the National Patient Register to identify all the cases of epilepsy and traumatic brain injury that occurred in Sweden between 1973 and 2009. They matched each case (nearly 23,000 for each condition) with ten members of the general population and retrieved data on all convictions for violent crime over the same period from the Crime Register. They then linked these data together using the personal identification numbers that identify Swedish residents in national registries. 4.2% of individuals with epilepsy had at least one conviction for violence after their diagnosis, but only 2.5% of the general population controls did. That is, epilepsy increased the absolute risk of a conviction for violence by 1.7%. Using a regression analysis that adjusted for age, gender, and various socio-demographic factors, the researchers calculated that the odds of individuals with epilepsy committing a violent crime were 1.5 times higher than for general population controls (an adjusted odds ratio [aOR] of 1.5). The strength of this association was reduced when further adjustment was made for substance abuse, and disappeared when individuals with epilepsy were compared with their unaffected siblings (a sibling control study). Similarly, 8.8% of individuals with traumatic brain injury were convicted of a violent crime after their diagnosis compared to only 3% of controls, giving an aOR of 3.3. Again, the strength of this association was reduced when affected individuals were compared to their unaffected siblings (aOR = 2.0) and when adjustment was made for substance abuse (aOR = 2.3).
What Do These Findings Mean?
Although some aspects of this study may have affected the accuracy of its findings, these results nevertheless challenge the idea that there are strong direct links between epilepsy and violent crime. The low absolute rate of violent crime and the lack of any association between epilepsy and violent crime in the sibling control study argue against a strong link, a potentially important finding given the stigmatization of epilepsy. For traumatic brain injury, the reduced association with violent crime in the sibling control study compared with the general population control study suggests that shared familial features may be responsible for some of the association between brain injury and violence. As with epilepsy, this finding should help patient charities who are trying to reduce the stigma associated with traumatic brain injury. Importantly, however, these findings also suggest that some groups of patients with these conditions (for example, patients with head injuries who abuse illegal drugs and alcohol) would benefit from being assessed for their risk of behaving violently and from appropriate management.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001150.
This study is further discussed in a PLoS Medicine Perspective by Jan Volavka
The US National Institute of Neurological Disorders and Stroke provides detailed information about traumatic brain injury and about epilepsy (in English and Spanish)
The UK National Health Service Choices website provides information about severe head injury, including a personal story about a head injury sustained in a motor vehicle accident, and information about epilepsy, including personal stories about living with epilepsy
Healthtalkonline has information on epilepsy, including patient perspectives
MedlinePlus provide links to further resources on traumatic brain injury and on epilepsy (available in English and Spanish)
doi:10.1371/journal.pmed.1001150
PMCID: PMC3246446  PMID: 22215988
18.  Smoking and high-risk mammographic parenchymal patterns: a case-control study 
Breast Cancer Research  1999;2(1):59-63.
Current smoking was strongly and inversely associated with high-risk patterns, after adjustment for concomitant risk factors. Relative to never smokers, current smokers were significantly less likely to have a high-risk pattern. Similar results were obtained when the analysis was confined to postmenopausal women. Past smoking was not related to the mammographic parenchymal patterns. The overall effect in postmenopausal women lost its significance when adjusted for other risk factors for P2/DY patterns that were found to be significant in the present study, although the results are still strongly suggestive. The present data indicate that adjustment for current smoking status is important when evaluating the relationship between mammographic parenchymal pattern and breast cancer risk. They also indicate that smoking is a prominent potential confounder when analyzing effects of other risk factors such as obesity-related variables. It appears that parenchymal patterns may act as an informative biomarker of the effect of cigarette smoking on breast cancer risk.
Introduction:
Overall, epidemiological studies [1,2,3,4] have reported no substantial association between cigarette smoking and the risk of breast cancer. Some studies [5,6,7] reported a significant increase of breast cancer risk among smokers. In recent studies that addressed the association between breast cancer and cigarette smoking, however, there was some suggestion of a decreased risk [8,9,10], especially among current smokers, ranging from approximately 10 to 30% [9,10]. Brunet et al [11] reported that smoking might reduce the risk of breast cancer by 44% in carriers of BRCA1 or BRCA2 gene mutations. Wolfe [12] described four different mammographic patterns created by variations in the relative amounts of fat, epithelial and connective tissue in the breast, designated N1, P1, P2 and DY. Women with either P2 or DY pattern are considered at greater risk for breast cancer than those with N1 or P1 pattern [12,13,14,15]. There are no published studies that assessed the relationship between smoking and mammographic parenchymal patterns.
Aims:
To evaluate whether mammographic parenchymal patterns as classified by Wolfe, which have been positively associated with breast cancer risk, are affected by smoking. In this case-control study, nested within the European Prospective Investigation on Cancer in Norfolk (EPIC-Norfolk) cohort [16], the association between smoking habits and mammographic parenchymal patterns are examined. The full results will be published elsewhere.
Methods:
Study subjects were members of the EPIC cohort in Norwich who also attended the prevalence screening round at the Norwich Breast Screening Centre between November 1989 and December 1997, and were free of breast cancer at that screening. Cases were defined as women with a P2/DY Wolfe's mammographic parenchymal pattern on the prevalence screen mammograms. A total of 203 women with P2/DY patterns were identified as cases and were individually matched by date of birth (within 1 year) and date of prevalence screening (within 3 months) with 203 women with N1/P1 patterns who served as control individuals.
Two views, the mediolateral and craniocaudal mammograms, of both breasts were independently reviewed by two of the authors (ES and RW) to determine the Wolfe mammographic parenchymal pattern.
Considerable information on health and lifestyle factors was available from the EPIC Health and Lifestyle Questionnaire [16]. In the present study we examined the subjects' personal history of benign breast diseases, menstrual and reproductive factors, oral contraception and hormone replacement therapy, smoking, and anthropometric information such as body mass index and waist:hip ratio.
Odds ratios (ORs) and their 95% confidence intervals (CIs) were calculated by conditional logistic regression [17], and were adjusted for possible confounding factors.
Results:
The characteristics of the cases and controls are presented in Table 1. Cases were leaner than controls. A larger percentage of cases were nulliparous, premenopausal, current hormone replacement therapy users, had a personal history of benign breast diseases, and had had a hysterectomy. A larger proportion of controls had more than three births and were current smokers.
Table 2 shows the unadjusted and adjusted OR estimates for Wolfe's high-risk mammographic parenchymal patterns and smoking in the total study population and in postmenopausal women separately. Current smoking was strongly and inversely associated with high-risk patterns, after adjustment for concomitant risk factors. Relative to never smokers, current smokers were significantly less likely to have a high-risk pattern (OR 0.37, 95% CI 0.14-0.94). Similar results were obtained when the analysis was confined to postmenopausal women. Past smoking was not related to mammographic parenchymal patterns. The overall effect in postmenopausal women lost its significance when adjusted for other risk factors for P2/DY patterns that were found to be significant in the present study, although the results were still strongly suggestive. There was no interaction between cigarette smoking and body mass index.
Discussion:
In the present study we found a strong inverse relationship between current smoking and high-risk mammographic parenchymal patterns of breast tissue as classified by Wolfe [12]. These findings are not completely unprecedented; Greendale et al [18] found a reduced risk of breast density in association with smoking, although the magnitude of the reduction was unclear. The present findings suggest that this reduction is large.
Recent studies [9,10] have suggested that breast cancer risk may be reduced among current smokers. In a multicentre Italian case-control study, Braga et al [10] found that, relative to nonsmokers, current smokers had a reduced risk of breast cancer (OR 0.84, 95% CI 0.7-1.0). These findings were recently supported by Gammon et al [9], who reported that breast cancer risk in younger women (younger than 45 years) may be reduced among current smokers who began smoking at an early age (OR 0.59, 95% CI 0.41-0.85 for age 15 years or younger) and among long-term smokers (OR 0.70, 95% CI 0.52-0.94 for those who had smoked for 21 years or more).
The possible protective effect of smoking might be due to its anti-oestrogenic effect [1,2,19]. Recently there has been renewed interest in the potential effect of smoking on breast cancer risk, and whether individuals may respond differently on the basis of differences in metabolism of bioproducts of smoking [20,21]. Different relationships between smoking and breast cancer risk have been suggested that are dependent on the rapid or slow status of acetylators of aromatic amines [20,21]. More recent studies [22,23], however, do not support these findings.
The present study design minimized the opportunity for bias to influence the findings. Because subjects were unaware of their own case-control status, the possibility of recall bias in reporting smoking status was minimized. Systematic error in the assessment of mammograms was avoided because reading was done without knowledge of the risk factor data. Furthermore, the associations observed are unlikely to be explained by the confounding effect of other known breast cancer risk factors, because we adjusted for these in the analysis. We did not have information on passive smoking status, however, which has recently been reported to be a possible confounder [5,6,21,24].
The present data indicate that adjustment for current smoking status is important when evaluating the relationship between mammographic parenchymal pattern and breast cancer risk. They also indicate smoking as a prominent potential confounder when analyzing effects of other risk factors such as obesity-related variables. It seems that parenchymal patterns may act as an informative biomarker of the effect of cigarette smoking on breast cancer risk.
PMCID: PMC13911  PMID: 11056684
mammography; screening; smoking; Wolfe's parenchymal patterns
19.  Confounder Summary Scores When Comparing the Effects of Multiple Drug Exposures 
Purpose
Little information is available comparing methods to adjust for confounding when considering multiple drug exposures. We compared three analytic strategies to control for confounding based on measured variables: conventional multivariable, exposure propensity score (EPS) and disease risk score (DRS).
Methods
Each method was applied to a dataset (2000–2006) recently used to examine the comparative effectiveness of four drugs. The relative effectiveness of risedronate, nasal calcitonin and raloxifene in preventing nonvertebral fracture, were each compared to alendronate. EPSs were derived both by using multinomial logistic regression (single model EPS) and by three separate logistic regression models (separate model EPS). DRSs were derived and event rates compared using Cox proportional hazard models. DRSs derived among the entire cohort (full cohort DRS) was compared to DRSs derived only among the referent alendronate (unexposed cohort DRS).
Results
Less than 5 percent deviation from the base estimate (conventional multivariable) was observed applying single model EPS, separate model EPS or full cohort DRS. Applying the unexposed cohort DRS when background risk for fracture differed between comparison drug exposure cohorts resulted in −7% to +13% deviation from our base estimate.
Conclusions
With sufficient numbers of exposed and outcomes, either conventional multivariable, EPS or full cohort DRS may be used to adjust for confounding to compare the effects of multiple drug exposures. However, our data also suggest that unexposed cohort DRS may be problematic when background risks differ between referent and exposed groups. Further empirical and simulation studies will help to clarify the generalizability of our findings.
doi:10.1002/pds.1845
PMCID: PMC2800174  PMID: 19757416
Drug Evaluation; Epidemiology; Pharmaceutical; Epidemiological Methods; Population Studies
20.  Impact of occupational carcinogens on lung cancer risk in a general population 
Background Exposure to occupational carcinogens is an important preventable cause of lung cancer. Most of the previous studies were in highly exposed industrial cohorts. Our aim was to quantify lung cancer burden attributable to occupational carcinogens in a general population.
Methods We applied a new job–exposure matrix (JEM) to translate lifetime work histories, collected by personal interview and coded into standard job titles, into never, low and high exposure levels for six known/suspected occupational lung carcinogens in the Environment and Genetics in Lung cancer Etiology (EAGLE) population-based case–control study, conducted in Lombardy region, Italy, in 2002–05. Odds ratios (ORs) and 95% confidence intervals (CIs) were calculated in men (1537 cases and 1617 controls), by logistic regression adjusted for potential confounders, including smoking and co-exposure to JEM carcinogens. The population attributable fraction (PAF) was estimated as impact measure.
Results Men showed an increased lung cancer risk even at low exposure to asbestos (OR: 1.76; 95% CI: 1.42–2.18), crystalline silica (OR: 1.31; 95% CI: 1.00–1.71) and nickel–chromium (OR: 1.18; 95% CI: 0.90–1.53); risk increased with exposure level. For polycyclic aromatic hydrocarbons, an increased risk (OR: 1.64; 95% CI: 0.99–2.70) was found only for high exposures. The PAFs for any exposure to asbestos, silica and nickel–chromium were 18.1, 5.7 and 7.0%, respectively, equivalent to an overall PAF of 22.5% (95% CI: 14.1–30.0). This corresponds to about 1016 (95% CI: 637–1355) male lung cancer cases/year in Lombardy.
Conclusions These findings support the substantial role of selected occupational carcinogens on lung cancer burden, even at low exposures, in a general population.
doi:10.1093/ije/dys042
PMCID: PMC3396321  PMID: 22467291
lung neoplasms; case–control study; carcinogens; occupational health
21.  Network-based Regularization for Matched Case-Control Analysis of High-dimensional DNA Methylation Data 
Statistics in medicine  2012;32(12):2127-2139.
The matched case-control designs are commonly used to control for potential confounding factors in genetic epidemiology studies especially epigenetic studies with DNA methylation. Compared with unmatched case-control studies with high-dimensional genomic or epigenetic data, there have been few variable selection methods for matched sets. In an earlier article, we proposed the penalized logistic regression model for the analysis of unmatched DNA methylation data using a network-based penalty. However, for popularly applied matched designs in epigenetic studies that compare DNA methylation between tumor and adjacent non-tumor tissues or between pre-treatment and post-treatment conditions, applying ordinary logistic regression ignoring matching is known to bring serious bias in estimation. In this article, we developed a penalized conditional logistic model using the network-based penalty that encourages a grouping-effect of 1) linked CpG sites within a gene or 2) linked genes within a genetic pathway for analysis of matched DNA methylation data. In our simulation studies, we demonstrated the superiority of using conditional logistic model over unconditional logistic model in high-dimensional variable selection problems for matched case-control data. We further investigated the benefits of utilizing biological group or graph information for matched case-control data. The proposed method was applied to a genome-wide DNA methylation study on hepatocellular carcinoma (HCC) where DNA methylation levels of tumor and adjacent non-tumor tissues from HCC patients were investigated using the Illumina Infinium HumanMethylation27 Beadchip. Several new CpG sites and genes known to be related to HCC were identified but were missed by the standard method in the original paper.
doi:10.1002/sim.5694
PMCID: PMC4038397  PMID: 23212810
DNA methylation; Genetic pathways; Matched case-control; Network-based regularization; Penalized conditional logistic; Variable selection
22.  Problem drinking as a risk factor for tuberculosis: a propensity score matched analysis of a national survey 
BMC Public Health  2013;13:871.
Background
Epidemiological and other evidence strongly supports the hypothesis that problem drinking is causally related to the incidence of active tuberculosis and the worsening of the disease course. The presence of a large number of potential confounders, however, complicates the assessment of the actual size of this causal effect, leaving room for a substantial amount of bias. This study aims to contribute to the understanding of the role of confounding in the observed association between problem drinking and tuberculosis, assessing the effect of the adjustment for a relatively large number of potential confounders on the estimated prevalence odds ratio of tuberculosis among problem drinkers vs. moderate drinkers/abstainers in a cross-sectional, nationally representative sample of the South African adult population.
Methods
A propensity score approach was used to match each problem drinker in the sample with a subset of moderate drinkers/abstainers with similar characteristics in respect to a set of potential confounders. The prevalence odds ratio of tuberculosis between the matched groups was then calculated using conditional logistic regression. Sensitivity analyses were conducted to assess the robustness of the results in respect to misspecification of the model.
Results
The prevalence odds ratio of tuberculosis between problem drinkers and moderate drinkers/abstainers was 1.97 (95% CI: 1.40 to 2.77), and the result was robust with respect to the matching procedure as well as to incorrect adjustment for potential mediators and to the possible presence of unmeasured confounders. Sub-population analysis did not provide noteworthy evidence for the presence of interaction between problem drinking and the observed confounders.
Conclusion
In a cross-sectional national survey of the adult population of a middle income country with high tuberculosis burden, problem drinking was associated with a two fold increase in the odds of past TB diagnosis after controlling for a large number of socio-economic and biological confounders. Within the limitations of a cross-sectional study design with self-reported tuberculosis status, these results adds to previous evidence of a causal link between problem drinking and tuberculosis, and suggest that the observed higher prevalence of tuberculosis among problem drinkers commonly found in population studies cannot be attributed to the confounding effect of the uneven distribution of other risk factors.
doi:10.1186/1471-2458-13-871
PMCID: PMC3852702  PMID: 24053258
Tuberculosis; Alcohol; Problem drinking; Multiple confounders; Propensity score; Sensitivity analysis
23.  Measurement error adjustment in essential fatty acid intake from a food frequency questionnaire: alternative approaches and methods 
Background
We aimed at assessing the degree of measurement error in essential fatty acid intakes from a food frequency questionnaire and the impact of correcting for such an error on precision and bias of odds ratios in logistic models. To assess these impacts, and for illustrative purposes, alternative approaches and methods were used with the binary outcome of cognitive decline in verbal fluency.
Methods
Using the Atherosclerosis Risk in Communities (ARIC) study, we conducted a sensitivity analysis. The error-prone exposure – visit 1 fatty acid intake (1987–89) – was available for 7,814 subjects 50 years or older at baseline with complete data on cognitive decline between visits 2 (1990–92) and 4 (1996–98). Our binary outcome of interest was clinically significant decline in verbal fluency. Point estimates and 95% confidence intervals were compared between naïve and measurement-error adjusted odds ratios of decline with every SD increase in fatty acid intake as % of energy. Two approaches were explored for adjustment: (A) External validation against biomarkers (plasma fatty acids in cholesteryl esters and phospholipids) and (B) Internal repeat measurements at visits 2 and 3. The main difference between the two is that Approach B makes a stronger assumption regarding lack of error correlations in the structural model. Additionally, we compared results from regression calibration (RCAL) to those from simulation extrapolation (SIMEX). Finally, using structural equations modeling, we estimated attenuation factors associated with each dietary exposure to assess degree of measurement error in a bivariate scenario for regression calibration of logistic regression model.
Results and conclusion
Attenuation factors for Approach A were smaller than B, suggesting a larger amount of measurement error in the dietary exposure. Replicate measures (Approach B) unlike concentration biomarkers (Approach A) may lead to imprecise odds ratios due to larger standard errors. Using SIMEX rather than RCAL models tends to preserve precision of odds ratios. We found in many cases that bias in naïve odds ratios was towards the null. RCAL tended to correct for a larger amount of effect bias than SIMEX, particularly for Approach A.
doi:10.1186/1471-2288-7-41
PMCID: PMC2048969  PMID: 17868465
24.  Adjusting for geographic variation in observational comparative effectiveness studies: a case study of antipsychotics using state Medicaid data 
Background
Area-level variation in treatment and outcomes may be a potential source of confounding bias in observational comparative effectiveness studies. This paper demonstrates how to use exploratory spatial data analysis (ESDA) and spatial statistical methods to investigate and control for these potential biases. The case presented compares the effectiveness of two antipsychotic treatment strategies: oral second-generation antipsychotics (SGAs) vs. long-acting paliperiodone palmitate (PP).
Methods
A new-start cohort study was conducted analyzing patient-level administrative claims data (8/1/2008–4/30/2011) from Missouri Medicaid. ESDA techniques were used to examine spatial patterns of antipsychotic prescriptions and outcomes (hospitalization and emergency department (ED) visits). Likelihood of mental health-related outcomes were compared between patients starting PP (N = 295) and oral SGAs (N = 8,626) using multilevel logistic regression models adjusting for patient composition (demographic and clinical factors) and geographic region.
Results
ESDA indicated significant spatial variation in antipsychotic prescription patterns and moderate variation in hospitalization and ED visits thereby indicating possible confounding by geography. In the multilevel models for this antipsychotic case example, patient composition represented a stronger source of confounding than geographic context.
Conclusion
Because geographic variation in health care delivery is ubiquitous, it could be a comparative effectiveness research (CER) best practice to test for possible geographic confounding in observational data. Though the magnitude of the area-level geography effects were small in this case, they were still statistically significant and should therefore be examined as part of this observational CER study. More research is needed to better estimate the range of confounding due to geography across different types of observational comparative effectiveness studies and healthcare utilization outcomes.
Electronic supplementary material
The online version of this article (doi:10.1186/1472-6963-14-355) contains supplementary material, which is available to authorized users.
doi:10.1186/1472-6963-14-355
PMCID: PMC4161848  PMID: 25164423
Comparative effectiveness research; Antipsychotics; Mental health; Small area variation; Geographic information systems; Spatial data analysis; Multilevel modeling
25.  Active or Passive Exposure to Tobacco Smoking and Allergic Rhinitis, Allergic Dermatitis, and Food Allergy in Adults and Children: A Systematic Review and Meta-Analysis 
PLoS Medicine  2014;11(3):e1001611.
In a systematic review and meta-analysis, Bahi Takkouche and colleagues examine the associations between exposure to tobacco smoke and allergic disorders in children and adults.
Please see later in the article for the Editors' Summary
Background
Allergic rhinitis, allergic dermatitis, and food allergy are extremely common diseases, especially among children, and are frequently associated to each other and to asthma. Smoking is a potential risk factor for these conditions, but so far, results from individual studies have been conflicting. The objective of this study was to examine the evidence for an association between active smoking (AS) or passive exposure to secondhand smoke and allergic conditions.
Methods and Findings
We retrieved studies published in any language up to June 30th, 2013 by systematically searching Medline, Embase, the five regional bibliographic databases of the World Health Organization, and ISI-Proceedings databases, by manually examining the references of the original articles and reviews retrieved, and by establishing personal contact with clinical researchers. We included cohort, case-control, and cross-sectional studies reporting odds ratio (OR) or relative risk (RR) estimates and confidence intervals of smoking and allergic conditions, first among the general population and then among children.
We retrieved 97 studies on allergic rhinitis, 91 on allergic dermatitis, and eight on food allergy published in 139 different articles. When all studies were analyzed together (showing random effects model results and pooled ORs expressed as RR), allergic rhinitis was not associated with active smoking (pooled RR, 1.02 [95% CI 0.92–1.15]), but was associated with passive smoking (pooled RR 1.10 [95% CI 1.06–1.15]). Allergic dermatitis was associated with both active (pooled RR, 1.21 [95% CI 1.14–1.29]) and passive smoking (pooled RR, 1.07 [95% CI 1.03–1.12]). In children and adolescent, allergic rhinitis was associated with active (pooled RR, 1.40 (95% CI 1.24–1.59) and passive smoking (pooled RR, 1.09 [95% CI 1.04–1.14]). Allergic dermatitis was associated with active (pooled RR, 1.36 [95% CI 1.17–1.46]) and passive smoking (pooled RR, 1.06 [95% CI 1.01–1.11]). Food allergy was associated with SHS (1.43 [1.12–1.83]) when cohort studies only were examined, but not when all studies were combined.
The findings are limited by the potential for confounding and bias given that most of the individual studies used a cross-sectional design. Furthermore, the studies showed a high degree of heterogeneity and the exposure and outcome measures were assessed by self-report, which may increase the potential for misclassification.
Conclusions
We observed very modest associations between smoking and some allergic diseases among adults. Among children and adolescents, both active and passive exposure to SHS were associated with a modest increased risk for allergic diseases, and passive smoking was associated with an increased risk for food allergy. Additional studies with detailed measurement of exposure and better case definition are needed to further explore the role of smoking in allergic diseases.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
The immune system protects the human body from viruses, bacteria, and other pathogens. Whenever a pathogen enters the body, immune system cells called T lymphocytes recognize specific molecules on its surface and release chemical messengers that recruit and activate other types of immune cells, which then attack the pathogen. Sometimes, however, the immune system responds to harmless materials (for example, pollen; scientists call these materials allergens) and triggers an allergic disease such as allergic rhinitis (inflammation of the inside of the nose; hay fever is a type of allergic rhinitis), allergic dermatitis (also known as eczema, a disease characterized by dry, itchy patches on the skin), and food allergy. Recent studies suggest that all these allergic (atopic) diseases are part of a continuous state called the “atopic march” in which individuals develop allergic diseases in a specific sequence that starts with allergic dermatitis during infancy, and progresses to food allergy, allergic rhinitis, and finally asthma (inflammation of the airways).
Why Was This Study Done?
Allergic diseases are extremely common, particularly in children. Allergic rhinitis alone affects 10%–30% of the world's population and up to 40% of children in some countries. Moreover, allergic diseases are becoming increasingly common. Allergic diseases affect the quality of life of patients and are financially costly to both patients and health systems. It is important, therefore, to identify the factors that cause or potentiate their development. One potential risk factor for allergic diseases is active or passive exposure to tobacco smoke. In some countries up to 80% of children are exposed to second-hand smoke so, from a public health point of view, it would be useful to know whether exposure to tobacco smoke is associated with the development of allergic diseases. Here, the researchers undertake a systematic review (a study that uses predefined criteria to identify all the research on a given topic) and a meta-analysis (a statistical approach for combining the results of several studies) to investigate this issue.
What Did the Researchers Do and Find?
The researchers identified 196 observational studies (investigations that observe outcomes in populations without trying to affect these outcomes in any way) that examined the association between smoke exposure and allergic rhinitis, allergic dermatitis, or food allergy. When all studies were analyzed together, allergic rhinitis was not associated with active smoking but was slightly associated with exposure to second-hand smoke. Specifically, compared to people not exposed to second-hand smoke, the pooled relative risk (RR) of allergic rhinitis among people exposed to second-hand smoke was 1.10 (an RR of greater than 1 indicates an increased risk of disease development in an exposed population compared to an unexposed population). Allergic dermatitis was associated with both active smoking (RR = 1.21) and exposure to second-hand smoke (RR = 1.07). In the populations of children and adolescents included in the studies, allergic rhinitis was associated with both active smoking and exposure to second-hand smoke (RRs of 1.40 and 1.09, respectively), as was allergic dermatitis (RRs of 1.36 and 1.06, respectively). Finally food allergy was associated with exposure to second-hand smoke (RR = 1.43) when cohort studies (a specific type of observational study) only were examined but not when all the studies were combined.
What Do These Findings Mean?
These findings provide limited evidence for a weak association between smoke exposure and allergic disease in adults but suggest that both active and passive smoking are associated with a modestly increased risk of allergic diseases in children and adolescents. The accuracy of these findings may be affected by the use of questionnaires to assess smoke exposure and allergic disease development in most of the studies in the meta-analysis and by the possibility that individuals exposed to smoke may have shared other characteristics that were actually responsible for their increased risk of allergic diseases. To shed more light on the role of smoking in allergic diseases, additional studies are needed that accurately measure exposure and outcomes. However, the present findings suggest that, in countries where many people smoke, 14% and 13% of allergic rhinitis and allergic dermatitis, respectively, among children may be attributable to active smoking. Thus, the elimination of active smoking among children and adolescents could prevent one in seven cases of allergic rhinitis and one in eight cases of allergic dermatitis in such countries.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001611.
The UK National Health Service Choices website provides information about allergic rhinitis, hay fever (including personal stories), allergic dermatitis (including personal stories), and food allergy (including personal stories)
The US National Institute of Allergy and Infectious Disease provides information about allergic diseases
The UK not-for-profit organization Allergy UK provides information about all aspects of allergic diseases and a description of the atopic march
MedlinePlus encyclopedia has pages on allergic rhinitis and allergic dermatitis (in English and Spanish)
MedlinePlus provides links to further resources about allergies, eczema, and food allergy (in English and Spanish)
doi:10.1371/journal.pmed.1001611
PMCID: PMC3949681  PMID: 24618794

Results 1-25 (1381724)