Though toxicological experiments demonstrate the teratogenicity of organic solvents in animal models, epidemiologic studies have reported inconsistent results. Using data from the population-based National Birth Defects Prevention Study, we examined the relation between maternal occupational exposure to aromatic solvents, chlorinated solvents and Stoddard solvent during early pregnancy and neural tube defects (NTDs) and orofacial clefts (OFCs).
Cases of NTDs (anencephaly, spina bifida and encephalocele) and OFCs (cleft lip ± cleft palate and cleft palate alone) delivered between 1997 and 2002 were identified by birth defect surveillance registries in 8 states; non-malformed control infants were selected using birth certificates or hospital records. Maternal solvent exposure was estimated by industrial hygienist review of self-reported occupational histories in combination with a literature-derived exposure database. Odds ratios (OR) and 95% confidence intervals (CI) for the association between solvent class and each birth defect group and component phenotype were estimated using multivariable logistic regression, adjusting for maternal age, race/ethnicity, education, pre-pregnancy body mass index, folic acid supplement use and smoking.
The prevalence of exposure to any solvent among mothers of NTD cases (n=511), OFC cases (n=1163) and controls (n=2977) was 13.1%, 9.6% and 8.2%, respectively. Exposure to chlorinated solvents was associated with increased odds of NTDs (OR=1.96; CI=1.34, 2.87), especially spina bifida (OR=2.26; CI=1.44, 3.53). No solvent class was strongly associated with OFCs in these data.
Our findings suggest that maternal occupational exposure to chlorinated solvents during early pregnancy is positively associated with the prevalence of NTDs in offspring.
congenital abnormalities; occupational exposure; solvents
Exposure lagging and exposure-time window analysis are 2 widely used approaches to allow for induction and latency periods in analyses of exposure-disease associations. Exposure lagging implies a strong parametric assumption about the temporal evolution of the exposure-disease association. An exposure-time window analysis allows for a more flexible description of temporal variation in exposure effects but may result in unstable risk estimates that are sensitive to how windows are defined. The authors describe a hierarchical regression approach that combines time window analysis with a parametric latency model. They illustrate this approach using data from 2 occupational cohort studies: studies of lung cancer mortality among 1) asbestos textile workers and 2) uranium miners. For each cohort, an exposure-time window analysis was compared with a hierarchical regression analysis with shrinkage toward a simpler, second-stage parametric latency model. In each cohort analysis, there is substantial stability gained in time window-specific estimates of association by using a hierarchical regression approach. The proposed hierarchical regression model couples a time window analysis with a parametric latency model; this approach provides a way to stabilize risk estimates derived from a time window analysis and a way to reduce bias arising from misspecification of a parametric latency model.
cohort studies; hierarchical model; latency; neoplasms; regression
Bayesian posterior parameter distributions are often simulated using Markov chain Monte Carlo (MCMC) methods. However, MCMC methods are not always necessary and do not help the uninitiated understand Bayesian inference. As a bridge to understanding Bayesian inference, the authors illustrate a transparent rejection sampling method. In example 1, they illustrate rejection sampling using 36 cases and 198 controls from a case-control study (1976–1983) assessing the relation between residential exposure to magnetic fields and the development of childhood cancer. Results from rejection sampling (odds ratio (OR) = 1.69, 95% posterior interval (PI): 0.57, 5.00) were similar to MCMC results (OR = 1.69, 95% PI: 0.58, 4.95) and approximations from data-augmentation priors (OR = 1.74, 95% PI: 0.60, 5.06). In example 2, the authors apply rejection sampling to a cohort study of 315 human immunodeficiency virus seroconverters (1984–1998) to assess the relation between viral load after infection and 5-year incidence of acquired immunodeficiency syndrome, adjusting for (continuous) age at seroconversion and race. In this more complex example, rejection sampling required a notably longer run time than MCMC sampling but remained feasible and again yielded similar results. The transparency of the proposed approach comes at a price of being less broadly applicable than MCMC.
Bayes theorem; epidemiologic methods; inference; Monte Carlo method; posterior distribution; simulation
Background stratified Poisson regression is an approach that has been used in the analysis of data derived from a variety of epidemiologically important studies of radiation-exposed populations, including uranium miners, nuclear industry workers, and atomic bomb survivors. We describe a novel approach to fit Poisson regression models that adjust for a set of covariates through background stratification while directly estimating the radiation-disease association of primary interest. The approach makes use of an expression for the Poisson likelihood that treats the coefficients for stratum-specific indicator variables as ‘nuisance’ variables and avoids the need to explicitly estimate the coefficients for these stratum-specific parameters. Log-linear models, as well as other general relative rate models, are accommodated. This approach is illustrated using data from the Life Span Study of Japanese atomic bomb survivors and data from a study of underground uranium miners. The point estimate and confidence interval obtained from this ‘conditional’ regression approach are identical to the values obtained using unconditional Poisson regression with model terms for each background stratum. Moreover, it is shown that the proposed approach allows estimation of background stratified Poisson regression models of non-standard form, such as models that parameterize latency effects, as well as regression models in which the number of strata is large, thereby overcoming the limitations of previously available statistical software for fitting background stratified Poisson regression models.
Cohort studies; Poisson regression; Ionizing radiation; Survival analysis
Changes in the workforce during the civil rights movement may have impacted occupational exposures in the United States. We examined Savannah River Site (SRS) employee records (1951–1999) for changes in radiation doses and monitoring practices, by race and sex. Segregation of jobs by race and sex diminished but remained pronounced in recent years. Female workers were less likely than males to be monitored for occupational radiation exposure [odds of being unmonitored = 3.11; 95% CI: (2.79, 3.47)] even after controlling for job and decade of employment. Black workers were more likely than non-black workers to have a detectable radiation dose [OR = 1.36 (95% CI: 1.28, 1.43)]. Female workers have incomplete dose histories that would hinder compensation for illnesses related to occupational exposures. The persistence of job segregation and excess radiation exposures of black workers shows the need for further action to address disparities in occupational opportunities and hazardous exposures in the U.S. South.
Lagging exposure information is often undertaken to allow for a latency period in cumulative exposure-disease analyses. The authors first consider bias and confidence interval coverage when using the standard approaches of fitting models under several lag assumptions and selecting the lag that maximizes either the effect estimate or model goodness of fit. Next, they consider bias that occurs when the assumption that the latency period is a fixed constant does not hold. Expressions were derived for bias due to misspecification of lag assumptions, and simulations were conducted. Finally, the authors describe a method for joint estimation of parameters describing an exposure-response association and the latency distribution. Analyses of associations between cumulative asbestos exposure and lung cancer mortality among textile workers illustrate this approach. Selecting the lag that maximizes the effect estimate may lead to bias away from the null; selecting the lag that maximizes model goodness of fit may lead to confidence intervals that are too narrow. These problems tend to increase as the within-person exposure variation diminishes. Lagging exposure assignment by a constant will lead to bias toward the null if the distribution of latency periods is not a fixed constant. Direct estimation of latency periods can minimize bias and improve confidence interval coverage.
asbestos; cohort studies; latency; neoplasms; survival analysis
In occupational epidemiologic studies, the healthy-worker survivor effect refers to a process that leads to bias in the estimates of an association between cumulative exposure and a health outcome. In these settings, work status acts both as an intermediate and confounding variable, and may violate the positivity assumption (the presence of exposed and unexposed observations in all strata of the confounder). Using Monte Carlo simulation, we assess the degree to which crude, work-status adjusted, and weighted (marginal structural) Cox proportional hazards models are biased in the presence of time-varying confounding and nonpositivity. We simulate data representing time-varying occupational exposure, work status, and mortality. Bias, coverage, and root mean squared error (MSE) were calculated relative to the true marginal exposure effect in a range of scenarios. For a base-case scenario, using crude, adjusted, and weighted Cox models, respectively, the hazard ratio was biased downward 19%, 9%, and 6%; 95% confidence interval coverage was 48%, 85%, and 91%; and root MSE was 0.20, 0.13, and 0.11. Although marginal structural models were less biased in most scenarios studied, neither standard nor marginal structural Cox proportional hazards models fully resolve the bias encountered under conditions of time-varying confounding and nonpositivity.
The authors investigated the relation between ionizing radiation and lymphoma mortality in 2 cohorts: 1) 20,940 men in the Life Span Study, a study of Japanese atomic bomb survivors who were aged 15–64 years at the time of the bombings of Hiroshima and Nagasaki, and 2) 15,264 male nuclear weapons workers who were hired at the Savannah River Site in South Carolina between 1950 and 1986. Radiation dose-mortality trends were evaluated for all malignant lymphomas and for non-Hodgkin's lymphoma. Positive associations between lymphoma mortality and radiation dose under a 5-year lag assumption were observed in both cohorts (excess relative rates per sievert were 0.79 (90% confidence interval: 0.10, 1.88) and 6.99 (90% confidence interval: 0.96, 18.39), respectively). Exclusion of deaths due to Hodgkin's disease led to small changes in the estimates of association. In each cohort, evidence of a dose-response association was primarily observed more than 35 years after irradiation. These findings suggest a protracted induction and latency period for radiation-induced lymphoma mortality.
lymphoma; mortality; nuclear weapons; radiation, ionizing
The objective of this study is to characterize the effect of temperature on emergency department visits for asthma and modification of this association by season. This association is of interest in its own right, and also important to understand because temperature may be an important confounder in analyses of associations between other environmental exposures and asthma. For example, the case-crossover study design is commonly used to investigate associations between air pollution and respiratory outcomes, such as asthma. This approach controls for confounding by month and season by design, and permits adjustment for potential confounding by temperature through regression modeling. However, such models may fail to adequately control for confounding if temperature effects are seasonal, since case-crossover analyses rarely account for interactions between matching factors (such as calendar month) and temperature.
We conducted a case-crossover study to determine whether the association between temperature and emergency department visits for asthma varies by season or month. Asthma emergency department visits among North Carolina adults during 2007–2008 were identified using a statewide surveillance system. Marginal as well as season- and month-specific associations between asthma visits and temperature were estimated with conditional logistic regression.
The association between temperature and adult emergency department visits for asthma is near null when the overall association is examined [odds ratio (OR) per 5 degrees Celsius = 1.01, 95% confidence interval (CI): 1.00, 1.02]. However, significant variation in temperature-asthma associations was observed by season (chi-square = 18.94, 3 degrees of freedom, p <0.001) and by month of the year (chi-square = 45.46, 11 degrees of freedom, p <0.001). ORs per 5 degrees Celsius were increased in February (OR = 1.06, 95% CI: 1.02, 1.10), July (OR = 1.16, 95% CI: 1.04, 1.29), and December (OR = 1.04, 95% CI: 1.01, 1.07) and decreased in September (OR = 0.92, 95% CI: 0.87, 0.97).
Our empirical example suggests that there is significant seasonal variation in temperature-asthma associations. Epidemiological studies rarely account for interactions between ambient temperature and temporal matching factors (such as month of year) in the case-crossover design. These findings suggest that greater attention should be given to seasonal modification of associations between temperature and respiratory outcomes in case-crossover analyses of other environmental asthma triggers.
Asthma; Temperature; Season; Case-crossover
Urinary 1,6-hexamethylene diamine (HDA) may serve as a biomarker for systemic exposure to 1,6-hexamethylene diisocyanate (HDI) in occupationally exposed populations. However, the quantitative relationships between dermal and inhalation exposure to HDI and urine HDA levels have not been established. We measured acid-hydrolyzed urine HDA levels along with dermal and breathing-zone levels of HDI in 48 automotive spray painters. These measurements were conducted over the course of an entire workday for up to three separate workdays that were spaced approximately 1 month apart. One urine sample was collected before the start of work with HDI-containing paints and subsequent samples were collected during the workday. HDA levels varied throughout the day and ranged from nondetectable to 65.9 μg l−1 with a geometric mean and geometric standard deviation of 0.10 μg l−1 ± 6.68. Dermal exposure and inhalation exposure levels, adjusted for the type of respirator worn, were both significant predictors of urine HDA levels in the linear mixed models. Creatinine was a significant covariate when used as an independent variable along with dermal and respirator-adjusted inhalation exposure. Consequently, exposure assessment models must account for the water content of a urine sample. These findings indicate that HDA exhibits a biphasic elimination pattern, with a half-life of 2.9 h for the fast elimination phase. Our results also indicate that urine HDA level is significantly associated with systemic HDI exposure through both the skin and the lungs. We conclude that urinary HDA may be used as a biomarker of exposure to HDI, but biological monitoring should be tailored to reliably capture the intermittent exposure pattern typical in this industry.
biomarkers; creatinine; dermal exposure; 1,6-hexamethylene diamine; 1,6-hexamethylene diisocyanate; inhalation exposure; urine analysis
In April 2010, the U.S. Nuclear Regulatory Commission asked the National Academy of Sciences to update a 1990 study of cancer risks near nuclear facilities. Prior research on this topic has suffered from problems in hypothesis formulation and research design.
We review epidemiologic principles used in studies of generic exposure–response associations and in studies of specific sources of exposure. We then describe logical problems with assumptions, formation of testable hypotheses, and interpretation of evidence in previous research on cancer risks near nuclear facilities.
Advancement of knowledge about cancer risks near nuclear facilities depends on testing specific hypotheses grounded in physical and biological mechanisms of exposure and susceptibility while considering sample size and ability to adequately quantify exposure, ascertain cancer cases, and evaluate plausible confounders.
Next steps in advancing knowledge about cancer risks near nuclear facilities require studies of childhood cancer incidence, focus on in utero and early childhood exposures, use of specific geographic information, and consideration of pathways for transport and uptake of radionuclides. Studies of cancer mortality among adults, cancers with long latencies, large geographic zones, and populations that reside at large distances from nuclear facilities are better suited for public relations than for scientific purposes.
childhood cancer; environmental epidemiology; ionizing radiation; methodology; nuclear power
Cox proportional hazards regression analysis of survival data and conditional logistic regression analysis of matched case-control data are methods that are widely used by epidemiologists. Standard statistical software packages accommodate only log-linear model forms, which imply exponential exposure-response functions and multiplicative interactions. In this paper, the authors describe methods for fitting non-log-linear Cox and conditional logistic regression models. The authors use data from a study of lung cancer mortality among Colorado Plateau uranium miners (1950–1982) to illustrate these methods for fitting general relative risk models to matched case-control control data, countermatched data with weights, d:m matching, and full cohort Cox regression using the SAS statistical package (SAS Institute Inc., Cary, North Carolina).
algorithms; cohort studies; conditional likelihood; dose-response function; linear trend; logistic models; models, statistical; software
The effect of an increment of exposure on disease risk may vary with time since exposure. If the pattern of temporal variation is simple (e.g., a peak then decline in excess risk of disease) then this may be modeled efficiently via a parametric latency function. Estimation of the parameters for such a model can be difficult because the parameters are not a function of a simple summary of the exposure history. Typically such parameters are estimated via an iterative search that requires calculating a different time-weighted exposure for each combination of the latency function parameters. This paper describes a simple approach to fitting logistic regression models that include a parametric latency function. This approach is illustrated using data from a study of the association between radon exposure and lung cancer mortality among underground uranium miners. This approach should facilitate fitting models to assess variation with time since exposure in the effect of a protracted environmental or occupational exposure.
The relative excess risk due to interaction (RERI) provides a useful metric of departure from additivity of effects on a relative risk scale. In this paper, the authors show that RERI is identical to the product term in a linear odds ratio or a linear relative risk model. SAS and STATA codes are provided for fitting a linear odds ratio model that directly parameterizes RERI. In addition, this paper presents a method for obtaining likelihood-based 95% confidence bound estimates for RERI. The authors show that likelihood-based confidence intervals may differ substantially from the asymptotic confidence interval estimates advocated by previous authors. The approach presented in this paper should facilitate estimation of RERI and associated likelihood-based confidence bounds, by using standard statistical packages.
confidence intervals; interaction; logistic regression; risk ratio
Methicillin resistant Staphylococcus aureus (MRSA) poses a threat to patient safety and public health. Understanding how MRSA is acquired is important for prevention efforts. This study investigates risk factors for MRSA nasal carriage among patients at an eastern North Carolina hospital in 2011.
Using a case-control design, hospitalized patients ages 18 – 65 years were enrolled between July 25, 2011 and December 15, 2011 at Vidant Medical Center, a tertiary care hospital that screens all admitted patients for nasal MRSA carriage. Cases, defined as MRSA nasal carriers, were age and gender matched to controls, non-MRSA carriers. In-hospital interviews were conducted, and medical records were reviewed to obtain information on medical and household exposures. Multivariable conditional logistic regression was used to derive odds ratio (OR) estimates of association between MRSA carriage and medical and household exposures.
In total, 117 cases and 119 controls were recruited to participate. Risk factors for MRSA carriage included having household members who took antibiotics or were hospitalized (OR: 3.27; 95% Confidence Interval (CI): 1.24–8.57) and prior hospitalization with a positive MRSA screen (OR: 3.21; 95% CI: 1.12–9.23). A lower proportion of cases than controls were previously hospitalized without a past positive MRSA screen (OR: 0.40; 95% CI: 0.19–0.87).
These findings suggest that household exposures are important determinants of MRSA nasal carriage in hospitalized patients screened at admission.
Benzene is a human carcinogen. Exposure to benzene occurs in occupational and environmental settings.
I evaluated variation in benzene-related leukemia with age at exposure and time since exposure.
I evaluated data from a cohort of 1,845 rubber hydrochloride workers. Benzene exposure–leukemia mortality trends were estimated by applying proportional hazards regression methods. Temporal variation in the impact of benzene on leukemia rates was assessed via exposure time windows and fitting of a multistage cancer model.
The association between leukemia mortality and benzene exposures was of greatest magnitude in the 10 years immediately after exposure [relative rate (RR) at 10 ppm-years = 1.19; 95% confidence interval (CI), 1.10–1.29]; the association was of smaller magnitude in the period 10 to < 20 years after exposure (RR at 10 ppm-years = 1.05; 95% CI, 0.97–1.13); and there was no evidence of association ≥ 20 years after exposure. Leukemia was more strongly associated with benzene exposures accrued at ≥ 45 years of age (RR at 10 ppm-years = 1.11; 95% CI, 1.04–1.17) than with exposures accrued at younger ages (RR at 10 ppm-years = 1.01; 95% CI, 0.92–1.09). Jointly, these temporal effects can be efficiently modeled as a multistage process in which benzene exposure affects the penultimate stage in disease induction.
Further attention should be given to evaluating the susceptibility of older workers to benzene-induced leukemia.
benzene; cohort study; leukemia; mortality; Ohio
During 1990–1991 a childhood leukemia cluster was observed in the sparsely populated region surrounding two nuclear establishments southeast of Hamburg, Germany. Since then, several new cases have been reported. Recently a possible accidental release of radionuclides in 1986 was hypothesized.
The objective of this study was to analyze the childhood leukemia incidence in this area since 1990.
All incident cases (< 15 years of age) were ascertained during 1990–2005 within a 5-km radius of the Krümmel nuclear power plant. We derived standardized incidence ratios (SIRs) using county and national leukemia incidence rates as referents. We stratified analyses by calendar period and attained age, and by subdividing the study region into areas north versus south of the Elbe river.
Fourteen cases were ascertained in the study area, whereas 4.0 were expected based on national referent rates [1990–2005: SIR = 3.5; 95% confidence interval (CI), 1.9–5.9]. The excess was not confined to the early 1990s; for the more recent time period 1999–2005, the SIR is still elevated (SIR = 2.7; 95% CI, 0.9–6.2). SIRs of greatest magnitude were observed for children 0–4 years of age (SIR = 4.9; 95% CI, 2.4–9.0) and for residents south of the Elbe (SIR = 7.5; 95% CI, 2.8–16.4).
The incidence in this region is significantly higher than the childhood leukemia incidence for Germany as a whole. To date, no unique hazards have been identified in this population. The fact that the elevated rates have persisted in this community for > 15 years warrants further investigation.
childhood leukemia; Germany; nuclear installations; standardized incidence ratio; time windows; vicinity
The U.S. government recently implemented rules for awarding compensation to individuals with cancer who were exposed to ionizing radiation while working in the nuclear weapons complex. Under these rules, chronic lymphocytic leukemia (CLL) is considered to be a nonradiogenic form of cancer. In other words, workers who develop CLL automatically have their compensation claim rejected because the compensation rules hold that the risk of radiation-induced CLL is zero. In this article we review molecular, clinical, and epidemiologic evidence regarding the radiogenicity of CLL. We note that current understanding of radiation-induced tumorigenesis and the etiology of lymphatic neoplasia provides a strong mechanistic basis for expecting that ionizing radiation exposure increases CLL risk. The clinical characteristics of CLL, including prolonged latency and morbidity periods and a low case fatality rate, make it relatively difficult to evaluate associations between ionizing radiation and CLL risk via epidemiologic methods. The epidemiologic evidence of association between external exposure to ionizing radiation and CLL is weak. However, epidemiologic findings are consistent with a hypothesis of elevated CLL mortality risk after a latency and morbidity period that spans several decades. Our findings in this review suggest that there is not a persuasive basis for the conclusion that CLL is a nonradiogenic form of cancer.
chronic lymphocytic leukemia; compensation; ionizing radiation; radiogenicity