The Department of Defense (DoD) strives to efficiently manage the large volumes of administrative data collected and repurpose this information for research and analyses with policy implications. This need is especially present in the United States Army, which maintains numerous electronic databases with information on more than one million Active-Duty, Reserve, and National Guard soldiers, their family members, and Army civilian employees. The accumulation of vast amounts of digitized health, military service, and demographic data thus approaches, and may even exceed, traditional benchmarks for Big Data. Given the challenges of disseminating sensitive personal and health information, the Person-Event Data Environment (PDE) was created to unify disparate Army and DoD databases in a secure cloud-based enclave. This electronic repository serves the ultimate goal of achieving cost efficiencies in psychological and healthcare studies and provides a platform for collaboration among diverse scientists. This paper provides an overview of the uses of the PDE to perform command surveillance and policy analysis for Army leadership. The paper highlights the confluence of both economic and behavioral science perspectives elucidating empirically-based studies examining relations between psychological assets, health, and healthcare utilization. Specific examples explore the role of psychological assets in major cost drivers such as medical expenditures both during deployment and stateside, drug use, attrition from basic training, and low reenlistment rates. Through creation of the PDE, the Army and scientific community can now capitalize on the vast amounts of personnel, financial, medical, training and education, deployment, and security systems that influence Army-wide policies and procedures.
big data; psychological strengths; cost analysis; healthcare utilization; personnel data
U.S. National Park Service employees may have prolonged exposure to wildlife and arthropods, placing them at increased risk of infection with endemic zoonoses. To evaluate possible zoonotic risks present at both Great Smoky Mountains (GRSM) and Rocky Mountain (ROMO) National Parks, we assessed park employees for baseline seroprevalence to specific zoonotic pathogens, followed by evaluation of incident infections over a 1-year study period. Park personnel showed evidence of prior infection with a variety of zoonotic agents, including California serogroup bunyaviruses (31.9%), Bartonella henselae (26.7%), spotted fever group rickettsiae (22.2%), Toxoplasma gondii (11.1%), Anaplasma phagocytophilum (8.1%), Brucella spp. (8.9%), flaviviruses (2.2%), and Bacillus anthracis (1.5%). Over a 1-year study period, we detected incident infections with leptospirosis (5.7%), B. henselae (5.7%), spotted fever group rickettsiae (1.5%), T. gondii (1.5%), B. anthracis (1.5%), and La Crosse virus (1.5%) in staff members at GRSM, and with spotted fever group rickettsiae (8.5%) and B. henselae (4.3%) in staff at ROMO. The risk of any incident infection was greater for employees who worked as resource managers (OR 7.4; 95% CI 1.4,37.5; p=0.02), and as law enforcement rangers/rescue crew (OR 6.5; 95% CI 1.1,36.5; p=0.03), relative to those who worked primarily in administration or management. The results of this study increase our understanding of the pathogens circulating within both parks, and can be used to inform the development of effective guidelines and interventions to increase visitor and staff awareness and help prevent exposure to zoonotic agents.
Incidence; National Park Service; Prevalence; Vector-borne; Zoonoses
The recent U.S. Congressional mandate for creating drug-free learning environments in elementary and secondary schools stipulates that education reform rely on accountability, parental and community involvement, local decision making, and use of evidence-based drug prevention programs. By necessity, this charge has been paralleled by increased interest in demonstrating that drug prevention programs net tangible benefits to society. One pressing concern is precisely how to integrate traditional scientific methods of program evaluation with economic measures of “cost efficiency”. The languages and methods of each respective discipline don’t necessarily converge on how to establish the true benefits of drug prevention. This article serves as a primer for conducting economic analyses of school-based drug prevention programs. The article provides the reader with a foundation in the relevant principles, methodologies, and benefits related to conducting economic analysis. Discussion revolves around how economists value the potential costs and benefits, both financial and personal, from implementing school-based drug prevention programs targeting youth. Application of heterogeneous costing methods coupled with widely divergent program evaluation findings influences the feasibility of these techniques and may hinder utilization of these practices. Determination of cost-efficiency should undoubtedly become one of several markers of program success and contribute to the ongoing debate over health policy.
opportunity cost; cost benefit; valuation; cost effectiveness; program efficacy; statistical mediation; economic analysis
East Africa has been identified as a region where vector-borne and zoonotic diseases are most likely to emerge or re-emerge and where morbidity and mortality from these diseases is significant. Understanding when and where humans are most likely to be exposed to vector-borne and zoonotic disease agents in this region can aid in targeting limited prevention and control resources. Often, spatial and temporal distributions of vectors and vector-borne disease agents are predictable based on climatic variables. However, because of coarse meteorological observation networks, appropriately scaled and accurate climate data are often lacking for Africa. Here, we use a recently developed 10-year gridded meteorological dataset from the Advanced Weather Research and Forecasting Model to identify climatic variables predictive of the spatial distribution of human plague cases in the West Nile region of Uganda. Our logistic regression model revealed that within high elevation sites (above 1,300 m), plague risk was positively associated with rainfall during the months of February, October, and November and negatively associated with rainfall during the month of June. These findings suggest that areas that receive increased but not continuous rainfall provide ecologically conducive conditions for Yersinia pestis transmission in this region. This study serves as a foundation for similar modeling efforts of other vector-borne and zoonotic disease in regions with sparse observational meteorologic networks.
Climate and weather influence the occurrence, distribution, and incidence of infectious diseases, particularly those caused by vector-borne or zoonotic pathogens. Thus, models based on meteorological data have helped predict when and where human cases are most likely to occur. Such knowledge aids in targeting limited prevention and control resources and may ultimately reduce the burden of diseases. Paradoxically, localities where such models could yield the greatest benefits, such as tropical regions where morbidity and mortality caused by vector-borne diseases is greatest, often lack high-quality in situ local meteorological data. Satellite- and model-based gridded climate datasets can be used to approximate local meteorological conditions in data-sparse regions, however their accuracy varies. Here we investigate how the selection of a particular dataset can influence the outcomes of disease forecasting models. Our model system focuses on plague (Yersinia pestis infection) in the West Nile region of Uganda. The majority of recent human cases have been reported from East Africa and Madagascar, where meteorological observations are sparse and topography yields complex weather patterns. Using an ensemble of meteorological datasets and model-averaging techniques we find that the number of suspected cases in the West Nile region was negatively associated with dry season rainfall (December-February) and positively with rainfall prior to the plague season. We demonstrate that ensembles of available meteorological datasets can be used to quantify climatic uncertainty and minimize its impacts on infectious disease models. These methods are particularly valuable in regions with sparse observational networks and high morbidity and mortality from vector-borne diseases.
From September through early December 2005, an outbreak of yellow fever (YF) occurred in South Kordofan, Sudan, resulting in a mass YF vaccination campaign. In late December 2005, we conducted a serosurvey to assess YF vaccine coverage and to better define the epidemiology of the outbreak in an index village. Of 552 persons enrolled, 95% reported recent YF vaccination, and 25% reported febrile illness during the outbreak period: 13% reported YF-like illness, 4% reported severe YF-like illness, and 12% reported chikungunya-like illness. Of 87 persons who provided blood samples, all had positive YF serologic results, including three who had never been vaccinated. There was also serologic evidence of recent or prior chikungunya virus, dengue virus, West Nile virus, and Sindbis virus infections. These results indicate that YF virus and chikungunya virus contributed to the outbreak. The high prevalence of YF antibody among vaccinees indicates that vaccination was effectively implemented in this remotely located population.
Plague, a life-threatening flea-borne zoonosis caused by Yersinia pestis, has most commonly been reported from eastern Africa and Madagascar in recent decades. In these regions and elsewhere, prevention and control efforts are typically targeted at fine spatial scales, yet risk maps for the disease are often presented at coarse spatial resolutions that are of limited value in allocating scarce prevention and control resources. In our study, we sought to identify sub-village level remotely sensed correlates of elevated risk of human exposure to plague bacteria and to project the model across the plague-endemic West Nile region of Uganda and into neighboring regions of the Democratic Republic of Congo. Our model yielded an overall accuracy of 81%, with sensitivities and specificities of 89% and 71%, respectively. Risk was higher above 1,300 meters than below, and the remotely sensed covariates that were included in the model implied that localities that are wetter, with less vegetative growth and more bare soil during the dry month of January (when agricultural plots are typically fallow) pose an increased risk of plague case occurrence. Our results suggest that environmental and landscape features play a large part in classifying an area as ecologically conducive to plague activity. However, it is clear that future studies aimed at identifying behavioral and fine-scale ecological risk factors in the West Nile region are required to fully assess the risk of human exposure to Y. pestis.
In Escherichia coli, Rob activates transcription of the SoxRS/MarA/Rob regulon. Previous work revealed that Rob resides in 3–4 immunostainable foci, that dipyridyl and bile salts are inducers of its activity, and that inducers bind to Rob’s C-terminal domain (CTD). We propose that sequestration inactivates Rob by blocking its access to the transcriptional machinery and that inducers activate Rob by mediating its dispersal, allowing interaction with RNA polymerase. To test “sequestration-dispersal” as a new mechanism for regulating the activity of transcriptional activators, we fused Rob’s CTD to SoxS and used indirect immunofluorescence microscopy to determine the effect of inducers on SoxS-Rob’s cellular localization. Unlike native SoxS, which is uniformly distributed throughout the cell, SoxS-Rob is sequestered without inducer, but is rapidly dispersed when cells are treated with inducer. In this manner, Rob’s CTD serves as an anti-sigma factor in regulating the co-sigma factor-like activity of SoxS when fused to it. Rob’s CTD also protects its N-terminus from Lon protease, since Lon’s normally rapid degradation of SoxS is blocked in the chimera. Accordingly, Rob’s CTD has novel regulatory properties that can be bestowed on another E. coli protein.
gene regulation; intracellular localization; immunofluorescence microscopy; anti-sigma factor; proteolysis
The ability to manipulate protein levels is useful for dissecting regulatory pathways, elucidating gene function, and constructing synthetic biological circuits. We engineered an inducible protein degradation system for use in Bacillus subtilis based on E. coli and C. crescentus ssrA-tags and SspB adaptors that deliver proteins to ClpXP for proteolysis. In this system, modified ssrA degradation tags are fused onto the 3’ end of the genes of interest. Unlike wild type ssrA, these modified tags require the adaptor protein SspB to target tagged proteins for proteolysis. In the absence of SspB, the tagged proteins accumulate to near physiological levels. By inducing SspB expression from a regulated promoter, the tagged substrates are rapidly delivered to the B. subtilis ClpXP protease for degradation. We used this system to degrade the reporter GFP and several native B. subtilis proteins, including, the transcription factor ComA, two sporulation kinases (KinA, KinB) and the sporulation and chromosome partitioning protein Spo0J. We also used modified E. coli and C. crescentus ssrA tags to independently control the degradation of two different proteins in the same cell. These tools will be useful for studying biological processes in B. subtilis and can potentially be modified for use in other bacteria.
ssrA; SspB; sporulation; ComA; GFP
In Bacillus subtilis, the transcription factor ComA activates several biological processes in response to increasing population density. Extracellular peptide signaling is used to coordinate the activity of ComA with population density. At low culture densities, when the concentration of signaling peptides is lowest, ComA is largely inactive. At higher densities, when the concentration of signaling peptides is higher, ComA is active and activates transcription of at least 9 operons involved in the development of competence and the production of degradative enzymes and antibiotics. We found that ComA binds a degenerate tri-partite sequence consisting of three DNA binding determinants or “recognition elements”. Mutational analyses showed that all three recognition elements are required for transcription activation in vivo and for specific DNA binding by ComA in vitro. Degeneracy of the recognition elements in the ComA binding site is an important regulatory feature for coordinating transcription with population density, i.e., promoters containing an optimized binding site have high activity at low culture density and were no longer regulated in the normal density-dependent manner. We found that purified ComA forms a dimer in solution and we propose a model for how two dimers of ComA bind to an odd number of DNA binding determinants to activate transcription of target genes. This DNA-protein architecture for transcription activation appears to be conserved for ComA homologs in other Bacillus species.
quorum sensing; transcription; response regulator; DNA binding; Bacillus subtilis
As part of a fatal human plague case investigation, we showed that the plague bacterium, Yersinia pestis, can survive for at least 24 days in contaminated soil under natural conditions. These results have implications for defining plague foci, persistence, transmission, and bioremediation after a natural or intentional exposure to Y. pestis.
Yersinia pestis; soil; plague; dispatch
On November 20, 2001, inhalational anthrax was confirmed in an elderly woman from rural Connecticut. To determine her exposure source, we conducted an extensive epidemiologic, environmental, and laboratory investigation. Molecular subtyping showed that her isolate was indistinguishable from isolates associated with intentionally contaminated letters. No samples from her home or community yielded Bacillus anthracis, and she received no first-class letters from facilities known to have processed intentionally contaminated letters. Environmental sampling in the regional Connecticut postal facility yielded B. anthracis spores from 4 (31%) of 13 sorting machines. One extensively contaminated machine primarily processes bulk mail. A second machine that does final sorting of bulk mail for her zip code yielded B. anthracis on the column of bins for her carrier route. The evidence suggests she was exposed through a cross-contaminated bulk mail letter. Such cross-contamination of letters and postal facilities has implications for managing the response to future B. anthracis–contaminated mailings.
Bacillus anthracis; inhalational anthrax; bioterrorism; postal facilities; research
After inhalational anthrax was diagnosed in a Connecticut woman on November 20, 2001, postexposure prophylaxis was recommended for postal workers at the regional mail facility serving the patient’s area. Although environmental testing at the facility yielded negative results, subsequent testing confirmed the presence of Bacillus anthracis. We distributed questionnaires to 100 randomly selected postal workers within 20 days of initial prophylaxis. Ninety-four workers obtained antibiotics, 68 of whom started postexposure prophylaxis and 21 discontinued. Postal workers who stopped or never started taking prophylaxis cited as reasons disbelief regarding anthrax exposure, problems with adverse events, and initial reports of negative cultures. Postal workers with adverse events reported predominant symptoms of gastrointestinal distress and headache. The influence of these concerns on adherence suggests that communication about risks of acquiring anthrax, education about adverse events, and careful management of adverse events are essential elements in increasing adherence.
Anthrax; Bacillus anthracis; prophylaxis; adverse effects; ciprofloxacin; doxycycline; patient noncompliance; Connecticut
In October 2001, the first inhalational anthrax case in the United States since 1976 was identified in a media company worker in Florida. A national investigation was initiated to identify additional cases and determine possible exposures to Bacillus anthracis. Surveillance was enhanced through health-care facilities, laboratories, and other means to identify cases, which were defined as clinically compatible illness with laboratory-confirmed B. anthracis infection. From October 4 to November 20, 2001, 22 cases of anthrax (11 inhalational, 11 cutaneous) were identified; 5 of the inhalational cases were fatal. Twenty (91%) case-patients were either mail handlers or were exposed to worksites where contaminated mail was processed or received. B. anthracis isolates from four powder-containing envelopes, 17 specimens from patients, and 106 environmental samples were indistinguishable by molecular subtyping. Illness and death occurred not only at targeted worksites, but also along the path of mail and in other settings. Continued vigilance for cases is needed among health-care providers and members of the public health and law enforcement communities.
A novel human cellular structure has been identified that contains a unique autoimmune antigen and multiple messenger RNAs. This complex was discovered using an autoimmune serum from a patient with motor and sensory neuropathy and contains a protein of 182 kDa. The gene and cDNA encoding the protein indicated an open reading frame with glycine-tryptophan (GW) repeats and a single RNA recognition motif. Both the patient's serum and a rabbit serum raised against the recombinant GW protein costained discrete cytoplasmic speckles designated as GW bodies (GWBs) that do not overlap with the Golgi complex, endosomes, lysosomes, or peroxisomes. The mRNAs associated with GW182 represent a clustered set of transcripts that are presumed to reside within the GW complexes. We propose that the GW ribonucleoprotein complex is involved in the posttranscriptional regulation of gene expression by sequestering a specific subset of gene transcripts involved in cell growth and homeostasis.