The WHO recommended intervention of Directly Observed Treatment, Short-course (DOTS) appears to have been less successful than expected in reducing the burden of TB in some high prevalence settings. One strategy for enhancing DOTS is incorporating active case-finding through screening contacts of TB patients as widely used in low-prevalence settings. Predictive models that incorporate population-level effects on transmission provide one means of predicting impacts of such interventions. We aim to identify all TB transmission modelling studies addressing contact tracing and to describe and critically assess their modelling assumptions, parameter choices and relevance to policy. We searched MEDLINE, SCOPUS, COMPENDEX, Google Scholar and Web of Science databases for relevant English language publications up to February 2012. Of the 1285 studies identified, only 5 studies met our inclusion criteria of models of TB transmission dynamics in human populations designed to incorporate contact tracing as an intervention. Detailed implementation of contact processes was only present in two studies, while only one study presented a model for a high prevalence, developing world setting. Some use of relevant data for parameter estimation was made in each study however validation of the predicted impact of interventions was not attempted in any of the studies. Despite a large body of literature on TB transmission modelling, few published studies incorporate contact tracing. There is considerable scope for future analyses to make better use of data and to apply individual based models to facilitate more realistic patterns of infectious contact. Combined with a focus on high burden settings this would greatly increase the potential for models to inform the use of contract tracing as a TB control policy. Our findings highlight the potential for collaborative work between clinicians, epidemiologists and modellers to gather data required to enhance model development and validation and hence better inform future public health policy.
There is increasing interest to control or eradicate the major neglected tropical diseases. Accurate modelling of the geographic distributions of parasitic infections will be crucial to this endeavour. We used 664 community level infection prevalence data collated from the published literature in conjunction with eight environmental variables, altitude and population density, and a multivariate Bayesian generalized linear spatial model that allows explicit accounting for spatial autocorrelation and incorporation of uncertainty in input data and model parameters, to construct the first spatially-explicit map describing LF prevalence distribution in Africa. We also ran the best-fit model against predictions made by the HADCM3 and CCCMA climate models for 2050 to predict the likely distributions of LF under future climate and population changes. We show that LF prevalence is strongly influenced by spatial autocorrelation between locations but is only weakly associated with environmental covariates. Infection prevalence, however, is found to be related to variations in population density. All associations with key environmental/demographic variables appear to be complex and non-linear. LF prevalence is predicted to be highly heterogenous across Africa, with high prevalences (>20%) estimated to occur primarily along coastal West and East Africa, and lowest prevalences predicted for the central part of the continent. Error maps, however, indicate a need for further surveys to overcome problems with data scarcity in the latter and other regions. Analysis of future changes in prevalence indicates that population growth rather than climate change per se will represent the dominant factor in the predicted increase/decrease and spread of LF on the continent. We indicate that these results could play an important role in aiding the development of strategies that are best able to achieve the goals of parasite elimination locally and globally in a manner that may also account for the effects of future climate change on parasitic infection.
The mammalian nervous system exhibits fast synchronous oscillations, which are especially prominent in respiratory-related nerve discharges. In the phrenic nerve, they include high- (HFO), medium- (MFO), and low-frequency (LFO) oscillations. Because motoneurons firing at HFO-related frequencies had never been recorded, an epiphenomenological mechanism for their existence had been posited. We have recently recorded phrenic motoneurons firing at HFO-related frequencies in unanesthetized decerebrate rats and showed that they exhibit dynamic coherence with the phrenic nerve, validating synchronous motoneuronal discharge as a mechanism underlying the generation of HFO. In so doing, we have helped validate the conclusions of previous studies by us and other investigators who have used changes in fast respiratory oscillations to make inferences about central respiratory pattern generation. Here, we seek to review changes occurring in fast synchronous oscillations during non-eupneic respiratory behaviors, with special emphasis on gasping, and the inferences that can be drawn from these dynamics regarding respiratory pattern formation.
Breathing; Motor synchrony; Gasping; Apneusis
A striking characteristic of the past four influenza pandemic outbreaks in the United States has been the multiple waves of infections. However, the mechanisms responsible for the multiple waves of influenza or other acute infectious diseases are uncertain. Understanding these mechanisms could provide knowledge for health authorities to develop and implement prevention and control strategies.
Materials and Methods
We exhibit five distinct mechanisms, each of which can generate two waves of infections for an acute infectious disease. The first two mechanisms capture changes in virus transmissibility and behavioral changes. The third mechanism involves population heterogeneity (e.g., demography, geography), where each wave spreads through one sub-population. The fourth mechanism is virus mutation which causes delayed susceptibility of individuals. The fifth mechanism is waning immunity. Each mechanism is incorporated into separate mathematical models, and outbreaks are then simulated. We use the models to examine the effects of the initial number of infected individuals (e.g., border control at the beginning of the outbreak) and the timing of and amount of available vaccinations.
Four models, individually or in any combination, reproduce the two waves of the 2009 H1N1 pandemic in the United States, both qualitatively and quantitatively. One model reproduces the two waves only qualitatively. All models indicate that significantly reducing or delaying the initial numbers of infected individuals would have little impact on the attack rate. Instead, this reduction or delay results in a single wave as opposed to two waves. Furthermore, four of these models also indicate that a vaccination program started earlier than October 2009 (when the H1N1 vaccine was initially distributed) could have eliminated the second wave of infection, while more vaccine available starting in October would not have eliminated the second wave.
Chikungunya is a re-emerging arboviral disease transmitted by Aedes spp. mosquitoes. Although principally endemic to Africa and Asia, recent outbreaks have occurred in Europe following introductions by returning travellers. A particularly large outbreak occurred on Réunion Island in 2006, the published data from which forms the basis of the current study. A simple, deterministic mathematical model of the transmission of the virus between humans and mosquitoes was constructed and parameterised with the up-to-date literature on infection biology. The model is fitted to the large Réunion epidemic, resulting in an estimate of 4.1 for the type reproduction number of chikungunya. Although simplistic, the model provided a close approximation of both the peak incidence of the outbreak and the final epidemic size. Sensitivity analysis using Monte Carlo simulation demonstrated the strong influence that both the latent period of infection in humans and the pre-patent period have on these two epidemiological outcomes. We show why separating these variables, which are epidemiologically distinct in chikungunya infections, is not only necessary for accurate model fitting but also important in informing control.
Early observations from countries that have introduced rotavirus vaccination suggest that there may be indirect protection for unvaccinated individuals, but it is unclear whether these benefits will extend to the long term. Transmission dynamic models have attempted to quantify the indirect protection that might be expected from rotavirus vaccination in developed countries, but results have varied. To better understand the magnitude and sources of variability in model projections, we undertook a comparative analysis of transmission dynamic models for rotavirus. We fit five models to reported rotavirus gastroenteritis (RVGE) data from England and Wales, and evaluated outcomes for short- and long-term vaccination effects. All of our models reproduced the important features of rotavirus epidemics in England and Wales. Models predicted that during the initial year after vaccine introduction, incidence of severe RVGE would be reduced 1.8–2.9 times more than expected from the direct effects of the vaccine alone (28–50% at 90% coverage), but over a 5-year period following vaccine introduction severe RVGE would be reduced only by 1.1–1.7 times more than expected from the direct effects (54–90% at 90% coverage). Projections for the long-term reduction of severe RVGE ranged from a 55% reduction at full coverage to elimination with at least 80% coverage. Our models predicted short-term reductions in the incidence of RVGE that exceeded estimates of the direct effects, consistent with observations from the United States and other countries. Some of the models predicted that the short-term indirect benefits may be offset by a partial shifting of the burden of RVGE to older unvaccinated individuals. Nonetheless, even when such a shift occurs, the overall reduction in severe RVGE is considerable. Discrepancies among model predictions reflect uncertainties about age variation in the risk and reporting of RVGE, and the duration of natural and vaccine-induced immunity, highlighting important questions for future research.
Antimicrobial use in food animals may contribute to antimicrobial resistance in bacteria of animals and humans. Commensal bacteria of animal intestine may serve as a reservoir of resistance-genes. To understand the dynamics of plasmid-mediated resistance to cephalosporin ceftiofur in enteric commensals of cattle, we developed a deterministic mathematical model of the dynamics of ceftiofur-sensitive and resistant commensal enteric Escherichia coli (E. coli) in the absence of and during parenteral therapy with ceftiofur. The most common treatment scenarios including those using a sustained-release drug formulation were simulated; the model outputs were in agreement with the available experimental data. The model indicated that a low but stable fraction of resistant enteric E. coli could persist in the absence of immediate ceftiofur pressure, being sustained by horizontal and vertical transfers of plasmids carrying resistance-genes, and ingestion of resistant E. coli. During parenteral therapy with ceftiofur, resistant enteric E. coli expanded in absolute number and relative frequency. This expansion was most influenced by parameters of antimicrobial action of ceftiofur against E. coli. After treatment (>5 weeks from start of therapy) the fraction of ceftiofur-resistant cells among enteric E. coli, similar to that in the absence of treatment, was most influenced by the parameters of ecology of enteric E. coli, such as the frequency of transfer of plasmids carrying resistance-genes, the rate of replacement of enteric E. coli by ingested E. coli, and the frequency of ceftiofur resistance in the latter.
The relationship between species richness and the prevalence of vector-borne disease has been widely studied with a range of outcomes. Increasing the number of host species for a pathogen may decrease infection prevalence (dilution effect), increase it (amplification), or have no effect. We derive a general model, and a specific implementation, which show that when the number of vector feeding sites on each host is limiting, the effects on pathogen dynamics of host population size are more complex than previously thought. The model examines vector-borne disease in the presence of different host species that are either competent or incompetent (i.e. that cannot transmit the pathogen to vectors) as reservoirs for the pathogen. With a single host species present, the basic reproduction ratio R0 is a non-monotonic function of the population size of host individuals (H), i.e. a value exists that maximises R0. Surprisingly, if a reduction in host population size may actually increase R0. Extending this model to a two-host species system, incompetent individuals from the second host species can alter the value of which may reverse the effect on pathogen prevalence of host population reduction. We argue that when vector-feeding sites on hosts are limiting, the net effect of increasing host diversity might not be correctly predicted using simple frequency-dependent epidemiological models.
Livestock movements in Great Britain are well recorded, have been extensively analysed with respect to their role in disease spread, and have been used in real time to advise governments on the control of infectious diseases. Typically, livestock holdings are treated as distinct entities that must observe movement standstills upon receipt of livestock, and must report livestock movements. However, there are currently two dispensations that can exempt holdings from either observing standstills or reporting movements, namely the Sole Occupancy Authority (SOA) and Cattle Tracing System (CTS) Links, respectively. In this report we have used a combination of data analyses and computational modelling to investigate the usage and potential impact of such linked holdings on the size of a Foot-and-Mouth Disease (FMD) epidemic. Our analyses show that although SOAs are abundant, their dynamics appear relatively stagnant. The number of CTS Links is also abundant, and increasing rapidly. Although most linked holdings are only involved in a single CTS Link, some holdings are involved in numerous links that can be amalgamated to form “CTS Chains” which can be both large and geographically dispersed. Our model predicts that under a worst case scenario of “one infected – all infected”, SOAs do pose a risk of increasing the size (in terms of number of infected holdings) of a FMD epidemic, but this increase is mainly due to intra-SOA infection spread events. Furthermore, although SOAs do increase the geographic spread of an epidemic, this increase is predominantly local. Whereas, CTS Chains pose a risk of increasing both the size and the geographical spread of the disease substantially, under a worse case scenario. Our results highlight the need for further investigations into whether CTS Chains are transmission chains, and also investigations into intra-SOA movements and livestock distributions due to the lack of current data.
Cost-benefit is rarely combined with nonlinear dynamic models when evaluating control options for infectious diseases. The current strategy for scrapie in Great Britain requires that all genetically susceptible livestock in affected flocks be culled (Compulsory Scrapie Flock Scheme or CSFS). However, this results in the removal of many healthy sheep, and a recently developed pre-clinical test for scrapie now offers a strategy based on disease detection. We explore the flock level cost-effectiveness of scrapie control using a deterministic transmission model and industry estimates of costs associated with genotype testing, pre-clinical tests and the value of a sheep culled. Benefit was measured in terms of the reduction in the number of infected sheep sold on, compared to a baseline strategy of doing nothing, using Incremental Cost Effectiveness analysis to compare across strategies. As market data was not available for pre-clinical testing, a threshold analysis was used to set a unit-cost giving equal costs for CSFS and multiple pre-clinical testing (MT, one test each year for three consecutive years). Assuming a 40% within-flock proportion of susceptible genotypes and a test sensitivity of 90%, a single test (ST) was cheaper but less effective than either the CSFS or MT strategies (30 infected-sales-averted over the lifetime of the average epidemic). The MT strategy was slightly less effective than the CSFS and would be a dominated strategy unless preclinical testing was cheaper than the threshold price of £6.28, but may be appropriate for flocks with particularly valuable livestock. Though the ST is not currently recommended, the proportion of susceptible genotypes in the national flock is likely to continue to decrease; this may eventually make it a cost-effective alternative to the MT or CSFS.
We assessed the severity of the 2009 influenza pandemic by comparing pandemic mortality to seasonal influenza mortality. However, reported pandemic deaths were laboratory-confirmed – and thus an underestimation – whereas seasonal influenza mortality is often more inclusively estimated. For a valid comparison, our study used the same statistical methodology and data types to estimate pandemic and seasonal influenza mortality.
Methods and Findings
We used data on all-cause mortality (1999–2010, 100% coverage, 16.5 million Dutch population) and influenza-like-illness (ILI) incidence (0.8% coverage). Data was aggregated by week and age category. Using generalized estimating equation regression models, we attributed mortality to influenza by associating mortality with ILI-incidence, while adjusting for annual shifts in association. We also adjusted for respiratory syncytial virus, hot/cold weather, other seasonal factors and autocorrelation. For the 2009 pandemic season, we estimated 612 (range 266–958) influenza-attributed deaths; for seasonal influenza 1,956 (range 0–3,990). 15,845 years-of-life-lost were estimated for the pandemic; for an average seasonal epidemic 17,908. For 0–4 yrs of age the number of influenza-attributed deaths during the pandemic were higher than in any seasonal epidemic; 77 deaths (range 61–93) compared to 16 deaths (range 0–45). The ≥75 yrs of age showed a far below average number of deaths. Using pneumonia/influenza and respiratory/cardiovascular instead of all-cause deaths consistently resulted in relatively low total pandemic mortality, combined with high impact in the youngest age category.
The pandemic had an overall moderate impact on mortality compared to 10 preceding seasonal epidemics, with higher mortality in young children and low mortality in the elderly. This resulted in a total number of pandemic deaths far below the average for seasonal influenza, and a total number of years-of-life-lost somewhat below average. Comparing pandemic and seasonal influenza mortality as in our study will help assessing the worldwide impact of the 2009 pandemic.
Methicillin-resistant Staphylococcus aureus (MRSA) is endemic in many hospital settings, including nursing homes. It is an important nosocomial pathogen that causes mortality and an economic burden to patients, hospitals, and the community. The epidemiology of the bacteria in nursing homes is both hospital- and community-like. Transmission occurs via hands of health care workers (HCWs) and direct contacts among residents during social activities. In this work, mathematical modeling in both deterministic and stochastic frameworks is used to study dissemination of MRSA among residents and HCWs, persistence and prevalence of MRSA in a population, and possible means of controlling the spread of this pathogen in nursing homes. The model predicts that: without strict screening and decolonization of colonized individuals at admission, MRSA may persist; decolonization of colonized residents, improving hand hygiene in both residents and HCWs, reducing the duration of contamination of HCWs, and decreasing the resident∶staff ratio are possible control strategies; the mean time that a resident remains susceptible since admission may be prolonged by screening and decolonization treatment in colonized individuals; in the stochastic framework, the total number of colonized residents varies and may increase when the admission of colonized residents, the duration of colonization, the average number of contacts among residents, or the average number of contacts that each resident requires from HCWs increases; an introduction of a colonized individual into an MRSA-free nursing home has a much higher probability of leading to a major outbreak taking off than an introduction of a contaminated HCW.
Influenza surveillance was carried out in a subset of patients with influenza-like illness (ILI) presenting at an Employee Health Clinic (EHS) at All India Institute of Medical Sciences (AIIMS), New Delhi (urban) and pediatric out patients department of civil hospital at Ballabhgarh (peri-urban), under the Comprehensive Rural Health Services Project (CRHSP) of AIIMS, in Delhi region from January 2007 to December 2010. Of the 3264 samples tested, 541 (17%) were positive for influenza viruses, of which 221 (41%) were pandemic Influenza A(H1N1)pdm09, 168 (31%) were seasonal influenza A, and 152 (28%) were influenza B. While the Influenza viruses were detected year-round, their types/subtypes varied remarkably. While there was an equal distribution of seasonal A(H1N1) and influenza B in 2007, predominance of influenza B was observed in 2008. At the beginning of 2009, circulation of influenza A(H3N2) viruses was observed, followed later by emergence of Influenza A(H1N1)pdm09 with co-circulation of influenza B viruses. Influenza B was dominant subtype in early 2010, with second wave of Influenza A(H1N1)pdm09 in August-September, 2010. With the exception of pandemic H1N1 emergence in 2009, the peaks of influenza activity coincided primarily with monsoon season, followed by minor peak in winter at both urban and rural sites. Age group analysis of influenza positivity revealed that the percent positivity of Influenza A(H1N1)pdm09 influenza virus was highest in >5–18 years age groups (OR 2.5; CI = 1.2–5.0; p = 0.009) when compared to seasonal influenza. Phylogenetic analysis of Influenza A(H1N1)pdm09 from urban and rural sites did not reveal any major divergence from other Indian strains or viruses circulating worldwide. Continued surveillance globally will help define regional differences in influenza seasonality, as well as, to determine optimal periods to implement influenza vaccination programs among priority populations.
We examined emotional memory enhancement (EEM) for negative and positive pictures
while manipulating encoding and retrieval conditions. Two groups of 40
participants took part in this study. Both groups performed immediate implicit
(categorization task) and explicit (recognition task) retrieval, but for one
group the tasks were preceded by incidental encoding and for the other group by
intentional encoding. As indicated by the sensitivity index
(dʹ), after incidental encoding positive stimuli were
easier to recognize than negative and neutral stimuli. Participants’ response
criterion was more liberal for negative stimuli than for both positive and
neutral ones, independent of encoding condition. In the implicit retrieval task,
participants were slower in categorizing positive than negative and neutral
stimuli. However, the priming effect was larger for emotional than for neutral
stimuli. These results are discussed in the context of the idea that the effect
of emotion on immediate memory enhancement may depend on the intentionality to
encode and retrieve information.
emotional memory enhancement; explicit/ implicit retrieval; intentional/ incidental encoding
At least 10% of the 56,000 annual new HIV infections in the United States are caused by individuals with acute HIV infection (AHI). It unknown whether the health benefits and costs of routine nucleic acid amplification testing (NAAT) are justified, given the availability of newer fourth-generation immunoassay tests.
Using a dynamic HIV transmission model instantiated with U.S. epidemiologic, demographic, and behavioral data, I estimated the number of acute infections identified, HIV infections prevented, quality-adjusted life years (QALYs) gained, and the cost-effectiveness of alternative screening strategies. I varied the target population (everyone aged 15-64, injection drug users [IDUs] and men who have sex with men [MSM], or MSM only), screening frequency (annually, or every six months), and test(s) utilized (fourth-generation immunoassay only, or immunoassay followed by pooled NAAT).
Annual immunoassay testing of MSM reduces incidence by 9.5% and costs <$10,000 per QALY gained. Adding pooled NAAT identifies 410 AHI per year, prevents 9.6% of new cases, costs $92,000 per QALY gained, and remains <$100,000 per QALY gained in settings where undiagnosed HIV prevalence exceeds 4%. Screening IDUs and MSM annually with fourth-generation immunoassay reduces incidence by 13% with cost-effectiveness <$10,000 per QALY gained. Increasing the screening frequency to every six months reduces incidence by 11% (MSM only) or 16% (MSM and IDUs) and costs <$20,000 per QALY gained.
Pooled NAAT testing every 12 months of MSM and IDUs in the United States prevents a modest number of infections, but may be cost-effective given sufficiently high HIV prevalence levels. However, testing via fourth-generation immunoassay every six months prevents a greater number of infections, is more economically efficient, and may obviate the benefits of acute HIV screening via NAAT.
Using time-kill methodology, we investigated the interactions of fosfomycin with meropenem or colistin or gentamicin against 17 genetically distinct Klebsiella pneumoniae clinical isolates carrying blaKPC-2. Synergy was observed with meropenem or colistin against 64.7 and 11.8% of tested isolates, while the combination with gentamicin resulted in indifference. All studied combinations showed improved bactericidal activity, compared to fosfomycin alone and prevented the development of fosfomycin resistance in 69.2, 53.8, and 81.8% of susceptible isolates, respectively.
The understanding of host-parasite systems in wildlife is of increasing interest in relation to the risk of emerging diseases in livestock and humans. In this respect, many efforts have been dedicated to controlling classical swine fever (CSF) in the European Wild Boar. But CSF eradication has not always been achieved even though vaccination has been implemented at a large-scale. Piglets have been assumed to be the main cause of CSF persistence in the wild since they appeared to be more often infected and less often immune than older animals. However, this assumption emerged from laboratory trials or cross-sectional surveys based on the hunting bags.
In the present paper we conducted a capture-mark-recapture study in free-ranging wild boar piglets that experienced both CSF infection and vaccination under natural conditions. We used multi-state capture recapture models to estimate the immunization and infection rates, and their variations according to the periods with or without vaccination. According to the model prediction, 80% of the infected piglets did not survive more than two weeks, while the other 20% quickly recovered. The probability of becoming immune did not increase significantly during the summer vaccination sessions, and the proportion of immune piglets was not higher after the autumn vaccination.
Given the high lethality of CSF in piglets highlighted in our study, we consider unlikely that piglets could maintain the chain of CSF virus transmission. Our study also revealed the low efficacy of vaccination in piglets in summer and autumn, possibly due to the low palatability of baits to that age class, but also to the competition between baits and alternative food sources. Based on this new information, we discuss the prospects for the improvement of CSF control and the interest of the capture-recapture approach for improving the understanding of wildlife diseases.
Deployment of limited resources is an issue of major importance for decision-making in crisis events. This is especially true for large-scale outbreaks of infectious diseases. Little is known when it comes to identifying the most efficient way of deploying scarce resources for control when disease outbreaks occur in different but interconnected regions. The policy maker is frequently faced with the challenge of optimizing efficiency (e.g. minimizing the burden of infection) while accounting for social equity (e.g. equal opportunity for infected individuals to access treatment). For a large range of diseases described by a simple SIRS model, we consider strategies that should be used to minimize the discounted number of infected individuals during the course of an epidemic. We show that when faced with the dilemma of choosing between socially equitable and purely efficient strategies, the choice of the control strategy should be informed by key measurable epidemiological factors such as the basic reproductive number and the efficiency of the treatment measure. Our model provides new insights for policy makers in the optimal deployment of limited resources for control in the event of epidemic outbreaks at the landscape scale.
The global spread of infectious diseases is facilitated by the ability of infected humans to travel thousands of miles in short time spans, rapidly transporting pathogens to distant locations. Mathematical models of the actual and potential spread of specific pathogens can assist public health planning in the case of such an event. Models should generally be parsimonious, but must consider all potentially important components of the system to the greatest extent possible. We demonstrate and discuss important assumptions relative to the parameterization and structural treatment of airline travel in mathematical models. Among other findings, we show that the most common structural treatment of travelers leads to underestimation of the speed of spread and that connecting travel is critical to a realistic spread pattern. Models involving travelers can be improved significantly by relatively simple structural changes but also may require further attention to details of parameterization.
Some epidemics have been empirically observed to exhibit outbreaks of all possible sizes, i.e., to be scale-free or scale-invariant. Different explanations for this finding have been put forward; among them there is a model for “accidental pathogens” which leads to power-law distributed outbreaks without apparent need of parameter fine tuning. This model has been claimed to be related to self-organized criticality, and its critical properties have been conjectured to be related to directed percolation. Instead, we show that this is a (quasi) neutral model, analogous to those used in Population Genetics and Ecology, with the same critical behavior as the voter-model, i.e. the theory of accidental pathogens is a (quasi)-neutral theory. This analogy allows us to explain all the system phenomenology, including generic scale invariance and the associated scaling exponents, in a parsimonious and simple way.
Seasonal variation in serum concentration of the vitamin D metabolite 25(OH)
vitamin D [25(OH)D], which contributes to host immune function, has
been hypothesized to be the underlying source of observed influenza seasonality
in temperate regions. The objective of this study was to determine whether
observed 25(OH)D levels could be used to simulate observed influenza infection
rates. Data of mean and variance in 25(OH)D serum levels by month were obtained
from the Health Professionals Follow-up Study and used to parameterize an
individual-based model of influenza transmission dynamics in two regions of the
United States. Simulations were compared with observed daily influenza excess
mortality data. Best-fitting simulations could reproduce the observed seasonal
cycle of influenza; however, these best-fit simulations were shown to be highly
sensitive to stochastic processes within the model and were unable consistently
to reproduce observed seasonal patterns. In this respect the simulations with
the vitamin D forced model were inferior to similar modeling efforts using
absolute humidity and the school calendar as seasonal forcing variables. These
model results indicate it is unlikely that seasonal variations in vitamin D
levels principally determine the seasonality of influenza in temperate
We analyse data from the early epidemic of H1N1-2009 in New Zealand, and estimate the reproduction number . We employ a renewal process which accounts for imported cases, illustrate some technical pitfalls, and propose a novel estimation method to address these pitfalls. Explicitly accounting for the infection-age distribution of imported cases and for the delay in transmission dynamics due to international travel, was estimated to be (95% confidence interval: ). Hence we show that a previous study, which did not account for these factors, overestimated . Our approach also permitted us to examine the infection-age at which secondary transmission occurs as a function of calendar time, demonstrating the downward bias during the beginning of the epidemic. These technical issues may compromise the usefulness of a well-known estimator of - the inverse of the moment-generating function of the generation time given the intrinsic growth rate. Explicit modelling of the infection-age distribution among imported cases and the examination of the time dependency of the generation time play key roles in avoiding a biased estimate of , especially when one only has data covering a short time interval during the early growth phase of the epidemic.
Current post-epidemic sero-surveillance uses random selection of animal holdings. A better strategy may be to estimate the benefits gained by sampling each farm and use this to target selection. In this study we estimate the probability of undiscovered infection for sheep farms in Devon after the 2001 foot-and-mouth disease outbreak using the combination of a previously published model of daily infection risk and a simple model of probability of discovery of infection during the outbreak. This allows comparison of the system sensitivity (ability to detect infection in the area) of arbitrary, random sampling compared to risk-targeted selection across a full range of sampling budgets. We show that it is possible to achieve 95% system sensitivity by sampling, on average, 945 farms with random sampling and 184 farms with risk-targeted sampling. We also examine the effect of ordering samples by risk to expedite return to a disease-free status. Risk ordering the sampling process results in detection of positive farms, if present, 15.6 days sooner than with randomly ordered sampling, assuming 50 farms are tested per day.
Assessment of the relative impact of diseases and pathogens is important for agencies and other organizations charged with providing disease surveillance, management and control. It also helps funders of disease-related research to identify the most important areas for investment. Decisions as to which pathogens or diseases to target are often made using complex risk assessment approaches; however, these usually involve evaluating a large number of hazards as it is rarely feasible to conduct an in-depth appraisal of each. Here we propose the use of the H-index (or Hirsch index) as an alternative rapid, repeatable and objective means of assessing pathogen impact. H-index scores for 1,414 human pathogens were obtained from the Institute for Scientific Information's Web of Science (WOS) in July/August 2010. Scores were compared for zoonotic/non-zoonotic, and emerging/non-emerging pathogens, and across taxonomic groups. H-indices for a subset of pathogens were compared with Disability Adjusted Life Year (DALY) estimates for the diseases they cause. H-indices ranged from 0 to 456, with a median of 11. Emerging pathogens had higher H-indices than non-emerging pathogens. Zoonotic pathogens tended to have higher H-indices than human-only pathogens, although the opposite was observed for viruses. There was a significant correlation between the DALY of a disease and the H-index of the pathogen(s) that cause it. Therefore, scientific interest, as measured by the H-index, appears to be a reflection of the true impact of pathogens. The H-index method can be utilized to set up an objective, repeatable and readily automated system for assessing pathogen or disease impact.
During summer 2007 Italy has experienced an epidemic caused by Chikungunya virus – the first large outbreak documented in a temperate climate country – with approximately 161 laboratory confirmed cases concentrated in two bordering villages in North–Eastern Italy comprising 3,968 inhabitants. The seroprevalence was recently estimated to be 10.2%. In this work we provide estimates of the transmission potential of the virus and we assess the efficacy of the measures undertaken by public health authorities to control the epidemic spread. To such aim, we developed a model describing the temporal dynamics of the competent vector, known as Aedes albopictus, explicitly depending on climatic factors, coupled to an epidemic transmission model describing the spread of the epidemic in both humans and mosquitoes. The cumulative number of notified cases predicted by the model was 185 on average (95% CI 117–278), in good agreement with observed data. The probability of observing a major outbreak after the introduction of an infective human case was estimated to be in the range of 32%–76%. We found that the basic reproduction number was in the range of 1.8–6 but it could have been even larger, depending on the density of mosquitoes, which in turn depends on seasonal meteorological effects, besides other local abiotic factors. These results confirm the increasing risk of tropical vector–borne diseases in temperate climate countries, as a consequence of globalization. However, our results show that an epidemic can be controlled by performing a timely intervention, even if the transmission potential of Chikungunya virus is sensibly high.