Cryptosporidiosis is predominantly caused by two closely related species of protozoan parasites the zoonotic Cryptosporidium parvum and anthroponotic Cryptosporidium hominis which diverge phenotypically in respect to host range and virulence. Using comparative genomics we identified two genes displaying overt heterogeneity between species. Although initial work suggested both were species specific, Cops-1 for C. parvum and Chos-1 for C. hominis, subsequent study identified an abridged ortholog of Cops-1 in C. hominis. Cops-1 and Chos-1 showed limited, but significant, similarity to each other and share common features: (i) telomeric location: Cops-1 is the last gene on chromosome 2, whilst Chos-1 is the first gene on chromosome 5, (ii) encode circa 50-kDa secreted proteins with isoelectric points above 10, (iii) are serine rich, and (iv) contain internal nucleotide repeats. Importantly, Cops-1 sequence contains specific SNPs with good discriminatory power useful epidemiologically. C. parvum-infected patient sera recognized a 50-kDa protein in antigen preparations of C. parvum but not C. hominis, consistent with Cops-1 being antigenic for patients. Interestingly, anti-Cops-1 monoclonal antibody (9E1) stained oocyst content and sporozoite surface of C. parvum only. This study provides a new example of protozoan telomeres as rapidly evolving contingency loci encoding putative virulence factors.
bioinfomatics/phyloinfomatics; biomedicine; genomics/proteomics; host parasite interactions; microbial biology; molecular evolution; parasitology; virulence
Background: Anthropogenic climate change will affect global food production, with uncertain consequences for human health in developed countries.
Objectives: We investigated the potential impact of climate change on food security (nutrition and food safety) and the implications for human health in developed countries.
Methods: Expert input and structured literature searches were conducted and synthesized to produce overall assessments of the likely impacts of climate change on global food production and recommendations for future research and policy changes.
Results: Increasing food prices may lower the nutritional quality of dietary intakes, exacerbate obesity, and amplify health inequalities. Altered conditions for food production may result in emerging pathogens, new crop and livestock species, and altered use of pesticides and veterinary medicines, and affect the main transfer mechanisms through which contaminants move from the environment into food. All these have implications for food safety and the nutritional content of food. Climate change mitigation may increase consumption of foods whose production reduces greenhouse gas emissions. Impacts may include reduced red meat consumption (with positive effects on saturated fat, but negative impacts on zinc and iron intake) and reduced winter fruit and vegetable consumption. Developed countries have complex structures in place that may be used to adapt to the food safety consequences of climate change, although their effectiveness will vary between countries, and the ability to respond to nutritional challenges is less certain.
Conclusions: Climate change will have notable impacts upon nutrition and food safety in developed countries, but further research is necessary to accurately quantify these impacts. Uncertainty about future impacts, coupled with evidence that climate change may lead to more variable food quality, emphasizes the need to maintain and strengthen existing structures and policies to regulate food production, monitor food quality and safety, and respond to nutritional and safety issues that arise.
adaptation; climate change; food safety; food security; nutrition; regulation
The aim of this study was to identify whether there was a relationship between the distance that people have to carry water home and ill health. We conducted a systematic review for papers that reported on the association between diarrheal risk and distance. Six papers were identified for inclusion in the meta-analysis. These were all observational studies, and only two reported effect sizes that adjusted for possible confounding. Multiple different types of water sources supplied the study communities. The combined odds ratio (OR) showed a significant increase in illness risk in people living farther away from their water source (OR = 1.45; 95% confidence interval [CI] = 1.04–1.68). There is a need for better designed studies to further elucidate the health impacts on having to carry water home.
This study sought to identify whether elevated risk of infectious intestinal disease (IID) exists in contaminated small water supply consumers compared with consumers drinking from small supplies complying with current standards and whether this effect is modified by age.
Methodology and Principal Findings
A prospective cohort study of 611 individuals receiving small supplies in England was conducted. Water supplies received sanitary inspection and examination for indicator bacteria and participants maintained a daily record of IID. Regression modeling with generalized estimating equations that included interaction terms between age and indicators of fecal pollution was performed. Crude IID prevalence was 9·3 days with symptoms/1000 person days (95%CI: 8·4, 10·1) and incidence was 3·2 episodes/1000 person days (95%CI, 2·7, 3·7) or 1·2 episodes per person year. Although there was no overall association between IID risk and indicator presence, there was strong interaction between age and indicator presence. In children under ten, relative risk (RR) of IID in those drinking from enterococci contaminated supplies was 4.8 (95%CI: 1.5, 15.3) for incidence and 8.9 (95%CI: 2.8, 27.5) for prevalence. In those aged 10 to 59, IID risk was lower but not statistically significant.
Contaminated small water supplies pose a substantial risk of IID to young children who live in homes reliant on these supplies. By contrast older children and adults do not appear to be at increased risk. Health care professionals with responsibility for children living in homes provided by very small water supplies should make parents aware of the risk.
The effects of various dengue control measures have been investigated in previous studies. The aim of this review was to investigate the relative effectiveness (RE) of different educational messages embedded in a community-based approach on the incidence of Aedes aegypti larvae using entomological measures as outcomes.
Methods and Findings
A systematic electronic search using Medline, Embase, Web of Science and the Cochrane Library was carried out to March 2010. Previous systematic reviews were also assessed. Data concerning interventions, outcomes, effect size and study design were extracted. Basic meta-analyses were done for pooled effect size, heterogeneity and publication bias using Comprehensive Meta-analysis. Further analysis of heterogeneitity was done by multi-level modelling using MLwiN. 21 publications with 22 separate studies were included in this review. Meta-analysis of these 22 pooled studies showed an RE of 0.25 (95% CI 0.17–0.37), but with substantial heterogeneity (Cochran's Q = 1254, df = 21, p = <0.001,). Further analysis of this heterogeneity showed that over 60% of between study variance could be explained by just two variables; whether or not studies used historic or contemporary controls and time from intervention to assessment. When analyses were restricted to those studies using contemporary control, there was a polynomial relationship between effectiveness and time to assessment. Whether or not chemicals or other control measures were used did not appear have any effect on intervention effectiveness.
The results suggest that such measures do appear to be effective at reducing entomological indices. However, those studies that use historical controls almost certainly overestimate the value of interventions. There is evidence that interventions are most effective some 18 to 24 months after the intervention but then subsequently decline.
Dengue fever is a mosquito-borne viral infection that is widespread in the tropics. Each year there are an estimated 50 million infections worldwide. Preventing infection relies on controlling the mosquitoes that spread disease. Unfortunately it is still not clear what does and does not work in the control of the mosquito vector. There have been several systematic reviews into control of dengue fever but still no consensus has been reached. This lack of consensus reflects the substantial heterogeneity in published effectiveness of studies of dengue control interventions. Prior systematic reviews have not adequately addressed this heterogeneity. We used multi-level modelling meta regression to investigate what variables modify the effectiveness of studies of educational messages embedded in a community-based approach. Most of the between study variation was explained by two variables, study design and time from intervention to assessment. In particular, studies using historic controls substantially overestimated the effectiveness of the intervention compared to those studies using contemporary controls. When the analysis was restricted to just those studies using contemporary controls, effectiveness was highest about 12 to 24 months after intervention but then declined.
To estimate, overall and by organism, the incidence of infectious intestinal disease (IID) in the community, presenting to general practice (GP) and reported to national surveillance.
Prospective, community cohort study and prospective study of GP presentation conducted between April 2008 and August 2009.
Eighty-eight GPs across the UK recruited from the Medical Research Council General Practice Research Framework and the Primary Care Research Networks.
6836 participants registered with the 88 participating practices in the community study; 991 patients with UK-acquired IID presenting to one of 37 practices taking part in the GP presentation study.
Main outcome measures
IID rates in the community, presenting to GP and reported to national surveillance, overall and by organism; annual IID cases and GP consultations by organism.
The overall rate of IID in the community was 274 cases per 1000 person-years (95% CI 254 to 296); the rate of GP consultations was 17.7 per 1000 person-years (95% CI 14.4 to 21.8). There were 147 community cases and 10 GP consultations for every case reported to national surveillance. Norovirus was the most common organism, with incidence rates of 47 community cases per 1000 person-years and 2.1 GP consultations per 1000 person-years. Campylobacter was the most common bacterial pathogen, with a rate of 9.3 cases per 1000 person-years in the community, and 1.3 GP consultations per 1000 person-years. We estimate that there are up to 17 million sporadic, community cases of IID and 1 million GP consultations annually in the UK. Of these, norovirus accounts for 3 million cases and 130 000 GP consultations, and Campylobacter is responsible for 500 000 cases and 80 000 GP consultations.
IID poses a substantial community and healthcare burden in the UK. Control efforts must focus particularly on reducing the burden due to Campylobacter and enteric viruses.
Campylobacter; diarrhoeal disease; epidemiology; infectious diarrhoea; salmonella
During 2001, a large outbreak of foot and mouth disease occurred in the United Kingdom, during which approximately 2,030 confirmed cases of the disease were reported, >6 million animals were slaughtered, and strict restrictions on access to the countryside were imposed. We report a dramatic decline in the reported incidence of human cryptosporidiosis in northwest England during weeks 13–38 in 2001, compared with the previous 11 years. This decline coincided with the period of foot and mouth restrictions. No similar reduction occurred in the other 26 weeks of the year. We also noted a substantial decline in the proportion of human infections caused by the bovine strain (genotype 2) of Cryptosporidium parvum during weeks 13–38 that year but not during the other weeks.
cryptosporidiosis; Cryptosporidium parvum; foot and mouth disease; incidence; zoonosis; dispatch
An outbreak of cryptosporidiosis occurred in and around Clitheroe, Lancashire, in northwest England, during March 2000. Fifty-eight cases of diarrhea with Cryptosporidium identified in stool specimens were reported. Cryptosporidium oocysts were identified in samples from the water treatment works as well as domestic taps. Descriptive epidemiology suggested that drinking unboiled tap water in a single water zone was the common factor linking cases. Environmental investigation suggested that contamination with animal feces was the likely source of the outbreak. This outbreak was unusual in that hydrodynamic modeling was used to give a good estimate of the peak oocyst count at the time of the contamination incident. The oocysts’ persistence in the water distribution system after switching to another water source was also unusual. This persistence may have been due to oocysts being entrapped within biofilm. Despite the continued presence of oocysts, epidemiologic evidence suggested that no one became ill after the water source was changed.
Cryptosporidium Oocysts in a Water Supply Associated with a Cryptosporidiosis Outbreak; Cryptosporidium; outbreak; oocysts; water; zoonosis; biofilm
As one article in a four-part PLoS Medicine series on water and sanitation, Paul Hunter and colleagues argue that much more effort is needed to improve access to safe and sustainable water supplies.
During times of public health emergencies, effective communication between the emergency response agencies and the affected public is important to ensure that people protect themselves from injury or disease. In order to investigate compliance with public health advice during natural disasters, we examined consumer behaviour during two water notices that were issued as a result of serious flooding. During the summer of 2007, 140,000 homes in Gloucestershire, United Kingdom, that are supplied water from Mythe treatment works, lost their drinking water for up to 17 days. Consumers were issued a 'Do Not Drink' notice when the water was restored, which was subsequently replaced with a 'Boil Water' notice. The rare occurrence of two water notices provided a unique opportunity to compare compliance with public health advice. Information source use and other factors that may affect consumer perception and behaviour were also explored.
A postal questionnaire was sent to 1,000 randomly selected households. Chi-square, ANOVA, MANOVA and generalised estimating equation (with and without prior factor analysis) were used for quantitative analysis.
In terms of information sources, we found high use of and clear preference for the local radio throughout the incident, but family/friends/neighbours also proved crucial at the onset. Local newspapers and the water company were associated with clarity of advice and feeling informed, respectively. Older consumers and those in paid employment were particularly unlikely to read the official information leaflets. We also found a high degree of confusion regarding which notice was in place at which time, with correct recall varying between 23.2%-26.7%, and a great number of consumers believed two notices were in place simultaneously. In terms of behaviour, overall non-compliance levels were significantly higher for the 'Do Not Drink' notice (62.9%) compared to the 'Boil Water' notice (48.3%); consumers in paid employment were not likely to comply with advice. Non-compliance with the general advice to boil bowser water was noticeably lower (27.3%).
Higher non-compliance during the 'Do Not Drink' notice was traced to the public's limited knowledge of water notices and their folk beliefs about the protection offered from boiling water. We suggest that future information dissemination plans reduce reliance on official leaflets and maximise the potential of local media and personal networks. Current public health education programmes are recommended to attend to insufficient and incorrect public knowledge about precautionary actions.
Lack of access to safe water remains a significant risk factor for poor health in developing countries. There has been little research into the health effects of frequently carrying containers of water. The aims of this study were to better understand how domestic water carrying is performed, identify potential health risk factors and gain insight into the possible health effects of the task.
Mixed methods of data collection from six were used to explore water carrying performed by people in six rural villages of Limpopo Province, South Africa. Data was collected through semi-structured interviews and through observation and measurement. Linear regression modelling were used to identify significant correlations between potential risk factors and rating of perceived exertion (RPE) or self reported pain. Independent t-tests were used to compare the mean values of potential risk factors and RPE between sub-groups reporting pain and those not reporting pain.
Water carrying was mainly performed by women or children carrying containers on their head (mean container weight 19.5 kg) over a mean distance of 337 m. The prevalence of spinal (neck or back) pain was 69% and back pain was 38%. Of participants who carried water by head loading, the distance walked by those who reported spinal pain was significantly less than those who did not (173 m 95%CI 2-343; p = 0.048). For head loaders reporting head or neck pain compared to those who did not, the differences in weight of water carried (4.6 kg 95%CI -9.7-0.5; p = 0.069) and RPE (2.5 95%CI -5.1-0.1; p = 0.051) were borderline statistically significant. For head loaders, RPE was significantly correlated with container weight (r = 0.52; p = 0.011) and incline (r = 0.459; p = 0.018)
Typical water carrying methods impose physical loading with potential to produce musculoskeletal disorders and related disability. This exploratory study is limited by a small sample size and future research should aim to better understand the type and strength of association between water carrying and health, particularly musculoskeletal disorders. However, these preliminary findings suggest that efforts should be directed toward eliminating the need for water carrying, or where it must continue, identifying and reducing risk factors for musculoskeletal disorders and physical injury.
Cryptosporidium is a protozoan parasite that causes diarrheal illness in a wide range of hosts including humans. Two species, C. parvum and C. hominis are of primary public health relevance. Genome sequences of these two species are available and show only 3-5% sequence divergence. We investigated this sequence variability, which could correspond either to sequence gaps in the published genome sequences or to the presence of species-specific genes. Comparative genomic tools were used to identify putative species-specific genes and a subset of these genes was tested by PCR in a collection of Cryptosporidium clinical isolates and reference strains.
The majority of the putative species-specific genes examined were in fact common to C. parvum and C. hominis. PCR product sequence analysis revealed interesting SNPs, the majority of which were species-specific. These genetic loci allowed us to construct a robust and multi-locus analysis. The Neighbour-Joining phylogenetic tree constructed clearly discriminated the previously described lineages of Cryptosporidium species and subtypes.
Most of the genes identified as being species specific during bioinformatics in Cryptosporidium sp. are in fact present in multiple species and only appear species specific because of gaps in published genome sequences. Nevertheless SNPs may offer a promising approach to studying the taxonomy of closely related species of Cryptosporidia.
Infectious intestinal disease (IID), usually presenting as diarrhoea and vomiting, is frequently preventable. Though often mild and self-limiting, its commonness makes IID an important public health problem. In the mid 1990s around 1 in 5 people in England suffered from IID a year, costing around £0.75 billion. No routine information source describes the UK's current community burden of IID. We present here the methods for a study to determine rates and aetiology of IID in the community, presenting to primary care and recorded in national surveillance statistics. We will also outline methods to determine whether or not incidence has declined since the mid-1990s.
The Second Study of Infectious Intestinal Disease in the Community (IID2 Study) comprises several separate but related studies. We use two methods to describe IID burden in the community - a retrospective telephone survey of self-reported illness and a prospective, all-age, population-based cohort study with weekly follow-up over a calendar year. Results from the two methods will be compared. To determine IID burden presenting to primary care we perform a prospective study of people presenting to their General Practitioner with symptoms of IID, in which we intervene in clinical and laboratory practice, and an audit of routine clinical and laboratory practice in primary care. We determine aetiology of IID using molecular methods for a wide range of gastrointestinal pathogens, in addition to conventional diagnostic microbiological techniques, and characterise isolates further through reference typing. Finally, we combine all our results to calibrate national surveillance data.
Researchers disagree about the best method(s) to ascertain disease burden. Our study will allow an evaluation of methods to determine the community burden of IID by comparing the different approaches to estimate IID incidence in its linked components.
Waterborne disease is a major risk for small water supplies in rural settings. This study was done to assess the impact of an educational intervention designed to improve water quality and estimate the contribution of water to the incidence of diarrhoeal disease in poor rural communities in Puerto Rico a two-part study was undertaken.
An educational intervention was delivered to communities relying on community water supplies. This intervention consisted of student operators and administrators supervising and assisting community members who voluntarily "operate" these systems. These voluntary operators had no previous training and were principally concerned with seeing that some water was delivered. The quality of that water was not something they either understood or addressed. The impact of this intervention was measured through water sampling for standard bacteriological indicators and a frank pathogen. In addition, face-to-face epidemiological studies designed to determine the base-line occurrence of diarrhoeal disease in the communities were conducted. Some 15 months after the intervention a further epidemiological study was conducted in both the intervention communities and in control communities that had not received any intervention.
Diarrhoeal illness rates over a four week period prior to the intervention were 3.5%. Salmonella was isolated from all of 5 distributed samples prior to intervention and from only 2 of 12 samples after the intervention. In the 15 months follow-up study, illness rates were lower in the intervention compared to control communities (2.5% vs 3.6%%) (RR = 0.70, 95%CI 0.43, 1.15), though this was not statistically significant. However, in the final Poisson regression model living in an intervention system (RR = 0.318; 95%CI 0.137 - 0.739) and owning a dog (RR = 0.597, 95%CI 0.145 - 0.962) was negatively associated with illness. Whilst size of system (RR = 1.006, 95%CI 1.001 - 1.010) and reporting problems with sewage system (RR = 2.973, 95%CI 1.539 - 5.744) were positively associated with illness.
Educational interventions directed both at identified individuals and the community in general in small communities with poor water quality is a way of giving communities the skills and knowledge to manage their own drinking water quality. This may also have important and sustainable health benefits, though further research preferably using a randomised control trial design is needed.
Understanding how risks to human health change as a result of seasonal variations in environmental conditions is likely to become of increasing importance in the context of climatic change, especially in lower-income countries. A multi-disciplinary approach can be a useful tool for improving understanding, particularly in situations where existing data resources are limited but the environmental health implications of seasonal hazards may be high. This short article describes a multi-disciplinary approach combining analysis of changes in levels of environmental contamination, seasonal variations in disease incidence and a social scientific analysis of health behaviour. The methodology was field-tested in a peri-urban environment in the Mekong Delta, Vietnam, where poor households face alternate seasonal extremes in the local environment as the water level in the Delta changes from flood to dry season. Low-income households in the research sites rely on river water for domestic uses, including provision of drinking water, and it is commonly perceived that the seasonal changes alter risk from diarrhoeal diseases and other diseases associated with contamination of water. The discussion focuses on the implementation of the methodology in the field, and draws lessons from the research process that can help in refining and developing the approach for application in other locations where seasonal dynamics of disease risk may have important consequences for public health.
Renal tract involvement is implicated in both early and late schistosomiasis leading to increased disease burden. Despite there being good estimates of disease burden due to renal tract disease secondary to schistosomiasis at the global level, it is often difficult to translate these estimates into local communities. The aim of this study was to assess the burden of urinary tract pathology and morbidity due to schistosomiasis in Zanzibar and identify reliable clinical predictors of schistosomiasis associated renal disease.
A cross-sectional comparison of Ungujan men and women living within either high or low endemic areas for urinary schistosomiasis was conducted. Using urine analysis with reagent strips, parasitological egg counts, portable ultrasonography and a qualitative case-history questionnaire. Data analysis used single and multiple predictor variable logistic regression.
One hundred and sixty people were examined in the high endemic area (63% women and 37% men), and 101 people in the low endemic area (61% women and 39% men). In the high endemic area, egg-patent schistosomiasis and urinary tract pathology were much more common (p = 1 × 10-3, 8 × 10-6, respectively) in comparison with the low endemic area. Self-reported frothy urine, self-reported haematuria, dysuria and urgency to urinate were associated with urinary tract pathology (p = 1.8 × 10-2, p = 1.1 × 10-4, p = 1.3 × 10-6, p = 1.1 × 10-7, respectively) as assessed by ultrasonography. In a multi-variable logistic regression model, self-reporting of schistosomiasis in the past year, self-reporting of urgency to urinate and having an egg-positive urine sample were all independently associated with detectable urinary tract abnormality, consistent with schistosomiasis-specific disease. Having two or more of these features was moderately sensitive (70%) as a predictor for urinary tract abnormality with high specificity (92%).
Having two out of urgency to urinate, self reporting of previous infections and detection of eggs in the urine were good proxy predictors of urinary tract abnormality as detected by ultrasound.
Tonoplast intrinsic proteins (TIPs) are widely used as markers for vacuolar compartments in higher plants. Ten TIP isoforms are encoded by the Arabidopsis genome. For several isoforms, the tissue and cell specific pattern of expression are not known.
We generated fluorescent protein fusions to the genomic sequences of all members of the Arabidopsis TIP family whose expression is predicted to occur in root tissues (TIP1;1 and 1;2; TIP2;1, 2;2 and 2;3; TIP4;1) and expressed these fusions, both individually and in selected pairwise combinations, in transgenic Arabidopsis. Analysis by confocal microscopy revealed that TIP distribution varied between different cell layers within the root axis, with extensive co-expression of some TIPs and more restricted expression patterns for other isoforms. TIP isoforms whose expression overlapped appeared to localise to the tonoplast of the central vacuole, vacuolar bulbs and smaller, uncharacterised structures.
We have produced a comprehensive atlas of TIP expression in Arabidopsis roots, which reveals novel expression patterns for not previously studied TIPs.
The importance of a person's perceptions about the causes of their disease has been emphasised by research on various diseases. Several studies have found perception may be linked to protective behaviours.
This study intends to identify the main perceived causes of sporadic cryptosporidiosis, and to analyse some of the factors that may influence respondent's perception. The role of respondents' attributions, the scientific plausibility of perceptions, and the importance of specific information sources are also explored.
Quantitative and qualitative analyses of data from a case‐control study.
General population in Wales and north west England.
The study is based on a sample of 411 respondents from Wales and north west of England, whose cryptosporidiosis diagnosis was confirmed by a laboratory.
The results show that the most frequent perceived causes are water (by drinking it or swimming), contagion (mostly from children), and contaminated food. Perceived causes are qualitatively similar to the ones described in scientific literature, but some quantitative differences are evident. Respondents' certainty in relation to the cause of illness is directly related with plausibility. The most frequent information sources used by respondents were test stool results, environmental health officers, and doctors or nurses. Results suggest that information sources may influence the perception of the causes of cryptosporidiosis. Qualitative data provided a few clues about situations where sporadic and outbreak cases may be confused.
In contrast with outbreaks, various information sources in addition to the media are used by people with sporadic cryptosporidiosis that in turn affects the perception of aetiology. This has implications for the dissemination of information about control measures for cryptosporidiosis and surveillance activities.
cryptosporidiosis; perception; disease attribution; communication
Cryptosporidium parvum and Cryptosporidium hominis are two related species of apicomplexan protozoa responsible for the majority of human cases of cryptosporidiosis. In spite of their considerable public health impact, little is known about the population structures of these species. In this study, a battery of C. parvum and C. hominis isolates from seven countries was genotyped using a nine-locus DNA subtyping scheme. To assess the existence of geographical partitions, the multilocus genotype data were mined using a cluster analysis based on the nearest-neighbor principle. Within each country, the population genetic structures were explored by combining diversity statistical tests, linkage disequilibrium, and eBURST analysis. For both parasite species, a quasi-complete phylogenetic segregation was observed among the countries. Cluster analysis accurately identified recently introduced isolates. Rather than conforming to a strict paradigm of either a clonal or a panmictic population structure, data are consistent with a flexible reproductive strategy characterized by the cooccurrence of both propagation patterns. The relative contribution of each pattern appears to vary between the regions, perhaps dependent on the prevailing ecological determinants of transmission.
This paper presents the results of a meta-analysis of published transfer rates of antimicrobial resistance genes. A total of 34 papers were identified, of which 28 contained rates estimated in relation to either donor or recipient bacterial counts. The published rates ranged from 10−2 to 10−9. Generalized linear modeling was conducted to identify the factors influencing this variation. Highly significant associations between transfer frequency and both the donor (P = 1.2 × 10−4) and recipient (P = 1.0 × 10−5) genera were found. Also significant was whether the donor and recipient strains were of the same genus (P = 0.023) and the nature of the genetic element (P = 0.0019). The type of experiment, in vivo or in vitro, approached statistical significance (P = 0.12). Parameter estimates from a general linear model were used to estimate the probability of transfer of antimicrobial resistance genes to potential pathogens in the intestine following oral ingestion. The mean logarithms of these probabilities are in the range of [−7.0, −3.1]. These probability distributions are suitable for use in the quantitative assessment of the risk of transfer of antimicrobial resistance genes to the intestinal flora of humans and animals.
Infectious intestinal disease (IID) surveillance data are an under-utilised information source on illness geography. This paper uses a case study of cryptosporidiosis in England and Wales to demonstrate how these data can be converted into area-based rates and the factors underlying illness geography investigated. Ascertainment bias is common in surveillance datasets, and we develop techniques to investigate and control this. Rural areas, locations with many livestock and localities with poor water treatment had elevated levels of cryptosporidiosis. These findings accord with previous research validating the techniques developed. Their use in future studies investigating IID geography is therefore recommended.
Surveillance data; GIS; Crypsorpodisiodis; Environment; Water supply
To assess whether domestic kitchen hygiene is an important contributor to the development of diarrhoea in the developed world.
Electronic searches were carried out in October 2006 in EMBASE, MEDLINE, Web of Knowledge, Cochrane central register of clinical trials and CINAHL. All publications, irrespective of study design, assessing food hygiene practices with an outcome measure of diarrhoea were included in the review. All included studies underwent data extraction and the data was subsequently analysed. The analysis was conducted by qualitative synthesis of the results. Given the substantial heterogeneity in study design and outcome measures meta-analysis was not done. In addition the existing dataset of the UK IID study was reanalysed to investigate possible associations between self-reported diarrhoea and variables indicative of poor domestic kitchen hygiene
Some 14 studies were finally included in subsequent analyses. Of the 14 studies included in this systematic review, 11 were case-control studies, 2 cross-sectional surveys, and 1 RCT. Very few studies identified any significant association with good environmental kitchen hygiene. Although some of the variables in the reanalysis of the UK IID study were statistically significant no obvious trend was seen.
The balance of the available evidence does not support the hypothesis that poor domestic kitchen hygiene practices are important risk factors for diarrhoeal disease in developed countries.
With an increasing move towards outpatient therapeutic feeding for moderately and severely malnourished children, the home environment has become an increasingly important factor in achieving good program outcomes. Infections, including those water-borne, may significantly delay weight gain in a therapeutic feeding program. This study examined the relationship between adequacy of water supply and children’s length of stay in a therapeutic feeding program in Niger.
The length of stay in a therapeutic feeding program of Médecins Sans Frontières in Niger was registered for 1518 children from 20 villages in the region. In parallel, the quality and quantity of the water source in each village were documented, and the association between adequacy of the water supply and length of stay in the program was assessed through Generalized Estimating Equation analysis.
36% of the children presented with a secondary infection, 69% of which were water-related. When stratified by the adequacy of the quantity and/or quality of the water supply in their village of origin, non-adequacy of the water supply was clearly associated with a higher prevalence of secondary water-related infections and with much longer lengths of stay of malnourished children in the therapeutic feeding program.
This study suggests that therapeutic feeding programs using an outpatient model should routinely evaluate the water supply in their target children’s villages if they are to provide optimal care.