|Home | About | Journals | Submit | Contact Us | Français|
Compliance with guideline removal targets for Cryptosporidium which do not provide any credit for the inactivation of oocysts through wastewater treatment processes can considerably increase the cost of providing recycled water. Here we present the application of an integrated assay to quantify both oocyst numbers and infectivity levels after various treatment stages at three Victorian and two South Australian (SA) wastewater treatment plants (WWTPs). Oocyst density in the raw sewage was commensurate with community disease burden, with early rounds of sampling capturing a widespread cryptosporidiosis outbreak in Victoria. The level of infectivity of oocysts in sewage was stable throughout the year but was significantly lower at the SA WWTPs. Removals across secondary treatment processes were seasonal, with poorer removals associated with inflow variability; however, no decrease in the oocyst infectivity was identified. For SA WWTPs, those oocysts remaining within the secondary treatment-clarified effluent were proportionally more infectious than those in raw sewage. Lagoon systems demonstrated significant inactivation or removal of oocysts, with attenuation being seasonal. Examination of a UV system emphasized its efficacy as a disinfectant barrier but conversely confirmed the importance of a multibarrier approach with the detection of infectious oocysts postdisinfection. The ability to characterize risk from infectious oocysts revealed that the risk from Cryptosporidium is significantly lower than previously thought and that its inclusion in quantitative risk assessments of reuse systems will more accurately direct the selection of treatment strategies and capital expenditure, influencing the sustainability of such schemes.
IMPORTANCE Here we present the application of a recently developed integrated assay not only to quantify the removal of Cryptosporidium oocysts but also to quantify their infectivity across various treatment stages at five wastewater treatment plants (WWTPs), thereby better measuring the “true effect” of the treatment train on oocyst risk reduction. For a number of the WWTPs analyzed in this study the risk, is significantly lower than previously thought. Therefore, the inclusion of oocyst infectivity in guideline values and in quantitative microbial risk assessment (QMRA) has the potential to affect future treatment directions and capital expenditure.
Reusing water traditionally seen as wastewater can provide additional sources of the resource while also reducing unwanted inputs into the environment (1, 2). Nevertheless, it is paramount that the health of both the public and the environment is protected, as reuse water has the potential to pose a high risk from viruses, bacteria, and protozoans (1, 3, 4). Therefore, water destined for reuse must be treated accordingly and be fit for purpose (5). The protozoan parasite Cryptosporidium is problematic for the water industry, having a significant impact on wastewater treatment requirements for producing reuse water due to the potential exposure of the community to this parasite (6,–9). The ever-present risk of Cryptosporidium has therefore required the implementation of additional treatment processes where recycling is utilized, which may include alternative disinfection methods such as the use of UV and ozone (due to the innate resistance of Cryptosporidium oocysts to chlorine-based disinfection) or filtration for effective oocyst removal (10, 11).
Guideline values have traditionally set log10 removal targets based on end-use application and, where inadequate or no specific monitoring data exist, have set default values that can assist providers in safely and sustainably managing recycled-water schemes and wastewater discharge (12, 13). This requirement enhances the need for the multiple-barrier approach as well as hazard analysis and critical control point (HACCP) principles. However, the removal of Cryptosporidium through treatment processes has been demonstrated to be variable and often plant dependent (3). This can significantly increase the cost of providing recycled water that satisfies guidelines by requiring the inclusion of additional treatment processes and, in some cases, necessitating a retrofit of additional processes to existing treatment plants (5). In contrast, the inactivation of Cryptosporidium oocysts that may occur throughout the wastewater treatment train is not credited in these guidelines. This has been due to the absence of a satisfactory tool capable of determining oocyst density (number density) and infectivity from a single-sample concentrate. This has meant that multiple-grab samples must be processed to conduct different analyses, increasing analytical costs, or that sample concentrates need to be split, compromising detection limits and accuracy. Foregoing comprehensive risk analysis means that oocyst risk may be significantly overestimated, resulting in the inclusion of additional treatment barriers which may be costly and redundant.
Cryptosporidium species are currently monitored in wastewaters using standard detection methodologies (methods 1622 and 1623 [14, 15]). However, these methods provide no information on the infectivity of the detected oocysts. Recently, we developed an integrated Cryptosporidium assay capable of determining oocyst density, infectivity, and genotype from a single-sample concentrate that can be used for source waters, wastewaters, and reuse water (16, 17). This assay obviates the need for processing multiple-grab samples or splitting sample concentrates for separate analyses, enabling comprehensive risk assessment to be undertaken.
Here we present the application of this assay to quantify both the removal and inactivation of oocysts at various treatment stages across five wastewater treatment plants (WWTPs), thereby more accurately measuring the “true effect” of the treatment train on oocyst risk reduction. The impact of seasonality on both oocyst challenge and attenuation was examined, as well as the influence that operational parameters have on the fate of oocysts across the treatment train. Additionally, we compare data generated from this work to default removal values given in the Australian Guidelines for Water Recycling (AGWR) (12) to illustrate how the incorporation of infectivity into risk analysis can significantly change the Cryptosporidium risk. Finally, we provide a worked example of how infectivity data can be included into a hypothetical operational quantitative risk assessment at a major South Australian WWTP to determine if recycled water is fit for purpose under various scenarios.
A review of the data generated after the first round of sampling was undertaken to examine the effectiveness of the sampling strategy, i.e., whether the sampling points chosen were sound and if there was sufficient sample replication employed. While oocyst densities were comparable between duplicate samples, the number of oocysts recovered across the WWTPs from mixed liquor (a combination of raw or unsettled wastewater and activated sludge [AS]) was low. This was the result of only small sample volumes being able to be processed due to the poor quality of the sample matrix (large bacterial flocs that could not be dissolved and the increased amount of immunomagnetic separation [IMS] reagent needed to capture oocysts, making it cost prohibitive). Therefore, it was decided to remove the mixed liquor sampling points from the subsequent sampling rounds and increase replication at other sample points to generate more-comprehensive data sets. Oocyst recoveries across five WWTPs for raw sewage, primary effluent, and secondary and tertiary treatment effluent were all within an acceptable range (Western Treatment Plant, 40% ± 14%; Altona, 31% ± 13%; Mt. Martha, 35% ± 13%; Aldinga, 37% ± 14; Glenelg, 39% ± 17%).
A large increase in the cryptosporidiosis notification rate occurred between January 2013 and May 2013 for Victoria (Fig. 1). South Australia experienced a smaller spike in the notification rate during the same period. During March 2013, a number of announcements in relation to cryptosporidiosis were issued to the public by the Victorian Department of Health, which subsequently determined that the outbreak originated from recreational swimming pools.
This increased burden of cryptosporidiosis in both Victoria and South Australia was reflected by concomitant spikes in the raw sewage oocyst densities challenging the WWTPs. Mean oocyst density for the raw sewage for each individual WWTP was mapped to date, season, and monthly notification rate for each state (Table 1). The greatest oocyst densities occurred during the periods when notification rates were highest. Interestingly, large oocyst densities were evident as early as 6 February at Mt. Martha, coincident with an increase in the notification rate and prior to any statements issued to the public by the Victorian Department of Health relating to cryptosporidiosis. A second, smaller spike in raw sewage oocyst numbers was also evident for both Altona and the Western Treatment Plant in July which occurred while the Victorian notification rate was still considerably greater than winter levels from the previous decade.
For all WWTPs, oocyst density in the raw sewage after the initial spike progressively decreased throughout the year, coinciding with a reduction in the cryptosporidiosis notification rates. Accordingly, mean oocyst density in the raw sewage at each individual WWTP exhibited a robust linear relationship with the state-wide cryptosporidiosis notification rate for the month in which the raw sewage was sampled (Western Treatment Plant r2 = 0.93; Altona r2 = 0.64; Mt. Martha r2 = 0.78; Aldinga r2 = 0.87; Glenelg r2 = 0.88). However, this relationship would be substantially weaker or would not hold if data from the spikes in oocyst density were removed from the data set.
Whereas oocyst density varied according to the level of cryptosporidiosis in the community, the level of infectivity of the oocysts in the raw sewage challenging each of the plants was stable throughout the year, with seasonality exhibiting no discernible effect on oocyst infectivity. However, the oocyst infectivity in the raw sewage was substantially lower for the South Australian plants than in the raw sewage challenging the Victorian plants (P < 0.05) (Table 2).
The oocyst challenge (density) in the raw sewage across all five WWTPs was compared to values stated in the Australian Guidelines for Water Recycling (12) (Table 3). The AGWR (12) identify values ranging from 0 to 10,000 oocysts per liter. The data generated from this study suggest that the oocyst range quoted in the guidelines is appropriate, with oocyst densities in the raw sewage at 4 of the 5 WWTPs either below or not markedly exceeding the upper value (10,000 oocysts/liter) quoted in the guidelines. As pathogen concentrations can differ over a wide range, 95th percentiles are often used in determining health-based targets and the AGWR (12) currently set the default value at 2,000 oocysts/liter. While the 95th percentile for total oocysts (oocysts/liter) from data generated from this project from the South Australian WWTPs was very similar to the AGWR (12) default value, all three Victorian WWTPs had greater 95th percentiles for total oocysts (oocysts/liter) in the raw sewage, with the 95th percentile for the Mt. Martha WWTP being an order of magnitude higher than the default value (Table 3). There is currently no reference in the AWGR (12) to the number of infective oocysts in the raw sewage. However, applying the methodology described above to only infective oocysts, the magnitude of the risk profile within the raw sewage decreased across all plants (Table 3). In terms of risk reduction values (RRV) (log10 values for total oocysts versus infectious oocysts), the reduction in risk ranged between 0.27 log10 and 1.36 log10 for the five WWTPs investigated. Examining the combined data, the risk reduction was greatest for the South Australian context (0.76 log10) as the infectivity in the raw sewage was low for both SA plants. The reduction in risk for the combined Victorian data was more moderate (0.42 log10) due to a higher oocyst infectivity fraction in the sewage.
Oocyst removals for secondary treatment processes were highly variable for the five WWTPs, ranging from as low as 0.21 log10 to as high as 3.27 log10 (Table 4). The mean removal for secondary treatment across all five WWTPs was 1.43 log10 with a calculated 5th percentile of 0.46 log10, comparable with the indicative values for secondary treatment for Cryptosporidium stated in the AGWR (12) (range, 0.5 to 1.05 log10). Water quality and operational and climate data for all WWTPs were obtained for the period of the investigation and were examined to determine if trends or variations occurred that may explain treatment performance. The majority of poorer removals for secondary treatment occurred in winter and early spring (Table 4), coinciding with substantial increases in volume and/or significant flow variability in the raw effluent (see Text S3 and Fig. S1 to S4 in the supplemental material).
The Altona WTP experienced only moderate flow variability (see Text S3 and Fig. S5), but a strong correlation was found between the oocyst log10 reduction values (LRVs) and variations in inflow and outflow, with Cryptosporidium LRVs decreasing proportionally with increased deviation of the daily flow from the yearly average daily flow (Fig. 2). This flow variability was reflected in the suspended solids of the raw sewage, with the increasing levels of suspended solids correlating with the decreasing oocyst LRV (r2 = 0.81). Environmental temperature was also found to correlate with oocyst removals across the secondary treatment process at a number of the WWTPs investigated (see Text S4 and Fig. S6 to S7).
Though oocysts were effectively removed by secondary treatment processes, there was no decrease in the infectious oocyst fraction following activated sludge treatment (Table 2). With the exception of the Western Treatment Plant, there was a trend for the infectious fraction of those oocysts remaining in the secondary clarified effluent to be higher than that of the oocysts in raw sewage, and this was significant for both the Glenelg and Aldinga WWTPs (P < 0.05). There was no discernible correlation between oocyst infectivity following secondary treatment and any of the operational, water quality, or climatic parameters investigated.
Two lagoon systems were investigated: a single lagoon at Aldinga WWTP receiving chlorinated secondary treatment-clarified effluent and a connected series of lagoons at the Western Treatment Plant. For the Aldinga WWTP, removals ranged between 0.54 log10 and 1.75 log10 (1.04 mean log10), with no clear seasonal effect. For the Western Treatment Plant, samples were taken across a linear treatment train encompassing raw sewage, anaerobic ponds, and a subsequent series of lagoons (see Text S1). No oocyst removals from the raw sewage through the anaerobic ponds were apparent. However, substantial removals through the ensuing lagoon system were evident (Fig. 3), typically increasing incrementally along the treatment train (for raw sewage through to lagoon 6, range, 1.57 log10 to 3.84 log10; mean, 2.62 log10). Removals within the lagoon system at the Western Treatment Plant were seasonal, with poorer performances occurring during winter and spring. Additionally, removals through lagoons 4 and 6 were found to demonstrate robust linear relationships with the water temperature of each lagoon (r2 = 0.78 and 0.88, respectively; water temperature data for lagoon 2 were not available).
In addition to extensive oocyst removals occurring for both lagoon systems, substantial inactivation of those oocysts remaining was patent. For Aldinga WWTP lagoon 1, oocyst inactivation ranged between 0.23 log10 and >1.53 log10, with no apparent correlation with season. For the Western Treatment Plant, small but consistent reductions in oocyst infectivity were evident after the anaerobic pots (data not presented), with inactivation levels as high as 0.62 log10. Additionally, substantial oocyst inactivation was identified post-lagoon 2 at the Western Treatment Plant; notably, for the first round of sampling, 268 oocysts were applied to cell culture but only 2 were infective (Fig. 4), with no infective oocysts detected in the subsequent lagoons for this sampling round. Overall, the inactivation ranged from 0.36 log10 to 1.56 log10. Unlike the Aldinga WWTP, the level of inactivation was influenced by seasonal factors, with strong linear relationships with air temperature (r2 = 0.80 [maximum average] and 0.79 [minimum average]; water temperature data not available) and solar radiation (r2 = 0.75) (data not shown). Inclusion of infectivity data in the log removal calculations considerably increased the values for oocyst attenuation in lagoons compared with using total oocyst numbers alone (Fig. 5).
The Altona WWTP was included in this study because it utilizes UV disinfection, which is known to be highly efficacious for inactivating Cryptosporidium. While secondary treatment at the Altona WWTP resulted in moderate oocyst removals (range, 0.49 log10 to 1.33 log10) and no oocyst inactivation, UV disinfection provided a substantial barrier to oocyst risk, providing an additional >1.48 log10 to >2.77 log10 reduction in oocyst infectivity on top of the removals through the intermittently decanted extended aeration (IDEA) reactor tanks. Two large oocyst density challenges to the plant were captured during the project (Table 1). No infectious oocysts were detected after UV disinfection for the first and largest challenge. However, after the second challenge, a small number of infectious oocysts were detected post-UV disinfection, coinciding with poorer oocyst removals across the secondary treatment and spikes in suspended solids and turbidity in the effluent. During this round, however, the UV disinfection plant still provided a 2.24 log10 reduction in oocyst infectivity. No infectious oocysts were detected in any of the other sampling rounds.
The variability in sewage concentrations of total and infectious oocysts for South Australian data was further characterized by fitting probability density functions (PDFs). Results showed that both the log10-transformed numbers of total oocysts per liter in sewage and the infectious fraction (percentage of total oocysts that were infectious) followed a normal distribution (see Text S5 and Fig. S8). By using these two PDFs, a Monte Carlo simulation was run to compute a PDF to describe the number of infectious oocysts in raw sewage, which subsequently represented the “risk reduction value” that could be assigned for describing the difference in the numbers of total oocysts and infectious oocysts in raw sewage (see Text S5 and Fig. S9). This approach demonstrated that the mean difference in total and infectious oocyst numbers in sewage was 0.95 log10 units, with 5th and 95th percentile values of 0.49 log10 and 1.76 log10 units, respectively.
A quantitative microbial risk assessment (QMRA) was performed to assess the potential health impacts for a number of hypothetical scenarios occurring at the Bolivar WWTP with the QMRA input assumptions described in Table 5. Figure 6 summarizes the estimated disability-adjusted life year (DALY) values for each scenario. The estimated DALY value for scenario A (95th percentile, ≤10−7) was well within the benchmark DALY threshold, considered to represent a “tolerable risk.” Incorporation of the RRV (scenario B) to account for oocyst infectivity further decreased the estimated DALY value by 1 order of magnitude. By modifying the Monte Carlo model, we explored two hazardous event scenarios (C and D) which assumed a partial failure of the dissolved air flotation filtration (DAFF) process. For scenario C, the estimated DALY value with DAFF treatment failure was just within the target benchmark, where the 95th percentile DALY value was estimated to be 10−6. Under the same conditions but incorporating the RRV (scenario D), the estimated DALY value was still well within the target benchmark specified in the AGWR (12), illustrating the benefit of accurately measuring oocyst infectivity to better inform risk data.
Guideline removal targets for Cryptosporidium can significantly increase the cost of providing recycled water. However, guidelines do not provide credit for the inactivation of Cryptosporidium oocysts by wastewater treatment, resulting in probable overestimation of risk. In part, this has been due to the absence of a satisfactory tool to measure oocyst infectivity and hence to a resultant dearth of knowledge of the true effect of treatment on Cryptosporidium oocysts. Here we applied a recently developed integrated Cryptosporidium assay capable of measuring oocyst numbers and infectivity from a single grab sample (16) to quantify both removal and inactivation of oocysts at various treatment stages across five wastewater treatment plants.
Estimation of Cryptosporidium risk from wastewater and determination of treatment requirements for producing reuse water are greatly influenced by the number of oocysts in the raw sewage. As evident from this work and previous studies (11), oocyst numbers can differ markedly; therefore, the magnitude of the challenge to a WWTP can also fluctuate considerably. This challenge is related to the burden of cryptosporidiosis within the community. In the case of South Australia and Victoria, this has often been linked to recreational swimming, with concomitant spikes in disease notification rates from summer through autumn. This seasonality was reflected in the raw sewage oocyst densities for all five WWTPs, showing that the cryptosporidiosis notification rate is a robust predictor of the oocyst density within raw sewage. Consequently, historical data from disease surveillance could provide a valuable resource to utilities operating WWTPs, representing the impact of seasonality on the oocyst challenge and, in concert with sampling data from raw sewage, allowing the development of predictive models.
This study captured a widespread cryptosporidiosis outbreak within Victoria, reflected by a sizable increase in oocyst numbers within the raw sewage. Significantly, this large increase in oocyst numbers was identified before any statements were issued by the relevant health authority. This offers the prospect of using sewage epidemiology to identify potential cryptosporidiosis outbreaks as a faster method than traditional surveillance. Moreover, an opportunity also exists to use the integrated Cryptosporidium assay to identify the genotypes and species of Cryptosporidium infecting the community, assisting authorities with disease surveillance and identifying potential sources of infection. Additionally, sewage epidemiology may be a more accurate predictor of community disease burden due to significant underreporting of certain diseases in the community (18,–20). This may be the cause for the weaker relationship between notification rate and oocyst density seen when oocyst density spikes were removed from the data. Sewage epidemiology previously gained notoriety because of its use in quantifying illicit drug consumption at the community level (21,–23). However, it has been highlighted more recently as an avenue to quantify community health across a much broader range of parameters, with such information providing an early warning to facilitate intervention for improvement of community health (24,–26).
Based on the monitoring data from the South Australian WWTPs, the current guideline values (12) reasonably capture the variability of oocyst numbers in raw sewage. Future revisions of these guidelines, however, may need to reflect the impacts of seasonality on oocyst density and treatment performance, as well as the sizeable increase in raw sewage oocyst numbers that can occur due to a cryptosporidiosis outbreak. Similar high levels of oocysts in sewage have also been reported in the literature (8, 27), but to our knowledge this is the first report directly relating oocyst density in raw sewage to cryptosporidiosis notification rates for a community. As suggested above, there is an opportunity for predictive modeling to assist in the determination of any revised default values without the need for exhaustive sampling and analysis.
Aside from using total oocyst numbers in the raw sewage, the estimation of risk confronting WWTPs can be further refined by considering oocyst infectivity. We initially hypothesized that, as was found for oocyst density, oocyst infectivity would have a seasonal dimension at each of the WWTPs. However, there was little evidence for this, with no commensurate change in the infectious oocyst fraction as a function of season or with increased disease burden in the community. Interestingly, a large difference between the South Australian and Victorian raw sewage samples in the levels of oocyst infectivity was apparent. The factors governing oocyst infectivity in raw sewage remain unknown and may be specific to each individual WWTP. Possible factors influencing oocyst infectivity in raw sewage include sewer detention time, catchment size/complexity, environmental factors, and the relative proportions of industrial and domestic effluent. The SA sewer networks have longer residence times, and increased holding times may prolong oocyst exposure to hostile conditions in the sewer such as the presence of hydrogen sulfide and sulfuric acid.
Guidelines (12, 13) typically have not incorporated considerations of oocyst infectivity in deriving water quality targets to protect public health. Inclusion of infectious oocyst risk as a simple “risk reduction value” substantially reduced the apparent risk of the challenge confronting the WWTP for all plants investigated, particularly for the South Australian WWTPs due to the lower infectious fraction identified in the raw sewage. This reduction in risk can have significant benefits for recycled-water schemes as it may demonstrate that treated water is fit for purpose without the requirement for further treatment, and it may enable changes to some treatment processes to reduce operating costs or relaxation of some on-site controls and use restrictions or provide an additional barrier to protect human health at no additional cost in the event of a process upset as demonstrated here using QMRA. Nevertheless, the broader inclusion of oocyst infectivity data in reuse risk assessments would warrant a comprehensive understanding of locale-specific factors affecting oocyst infectivity.
Recycled-water guidelines often specify combinations of treatment processes together with on-site controls and usage restrictions to provide water of acceptable quality for identified uses. In this study, we covered a selection of treatment processes, including a variety of secondary treatment processes, lagoon storage, and UV disinfection.
Secondary treatment processes have been reported to be highly variable and, by some accounts, ineffective in removing Cryptosporidium oocysts (11). This has influenced the conservatism within guidelines, such as in the AGWR (12), where indicative log removal values for secondary treatment range between 0.5 and 1.05 log10. However, though this study also confirmed the highly variable nature of secondary treatment in oocyst removal, we identified a substantial seasonal influence contributing to this variability, in particular, flow variability and ground water intrusions into sewer systems following winter/spring rainfall. Being essentially a large bioreactor designed to achieve nitrification and denitrification, it is not surprising that the biomass of the secondary treatment process may be susceptible to being washed out or otherwise affected by rapid changes in the water quality of the influent, reducing the possibility of the bacterial floc interacting with and removing oocysts. Significantly for the sites studied, poorer oocyst removals across secondary treatment coincided with diminished oocyst challenges, while improved oocyst removals corresponded with the larger challenges that occurred during the summer/autumn months. The nexus between oocyst challenge, removal performance, and seasonal drivers has the potential to significantly change the risk associated with recycled water and presents opportunities for reducing costs of production without compromising human health. However, this may be locale dependent, and, in some settings, increased oocyst challenges may coincide with poorer removal performances as a result of differences arising from community disease dynamics and seasonality.
Although oocysts were removed by activated sludge treatment, the infectivity of the remaining oocysts in the clarified effluent was higher than that of the oocysts in raw sewage for 4 of the WWTPs investigated and was significantly so for Aldinga and Glenelg. This may have been due to dead or damaged oocysts being more readily removed by the activated sludge process (ASP), possibly due to changes in oocyst wall proteins and/or oocyst density affecting interactions with biological flocs. Notably, this emphasizes the importance of tertiary treatment due to the risk posed by infectious oocysts remaining in secondary treatment effluent.
Lagoons (wastewater stabilization ponds) and constructed wetlands provide attractive low-technology and low-energy-use solutions for treating wastewater. Provided that detention times are adequate and sediment resuspension is minimal, lagoons and constructed wetlands can provide excellent removal of oocysts from wastewater. Reflecting this, the AGWR (12) state an indicative removal range of 1.0 to 3.5 log10. In this investigation, removals achieved for the single lagoon at Aldinga WWTP were closer to the lower indicative value, whereas removals achieved at the Western Treatment Plant for multiple lagoons were closer to the upper indicative value. While sedimentation is thought to be the major mechanism responsible for removal, predation by zooplanktons may also make a substantial contribution. This is most likely the explanation for the highly seasonal removals at the Western Treatment Plant, which were restricted by water temperature, a factor constraining biotic antagonism. Notably, the removal of oocysts at Aldinga WWTP was not affected by season. However, the secondary treatment influent entering the lagoon is chlorinated, a condition which would greatly reduce biotic activity. If this were the case, sedimentation might be more significant in contributing to oocyst removal at the Aldinga WWTP as a result of process design.
Unlike secondary treatment processes, which facilitate only physical removal of oocysts, the lagoon systems investigated in this study provided both oocyst removal and substantial oocyst inactivation. This was strongly seasonal for the Western Treatment Plant, with the greatest inactivation occurring in the summer months. Available climatic and water quality data from the Western Treatment Plant suggested that this inactivation was due to the effects of a combination of temperature, solar radiation, and biotics. All three parameters have all been previously demonstrated to rapidly inactivate and remove oocysts (28,–30). These results confirm previous reports identifying the importance of lagooning for both the removal and the inactivation of Cryptosporidium oocysts (11). By incorporating infectivity into risk calculations, the effectiveness of lagoons as a barrier was confirmed, emphasizing the attractiveness of this technology for treating wastewater.
Attenuation of oocysts across secondary treatment and lagoon storage needs to be placed into the context of future climate variability, as climate change can affect the drivers behind both treatment performance and oocyst challenge. For example, while peak oocyst densities and poor secondary treatment removals occur in different seasons, the temporal separation between these two periods was small. A climate scenario where significant rainfall occurs earlier in the season, combined with longer periods of recreational swimming stemming from increased temperatures, might result in an overlap between high levels of oocyst challenge and poor treatment performance, exacerbating risk. In contrast, for lagoon storage, warmer temperatures would augment temperature-derived oocyst inactivation and possibly increase removal via enhanced biotic predation. However, as a multitude of interrelated biotic and abiotic factors are capable of affecting oocyst attenuation, careful modeling would need to be undertaken to determine future climate-driven scenarios.
UV disinfection is highly effective at inactivating Cryptosporidium and so is often incorporated within treatment trains (31,–33), with the AGWR (12) stating an indicative inactivation value of >3.0 log10 for UV light. While densities in individual replicates of raw sewage were as high as 11,000 oocysts/liter at Altona, the number of oocysts directly challenging the UV disinfection plant was reduced substantially as a result of secondary treatment removal. This diminished our ability to quantify oocyst inactivation values greater than 3.0 log10 as a result of UV disinfection, with inactivation quantified in this study at levels ranging between >1.48 log10 and >2.77 log10. However, under conditions of two distinct high-oocyst-challenge events, infective oocysts were detected in the UV effluent during the second event and a precise inactivation value of 2.24 log10 was calculated. While UV disinfection is an effective barrier, the detection of infective oocysts within the effluent under conditions of an adverse water quality scenario emphasizes the importance of a multibarrier approach. In this case, UV-treated effluent had the additional treatment barrier of reverse osmosis. For managing human health risks, treatment validation is essential because of the magnitude of the risks to health from using recycled water. This means that log reduction data provided by designers or manufacturers of treatment systems cannot be assumed to be valid—some objective empirical evidence of performance is required.
To illustrate the value of incorporating oocyst infectivity data into risk management of a reuse scheme, we explored two hazardous event scenarios based on the partial failure of a treatment barrier at a WWTP in South Australia. The incorporation of the infectivity data into the QMRA demonstrated that the plant would still be within safety limits as defined by DALY values for reuse water for municipal dual reticulation. However, without considering oocyst infectivity data, the reuse water would be only just within the required safety threshold. Under like scenarios, a utility undertaking QMRA without incorporating oocyst infectivity considerations might invest in an unnecessary treatment barrier and incur considerable costs. For example, inclusion of an additional barrier such as a UV plant providing a dose of 40 mJ/cm2 has estimated operational and maintenance costs ranging from US $19,000 a year for a 41.6 megaliters (ML)/day facility to $600,000 a year for a 794.9 ML/day facility, with capital costs ranging from $1.1 million to $22 million (34). Conducting such hypothetical exercises might also identify opportunities for changes to operational parameters (e.g., relaxation of turbidity targets for DAFF and therefore reduced coagulation expenditure) when extra treatment performance is not warranted. It should be noted that the QMRA example presented here is rudimentary and that such a hypothetical scenario could be further expanded to include other hazardous events such as increases in oocyst numbers following an outbreak or further treatment credits, including the substantial oocyst inactivation observed in lagoon systems. While Monte Carlo analysis was used in this QMRA exercise, more-sophisticated statistical tools such as Bayesian frameworks may offer the ability to account for the significant seasonal elements (treatment and challenge) identified in this study, adding additional precision and flexibility to risk analysis.
In conclusion, this project quantified both the removal and inactivation of oocysts by various stages of the wastewater treatment train, thereby accurately quantifying the “true effect” of a particular treatment train on oocyst risk reduction. For a number of WWTPs, the results demonstrated that the risk is significantly less than previously thought. Therefore, the inclusion of oocyst infectivity data into LRVs has the potential to direct the selection of treatment strategies and capital expenditure. Furthermore, we have shown that removals across the plants investigated were highly seasonal, with better removals outside the winter months. This is significant as the greatest oocyst challenges in these locales are more likely to occur in the summer/autumn months, when plant performance was greatest for the plants investigated and when demand for reuse is highest. Generating the most accurate data on which risk models are built will not only protect public health but also eliminate treatment process redundancy, thereby enhancing the sustainability and future of recycled-water schemes. However, overly conservative approaches incurring additional costs may jeopardize their future. Finally, we highlight the opportunity that sewage epidemiology presents as an avenue for quantifying community health, in addition to identifying the prospect of using historical data from disease surveillance systems to assist in predictive modeling of oocyst challenges to wastewater treatment plants.
Three Victorian and two South Australian WWTPs (Table 6; see also Text S1 in the supplemental material) were chosen to represent a variety of treatment processes. Samples were collected from across the treatment trains to include raw sewage, primary effluent, and secondary and tertiary treatment effluent (where applicable), covering the major treatment processes applied at each plant (see Text S2 and Tables S1 to S5 in the supplemental material). Wastewater samples were collected in duplicate or triplicate over a 12-month period between January and December 2013, with a minimum of six sampling rounds undertaken at each wastewater treatment plant. Sampling rounds were spaced throughout the year to collect samples from each of the WWTPs in summer, autumn, winter, and spring to account for any seasonal impacts on oocyst numbers or treatment plant performance.
A review of the data generated after the first round of sampling was undertaken to examine the soundness of the sampling points chosen and the degree of sample replication employed. This indicated that, with the exception of the Western Treatment Plant, a change to some sampling locations, as well as an increase in sample replication, was prudent. Subsequent to round 1, samples were collected in triplicate for each treatment process selected to be investigated, with the exception of samples from the Western Treatment Plant, which were collected in duplicate (see Text S2 and Tables S1 to S5). Samples collected from Victorian WWTPs were transported by overnight courier to the laboratory for processing, while samples collected from South Australian WWTPs were transported on the day of collection to the laboratory for processing.
Concentration, isolation, and analysis of environmental oocyst density and infectivity from wastewaters were undertaken using the integrated Cryptosporidium assay as previously described (16). In brief, all samples were spiked with ColorSeed (BTF Bio/bioMérieux), an internal oocyst recovery control. Samples were then initially concentrated using calcium carbonate precipitation (35) followed by oocyst purification using a Dynabead Cryptosporidium immunomagnetic separation (IMS) kit following the instructions of the manufacturer (ThermoFisher Scientific). Concentrated oocysts were then treated to mimic ingestion by a host and then applied to host cells. The oocysts (including the added ColorSeed) were then recovered from the cell culture wells and counted to provide an oocyst recovery rate (determined from the ColorSeed count) and an indigenous oocyst count. Additional controls were included in each batch of samples to account for oocyst losses on the plate, which is critical for estimating the number of oocysts inoculated into culture. After 48 h, the cells were stained using a fluorescent antibody to detect Cryptosporidium infectious stages, counting the number of zones of infection, termed “foci,” each of which indicates an infection caused by a single oocyst. An infectious oocyst fraction was then calculated from the number of foci detected and the number of oocysts inoculated onto cell culture.
Standard water quality and operational data collected as part of routine sampling programs at each WWTP were made available for analysis in concert with any measured reductions in Cryptosporidium oocyst numbers and infectivity. This was undertaken in order to examine if any of these measured factors influenced either oocyst infectivity or removal. Furthermore, climate data for air temperature, rainfall, and solar radiation for the year 2013 were downloaded from the Bureau of Meteorology (BOM) website (http://www.bom.gov.au/climate/data/index.shtml).
Cryptosporidiosis notifications were collected from the Australian Government Department of Health National Notifiable Diseases Surveillance System website (http://www9.health.gov.au/cda/source/rpt_1_sel_a.cfm).
For calculation of oocyst inactivation, the weighted average percent oocyst infectivity was calculated for each sampling location as follows: [(total number of sample 1 oocysts × percent sample 1 oocyst infectivity) + (total number of sample 2 oocysts × percent sample 2 oocyst infectivity)]/(total number of sample 1 oocysts + total number of sample 2 oocysts).
The oocyst LI value between sampling points was calculated using the log10 value of the weighted average percent oocyst infectivity. This calculation does not take into account sample recovery rates. When there were no infectious oocysts detected at the final point of analysis, a greater-than (>LI) value was calculated based on the conservative assumption that 1 infectious oocyst had been detected in each replicate sample.
A LRV for total oocysts was calculated using the total number of oocysts/liter in the raw sewage and the total number of oocysts/liter at the final point of analysis. The oocyst counts were corrected using oocyst recovery rates to account for oocyst losses that occurred during sample processing. While this calculation gives an accurate measure of the capacity of a treatment process to remove all oocysts, it does not quantify any oocyst inactivation resulting from the treatment process. When no oocysts were detected in the treated effluent, a greater-than (>) LRV was calculated based on the conservative assumption that 1 oocyst had been detected in each replicate effluent sample.
The LRV for infectious oocysts was calculated using the number of infectious oocysts/liter in the raw sewage and the number of infectious oocysts/liter at the final point of analysis. This calculation incorporated oocyst recovery rates to account for oocyst losses during sample processing. While this calculation gives an accurate measure of the “real risk” posed by infectious oocysts along a treatment train, it does not quantify the capacity of a treatment process to remove all oocysts (i.e., both infectious and noninfectious oocysts). When no infectious oocysts were detected, a greater-than (>) LRV was calculated based on the conservative assumption that 1 infectious oocyst had been detected in each replicate sample.
Treatment targets specified by the AGWR (12) to manage the risk from Cryptosporidium for the production of reuse water are based on total oocyst counts at the inlet to the plant (measured from either raw sewage or primary effluent, depending on the plant). This may overestimate the true risk, since not all oocysts are infectious. To document the effective reduction in risk using total and infectious oocyst counts, we used a risk reduction value that was calculated using the ratio of log10(raw sewage total oocyst count) to log10(raw sewage infectious oocyst count).
A hypothetical risk assessment incorporating oocyst infectivity data was conducted for the Bolivar WWTP due to it being a well-characterized system with each of the treatment steps previously validated in relation to all reference pathogens, including protozoa. Bolivar is the largest wastewater treatment plant in South Australia, processing >60% of metropolitan Adelaide's raw sewage and treating approximately 46,000 ML of wastewater annually. It uses four process stages before discharging effluent into the environment or reuse. The preliminary treatment consists of mechanical screening, followed by preaeration, grit removal, and primary sedimentation. The secondary treatment includes an activated sludge process (ASP). Three wastewater lagoons, followed by dissolved air flotation filtration (DAFF), comprise the tertiary treatment stages for the production of reuse water.
Using the raw sewage infectivity data collected from the Glenelg and Aldinga WWTPs in this investigation, a quantitative microbial risk assessment (QMRA) was performed to assess the risk for a number of hypothetical scenarios occurring at the Bolivar WWTP. The four illustrative scenarios assessed included (i) baseline barrier LRV performance (scenario A), (ii) baseline barrier LRV performance and sewage oocyst infectivity as a risk reduction value (scenario B), (iii) partial failure of the DAFF process (scenario C), and (iv) partial failure of the DAFF process and sewage oocyst infectivity as a risk reduction value (scenario D).
The variability in sewage concentrations of total and infectious oocysts for South Australian data was characterized by fitting probability density functions (PDFs) to cumulative data using @Risk software (Palisade Corporation, version 5.5) and examined for goodness of fit using root mean square error values. PDF fitting of total oocysts and the infectious fraction (percent) was undertaken in preparation for a Monte Carlo simulation to compute a final distribution to estimate the density of infectious oocyst numbers in the source (raw sewage). This was subsequently used to calculate a PDF value describing risk and to generate a risk reduction value that could be assigned for describing the difference in total and infectious oocyst numbers in raw sewage.
This risk reduction value was used in combination with the value corresponding to the concentration of total oocysts in sewage at Bolivar from previous validation work, in addition to barrier effectiveness (LRV) for the ASP, lagoon, and DAFF treatment steps, all of which were described in the form of probability density functions (PDFs). Monte Carlo simulations were employed to integrate these PDFs to estimate risk and disease burden in the form of disability-adjusted life year (DALY) values. For this exercise, the major exposure pathway was ingestion from municipal dual reticulation (12). A Cryptosporidium dose-response model and the DALY value (12) were used to generate illness and DALY estimates for each scenario.
We thank the project advisory committee, especially Judy Blackbeard and David Cunliffe, and all other participating water utilities, including Melbourne Water, City West Water, South East Water, and Allwater. WWTP operational and technical support was provided by Sam Costello, Shelly Koh, Rudi Regel, Alex Keegan, Teresa Qiu, Ann Gooding, and Ben Murdoch. Protozoology technical expertise was provided by Suzanne Hayes, Mira Maric, Ben Chong, and Chris Spry. Project administrative support was provided by Gareth Roeszler, Damien Connell, and Euan Hind.
This work was funded by the Victorian Smart Water Fund, Water Research Australia, and the South Australian Water Corporation.
Supplemental material for this article may be found at https://doi.org/10.1128/AEM.03068-16.