Introduction

Waterborne diseases are a major threat to public health globally. Worldwide, it is estimated that ~ 80% of wastewater is released to the environment without sufficient treatment (UN 2017) and at least 2 billion people use a drinking water source contaminated with faeces (WHO 2019). Waterborne diseases are thought to be responsible for between 1.6 and 12 million deaths annually (Gleick 2002; Troeger et al. 2018; Xagoraraki and O’Brien 2020). Although the burden is the highest in developing countries, outbreaks of disease still occur in developed countries and the global burden is estimated at 12 billion US dollars per year (Alhamlan et al. 2015). Waterborne pathogens include bacteria (e.g. Escherichia coli, Salmonella spp., Campylobacter spp., Vibrio cholerae), viruses (e.g. norovirus, adenovirus, poliovirus), protozoa (Cryptosporidium spp. and Giardia spp.) and helminths (e.g. Ascaris spp. and Trichuris spp.).

Clinical surveillance and monitoring of waterborne pathogens are essential tools for detecting and preventing further spread and to minimise the extent of an outbreak. However, clinical testing is usually limited to individuals who are ill enough to seek treatment and testing, resulting in underreporting of disease prevalence (Cacciò and Chalmers 2016) and providing a lag indicator for a community outbreak. Similarly, screening for pathogens directly to verify water safety in source and treated waters used for drinking is problematic due to the low pathogen concentrations that are considered acceptable, requiring the analysis of large volumes of water. For instance, acceptable viral pathogen concentrations in treated drinking water range from one infectious virion per 500 kl to 5 ml (Regli et al. 1991; Schijven and Hassanizadeh 2000; Schijven et al. 2006; Moore et al. 2010). Such concentrations are beyond what is practicably detectable. One solution is to monitor pathogens where they are present at higher concentrations in source waters and make assumptions about pathogen reduction. However, in relatively clean source waters that place little or no reliance on treatment barriers, the acceptable source water concentrations can still be way below the practicable limits of detection, which are in the range one infectious pathogen per l to 1 kl, depending on the pathogen, assay type and quality of the water. Indeed, most of the standard approaches to pathogen monitoring for both clinical disease and water testing are costly, often pathogen-specific, frequently rely on passive monitoring, are only practicable and affordable at inadequately low frequencies, are subject to biases, and vary widely from country to country depending on the resources and funding available (Ramírez-Castillo et al. 2015; Sims and Kasprzyk-Hordern 2020).

Improved monitoring systems that can detect multiple waterborne diseases across broad communities in a cost-effective manner, preferably in real-time, are therefore urgently required. In this regard, wastewater-based epidemiology (WBE), as an early warning system for a variety of waterborne infectious diseases, has received much recent attention. Initially used for monitoring poliovirus prevalence (Pöyry et al. 1988; Berchenko et al. 2017), community-wide drug abuse (Castiglioni et al. 2006) and other chemical pollutants (Choi et al. 2018), WBE relates broadly to the analysis wastewater for the presence of nucleic acids or other biomarkers excreted in faeces and urine to provide comprehensive community health information (Mao et al. 2020a). The methods will also detect pathogens secreted in saliva, sputum, mucus, vomitus and phlegm—all of which are often captured in wastewater (Deere et al. 2020). Thus, WBE is equivalent to obtaining and analysing a large community-based composite sample of faeces, saliva, vomitus, sputum, urine, shed skin and other material shed during personal cleansing, washing, bathing and excreting, providing a sensitive means of monitoring temporal changes in pathogen concentrations and diversity within a community (Xagoraraki and O’Brien 2020). A further advantage of sampling wastewater directly is that pathogen numbers are higher in wastewater compared with the receiving environments into which wastewaters are discharged.

The emergence in 2020 of the severe acute respiratory syndrome Coronavirus 2 (SARS-CoV-2), which causes viral pneumonia, Coronavirus Disease 2019 (COVID-19), has heightened the focus on WBE as a surveillance tool to provide early detection of disease in the community, particularly due to the time lag between the development of symptoms, clinical diagnosis and any action required by health authorities to contain the disease cluster. Although SARS-CoV-2 typically causes respiratory symptoms, and is shed in nasal, buccal, oesophageal and respiratory discharges into wastewater, it can also result in gastrointestinal symptoms and/or viral shedding in faeces (Wu et al. 2020a, 2020b; Xu et al. 2020), with a meta-analysis of COVID-19 studies finding that 17.6% of COVID-19 patients had gastrointestinal symptoms and 48.1% of COVID-19 patients had SARS-CoV-2 RNA detected in their faeces (Cheung et al. 2020). Thus, monitoring the presence of SARS-CoV-2 RNA in wastewater is becoming widely used to track changes in COVID-19 case numbers in communities (e.g. Ahmed et al. 2020a; Bar-Or et al. 2020; Kocamemi et al. 2020; La Rosa et al. 2020a; Medema et al. 2020; Nemudryi et al. 2020; Peccia et al. 2020; Randazzo et al. 2020a, 2020b; Wu et al. 2020a; Wurtzer et al. 2020a, 2020b) (Table 1).

Table 1 Summary of studies that used molecular techniques to detect SARS-CoV-2 in wastewater in 2020

The protozoan parasites, Cryptosporidium and Giardia, are also important enteric pathogens of public health concern and major waterborne pathogens. Between 2011 and 2016, Cryptosporidium and Giardia were responsible for all reported waterborne outbreaks due to protozoa (n = 381) worldwide (Efstratiou et al. 2017). Cryptosporidium is the second most important cause of moderate to severe diarrhoea and mortality in children under 5 years of age in developing countries and both symptomatic and asymptomatic cryptosporidial infections in children are associated with malnutrition and stunted growth (Khalil et al. 2018). Oocysts are shed in faeces in high numbers (up to 109 per stool); the median infectious doses reported are in the range approximately 1–125 oocysts, depending on species, with a consensus probability of infection per ingested oocyst recommended of 20%; and the oocysts can remain infectious in the environment for more than 6 months under cool, dark, moist conditions (DuPont et al. 1995; Chappell et al. 2006; Shirley et al. 2012; WHO 2016). The global prevalence of Cryptosporidium has been estimated at 7.6%, with an average prevalence of 4.3% in developed countries and 10.4% in developing countries (with prevalences as high as 69.6% in some countries) (Dong et al. 2020). There is evidence that due to under-reporting, the true number of cases in the community may be as much as 500 times higher than the numbers estimated based on routine clinical stool isolates (Hall et al. 2006). Currently, 43 Cryptosporidium species are considered valid (Bolland et al. 2020; Holubová et al. 2020; Ježková et al. 2020), with the majority of human infections caused by C. hominis and C. parvum, although > 20 species and genotypes have been reported in humans (Feng et al. 2018; Zahedi and Ryan 2020).

Giardiasis is the most common enteric protozoan parasitic infection worldwide, with an estimated 280 million people infected annually. The species that infects humans, Giardia duodenalis, is a species complex consisting of eight assemblages (A-H), with assemblages A and B the dominant assemblages in humans and assemblages C-H in animals, although sporadic cases of assemblages C, D, E and F have been reported in humans (Ryan and Zahedi 2019). Giardia infections can be asymptomatic or result in diarrhoea that can become chronic and has also been associated with irritable bowel syndrome, chronic fatigue and joint pain (Coffey et al. 2020). In infants and children, infections can result in failure to thrive and malnutrition (Dunn and Juergens 2020). In developing countries, up to 33% of individuals have been infected and up to 8% in developed countries (Cacció and Sprong 2014; Dunn and Juergens 2020). As with Cryptosporidium, cysts are shed in the faeces in high numbers (up to 1010 cysts per day) with a median infectious dose of approximately 25 cysts (Rendtorff 1954, 1979).

Both parasites are prevalent in wastewater (Hamilton et al. 2018; Zahedi et al. 2018; Adeyemo et al. 2019; Razzolini et al. 2020) (Tables 2 and 3), with concentrations in wastewater as high as 60,000 Cryptosporidium oocysts and 100,000 Giardia cysts L−1(Hamilton et al. 2018), and Cryptosporidium oocysts are resistant to chemical disinfection (Campbell et al. 1995). Urban wastewater discharge is known to play an important role in pathogen transmission. For example, the largest cryptosporidiosis outbreak to date in 1993 in Milwaukee, USA, which affected over 400,000 individuals, was due to drinking water becoming contaminated with wastewater as a result of extreme weather and water treatment failure (MacKenzie et al. 1994). This review will focus on the surveillance and detection of Cryptosporidium, Giardia and SARS-CoV2 in wastewater and the benefits and challenges of WBE for public health.

Table 2 Summary of studies that used molecular techniques to detect Cryptosporidium in wastewater
Table 3 Summary of studies that used molecular techniques to detect Giardia in wastewater

Occurrence of SARS-CoV-2 in wastewater

Very limited data is available on the occurrence of the closely related SARS-CoV-1 in wastewater (Peiris et al. 2003; Wang et al. 2005a; Gundy et al. 2009; Wigginton et al. 2015), but connections with wastewater were identified in a 2003 outbreak in Hong Kong linked to a faulty sewage system (Peiris et al. 2003). Since the first report of SARS-CoV-2 in human waste (Holshue et al. 2020), the presence of SARS-CoV-2 in wastewater has drawn substantial attention, and globally an increasing number of studies have detected SARS-CoV-2 in untreated and/or treated urban wastewater and wastewater treatment plants (WWTPs) to track the spatial and temporal dynamics of the virus and the removal of efficiency of wastewater treatment processes (Table 1) and the potential public health risks associated with SARS-CoV-2 in wastewater (Michael-Kordatou et al. 2020). Prevalence rates ranging from 11 to 100% at a concentration up to 4.6 × 107 genome copies/L in untreated (raw influent), and 0 to 100% at a concentration up to 105 genome copies/L in treated (final effluent) wastewater have been reported (Table 1).

Occurrence of protozoans in wastewater

The protozoan parasites, Cryptosporidium and Giardia, are among the most common parasites reported in wastewater worldwide and are significant contributors to the global waterborne disease burden (Zahedi et al. 2018) (Tables 2 and 3). The occurrence and distribution of Cryptosporidium oocysts and Giardia cysts in untreated wastewater generally correlates to the infection and excretion rates in the population served, which may also be influenced by the contribution of infected domestic or wild animals to the Cryptosporidium and Giardia load in the raw wastewater (Castro-Hermida et al. 2008; Deere and Khan 2016). To date, studies conducted globally have reported more than 20 species/genotypes of Cryptosporidium and G. duodenalis assemblages A, B, C, E and G in wastewater, with prevalence rates of 11.4 to 100% and 18.8 to 100% for Cryptosporidium and Giardia spp., respectively, and often at concentrations over 10 oocysts/L and 100 cysts/L for Cryptosporidium and Giardia, respectively (Tables 2 and 3).

Several studies across the world have reported C. hominis (the predominant species in humans) among the most prevalent Cryptosporidium species detected in wastewater; e.g. Australia (King et al. 2015a; Zahedi et al. 2018), Brazil and Peru (Ulloa-Stanojlović et al. 2016; Martins et al. 2019), China (Feng et al. 2009; Li et al. 2012; Huang et al. 2017), Japan (Hashimoto et al. 2006; Hirata and Hashimoto 2006), Switzerland and Germany (Ward et al. 2002), the USA (Xiao et al. 2001; Zhou et al. 2003) and Tunisia (Ben Ayed et al. 2012) (Table 2). While in Europe, a number of studies have reported that C. parvum is the dominant species in wastewater (Hänninen et al. 2005; Spanakos et al. 2015; Imre et al. 2017; Ramo et al. 2017). In other countries such as China, Iran, Tunisia and the USA, livestock-associated species such as C. andersoni and C. xiaoi dominate in wastewater samples (Xiao et al. 2001; Liu et al. 2011; Ben Ayed et al. 2012; Hatam-Nahavandi et al. 2016; Ma et al. 2019) (Table 2).

Compared to other assemblages of G. duodenalis, assemblages A and B have been predominantly reported in wastewater globally, while assemblage C (Yamashiro et al. 2019), E (Castro-Hermida et al. 2011; Ben Ayed et al. 2012; Hatam-Nahavandi et al. 2017) and G (Huang et al. 2017; Ma et al. 2019) have been reported sporadically (Table 3).

Current protozoan and SARS-CoV-2 detection/surveillance systems in wastewater

SARS-CoV-2 is most commonly detected using quantitative reverse transcription polymerase chain reaction (RT-qPCR) assays for the detection of SARS-CoV-2 RNA. To date, more than 17 RT-qPCR assays have been developed for the detection of SARS-CoV-2 in clinical samples including the CDC-recommended 3 gene segments of SARS- CoV-2 RNA (N1, N2 and N3) (CDC 2020) and the envelope protein (E) gene (E_Sarbeco, target) (Corman et al. 2020). Some of these assays have been used to detect SARS-CoV-2 in wastewater samples from a wide range of countries including Australia (Ahmed et al. 2020a), Chile (Ampuero et al. 2020), Germany (Westhaus et al. 2020), Israel (Bar-Or et al. 2020), India (Kumar et al. 2020), Italy (La Rosa et al. 2020a), Japan (Hata et al. 2020), France (Wurtzer et al. 2020a, 2020b), the Netherlands (Medema et al. 2020), Spain (Randazzo et al. 2020a, 2020b), Turkey (Kocamemi et al. 2020) and the USA (Green et al. 2020; Nemudryi et al. 2020; Wu et al. 2020a). While qRT-PCR is the most reliable method to detect SARS-CoV-2, a variety of serological tests (ELISAs, lateral flow assays etc) have also been developed, which provide additional important information on the kinetics of the immune response and detection of asymptomatic infections and have the advantage that virus proteins are more stable than RNA (La Marca et al. 2020). Antibody-based methods have been applied for the detection of SARS-CoV-2 protein in wastewater using immunoblotting and immune-linked PCR (Neualt et al. 2020).

Prior to detection, studies have used a variety of viral concentration methods including ultrafiltration, polyethylene glycol (PEG) precipitation, filtration with an electronegative membrane and centrifugation (Lu et al. 2020) (Table 1). To enable accurate measurements of SARS-CoV-2 in wastewater, it is important to determine the recovery efficiencies of these methods. A recent study compared the efficiency of different viral concentration methods from WWTPs using murine hepatitis virus as a human coronavirus (CoV) surrogate (Ahmed et al. 2020b, 2020c). Of these, the highest mean recovery (65.7%) was achieved using an adsorption-extraction method, supplemented with MgCl2, followed by an adsorption-extraction method without MgCl2 (60.5%). Mean recovery efficiencies for PEG precipitation (44%) and ultrafiltration (Amicon® Ultra-15 – 28%, Centricon Plus-70 - 56%) were lower (Ahmed et al. 2020b). Concentration of both liquid and solid fractions of wastewater samples (due to viral particle adsorption to organic matter) and avoiding acidification of samples was identified as important for viral recovery (Ahmed et al. 2020b).

Standard detection methods for Cryptosporidium and Giardia in wastewater involve concentration (using filtration or flocculation) and purification of the oo(cysts) (usually using immuno magnetic separation—IMS), followed by immunofluorescent microscopy and enumeration, based on Method EPA 1693/2014 (USEPA 2014). The recovery efficiency from wastewater varies widely and ranges from 5.5% to as high as 100% with mean recoveries of 62% (Cryptosporidium) and 45% (Giardia) (Gennaccaro 2003; Quintero-Betancourt et al. 2003; Robertson et al. 2006; Nasser et al. 2012; Nasser 2016; Yamashiro et al. 2019). However, a major limitation of standard microscopy-based detection methods is that they do not provide information on the species/assemblages. Vital dyes have been used to determine viability but are problematic and subject to overestimation of oo(cyst) viability (Sammarro Silva and Sabogal-Paz 2020). As a consequence, more recent studies have employed molecular detection methods for genetic characterisation, or cell culture infectivity assays.

Relatively few studies have genetically characterised Cryptosporidium and Giardia in wastewater (Tables 2 and 3) and most studies have utilised Sanger sequencing of PCR amplicons with only two studies using next-generation sequencing (NGS) of amplicons (Zahedi et al. 2018, 2019). A custom microarray targeting a range of viral, bacterial and protozoan pathogens has also been tested against DNA obtained from whole genome amplification (WGA) of RNA and DNA from wastewater and animal faeces, which detected Giardia but not Cryptosporidium (Li et al. 2015).

A wide diversity of Cryptosporidium and Giardia species and assemblages have been detected in wastewater with many studies reporting C. hominis as well as C. parvum, C. muris, C. meleagridis and G. duodenalis assemblages A and B among the most prevalent (King et al. 2015a; Taran-Benshoshan et al. 2015; Ulloa-Stanojlović et al. 2016; Huang et al. 2017; Ramo et al. 2017; Zahedi et al. 2018; Yamashiro et al. 2019) (Tables 2 and 3). In addition, a few studies have utilised subtyping tools to further investigate Cryptosporidium gp60 subtypes in wastewater (Feng et al. 2009; Ben-Ayed et al. 2012; Li et al. 2012; Ma et al. 2016; Huang et al. 2017; Jiang et al. 2020). Amongst C. hominis subtype families identified in wastewater to date, subtype family Ib was the most predominant subtype family reported (83% of studies that used subtyping), followed by subtype families Ia (66%), Id and Ie (50% each) and If (33%). For C. parvum, only three studies have used subtyping tools, and C. parvum subtype families IIa (Tunisia and USA), IIc (Tunisia) and IId (China) were reported (Zhou et al. 2003; Ben Ayed et al. 2012; Li et al. 2012; Huang et al. 2017). In addition, subtyping of C. meleagridis, C. viatorum and C. ubiquitum in wastewater samples at the gp60 locus have identified subtypes IIIbA22G1R1c, XvaA6 and two distinct subtype families XIIg and XIIh, respectively (Ma et al. 2016; Huang et al. 2017) (Table 2).

Fate/survival/removal of protozoans and SARS-CoV-2 in wastewater

After being shed into nasal, buccal, oesophageal, respiratory and faecal discharges into wastewater, pathogens are exposed to the wastewater environment for hours to days before they reach WWTPs. The fate and survival of pathogens in wastewater systems depend on a variety factors, including wastewater characteristics, the presence of biofilms, temperature, pH, average in-sewer travel time, per-capita water use, and the processes used to treat and disinfect the wastewater (Curtis 2003; Cao et al. 2020; Hart and Halden 2020; Mandal et al. 2020). Wastewater treatment usually involves a combination of physical (sedimentation, filtration, inactivation by solar or UV radiation), biological (activated sludge, algae) and chemical (coagulation-flocculation, inactivation by oxidants such as chlorine) processes for pathogen removal from wastewater, with some of the process occurring concurrently (Bhatt et al. 2020; Fu et al. 2010; Nasser et al. 2012).

In general, secondary wastewater treatment is capable of removing an average of 1-log10 (90%) of viruses, although removal levels are highly variable and additional treatment such as chlorination is required to reduce virus levels to safe levels for release to the environment (McLellan et al. 2020). Relatively, little is known about the fate of SARS-COV-2 in WWTPs. In one study, the time from stool emission to the arrival at the WWTP for SARS-CoV-2 was estimated at 6–8 h (Rimoldi et al. 2020) and it has previously been reported that SARS-CoV-1 can remain infectious in wastewater for up to 14 days (at 4 °C) (Wang et al. 2005a). Coronaviruses are enveloped viruses, which means that the virus genome and associated proteins are covered by a lipid membrane taken from the host cell during virus reproduction (Casanova et al. 2009; Schoeman and Fielding 2019). In contrast, enteric viruses such as noroviruses and enteroviruses are non-enveloped, and their genome is encapsulated by a protein coat. These structural differences alter their behaviour, with enveloped viruses more readily binding to particulates in wastewater compared with non-enveloped viruses, which are not particle associated (Ye et al. 2016). Enveloped viruses are considered to be more fragile compared with non-enveloped viruses because the presence of compounds such as solvents and detergents in wastewater can damage the virus envelope, rendering them non-infectious (Gundy et al. 2009). Wastewater temperature varies seasonally and it has been estimated that at 20 °C, at least 25% of SARS-COV-2 virus RNA in wastewater should persist even with an in-sewer transit time of 10 h and low virus stability (Hart and Halden 2020). Chlorination is the most commonly used disinfection technique in WWTPs and previous studies have shown that SARS-CoV-1 is more sensitive to disinfection than Escherichia coli, with complete inactivation at a dose of 10 mg/L chlorine or 20 mg/L chlorine dioxide (Wang et al. 2005b). A study in Italy detected SARS-CoV-2 RNA in raw, but not in tertiary treated wastewaters and none of the positive samples contained infectious virus (Rimoldi et al. 2020), which is similar to a study in Spain (Randazzo et al. 2020b). A study in Paris identified SARS-CoV-2 RNA in raw (23/23) and treated (6/8) wastewater, but there was a 100-fold reduction in viral load in treated water compared to raw water (Wurtzer et al. 2020b).

The removal of Cryptosporidium and Giardia (oo)cysts at WWTPs can be highly variable and often dependent on the temperature and type of wastewater treatment processes used (Emelko 2003; Nasser et al. 2012; 2016; King et al. 2017; Hamilton et al. 2018; Schmitz et al. 2018). Seasonality and inflow also affect removal (King et al. 2017), and many studies have reported variable removal of both Cryptosporidium and Giardia from WWTPs, particularly activated sludge (Nasser et al. 2012; 2016). Giardia Log10 reduction values (LRV) removal efficiencies of 0.5–4.0 (Taran-Benshoshan et al. 2015; Soller et al. 2017; Hamilton et al. 2018; Yamashiro et al. 2019) and Cryptosporidium LRVs ranging from 0.21 to 3.08 (King et al. 2017; Soller et al. 2017; Hamilton et al. 2018) have been reported from various WWTPs. WWTPs that used Bardenpho processes (similar to activated sludge but incorporates additional aerobic (oxic) and anoxic stages) have been reported to have had significantly greater LRVs for Cryptosporidium and Giardia than WWTPs using activated sludge or other methods (Schmitz et al. 2018).

Few studies have measured the extent of protozoan inactivation that may be occurring across treatment processes. An integrated Cryptosporidium assay that determines oocyst density, infectivity and genotype has been developed (Swaffer et al. 2014; King et al. 2015a, 2017) and applied to wastewater (King et al. 2015b, 2017). Using this assay, King et al. (2017) showed that Cryptosporidium oocyst infectivity in wastewater in two states in Australia were stable throughout the year but that removals across secondary treatment processes were seasonal and highly variable (King et al. 2017). Interestingly, the infectivity of oocysts that were not removed in the effluent was higher compared to inlet samples for some WWTPs analysed, possibly due to the preferential removal of damaged/non-infectious oocysts. Another study reported that while activated sludge removed ~ 80% of oocysts, the remaining oocysts were still infectious in mice (Villacorta-Martínez de Maturana et al. 1992), which highlights the importance of incorporating routine infectivity testing in wastewater (King et al. 2017). Ultrafiltration (Cryptosporidium: 4.4–6.0 LRV; Giardia: 4.7–7.4 LRV) and UV disinfection combined with advanced oxidation (∼ 6.0 LRV for both Cryptosporidium and Giardia) were reported as the most efficient methods for removal and disinfection of Cryptosporidium and Giardia (oo)cysts in WWTPs (Soller et al. 2017). Future studies on environmental conditions including temperature and pH and other wastewater treatment processes and disinfection studies are necessary to better understand the removal of a range of pathogens from WWTPs (Bhatt et al. 2020).

Benefits of wastewater-based epidemiology

Normally, disease outbreaks are detected and their progression monitored by the clinical testing of symptomatic individuals. However, particularly in the case of enteric pathogens, outbreaks can be missed or disease incidence under-reported because there is a reliance on infected people presenting for medical care, and for medical practitioners to request clinical testing to confirm infection and to report results (Cacciò and Chalmers 2016). In the case of the COVID-19 pandemic, many countries adopted large-scale screening of people with flu-like symptoms to identify COVID-19 cases and assist with disease containment, overwhelming the testing capacity of many public health systems and also causing global shortages of testing reagents. As pathogens such as viruses (e.g. SARS-CoV-2) and protozoa (Cryptosporidium and Giardia) are shed through faeces into wastewater, continuous and systematic monitoring of WWTPs can clearly benefit public health by providing early warning signs and information about temporal and spatial spread of infection in different localities at a population level (Kitajima et al. 2020).

Several WBE studies have reported the occurrence of local community transmission of SARS-CoV-2 before the first notified autochthonous SARS-CoV-2 cases (La Rosa et al. 2020a; Medema et al. 2020; Randazzo et al. 2020a, 2020b). Had this testing been in place at the time, it would have provided public health officials with more time to coordinate and implement actions to slow the spread of disease. A study in the UK reported that clinical testing underestimated the prevalence of COVID-19 and that large reductions in SARS-CoV-2 RNA in wastewater coincided with lockdowns (Martin et al. 2020).

Similarly, analyses of wastewater in Australia for Cryptosporidium identified a large increase in oocyst numbers relating to an outbreak of cryptosporidioisis, prior to it being reported by public health officials (King et al. 2017). WBE has also been used in several studies to show that the community level prevalence of Giardia is underestimated (Jakubowski et al. 1991; Oda et al. 2005; Nasser et al. 2012).

The lag time between symptoms developing and clinical testing varies depending on a number of factors including willingness of individuals to present for testing, workloads in testing facilities etc., but is usually 3–9 days after symptom onset. One study in the US reported that WBE for SARS-CoV-2 foreshadowed new clinical case reports by 2–4 days (Nemudryi et al. 2020) and another that viral titre trends in wastewater appeared 4–10 days earlier in wastewater than in clinical data (Wu et al. 2020b). In addition to this lag time, clinical testing for SARS-CoV-2 underestimates the true scale of the pandemic, as another US study estimated that only 32% of SARS-CoV-2-infected individuals sought medical care (Silverman et al. 2020). WBE overcomes this by capturing data from all individuals in the community. WBE can also detect asymptomatic community infections and rapidly identify emerging clusters which can then be used to alert public health officials about emerging undetected transmission events (Tang et al. 2020).

In addition, WBE can be used to monitor the effectiveness of public health interventions. For example, a study in Cuba detected poliovirus in 100% of wastewater samples prior to an immunization campaign, but 15 weeks after the campaign, no virus was detected (Más Lago et al. 2003). Similarly, WBE could be used to monitor the ongoing effectiveness of public health campaigns to reduce COVID-19 by tracking increases or decreases in disease burden, or to detect the re-emergence of disease in communities that have no active COVID-19 cases. Carefully designed spatial sampling and nationwide WBE monitoring could be used to identify and monitor sensitive locations, such as aged care facilities, or to generate maps of disease clusters and show patterns of disease and identify which public health interventions are more effective than others (Daughton 2020). Communities with high numbers of a particular pathogen identified could be targeted for more focussed testing and in the longer term to identify and mitigate causes, e.g. socioeconomic status, age demographics, etc. (Sims and Kasprzyk-Hordern 2020).

Sequencing and phylogenetic analysis of pathogens in wastewater allows for comparisons between regions and detection of sources of infection and transmission dynamics. This is very much in its infancy for SARS-CoV-2 in wastewater, but is actively being used in identifying and tracing sources of COVID-19 as part infection control strategies (Rockett et al. 2020). Such an approach may be particularly useful in settings with low disease incidence when the source of new infection clusters is being tracked (Eden et al. 2020). A comprehensive study using WGS assessed the geographic and temporal distribution of SARS-CoV-2 lineages across Europe (Alm 2020) and this approach was used in the Netherlands to identify separate introductions to mink farms (Oreshkova et al. 2020). A study in England used WBE to detect virus variants that were particularly prevalent in the UK and also identified the increasing dominance of the Spike protein G614 variant using Whole Genome sequencing (WGS) (Martin et al. 2020). Similarly, phylogenetic analyses of a SARS-CoV-2 genome from a WWTP in Bozeman, Montana (USA), showed that it was more closely related to isolates from California and Australia than the Wuhan WA1 linage (Nemudryi et al. 2020). Surveillance using WGS has also been used to show that infections in California have been due to multiple introductions from interstate and international sources (Deng et al. 2020).

Subtyping of Cryptosporidium from wastewater in China has been used to identify differences in the transmission dynamics of C. hominis from different cities (Li et al. 2012). In the same way, molecular analyses have been used to identify that hospitals are important contributors of Cryptosporidium and Giardia to urban wastewater (Jiang et al. 2020). Molecular typing of Cryptosporidium in wastewater has also been used to identify the contribution of abattoirs to wastewater as species from livestock such as C. andersoni (Zhou et al. 2003; Ben Ayed et al. 2012) and species from poultry (C. galli, C. baileyi and C. meleagridis) (Huang et al. 2017; Ramo et al. 2017; Zahedi et al. 2018) are more frequently detected in cities with large abattoirs. WBE has identified the persistence of the C. hominis IbA10G2 subtype, which was responsible for the 1993 Milwaukee outbreak (Zhou et al. 1993). A study conducted 7 years after the outbreak identified that despite the complexity of Cryptosporidium in wastewater, the IbA10G2 subtype was still the predominant subtype indicating its persistence even in the absence of another outbreak (Zhou et al. 1993). WBE has also been used to show that anthroponotic and not zoonotic transmission of Giardia dominates in cities in China due to the absence of detection of animal-specific Giardia assemblages and even when potentially zoonotic assemblages (A) were detected, subtyping identified sub-assemblage AII, which is mainly found in humans (Li et al. 2012). Due to the diversity of pathogens from different sources in wastewater, NGS has advantages over conventional Sanger sequencing in identifying the extent of diversity and also detecting low abundance species that may not otherwise be detected by conventional sequencing. For example, NGS has been shown to detect a larger diversity of Cryptosporidium species and subtypes in Australian wastewater compared to Sanger and identified striking differences between states, reflecting differing contributions from humans, livestock, wildlife and birds and abattoirs to wastewater (Zahedi et al. 2018).

Challenges, risks and future prospects

Despite the obvious benefits of WBE, many challenges remain. Concentrations of pathogens in wastewater can vary seasonally and daily, depending on a wide variety of factors including the disease prevalence and age and health status in communities, the rate at which the pathogens are shed into the wastewater in nasal, buccal, oesophageal, respiratory and faecal discharges, climate and environmental factors including rainfall, the relative proportions of industrial and domestic effluent, water use and wastewater management practices including sewer residence and holding times. The impacts of all these factors need to be better understood to improve the predictive value of WBE.

It is particularly important to better understand how and in what quantities pathogens are shed in the nasal, buccal, oesophageal, respiratory and faecal discharges from infected individuals that might enter wastewater streams in order to model the number of infections in the community using the numbers of pathogen detected in wastewater. For example, defecation frequency is the highest in the morning (Heaton et al. 1992) and therefore the timing of sampling is important, as morning samples are likely to contain higher numbers of faecal-oral pathogens. In addition, pathogen shedding is frequently sporadic. For instance, not all COVID-19 patients shed virus in their faeces. A recent meta-analysis of ninety-five studies reported that 43% (934/2149) of patients (including asymptomatic patients) tested positive for SARS-CoV-2 in stool samples but the prevalence of positivity from faecal samples varied widely across studies (van Doorn et al. 2020). The viral load of SARS-CoV-2 in the faeces of patients also varies widely depending on the infection course, with up to 108 copies per gram of faeces (Foladori et al. 2020; Lescure et al. 2020; Pan et al. 2020; Wölfel et al. 2020). Similarly, both Cryptosporidium and Giardia also exhibit sporadic (oo)cyst shedding in faeces (Danciger and Lopez 1975; Chappell et al. 1996) and as with SARS-CoV-2, the (oo)cyst faecal load also varies widely with up to 105–7 Cryptosporidium oocysts per gram of faeces (Chappell et al. 1996).

It is also unclear how long the shedding continues in faeces once other symptoms have resolved. For primarily upper respiratory and nasopharyngeal pathogens, shedding from nasal, buccal, oesophageal and respiratory discharges into wastewater are relevant as well as faecal inputs. Other pathogens, such as norovirus, are often shed in vomitus (Kirby et al. 2016). As regards SARS-CoV-2, initial studies report that faecal shedding is relevant to wastewater samples since faecal samples were positive between 1 and > 30 days (up to 7 weeks) post onset of illness and the median survival of positive viral signals was significantly longer in faecal samples than that in oropharyngeal swabs (Amirian 2020; Wang et al. 2020). With Cryptosporidium, oocyst shedding post cessation of diarrhoea is very variable and can extend for up to 60 days (Jokipii and Jokipii 1986; Stehr-Green et al. 1987) and for up to 6 months with Giardia (Hanevik et al. 2007).

Efficient recovery and concentration of pathogens from WWTPs prior to identification is central to reliable detection. Currently, there are differences in the types and volumes of samples analysed and differences in the concentration and processing procedures and detection methods used between studies. In order to make WBE studies more comparable, a standard approach for WBE including robust sample design and quality assurance protocols is essential (Ahmed et al. 2020d; Farkas et al. 2020). Studies to develop a simple, effective primary concentration method that can be used for the concentration of viral, bacterial and eukaryotic pathogens are also vital. Central to this is the ability to determine recovery efficiencies for the different pathogens monitored as without this, accurate quantitation and determination of the numbers of pathogens present in WWTPs is not possible. Whatever methods are developed for real-time WBE detection in the future, they need to be fully quantitative to allow for comparison across communities. If detection methods are nucleic acid based, then standardised extraction and PCR-based diagnostic methods should be used.

Understanding the detection limits of WBE is also an area that requires more study (i.e. what numbers of cases need to be positive in a community before they can be confidently detected at a WWTP). Modelling suggests that for SARS-CoV-2, detection in community wastewater of one positive case per 100 to 2,000,000 non-infected people is theoretically feasible (Hart and Halden 2020; Kitajima et al. 2020). A study in Japan reported that SARS-CoV-2 RNA could be detected in WWTPs when the number of total confirmed cases was as low as 1 in 100,000 people but that detection frequency increased and became more reliable once cases were at 10 in 100,000 people or higher (Hata et al. 2020). The detection limits for other pathogens remain unknown.

While some studies have reported that the prevalence in wastewater correlated well with the reported COVID-19 community prevalence (Ahmed et al. 2020a; Medema et al. 2020; Wurtzer et al. 2020a, 2020b), another study reported that SARS-CoV-2 concentrations in wastewater were orders of magnitude greater than the number of confirmed clinical cases (Wu et al. 2020a). The impact of confounding variables such as the rate of asymptomatic cases and the variation in numbers of individuals that present for testing as well as the testing and quantitation methods used are also variables that require further study to provide more robust data in this area. Another challenge associated with WBE is estimating the population size of individual WWTP catchments and the contribution of tourists or commuters in smaller communities (Sims and Kasprzyk-Hordern 2020). Ethical considerations, including privacy and the stigmatisation of ethnic and vulnerable populations, are also issues that will need to be managed. Analyses based on populations over > 10,000 are thought to provide anonymity; however, reporting the emergence and/or spread of disease in small populations or sub-populations by WBE must be done with care and needs to be sensitive to different social, ethnic and economic circumstances (Sims and Kasprzyk-Hordern 2020).

Although RT-PCR is the most widely used method for detecting SARS-CoV-2, it can be expensive, time-consuming and requires skilled technicians, and is therefore not conducive to real-time WBE. A variety of point-of-care (POC) options are being explored including paper-based devices (e.g. those that use inexpensive isothermal nucleic acid amplification on a paper material) (Mao et al. 2020b). However, available data indicates that current isothermal amplification of SARS-CoV-2 lacks the required sensitivity and throughput and still requires sample concentration prior to analyses. CRISPR (clustered regularly interspaced short palindromic repeats)-based isothermal RNA detection assays has been developed to help overcome some of these issues but are expensive and the sensitivity remains to be fully evaluated (Broughton et al. 2020; Huang et al. 2020). Small-scale lab-on-a chip biosensor devices which use a bio-recognition element (e.g., antibodies, aptamer, peptides, protein, etc.) that can generate physicochemical signals (optical, electrochemical, etc.) are increasingly being developed for pathogen detection (Ryan et al. 2017; Cesewski and Johnson 2020), including Cryptosporidium (Luka et al. 2019) and SARS-CoV-2 (Funari et al. 2020; Mavrikou et al. 2020; Qiu et al. 2020; Seo et al. 2020). Biosensors have the potential for rapid and real-time WBE and have been applied to wastewater (Yang et al. 2017), but still present many technical challenges including sensitivity, specificity and detection limit (Ryan et al. 2017; Cesewski and Johnson 2020; Mao et al. 2020c).

Moreover, as discussed above, we need to better understand the infectiousness, half-life and survival of various pathogens in wastewater as well as the travel time to the treatment facility, water use per capita and the effectiveness of various WWTP processes and disinfection technologies (chlorine, UV, ozone etc) on the removal of a wide range of pathogens to better inform computational models (Ahmed et al. 2020a, 2020d; Hart and Halden 2020; Mandal et al. 2020).

Conclusions

WBE has the potential to be a powerful and effective early warning tool for community-wide monitoring of public health. However, improved assays for pathogen concentration, detection, quantitation and infectivity are needed for continuous and systematic monitoring of WWTPs. WBE also needs to be integrated with clinical testing, case reporting and public health campaigns, including coordination of testing methods, so that data generated from WBE and clinical testing is comparable. Recently, a global COVID-19 WBE Collaborative project has been launched (www.covid19wbec.org/) in collaboration with the Sewage Analysis CORe group Europe (SCORE) network and the Global Water Pathogen Project to coordinate methodological research and reporting on WBE. Based on this precedent, similar collaboration in relation to the monitoring of protozoan and other pathogens in wastewater is highly desirable. Molecular sequencing and typing of pathogens in wastewater holds great promise for identifying sources of infection and determining transmission dynamics.