Being infected with a respiratory virus doesn’t necessarily imply being infectious. A rapid answer to whether someone may be infectious is more useful than waiting for a definitive result1. Frequent testing requires diagnostics that are easy to use, and cheap and robust technology that can be quickly scaled up when needed. These are some of the lessons being learned as the COVID-19 pandemic evolves.

Insets, embedded in a flexible device, with freeze-dried genetic circuits for the colorimetric detection of pathogens when the device is exposed to a contaminated splash. Reproduced with permission from ref. 7, Springer Nature America, Inc.

Early in the pandemic, in many geographies the increasing levels of cases correlated better with the processing of PCR tests, whose availability was highly constrained, rather than with the actual spread of the virus. Antigen tests, which are less sensitive yet cheaper and more rapid, came to market too late in most countries (in the United States, the Food and Drug Administration provisionally authorized the use of the first over-the-counter antigen test in mid-December 2020), partly because their utility was debatable; the expected high rates of false negatives, when these tests are used for the detection of early infection, was considered to be problematic. Although it quickly became evident that infectiousness correlated with viral load in the respiratory tract2, governments insisted on the need for PCR testing; but knowing whether someone was infected after they had passed on the infection didn’t help much in cutting transmission chains. Similarly, a shortening of the standard quarantine period without the need to test for infectiousness via a rapid test, as recently recommended by the American Centers for Disease Control and Prevention, may not be best for public health; some can remain infectious 5 days after the start of self-isolation, particularly if infected with the Omicron variant3.

In contrast, the Test-to-Treat initiative launched this month by the Biden administration emphasizes the need of rapid testing to maximize the utility of antiviral pills (in particular, Pfizer’s protease inhibitor Paxlovid and Merck’s ribonucleoside prodrug molnupiravir) that, when taken soon after symptom onset, substantially reduce the odds of COVID-19 leading to severe illness and hospitalization. In England in the United Kingdom, since April 2021 any individual has been able to order and take rapid COVID-19 tests twice a week for free (free mass testing for the general public is ending this month, however).

When public health is imperative, as is during a pandemic, the readiness and availability of suitable diagnostic technology strongly determines the types of public-health interventions that governments can deliver. When cheap and rapid tests are not widely available or accessible, severing transmission chains and limiting superspreading may require imposing generalized quarantining. Slow scale-up of testing, because of constraints in the availability of reagents or technology (as occurred early in the COVID-19 pandemic with PCR diagnostics, and as continues to be the case with nucleic-acid sequencing of viral samples to detect variants of the virus) can accrue enormous long-term costs, not only in terms of lives, but also economically and societally. An overcautious regulatory environment, misguided understanding of the utility of diagnostics, a lack of standardized protocols for the collection, aggregation and reporting of results, and many other forms of unpreparedness for a new infectious-disease outbreak, can also have extraordinary long-term consequences.

Learning from the mistakes of the past two years is, therefore, crucial. Testing tools should be designed, properly validated and suitably regulated as public-health tools rather than solely as medical technology or as devices to protect their users. Here are three examples that illustrate this point: (1) the most successful smartphone-enabled exposure notification systems, which were shown to avert cases4, were designed for privacy, broad compatibility and wide adoption; (2) most prediction models (many based on artificial intelligence) for COVID-19 diagnosis or prognosis in patients or for the detection of people at increased risk of infection5,6 were not fit for clinical use, owing to poor data quality or data curation, or to avoidable errors in the training of the models; and (3) the utility of at-home lateral-flow antigen tests to determine when an individual should self-isolate was unappreciated for too long.

As for diagnostic devices, the COVID-19 pandemic has heightened the need to design for faster time-to-result, usability, lower cost, portability, adaptability, rapid scale-up, robustness and accuracy. Recent advances make it possible to meet all of these requirements to a satisfactory degree. A fast time-to-result doesn’t necessarily entail low sensitivity or the need for expensive centralized equipment or reagents. High accuracy doesn’t need to involve a complex protocol or a great deal of technical expertise.

What does the future hold? Possibly, the availability of ubiquitous technology to detect exposure, infection and infectiousness; separately, and with little friction. Sensors of exposure might be embedded in textiles — freeze-dried paper-based sensors (pictured) are being designed for such purposes7, and could be adapted to detect multiple disease biomarkers in exhaled aerosols or in sweat, as Yuan Lu notes in a News & Views article in this issue of Nature Biomedical Engineering. The issue also includes a report of the findings of double-blinded studies of paper-based diagnostic tests for the diagnosis of the Zika and chikungunya viruses in serum samples and of a companion portable device designed for use in low-resource settings. Paper-based cell-free colorimetric assays can also be used to accurately identify pathogens in amplified RNA from saliva samples by using ribocomputing systems (which use RNA molecules as input and proteins as output) with sequence-independent molecular logic, as shown in another Article in this issue.

To detect actual infection, most tests may be portable, inexpensive, fast and sensitive. Another Article in this issue reports a 20-minute one-pot CRISPR-based assay8 for nucleic-acid testing of nasopharyngeal samples that is as sensitive as PCR and that allows for flexibility in assay design. And an electromechanical chip leveraging DNA structures immobilized on field-effect transistors and functioning as molecular cantilevers, as described in another Article, also promises wide applicability in the sensing of biomolecules; the chip approaches single-molecule sensitivity with unprocessed biofluids, and detection occurs within minutes.

Determining infectiousness may become more precise. Rapid point-of-care tests leveraging lab-on-a-chip and CRISPR technologies might simultaneously quantify9, in a few steps, viral levels and the levels of host antibodies (and hence seroconversion), or determine10 the actual viral strain. These tests may use lyophilized reagents (and hence forgo the need for cold storage) and might not require heating at high temperatures (detection at body temperature, or even at room temperature, may provide reasonable accuracy).

The hope is that such new technology for diagnostics makes it into actual products to help the world prepare to quash future outbreaks of infectious diseases. Of course, rethinking diagnostics won’t be sufficient; measures to reduce economical and societal inequalities, to stifle misinformation and to correct for knowledge gaps also require rethinking. Pandemics are seismic; they should be met with systemic preparedness.