The scientific revolution has led to many amazing advances for humankind: from modern computers to space travel and RNA vaccines, our lives have been drastically affected by scientific discoveries made during the last 500 years. In his bestseller Sapiens: A Brief History of Humankind, Yuval Noah Harari argues that the key to unleashing this success can be found in the connection between science and technology: “We often think that it is impossible to develop new technologies without scientific research and that there is little point in research if it does not result in new technologies.” This may imply to some that basic research is unnecessary, unless in service of technological development and growth.

Credit: Sergio Delle Vedove / Alamy Stock Photo

The productive relationship between science and technology was pointed out in the seventeenth century by Francis Bacon with his famous scientia potentia est aphorism and has since shown itself to be very powerful over and over again. But, has it gone too far? Is science becoming a mere servant to technology? For many of us grant-writers, it seems that basic scientific research is more likely to be funded if a convincing ‘broader impact’ argument that illustrates potential technological advancements is provided. Similarly, scientific publications often begin with ‘motivation’ paragraphs, which relate reported work to some technology, such as drug discovery or renewable energy conversion. It appears that scientists must continuously justify the applicability of their basic research.

Although there are many reasons why scientists should keep evaluating whether new technological advances may emerge from their science, we argue that the scientific process itself should largely be decoupled from such considerations to enable greater freedom for researchers to mine for new ideas in areas that are not directly connected to obvious applications. Narrowly focused research may limit our progress, as Arthur L. Schawlow, one of the co-discoverers of the laser, warns us: “We had no application in mind. If we had, it might have hampered us and not worked out as well” (https://go.nature.com/3mQgXkw). To illustrate our point, we outline several examples of fundamental chemistry findings that had no obvious application at the time of discovery. We follow how these discoveries later became key ideas for significant technological advances. Our examples clearly show that a strong connection exists between science and technology. But they also highlight the problem: resources reserved for basic research are often scarce. We suggest that the equilibrium between basic and applied research should be pushed towards basic research and we propose ways to achieve this.

Despite the present-day excitement concerning the electronic and optoelectronic applications of nanocarbons, their discovery — as well as early studies on them — were driven by fundamental science. For example, buckminsterfullerene (C60) was discovered in an attempt to understand the origins of carbon-based structures that were observed in interstellar space by astronomers. To provide evidence that such carbon structures are formed in red-giant stars, Richard Smalley and Harry Kroto simulated the environment of hot stars using lasers to generate local temperatures exceeding tens of thousands of degrees Celsius. To their surprise, they discovered a new carbon allotrope made up of five and six-membered rings. The unusual shape of C60 and other ball-like fullerenes discovered afterwards has inspired other fundamental studies that eventually paved the road towards the discovery of new carbon allotropes, such as carbon nanotubes and nanographenes. Fundamental studies of the shape-dependent mechanical strength, electronic conductivity and optical response of various nanocarbons has opened the door towards numerous present-day applications, including electrodes for batteries, photodetector materials, biosensors and carbon-fibre materials for bicycles, cars and planes.

Similarly, the story behind the discovery of green fluorescent protein (GFP) reflects the fascinating devotion of scientists in their pursuit for fundamental knowledge. It begins with Osamu Shimomura’s curiosity over “the brilliant luminescence” of the jellyfish. He spent two decades, with his colleagues and family members, catching hundreds of thousands of jellyfish in order to collect enough material to elucidate the mechanism of luminescence. Notably, this work was not driven by the thought of any particular application, and the research was guided by fundamental questions rather than any desired practical outcomes.

The first person who recognized the potential of using GFP as a gene marker, Douglas Prasher, could not convince the National Institutes of Health to support his work (https://go.nature.com/3YWTwn8). With only limited research funding available, Prasher managed to clone the GFP gene and shared it with Roger Tsien and Martin Chalfie, who later successfully demonstrated the applicability of GFP as a fluorescent marker of gene expression. Prasher lost his job in science and became a courtesy-shuttle-bus driver, a cautionary tale of how our current academic review process can fail to recognize impactful science; Tsien and Chalfie went on to share the 2008 Nobel Prize in Chemistry with Shimomura.

Paradigm-shifting discoveries are rarely appreciated in their early stages and the long-term impact of a scientific study cannot be predicted using blunt metrics such as journal impact factors or other citation-based quantifiers1. Today, a wide range of GFP mutants have been engineered to improve the brightness and to change the colour of the fluorescence. From its modest origins as an obscure protein found in jellyfish, it has become a versatile tool to shed light onto gene expression in cells. And even more than that: mutagenesis and engineering of GFP has brought about new protein-based fluorescent sensors which allow for real-time measurements of various parameters of cell physiology, such as pH, membrane voltage, and species such as calcium ions, ATP, NADH, reactive oxygen species as well as several enzymes.

It can be argued that electrochemistry has historically been considered a niche area of chemistry, a view perhaps reinforced by the fact that the first Nobel Prize honouring electrochemistry since Jaroslav Heyrovsky’s 1959 award for his work on polarographic methods was awarded only a few years ago, in 2019. Today, electrochemical research is at the forefront of applied chemistry, driven by the need for high-energy-density batteries that can power our mobile phones, laptops and electric vehicles. Just like many other technologies, the early experiments that led to discovery of Li-ion batteries can be traced to fundamental scientific work that many at the time may have considered to be useless. For example, when the electrochemically driven intercalation of ions into carbon-based electrodes was first discovered, it was considered a nuisance because it involved the chemistry of carbon electrodes rather than chemistry occurring at the electrodes. Jürgen Besenhard was nonetheless intrigued by this behaviour and published a series of studies investigating factors that control the intercalation chemistry2. This fundamental knowledge served as important groundwork for Akira Yoshino’s development of Li-ion intercalating anodes found in present-day Li-ion batteries. Similarly, the years of experiments involving intercalations into layered chalcogenides by Theodore Geballe and others3 are key fundamental studies that have underpinned M. Stanley Whittingham’s and John Goodenough’s development of Li-ion intercalating cathodes for Li-ion batteries.

Organic chemistry is rife with haphazard discoveries that ultimately give rise to important applications. This fact is maybe not so surprising for the discipline whose very beginning is marked by serendipity, considering that the first synthesis of an organic compound (urea, 1828) was the unintended outcome of Friedrich Wohler’s effort to prepare pure ammonium cyanate. The history of organic synthesis contains a long list of important compounds synthesized by chance against expectations of the researchers. The discovery of crown ethers was a landmark discovery that marked the beginning of supramolecular chemistry and the subsequent development of nanotechnology and molecular machines. And yet, it was kickstarted by a serendipitous discovery from Charles Pedersen, whose curiosity was ignited by a 0.4% impurity in a reaction product that was caused by contamination of the starting material. As a leading industrial chemist, Pedersen knew that the impurity would not serve the applicative purpose of the initial research; however, curiosity prevailed and, by his own words, “one of my first actions was motivated by aesthetics more than science” (https://go.nature.com/42hHQhA). Twenty years later, this attitude was rewarded with the receipt of the Nobel Prize in Chemistry (shared with Donald J. Cram and Jean-Marie Lehn).

Numerous useful chemical reactions are the surprising results of research initially driven by another focus; the most prominent examples are Wittig olefination and Brown hydroboration (of which both earned Nobel Prizes for their discoverers) as well as the Friedel–Crafts reaction (with its immense industrial application). Derek Barton, a Nobel laureate who discovered several new reactions, considered the most important of them to be accidental discoveries (notwithstanding the fact that “chance favours only the prepared mind”, as Louis Pasteur suggested). In line with the distinction between ‘normal’ and ‘revolutionary’ science, as outlined by Thomas Kuhn in his book The Structure of Scientific Revolutions, reactions developed by systematic effort tend to be associations with and/or extensions of known principles, whereas unexpected breakthroughs may lead to novel insights of chemical reactivity. Without neglecting the importance of having a research plan, narrowing its scope with the burden of applicability and purpose may limit the chances of serendipitous discovery4.

Perhaps the most transformative among recent chemistry discoveries is the CRISPR method for gene editing, invented by Emmanuelle Charpentier and Jennifer Doudna, for which they shared the 2020 Nobel Prize in Chemistry. As is often the case, the groundwork for these discoveries was laid by a fundamental study fuelled not by an application, but by Francisco Mojica’s curiosity over strange gene sequences he discovered in bacterial DNA. Even though he struggled (for quite a while without success) to attract the attention of the scientific community, publish in high-impact journals or secure funding for his projects, Mojica kept pursuing what was behind the mystery of these unusual repeating gene segments, separated by spacers that have different gene sequences. The puzzle was solved thanks to Mojica’s sequencing efforts and database searches: the unusual sequence is associated with bacteria’s defence mechanisms against viruses. The repeating units code for the protein that cuts genetic material, while the spacers code for viral DNA and direct the cutting protein to the parts of the viral genome with the corresponding (that is, complementary to the spacer) sequence. Combined, these genes enable the bacteria to recognize and break apart genetic material injected by viruses. The subsequent simplification of the technique by Charpentier and Doudna has created a powerful tool for gene editing, with the potential to revolutionize food production, pest control and, most notably, medicine. Indeed, previously unimaginable avenues have opened up in organ transplantation, regenerative medicine and the treatment of inherited diseases: recently, CRISPR treatment was inserted for the first time into the human body to treat hereditary blindness5. The financial potential of gene editing technology is estimated to run into the hundreds of billions of dollars.

All of these examples point to an important fact: many of the most important technologies have been derived from fundamental scientific knowledge. It is fundamental scientific knowledge that ultimately enables practical applications and is essential for their development. The COVID-19 global health crisis provides us with two cautionary tales. First, fundamental research on mRNA vaccines allowed for the development of anti-COVID-19 vaccines at ‘warp speed’, but the pioneers in the field struggled for funding in the early stages of their discoveries: Robert Malone, the first to use positively charged liposomes for mRNA delivery into cells, received a rejection to his proposal in 1996 to develop an mRNA-based vaccine against coronavirus infections6. The rejection was likely because the proposed research was considered inapplicable, owing to the well-known instability of RNA, so he switched his focus to DNA vaccines which showed greater promise at that time. Second, and in sharp contrast to cardiovascular diseases, diabetes or cancer, research in large pharmaceutical companies has typically neglected viral diseases (except for flu, HIV and hepatitis C virus, which are widely spread in the developed world and, more recently, Ebola, whose high lethality and virulence threatened to quickly transform a local epidemic into a global health issue with a high fatality rate). Coronavirus outbreaks from 2003 (SARS) and 2012 (MERS) instigated the research in this domain; however, it subsided as soon as their epidemic potential was estimated to be low7. Continuous research in this realm might have enabled us to tackle the recent pandemic with a portfolio of anti-COVID drugs already at hand.

Changes to the current science–technology equilibrium to favour curiosity-driven fundamental research can be made in the long term and will require support from many areas, but not least the public. This fact is most obvious in energy fields, where increased scientific literacy of non-scientists on the role that greenhouse gases play in climate change has resulted in significant investments in fundamental chemistry research. Climate activists, such as Jamie Margolin and Greta Thunberg, have inspired many citizens to put pressure on companies and governments to do better and, because cost-competitive renewable energy technologies are not available yet, the response has been to make significant investments in fundamental research in the areas of carbon capture and utilization, solar energy, energy storage and other renewables. For example, the Basic Energy Sciences division of the US Department of Energy proposed a 2022 budget to the Senate which requests a 2.4% increase in the basic research investments in climate change and clean energy (https://go.nature.com/3LstyVh).

Moreover, academic institutions play an important role in the education of the public as well as promotion of basic research. Although the academy has traditionally been a stronghold of fundamental research, in a race for reputation and money, many universities have introduced a corporate-like culture where sheer output is often valued over good research. This is detrimental for fundamental science, which often takes time to bear fruit in terms of citations, h-index, funding and possible applications, as sadly exemplified by the aforementioned Douglas Prasher who, instead of sharing a Nobel Prize, continued his career as a courtesy driver. Instead of being “obsessed with their position in global rankings”8, academics should remain fond of the spirit of curiosity and serendipity that maintains the vitality of research and unexpectedly brings about ground-breaking discoveries in unpredicted realms. Scientists should be encouraged (and funded) to look for inspiration wherever they might find it, even in the stains of inadvertently spilled coffee — years later, a deep analysis of such a mundane phenomenon may bring about both citations and applications9. This would help the students of today and the policymakers of tomorrow to become more aware of non-material values in science and, consequently, create a more appropriate balance of fundamental and applied research in funding schemes.

Practically, the change can be implemented by the introduction of policies that fund basic research and support private–public partnerships. For example, a recent report by the IMF (Research and Innovation: Fighting the Pandemic and Boosting Long-term Growth) points out that “basic scientific research in advanced economies is underfunded”. The report also estimates that a 10% increase in basic research investments would increase economic productivity by 0.3%, meaning that investment in basic research would start to pay off within a decade. The pay-off is expected to be even higher in medicine, where every dollar of federal investment is projected to yield 8 dollars in economic growth10. Some countries recognize the value of investing in basic research. For example, South Korea’s basic research funding is on a 5-year plan to double by 2022 (to US$2 billion)11. China also announced a 5-year plan with significant increases in basic research spending, which comes on top of impressive increase in research spending that the country has experienced since 199512. However, some countries are slow to recognize the need for basic research. US Congress is currently considering a bill that, if approved, will make significant changes to the National Science Foundation, the country’s main sponsor of basic research. The change will give the organization a new name, the National Science and Technology Foundation, and a significant boost in funding ($100 billion increase over 5 years). Although an increased budget is very welcome, there is some concern that the ‘upgraded’ agency will steer funding resources away from basic research towards applied research and technology development13.

In his book, The Usefulness of Useless Knowledge, Abraham Flexner, the founder of the Institute for the Advanced Study at Princeton, also emphasized that: “Institutions of learning should be devoted to the cultivation of curiosity and the less they are deflected by considerations of immediacy of application, the more likely they are to contribute not only to human welfare but to the equally important satisfaction of intellectual interest which may indeed be said to have become the ruling passion of intellectual life in modern times.” So, to the members of an evaluation panel, who wish to know what we are going to discover, by what methodology, and when — we may reply: “If we knew what we were doing, it wouldn’t be called research, would it?” If the panel finds such an answer unacceptable, even insolent — may we remind them that this quote comes from Albert Einstein; one should think twice before rejecting his opinion as nonsense.