Hostname: page-component-76fb5796d-2lccl Total loading time: 0 Render date: 2024-04-26T09:30:01.761Z Has data issue: false hasContentIssue false

Biases and Feedbacks in the Knowledge System: from Academia to the Public and Back

Published online by Cambridge University Press:  19 October 2023

Yonatan Dubi*
Affiliation:
Department of Chemistry, Ilse Katz Center for Nanoscale Science and Technology and School for Sustainability & Climate Change, Ben-Gurion University of the Negev, Beer Sheva 81040501, Israel. Email jdubi@bgu.ac.il
Rights & Permissions [Opens in a new window]

Abstract

In the philosophy of science, there are multiple concepts trying to answer the question of how scientists ‘know’ things, all circling around the notion of observation, thesis, falsification and corroboration – namely, the usual concepts of scientific practice. However, a whole different question is ‘how does the public know things?’. Understanding the answer to this question is crucial, since (at least in Western democracies) the public is the entity which funds, and through funding directs to a certain extent, the course of science. Here I discuss ‘the knowledge system’, a concept (proposed by the American writer Alex Epstein), which can generally be thought of as the set of institutions and processes which take part in the way the public becomes knowledgeable about certain (scientific) topics. I argue that the ‘knowledge system’ contains two inherent flaws, namely (i) the accumulation of biases; and (ii) strong feedback loops, which are almost unavoidable. I demonstrate these flaws with some examples and show how these flaws can (and already do) lead to policy suggestions that de-facto abolish academic freedom. Finally, I discuss possible ways to overcome – or at least minimize – the effect of these flaws on science and the scientific community.

Type
Focus
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2023. Published by Cambridge University Press on behalf of Academia Europaea Ltd

Introduction: Truth and Knowledge

When I was young, I used to accompany my father to his office at the university (Ben-Gurion University, the same university where I am now a professor). I was young, and the long corridors, the strange machines (computers) and people with concentrated faces made a great impression on me. Yet, for the life of me, I could not understand what it was they were doing there. To me it was just ‘Dad’s workplace’. Finally, as I became a bit older (maybe 12 or 13), I remember asking my dad what this place was, what is ‘a university’. ‘Well, my son,’ Dad answered with a serious face, ‘the university is a place that accumulates knowledge.’ ‘OK,’ I answered, ‘but how does it do it?’ ‘Well,’ replied Dad, still keeping the face serious, ‘when students come to learn in the university, they know very little. When they leave the university, they know absolutely nothing. The difference was accumulated by the university…’

This story (absolutely true, my mother can vouch for that) is funny, I hope. But it also makes one think more deeply – indeed, what is this place called a university? Of course, answers vary. Some would say it is a community of scholars (see, for example, the Kalven report from the University of Chicago (Kalven Reference Kalven1967)), others might argue that the key role of a university is the training of students. But many would agree that the telos, the heart and soul of a university, is scientific research or (more broadly) academic inquiry. But what is that? I think a good answer actually comes from my own university’s ‘code of ethics’ (Ben-Gurion University 2007), which states that:

The fundamental aim of the University is to seek, investigate and teach the truth, to promote all fields of knowledge and scholarship… (Emphasis added)

Two words which require explanation immediately stick out, namely ‘truth’ and ‘knowledge’. Let me dwell on these for a bit. ‘Truth’ is, of course, an immensely complex notion. Yet, in the realm of scientific scholarship, it has some relatively simple definitions. For example, in mathematics (which many, me included, do not consider as a natural science but as a field of its own) the definition of a ‘true’ statement is quite simple – anything which can be derived from the axioms. Of course, this definition has its own strangeness (for instance, the same statement can be true or false under different sets of axioms), but it works quite well as a definition for truth.

In the physical sciences, one can define ‘true’ as any statement which reflects observations about the natural world. Namely, the truth is the results of experiments and observations made in the most accurate and detailed way possible. Put differently, what is true is the face of nature (Spinoza’s facie naturae), namely the collective set of observations we have about the natural world. So, for example, if one performs an experiment, the report on that experiment (e.g. ‘this apple fell from that tree to earth’) is the truth. The statement ‘all apples fall to earth when they detach from the tree’ is a generalization, and the statement ‘all apples fall to earth when they detach from the tree because of the law of Gravitation’ is already a hypothesis. And while some hypotheses are so entrenched in our life, experience and thinking that we confuse them with the truth, they are still only hypotheses. The difference may seem delicate, yet it is important.

Now that we have clarified the idea of truth (as much as one can clarify it in two paragraphs), let us think a bit about ‘knowledge’, or the question of ‘how do we know’. I steal some ideas here from the excellent lectures of Yale University’s Professor J. Michael McBride, on organic chemistry (McBride Reference McBride2009). Basically (according to McBride), there are four ways of knowing things. The first way is knowledge from the divine. If you have been lucky enough to learn things from the almighty him-/herself, then you sure can trust that you know what you have been told. Unfortunately, this way of knowing is quite rare and relatively hard to verify. A much more common way of knowing is obtaining knowledge from authority, for example by listening to the person who gained his knowledge through the aforementioned path. This way of knowing is much more common, and if generalized to any kind of authority (rather than only the authority of those who speak directly to the divine), it is probably the most common way of knowing things in the world.

However, neither of these two methods of knowing is particularly useful for science and scientific discovery (although the authority of teachers and textbooks is important to students when they start doing research). Rather, the scientific method is based on two other, very distinct ways of knowing. The first is, of course, knowing by observation. We know that something is, by measuring it. Measuring is simply observing facie naturae, making an interesting connection between knowing by observation and the truth – if we learned something by observing it, then we have learned something truthful. Thus, a scientist can know what the shape of gold nanoparticles is, or what the absorption spectrum of methane is by observing and measuring them. This way of knowledge (which goes back to Francis Bacon) is essential to the scientific method.

The fourth way of knowing is through logic. We know things because they are logical consequences of other things which we know (we may have learned those by observation, or from the divine, it doesn’t matter as long as we know them). Logic is what guides us when we make inductive generalizations, from observing one apple falling off a tree to saying, ‘all apples which are detached from the tree will fall’ (using the logical assumption that there is nothing special about our apple), and logic is what guided Newton to generalizing apples into a complete theory of gravitation.

The scientific method, which aims at understanding the natural world, is based on these last two ways of knowing. It starts with some observation, and then scientists formulate a thesis that explains this observation (i.e., what was the cause for this observation). They use logic for that, and then they use logic again to come up with a prediction that experiments can verify, which comes up from the explanation. Then, they test the hypothesis again by observation, which can either falsify the thesis or corroborate it. Then, the new observation is a basis for yet another thesis, and so on and so forth.

This is the never-ending cycle of scientific inquiry (in a nutshell). It has worked quite well, and still does, at the level of a single scientist sitting in her lab or office, thinking about why the world behaves the way it does. However, scientists and their way of knowing things is only a small part of the world of knowledge. After all, most of us still know things, even about scientific matters, although we are not scientists. How do we know things?

Epstein’s Knowledge System

The question of ‘how the public know things about science’ is an old question discussed in Sociology of Knowledge, for example by Karl Mannheim (Reference Mannheim, Wirth and Shils1936) and Robert K. Merton (Reference Merton1937) (see also Hamilton Reference Hamilton1974), who also deal with the roles of ideology and relationism. It was recently also dealt with in a popular book by Alex Epstein (2022). Epstein suggests that knowledge on scientific topics comes to the public through a system of institutions and processes which he calls ‘the knowledge system’, which is composed of four central parts.

The first link in the knowledge system chain is the researchers. These are the professionals who directly engage in the scientific method of inquiry and focus on knowing from observation or logic (or both). They typically work at universities, government research centres or R&D groups in industry and publish their results in the scientific literature (which typically only other scientists read). The second link in the chain is the synthesizers. As the amount of scientific knowledge becomes increasingly large, synthesizers are those entities which take the scientific knowledge in a certain field and synthesize it into a single framework – typically a report (sometimes a book, or any other documents of that sort). Synthesizers are typically appointed for the purpose of synthesis. An important example of a synthesizer is the IPCC, the Intergovernmental Panel for Climate Change, the UN body created by the World Meteorological Organization (WMO) and the United Nations Environment Program (UNEP) under the auspice of the UNFCCC (United Nations Framework Convention for Climate Change), in order to synthesize scientific knowledge on climate change. Once every few years the IPCC publishes a report on climate change (known as assessment reports, or ARs), which is widely discussed, and is considered probably the central document portraying our understanding of the climate.

Of course, there are many others (and I am still talking about climate): the national climate report from the US National Centers for Environmental Information, the OECD environmental working group, and many more (essentially every government in the western world has a climate change synthesizer). Synthesizers can also come from industry – an example is the Lazard Climate Center (Lazard 2020).

Of course, climate is just one example out of numerous others – in essentially every aspect of modern life where science is relevant, there are synthesizers – most notably health issues but also various regulatory issues and other topics of public interest. From private institutions to governments to the UN, synthesizers form a crucial step in the knowledge system, as they are charged with addressing a specific topic, observing the scientific literature (which is sometimes vast and always inaccessible to the general public) and asking: out of all this knowledge, what is relevant, what is important and what is valid?

The third link in the knowledge system chain is the disseminators. These are the bodies that take the reports of the synthesizers (and sometimes from the researchers themselves), and translate them into information that the uninformed, non-professional public, can absorb. I am mainly talking about the media: news outlets, TV channels, leaders in social media, etc. Of course, the synthesizers themselves can have disseminators – spokespersons, for example (see the discussion below about the IPCC’s summary).

One may ask: why doesn’t the public simply read the synthesizer’s reports? The answer is that typically, even though they are slightly easier to read than the original scientific literature, these reports are still challenging for the non-expert. Again, let us take the IPCC as an example. The last IPCC assessment report (AR) was composed of three sub-reports, (i) the physical science basis; (ii) impact, adaptation and vulnerability; and (iii) mitigation of climate change. Each of these sub-reports contains thousands of pages (the 2021 physical science basis report has 2409 pages including titles (IPCC 2021a)) and is written in technical language. They are very hard to read, and therefore the IPCC itself publishes ‘a summary for policy makers’ (32 pages for the 2021 AR6 physical science basis report (IPCC 2021b)). However, the public often will not even read this part – but journalists hopefully do read it and can disseminate its content to the public via their news outlets.

The final link in Epstein’s knowledge system is the evaluators. These are the people who are supposed to tell the public what to do, or which policy would be most desirable in light of the gathered knowledge. While Epstein gives the example of newspaper editors as evaluators, I believe that this needs to be taken a step further – the true evaluators are the politicians, the policymakers themselves. This is because, in many cases, the public is informed by the politicians (think about President Obama asserting to the nation that ‘the science [i.e. climate science] is settled’ and 97% of scientists agree (Obama Reference Obama2013). Much of what the public knows about matters which have a scientific content comes from listening to politicians.

Biases and Incentives in the Knowledge System

In Epstein’s description, the knowledge system is linear: the researchers provide knowledge to the synthesizers, who provide knowledge to the disseminators, who provide knowledge to the evaluators (see Figure 1). Before I go on to describe in more detail why I think this is not a full description of the knowledge system, it is important to recognize that even as a linear structure, the knowledge system is far from being a perfect system. The reason is that within every link, there are biases and incentives which may skew the output of that link.

Figure 1. Alex Epstein’s ‘knowledge system’

Let us naturally start with the researchers. Clearly, they are prone to a great deal of biases and incentives. Perhaps the public has the naïve image of the scientist as an objective, immune-from-bias scholar, yet this cannot be farther from the truth. This has been widely discussed in history and philosophy of science; for example, in their historical analysis of the concepts of subjectivity and objectivity, Daston and Galison (Reference Daston and Galison2007) emphasize their mutable and non-tangible nature and multiple meanings, as well as the continued influence of biases. Scientists are first and foremost human beings, and they are susceptible to biases like any other person. One of the central biases is the so-called ‘confirmation bias’ (Guiney et al. Reference Guiney, Goodfellow and Canfield2020), which drives researchers to find evidence to corroborate their theory rather than falsify it, but this is just one example out of many cognitive biases that affect researchers (Nuzzo Reference Nuzzo2015).

Researchers are also affected by academic pressure, mainly in the form of publication pressure (Kiai Reference Kiai2019), which may lead to fast publication at the expense of deep (and sometimes crucial) verification of results. Some claim that publication pressure (which in the modern scientific establishment system is related to funding, promotions and status) has led to the replication crisis (Rajtmajer et al. Reference Rajtmajer, Errington and Hillary2022).

Another form of incentive is peer pressure. While the public may have an (again, mostly false) image of the scientist as a dissident, a scholar who only cares about the scientific truth at all costs, in reality, scientists – like every other person – want to fit in, and typically do not wish to go against their community. This is probably why scientists who did so are so commemorated and celebrated.

Researchers are also highly biased by their own political views. This is not so much an issue if one is researching plasmonic photocatalysis or the chiral-induced spin selectivity effects (which are some of the things I am interested in), but if you are researching climate change, or psychology or economics, your political background may definitely bias you towards which projects to pursue and even how the results of the research are presented.

Finally, a clear source of bias is funding. In the modern world, science is mostly funded by governments (except perhaps research in machine learning, which mainly comes from private companies such as Google, Microsoft, Facebook and Amazon). Therefore, the researcher may be limited by the funding source. As an example, consider a climate scientist who is interested in the effect of, say, sunspots or solar wind on the earth’s climate. If he is funded by a grant which specifically calls for research on the role of methane in the Earth’s climate, he will abandon his passion, and focus on methane. And if his research was funded to find evidence that methane has an important role in the earth’s climate, then he will most likely find that this is indeed so. Why? Not because the researcher is wrong, but because such problems tend to be very complicated with evidence both supporting and detracting from the claim. However, if a climate researcher thinks that methane is not a big factor in the Earth’s climate – he will not submit the grant proposal in the first place (or if he submitted, he will most likely not receive the funding). This is a very clear bias, which is both understandable and well-known.

What are possible biases and incentives for synthesizers? Well, first, we should recognize that, in many cases, the synthesizers are literally the same people as (or more accurately a sub-group of) the ‘researchers’, and hence are subject to the same biases and incentives as researchers. However, synthesizers have their own additional set of incentives and biases, with two very important examples. The first is the need for public attention; in many cases, synthesizers address the public in one way or another, and wish to have as much public appearance and influence as possible. The second is what I call ‘self-preservation’. In many cases, synthesizers are appointed with a specific goal. An obvious example is, again, the IPCC, whose declared charter is to examine the dangers of (man-made) climate change. With a direct budget of millions of dollars, a global travel itinerary and world-renowned influence, there is a clear incentive for the IPCC to preserve itself, for instance, by emphasizing results in its reports which support its a priori charter. Imagine a scenario, in which there is little evidence at all for man-made global warming or that new, strong evidence emerges that global warming is, say, 90% natural. Can one imagine an IPCC report which says that ‘there is no longer a use for the IPCC’? I wish to clarify; I am not talking about (nor accusing anyone of) dishonesty or direct data ‘cherry-picking’ (although it is surely there). Rather, I am pointing out that this is an inherent incentive in many synthesizes, which may lead to biases.

The next level is the disseminators. Clearly, disseminators have a very different set of incentives (and hence biases) than both researchers and synthesizers. Two clear biases are political agenda and financial incentives. Political biases are very clear; for instance, a ‘right-wing’ reporter might choose not to report on a climate-related event, while her ‘left-wing’ colleague might choose to report any minor weather-related incident. The second incentive is as obvious as the first – after all, newspapers, as a hallmark example of disseminators, are typically commercial enterprises. Therefore, they are incentivized to maximize their readership by using, for example, exaggerated titles (‘click bait’) or picking stories that do not represent the norm. Furthermore, a newspaper might choose not to publish a story not because it is unsound, but because it might offend its readership. Finally, a news outlet might simply do a lousy journalistic job in verifying some issue, simply because it serves the agenda. One amazing and hilarious example is the publication that during the spring 2022 floods in Pakistan, one-third of the country was flooded. While it is not clear where this rumour came from, it was nevertheless very widely cited in mainstream media (see, for example, Columbia Climate School 2022). Yet, a simple check (UNOSAT 2022) would have led the journalists (and editors) responsible for the title to see that Pakistan is a country of 796,095 square kilometres, out of which about 55,000 were flooded. Still a great disaster, with horrible damage to human life and property, but the flooded area is very far from a third, closer to 6%. There are numerous other examples in modern-day media. There is nothing new nor secret about these incentives, but one must think of them now in the new context of the press as a part of the knowledge system.

Finally, we have the evaluators. Clearly, they are biased by their own political agenda. But even more so, the evaluators include politicians; they are also subject (at least in the Western world) to the need to be re-elected. Therefore, they may choose not to support a statement if it may affect their chances of re-election, even if they believe it to be scientifically correct.

I stress again, I am not accusing anyone of dishonesty, cherry-picking, or things of the sort (although, these things clearly do exist). I am simply pointing out that the knowledge system is, after all, composed of individuals, and individuals are subject to biases, even if they are scientists or journalists (who are, at least in principle, conceived as objective). We are all human.

The Invisible Flaws of the Knowledge System

What I would like to do now is point out two inherent characteristics of the knowledge system which I believe were overlooked by Epstein. These characteristics amplify the problems with the knowledge system that were addressed by Epstein.

The first point is the accumulation of bias. Even if one considers the knowledge system to be linear, one must accept that biases will be accumulated – and amplified – as knowledge progresses through the knowledge system. To illustrate this point, let us take a theoretical example. Assume that there is an interesting theoretical debate on whether walking barefoot prevents heart disease. This is a well-defined scientific question, but answering it is rather difficult because many factors contribute to heart disease so that making the multivariate analysis is not simple, nor is conducting well-controlled experiments. The scientific community is divided, say 70–30 in favour of barefoot walking.

Now, the Ministry of Health wants a report on heart disease, and one section is devoted to the connection with walking with/without shoes. It is likely that the head of this committee is one of the 70%, since they are more numerous and possibly more public. Now, she is naturally biased, and therefore the report is not 70–30, but more like 85–15. By some chance a journalist reads the report, and maybe even calls the head of the committee for an interview, where he hears something like ‘yes, some people think there is no relation, but they are a minority – the consensus is that bare-foot walking prevents heart failure’. The journalist might write a long story on a front page about the benefits of walking barefoot (or more likely, the dangers of walking with your shoes on), citing the committee head. Now, what are the chances that some politicians who read the article will suggest a new bill, forcing everyone to walk barefoot for at least an hour a day?

I know, it’s a made-up story, with a made-up scientific thesis and a made-up storyline. But is it so unrealistic? Our daily experience tells us that this is a rather realistic scenario. And notice – no one lied, no one cherry-picked, no one was an evil conspirator – it’s just a flaw in the knowledge system combined with human nature.

Another, possibly even more important, feature of the knowledge system is that it is, in fact, not a linear chain. Rather, each element in the knowledge system is multiply connected with the other elements in a complicated set of interplays and feedbacks, as depicted in Figure 2.

Figure 2. The knowledge system as a feedback system

Let us think about a few examples. First, consider the relation between researchers and synthesizers. Besides the obvious unidirectional connection that the synthesizers gain their data from the researchers, an obvious connection is that these groups are composed of literally the same people, and the synthesizers interact with the researchers beyond the limit of their role as synthesizers. Thus, for instance, synthesizers can invite researchers to conferences (or vice versa), to give expert testimonies or simply consult them in the university department hallway. Might this interplay generate possible biases in the synthesizers? Probably.

Next, consider the interplay between synthesizers and disseminators. Many synthesizers want their voices to be heard, so they depend on the disseminators for interaction with the public. The disseminators need the synthesizers to explain the science to them. Of course, disseminators might (and often do) communicate with researchers who were not part of the synthesizing group. An interesting example is a recent paper written by researchers from the Weizmann Institute in Israel, which made headlines (Israel Government Press Office 2022). The headline states ‘Earth’s Climate Change Apparently More Rapid than Expected’. Yet, when one reads the research article itself (Chemke et al. Reference Chemke, Ming and Yuval2022), the story is completely different. The authors actually demonstrated the lack of ability of climate models to reproduce the experimental data regarding winter storms in the southern hemisphere. A central conclusion of the original paper is ‘The inability of climate models to adequately capture the storm-track intensification, which delays the detection of the intensification in models by several decades, questions the skill of climate models to accurately assess the future changes in the Southern Hemisphere extra-tropics’.

Why is this relevant to the disseminator–researcher interplay? Because the journalists interviewed the lead researcher, and he is widely quoted in the paper (and this also appeared on the website of the Weizmann Institute). However, the researcher did not correct the journalists from making this mistake. Why? Interplay and feedback.

The interplay between disseminators and evaluators (plainly – journalists and politicians) needs basically no introduction and has been depicted, for instance, in many Hollywood movies. My favourite in this context is The Ides of March with George Clooney and Ryan Gosling. If you haven’t seen it, you must; it’s a beautiful and sad depiction of politics and has a very good example of the interplay between the press and the politicians.

Finally, the evaluators have interplay with both the synthesizers and the researchers, mainly through funding. In the modern world, most of the research funding comes from the state, and various programmes and initiatives funnel funding into topics which are of interest to the evaluators. This leaves a critical feedback loop. One can actually see this feedback at play, and climate science is a good example. Up until the 1990s, climate and Earth science were a tiny and not very interesting branch of the natural sciences. This changed as the topic of climate change rose in public attention (see Hansen Reference Hansen1988 for an example of interplay between researchers and evaluators). Money started to be shifted towards funding climate research (see Figure 3(a)), which in turn naturally led to an increase in the number of researchers in the field of climate science. Since these researchers need to publish their results, this rise in the number of researchers was followed by a rise in the number of journals on climate science and the number of publications (Figure 3(b)), and – of course – the number of citations in news outlets (Figure 3(b)). This is an example of a perfect feedback loop.

Figure 3. (a) Funding for climate research in the natural and technical sciences (versus the social sciences and humanities). Figure reprinted from Overland and Sovacool (Reference Overland and Sovacool2020) under CC BY 4.0. (b) Number of journals exclusively dedicated to climate change actively published each year (dashed line), Access World News (dotted line, divided by 12,000), and number of published articles on climate science (solid line). Note the exponential growth. Figure reprinted from Grieneisen and Zhang (Reference Grieneisen and Zhang2011) with permission

Of course, some may argue that there is nothing wrong with this feedback, and it’s the natural order of things. Moreover, society can decide what topics are of societal interest and promote them through funding.

However, when a scientific topic has policy consequences, I claim that this feedback loop can be destructive to scientific progress. Indeed, many more researchers now work on climate science, but they are limited to a rather narrow scope, namely man-made climate change. And research which is slightly off this mark is difficult to get funded and published (you can read many examples in Professor Nir Shaviv’s blog (Shaviv Reference Shaviv2006)). But still, what is the harm? Perhaps, since man-made climate change is the greatest problem humanity has ever encountered and we only have, what, 12 years to solve it, we should indeed invest all our scientific efforts and skills into understanding it. I am not going to argue here against this statement. What I want to show in the next section, is that this self-enhancing feedback loop leads to a real peril to science, academia, and free thinking.

The UNESCO – Sustainable Development Goals (SDG) Agenda and the Role of Higher Education Institutions

UNESCO is the United Nations Educational, Scientific and Cultural Organization, a specialized agency of the United Nations (UN) aimed at promoting world peace and security through international cooperation in education, art, science and culture. In the context of the knowledge system, UNESCO, like other branches of the UN, is a mix of synthesizer and evaluator; the organization spends millions in writing scientific reports and suggests policy steps towards achieving its goals. Although with no formal political power, the UN can directly influence policy by advocacy and by direct funding of programmes (the total UN annual budget is ∼$3 billion, about 20% of which is devoted to political missions).

In 2022, UNESCO issued a document (UNESCO 2022a) in which it formulated what higher education institutions (HEIs) should do to support the Sustainable Development Goals (SDG) announced by the UN in its Agenda 2030 campaign (UNESCO 2022b). The SDG goals have been formulated in such a way that they sound very positive (who would resist eliminating poverty, while promoting peace and justice, and quality education?). But, in truth, these goals are the end result of a long feedback loop of information circulating in the knowledge system. In the case of ‘climate action’ (pertaining mainly to goal 13: ‘climate action’, but also to others), this information is mostly (at least in the popular press) catastrophically frightening. But nevertheless, the SDG goals sound really good. Why not harness the intellectual power of the HEIs towards achieving them?

But, in truth, the UNESCO statement is a dangerous document, and HEIs or countries which endorse it will put academic freedom, free thinking, and the entire scientific endeavour at dire risk. Let me give a few examples.

In the Executive Summary, the document states:

Universities and, more broadly, HEIs, need to use the knowledge they produce and their education of new professionals, to help solve some of the world’s greatest problems, as addressed by the SDGs set out by the UN.

and a few lines later,

The call [bold in original] this report makes is for universities and HEIs to play an active part in an agenda…

Read this again, and then read again the opening section of this article. The telos of universities and other HEIs is no longer ‘to seek, investigate and teach the truth, to promote all fields of knowledge and scholarship’, but rather – to promote an agenda, and not any agenda, but the agenda pushed by the UN. This is an unbelievable change of paradigm.

And there is a cost to promoting an agenda, and that cost is free thinking and academic freedom. Here is what the UNESCO document says about academic freedom:

HEIs should not cease to protect and expand academic freedom for the promotion of systemic change.

So academic freedom is no longer a value in its own right, but now a tool to promote systemic change and the agenda. Moreover (and I again quote, I am mentioning this here because it is quite unbelievable):

Basic and curiosity-driven research should also be maintained as a core principle where relevant.

According to this report, basic and curiosity driven research are no longer the core drivers of academia. They can only be pursued where relevant. And who will decide whether academic freedom works for the promotion of systemic change, and if curiosity-driven research is relevant to the cause? The UNESCO agenda has a solution – specifically designed SDG officers and committees (in Russian – commissars and soviets, by direct translation). The UNESCO document states, quite clearly, that:

To anchor and monitor sustainability activities in HEI governance structure, HEIs should consider establishing the post of Chief Sustainability or SDG officer and/or a sustainability committee at the top level. [Bold in original]

And as for academic freedom? Well, according to the UNESCO document:

HEIs must refuse to engage in research that supports non-sustainable practices (for example, the fossil fuel industry)… [Bold in original]

So now, according to the UNESCO document, a university administrator, holding the high rank of ‘Chief sustainability officer’, or (possibly even worse) a committee, will determine which research activity supports or does not support non-sustainable practices. Will I be able to continue my research on spin-transport in molecular junctions? Well, that would depend on the whims of the SDG commissar and his merry band of committee members. Goodbye academic freedom, welcome Lysenkoism.

Two questions should be asked. The first is: How did we come to this? How is it that an official UN body can publish a paper which is so bluntly anti-academic-freedom? In my opinion, the answer is deeply rooted in the flaws of the knowledge system. If one examines the SDG goals they seem almost scientific, and for sure they are backed by numerous published papers, official reports, news media articles and public opinion polls. The whole 2030 agenda is fuelled by biases in the way the public gains knowledge, leading to public support in unsound goals, not supported by scientific evidence, without taking scientific unknowns and error bars into account – but who will be against ‘sustainable development’? Will the public be informed of the true content of the UNESCO document, or will it also be skewed by the disseminators?

The second question was asked by a person from the audience in a talk I gave on this topic. The question is: surely, pursuit of truth and knowledge should be the central goals of the university. However, since many of the universities are funded by public money (this is certainly the case in Israel, but also in most of the Western world – and if the university itself is a private entity, still much of the research funds come from the state) why not add to this goal a second goal: to make society better and to improve people’s lives? What could be a more noble cause than that?

The answer is simple: while truth and knowledge are largely objective, the answer to the question of how to improve society is a subjective, political question, and clearly different people will have different answers. That’s what politics is all about, and there is no correct answer. And so, when trying to direct universities towards a political agenda, one can be full of good intentions. I am sure that the people who advocated for Jewish quotas at the US Ivy league universities in the 1920s were certain that they were improving society and reducing injustice.

What Can I Do?

In the previous sections, I have outlined what I believe are the central flaws in the knowledge system, and the possible consequences of these flaws. I want to stress here again that I am not accusing any person or organization of fraud, cherry-picking, or anything like that, and I don’t think there is a conspiracy. In most cases, the flaws and biases are inherent to the knowledge system and are not the consequence of deliberate actions by individuals. They are just part of our human nature.

And still, the flaws are there, and they have dire consequences. It is therefore important to ask: what can be done?

One thing that I do not think can be done is to ‘fix the system’. I have a hard time imagining how the knowledge system – a somewhat independent emergent structure, which was not planned by anyone – can be fixed from the outside. Of course, one can try to fix certain parts of it, but that is a different thing.

However, the first step towards addressing these flaws is (i) to be aware of them; (ii) to at least try to fix them. At the societal level, a necessary condition for this to happen is that the level of scientific literacy in the general populace must increase. The public must understand what science is, what it can, and importantly cannot, do. It must be aware of the biases in the knowledge system, which in turn requires a level of scientific, historical and philosophical literacy which is not available to large portions of society in any country.

I have children at different ages, from elementary to high-school levels, and I watch how they are being taught science. I can summarize the problem as follows: they are not being taught science, but rather they are taught how to solve questions for exams in science. But do they understand the basics? The different definitions of truth? The Popper vs Kuhn argument on the nature of scientific inquiry? Or how to corroborate or falsify scientific claims? No, they don’t. But in our modern world, these are, in my opinion, highly necessary tools one needs to have in order to avoid the biases inherent to the knowledge system. So, push for science literacy – at your children’s schools, at your university, everywhere you can.

What else can be done? Let me divide the answer into different scales: the personal, the university, and the state levels.

At the personal level, each member of the knowledge system and the public (that’s pretty much everyone) must first think about the biases in the knowledge system and talk about them. Another important thing to do is to learn and study history and philosophy of science. While some ideas might seem vague and useless, I have found that studying the philosophy of science has dramatically improved my own scientific research.

But even more important, each of the members of the knowledge system must be aware of his/her own biases. Whether you are a researcher, synthesizer, or disseminator, you should always think – am I writing this because I am biased, or does the data really show this? We are all subject to biases, there is not much we can do about that, but being aware of this fact can already do wonders in reducing the chance of our biases controlling the knowledge and information we promote.

At the institution level – and I am mainly focused on universities here – there are many things to do. First, the discussion must be raised. Academics must be engaged in discussion on the topics of the knowledge system and inherent biases. Second, those who are aware of the dangers should be active – through university committees, for example – in defending academic freedom. Finally, if one suspects that politicization is occurring, one must vocally resist!

At the national and international level, things are clearly harder. Nonetheless, we must try. Participate in national discussions on research funding, join committees, talk to politicians about these issues – these can all help. Engage in global discussion – write papers, op-eds, and articles, and use social media, to raise awareness of the structure and flaws of the knowledge system. And, of course, show vocal opposition to clear biases and to mistakes you identify in the knowledge system, at all levels, and to attacks on academic freedom. They are all connected.

Epilogue

Dwight Eisenhower, the 34th president of the United States, was a smart man with vision and foresight, which served him well as the architect of numerous victories of the allied forces in the Second World War, and later as president. In 1961, after two terms in office, he gave his final public speech as president, a speech which is to this day remembered because of his warning of the military–industrial complex.

But there was another, slightly less well-known warning in his speech. There, Eisenhower warned us about the corruption of the scientific process as part of the centralization of funding by governments. In his words:

Today, the solitary inventor, tinkering in his shop, has been overshadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity… The prospect of domination of the nation’s scholars by Federal employment, project allocations, and the power of money is ever present and is gravely to be regarded.

Yet Eisenhower did not end there. He already understood that the knowledge system is a closed feedback loop, and that power can flow not only from the government down to the scientists, but also the other way around. In his words:

Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of the scientific-technological elite.

Eisenhower was right, but not all the way. All these systems, the government funding agencies, the universities (with their intellectual elites), the press, are part of a complicated network responsible for bringing knowledge to the public. And while this system had tremendous success in progressing science and technology in many areas, it is nonetheless seriously skewed and flawed, which presents many dangers. And if we are not careful, the end result will be nothing short of the loss of academic freedom, and with it the huge benefits it brought to mankind. We should all be worried.

About the Author

Yonatan Dubi is a full professor in the chemistry department at Ben-Gurion University of the Negev (BGU), Israel. A theoretical physicist by training, Dubi graduated in 2007 from the BGU Physics Department, and spent three years as a post-doctoral scholar at UC San Diego and at the Los Alamos National Laboratories, USA. After a brief stint in industry, in 2012 he returned to BGU as a faculty member. His research spans a variety of topics, from quantum effects in biology to nanotechnology and energy conversion, and he teaches courses in quantum chemistry and in the philosophy of science. He is the recipient of the 2017 Krill prize for excellence for young scholars, awarded by the Wolf Foundation.

References

Ben-Gurion University (2007) Code of Academic Ethics. Available at https://in.bgu.ac.il/acadsec/DocLib/Pages/ethics/v1-קוד_אתי-_גרסה_באנגלית.pdf (accessed June 2023).Google Scholar
Chemke, R, Ming, Y and Yuval, J (2022) The intensification of winter mid-latitude storm tracks in the Southern Hemisphere. Nature Climate Change 12(6), 553557. doi: 10.1038/s41558-022-01368-8.CrossRefGoogle Scholar
Columbia Climate School (2022) The Flood Seen From Space: Pakistan’s Apocalyptic Crisis. Available at https://news.climate.columbia.edu/2022/09/12/the-flood-seen-from-space-pakistans-apocalyptic-crisis/ (accessed June 2023).Google Scholar
Daston, L and Galison, P (2007) Objectivity, edited by Galison P. Cambridge, MA: Zone Books.Google Scholar
Epstein A (2022) Fossil Future: Why Global Human Flourishing Requires More Oil, Coal, and Natural Gas – Not Less. New York: Portfolio/Penguin.Google Scholar
Grieneisen, ML and Zhang, M (2011) The current status of climate change research Nature Climate Change 1(2), 7273. doi: 10.1038/nclimate1093.CrossRefGoogle Scholar
Guiney, PD, Goodfellow, WL and Canfield, TJ (2020) An Overview of Conformation Bias in Science: Examples and Opportunities for Improvement. SETAC North America 41st Annual Meeting, SETAC.Google Scholar
Hamilton, P (1974) Knowledge and Social Structure: an Introduction to the Classical Argument in the Sociology of Knowledge. Oxfordshire, UK: Routledge.Google Scholar
Hansen, J (1988) US Senate testimony. Available at https://www.sealevel.info/1988_Hansen_Senate_Testimony.html (accessed June 2023).Google Scholar
IPCC AR6 WG1 (2021a) Climate Change 2021: The Physical Science Basis. Available at https://www.ipcc.ch/report/sixth-assessment-report-working-group-i/ (accessed June 2023).Google Scholar
IPCC AR6 SPM (2021b) Summary for Policymakers. Available at https://www.ipcc.ch/report/ar6/wg1/downloads/report/IPCC_AR6_WGI_SPM.pdf (accessed June 2023).Google Scholar
Israel Government Press Office (2022) Israeli-led Study: Earth’s Climate Change Apparently More Rapid than Expected. Available at https://www.gov.il/en/departments/news/climate_change26052022 (accessed June 2023).Google Scholar
Kalven, H (1967) Report on the University’s Role in Political and Social Action. Available at https://provost.uchicago.edu/reports/report-universitys-role-political-and-social-action (accessed June 2023).Google Scholar
Kiai, A (2019) To protect credibility in science, banish ‘publish or perish’. Nature Human Behaviour 3(10), 10171018. doi: 10.1038/s41562-019-0741-0.CrossRefGoogle ScholarPubMed
Lazard Climate Center (2020) Available at https://www.lazard.com/financial-advisory/specialized-advisory/climate-center/ (accessed June 2023).Google Scholar
Mannheim, K (1936) Ideology and Utopia: an Introduction to the Sociology of Knowledge. Translated by Wirth, L and Shils, E. New York: Harcourt, Brace and Company; London: Kegan Paul, Trench, Trubner & Co.Google Scholar
McBride, JM (2009) How Do You Know? Available at https://www.youtube.com/watch?v=WSYEApgJkh0.Google Scholar
Merton, RK (1937) The sociology of knowledge. Isis 27(3), 493503. doi: 10.1086/347276.CrossRefGoogle Scholar
Nuzzo, R (2015) How scientists fool themselves – and how they can stop. Nature 526(7572), 182185. doi: 10.1038/526182a.CrossRefGoogle Scholar
Obama, B (2013) Ninety-seven percent of scientists agree… Available at https://twitter.com/barackobama/status/335089477296988160 (accessed June 2023).Google Scholar
Overland, I and Sovacool, BA (2020) The misallocation of climate research funding. Energy Research & Social Science 62, 101349. doi: 10.1016/j.erss.2019.101349.CrossRefGoogle Scholar
Rajtmajer, SM, Errington, TM and Hillary, FG (2022) How failure to falsify in high-volume science contributes to the replication crisis. eLife 11, e78830. https://doi.org/10.7554/eLife.78830.CrossRefGoogle Scholar
Shaviv, N (2006) Available at http://www.sciencebits.com/ (accessed June 2023).Google Scholar
UNESCO (2022a) Knowledge-driven actions: transforming higher education for global sustainability. Available at https://unesdoc.unesco.org/ark:/48223/pf0000380519 (accessed June 2023).Google Scholar
UNESCO (2022b) UNESCO and Sustainable Development Goals. Available at https://en.unesco.org/sustainabledevelopmentgoals (accessed June 2023).Google Scholar
Figure 0

Figure 1. Alex Epstein’s ‘knowledge system’

Figure 1

Figure 2. The knowledge system as a feedback system

Figure 2

Figure 3. (a) Funding for climate research in the natural and technical sciences (versus the social sciences and humanities). Figure reprinted from Overland and Sovacool (2020) under CC BY 4.0. (b) Number of journals exclusively dedicated to climate change actively published each year (dashed line), Access World News (dotted line, divided by 12,000), and number of published articles on climate science (solid line). Note the exponential growth. Figure reprinted from Grieneisen and Zhang (2011) with permission