1 Introduction

The early twentieth century brought two revolutions, which ended the age of classical physics and formed the beginning of the age of modern physics. The first revolution was brought about by the introduction of Einstein’s relativity theory in 1905, which fundamentally changed the notions of space and time in physics. The second revolution, the development of quantum physics in the first decades of the twentieth century, is generally seen as having brought an end to classical determinism.

But was there already a clear notion of ‘classical physics’ at this point? Staley (2005, 2008a) has pointed out the obvious by noting that the classical only exists in contrast to the non-classical, and has shown that it was only in the context of these two revolutions that the concept of ‘classical physics’ was developed.Footnote 1 In the early twentieth century, an image was created of classical physics, which formed a contrast with the modern physics of relativity theory and quantum mechanics; Staley therefore speaks of the ‘co-creation’ of classical and modern physics. Central aspects of this image of classical physics are (1) the use of Euclidean space and time and (2) determinism, of which the ultimate expression was found in the work of Laplace (1814). In addition, it has often been thought that the physics of the late nineteenth century can be characterized by a sense of complacency and by the idea that the basic features of physics had been figured out and that the remaining task was to fill in the details—however, this feature of ‘classical physics’ has been contested, for it has also often been noted that in fact, the last decade of the nineteenth century saw lively debate and disagreement about foundational issues in physics (on this topic, see Kragh 2014; Seth 2007).

We can ask to what extent the conception of classical physics, as it was formed in the first decades of the twentieth century, provides a correct image of the physics of the eighteenth and nineteenth century. In this paper, I focus on the aspect of determinism. Determinism forms an essential part of the image of classical physics: however, there are a few reasons to question whether physics was indeed deterministic up until the introduction of quantum mechanics.

First, it has been pointed out that during the 1910s and early 1920s, when quantum physics was still in its early stages and its implications for causality and determinism were not yet clear, several physicists already doubted or even abandoned determinism. Moreover, they did so for reasons that were at least partly independent of the developing quantum theories. Where Forman (1971) has argued that the cause for this turn against determinism can be found in the cultural milieu of the Weimar republic, others (including Brush 1976b; Ben-Menahem 1989; Stöltzner 1999) have looked especially at the context of statistical mechanics: within statistical mechanics, the conception of statistical laws of nature arose, which gave rise to the idea that all laws of nature may be statistical: the laws of physics may arise as statistical averages, with processes at the fundamental level taking place by pure chance. This idea was most notably expressed by Franz Serafin Exner and Erwin Schrödinger (Ben-Menahem 1989; Stöltzner 1999). These early acceptances of indeterminism in the context of statistical mechanics, however, do not necessarily have to be interpreted as implying that indeterminism already arose within classical physics: one might argue that statistical mechanics itself became non-classical as soon as it allowed for fundamental chance, and that we therefore have to draw the classical/modern boundary already at this point.

Secondly, in the last decades, philosophers of physics including Earman (1986, 2007) and Norton (2008) have claimed that it is possible to construct systems within classical mechanics in which determinism fails. This would mean that the idea that classical physics is deterministic has been a mistake, based on the fact that these possibilities were overlooked. One may suppose that these subtleties were unknown during the eighteenth and nineteenth century, and that therefore at least historically, it is correct to state that physics was deterministic during this period. However, the failure of indeterminism described by Norton (2008), known as the ‘Norton’s dome,’ was known and discussed by at least a few physicists in nineteenth century France (see Sect. 2.2).

In this paper, I aim to answer the question to what extent mechanics, and physics as a whole, was regarded as deterministic before the twentieth century. Within physics, determinism is usually defined in terms of laws and initial conditions: determinism is essentially the claim that all processes can be fully described through a set of fundamental laws of nature, which always have a unique solution for given initial conditions. I argue that during the eighteenth and nineteenth century, this claim could not easily be established, and it seems that only a limited number of physicists explicitly adhered to it. This, however, does not mean that there was a widespread acceptance of indeterminism. It is possible to take for granted that everything that happens is uniquely determined, without thinking that this has been established by physics. It is also possible to think that ultimately, physics should aim at describing all processes through deterministic laws, without thinking that this aim has already been accomplished. My main claim is that during the period which we now describe as classical, determinism was not so much an established result of physics, but rather an expectation, and that during the late nineteenth and early twentieth century, it more and more took the form of a methodological principle or necessary presupposition of science, rather than an ontological claim.

Section 2 of this paper deals with the question to what extent physics was in fact deterministic during the eighteenth and nineteenth century. Claims that in this period, physics was deterministic, are usually based on the idea that the laws of mechanics uniquely determine all processes within classical physics. Thus, we first have to consider the question whether the mechanics of the eighteenth and nineteenth century was in fact deterministic (Sects. 2.12.3) and then the question whether mechanics sufficed to account for all phenomena within physics (or alternatively, whether another unifying framework was possible) (Sect. 2.4).

Section 3 deals with reflections of physicists on the issue of determinism. In Sect. 3.1, it is shown that in the late nineteenth century, many physicists adopted a position of modesty toward the ontological implications of their theories: this position undermined the idea that physics could decide on the issue whether nature is deterministic. Sect. 3.2 shows how during the late nineteenth and early twentieth century, physicists could argue for determinism as a presupposition of science, while remaining agnostic about whether nature itself is deterministic.

2 Foundations of determinism in classical physics

2.1 Varieties of mechanics

One might think that determinism in classical mechanics is rather straightforward: mechanics describes how matter moves according to laws of motion. In classical mechanics, these laws are for example Newton’s laws of motion, according to which for each particle, the motion is described through a second-order differential equation (\(F=m\frac{\mathrm{d}^{2}x}{\mathrm{d}t^{2}})\). These equations always have a unique solution for given initial conditions, thus ensuring that for a given initial configuration of matter, there is only one possible way the system can evolve in time. (In fact, uniqueness of solutions only follows under an additional assumption of continuity which is often overlooked, namely that the force function F(x) is Lipschitz continuous; see below).

Thus, if all change takes place through motion of matter, as is generally presupposed in mechanics, and if all motion takes place according to these laws of motion, and if these indeed have unique solutions for given initial conditions, then determinism holds. The usual historical reference is Laplace, who in 1814 gave the most famous expression of determinism in physics:

An intelligence which, for one given instant, would know all the forces by which nature is animated and the respective locations of the entities which compose it, if besides it were sufficiently vast to submit all these data to mathematical analysis, would encompass in the same formula the movements of the largest bodies in the universe and those of the lightest atom; for it, nothing would be uncertain and the future, as the past, would be present to its eyes (Laplace 1814, 3–4).Footnote 2

However, there are good reasons to think that mechanics was never that simple. Generally, it is a simplification to think of classical mechanics as being constituted completely by Newton’s laws of motion. In fact, the conception of the laws of mechanics as dynamical laws, expressed by means of differential equations, was developed only during the eighteenth century. Furthermore, from the eighteenth century until the present, there have been various formulations of mechanics and various proposals for the basic mechanical principles, and attempts to reduce all of this to one well-defined and consistent theory with one basic set of laws have been only partially successful (Truesdell 1968; Stan 2017; Stan forthcoming-a; Stan forthcoming-b). Stan describes how during the eighteenth century, there was no general agreement on what exactly the laws of motion were, and various mechanical principles, laws of motion, and formulations of mechanics were developed: ‘this diversity of foundational perspectives defies any attempt to show that post-Newtonian mechanics is a unified theory’ (Stan forthcoming-a).

The most pressing issue was that mechanics dealt with several distinct conceptions of matter (Stan 2017; see also Wilson 2013). The main conceptions of matter used in mechanics are mass points, rigid bodies and deformable continua. These are fundamentally different and are subjected to different dynamics:

  • Mass points are unextended and can only undergo translational motion. Collisions between mass points are rare enough that they can plausibly be ignored, so that the mass points only interact through ‘action at a distance’; they interact through forces which act between pairs of particles, are directed along the line which connects them, and depend on their distance.

  • Rigid bodies are extended and generally have well-defined geometrical shapes which cannot be deformed. They undergo both translational and rotational motion and can exert both contact forces (one body pushing against another) and body forces (e.g., gravity and electromagnetic force).

  • Deformable continua are also extended and can be divided into volume elements with an infinitesimal extension. They can undergo translation, rotation and deformation and can undergo internal stresses.

All three of these conceptions of matter have turned out to be needed to describe the range of phenomena that fall under the domain of mechanics. This fragmented ontology of mechanics persisted throughout the nineteenth century and still persists today: textbook examples in mechanics often use a mixture of the above conceptions of matter. Most physicists content themselves with the idea that in classical mechanics, matter can be modeled in different ways, depending on the situation and context.

Those who seek to base mechanics on solid foundations and on a single ontological picture can argue that only one of these conceptions of matter is foundational, and that the others are only used as approximations. For example, one could argue that what there really is are mass points, but that certain problems within mechanics can be solved much more easily if you model certain configurations of mass points as rigid bodies or as deformable continua. Many such proposals for ontological unification have been made—more on this in the next section—but up until the late nineteenth century, the issue remained debated, and there was no general consensus about the constitution of matter (Wilholt 2008). In the twentieth century, it became clear that to answer the question what matter is like at a foundational level, one has to look at quantum mechanics, and that classical mechanics can only offer descriptions of matter on a higher scale level.

Here, it may be objected that none of this has any implications for the issue of determinism: the fact that there are various formulations of mechanics and that mechanics works with various conceptions of matter in itself does not give us any reason to think that there could be indeterministic processes in mechanics. This is fair enough, but at least it shows that the issue of determinism in classical mechanics is not that straightforward after all: the conception according to which mechanics deals with a single type of matter of which the motion is uniquely determined through a single set of laws is too simple. To properly demonstrate that (classical) mechanics is deterministic, one would need to engage with mechanics in its full complexity. One possibility would be to establish that there are abstract principles which can be applied to all conceptions of matter, such as the principle of virtual work and d’Alembert’s principle, and establish that these principles determine the course of any process in full detail.Footnote 3 Another possibility would be to establish that at bottom there is a single type of matter and one set of laws of motion and that this ‘bottom level’ is deterministic; however, one then has to make plausible that the rest of mechanics can (at least in principle) ultimately be reduced to this bottom level.

It is no coincidence that the most well-known statements of determinism in physics have been made by scientists who argued that mechanics should be based on a single conception of matter; and it is also no coincidence that the conception of matter they adhered to was that of the mass point, which is the conception for which the chances of establishing determinism look best. In the next section, we will look at the status of determinism within point particle mechanics; in Sect. 2.3, we will look at the situation in other areas of mechanics.

2.2 Point particle mechanics

As was mentioned in the previous section, by far the most well-known statement of determinism in physics was given by Laplace (1814). However, similar statements of determinism can already be found earlier, in the work of a number of contemporaries of Laplace in eighteenth century France, notably Maupertuis, Condorcet and d’Holbach; moreover, Laplace himself already expressed the idea of determinism in a lecture in 1773 (Wolfe 2007; Van Strien 2014a). These early expressions of determinism can be understood in the context of the French enlightenment and its optimism about science. Perhaps the most notable precursor of Laplace’s determinism, however, is the Jesuit mathematician and natural philosopher Roger Boscovich: Kožnjak (2015) has argued that in the work of Boscovich, one finds a statement of determinism that is both earlier (1758) and stronger than that of Laplace, yet has remained largely unknown.Footnote 4 Like those of his predecessors, also Laplace’s statement of determinism did not have an immediate impact: in fact, Laplace’s determinism only became well known and much debated after being popularized in a famous lecture by Emil Du Bois-Reymond in 1872 (Du Bois-Reymond 1898 [1872]; see Cassirer 1956 [1936]; Hacking 1983).

Boscovich, Laplace, and Du Bois-Reymond are thus perhaps the three most significant sources of the idea of determinism in classical physics (even though the historical influence of Boscovich’s determinism may have been limited). All three of them strictly adhered to a conception of matter as constituted by point particles, or by very small particles of which the internal structure need not be studied, with central forces acting between these particles; and according to all three of them, the task of physics consists in explaining all natural processes in terms of the motions of these particles. In fact, the conception of mass points finds its origin in Boscovich’s work. In this section, I show how the statements of determinism of Boscovich, Laplace and Du Bois-Reymond were each embedded in a research program which sought to reduce physics to point particle mechanics, and then look at in how far it was possible for them to establish determinism within point particle mechanics.

Boscovich has become most well-known for his idea that matter is constituted by point particles, with central forces acting between them; the force acting between point particles is repulsive at short distances, represents gravity at large distances, and in between it oscillates a few times between repulsive and attractive. The exact form of the force function should account for various properties of matter, including electricity and magnetism; Boscovich’s theory thus represents an ambitious attempt at unification. Boscovich argues for determinism as follows:

Any point of matter, setting aside free motions that arise from the action of arbitrary will, must describe some continuous curved line, the determination of which can be reduced to the following general problem. Given a number of points of matter, & given, for each of them, the point of space that it occupies at any given instant of time; also, given the direction & velocity of the initial motion if they were projected, or the tangential velocity if they are already in motion; & given the law of forces expressed by some continuous curve, such as that of Fig. 1, which contains this Theory of mine; it is required to find the path of each of the points, that is to say, the line along which each of them moves.

(...) Now, if the law of forces were known, & the position, velocity & direction of all the points at any given instant, it would be possible [for a mind that is brilliant enough] to foresee all the necessary subsequent motions & states, & to predict all the phenomena that necessarily followed from them. (Boscovich 1922 [1763]).

This statement of determinism is carefully formulated. It explicitly applies to point particles and holds as long as the forces between them are continuous; this holds for the specific force function Boscovich proposed to act between particles. (Also, note that he argues that physical systems are only deterministic as long as no ‘arbitrary will’ intervenes; he thus argues for determinism within the domain of physics, but is not an absolute determinist).

Also Laplace adhered to a particle conception of matter; in fact, he was the leading figure of a research program which Fox (1974) has termed ‘Laplacian physics.’ The main feature of Laplacian physics was that it sought to account for all phenomena in terms of molecules, of which the extension could be neglected for most practical purposes and which could thus be treated as point particles; and central forces, which could be attractive or repulsive. On this basis, all properties and interactions of matter, including chemical reactions, were to be accounted for. Heat, light, electricity and magnetism were conceived of as imponderable fluids, consisting of molecules. Thus, physics and chemistry were to be based on strong, unitary foundations provided by molecules and central forces. According to Fox, Laplacian physics constituted a highly unified and ambitious research program: ‘In the years of its greatest success, from 1805 to 1815, the program both raised problems and laid down the general principles for solutions; and, by doing so, it gave French physical science a most uncommon unity of style and purpose’ (Fox 1974, 91).

Point particles play a central role in celestial mechanics, in which planets, moons and the sun are usually modeled as point masses in calculations of planetary orbits. This approach was extraordinarily successful in the eighteenth century and provided a model for Laplace’s physics. In celestial mechanics, it had become possible to make impressively accurate long term predictions; Laplace (1814) refers to Halley’s prediction of the return of a comet in 1759 as an impressive feat of prediction. After writing that an intelligence which would have exact knowledge of the present state of the universe, would, if this intelligence were ‘sufficiently vast to submit all these data to mathematical analysis,’ be able to predict anything that would happen in the future, Laplace continues: ‘The human mind offers, in the perfection which it has been able to give to astronomy, a feeble idea of this intelligence’ (Laplace 1814, 4). If all of physics can indeed be reduced to point particle mechanics, then the problem of predicting any process within physics is merely a (much) more complex variation on the problem of predicting planetary orbits and orbits of comets; although predicting these motions by far surpasses our calculation skills, there is nevertheless good reason to assume that these motions are deterministic. Laplace concludes his reflections on determinism by stating: ‘The curve described by a simple molecule of air or vapor is regulated in a manner just as certain as the planetary orbits; the only difference between them is that which stems from our ignorance’ (Laplace 1814, 6).

The fact that Laplace was the leading figure of a research program which sought to reduce physics to the mechanics of particles raises the question in how far his determinism was bound to this specific program. This question becomes all the more pressing if we consider that, according to Fox (1974), the program of Laplacian physics collapsed quite suddenly between 1815 and 1825 and was abandoned by almost all of its former adherents. Fox explains this collapse both through institutional factors and through new challenges within physics: most scientists came to the agreement that the scheme provided by Laplacian physics was too rigid and too limited to account for all phenomena within physics. He mentions a number of scientific developments of the early nineteenth century which could not easily be integrated into the program of Laplacian physics, including the wave theory of light, the vibrational theory of heat, Dalton’s atomic theory of chemistry and the rational mechanics of Fourier, as well as phenomena of elasticity and electrodynamics.

Although the research program of Laplacian physics lost many of its adherents, its basic concepts and approach continued to be influential. Laplace’s determinism was popularized in Du Bois-Reymond’s well-known lecture ‘On the limits of our knowledge of nature’ in 1872. In this lecture, in which Du Bois-Reymond sought to determine the potential domain and the limits of natural science, he argued that to understand something scientifically means to reduce it to the motion of atoms:

Natural science—or, more definitely, knowledge of the physical world with the aid of and in the sense of theoretical natural science—means the reduction of all change in the physical world to movements of atoms produced independently of time by their central forces; or, in other words, natural science is the resolution of natural processes into the mechanics of atoms. (Du Bois-Reymond 1898 [1872], 18).

For Du Bois-Reymond, understanding natural processes in terms of motion of atoms is a definition of scientific knowledge. A similar definition of scientific knowledge can be found in von Helmholtz (1847); Helmholtz similarly argued that the only way to completely understand nature is to reduce natural phenomena to the motion of material particles subjected to central forces. According to Du Bois-Reymond, this then sets the boundaries of science: everything which can in principle be reduced to the motion of atoms falls within the domain of what can be known scientifically, and anything that cannot be understood in terms of motions of atoms is fundamentally unknowable. If all natural processes can be reduced to motion of atoms, Laplacian determinism follows:

If we were to suppose all changes in the physical world resolved into atomic motions, produced by constant central forces, then we should know the universe scientifically. The condition of the world at any given moment would then appear to be the direct result of its condition in the preceding moment and the direct cause of its condition in the subsequent moment. Law and chance would be only different names for mechanical necessity. Nay, we may conceive of a degree of natural science wherein the whole process of the universe might be represented by one mathematical formula, by one infinite system of simultaneous differential equations, which should give the location, the direction of movement, and the velocity, of each atom in the universe at each instant. (Du Bois-Reymond 1898 [1872], 18).

Du Bois-Reymond uses the term ‘astronomical knowledge’ for knowledge of the positions and motions of atoms; this shows that he, like Laplace, models science on celestial mechanics (Du Bois-Reymond Du Bois-Reymond 1898 [1872], 26).

As in the cases of Boscovich and Laplace, also Du Bois-Reymond’s statement of determinism thus appeared within the context of a broader program of reduction to point particle mechanics. It is relevant to note here that Du Bois-Reymond himself was not a physicist, but rather a physiologist, although one for whom physics was central to physiology; with Johannes Müller and Hermann von Helmholtz, he developed an approach to physiology which was heavily based on the methods and theories of physics and chemistry, and which aimed at mechanistic explanation of organic processes (Finkelstein 2013). Du Bois-Reymond was particularly concerned with arguing against any vitalist notions within physiology; he argued that physiological processes are determined by material conditions, without intervention of the mind or any type of vital force (Van Strien 2014c). In this context, it was important for him to argue that all natural processes, including physiological ones, could be understood in terms of mere matter and motion. It is also relevant to note that the lecture in which Du Bois-Reymond argued for determinism was a popular lecture; Romizi (2019) has argued that Du Bois-Reymond’s determinism should be understood in the context of science popularization, and that the public demand for an all-encompassing scientific world view contributed to the extension of determinism to all of nature. Thus, also Du Bois-Reymond’s determinism was embedded within a specific context and within an ambitious research program, in which the idea that all natural processes can be reduced to atoms and central forces played an essential role.

If one assumes that all phenomena within physics can indeed be reduced to the motion of point particles, determinism seems to follow quite straightforwardly. Of course, it is not possible for us to calculate exactly what the future will hold: this would require exact knowledge of instantaneous values of the position and momentum of all particles—either all particles in the universe, or within a perfectly isolated system. Nevertheless, it is in principle possible to describe the motion of each particle through differential equations which, although extremely complicated and practically unsolvable for systems of more than a handful of particles, are nevertheless of a relatively straightforward form and can be expected to have a unique solution for given initial conditions.

However, there are a few caveats. First, it remains to be specified which forces act between atoms, besides gravity. In celestial mechanics, the motion of planets, moons and comets is determined purely by gravitational force; but this does not suffice to explain all interactions of matter on smaller scales. If all processes in physics are to be reduced to point particle mechanics, additional force functions have to be introduced to account for, e.g., properties of different materials and electric and magnetic phenomena.

During the nineteenth century, it was established that in order for the differential equation describing the motion of a point particle to yield a unique solution for given initial conditions, the force function needs to fulfill a continuity condition. The exact condition was formulated by Lipschitz in 1876: a force function F(x) needs to fulfils the condition that there is a constant \(K>0\) such that for all \(x_{1}\) and \(x_{2}\) in the domain of F,

$$\begin{aligned} \left| F\left( x_{{1}} \right) -F\left( x_{{2}} \right) \right| \le K\left| x_{{1}}- x_{{2}} \right| . \end{aligned}$$

(see Van Strien 2014b). This condition has come to be known as ‘Lipschitz continuity.’ This mathematical theorem was not yet available to Boscovich or Laplace; therefore, strictly speaking, they would not have been able to prove mathematically that their laws of motion always yield a unique solution for given initial conditions. This is not to say that they had no insight in whether equations of motion in mechanics could be expected to have a unique solution: they can be expected to have had reliable mathematical intuitions on this matter, and in fact both seem to have realized that determinism depends on a continuity condition. Boscovich explicitly requires that the force function has to be continuous in order for his theory to be deterministic (Boscovich 1922 [1763], 281), and Laplace appeals to version of Leibniz’ law of continuity in his argument for determinism (Van Strien 2014a; Israel 1992).

The possibility that a point particle may be subjected to a force for which the equation of motion fails to have a unique solution was already raised by Poisson (1806), as well as by Duhamel (1845), Boussinesq (1879) and Joseph Bertrand (1878). In their examples of non-uniqueness of solutions to mechanical equations of motion, the Lipschitz condition is in fact violated; but they did not explicitly rely on the Lipschitz condition, or on earlier results by Cauchy on conditions for the uniqueness of solutions to differential equations. Poisson and Duhamel consider the possibility that point particles attract each other with a force which allows for non-unique solutions to the equation of motion; Boussinesq also considers this possibility and in addition designs a situation in which a point particle is placed on top of a dome-shaped surface and subjected to gravitational force; because of the particular shape of the surface, the equation of motion fails to determine if and when the point particle will roll down the surface. The latter way of constructing an indeterministic system in classical mechanics was rediscovered much later by John Norton (Norton 2008) and is now known as ‘the Norton dome.’ It is an interesting case because in this system, indeterminism arises through gravitational force.Footnote 5 (Note that this is not a system within point particle mechanics, as it involves both a point particle and a rigid surface).

Interestingly, most of the above authors did not conclude that there could be indeterminism in nature. Poisson and Duhamel argued that when an equation of motion has more than one solution, only one of these solutions can be correct, and the problem is how to identify the correct solution. Bertrand sought the problem in the relation between theory and reality and argued that when the theory allows for multiple possible future evolutions of a system, the theory must not be entirely accurate (see Sect. 3.2).Footnote 6

Collisions are a further caveat to the claim that determinism is relatively straightforward within point particle mechanics: when two or three particles collide, there is a singularity in the equations of motion, and these equations do not say what will happen; one has to figure out a way to continue the equations past the collision. One way to avoid this problem is by arguing that for particles without extension, the probability of collision is infinitely small and can therefore be ignored. Boscovich in fact postulated that at very short distances between particles, there is a repulsive force which goes to infinity as the distance between particles goes to zero, so that collisions never take place.Footnote 7

We thus arrive at the following conclusions. First, the statements of determinism of Boscovich, Laplace and Du Bois-Reymond, three of the most significant proponents of determinism in physics in the eighteenth and nineteenth century, were each embedded within a specific research program, aimed at reducing all natural phenomena to the motion of atoms which can be regarded as mass points. These research programs were very influential, but they did not represent a general consensus in physics, and they were ambitious and ongoing: the endpoint of being able to account for all natural phenomena in terms of the motion of atoms was never reached. Furthermore, although within a system of point particle mechanics, determinism seems relatively straightforward, it is not at all trivial to rigorously establish that a system of point particles is deterministic, and there may be failures of determinism even within point particle mechanics—and one can in fact find a few examples of physicists in the nineteenth century who were aware of this.

The point particle conception of matter was very influential throughout the nineteenth century, but always existed alongside other conceptions of matter. In the second half of the nineteenth century, an increasing number of physicists doubted whether point particles could be regarded as foundational. Besides a general increasing resistance to reductionism in physics (about which more in Sect. 2.4), point particles raised a number of conceptual problems. Already Du Bois-Reymond, while arguing for a program of reduction of all natural phenomena to the motion of atoms, argued that we cannot know the nature of atoms; in fact, he argued that our conception of atoms is ultimately contradictory. One problem lies in the question whether atoms are extended: whereas Du Bois-Reymond thought that atoms can be considered as point particles for purposes of calculation, he argued that they did in fact have to occupy at least a small space, for something cannot exist without being in space. However, at the same time, he argued that for an atom to be a foundational element in science, it would have to be indivisible, but an atom can only be truly indivisible when it is unextended.Footnote 8 Because of such puzzles, Du Bois-Reymond argued that the nature of matter will always remain unknown to us.

Other physicists also regarded the point particle conception of matter as problematic. James Clerk Maxwell argued that results from spectroscopy showed that atoms had to be elastic, but if atoms are foundational, this means that the property of elasticity cannot be further explained. Moreover, this result from spectroscopy was hard to reconcile with information about the inner structure of atoms obtained within the kinetic theory of gases, as well as with the point particle conception of atoms according to which they do not have an internal structure (Maxwell 1875, 471).Footnote 9 Duhem argued in 1905 that attempts to reduce all of physics to point particle mechanics had led to increasingly complicated and artificial theories, and that it had ultimately turned out that it is not possible to account for, e.g., elasticity in terms of point particles (Duhem 1905, 88). Henri Poincaré similarly argued that whereas the conception of mass points and central forces had been very useful in the historical development of physics, at some point, it seemed no longer methodologically adequate to attempt to base all of physics on this foundation (Poincaré 1921, 297–299; Liston 2017).

By the late nineteenth century, the question of the inner structure of atoms was an open problem, and the very existence of atoms was debated. The point particle conception of matter was confronted with both conceptual problems and empirical challenges, and it seemed increasingly likely that it would not be possible to account for all natural processes in terms of point particles and central forces.

2.3 Mechanics of rigid bodies, continua and fluids

Without the assumption that all of physics is reducible to point particle mechanics, the picture becomes more complicated. The two main alternatives to the point particle conception of matter were rigid bodies and deformable continua.

In order to develop a deterministic mechanics based on rigid bodies, one has to formulate rules for collisions of rigid bodies, or determine that no collisions take place. The formulation of rules for collisions turned out to be a problematic issue: in order for motion not to be lost, collisions would have to be elastic, but this seemed hard to reconcile with the property of rigidity. In particular, an elastic collision between hard bodies would involve an instantaneous change of motion, which would require an infinite force. This problem of collisions of hard bodies was much debated during the eighteenth and nineteenth century (Scott 1970; Darrigol 2001).Footnote 10

If matter is allowed to deform during collisions, we arrive at the conception of matter as deformable continua, which is described by continuum mechanics. But by taking this step, the possibility of giving a complete and exact description of natural phenomena gets much further out or reach. Whereas in point particle mechanics, processes are (usually) determined by a finite number of initial conditions, namely the position and velocity of each mass point, plus the force laws acting between particles, in continuum mechanics you need the values of physical quantities (such as pressure and density) over entire surfaces and volumes. Mathematically speaking, in continuum mechanics you need to work with partial differential equations, which require boundary surfaces as initial and boundary conditions. In order to deal with properties of different substances, one needs to specify internal stresses, elasticity, friction and viscosity. In order to make problems mathematically tractable, it is unavoidable to work with idealizations, to take surfaces to be smooth and volumes to be homogeneous.

Physicists in the nineteenth century certainly realized the necessity of simplifying assumptions and idealizations in continuum mechanics. Maxwell, when comparing the atomic and the continuum conceptions of matter, noted that the continuity view can be used as long as bodies can be assumed to be homogeneous:

[A] theory that some particular substance, say water, is homogeneous and continuous may be a good working theory up to a certain point, but may fail when we come to deal with quantities so minute or so attenuated that their heterogeneity of structure comes into prominence. (Maxwell 1875, 450).

It was not obvious that equations in continuum mechanics would always yield exact solutions for given initial conditions. Existence and uniqueness theorems for solutions to partial differential equations were largely unavailable during the eighteenth and nineteenth century; therefore, if physicists at the time were concerned with establishing whether equations in continuum mechanics have unique solutions for given initial conditions and thus yield deterministic descriptions, it would have been difficult, if not impossible, to rigorously establish this. Fluid mechanics brought particular challenges, especially regarding turbulent flow, and physicists in the nineteenth century extensively debated the possibility of discontinuity and unstable solutions in fluid mechanics (Darrigol 2002).Footnote 11 While the existence of unstable solutions would not demonstrate a failure of determinism, it would be a further obstacle to establishing a strict determinism.

Other areas of mechanics required yet different types of mathematical equations. In order to deal with stress on physical bodies, Ludwig Boltzmann, Vito Volterra and E. Picard developed an approach called hereditary mechanics, in the late nineteenth century; the idea is that for a body subjected to stress, its deformation may not only depend on the stress applied to it at that particular moment, but also on the stress that has been applied to it at earlier times (Dörries 1991; Ianniello and Israel 1993). In order to describe this mathematically, they used integro-differential equations. This means that in these cases, in order to make predictions about future behavior, it does not suffice to know the current state of the system; one also needs knowledge about past states. In 1910, Paul Painlevé, noting that this went against Laplacian determinism, drew a dramatic conclusion: ‘The conception according to which, in order to predict the future of a material system, one has to know its entire past, is the very negation of science’ (Painlevé, quoted in Israel 1992, 269). Most other physicists who concerned themselves with hereditary mechanics, however, were not particularly bothered with its implications for determinism (Ianniello and Israel 1993).

A more recent perspective on rigid body mechanics and continuum mechanics is given by Mark Wilson, who has summarized the situation by noting that classical mechanics ‘must inevitably compress swatches of very complicated physical behavior into simplified rules of thumb’ (Wilson 2009, 175). In a paper titled ‘What is classical mechanics anyway?’, Wilson notes:

As matters are commonly represented within modern college primers, ‘classical physics’ appears to be a transparent subject matter firmly founded upon Newton’s venerable laws of motion. But this placid appearance is deceptive. Any purchaser of an old home is familiar with parlor walls that seem sound except for a few imperfections that ‘only require a little spackle and paint.’ When those innocent dimples are opened up, the ancient gerry-rigged structure comes tumbling down and our hapless fix-it man finds himself confronted with months of dusty reconstruction. So it is with our subject, whose basic concepts can seem so ‘clear and distinct’ on first acquaintance that unwary thinkers have mistaken them for a priori verities. But the true lesson of ‘classical mechanics’ for philosophy should be exactly the opposite: the conceptual matters that initially strike us as simple and pellucid often unwind into hidden complexities when probed more adequately (Wilson 2013, 43).

To sum up: in order to deal with the full range of problems in mechanics, physicists in the nineteenth century needed to work with various mathematical techniques and often needed to work with simplifying assumptions and idealizations in order to make problems mathematically tractable. Determinism was often not a point of special concern, but would in fact be far from trivial to establish rigorously: a demonstration that the equations of motion that were used would always yield unique solutions for given initial conditions was far beyond what was mathematically feasible. This does not mean that there was reason to think that mechanics allowed for indeterministic processes; but it does show that the idea that all mechanical processes can be fully described through a single set of (second-order differential) equations was far removed from scientific practice.

2.4 From mechanics to physics

So far, we have examined the foundations of determinism within mechanics; in order to argue on the basis of determinism in mechanics that physics as a whole is deterministic, one would need to assume that all of physics can be reduced to mechanics. That this can be done was part of the research programs of Laplace and Roger Joseph Boscovich, and also Du Bois-Reymond adhered to this idea.

The nineteenth century saw enormous development in different domains of physics, including electrodynamics and thermodynamics—in fact, physics as a discipline was only established during the nineteenth century. There were extensive attempts to reduce electrodynamics and thermodynamics to mechanics, e.g., by finding a mechanical model for the electromagnetic ether, and by giving a mechanical derivation of the second law of thermodynamics—there was a prevalent idea that only through reduction to matter and motion, natural phenomena could really be understood. Thus, J. J. Thomson argued in 1888 that the belief in the possibility of mechanical explanation of natural phenomena was ‘the axiom on which all Modern Physics is founded’ (Thomson 1888, 1). Some important results were reached: probably the main success was the development of the kinetic theory of gases, which enabled to define heat in terms of the motion of atoms. However, by the end of the century, there was also a movement against the idea that physicists should actively try to account for all phenomena through mechanical models. This movement was led by physicists such as Ernst Mach, Pierre Duhem and Wilhelm Ostwald, who argued that the attempts to give mechanical explanations of all physical phenomena were often cumbersome and unfruitful, leading physicists to devise mechanical models that were speculative and needlessly complicated, and that it was often methodologically more fruitful to examine relations between observable phenomena than to attempt to reduce them to configurations of matter and motion. Mach, Duhem and Ostwald furthermore argued that there may well be natural phenomena and natural laws which were fundamentally irreducible to mechanics; an important case was the second law of thermodynamics, which described irreversible behavior which could not be derived from mechanics (Klein 1973; Van Strien 2013).

Mach argued already in 1872 that the foundational status often given to mechanics could be explained by the contingent historical fact that mechanics was the first area of science to develop into an exact science:

It is the result of a misconception, to believe, as people do at the present time, that mechanical facts are more intelligible than others, and that they can provide the foundation for other physical facts. This belief arises from the fact that the history of mechanics is older and richer than that of physics, so that we have been on terms of intimacy with mechanical facts for a longer time. Who can say that, at some future time, electrical and thermal phenomena will not appear to us like that, when we have come to know and to be familiar with their simplest rules? (Mach 1911 [1872], 56–57)

In fact, in the late nineteenth and early twentieth century, there were proposals to take another area of physics as foundational: specifically, there were attempts at a unification of physics based on electrodynamics or on energy and thermodynamics. In the ‘electrodynamic world view,’ proposed in 1900 by Wilhelm Wien, matter was conceived of as structures in an electromagnetic ether, and the laws of mechanics were to be explained electrodynamically—this attempt of reduction of physics to electrodynamics looked promising for a while, but ultimately its success remained limited (Kragh 2002b, 2014). Wilhelm Ostwald proposed to take energy as fundamental and to unify physics by reducing it to the science of energy. This approach, which went under the name ‘energeticism,’ drew much attention but also harsh criticism, among others by Max Planck and Mach; in the end, it was not very successful (Hiebert 1971; Kragh 2014).

Boltzmann positioned himself as going against the trend of anti-mechanism and persevered in his work on the kinetic theory of gases, with the aim to give a mechanical account of thermodynamic phenomena. The kinetic theory of gases, developed from the mid-nineteenth century by Rudolf Clausius, Maxwell and Boltzmann among others, explained the behavior of gases through the motions of the atoms composing them. It started out as a fully mechanical theory; however, in the course of time, Boltzmann found it needed to give a more central role to probabilities in the theory. Whereas in Boltzmann’s earlier work, probabilities used within the kinetic theory of gases were determined by the mechanical properties of the gas, at some point he introduced probabilities which relied on assumptions about the state of the gas as a whole, and could not directly be reduced to its mechanical properties. According to Uffink (2007, 928), this marks the transition from the kinetic theory of gases to statistical mechanics. Even though Boltzmann’s statistical mechanics is mechanical in the sense that it describes systems composed of moving particles, the macroscopic evolution of these systems cannot be fully derived from the equations of motion of these particles without probabilistic assumptions (Uffink 2007, 972).

Thus, during the nineteenth century, physics became more complex and plural. By the late nineteenth century, the range of phenomena studied and the types of mathematical equations and methods used to describe these phenomena had become very diverse, and the idea that all of physics could be reduced to mechanics and all of mechanics could in turn be reduced to point particle mechanics was increasingly criticized. This does not necessarily indicate a move away from determinism: of course, the fact that there are theories in physics which cannot be reduced to mechanics does not imply that there are cases in which determinism fails. But the only feasible way to demonstrate that all processes in physics are deterministic would be to find a relatively small set of entities and laws of nature, to which everything else reduces. The electrodynamic and energetic world views were attempts at such a unification; however, they remained ambitious research programs with a limited number of adherents, there was no general consensus about how promising they were and ultimately their goals of unification were never achieved. Generally, by the end of the nineteenth century, it had become increasingly unfeasible to reduce all of physics to a basic set of equations and to establish that these equations would always have a unique solution for given initial conditions.

In how far determinism was generally accepted in nineteenth century physics is not easy to judge. As Brush (1976b) notes, in the nineteenth and early twentieth century, there was often a certain ambiguity in the use of terms like randomness, indeterminism, chance, spontaneous, and probabilistic, and it is often not clear whether physicists argue that there is genuine indeterminism, or whether they are merely expressing ignorance of the exact paths of particles or of causes of movement. Moreover, there were many physicists who were simply not concerned with the question whether or not physics is deterministic. But at least some cases can be found of physicists who doubted or explicitly rejected determinism. Notable examples include Maxwell, who argued that unstable mechanical systems could allow for the intervention of an undetermined free will (Maxwell 1995 [1873]) and Boussinesq, who used the possibility of failures of determinism in point particle mechanics as the basis for an elaborate theory of free will and organic life (see Sect. 2.2). Generally, those who rejected determinism in physics often did so in the context of religion, and often argued for the possibility of free will and/or for a vitalist conception of organic life (on indeterminism in the nineteenth century, see Nye 1976; Hacking 1983; Van Strien 2014b, c; Romizi 2019). At the same time, many physicists seem to have taken for granted that all physical processes are uniquely determined, often without explicitly arguing for this, and without necessarily being committed to the idea that current theories of physics could yield a deterministic description of all natural processes.

3 Determinism around 1900

3.1 Descriptionism and determinism

In the late nineteenth century, many physicists argued, in different ways, that natural science cannot answer ontological questions and cannot provide us with a true representation of nature. It is possible to describe and predict phenomena by means of scientific theories, but one should not assume that these theories correspond with nature in every respect. Heilbron (1982) has proposed the term ‘descriptionism’ to describe this trend. This is a broad term describing a general tendency, which can according to Heilbron be found in the work of diverse physicists such as Mach, Poincaré, Duhem and Boltzmann: despite significant differences in their philosophies of science, they all share a certain ontological humility and emphasize that science cannot yield true representations of nature.

This trend of descriptionism was not a break with the past, but was continuous with earlier developments. Already 1876, Gustav Kirchhoff argued in the introduction to his Vorlesungen über Mechanik (Kirchhoff 1876), that the aim of mechanics is merely to describe the motions of matter, and not to inquire into the causes of motion. Mach argued in the 1880s that science can merely describe the relations between observable phenomena, and that the aim of science is to merely arrive at efficient ways to report experiences.Footnote 12 Whereas Mach and Kirchhoff stressed that we should stick with the phenomena and not speculate about underlying causes, a different strand of descriptionism was based on the liberal use of mechanical models. Maxwell and Kelvin devised detailed mechanical models, but used these models freely, switching between different models and not being too concerned about inconsistencies between them. This free attitude stemmed from the fact that they did not demand that these models gave a true and exact representation of nature. Maxwell thought that it was generally fruitful to work with concrete mechanical models, and argued for a method of analogies: analogies enable us to work with a ‘clear physical conception,’ without being committed to specific hypotheses about, e.g., the inner structure of matter (Maxwell 1856). Maxwell’s use of models was praised by Boltzmann, who argued for the heuristic value of visualizable models, or ‘pictures,’ of reality, which do not have to be taken as true representations of nature (Boltzmann 1902). In a lecture in 1899, Boltzmann argued that

[N]o theory can be objective, actually coinciding with nature, but rather (...) each theory is only a mental picture of phenomena, related to them as sign is to designatum.

From this it follows that it cannot be our task to find an absolutely correct theory but rather a picture that is as simple as possible and represents phenomena as accurately as possible. One might even conceive of two quite different theories both equally simple and equally congruent with phenomena, which therefore in spite of their difference are equally correct.

Many questions that used to appear unfathomable thus fall away of themselves. How, it used to be said, can a material point which is only a mental construct, emit a force, how can points come together and furnish extension, and so on? Now we know that both material points and forces are mere mental pictures. The former cannot be identical with something extended, but can approximate as closely as we please to a picture of it. The question whether matter consists of atoms or is continuous reduces to the much clearer one, whether the continuum is able to furnish a better picture of phenomena (Boltzmann 1974, 91; and see De Regt 1999; De Courtenay 2002).

More origins of descriptionism can be found, and they can be traced further back in time. Schiemann (2008) and Pulte (2000, 2009) have described how during the nineteenth century, the ideal of absolute truth in science was abandoned.Footnote 13 Pulte (2009) argues that this abandonment of absolute truth was partly brought about by internal problems within mechanics, which led to an abandonment of the idea that mechanics is built up deductively from absolutely certain axioms. Romizi (2019) uses the term ‘ent-ontologisierung’ (de-ontologization) to describe the shift in the attitude of scientists toward scientific theories in the late nineteenth century and draws the origin of this development all the way back to Kant, who already argued that science cannot describe the world as it really is, and that for example causality is not a feature of the world in itself but rather a category of understanding.

The development which Heilbron describes as ‘descriptionism’ was thus a broad movement, encompassing various other—isms, such as conventionalism, positivism, and instrumentalism. There were significant differences between the positions of, e.g., Maxwell, Mach, Poincaré, Duhem, and Boltzmann: they had different attitudes to atomism, made different judgments about the heuristic value of hypotheses and of detailed mechanical models, and attached different values to unification and consistency in science. How broadly the idea that scientific theories cannot be taken as true representations of nature was shared can be seen from the fact that also Du Bois-Reymond, despite his strong mechanical reductionism, argued that matter and force cannot be taken to exist the way we conceive of them. We have already seen in Sect. 2.2 that Du Bois-Reymond thought that the nature of matter and force remain unknown to us; in fact, he argued already in 1848 that both our concepts of matter and of force are abstractions from reality (Du Bois-Reymond 1848, xlii; Finkelstein 2013, 283). A similar point was made previously by Helmholtz (von Helmholtz 1847). For Du Bois-Reymond, this means that the aim of science can merely be to describe the motions of matter and not to investigate their causes.

But despite the broadness of ‘descriptionism,’ and the fact that it can be found throughout the nineteenth century and across many authors who otherwise had significantly different views on natural science, it does point to a general tendency which was increasingly emphasized toward the end of the century: a rejection of metaphysical explanation and of claims to absolute truth, and a rhetoric of modesty in the aims of science.Footnote 14

This descriptionist rhetoric is often found in lectures for a general audience; in this period, scientists were expected to regularly give public lectures, and physicists often used these occasions for philosophical reflections on their field (Stöltzner 2011). Heilbron has argued that descriptionism should primarily be understood in the context of the engagement of scientists with a general public, and was largely a reaction to a negative public image of natural science. Natural science was held responsible for uncontrolled technological development and was seen as materialistic, atheistic, crude and arrogant, removing all mystery, emotion and value from the world (Heilbron 1982, 57–58; and see MacLeod 1982). Heilbron argues that in order to counteract this public image of science, many physicists in the late nineteenth century distanced themselves from broad scientistic and materialistic world views and argued that their aim was merely to describe the phenomena. These public aspects are indeed significant, but do not give the full picture, since as we have seen, there were also strong internal causes for descriptionism. Physicists in this period partly used the means of public lectures to debate the aims and methods of physics, and the shift in the ultimate aims and explanatory ideals which can be seen in these lectures also (in different ways) shaped scientific practice.

The modesty of descriptionism seems far removed from the Laplacian ideal of predicting all occurrences in nature by tracing the motion of atoms and goes against the confidence and optimism often associated with classical physics. Indeed, in the late nineteenth and early twentieth century, it is rare to find physicists making broad claims in the style of Laplace or Du Bois-Reymond. However, despite the rhetoric of modesty, descriptionism could also convey a greater degree of certainty to physics: ‘By giving up metaphysics and relying instead on mathematical description, physics could eliminate whatever was doubtful and attain to almost perfect rigor and certainty’ (Porter 1994, 135; see also Staley 2008b). This point was already made by Duhem, who argued in 1906 that by restricting the aim of physics to describing and classifying the phenomena, rather than explaining the underlying causes, physics can be made independent of metaphysics, and thereby a higher degree of certainty can be reached (Duhem 1991).

If descriptionism implies a focus on mathematical description rather than metaphysical explanation, this does not have to stand in the way of determinism. However, there are other reasons why descriptionism could undermine commitments to scientific determinism. Among physicists in the late nineteenth century, there was a widely shared idea that scientific theories should be understood as models which do not correspond with nature in every way, and that laws of nature always involve some degree of idealization and abstraction. For example, Mach, Boltzmann and Poincaré all argued that laws of nature can only hold with good approximation and not absolutely, and that we cannot formulate laws of nature without idealizing. Mach argued that in order to formulate laws of nature, we have to look for regularities within the observed phenomena, and in order to do this we must always abstract away from the details and single out that which seems important to us; therefore, there is always a certain loss of information and a lack of complete accuracy (Mach 1903, 222). According to Boltzmann, ‘No equation represents any process with absolute precision; each idealizes them, emphasizes common features and disregards differences, and thus goes beyond experience’ (Boltzmann 1905, 222; see also Poincaré 1921, 340ff).Footnote 15

Careful consideration of the relation between theory and reality can also be seen in Poincaré’s attitude to the problem of the stability of the solar system. Poincaré is commonly regarded as the grandfather of modern chaos theory, and this is mostly because of his work on the n-body problem: the problem of the motion of a small number of mass points attracting each other with gravitational force. In the late nineteenth and early twentieth century, this problem drew attention because of its relevance for the stability of the solar system: if the sun, planets, and moons are modeled as mass points, can one prove mathematically that the solar system is stable, and will keep holding together even in the far future? It turns out that mathematically, this is a very complicated problem, and as a mathematician, Poincaré made important contributions to it (Parker 1998). Poincaré noted, however, that the problem was of limited physical interest. He argued that, as successful as the models based on mass points and gravitational force were in astronomy, they could never yield an exact representation of the solar system: the sun, planets, and moons are of course not actual mass points, Newton’s law of gravity may not be rigorously exact, and in our solar system, there are other forces at play than gravitational forces between the centers of mass (Poincaré 1898).Footnote 16 These issues became all the more pertinent when it turned out that in the mathematical model of the solar system, the stability of the system depends sensitively on initial conditions: even small differences in initial states could lead to very different evolutions. As the actual state of the solar system cannot be determined with mathematical precision and the mathematical model can never be an exact representation of the actual system, this meant that the question of the stability of our solar system had to remain unanswered.Footnote 17

For the issue of determinism, the claim that scientific theories never exactly correspond with reality implies that determinism on the level of scientific theories does not necessarily have to correspond with determinism on the ontological level. If the laws of nature are taken to be rigorously valid, then the fact that these laws always yield unique solutions for given initial conditions can be used to demonstrate that nature itself is deterministic—even if in practice, we can never make absolutely certain predictions, because we can never know initial conditions with absolute accuracy. But if the laws of nature only hold approximately (even with a very good approximation), the fact that they yield exact and unique predictions does not necessarily have to imply that nature itself is deterministic.

Romizi (2019, 259) has recently argued that the tendency of late nineteenth century scientists to problematize the relation between theory and reality, and to regard scientific theories as only providing (possibly very good) models of scientific reality, led to a weakening of determinism in this period. She notes, however, that this ‘gap’ between theory and reality could in certain cases also be used to defend determinism against possible counterexamples. In fact, Joseph Bertrand appealed to the idea of a discrepancy between theory and reality in his criticism of Boussinesq’s arguments for indeterminism (Bertrand 1878, 520; Van Strien 2014b). But although in this case, a problematization of the relation between theory and reality provided a way to defend determinism, in general it probably had primarily the effect of undermining determinism, through undermining the inference from theories to reality. Although most physicists at the time did not explicitly reflect on the issue, the fact that scientific theories were generally conceived of as highly reliable but still idealized models, which could not be taken to directly correspond to reality, could be taken to imply that even if there were a unified theory of physics which always yielded unique solutions for given initial conditions, one would still not be justified in inferring that nature itself is deterministic.

3.2 Determinism as a presupposition of science

By the late nineteenth century, broad, sweeping claims about universal determinism had gone out of fashion. However, the ideal of determinism did not disappear from physics altogether, but more and more, it took the shape of a necessary presupposition of science, rather than an established result within physics or a broad metaphysical principle.Footnote 18 This section focuses on the conceptions of determinism of Mach, Boltzmann and Poincaré. They are three of the major physicists of this period, and all three formulated general ideas on philosophy of science; this section argues that despite the fact that their philosophies of science are quite different, there is a striking similarity in their ideas on the status of determinism in physics. In particular, all three thought of determinism as an assumption which one must make when doing scientific research.

In his book on the history of mechanics, Mach argues that within mechanics, any process is uniquely determined by Newton’s laws of motion (Mach 1897 [1883], 257). Yet, the same book contains a sharp criticism of Laplacian determinism:

If the French encyclopedists of the eighteenth century believed they were close to the goal of explaining all of nature physico-mechanically, if Laplace imagined a spirit who could indicate the course of the world in the whole future, if only it knew all the masses with their positions and initial velocities, then this joyful overestimation of the scope of the physical-mechanical insights gained in the eighteenth century is forgivable, even a charming, noble, uplifting spectacle, and we can vividly sympathize with this intellectual joy, which is unique in history.

But after a century, after we have become more level-headed, the projected world view of the encyclopedists appears to us as a mechanical mythology in contrast to the animistic mythology of the old religions. Both views contain improper and fantastic exaggerations of a one-sided knowledge. (Mach 1897 [1883], 455; see also Mach 1903, 217).

There are several reasons why, for Mach, Laplacian determinism is not tenable. Although according to Mach mechanics itself, as a scientific theory, is deterministic in the sense that it describes processes which are uniquely determined by the laws of motion, Mach rejects the idea that all of science is reducible to mechanics: mechanics only describes certain aspects of nature (Mach 1897 [1883], 499). He particularly objects to the idea that physiology and psychology are reducible to mechanics, arguing that mechanics is ultimately based on sensory perception which is to be understood physiologically, and therefore mechanics cannot be an ultimate foundation. As we have seen in Sect. 2.4, Mach also objected to the idea that all domains of physics should be reducible to mechanics. Thus, even if mechanics is deterministic, this says little about whether nature as a whole is deterministic.

Mach furthermore objected to the universal scope and the temporality of Laplace’s determinism, according to which the current state of the universe, together with the laws of nature, determines all future and past states. According to Mach, this depends on a notion of absolute time which is not warranted. Mach argues that time is relational, which means that to describe how something changes in time, one has to describe how it changes relative to a part of the world functioning as a clock; e.g., describing the motion of the heavenly bodies in time is equivalent to describing the motion of the heavenly bodies relative to the rotation of the earth. But for the universe as a whole, there is no clock relative to which we can describe its changes. Therefore, Laplace’s statement of determinism depends on the false assumption that we have a notion of time which can be applied to the universe as a whole (Mach 1872, 36-37). More generally, in Mach’s account, laws of nature are essentially summaries of experience, and this entails that they cannot be extrapolated too far beyond experience. For these reasons, it would according to Mach be overly confident to assume that our laws of nature hold absolutely and are applicable to the universe as a whole. Laplacian determinism oversteps the boundaries of science.

Nevertheless, there are no indications that Mach would accept a role for fundamental chance in physics. In fact, he still adhered to determinism in a weaker sense. According to Mach, the aim of science is to find dependencies between observable phenomena, and he argues that the ‘law of causality’ should be interpreted as the statement that there are indeed dependencies between the phenomena to be found, or in other words, that we can find laws of nature (Mach 1872, 34; Mach 1897 [1883], 492). According to Mach, this is an assumption we need to make when doing science: to do scientific research is by definition to look for dependencies between phenomena, and thus, the assumption that there are such dependencies to be found is the assumption that science is possible. However, this assumption has to do with our scientific activity, rather than with nature: Mach stresses ‘that all forms of the law of causality arise from subjective instincts, to which there is no need for nature to correspond’ (Mach 1897 [1883], 495).

In later years, Mach argued that it is better to avoid the terms ‘cause’ and ‘causality’ in the context of physics and to speak only about functional dependence. He argued that the notion of cause is imprecise, and that to specify functional dependencies between phenomena is both more precise and more informative than to state causes of phenomena; therefore, an advanced science like physics deals (or at least should deal) with functions rather than with causes. In line with this, in his book Erkenntnis und Irrtum (1905), he no longer speaks about the ‘law of causality’ but rather uses the term ‘determinism’ to express the assumption that there are functional dependencies between phenomena. Mach presents this as a postulate which is needed when doing research, even though it cannot be proven:

The correctness of the position of ‘determinism’ or ‘indeterminism’ cannot be proven. It could only be decided if science were completed or demonstrably impossible. We are dealing here with presuppositions that one brings to the consideration of things, depending on whether one attaches a greater subjective weight to the successes or to the failures that research has attained so far. But during research, every thinker is necessarily a theoretical determinist (Mach 1905, 277).

However, scientists must always be aware of the possibility that their theories fail:

The researcher must (...) always be ready for disappointment. He never knows whether he has already taken into account all the dependencies that may be considered in a given case. His experience is limited in space and time and only offers him a small section of world events. No fact of experience is exactly repeated. Every new discovery reveals flaws in our understanding and reveals a previously neglected remainder of dependencies. So also he who advocates an extreme determinism in theory must in practice remain an indeterminist, especially if he does not want to speculate away the most important discoveries. (Mach 1905, 278).

Thus, determinism is an assumption which is needed when doing scientific research, but is to be avoided when reflecting on the scope of current scientific theories. Science is about finding dependencies between phenomena, and this is an ongoing process. When doing science, one should look for dependencies between phenomena and assume that such dependencies are to be found. At the same time, scientists should take account of the possibility that the dependencies that have been found do not hold absolutely at every scale and should be aware of the limits of their theories and open to the possibility to have missed something. Ultimately, there is no guarantee that we can find laws of nature determining every natural process in full detail. The issue of determinism versus indeterminism is therefore for Mach not a metaphysical issue. It is not about whether nature is fully determined or whether there is pure chance—this is a question which remains unanswerable. Rather, the issue is about assumptions made by scientists and the attitude of scientists toward their work.

A similar conception of determinism can be found in the work of Boltzmann. Boltzmann is mostly known for his work in statistical mechanics, which is usually seen as the main source of indeterminism in the early twentieth century; therefore, he is often seen as a forerunner of later indeterminism in physics (Stöltzner 1999). In his work on statistical mechanics, Boltzmann introduced probabilistic assumptions which could not be derived from the dynamical laws and worked with a notion of ‘molecular disorder’ (see Uffink 2007). Moreover, he argued for the idea that the second law of thermodynamics is a law of nature which holds statistically rather than absolutely, thus supporting the idea that laws of nature can be statistical.

Nevertheless, in a lecture in 1899, Boltzmann argued that the assumption that natural processes are uniquely determined is a precondition for science:

A precondition for all scientific knowledge is the principle of unique determination of natural processes; in the case of mechanics, the unique determination of all motions. This means that bodies do not move purely by chance, now this way and now that way, but that the motions are uniquely determined by the circumstances under which the body is located. If each body moved however it wanted, if under the same circumstances now this, now that motion would follow by chance, we could only curiously observe the course of the phenomena, not investigate it (Boltzmann 1905, 276–277).

Boltzmann notes that this principle becomes unusable if the ‘circumstances under which the body is located’ include the universe as a whole. If the movement of an object on earth could just as well be determined by processes taking place in another solar system, this would make scientific investigation into the movement impossible. It is therefore needed to make a restriction to local circumstances: we should assume ‘that the same movement always occurs when the direct environment is in the same state’ (Boltzmann 1905, 276–277). Boltzmann’s principle of determinism is thereby stronger than Mach, who did not impose such a locality restriction.

Boltzmann argues for the Kantian idea that we have laws of thought, without which knowledge would be impossible and our observations would be without any connection. However, he argues against Kant that these laws of thought have no a priori certainty (see Boltzmann 1905, 398–399; Fasol-Boltzmann 1990, 160). Rather, our laws of thought are evolutionary in origin. As such, they are usually very effective in helping us to make sense of the world, but because they have become such strong habits, they can in certain cases overshoot the mark (‘über das Ziel hinausschießen’—Boltzmann 1905, 399). Boltzmann compares this with a baby’s instinct to suck, which keeps the baby alive but is not effective in every instance. One of these laws of thought is the law of causality, which according to Boltzmann is the ‘foundation of all knowledge’ (Boltzmann 1905, 321). However, also our tendency to look for causes can overshoot the mark: it can lead to superstition when we look for causes of chance events and leads to useless philosophical puzzles, for example when we ask why we exist, why the world exists, or why the law of causality holds (Boltzmann 1905, 321, 354, 398–399).

Also Boltzmann’s principle of unique determination may be interpreted as a law of thought: it is required for doing science, but at the same time, there is no a priori guarantee that it holds. A problematic issue, according to Boltzmann, is how to determine what counts as local circumstances. If we really assume that the motion of bodies is always determined by their immediate surroundings, this excludes action at a distance, which makes it problematic to account for gravity. Alternatively, one could consider a larger area around the body: in the case of an object falling toward the earth, the earth can be included in its local surroundings, while distant stars are excluded. However, there is no strict criterion deciding which circumstances should be taken into account, and there is no guarantee that we can always find a set of local circumstances through which occurrences are determined (Boltzmann 1905, 278).

Furthermore, we may ask how Boltzmann’s claim that scientists need to assume that the motion of all bodies is locally determined relates to Boltzmann’s statistical physics, in which molecules move around randomly. In fact, Boltzmann suggested the possibility that the laws of motion according to which molecules move around in a gas may be statistical regularities which hold on average for large amounts of molecules, and that the motion of individual molecules is undetermined (Boltzmann 1898, 260). Boltzmann does not explicitly address the question of how this relates to the principle of unique determination, but a plausible answer is that it depends on what you want to investigate. In statistical physics, the behavior of a gas as a whole is studied, and then it is permissible to make the assumption that the individual molecules move around randomly; but if the goal is to investigate the movement of an individual molecule, this is only possible under the assumption that it does not move around by chance but that the movement depends on circumstances.

Boltzmann’s principle of local determination is thus neither a priori certain nor derivable from experience. When doing science, we have to look for local determining circumstances of processes, but whether we can in fact establish a theory in which all processes are fully locally determined is an empirical question which Boltzmann leaves open.

In contrast to Boltzmann and Mach, Poincaré does seem to argue explicitly for a strong, universal Laplacian determinism. For example, in Poincaré (1917 [1913], 7), he writes: ‘Knowing the present state of each part of the universe, the ideal scientist who knew all the laws of nature would possess fixed rules to deduce the state that these same parts will have the next day; it is conceivable that this process can be continued indefinitely.’ Like Laplace, Poincaré especially emphasizes determinism in the context of astronomy: he writes that the accurate predictions that are possible within astronomy provide a model for other sciences (Poincaré 1921, 289ff).

Nevertheless, in my view, also Poincaré ultimately regards determinism as an assumption we have to make when doing scientific research, rather than as an ontological claim about nature or an established feature of our current scientific theories. According to Poincaré, the idea that our universe is deterministic depends on certain assumptions which scientists (in particular physicists) make, and which are needed either to make science possible or to make specific scientific problems much easier to deal with. In his writings, Poincaré discusses a number of these assumptions.

First, as we have seen in Sect. 3.1, Poincaré argues that laws of nature are always approximate and incomplete. However, in order to do science, scientists need to work with the assumption that it is possible to find ever more accurate laws of nature, which will enable them to make better predictions:

[N]o particular law will ever be more than approximate and probable. Scientists have never failed to recognize this truth; only they believe, right or wrong, that every law may be replaced by another closer and more probable, that this new law will itself be only provisional, but that the same movement can continue indefinitely, so that science in progressing will possess laws more and more probable, that the approximation will end by differing as little as you choose from exactitude and the probability from certitude. (Poincaré 1921, 341)

If we find that a certain law fails, that it fails beyond a certain limit or that it leads to contradictions when extrapolated, we assume that it can be replaced by some other, more accurate law.

Secondly, Poincaré argues like Boltzmann that physicists generally assume that only the local environment of the phenomena they are studying has an effect (Poincaré 1921, 346). In addition to this assumption of spatial locality, physicists generally assume that an event is determined by the situation immediately before, and that one does not need to know what happened in the distant past in order to predict future occurrences. The assumption that the state of a system at an instant can be derived from the state at the preceding instant makes it possible to work with differential equations (Poincaré 1921, 136).

Furthermore, physicists generally work with the assumption that from similar antecedents follow similar consequents; Poincaré states that this can be made mathematically exact through the assumption that the consequent is a continuous function of the antecedent (Poincaré 1921, 346). More generally, he claims that we can always express the relations between physical quantities by mathematical functions which are continuous, differentiable and analytic. According to Poincaré, this assumption cannot be falsified, since any function can be approximated closely enough by an analytic function to be observationally indistinguishable (Poincaré 1905, 296–297; Poincaré 1921, 288; Van Strien 2015). As Poincaré points out, these assumptions automatically lead to a deterministic conception of the world (Poincaré 1905, 295). The assumption that the relations between physical quantities can always be expressed by analytic functions guarantees that any process can be described by differential equations, and, although Poincaré does not make the point explicit, because of the property of analyticity these equations are guaranteed to have a unique solution for given initial conditions and thus to uniquely determine what will happen.

To summarize, according to Poincaré, physicists work among others with the following assumptions: (1) We can always find more accurate laws of nature; (2) Phenomena are determined by circumstances which are close to them in both space and time; (3) Physical quantities take continuous values, and the relations between them can be expressed by continuous functions. These assumptions are not empirically verifiable or falsifiable, and neither of them can be taken as an a priori truth, although we do have to make such assumptions in order to make progress in science.

Generally, according to Poincaré, scientific theories are based on experience as well as on various types of hypotheses. Among the types of hypotheses he distinguishes is that of ‘natural hypothesis’:

There are first those [hypotheses] which are perfectly natural and from which one can scarcely escape. It is difficult not to suppose that the influence of bodies very remote is quite negligible, that small movements follow a linear law, that the effect is a continuous function of its cause. I will say as much of the conditions imposed by symmetry. All these hypotheses form, as it were, the common basis of all the theories of mathematical physics. They are the last that ought to be abandoned. (Poincaré 1921, 135)

The examples which Poincaré here gives of natural hypotheses are very similar to, and in part identical with, the above-mentioned assumptions. Natural hypotheses are hypotheses which we cannot test empirically, and of which we have no a priori guarantee that they hold; nevertheless, it would be very difficult or even impossible to do science without such hypotheses.

Poincaré is known for his conventionalism and neo-Kantianism in philosophy of science. De Paz (2014) has argued that natural hypotheses can be understood as a type of convention: despite the fact that they cannot be verified, the scientist can decide to adopt these hypotheses because they are useful for the constitution of scientific theories. It is clear that to label these assumptions as ‘conventions’ does not mean that they are arbitrary. In general, Poincaré argues that although conventions can be freely chosen, they are not arbitrary: ‘Experiment leaves us our freedom of choice, but it guides us by aiding us to discern the easiest way. Our decrees are therefore like those of a prince, absolute but wise, who consults his council of state’ (Poincaré 1921, 28). The choice for conventions is thus guided by practical considerations and experience (Poincaré 1921, 352; on Poincaré’s conventionalism, see, e.g., Psillos 2014; Ivanova 2015). It can also be argued that natural hypotheses are stronger than conventions, as they are necessary preconditions for science, and that they should rather be seen as playing the role of synthetic a priori principles (Heinzmann and Stump 2017).Footnote 19

Now we can come back to Poincaré’s conception of determinism. In Science and Value (first published in 1905), Poincaré reacts to the views of the philosopher Édouard Le Roy, who had used Poincaré’s ideas in order to argue for a more extreme version of conventionalism, according to which science is merely a human construct (see Psillos 2014). Le Roy argues that determinism depends on the assumption that our laws hold absolutely, but since this is an assumption which we make freely, it is of our own free will that we can end up with a deterministic world view—thus, determinism is inherently contradictory. Poincaré is critical of Le Roy’s anti-intellectualism and skepticism about science, but admits that, in a sense, ‘we are determinists voluntarily’ (Poincaré 1921, 347). In his final work, Dernières Pensées, we find the following remark:

Science is deterministic; it is deterministic a priori; it postulates determinism, because without it, science could not exist. It is also deterministic a posteriori; if it started out by postulating determinism, as an indispensable condition for its existence, it then demonstrates determinism precisely by existing, and each of its conquests is a victory for determinism. (Poincaré 1917 [1913], 244).

For Poincaré, science is about finding laws of nature through which events are determined. Scientific theories are thus necessarily deterministic in the sense that they specify the conditions under which occurrences take place. However, this does not mean that there is a guarantee that determinism holds absolutely and that we can be sure that every event that takes place in nature is uniquely determined: although determinism must be presupposed in order for science to be possible, the possibility of science is not guaranteed, and we have to find out empirically how far we can get with the development of scientific theories. Thus, we can only confirm determinism insofar as science is successful. In Poincaré’s words, ‘science, rightly or wrongly, is deterministic; wherever it enters, it brings in determinism’ (Poincaré 1917 [1913], 245).

Despite significant differences in the philosophies of science of Mach, Boltzmann and Poincaré, there is thus a striking similarity in their positions on determinism. All three argue that science is essentially about finding the conditions under which phenomena occur (or, with Mach, to find functional dependencies between the phenomena), and therefore, in order to do scientific research, one has to assume that such conditions can be found, and that phenomena do not just take place by chance. In this sense, determinism is a necessary feature of scientific theories. But all three of them ultimately remain agnostic about whether nature is deterministic at the ontological level. For these authors, the issue of determinism is not about ontology, but rather about the presuppositions of science, the methods and assumptions that scientists use in their research, and the attitude of scientists toward their theories.

4 Conclusion: the image of classical physics

The use of the term ‘classical physics’ may have colored our perception of physics and mechanics in the eighteenth and nineteenth century: this term suggests a finished theoretical framework and may have contributed to a perception of physics in this period as a static and finished whole, rather than an active research area which was changing and developing, and in which there were a diversity of approaches and lively debates about the foundations of the field. The same holds for ‘classical mechanics.’ I think that it is only through the conception of classical physics as a unified and essentially complete theoretical framework that the idea that ‘nineteenth century physics was deterministic’ can appear as straightforwardly true.Footnote 20

Retrospectively, we can say that the aim of accounting for all phenomena within physics through a basic set of laws of motion was never reached. By the late nineteenth century, the idea that all processes within physics are uniquely determined by the laws of mechanics had lost a significant part of its plausibility: new developments in physics showed a complex and diverse set of phenomena, which were described through a range of mathematical techniques, and it was increasingly unfeasible to reduce all of physics to a basic set of equations and to establish that these equations uniquely determine the course any process. Moreover, it was often necessary to work with idealizations, which in the late nineteenth century led to the common acceptance of the idea that scientific theories could merely offer idealized models, with which it may be possible to formulate highly reliable predictions but which could not be taken to exactly correspond to reality.

Nevertheless, the claim that physics was never deterministic may be an overstatement. Among nineteenth century physicists, one finds some who explicitly argued for or against determinism, and quite a few who were silent on the issue. Overall, the majority of physicists in this period seems to have taken for granted that natural processes are uniquely determined and do not take place by chance; even though they would not be able to demonstrate that this was indeed the case, there were also no compelling reasons to abandon this belief.Footnote 21 If determinism could not be demonstrated, neither could it be shown that there was fundamental chance. With the ontological humility that was characteristic of the late nineteenth century, the question whether nature is deterministic became undecidable. But even if physicists remained agnostic about determinism at the ontological level, it was still possible to hold on to determinism as an assumption that has to be made when doing scientific research. Examples can be found in the work of Mach, Boltzmann and Poincaré, who each conceived of determinism as a presupposition of science: if science is about finding out what happens under which conditions, then the working scientist, when investigating some phenomenon, should assume that its determining conditions can be found.