Reason: Discussing Laplace's Demon, an Old Thought Experiment
Still on the trail, trying to find the unknown and probably unknowable, I discuss an old thought experiment about determinism.
Author's Preface
Laplace's Demon is a thought experiment from the early modern period of science and philosophy. Laplace proposed that if a being knew the precise location and momentum of every atom in the universe at one particular time, then the entire past and future could be calculated exactly.
Although I find big problems with the coherence of Laplace's thought experiment, I note that in an unpublished set of musings, I have argued that positing randomness as uncaused phenomena is self-defeating, since all matter and events have structure and extent, showing that constraints exist, even if the constraints themselves are not fully known. Events have frequencies; measurements have bounds of precision and accuracy; and matter is bounded in space and time. So, to assert that things are both uncaused but constrained violates everyday logic. Maybe this just shows the limitations of language; maybe it shows the limitations of human intellect.
I argue that even things regarded as possibly random in the subatomic world seem subject to constraints. Some invoke the magic of hidden "variables." I contend that "unknown factors" might be a better way of expressing this idea.
As a pragmatist who thinks that determinism is the better explanation than uncaused events and outcomes, I rest my case.
Introduction
Laplace's Demon is one of the most enduring thought experiments from the early modern period of science and philosophy. In 1814, Pierre-Simon Laplace proposed that if an intelligence—later dubbed "the demon"—knew the precise location and momentum of every atom in the universe at one instant, then the entire past and future could be calculated with absolute certainty. This vision exemplified the Newtonian mechanistic worldview, where the universe was imagined as a vast, perfectly predictable machine, operating according to immutable laws.
At the time, such a vision was plausible. The physical sciences were uncovering regularities in planetary motion, mechanics, and thermodynamics, suggesting a deeply ordered cosmos. Human ignorance was seen as the only obstacle to perfect prediction—not any limitation inherent in nature itself.
However, scientific developments in the twentieth century severely challenged this picture. Quantum mechanics, in particular, introduced the idea that uncertainty might be not merely a matter of ignorance, but a fundamental feature of reality itself. According to standard interpretations like the Copenhagen interpretation, events at the quantum level are governed by intrinsic probabilistic laws: an electron's position and momentum cannot both be known precisely, not because of practical limitations, but because nature itself does not possess simultaneously definite values for both.
Despite these scientific developments, it is crucial to recognize that interpretations of quantum experiments vary immensely. While some interpretations see indeterminacy as fundamental, others—such as the de Broglie–Bohm pilot-wave theory and the many-worlds interpretation—retain a form of determinism, suggesting that quantum uncertainty is a feature of our epistemic perspective rather than of the underlying ontology. No experimental result has unambiguously favored one interpretation over others.
Thus, while quantum mechanics challenges Laplacian determinism, it does so in a way that is itself interpretative. The mere majority view of physicists does not determine truth. Scientific consensus is historically contingent, vulnerable to paradigmatic shifts. History shows that once widely accepted scientific views—such as geocentrism, phlogiston theory, and classical ether theory—have later been discarded.
Laplace’s Demon remains, therefore, not merely an antiquated relic, but a living invitation to scrutinize our assumptions about causality, determinism, randomness, and the structure of explanation.
Discussion
The Superficial Plausibility of Laplace’s Demon
At first glance, the Demon thought experiment seems compelling. After all, everyday experiences show causality operating in familiar, law-governed ways: billiard balls collide predictably, planetary orbits follow calculable paths, and chemical reactions proceed according to well-established regularities. It seems natural to extrapolate from such examples to the entire cosmos.
However, deeper reflection exposes the fragility of Laplace’s assumptions. For the Demon to predict everything, it must possess an absolutely complete specification of the universe’s state at a given instant. This demands not only a complete knowledge of factors but an infinitely precise determination of every factor’s value. Infinite precision is not merely difficult; it is conceptually incoherent.
Furthermore, modern knowledge about complex systems, such as turbulence, weather systems, and chaotic dynamics, shows that even systems governed by deterministic laws can exhibit unpredictability due to extreme sensitivity to initial conditions. Tiny, unmeasurable differences can amplify over time, leading to wildly divergent outcomes—a phenomenon known as the "butterfly effect." Thus, determinism does not imply practical predictability.
Example: Weather Prediction
A concrete example is weather forecasting. The equations of fluid dynamics governing atmospheric motion are deterministic. Nevertheless, weather predictions become unreliable beyond a few days because small uncertainties in initial measurements mushroom into large-scale differences. Even with complete access to the governing equations, prediction is limited by practical and theoretical constraints on measurement precision.
Laplace's Demon would require complete knowledge with infinite resolution.
The Epistemological Problem: State vs. Knowledge
Laplace's formulation conflates two distinct concepts: the physical state of the world and the knowledge of that state. The Demon is imagined as a knower—a mind capable of possessing complete information. But complete information is not the same as the state itself. This confusion introduces epistemological assumptions into what purports to be a metaphysical claim about determinism.
Modern approaches suggest that discussions of determinism should refer only to the properties of physical systems themselves, without invoking hypothetical knowers. The issue is whether the current state of a system fully determines its future, not whether some intelligence can compute that future.
When reformulated properly, determinism concerns the structure of reality—not the cognitive abilities of demons.
Beyond Randomness: The Circularity of Standard Objections
A common rebuttal to Laplace’s Demon invokes quantum randomness. According to the standard interpretation, processes like radioactive decay occur without cause: an atom "chooses" to decay at a particular moment without any underlying reason.
Yet this rebuttal risks circularity. It presumes that indeterminacy is real and fundamental, but that is precisely the point in dispute. Invoking randomness as a given amounts to begging the question.
Moreover, quantum randomness itself is statistical. While individual events are unpredictable, the distribution of events follows statistical laws. Half-lives of radioactive isotopes, for instance, are known with great precision. This strongly suggests underlying constraints—even if the mechanisms remain obscure.
Example: Radioactive Decay
Consider the decay of a radioactive isotope. The timing of any single atom's decay appears unpredictable. But across a large sample, the proportion of atoms decaying over a given period conforms to a well-defined exponential curve. There is clear regularity in aggregate behavior, despite apparent randomness at the individual level.
Constraint is evident; the decay is not utterly chaotic. If decay events were truly unconstrained, no regular statistical pattern would emerge. Thus, invoking "randomness" alone does not eliminate structure or the need for explanation.
The Self-Defeating Nature of "Uncaused Yet Constrained"
This leads to a more powerful critique: the claim that phenomena are both uncaused and constrained is logically self-defeating. Constraint implies structure—limits on what is possible, relations among properties, regularities across instances. But structure requires some form of causal or lawful underpinning. Otherwise, why would constraints exist rather than utter anarchy?
If decay times, particle interactions, or quantum fluctuations exhibit regularities, then those regularities demand explanation. Simply labeling them "uncaused" sidesteps the explanatory task. True uncausedness would entail utter absence of pattern, structure, or repeatability—which is not what we observe, except in some dreams.
Thus, even the phenomena most often cited as examples of randomness—quantum events, radioactive decay, vacuum fluctuations—show evidence of constraint. "Unknown factors" may better characterize the situation than "uncaused events." The unknown does not imply the uncaused.
Example: Brownian Motion
Brownian motion—the seemingly random movement of particles suspended in a fluid—once appeared to be a paradigm of randomness. However, deeper analysis showed that the random motion results from countless molecular collisions governed by physical laws. What looked random on the surface was structured beneath.
Quantum phenomena may similarly reflect hidden things not yet fully understood. To assert absolute uncausedness prematurely is not cautious science but metaphysical overreach.
The Problems of Infinite Divisibility and Extent
Laplace’s Demon, in its original formulation, does not explicitly depend on claims about infinite divisibility or infinite spatial extent. It operates on the premise that, given a complete specification of the current state of all matter in the universe—however defined—an intelligence could, in principle, compute all future and past states using known physical laws. The thought experiment does not engage directly with the metaphysical questions of whether the universe is infinitely large or whether matter is infinitely divisible.
Nonetheless, contemporary discussions often project these issues onto the thought experiment. Modern physics introduces the notion of a minimum length scale—the Planck length—beyond which classical concepts of space and time may lose meaning. Similarly, there are models in cosmology that suggest the universe might be spatially finite or topologically bounded. But these remain interpretive frameworks rather than empirically settled facts. We are dealing with the “high strangeness” of modern physics and cosmology.
The limits of the very small and the very large are not established by observation alone; they are inferred within particular theoretical models. Experimental data are interpreted through the lens of existing frameworks, and different models may yield divergent implications. Thus, to claim that such limits definitively undermine Laplace’s thought experiment overreaches what the evidence can warrant. It is undermined by its overally lack of clarity.
Evidence does not "speak" independently of interpretation; it must always be embedded within a conceptual framework that defines what counts as a limit, a state, or even a prediction.
Summary
Laplace’s Demon once seemed a compelling embodiment of classical determinism. It promised a vision of total order, predictability, and intelligibility.
Today, the thought experiment stands revealed as deeply problematic. It rests on incoherent assumptions about infinite precision, conflates physical state with epistemic knowledge, and underestimates the complexities of structured but unpredictable phenomena.
Modern quantum mechanics complicates but does not at all resolve these issues. Appeals to randomness do not defeat determinism; they merely reframe the debate. More fundamentally, the very notion of uncaused yet constrained phenomena collapses under logical scrutiny. Constraint implies structure; structure demands explanation, under any understanding of language and of the world.
Thus, whether one inclines toward deterministic or indeterministic interpretations, Laplace’s Demon no longer serves as a viable model. The world, it seems, is structured—but the nature of that structure, its causes, and our capacity to know them remain among the deepest mysteries.
Readings
Albert, D. Z. (2000). Time and chance. Harvard University Press.
Explores the relationship between determinism, chance, and the direction of time, challenging classical assumptions in light of quantum theory.
Barrett, J. A. (1999). The quantum mechanics of minds and worlds. Oxford University Press.
Provides a systematic analysis of quantum mechanics interpretations, especially many-worlds and Bohmian mechanics, with implications for determinism.
Canales, J. (2020). Bedeviled: A shadow history of demons in science. Princeton University Press.
Investigates how thought experiments involving demons—Laplace's among them—have shaped scientific epistemology and philosophical reflection.
Collier, J. (2002). Holism and emergence: Dynamical complexity defeats Laplace’s demon. Konrad Lorenz Institute for Evolution and Cognition Research (Working Paper No. 4/2002).
Argues that emergent phenomena and nonlinear dynamics in complex systems challenge the predictive power assumed by Laplace’s Demon.
Earman, J. (1986). A primer on determinism. D. Reidel.
Comprehensive and technically rigorous introduction to philosophical and physical determinism, including extensive discussion of Laplace's model.
Hoefer, C. (2016). Causal determinism. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy (Fall 2016 ed.). Stanford University. https://plato.stanford.edu/archives/fall2016/entries/determinism-causal/
Explores the conceptual foundations of causal determinism, the varieties of its interpretation, and its relevance to classical and quantum physics.
Popper, K. R. (1982). The open universe: An argument for indeterminism. Routledge.
Classic work arguing that science must reject determinism and embrace the openness of the future, based on logical and empirical grounds.
Prunkl, C., & Timpson, C. G. (2017). Quantum uncertainty and the impossibility of Laplace’s demon. Studies in History and Philosophy of Modern Physics, 58, 1–14. https://doi.org/10.1016/j.shpsb.2016.12.003
Critically examines whether quantum mechanics, under any interpretation, permits the full predictability envisioned by Laplace’s Demon.
Rovelli, C. (2016). Reality is not what it seems: The journey to quantum gravity (S. Carnell & E. Segre, Trans.). Riverhead Books. (Original work published 2014)
Discusses modern physics’ view of space and time as relational and discrete, challenging assumptions of classical continuous determinism.
Stöltzner, M. (2011). Laplace’s demon and its shadow: Pre-determined randomness. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 369(1956), 4097–4115. https://doi.org/10.1098/rsta.2011.0160
Explores how modern physics incorporates randomness into deterministic structures, complicating traditional distinctions.
Wolpert, D. H. (2008). Physical limits of inference. Physica D: Nonlinear Phenomena, 237(9), 1257–1281. https://doi.org/10.1016/j.physd.2007.12.014
Discusses theoretical bounds on inference and prediction, offering formal reasons why no system—demon or otherwise—can predict all outcomes, even in deterministic universes.
Yeah, the quantum mechanics puts any classical mechanistic theory subtly to rest. Classical (macro world) mechanics works nearly 100% of the time (with strong calculable determinism), until it doesn't for reasons that can not be explained except for quantum effects becoming observable in the classical macro world.