by Stefan Morales |
Last Updated: March 17, 2024
So these are some irreverent explorations around the dualism between a stochastic and deterministic view of the universe – where theories like Assemblage Theory (AT), Actor Network Theory (ANT) and many others live in that dualism – and how we might begin holding this dualism together in paradox as a complementary pair – “those things, events and processes in nature that may appear to be contraries but are mutually related and inextricably connected” (Kelso & Engstrøm).
- The dualism between a stochastic and deterministic view of the universe is a central bias in this work, and I’m trying to hold it as a wicked question along with a few other folks I’m in dialogue with. That this stretches as far back as ancient philosophy is no surprise, and we may never really easily break out of the central cultural, linguistic, etc. patterning laid down by the canon (we can only wonder: would our dominant ways of knowing be different if they had their historical and cultural roots elsewhere?). With that in mind, let’s try anyways: to loosen things up a bit, reveal some cracks in our shared models of reality, and create the conditions for the creation of new concepts.
- While AT and ANT are both fairly oriented to the social sciences there are many other theories attending to one or the other side of this dichotomy, that come from biology, chemistry, physics, etc. I’ll outline a few here, and deconstruct the dichotomy a bit as I go. I will say that while we can deconstruct the dichotomy, we are still always working within, alongside it or against it to some degree, so the conceptual blind spot is still glaring, or unthinkable, without the dichotomy and its elements. And I believe this sort of deconstructive, reconstructive, creative exploration matters because we are reinforcing our tacit knowledge about our understanding and experience of reality (social and otherwise) simply by holding these questions open, exploring the uncertainty of them without the need to codify or quantify, and with an eye towards developing new concepts and language. OK, let’s go!
Chaos Theory
- Chaos theory studies systems that are deterministic in nature but exhibit unpredictable behaviour due to their sensitivity to initial conditions. Chaos theory suggests that small differences in the starting point of a system can lead to vastly different outcomes, making long-term prediction difficult despite the system being governed by deterministic laws. Chaos theory bridges the gap by showing how deterministic rules can result in behaviour that appears random and unpredictable.
- It’s contribution to a deconstruction of the dualism? Unpredictability (a trait associated with stochasticity) emerges not in spite of determinism but because of it. Chaos theory reveals the inherent indeterminacy within deterministic systems, challenging the supremacy of predictability in understanding complex behaviours.
- The deterministic nature of the system’s laws does not eliminate unpredictability but rather, through sensitivity to initial conditions, generates a complexity that mimics stochastic behaviour.
- This unpredictability does not arise from the lack of deterministic laws but from the practical limits of measurement and the inherent complexity of the system – we then also get to the question of how well we know and understand the instruments we use to measure (the influence and noise of complexity vs. the instrument, etc.), as well as the limits of quantification itself.
- This aspect of instrumentation also reveals the reverse: is there something in the initial conditions that is unpredictable, random, unforeseen, etc. that nonetheless ends up being a determining factor in the outcome? Is this simply because of limitations of our instrumentation, our ways of measuring? Or is simple and supposedly apparent reality fundamentally opaque?
- Conjecture Corner: As humans, going about our day-to-day lives, are we perhaps stuck in a ‘messy middle’? Almost as if our extensive scale of reality is enclosed in a package, or ‘bounded on its sides’ by thresholds… the other side of which lies a deterministic universe? Put another way: as if we are in a bubble or a band, whose inside dynamics are characterized by unpredictability, but whose outside dynamics are characterized by determinism? But what then of the emergence of determinism within the band or bubble? This is a fairly spatial representation of an idea though, and there could be other ways to characterize this idea of layered or blended duality, or non-duality.
Complexity Science and Complex Adaptive Systems
- Complexity science studies how relationships between parts give rise to the collective behaviors of a system and how the system interacts and evolves with its environment. You can see the obvious linkage with the concept of the assemblage. Complex adaptive systems (CAS), such as ecosystems, economies, and social systems, are characterized by self-organization, emergence, and adaptation, displaying both deterministic and stochastic features. These systems adapt and evolve through local interactions among agents following simple rules, without central control, leading to emergent behaviour that cannot be predicted solely by deterministic or stochastic models.
- Its contribution? The unpredictability and adaptation of complex systems illustrate the co-presence of chance and necessity, undermining the clear separation between deterministic and stochastic models. So, a constant interplay, or oscillation between chance and necessity.
- Conjecture Corner: Or are we bumping up against frequency and interference patterns? This may hint at Moiré patterning effects in the nature of things, harmonics, etc. And does this hint at a fully deterministic universe whose stochasticity is simply the case due to observing a system through the lens of point-form measurements, when instead we can observe the system through the lens of wave-form? Where the interference patterns between each create observed chaos and complexity? And if so, are the wave forms all part of the same universe? Or are they creating interference patterns that criss-cross our reality, or influence one another as in the Moiré pattern effects? Hmmmm….
Statistical Mechanics
- Statistical mechanics links the microscopic behavior of particles, which can be stochastic and unpredictable, with the macroscopic properties of systems, which follow deterministic thermodynamic laws. By using probabilistic methods to describe the collective behavior of a vast number of particles, statistical mechanics provides a framework for understanding how deterministic macroscopic laws emerge from the stochastic behavior of microscopic components.
- The emergence of thermodynamic laws from the random motions of particles illustrates how stochastic micro-level interactions give rise to predictable macro-level patterns, blurring the line between chance and determinism. A perfect example of some characteristics of intensive science: by studying a vast population of particles, we see how a pattern emerges in the form of thermodynamic laws.
- I also can’t help but share the story about Sir Francis Galton at the county fair: about 800 people at the fair take a guess at an ox’s weight to win a prize at the closest guess. Galton took the data and calculated the average weight from the 800 people. The average guess was basically perfect. So, a bit more messy than particles, but a similar pattern of stochastic micro-level guesses, averaging out to the correct guess.
- Conjecture Corner: if the outcome of all the stochastic behaviour is nonetheless going to follow a law, is what happens below the threshold of the expression of the law meaningful or important? Or is it simply noise with a trajectory? How much of life itself is simply noise with an endlessly repeating trajectory – having no impact beyond it’s Critical Zone?
Evolutionary Theory and Evolutionary Dynamics
- Evolutionary theory, particularly in the context of evolutionary dynamics, models the change in populations over time using both deterministic and stochastic elements. Genetic drift, for example, introduces randomness into the evolution of populations, while natural selection can be modelled by deterministic equations (i.e. the breeder’s equation). This integration helps explain the diversity of life and the adaptation of species to their environments.
- This interweaving of chance and necessity in the development of life on Earth troubles our stochastic/deterministic binary by showing how randomness and determinism are co-constitutive forces in natural processes.
- Conjecture Corner: when we look at evolutionary history we see that the evolution of a biological organism has no objective. Instead we see the increasing ability of successive generations to survive and reproduce in their environment – and this increase in ability to survive and reproduce is defined in relation to its environment. If we zoom out from all of the myriad forms of species, all the dynamic variation of life’s expression over the span of millennia, all of the different ways that survival and relation to an environment play out (namely, the messy, stochastic realm of chaos and complexity), we see that the process of reproduction is what has not changed very much. We have asexual reproduction dominant up until about 2 billion years ago, where sexual reproduction evolved (and there are many theories why). Throughout life’s history, the process of reproducing generation after generation has been the consistent model of life itself – from this view, reproduction is the static, deterministic geometry within/outside/virtual to a highly variable history of biological form and function. Where non-organic ‘life’ is law and threshold bound, is organic life also bounded similarly, but also by the process of reproduction – copy-bound? And would then AI be anything like life? – or is it’s intelligence fundamentally outside of ‘copy-boundedness’?
Nonlinear Dynamical Systems
- Nonlinear dynamics studies systems in which the output is not directly proportional to the input, leading to phenomena such as bifurcations and phase transitions. These systems can exhibit both predictable (deterministic) and unpredictable (stochastic or chaotic) behavior under different conditions, providing insights into how complex patterns and structures emerge in nature and society.
- Deterministic equations can produce a vast array of behaviors, from predictable to chaotic, depending on initial conditions and parameters. This field exemplifies the deconstruction of the deterministic/stochastic binary by illustrating how predictability and unpredictability are not inherent qualities of systems but emerge from the interplay of system dynamics.
- Conjecture corner: In many ways, the key phrases here are ‘non-linear’, and ‘exhibit’. I’ll say more about linear time, etc. below under “conjectures and conceptures”, and for here I’ll say the word ‘exhibit’ is important because it demonstrates another fundamental bias towards what can be quantified, observed, etc. But just because we cannot compute what has lead to an outcome, does not mean that there is simply nothing happening below a threshold of observation other than complexity, chaos, stochasticity, etc. We have to be mindful in our use of words like “complexity” and “stochasticity” as perhaps fancier ways of saying “I don’t know, and I don’t know how I can come to know.” What if bifurcation points, phase transitions, etc. are what we observe in our dimension, scale, etc. of reality, and that they reveal the existence of a ‘geometry’ or static~dynamic plane – not transcendent to reality, but immanent yet virtual? The concept of machinic phylum is helpful to puzzle out in relation to this, but I won’t go into it here, and suggest reading some early Delanda on the topic.
Quantum Mechanics
- Quantum mechanics fundamentally incorporates indeterminacy at a microscopic level, with phenomena like superposition and entanglement challenging classical deterministic notions. However, it also employs deterministic equations (e.g., Schrödinger’s equation) to describe the evolution of quantum states over time. The probabilistic interpretation of quantum states—where probabilities are determined by deterministic wave functions—illustrates a conceptual bridge between determinism and stochasticity.
- Perhaps the best known ‘deconstruction via a merger’ model! LOL By introducing indeterminacy at the fundamental level of physical reality, where deterministic equations (e.g., Schrödinger’s equation) govern the evolution of probabilities rather than certainties, we have a deterministic model that produces a probabilistic, stochastic output (rather than a predictable one), at the instant of measurement. This theory merges determinism with indeterminacy, suggesting that the fabric of reality itself is woven from threads of both chance and necessity. But again, we are at a ‘fabric of reality’ extensive scale of reality, which may be outside of the bubble we happen to be in, or the phase transition that lies between the extremely small quantum scale and other extensive scales (from the human to the cosmological).
Summary:
- Summarizing some main aspects of this deconstruction:
- Purity: a central assumption of pure determinism or pure stochasticity is at work here. (pure-impure). However, when presenting a mixture of the two as I’ve done in a few places above, the very notion of a mixture leans in the direction of stochastic, dynamic, intensive, fluid, etc.
- Separateness: a central assumption of no relationship between the two, when they are highly relational – but the question is more of when, how, why, etc. (separate-relational)
- Dichotomy: this goes without saying, and so the continuum is then a converse assumption to work within (again, however, another dualism). (dichotomy-continuum)
- Real/Ideal: again, a bit implicit here, but this assumes that one operates in the messy space of the real, whereas the other in the arid space of laws, forms, etc. The notion of the virtual is an interesting third to think with, same with Moiré Patterning. The very idea that determinism relies on stochastic elements to function and that stochastic processes can lead to deterministic patterns over time points to a virtual space that both bounds and criss-crosses both in process (real-ideal-virtual) AND to the methods of instrumentation and modes observation themselves.
- Static: by assuming that there are two positions or ways into reality, we assume that this is a static world. What happens when we prioritize the process, as opposed to the product? Evolution then becomes a long history of reproductive process, etc. (static-process)
- Brian Massumi has a helpful caveat on deconstruction in his preface to the new 20th anniversary edition of an old favourite of mine, Parables of the Virtual, which I’ll quote in full: “You can use a distinction like abstract/concrete, or interactive/relational, provided you deconstruct it while you’re using it so that it no longer holds water as a binary opposition. If you don’t do that, you cleave the two sides of process— the conditions of emergence and the emerged— from each other. This vivisects process. But if you content yourself just with deconstructing them, you lose the ability to generate precise, operative concepts. At every turn, you find yourself falling into aporia because both terms have been disqualified. This gets you nowhere processually. Or only so far. It paralyzes the movement of thought at the same point in every line of thinking. So you construe the two seemingly opposed terms as qualitative degrees of each other. Because the world has both dimensions, the constituted and the constituent, the emerged and the emerging, the being and the becoming, the concrete and the abstract, interaction and relation. One makes no sense without the other. You have to take them both as they come (together). Now you’re on the way to constructing concepts for how they come together (“how?” being the process question par excellence). You’re on the way to making what was a paralyzing aporia a productive paradox. You juggle the terms. You go back and forth between their respective logics. The focus shifts and the exploration moves between levels. You’re always making sure to feed their rise and fall back and forward into each other— until they become one, mutually modulating co-movement. So that they relate (instead of interacting). In other words, you have not only to recognize but also to perform a primacy of relation. Mutual inclusion, the logic of relation, then loops around to encompass both itself and its “opposite”— which is now not its opposite but its processual correlate, riding it like a burr that has worked its way into the fabric. This wrap-around logic is the third logic: that of the primacy of relation (the ultimateness of the more-than concreteness of the abstract—in case you hadn’t noticed, this is where it gets metaphysical).” (xxiii-xxiv)
- So then, in terms of new concepts to play with that have been outlined above (see bolded above):
- Messy Middle
- Threshold Bounded Reality
- Oscillation
- Frequency and Interference Patterning
- Moiré Patterning
- Noise with a Trajectory
- Critical Zone
- Copy-Bound
- Virtual
- Machinic Phylum
- The Instant of Measurement
Conjectures & Conceptures
- So, although we’ve been deconstructing the dualism between stochastic and deterministic throughout the above, deconstruction itself involves an oscillation. Deconstruction is a dynamic picture of the interplay between supposedly opposed qualities: dynamic/static; chance/necessity; point-form/wave-form; etc. What happens when we posit that the concept of time itself – through which we are able to make sense of the dynamic, stochastic conception of the universe (and oscillation itself) – is a foundational bias of all living organisms? What happens if ‘time’, as we experience it, and conceptualize it, is wrong (or as Einstein strongly put it, an illusion)?
- Organic life, in its constant effort to resist entropy, builds upon itself, selects, evolves, etc. ‘over time.’ So, how could we not perceive time everywhere? And yet, it is not so important to physics. Organic life, on the other hand, has measurable memory in the form of how many component parts have been assembled, and we can measure this in a lab in the form of how many steps any given organic object (passes a threshold of 15 steps) or non-organic object (cannot pass a threshold of 13 steps) contain within their formation. But is this proof of time, or simply another singularity or threshold in yet another ‘flow’?
- To us it seems to prove a very intuitive concept of ours. But is it simply a record of an assembly threshold that distinguishes non-organic ‘life’ from organic life? This is why I’m currently curious about how we can think of memory without time: is life simply a more expansive record of different ways that heat and entropy can interplay with matter?
- And does our concept of ‘record’ or ‘memory’ even make sense without time? Or do we need different concepts? If time is simply a foundational bias of life, if time is simply how life reads itself and the universe around it, then perhaps we have to ‘back into’ a concept of reality that seems strange to us. Here’s a few thoughts on that reversal:
- Strange attractors, singularities, thresholds, etc. although seemingly virtual, perhaps only appear so because of the position through which we observe them. They seem to ’emerge’ from dynamic, intensive processes, etc. that we are embedded within, and that we think and behave within as living beings. And we are embedded within a specific type of assembly – one that has more memory, say, than a non-organic molecule.
- But what if instead of the result of intensive processes, these virtual attractors have always, already been? What if attractors are static, eternal – like an immanent, invisible Platonic geometry? They don’t result from flow, but are immovable pillars within ‘flow’. Are we now just coming back to ‘forms’ – not transcendent, but immanent?
- What if we reverse the immanence/transcendence dualism and posit that life-itself is what truly transcends a threshold of in-built memory, transcending eternal time by expanding memory? ‘Carving out’ a cavity of possibility and potentiality within some eternal, immanent substance – one which nevertheless eventually collapses back into the eternal substance?
- [An aside: perhaps enlightenment is a reversal? Not access to an ultimate transcendent truth, but rather access to an ultimate immanence beyond the time bias that lands our perception and modes of understanding in a transcendent relationship to pure immanence? Enlightenment being where one truly perceives reality outside the useful-to-life bias of time?]
- Dynamic, intensive processes, flow (laminar or turbulent), movement, etc. although real – we can measure and monitor, we can compute, etc. – perhaps only appears so… again because of our position within these systems.
- What if instead of change, movement, chance, etc. we are zoomed in too close to something that is fairly static, consistent, opaque – like television static or noise? What if flow is also static?
- What if a threshold is simply a contour between states – the vast majority of which contain limited memory, and an undetermined minority containing more expansive memory capability (provided by ‘life’)?
- Reasoning via tacit knowledge becomes vital then, and it’s perhaps why well-facilitated group process can be so helpful: metaphysical conviction without guarantee, catching sight of group intuition together, uncovering presuppositions and strategically making new explicit knowledge together, oscillations between deductive and inductive knowing, etc. But I’m also limiting myself to a single event (a facilitated moment): what of the use of multiple modes of consciousness in a single individual to solve problems, or the tacit culture and team dynamism of a Menlo park, or the emergent creativity of a scenius, or whole communities, cities and regions, like a Silicon Valley, or a Mohenjo-daro?
thx,
Stef