From: jimruttshow8596

In a discussion on the Jim Rutt Show, guests Sarah Walker and Lee Cronin provided a critical perspective on traditional complexity theory and introduced their “Assembly Theory” as an alternative approach that redefines the understanding of complexity.

Traditional Measures of Complexity

Sarah Walker, an astrobiologist and theoretical physicist, notes that traditional measures from computer science, such as Kolmogorov complexity, focus on “program size complexity” – the minimal machine required to run a program [01:28:24]([01:28:24]. This approach is often “uncomputable” because it requires searching over all possible machines for the minimal description [01:36:37]([01:36:37].

Critiques of Traditional Complexity Theory

Lee Cronin, a chemist, offers a blunt assessment of existing complexity theory:

[02:25:05]Complexity theory is not none of it’s correct and it’s one of the biggest errors that a lot of people have made.”

Cronin cites a letter from Freeman Dyson, who wrote:

[02:32:00] “I should add the whole literature of complexity theory suffers from the same deficiency that experts call it complexity theory but in fact there is no theory there are a lot of interesting examples of complex objects and complex systems but no General understanding their behavior and the collection of examples not a theory if it provides no understanding.”

Cronin suggests that complexity theory “was generated by computation lists fascinated with computation and Turing machines” [02:04:06]([02:04:06], acting as an “excuse to not actually do your accounting properly” by collecting and binning things to produce a number [02:25:25]([02:25:25]. Jim Rutt, the host, largely agrees, viewing the field as “exploring” phenomena without a “unified theory yet” [02:44:00]([02:44:00].

A key limitation highlighted by Sarah Walker is that traditional measures of complexity “can’t distinguish random from complex” [01:56:58]([01:56:58], as random things can appear very complex due to their unstructured nature [02:02:01]([02:02:01].

Assembly Theory: A New Measure of Complexity

In contrast to these traditional views, Assembly Theory proposes a different way to measure and understand complexity, rooted in the physical properties and causal history of objects.

Core Concepts

Assembly Theory assesses an object’s complexity based on two main components [01:46:46]([01:46:46]:

  1. Number of parts: The number of distinct components an object possesses [01:48:48]([01:48:48].
  2. Number of identical objects: How many identical copies of that object exist [01:49:50]([01:49:50].

The theory asks: “How unlikely is this object to form probabilistically by chance?” [01:10:08]([01:10:08]. For example, finding ten identical iPhone 14s on Mars would strongly suggest they did not form by chance [01:36:37]([01:36:37].

Assembly Steps and Causal Pathways

Assembly Theory focuses on the “minimal set of causal pathways for making the object” [01:19:13]([01:19:13]. This involves recursively decomposing an object into its components and identifying the “shortest route” or least number of “steps” required to reform it, allowing for the reuse of components [01:25:27]([01:25:27].

  • Recursivity: Only parts already built in the past can be used [01:22:15]([01:22:15]. This implies that “every object encodes its own memory” [01:22:25]([01:22:25].
  • Physical Attribute: The minimal path is considered a “physical attribute of the object” [01:34:36]([01:34:36], meaning the object is “extended in time” [01:38:40]([01:38:40].
  • Measurement: Assembly theory began with the “profound insight” of being experimentally measurable [02:09:11]([02:09:11]. Techniques like spectroscopy, magnetic resonance, and mass spectrometry can reveal the number of different parts in a molecule, providing a measurable “assembly index” [03:00:27]([03:00:27].

Distinguishing Random from Complex

Unlike traditional complexity measures, Assembly Theory differentiates between random and complex objects:

  • “Random objects are really hard to make” in Assembly Theory [02:08:10]([02:08:10].
  • It focuses on objects that demonstrate “reuse of parts,” a characteristic seen in evolution [02:21:23]([02:21:23]. High complexity objects arise from a “memory of what existed in the past and reusing those features” [02:29:34]([02:29:34].
  • This approach is concerned with “the causal chain and how complex is that and can that even form based on finite resources finite time” [02:02:01]([02:02:01].

Memory, Selection, and the Origin of Life

Assembly Theory posits that objects encode their own memory through the recursive process of their construction [01:22:25]([01:22:25]. This concept of memory is crucial for the creation of high complexity structures, such as large organic molecules, which are dependent on the “deep memory in DNA and the local memory in the cytoplasm” [03:22:00]([03:22:00]. This aligns with the idea that the universe’s evolution involves an increasing ability to take more steps as memory gets deeper and more complicated [03:52:00]([03:52:00].

A key outcome of Assembly Theory is the idea that “selection has to predate biology as we know it” [03:52:00]([03:52:00]. Without pre-biological selection, the emergence of complexity would require a “miracle” [03:52:00]([03:52:00]. Assembly Theory suggests a mechanism where selection initially builds steps randomly, and then mechanisms for memory are built by the universe [03:52:00]([03:52:00].

The theory also reveals a “sharp phase transition” between non-biotic and biotic chemistry, which appears around 13 or 14 “steps” [04:25:27]([04:25:27]. Objects with 15 steps or above are considered to require a biotic origin [04:25:27]([04:25:27]. This threshold represents a point where a “mechanism must be in place in order to observe those High assembly objects” [04:39:57]([04:39:57], arising from the combinatorial explosion of possibility space [04:48:40]([04:48:40]. Assembly Theory provides a “meaningful way of talking about when systems cross that boundary” between non-living and living, using a unified descriptive language [04:54:55]([04:54:55].

Assembly Theory as a Theory of Time

Sarah Walker argues that Assembly Theory is “a theory of time” [00:58:19]([00:58:19]. Unlike abstract concepts of information, Assembly Theory aims to make information material, as it is “accumulated over time” and is a “temporarily embedded structure” [00:59:53]([00:59:53]. This means the “molecule is not the thing I hold in my hand it’s actually this extended temporal structure that captures these evolutionary informational properties” [01:00:31]([01:00:31].

This perspective challenges the “block universe” view of Einstein, where all times exist at once and novelty does not exist [01:03:01]([01:03:01]. Assembly Theory suggests that “some things have to happen before other things can happen” [01:02:09]([01:02:09], making it fundamentally incompatible with the block universe concept [01:02:14]([01:02:14]. Cronin highlights that the universe expands in a way that “creates options,” meaning “there are more options in the future than around the past” [01:04:03]([01:04:03]. This inherent asymmetry is a core tenet, supported by particle physics’ observation of CP violation [01:17:17]([01:17:17]. Thus, Assembly Theory reveals time by showing that complex objects have “depth and time” – a cell, for instance, has a lineage stretching back billions of years [01:05:43]([01:05:43].

Ultimately, Assembly Theory offers a framework that grounds complexity in measurable physical attributes and causal histories, presenting a critical alternative to established complexity theory paradigms and even proposing a new understanding of time itself.