From: jimruttshow8596
The concept of complexity can be measured in numerous ways, and complexity theorists often develop specific metrics suited to different applications [02:24:25]. Among these, Logical Depth and Thermodynamic Depth offer distinct yet related perspectives on how difficult a system is to produce or characterize [01:21:49] [01:22:04].

Logical Depth

Proposed by Charles Bennett, logical depth is considered a “beautiful measure of complexity” [01:17:17]. It is a computational measure of complexity that applies to bit strings and computers [01:32:32].

Logical depth quantifies the number of computational steps a computer must take to produce a given output, starting from its shortest possible program [01:57:51].

  • Simple Outputs:
    • A sequence of a billion ones (111...): This sequence is easy to produce because a short program can simply instruct the computer to print ‘1’ a billion times [01:33:41]. Although it takes a billion steps, the program itself is very short, leading to low logical depth [01:50:50].
    • A truly random bit string (e.g., 0110010101...): While such a string has high algorithmic complexity (or Kolmogorov complexity) because its shortest description is essentially the string itself, it is not considered logically deep [01:52:00] [01:55:00]. The program to produce it is just a list of the digits, which runs very fast [01:56:00].
  • Logically Deep Outputs:
    • The first billion digits of Pi (3.1415926...): Pi is a very specific number whose digits appear random but contain many underlying patterns [01:47:45]. A short program (like the ancient Greek method of inscribing polygons in a circle) can generate these digits, but it requires a very long time to execute [01:47:45]. Thus, it is considered logically deep [01:50:50].
    • Patterns from certain cellular automata rules (e.g., Rule 110): These simple rules can produce patterns that are “extremely complex” and can only be generated by executing all the steps of the automaton [01:56:13]. Such patterns are effectively logically deep [01:59:01].

Thermodynamic Depth

Co-defined by Seth Lloyd and Hein Pagels for Lloyd’s PhD thesis, thermodynamic depth is a physical analog of logical depth [01:50:50]. It measures the amount of physical resources, specifically free energy, that must be consumed or “burned up” to construct a system from the way it was actually put together [01:50:50].

  • Example: Bacterial Metabolism: The metabolism of a bacterium is thermodynamically “humongous” [01:50:50]. This is because its complexity took billions of years of evolution, with countless bacteria sacrificing their lives, to achieve the sophisticated state produced by natural selection [01:50:50].

Relationship and Broader Context

Logical depth and thermodynamic depth are interconnected via the physics of computation, demonstrating that they are “essentially the same thing when they overlap” [01:50:50]. Thermodynamic depth is considered the most physical of these notions [02:00:19].

These measures contribute to the broader understanding of complexity:

  • They relate to measures of “how hard it is to do something,” such as computational complexity (how many elementary logical operations are needed) and spatial computational complexity (how much memory space is needed) [02:06:01].
  • They are also related to effective complexity, which combines physical and computational notions by distinguishing between the random and non-random aspects of a system [02:06:01].
  • In engineered systems, such as a car, the effective complexity (and by extension, related logical/thermodynamic depth) is clearly defined by its functional requirements and the blueprints needed for its manufacturing [02:06:01].

Ultimately, both logical and thermodynamic depth provide ways to quantify the effort required to bring a complex system into existence, whether that effort is measured in computational steps or physical energy consumption [02:06:01].