From: jimruttshow8596

Measuring complexity is a difficult task, with many different approaches and measures available, each suited for particular applications [01:28:44]. When examining systems like networks and fractals, specific measures become particularly useful for understanding their intricate structures and behaviors.

Fractal Dimensions

Complex systems can be understood through the lens of nonlinear dynamical systems, which often include the study of fractals [00:30:22]. Fractals are patterns, like snowflakes, that are self-similar, meaning they look similar at different scales [00:30:40].

A classic example is the weather, as studied by Ed Lorenz at MIT [00:30:55]. Lorenz observed that weather equations were chaotic, where a tiny initial difference could lead to a large divergence later, making them intrinsically unpredictable [00:31:11]. However, this nonlinearity drives the dynamics to a “strange attractor,” which is a fractal structure [00:31:26]. The confinement of weather dynamics to this strange attractor, despite its unpredictability, reveals a lot about its behavior [00:31:41].

While calculating the weather by tracking every atom is impossible, fractal structures at various scales allow for simulations and predictions [00:35:06]. This means complex systems, like the weather, are hard to predict, but not intrinsically unpredictable [00:35:17].

Network Complexity

Network complexity is a broad class of ideas for dealing with complex networks [00:57:21]. Examples include:

The structure of a network contributes to its complexity [00:58:02]. For instance, the power grid comprises diverse power plants (coal, nuclear, wind, solar) connected in varied ways, with electricity spread across long distances through different transmission lines [00:58:09]. This intricate structure leads to dynamic and often unforeseen behaviors [00:58:37].

Networks can enter chaotic regimes, which is undesirable for systems like the electrical grid [00:58:50]. Efforts are made to tune them to non-chaotic regimes [00:58:56]. However, when complex electromechanical systems are pushed to their limits, even small changes can lead to emergent, often unpleasant, behavior at the “edge of chaos” [00:59:00]. Research suggests that, given the fluctuations in demand and the grid’s topology, arbitrary levels of failure, including complete collapse, are theoretically possible [00:59:46].

Multiscale Entropy

Related to complexity theory and systems thinking | coarse-graining, multiscale entropy addresses how much information exists in a system at different scales [01:00:20]. Coarse-graining involves looking at a system at a particular scale, effectively discarding information below that scale (e.g., focusing on gas temperature and pressure, not individual molecule movements) [01:00:27].

Complex systems, by most definitions, typically contain significant amounts of information at each scale [01:01:10]. Living systems serve as prime examples:

  • Humans exhibit macroscopic behaviors like conversations [01:01:31].
  • At smaller scales, individual human cells are highly complicated [01:01:49].
  • Even tiny mechanisms within a cell, such as mitochondria (involved in energy production), are extremely complex [01:01:53].

Multiscale entropy quantifies this information across different scales [01:02:17]. Systems like networks or biological organisms display a large amount of multiscale information or entropy [01:02:24]. However, just like mutual information or integrated information, multiscale entropy is considered a symptom of complexity, rather than its sole cause [01:02:50]. Simple fractals, for instance, can also possess high multiscale information without necessarily being considered very complex systems [01:02:43].

Ultimately, there is no single, universal measure of complexity that applies to all domains; rather, many different measures exist, and the most appropriate one depends on the specific context and purpose [01:03:14].