From: veritasium

Entropy is described as one of the most important, yet least understood concepts in all of physics [00:00:00]. It governs phenomena from molecular collisions to massive storms [00:00:07], and influences the universe’s origin, evolution, and eventual end [00:00:12]. Entropy may even determine the direction of time and be fundamental to the existence of life [00:00:19].

Defining Entropy

To understand entropy, it’s helpful to consider the Earth’s interaction with the Sun. While many people believe the Earth receives and retains energy from the Sun, in reality, for most of its history, the Earth radiates exactly the same amount of energy back into space as it receives from the Sun [01:37]. If this weren’t the case, the Earth would continuously heat up [01:46]. The crucial difference lies not in the amount of energy, but in its quality or usability [02:03].

Energy cannot truly be “used up” because it never goes away [01:24]. Instead, it becomes less usable when it spreads out [09:39]. Entropy is a quantity that measures how spread out energy is [09:51]. When energy is concentrated, it represents low entropy; as it spreads to surroundings, entropy increases [10:02]. While often described as disorder, the best way to think about entropy is as the tendency of energy to spread out [11:14].

Origins in Thermodynamics

The concept of entropy emerged from the study of heat engines in the 19th century [02:09]. Sadi Carnot, a 17-year-old student in 1813, sought to improve French steam engines, which were less efficient than those in other countries like Britain [03:33]. At the time, even the best steam engines converted only about 3% of thermal energy into useful mechanical work [03:45].

Carnot’s key insight involved how an ideal heat engine would work, without friction or environmental losses [04:03]. Such an engine, operating between a hot and a cold reservoir, is completely reversible [05:30]. Despite its ideal nature, it is not 100% efficient [06:25]. This is because some heat must always be dumped into the cold reservoir to return the piston to its original position [08:41]. The efficiency of an ideal heat engine depends fundamentally on the temperatures of the hot and cold sides [08:15]. Achieving 100% efficiency would require infinite temperature on the hot side or absolute zero on the cold side, both practically impossible [08:26].

Decades later, German physicist Rudolf Clausius studied Carnot’s engine and developed a way to measure how spread out energy is, calling this quantity entropy [09:48].

The Second Law of Thermodynamics

In 1865, Clausius summarized the first two laws of thermodynamics:

The Second Law of Thermodynamics explains why hot things cool down, gas expands to fill containers, and perpetual motion machines are impossible [10:48]. The amount of usable energy in a closed system is always decreasing [10:57].

Entropy and the Arrow of Time

Most laws of physics work the same forwards or backwards in time, but the clear time dependence observed in the universe comes from entropy [11:22]. For instance, heat flowing from cold to hot is not impossible, but merely improbable [12:57]. Ludwig Boltzmann’s insight explained that a system is far more likely to be in a configuration where energy is evenly spread out than in a concentrated, ordered state [13:16]. As the number of particles increases, the probability of energy concentrating becomes infinitesimally small [13:52].

The universe consistently moves from highly unlikely, ordered states to more likely, disordered states [14:36]. This irreversible process, such as a Rubik’s cube becoming randomized, is why there is an “arrow of time” [24:00].

Entropy and Life on Earth

While local decreases in entropy (like a house getting colder due to air conditioning) are possible, they are only achieved by increasing entropy by a greater amount elsewhere, such as at a power plant converting chemical energy into heat [15:12].

Life on Earth exists because the Earth is not a closed system [16:17]; it receives a steady stream of low entropy, concentrated energy from the Sun [16:28]. The energy received from the Sun is more useful and concentrated than the energy radiated back into space [16:35]. Plants capture this low entropy energy to grow, and animals consume plants, and other animals, using that energy to maintain their bodies [16:44]. At each step, this energy becomes more spread out, increasing entropy [16:58].

Ultimately, all energy reaching Earth is converted into thermal energy and radiated back into space as lower energy photons [17:08]. For each photon received from the Sun, approximately 20 photons are emitted from Earth [17:34]. All processes on Earth, including life itself, contribute to converting fewer, higher-energy photons into many more lower-energy photons [17:51]. Without a source of concentrated energy and a way to discard spread-out energy, life would not be possible [18:01].

It has been suggested that life itself may be a consequence of the Second Law of Thermodynamics [18:10]. If the universe tends toward maximum entropy, then life offers a way to accelerate this natural tendency, as life is highly effective at converting low entropy into high entropy [18:16]. Jeremy England proposed that a constant stream of clumped energy could favor structures that dissipate that energy, eventually resulting in increasingly efficient energy dissipators, leading to life [18:42].

Entropy of the Universe

The Sun’s low entropy originates from the universe itself [19:15]. Since the total entropy of the universe is always increasing, it must have been lower in the past, leading back to the Big Bang as the point of lowest entropy [19:22]. This is known as the “past hypothesis” [19:41].

While the early universe was hot, dense, and nearly uniform (everything mixed), its entropy was low because of gravity [19:51]. Gravity tends to clump matter together, so having matter spread out uniformly would be an extremely unlikely state, thus representing low entropy [20:09]. As the universe expanded and cooled, matter clumped together, converting potential energy into kinetic energy, and then into heat through collisions, thereby increasing entropy [20:26]. This process of using useful energy led to the formation of stars, planets, galaxies, and life [20:58].

In 1972, Jacob Bekenstein proposed that black holes are another significant source of entropy, proportional to their surface area [21:36]. Stephen Hawking later confirmed that black holes emit radiation (Hawking radiation) and have a temperature, validating Bekenstein’s proposal [22:12]. Black holes account for almost all the entropy in the universe [23:14]. The supermassive black hole at the center of the Milky Way, for example, holds 1,000 times as much entropy as the early observable universe [22:56].

Heat Death of the Universe

The increasing entropy will continue until, eventually, energy is spread out so completely that nothing interesting can happen again [24:16]. This predicted end state is known as the heat death of the universe [24:26]. In this distant future, after all black holes have evaporated, the universe will be in its most probable state, and the arrow of time will disappear, as it would be impossible to tell the difference between time moving forwards or backwards [24:29].

Entropy and Complexity

While maximum entropy implies low complexity, low entropy does not automatically mean maximum complexity [25:00]. Complex structures, such as the patterns formed when milk mixes with tea, appear and thrive in the middle range of entropy, between highly ordered and completely disordered states [25:07]. The universe currently exists in this complex middle state, allowing for the formation of intricate structures and phenomena, including life [25:35].