From: veritasium

The history of analog computers dates back to antiquity, with one of the earliest known examples being used for astronomical predictions. These machines later evolved to tackle complex engineering problems, most notably the prediction of tides.

Early Analog Computing: The Antikythera Mechanism

Discovered in 1901 from a shipwreck off the island of Antikythera, an ancient Greek artifact constructed around 100 or 200 BC represents a sophisticated early computer [00:00:00]. Three-dimensional X-ray scans have revealed that this device, known as the Antikythera mechanism, contains 37 interlocking bronze gears [00:00:06]. These gears allowed it to model the motions of the sun and moon and predict eclipses decades in advance [00:00:12].

Unlike modern digital computers, the Antikythera mechanism operated by analogy [00:00:32]. Its gears were designed so that the motions of certain dials were analogous to the celestial bodies they represented, making it an analog computer [00:00:37].

Analog vs. Digital Computers

The fundamental differences between analog and digital computers can be illustrated with simple examples:

  • An analog computer for addition might use rotating wheels where the sum of two rotations is shown on a third wheel [00:00:49]. Analog computers have a continuous range of inputs and outputs [00:01:24]. Quantities are represented by physical attributes, such as the amount a wheel has turned [00:01:32].
  • A digital mechanical computer, in contrast, would work with discrete values, like adding single-bit numbers (e.g., 0+0=0, 0+1=1, 1+1=2) [00:01:03]. Digital computers operate on symbols, typically zeros and ones [00:01:40].

For thousands of years, both analog devices (like the Antikythera mechanism or slide rules) and digital devices (like abacuses) were in use [00:01:53]. Up until the 1960s, the most powerful computers were actually analog [00:02:03].

The Challenge of Tide Prediction

Predicting tides has been a critical problem for millennia [00:02:48]. Historically, miscalculations could lead to disaster, as in Napoleon’s near-death experience crossing the Red Sea [00:02:54], or ships running aground [00:03:01].

Most coastal locations experience two high and two low tides daily, but their exact timing and magnitude vary due to local factors like sea bed depth and shoreline shape [00:03:07].

Laplace’s Equations and Fourier Analysis

In the late 1700s, Pierre-Simon Laplace derived complex differential equations to describe oceanic tidal flow [00:03:24]. While these equations had no analytical solution at the time, Laplace made a key finding: tides are driven by a few specific astronomical frequencies, including the moon, sun, and lunar orbit eccentricity [00:03:38]. Each factor contributes a sine wave of a particular amplitude and phase to the total tide curve [00:03:53]. The challenge became how to correctly combine these frequency components to predict tides [00:04:01].

It took nearly a century, but in the 1860s, William Thompson, later known as Lord Kelvin, took on the challenge [00:04:09]. Inspired by his work laying the first transatlantic telegraph cable, he dedicated himself to measuring and predicting tides [00:04:17].

Kelvin applied the work of French mathematician Joseph Fourier, who had shown how any function could be decomposed into a sum of sine waves [00:04:45]. Although applying Fourier’s analysis to tidal curves was straightforward, the computation required was enormous [00:05:04]. To characterize tides at one location, Kelvin needed 10 different frequency components, requiring extensive multiplication and addition [00:05:38]. This had to be repeated for every new location [00:05:50]. Once these coefficients were known, the next step was to add the sine functions to predict future tides [00:05:56].

Lord Kelvin’s Tide Predicting Machines

Lord Kelvin spent years manually analyzing and predicting tides before conceiving the idea of designing a machine to automate these calculations [00:06:05]. His goal was to “substitute brass for brains” [00:06:16]. The resulting analog computers were used for nearly a century [00:06:20].

The Tide Predictor (Addition Machine)

Kelvin first addressed the prediction problem: adding sine waves together given their amplitudes and phases [00:06:29]. He knew a scotch yoke device could create sinusoidal motion from uniform circular motion [00:06:37]. For mechanical addition of 10 sine waves, a chance encounter on a train with inventor Beauchamp Tower in 1872 provided the solution [00:06:50]. Tower suggested using Wheatstone’s plan of a chain passing around multiple pulleys [00:07:09].

By attaching a pulley to each scotch yoke and running a weighted cord around them, Kelvin could mechanically add all their contributions simultaneously [00:07:19]. He sketched the entire plan for this predictor machine on the train and secured funding to build it [00:07:28]. This machine automated the tedious task of predicting future tides, with four hours of cranking yielding a full year of predictions [00:07:43].

The Harmonic Analyzer (Integration Machine)

The harder part of the problem – decomposing an existing tide curve into its component frequencies – remained manual for many years [00:07:58]. To automate this, Kelvin needed a machine capable of multiplying the tide curve by a sine wave and then integrating the result [00:08:06].

With his older brother, James Thompson, Kelvin developed a mechanical integrator based on a ball on a rotating disk [00:08:18]. The ball’s rotation speed depends on its distance from the disk’s center [00:08:27]. A stylus traces the function to be integrated, controlling the ball’s position and speed [00:08:52]. The ball’s motion is converted to an output by a roller, plotting the integral [00:09:01].

To decompose a tide curve, the disk was made to oscillate back and forth at a specific frequency [00:09:10]. By tracing the tide curve with the stylus, the roller summed the integral of the tide curve multiplied by the sine wave [00:09:32]. Multiple ball and disk integrators could be connected in parallel, each oscillating at a different frequency, to calculate coefficients for many frequency components simultaneously [00:09:46].

Impact and Legacy

Kelvin’s analog computers revolutionized tide prediction [00:10:00]. Tidal curves from anywhere could be analyzed into sinusoidal coefficients using the harmonic analyzer, and these could then be added by the predictor machine to forecast future tides [00:10:05].

Kelvin’s tide predicting machines were used well into the 1960s [00:10:27]. They were even overhauled to include 26 frequency components and played a critical role in planning the Allied invasion on D-Day [00:10:32]. The invasion times were staggered across the five landing beaches according to precise tide predictions, allowing demolition teams to clear obstacles at low tide before the main forces landed as the water rose [00:11:15].

Other Military Applications During World War II

Beyond tide prediction, analog computers were crucial in World War II for applications such as aiming anti-aircraft guns [00:11:26]. The M9 Gun Director, developed by Bell Labs using operational amplifiers, could rapidly calculate the correct trajectory and fuse settings for anti-aircraft guns based on radar and optical data [00:13:52]. This dramatically improved accuracy, reducing the average rounds needed to take down an airplane from 17,000 in World War I to just 90 in 1943 [00:14:21].

However, not all analog computers were successful. The Norden bombsight, an incredibly complex mechanical analog computer and the third-largest single expense in the U.S. military budget during the war, failed to deliver its promised high precision [00:14:33]. Its 2,000 fine parts meant that any manufacturing inaccuracy translated directly into computational inaccuracy, and repeat calculations would not yield the exact same answer [00:15:29]. This highlights one of the inherent challenges of analog computing: physical imperfections affect results [00:15:35].

The Rise of Digital Computing

As World War II progressed, digital computers began to gain traction [00:16:13]. The electronic Colossus machines at Bletchley Park were crucial for breaking German codes [00:16:17]. In the U.S., the ENIAC, designed to speed up artillery firing table calculations previously done by analog differential analyzers, demonstrated the power of digital computing and is considered by many to be the first modern computer [00:16:24].

Claude Shannon’s 1936 master’s thesis, which showed that any numerical operation could be performed using Boolean algebra (true/false or one/zero, and/or/not operations), truly opened the door to the digital revolution [00:16:55]. This made digital computers ideal versatile machines, resilient to noise since it takes a large error to mistake a one for a zero [00:17:16]. In contrast, each analog computer is typically designed for only one type of problem, and small errors can accumulate [00:17:21].

Today, nearly everything is digital [00:17:44]. Digital devices provide exact, repeatable answers and are robust to noise [00:17:55]. Their components have been miniaturized and optimized, making them universal computing machines [00:18:12]. Despite this dominance, analog computing may now be making a comeback, with startups actively working on them, hinting at potential benefits in areas like artificial intelligence [00:18:26].