Analog Computers

Analog computers are a form of computing that utilizes the continuous variation aspect of physical phenomena to model the problem being solved. In contrast, digital computers use discrete values of both time and amplitude to represent varying quantities.

The complexity of analog computers varies from simple slide rules and nomograms to intricate naval gunfire control computers and large hybrid digital/analog computers. Analog computation is used in complex mechanisms for process control and protective relays to perform control and protective functions.

With digital computers beginning to make waves in the 1950s and 1960s, analog computers started to become out of date.

However, they continued to be used in specific applications such as aircraft flight simulators, flight computers, and for teaching control systems in universities.

Mechanical watches provide a relatable example of analog computing, where the periodic rotation of interlinked gears drives the seconds, minutes, and hours needles in the clock.

More intricate applications, such as aircraft flight simulators and synthetic-aperture radar, continued to be powered by analog computing (and hybrid computing) well into the 1980s, as digital computers were not sufficient for these tasks.

Timeline of Analog Computers

Precursor

Analog computers have a long history dating back to ancient times when mechanical aids were constructed for astronomical and navigation purposes. The planisphere and astrolabe were early examples of analog computing that could solve problems in spherical astronomy.

The sector, a calculating instrument used for solving problems in proportion, trigonometry, and multiplication and division, was developed in the late 16th century and found application in gunnery, surveying, and navigation.

The slide rule, a hand-operated analog computer for doing multiplication and division, was invented around 1620-1630.

Aviation is one of the few fields where slide rules are still used, particularly for solving time-distance problems in light aircraft. In the 19th century, mathematician and engineer Giovanni Plana devised a perpetual-calendar machine that could predict the perpetual calendar for every year from AD 0 to AD 4000, keeping track of leap years and varying day length.

Sir William Thomson’s tide-predicting machine, invented in 1872, used a system of pulleys and wires to automatically calculate predicted tide levels for a set period at a particular location.

The differential analyzer, designed to solve differential equations by integration, used wheel-and-disc mechanisms to perform the integration. Mechanical differential analyzers were developed in the 1920s by Vannevar Bush and others. Similar systems followed, notably those of Spanish engineer Leonardo Torres y Quevedo, who built several machines for solving real and complex roots of polynomials.

The harmonic analyzer, developed by Michelson and Stratton, performed Fourier analysis using an array of 80 springs rather than Kelvin integrators. This work led to the mathematical understanding of the Gibbs phenomenon of overshoot in Fourier representation near discontinuities.

In a differential analyzer, the output of one integrator drove the input of the next integrator, or a graphing output. The torque amplifier was the advance that allowed these machines to work.

In summary, analog computers have a rich history that spans many centuries, with numerous inventors and mathematicians contributing to its development over time. From the planisphere and astrolabe to the slide rule and differential analyzer, these early forms of computing paved the way for modern-day analog and digital computers.

Modern Day

During the modern era, many different kinds of analog computers were used. The Dumaresq, invented by Lieutenant John Dumaresq of the Royal Navy in 1902, was an analog computer that related vital variables of the fire control problem to the movement of one’s own ship and that of a target ship.

In 1912, Arthur Pollen developed an electrically driven mechanical analog computer for fire-control systems, based on the differential analyzer. AC network analyzers were constructed starting in 1929 to solve calculation problems related to electrical power systems that were too large to solve with numerical methods at the time.

During World War II, mechanical analog computers were used in gun directors, gun data computers, and bomb sights. Helmut Hölzer built a fully electronic analog computer in 1942 as an embedded control system to calculate V-2 rocket trajectories and to stabilize and guide the missile.

In the Netherlands, Johan van Veen developed an analogue computer to calculate and predict tidal currents when the geometry of the channels are changed. In 1947, physicist Enrico Fermi invented the FERMIAC, an analog computer that aided in his studies of neutron transport. RCA developed Project Typhoon, an analog computer, in 1952, consisting of over 4,000 electron tubes and using 100 dials and 6,000 plug-in connectors to program.

Analog computers also played a significant role in educational settings, with devices like the Heathkit EC-1 and General Electric’s “educational” analog computer kit helping to illustrate the principles of analog calculation. In industrial process control, analog loop controllers were used to regulate temperature, flow, pressure, or other process conditions.

Analog computing continued to be used in scientific and industrial applications even after the advent of digital computers. They were often used in industrial settings to control and regulate manufacturing processes, while in scientific research, they were used to model complex physical systems. However, digital computers gradually replaced analog computers in most applications as they became more powerful and affordable.

Analog computing has experienced renewed interest in recent times, with a particular focus on neuromorphic computing. It has also found new applications in the field of machine learning.

It can model complex systems with high degrees of parallelism and non-linearity. Given these developments, analog computing continues to be a significant area of research.

Conclusion

In conclusion, analog computing is a form of computing that uses physical phenomena to model the problem being solved, utilizing continuous variation aspects. In contrast, digital computers use discrete values to represent varying quantities.

Analog computing has had a rich history, dating back to ancient times, with many inventors and mathematicians contributing to its development over time.

While digital computers have replaced analog computers in most applications, analog computing has seen a resurgence of interest in recent times, particularly in the field of neuromorphic computing and machine learning, where it has found new applications.

Comments are closed