Digital Revolution — The Timeline

Sterin Thalonikkara Jose
10 min readAug 9, 2020
Image Source: Pixabay.com

Digital Computers have changed the way the world we live in. The tendency of humans has always been towards making life easier. To delegate effort. While the early man tamed animals to give a hand in what he does, the modern man is forever trying to automate his work with machines. The Industrial Revolution (1700s — 1800s) spawned a new era, with mechanization galore. Manufacturing, transport, utilities — mechanization had taken over much of the work that had previously necessitated human involvement. Among many others, the textile industry was much demanding of cheaper labor, which accelerated the mechanization process. In the field of computing, a similar advancement of the need for mathematical calculators, had taken place in the early 1900s, which paved way for the Digital Revolution.

This article briefs the stages in the process, from the early calculator to the modern-day computer, and the Age of Information.

The Mechanization in Calculations

‘Necessity is the mother of Invention.’

Much like the Industrial Revolution was spurred by necessities in manufacturing and commerce, the digital revolution was throttled by a need for minimizing the drab of mathematical calculations, which were repetitive. Blaise Pascal invented his mechanical (hand-operated) calculator to aid his father’s tax supervision calculations as early as 1640. Charles Babbage (1791–1871) borrowed Pascal’s idea of a calculating machine, refining it with a few of his own. Babbage was probably the first to envision a scientific calculator that could do complex mathematical operations. Logarithms, sines and cosines, and the like. He implemented the concept of ‘division of labor’, borrowed from the manufacturing process, to simplify complicated tasks. Babbage set out to break down the complex mathematical operations into sequences of repeatable additions and subtractions and implementing these with wheels and rods and gears and cams. His idea was outright simple — and carried itself to the digital age — of repeating procedural loops, and subsequent engineered implementation.

Babbage’s sidekick in his project of the Analytical Engine was rather a visionary herself — Lady Augusta Ada Lovelace. She was to extend his idea of a programmable computer, as we know today. Her brilliance in envisaging the idea of a ‘Combining Faculty’ in the 1840s (in association with the Analytical Engine), was a century ahead of her time. This would be the general-purpose computer, extending beyond just being a mathematical calculator. The general-purpose would operate upon symbols, and not just numbers. Though the Analytical Engine embodied her philosophy to how much the technology of the time permitted in concept, it never realized until a century forth. This machine with its cranks and rotors and wheels was the first to capture the concepts of re-programming using punched cards as input and the programming construct of looping.

The Analytical Engine — Image Source : Wikimedia

Lady Augusta Ada had come up with a question — Can machines think? And a conclusion herself.

‘A machine is only as intelligent as we instruct it to be.’

The World Wars

The 1900s has seen much revolutionary changes in the world — politically, culturally, and technologically. Much of the first half of the twentieth century has seen political disturbances, and the world wars. The contributions of these undesirable circumstances and episodes has been far worse to humanity than it was to those who won or lost. This however had an influence on technology and its pace. War accelerated the pace of technological advancements.

Much of the research and development happens for the supremacy in the military arsenal. As a result, military research is heavily funded and drives the course of technology, anytime of the year, any year of the century. The computationally expensive analog computers gave space for digitized apparatus, bringing down costs, and promising scalability into the future. Analog computers started taking the back seat, with the digital technology driving the world. During the World War — II, at around 1937, a few technological leaps happened to take place.

Claude Shannon & Alan Turing

An exceptional figure in the Information Age may be Claude Shannon. In fact, worthy of being called the founding father of the Age of Information. Shannon, in his master’s thesis at MIT in the fall of 1937, entitled, ‘A Symbolic Analysis of Relay and Switching Circuits’, came up with how two-state Boolean Algebra could be implemented with switches (electromechanical relay circuits). The ‘On’ and ‘Off’ states of the switch representing the two states of the binary Boolean system — 1 and 0. Shannon’s seminal work turned out to be the dawn of the Information Age with digital computing. This laid the foundations of ‘Switching Theory’ in digital circuitry.

Claude Shannon — Image Source : Flickr

Alan Turing, at about the same time, had conceptualized his ‘Turing Machines’ (as we shall discuss in a future post). ‘Turing Machines’ were concepts of operation, or algorithmic processes that were awaiting implementation. Turing was impressed by Shannon’s device of switching theory in order to implement his concepts. Turing now had a mechanism to write simple input in binary code (a string of ones and zeroes) and input into a specifically wired circuit of switches, and produce the outputs desired. The circuits could also accept instructions (or a program), and process inputs, and produce the output.

Alan Turing — Image Source : Wikimedia

Alan Turing was successful in building an Enigma-Code breaker for the war, which was dubbed ‘the Bombe’. The Bombe used electromechanical relays as switches, capable of decoding intercepted encoded Enigma messages. This device was a full-fledged, digital computer — The first of its breed.

Vacuum Tubes for Electromechanical Relays

Later, in the 1940s, an electronic variant of the Bombe, the Colossus, used vacuum tubes (thermionic valves) for switches, and is regarded as the first programmable computer.

The Colossus — Image Source : Wikipedia

Thermionic emitters, or vacuum tubes were invented as early as the late 19th century, but it was with Lee deForest’s invention of the ‘triode’ tube in 1907 that established vacuum tube as an amplifier, that is an electronic device that could amplify weak signals. By the early 1940s the vacuum triode was adopted as the fundamental switch for the digital computer, mostly replacing the clanky electromechanical relays. The result was non-mechanical (electromechanical devices needed movement to operate), compact, high-speed circuits. The evolving digital computer would admit vacuum tubes for its switches, adding to its speed.

Early Electromechanical — Electronic computers

We shall mention a few early digital computers that was built around the period of transition from relays to tubes.

  • The Complex Number Calculator (1939) — George Stibitz at the Bell Labs built his Complex Number Calculator machine, out of necessity to make calculations of amplitude and phase for telephony. Calculations involving phases are easily and compactly done using the tool of complex numbers. (There is nothing complex in complex numbers, they just provide a compact way of expression and manipulation.) Stibitz realized his machine with over 400 electromechanical relays, capable of open-shut operations at 20 times per second. This was non-rewireable (non-programmable).
  • The Z3 (1941) — Konrad Zuse, a German, made the Z3 with electromechanical relays for the Arithmetic, Memory, and Control units. The Z3 was all-purpose, programmable, and digital. Its electronic counterpart using tubes never realized due to lack of funding.
  • Atanasoff-Berry Computer (1942) — John Vincent Atanasoff was probably the first to invent the concept of RAM (Read Only Memory), as the early technology had latency in reading from the memory unit into the Arithmetic and logic unit, then getting the data processed. It employed around 300 tubes. The vacuum tube circuit made fast calculations; the mechanically rotated memory units slowed them down. He built the ABC partnering with Clifford Berry. This was incomplete and would end up in the basement of the physics building of Iowa State University. It was a hard-wired machine designed for solving simultaneous linear equations. It was, hence, not re-programmable.
  • Mark I (1942) — Howard Aiken at Harvard built the Mark I, funded by IBM. The Mark I was based on Babbage’s Difference Engine, it was fully automatic in that it could accept programs and data written in paper tapes, and handle the computations by itself, Thus, making up for ‘her’ sluggishness due to not being fully electronic or even electromechanical. The Mark I was referred to as ‘she’ as though she was a ship, as many of the hands that made her served the US Navy, including Aiken.
The Mark I — Image Source : Twitter
  • ENIAC (1945) — The Electronic Numeric Integrator and Calculator was designed by John Mauchly and engineered by Presper Eckert. It used the decimal number system, instead of the binary (ten states instead of two), it had the capacity of handling subroutines (small chunks of repetitive instructions that could be called from anywhere in the main instruction list, like a function). It was programmable, in the manner of plugging and unplugging the cables connecting its different units and was hundred times faster than any machine of the time. It qualified as a general-purpose computer and would be in use for another ten years. It was a mammoth machine — filled up a three-bedroom apartment, used over 17,000 tubes.

The above-mentioned machines would be called the forefathers of the modern-day computer. They were huge, clumsy, noisy, expensive, slow, yet they set for the road ahead. They were designed and engineered with the purpose of winning the war. So yet, the digital age was now a reality.

The Transistor

The end of the war also marks the advancements in the field of semiconductor technology. Semiconductors are materials that stand at the cusp of conductors and insulators. The substances like the freely available Silicon (the same silicon that sits in ‘Silicon Valley’), have the property of being conductors after being taken through a process of doping (with an impurity element like Phosphorous). Semiconductor technology made implementations of circuits compact, reliable, less power consuming, and lightning fast. The basic unit — the transistor — stood for the digital revolution, what the steam engine stood for the industrial revolution.

The transistor maybe seen as a solid-state replacement for the vacuum triode used in the earlier computers. The invention was made by John Bardeen and Walter Brattain in the Bell Labs Laboratory in 1947. The transistor in its modern-day shape was an improvement on the invention, brought about by William Shockley.

Electronics Magazine, 1948 — Image Source : ComputerHistory.org

Shockley Semiconductor Laboratory, founded by himself in 1956, would lay down the first flagstone of what would become the Silicon Valley. The legacy would be continued over later by Robert Noyce and Gordon Moore through Fairchild Semiconductor a year later, and later forming Intel in 1968.

The Microchip

The advent of solid-state devices not just made computers fast, they reduced the size of the circuit. What weighed in tones and occupied enormous volumes of space silently shrank at a rate, given by Moore’s Law (Gordon Moore of Intel): ‘The complexity of minimum component costs increases at a rate of a factor of two per year.’ Though it was a casual prediction for another ten years, it became the watch-phrase of the computer semiconductor industry.

Moore’s Law — Image Source : Wikimedia

Again, necessity paved way for invention: In 1957, a hawk-eyed Bell Labs executive for improvement came up with what became known as ‘the tyranny of numbers’. As the number of components in a circuit increased, the number of outwardly wired connections increased way faster. Jack Kilby, an electrical engineer at Texas Instruments in 1958, came up the ‘monolithic idea’ — circuit elements made in one single piece of silicon. The highly portable microchip was born. With technological advancements, we have Integrated Circuits Technologies, and currently the ULSI (Ultra Large-Scale Integration) with a packing capacity of one million transistors in a single chip.

The Microprocessor

The invention of the microchip further led the developments into a whole resident computer in a single unit — the microprocessor. The microchip design of circuitry was inelegant — it meant different units for different functions. Ted Hoff, an Intel employee working with chip design, came up with a far more economic innovation — a single multipurpose unit that could be programmed to perform a wide variety of tasks. It meant an entire computer in one single chip. The processor architecture came with an interface manual for the instruction set, known as the assembly language. The modern day programmable, portable computer is born.

The microprocessor has ever since been evolving in conformity with Moore’s Law. We have today super-fast devices as small as the size of a cigarette case. As technology evolves, so does computational power.

Coda

Digital Revolution has mechanized much of human labor — physical and mental. The question we shall address next — Can computational excellence promise thinking ability?

Next Week: Turing Machines.

Previous week: Digital Revolution — The Timeline.

First week: Can Machines Think?

--

--

Sterin Thalonikkara Jose

My friend Roshan Menon and I are researching the subject “Thinking Machines” and possibilities to make one. We would like to pen down our thoughts here.