History of Computing PtI

Cara
The Startup
Published in
10 min readAug 17, 2020

We learn and know (hopefully) a basic history of the world, particularly major events like the French revolution, the American Civil War, World War I, World War II (wow lots of wars), the Spaceage etc. It is important to understand the concepts of these and many other historical events. Being able to recall the start year or the exact details of how such events unfolded is one thing, but on a human level, it is more important to understand the rationale, lessons and philosophy of major events. Ultimately history teaches us what makes us innately human. Furthermore, understanding history helps us realise the how and why we operate in today. History provides the context for today. It makes today seem ‘obvious,’ ‘justifable’ and ‘logical’ given the previous events that unfolded.

So, following this thread of logic, understanding the history of computers should help us understand how we have got to today. A today when computers moderate much of our communication with one another. A today where computers and screens are stared at for many (and often a majority) of our waking hours (especially during Covid). A today where the thought of working, socialising or learning without a computer would be an affront and a disadvantage. Just as major events like World War II and the Cold War have greatly contributed to today’s political and social climate. I would argue computers influence just as much (if not more) of our daily lives.

Therefore it is important for us to understand the evolution of computers to understand where we may be heading in our relationship with computers.

I would like to preface that the following articles outlining the history of computers by saying this is in no way an exhaustive history of the origin of computers. Some major events have been glossed over while other meaningful contributions omitted entirely.

Whilst the thought of history for some may make the eyes automatically glisten over, I will try and make the following series as painless and exciting as possible. While I paint a story of linear progress of computation, this is hindsight bias in action. We like to create a story of history attributing certain importance to some events and not others when in reality as these events were unfolding (and continue to unfold) it was not always obvious what was a gigantic discovery. It is only now with some distance that we can appreciate past event. This means perhaps in ten years this recount will emphasis other features and neglect some of the stories today we find so foundational to computer’s creation.

With all this in mind let’s begin !!

The first computers

Since their inception computers have taken over human work by performing tedious, complex and repetitive tasks. Interestingly, the word computer initially described humans!! Initially computers were humans (often women) who were able to perform complex mathematical computations — usually with pen and paper. Often teams would work on the same calculation independently to confirm the end results. It is interesting to note that initally when electronic computers were developed they were referred to as such — electronic computers. With time as electronic computers became more and more pervasive and powerful, it became the human computer that was deemed obsolete and inefficient. The electronic was dropped and now when we discuss computers we think of nothing else besides our gracefull and versalite electronic tools. It is important to keep computer’s mathematical origin in mind as we will see it only further emphasises the never imagined pervasiveness and uses of computers today.

Our story begins with the humble abacus, generally considered the first computer. When researching I was puzzled how an abacus could be considered a computer. Luckily my curiosity was settled by a quick Google search (thank you Google). Google was even able to suggest my search before I completed typing ‘Why is the abacus considered the first computer’! I ended up on trusty Quora where one users: Chrissie Nysen put things simply-“Because it is used to compute things.” Though estimates vary, the abacus is thought to originate in Babylon approximately 5000 years ago. The role of the abacus was to ease in simple mathematical calculations- addition, subtraction, division and multiplication. In this sense, we can consider the abacus as a simple calculator. As farming, produce and populations increased in size, the abacus allowed the educated to more easily manage logistics. After the abacus the first computer, computer’s evolution remained dormant for some time…..

The first computer program- Charles Babbage and Ada Lovelace

Approximately 4800 years later the English Mathematician Charles Babbage enters the scene. Today Babbage is referred to as the ‘father of computing.’ Babbage’s computer, the Difference Engine (1822) was designed to automate the calculation of mathematical tables like logs, tides and astronomy composed of entirely mechanical components. While a small prototype was formed, the machine was never created to scale.

Later and more ambitious, Babbage envisioned the Analytical Engine- which was to have memory store, a central processing unit and the ability to select for the engine to perform different actions and computations. This machine was to be programmable by punch cards– it worked similarly to weaver looms which were Babbage’s source of inspiration. Aspects such as punch cards and the phrase ‘store’ were taken directly from the textile industry! Babbage’s Analytical Engine mirrors our understanding of computers today (minus the fact that this computer was meant to be entirely mechanical). Today our computers do possess memory, a processor and is reprogrammable — Babbage was indeed a man ahead of his time as it took over a century for the computer of Babbage’s vision to come to fruition.

Along Babbage’s close colleagues was Ada Lovelace, a skilled mathematician. Lovelace is considered by many as the first computer programmer — this is quite ironic considering she was a female and the stereotypes that existed at the time (and some argue continue today) surrounding women. With an uncommon upbringing for a young aristocrat girl, Ada was taught science and maths and excelled. Her talent lead her to being introduced to Babbage who became a mentor. When Babbage become disenchanted by the lack of support from the British scientific community, he went abroad to Italy where he presented a lecture on his Analytical Engine. It was Lovelace who translated the first published account of the engine (by Luigi Federico who later became prime minister of Italy another story) from French to English in 1842. However, Lovelace not only translated Federico’s manuscript but also added her own detailed notes. In fact, of the final paper published in 1843 in Taylor’s Scientific Memoirs contained sixty-six pages, forty-one (just under two thirds) of which were Lovelace’s own appendices and notes.

The most famous and groundbreaking part of this publication was the final appendix, Note G, which demonstrated the machine’s operation by giving an example of how it would calculate Bernoulli numbers. Bernoulli numbers are considered important in expansion of trigonometric functions and number theory. Bernoulli numbers most interesting feature in the context of computation is that they are recursive (the nth termed is determined the nth-1 term). This final appendix is considered by many as the first program (More accurately the punch cards that would have performed the steps Lovelace described would have been the program). Nevertheless, the writer of the first program is bestowed upon Lovelace. Lovelace’s fascination and obsession with Babbage’s machine led her to recognise the machine’s potential to expand past numbers and mathematics. Today, nearly 2 centuries after Lovelace’s paper many of her ideas remain influential within computer science.

The forgotten computers- analog computers

Interestingly, the next computers that developed were the now forgotten analog computers. Analog computers used changeable physical properties of electricity, hydraulics and mechanics to demonstrate an output. The best way to think about analog computers is by considering an anolog clock- analog clocks indicate the time by the position of the hands, so an analog computer could have represented a numerical number by the angle of rotation of a part.

The first large scale automatic analog machine created was the Differential Analyser created by Vannevar Bush at MIT in 1931. Bush is an extremely impressive guy — he was not only the dean of MIT school of Engineering but later headed the Office of Scientific Research and Development (OSDR) during World War II. The OSRD oversaw the vast majority of US R&D during the wartime so Bush was involved in the initiation of inventions such as the radar and Manhattan project. This mechanical analog computer could solve differential equations. This computer inspired similar versions worldwide. However, for each equation it needed to be laboriously set up- namely a skilled mechanic was required leading Bush and his colleagues to continually replace the mechanical components with electrical devices. The high cost associated with analog computers remained a major drawback despite the creation of hybrid — analog / digital computers, today analog computers are a rarity.

A multipurpose computer- Alan Turing

A few years following Bush’s differential analyser, Alan Turing entered, of Cambridge university. Turing today is widely considered the father of computer science and artificial intelligence– having invented many of the governing principles of modern computing. As shown throughout history — necessity is the mother of all invention. With the advent of World War II armies invested in computers as a mean of code cracking. If the name rings a bell, Turing began to permeate popular culture through the Oscar winning movie the Imitation Game (2014). The movie depicts Turing’s role in helping the Allies crack the German enigma code during World War II- helping the allies win the war.

Turing was the first to propose the idea of a multi-purpose computer, one which had memory and worked on instructions dictated by a program- a Turing machine. This simplistic concept was a computer that read a code containing tape and translating the code into instructions and actions. This is essentially how our computers operate today. A (successful) Turing machine could essentially perform any algorithm — unlike Babbage’s Analytical machine or Bush’s differential analyser (both envisaged to deal with math only).

In this sense, Turing was the first to begin expanding computers past performing solely mathematical tasks. Recall that Babbage’s computer and even further back the abacus — were invented with the sole purpose of arithmetical manipulation. It is here, the computer with the initial purpose of math begins to transform into Turings vision of a multi-purpose computer with potentially limitless abilities. However, it still took some time for this vision to become a reality as out story is still only in 1936 and after all today computer’s power are still (thankfully) not limitless …

Ahead of his time Turing (alike Babbage and many major players within computing) also discussed the possibility of computers to eventually program themselves — what we now understand as artificial intelligence and machine learning, huge areas of growth and hype within computation today. Speaking in London 1947, Turing explained his concept of :

“a machine that can learn from experience” one with the “possibility of letting the machine alter its own instructions…” .

After all, the Turing Test was named and invented by Turing himself. This test gets a human to interact with a computer and asses the computers intelligence and/or behaviour. If the computer is deemed to act and behave in an indistinguishable manner from the human the computer passes the test.

Relays, switches and electronic computers

At this point of our story the majority of computers were electromechanical- controlled by a mixture of electronics and mechanical switches — relays. The combination of mechanical and electronic parts meant that they operated quite slowly, were tedious to set up and consumed lots of energy. During WWII countries battled to create the first operational multi-purpose computer. The victor was Germany with Konrad Zuse’s Z3 computer functioning by 1941. This computer worked by re-routing wires and plugs to perform new tasks, a tedious process. Naturally, we can see the problem with electromechanically computers- they do not scale very well. Sure, they may with some time and effort be able to perform simple tasks however, complex tasks require huge memory storage and many steps — well that would require lots of manual re-routing of wires…

This poor functionality lead computer scientists to consider moving to a solely electronic computer. Soon the wires and plugs were replaced by vacuum tubes which were developed by John Fleming in the early 20th century but only much later added to computers. With vacuums integration, the second generation of computers were created. Even with these vacuum tubes, computation was unreliable, energy consuming and meant the computers were gigantic, weighing tonnes and taking up rooms- these computers would never be commercial and compact with vacuum tubes.

Nevertheless, some important and early computers including the Atanasof –Berry Computer (1939), Thomas Flowers’ Colussus I which assisted in British code breaking during WWII (1943), along with his replacement of telephone relays in Britain used vacuum tubes.

It is interesting to note- although computation played a major role in the Allies’ victory over Axis, the work of both Turing and Flower’s at Bletchley Park was classified under the Offical Secrets Act. In fact, most of the code-cracking computers were destroyed with the wars completion (quite upsetting now considering not only their aid during WWII but also their place centre stage in the story of computer history). Only relatively recently in 2000, was the Collusual role in the war was fully declassified. These actions are a shame as not only did it mean that Flower’s and his collegues did not receive the recognition they very rightly deserved but perhaps if these secrets were shared the evolution of computers could have been fast tracked.

The great minds behind the code breaking of Bletchley Park continued to make waves in computer science. Turing’s Automatic Computing Engine (ACE) designed in 1945 and finally executing its first code in 1950, became the basis of the first personal computer. Meanwhile the earliest computer, in the sense we today can comprehend, (stored-program, general purpose- though still of course quite primitive by today’s standards) was built at Manchester University commencing operations in 1948 and became known as the Manchester ‘Baby.’ Within the United States, the first fully functioning electronic digital computer was the ENIAC, completed in 1945. Constructed by the Moore School of Electrical engineering, the ENIAC for many years (incorrectly –recall the secret Colususs) was believed to the first functioning electronic digital computer.

It was only in 1951 that the UNIVAC, the first ‘commercially’ available computer was created (When I say ‘commercially available’ I mean only available to the biggest and wealthiest firms who deemed this strange new invention to hold some value). A total of 46 UNIVAC’s were delivered. Among the first recipients of UNIVAC were the U.S. Government Census Bureau, Prudential Insurance Company, General Electric’s Appliance Division and Westinghouse Electric Company. The UNIVAC was the first computer to perform commercially helpful tasks, for example both Electric companies used the UNIVAC to calculate payrolls. The UNIVAC also filed sales records and could analyse competitor’s performance. In fact, the UNIVAC was used by the US government to correctly predict the outcome of the 1952 presidential election as Eisenhower- an outcome deemed so unlikely it was initially deemed to be a computer error.

Finally, society were starting to see and recognise the immense power of computers……we will return soon to continue the story of progress !

--

--

Cara
The Startup

Co Founder @Steppen. Working to democratise workouts. Lover of avo and halloumi.