History of Computer Science

Christopher J. Mayfield
4 min readAug 2, 2017

--

In this post, I want to take you through a quick tour of Computer Science, what it is, and some history. I hope that this piques your interest.

What is CS?
Well, due to the verbose definition from Wikipedia, I am going to simply paraphrase. The definition is the study of automatic algorithmic processes that scale

Okay, so what does that mean?

Simply, it means letting computers do what we tell them too, but making sure they do it efficiently.

Many fields use the insights from CS into their respective areas of study. Some examples are Linguistics, Psychology, Neuroscience, Biology, and Physics.

One of the key concepts in computer science is that computer’s aren’t smart. They will only do what we tell them. Furthermore, the acronym GIGO was formed, witch stands for Garbage In, Garbage Out. What this means is that if you put nonsensical information into a function, what you are going to get back is exactly that, nonsensical information.

History of CS
The earliest computing tools were the abacus, they look like this
which was created in the period 2700–2300 BC.

Fast forwarding a few centuries, John Napier discovered the logarithm. This pushed forward the development of computational tools. In the year 1623, Wilhelm Schickard (yep, that’s how you spell it) developed a machine to calculate numbers, but a fire destroyed his prototype the following year. In 1640, Blaise Pascal developed a machine that can add. In 1672 Gottfried Leibniz created the Stepped Reckoner (pretty cool name), which was finally completed and released to the world in 1694.

The Stepped Reckoner

The world for a while was quiet, witch no major breakthroughs on the horizon. Then, in 1837, Charles Babbage more or less laid the ground work for what would be describe as the first modern computer, the Analytical Engine (AE). The AE included what we see in the standard von Neumann architecture (i’ll discuss that shortly), such as arithmetic logic unit, control flow and integrated memory (I’ll discuss what those are in more depth in a future post).

The Analytical Engine

A few decades later, we see the dawn of digital electronics beginning to emerge. The first documentation of using digital electronics for computing was the paper written in 1931 entitled “The Use of Thyratrons for High Speed Automatic Counting of Physical Phenomena”. Here’s a link to the paper, if you are interested.

In 1937, the Atanasoff-Berry computer was invented. It was the first digital electronic computer, but it was not programmable. The Z3 computer invented in 1941 by Konrad Zuse was the first working computer that was a fully automatic computing machine. Shortly after in the 1946, the ENIAC, which stood for Electronic Numerical Integrator and Computer was the first electronic general purpose computer. It was the first Turing complete. For simplicity, Turing completeness can be describe as having a programming language. There are other definitions and ways a program can be Turing complete, If you want more information, I encourage you to read this.

In 1946, a major milestone was achieved in the field, a uniform computer architecture was introduced by John von Neumann. It was called the Von Neumann architecture. This architecture consisted of three main parts: The arithmetic logic unit (ALU), the memory, and finally the instruction processing unit (IPU). In the standard Von Neumann design, the IPU passes addresses to memory , and memory is routed either back to the IPU if an instruction is being grabbed or the ALU if data is being grabbed.

Basically, all the smartphones and the modern laptops that we use still use the von Neumann architecture.

The future of computing is what is called quantum computing. That is for a future blog post.

--

--