Blogbuster HIT 1: Computer Goes Beep Boop

IEEE CS MUJ Press
4 min readJul 18, 2023

--

Author: Saumay Rustagi

There hasn’t been a change more significant to human life than the introduction of computers. We started with the abacus — a primitive tool designed to make arithmetic simpler and faster. But compared to that, the capabilities of a modern computer seem so foreign, they might as well be performing magic under the hood: even threatening the foundations of a society in the process. So how did we get from “1+1=2” to “don’t automate my job away”?

Since George Boole showed that almost all information could be represented as a series of “yes” or “no”, computing was never the same. We went from employing human computers — that would perform calculations by hand — to the ENIAC, called a Giant Brain by the press, which used a hybrid of decimal and binary for computation. Even so, we still had two major leaps left to get to where we are today. While ENIAC may have been digital, its operations had to be programmed by hand by changing the circuitry underneath. It wasn’t until the EDVAC that the Von Neumann architecture was used. EDVAC was arguably the first general-purpose stored-program computer: you could change the program on the fly but the underlying circuitry would remain the same! We have now made the first leap.

(Von Neumann architecture wasn’t actually conceptualized by John von Neumann, despite how great a polymath he was. He took notes on the concept developed by Eckert and Mauchly of UPenn and got sole credit for the idea due to a clerical error.)

The next leap — and one foundational to modern computing — was the invention of the Transistor at Bell Labs. It would spawn completely new fields like electronics and make it possible to reduce the size of a computer by orders of magnitude. Eventually, you could fit multiple transistors onto a single miniature circuit — an Integrated Circuit. It led to the observation that is known as Moore’s Law — Every 2 years, our computation power increases two-fold and becomes cheaper to achieve (paraphrased).

Now computing, much like the pace of this article, could accelerate. Chemical advancements made thick but relatively portable CRT monitors possible. The storage devices had upgraded to magnetic storage like floppy disks. But the modern computer had one more step left to take, one it had already taken, waiting to be discovered by the world. Until now, computers were operated by keyboards and joysticks, and it would take significant effort to learn to operate those for the layman. The biggest player in the computer market was IBM, with Apple and Microsoft looking to dethrone it — Microsoft wanting to do so from the inside. Then, a group of CS researchers at Xerox PARC made the GUI.

Computers went from using DOS and terminals to using full-fledged Windows 98-like Start Menu and Homepages. The Xerox PARC research team had developed the Mouse and the Graphical User Interface, which would allow for easy and clear operation of a computer but Xerox didn’t see the appeal and the technology went to Apple, from whom Microsoft appropriated the same. You got revolutions like the Mac II and the Personal Computer. Computers were now a lot more familiar.

Ever since then, Moore’s law has been slowing and even reversing in certain cases, with us nearing the physical limits of what is possible. Now CPUs have basically run their course. But CPUs are used for very general electronic computation, what if you had to render graphics or shapes on the screen? Enter GPUs.

We went from a Central Processing Unit doing all the work to delegating most of the repetitive but cumbersome work to a Graphical Processing Unit that could free up the CPU for other tasks. Computers sped up again as a result and set off the complete reversal of Moore’s law to a later date by using the concept of parallelization. Now in bleeding edge areas like quantum computing, we hope to achieve greater goals by making use of the same principles that limit us in Classical Physics and employing their Quantum counterparts to a greater effect.

And yet, even after all this time, the most powerful computer we can conceptually make is a Turing Machine, conceptualised by Alan Turing in 1936. One could say computers have come a long way since then, and yet not long enough.

Follow IEEE CS MUJ Press for more!

Follow us on Instagram, LinkedIn and Twitter.

--

--

IEEE CS MUJ Press

The official blog of IEEE Computer Society, Manipal University Jaipur. A team of STEM enthusiasts, who believe in gaining and spreading knowledge.