The Antikythera Mechanism to Spacewar! Computer Programming History For Dunces
Without programming, the software and the code, the hardware wouldn’t have meant a thing. From the Greeks to the modern-day era, we’ve got a lot to be thankful for
When I started my short career as a ten-year-old ‘computer programmer’ back in the mid-1980s, when the vibe was certainly ‘Stranger Things’ in spirit at least, I didn’t know anything about the history of what lay behind the scenes, how the Amstrad CPC 464 I used, worked. Or how, the geniuses that had lived before me had laid the groundwork for a mega technological revolution of bits and bytes. One that is still going on. And one that will, I hope, continue with my current passion:
Since the early days, programming and computers have changed beyond their first models, mutated and morphed into so many areas of specialization that to be a modern-day programmer takes more skill and education than I could ever accomplish.
Like I don’t know Russian, either — yet that doesn’t mean I don’t want to know it, admire it for what it is, of what pleasure it has given me while reading Dostoyevsky and Solzhenitsyn.
But that’s something else, for another day, another publication.
What I do know, though — and this took route in my nerdy nature for history — is how computers started, what code they used in their basic form and how, more importantly, they evolved through the ages.
We could start, in essence, with the toga-garbed Greeks and their ancient analogue ‘Antikythera Mechanism’, designed to predict astronomical positions. Yet that would be cheating in some way, giving credit for sublime innovation when really it was still two thousand years away.
French Steampunk Hero
So let’s fast forward to France. The year is 1804. Napoleon has been crowned Emperor of the French at Notre-Dame. Meriwether Lewis and William Clark begin their epic adventure of the American West. The Industrial Revolution is at full pace. A time and epoch of great men and greater inventions, none more so than Joseph Marie Jacquard’s programmable loom.
This unknown genius, born in Lyon in 1752, improved on a device that was at the centre of the time, an innovation so important that it changed the lives of countless people. For centuries before man worked looms by hand. Textile production was slow. It was hard, time-consuming work. The power loom changed everything.
Jaquard’s invention was based on a series of punchcards placed on top of looms that could program intricate and complex patterns on the fabric and save time on production.
This was the introduction of the punchcard onto the world.
And then came the vacuum. A dark space of nothingness, where humanity’s mind was blank to the future of computers.
The Georgian period passed to the Regency era which handed the baton to the Victorians where Babbage and Lovelace took the stage with the Analytic Engine.
Change was afoot.
In 1889 Herman Hollerith invented the ‘electric tabulating system’, which used his own version of punchcards called Hollerith cards. This ingenious device could read data.
Not only wow. His company went on to be one of the founding companies that created IBM.
Punchcards ruled until the mid-20th century when Citizen Turing, the Polish General Cipher Bureau, IBM and ENIAC’s coding system were born.
The way they worked was simple. Data was inputted on punch cards concocted by a programmer, a human, which was then fed into the said device to realize the operation required.
The Colossus, the machine designed to break the German encryption, of World War II, too.
A Bug’s Life
Know where the word ‘bug’ comes from in computer language?
Blame the moth, I say.
Oh, and we can’t forget the Atanasoff-Berry Computer, or the ABC, of 1942, which was the first-ever electronic digital computer. Its main function was to solve linear equations. Unfortunately, the machine wasn’t programmable, but it could boast a separate memory and parallel processing amongst other things, leaving punch cards in the dust.
The EDSAC followed, the world’s first practical stored-program electronic computer, advancing the technology by calculating prime numbers.
A few years later, the first advanced computer programming language appeared. Invented by the lazy man of computer scientists John Backus, FORTRAN — as it was called — was unique in its user-friendliness, far removed from what had preceded it.
‘Much of my work has come from being lazy. I didn’t like writing programs, and so, when I was working on the IBM 701 (an early computer), writing programs for computing missile trajectories, I started work on a programming system to make it easier to write programs.’
There were others, as well — ALGOL, BASIC.
Cold War or Spacewar!
The 1950s moved into the 1960s. The Cold War had begun. People were scared of the Mushroom Nightmare. Humanity thought a nuclear war was imminent. Surrounded by this jingoistic tension, the first computer game was birthed: Spacewar!, in 1961.
Its programmer, Steve Russell, wrote it on DEC computer. What’s more important, however, is what followed: Russell met one Nolan Bushnell at Stanford University, showed him Spacewar! and a revolution began. Bushnell, an entrepreneurial and technical genius, went on to co-found Atari Computers, as well as designing the first coin-slot arcade game.
This was the decade of hippie Jobs’ Apple and the bespectacled nerd Gates’ Microsoft. Of Watergate and a cessation of the Vietnam War.
Like computer viruses?
But there’s one man to blame for that, computer scientist Fred Cohen. While a student at the University of Southern California, he designed a program that was able to infect the host computer, copy itself before infecting other computers via floppy disk.
I bet that hurt!
But exhale a sigh of relief, because Cohen was a caped crusader, a geek with good intentions. His virus was never meant to be harmful. He only wanted to prove he could do it.
And he did.
Brownie points, boy.
Computer languages in this decade included C++ and its object-oriented programming. Niklaus Wirth’s Pascal, designed for ethical ‘programming practices using structured programming and data structuring’.
There was a cascade of computer programming languages. A shift had occurred. Programmers started thinking in a new way in regard to computer software. Application development cycles were faster. More code reviews became the norm to reduce or eliminate errors.
We pierced the new millennium before the Dot Com Bubble set in. Yet that didn’t hinder the developers, those futurists with an eye for what was not the reality.
The internet gained more users while companies, hungry for sales, adopted the web for their commercial needs. The economics made sense, though, but only the ballsiest CEOs made a move.
AJAX and the Web 2.0 — some say popularized because of the business potential it promised — made all this the more possible.
‘Web 2.0 is the network as platform, spanning all connected devices; Web 2.0 applications & [are] delivering software as a continually-updated service that gets better the more people use it, consuming and remixing data from multiple sources, including individual users, while providing their own data and services in a form that allows remixing by others, creating network effects through an ‘architecture of participation,’ and & deliver rich user experiences.’
- Tim O’Reilly coined the phrase Web 2.0
Next was the smartphone: iOS, Android, Windows phone.
A Thank-You Note
So, thank you, Greeks for the start.
Babbage — and your Love Lace.
Jaquard with your loom, in bloom.
And all the rest of you, too — for the world owes you big time!