Artificial Intelligence: The Death of The Coder

Sam Scott
5 min readJun 3, 2018

--

I sit here on a gloriously hot summers day in London, V.R. specs on, “plugged in” as we now like to say. The date is June 3rd 2028 and I sit here gazing back with heartfelt sorrow for the death of what I once held dear: coding.

I used to love coding. I did it for my job as a game developer and then at home on various small projects. Tinkering at the edges of my knowledge, a form of exploration and problem solving all wrapped up into a virtual world that I felt I ruled, that I controlled.

I remember lying on my bed for hours staring up at a blank white ceiling trying to imagine the best implementation of some complex problem that would try to bring balance to a set of conflicting requirements such as robustness, performance, memory consumption and/or security to name but a few.

It was a balancing act I enjoyed trying to solve. An infinite problem domain with a near infinite set of solutions. Admittedly I never quite got close to solving it in the general sense, it was a difficult task right!?! It was at least a four dimensional graph for which at best I had limited information on the inputs and therefore the outputs. The rough 4D terrain of that graph was like a 4D map of the Himalayas: plenty of local minima and maxima to get stuck in or metaphorically to fall to your doom.

No wonder it was all doomed, much like my previous pursuit of collecting music down a rain swept Berwick Street in the cold winters of the 1990’s. The writing was on the wall even if I didn’t know it at the time.

Software was getting more and more complex, ever larger and larger. Teams of engineers were ballooning. The individuals were brilliant but no one felt they had control anymore. The software was getting buggier and buggier, the code slower and slower, guzzling gargantuan amounts of memory only for hackers to attack the now soft underbelly exposed by all this complexity.

Coders would battle in vain as they waded through layer upon layer of abstraction just so they could understand the damn thing they were writing. At times it felt much like riding a mighty beast that was rampaging out of control.

Being engineers we won the battle but at one terrible cost.

I’m not quite sure who thought up the idea of training a neural network to output machine code or assembly as you might like to call it. Maybe it just fell out of the collective consciousness of the time. Deep learning was advancing at an incredible rate back then. Arguably it still is. This was the golden age though. There were machines beating Go masters. Machines writing music. Machines drawing pictures. Machines hearing our words, seeing our world and detecting our emotions. The now ubiquitous Waymo cars were first taunting the start of what was to become the Great Redundancy of the mid 2020’s.

Anyway whoever it was, decided that us humans were better at directing than programming. It was that moment that changed the game. It was that moment that would turn our world upside down.

Looking back it was obvious for everyone to see. To be quite blunt, I’m not sure why it wasnt pursued before. For all of human time, mankind had directed. We told our children “Do this, don’t do that”. We never told them “If this, then do that”. Now we would direct an A.I to build our code. We would provide a hierarchical set of actions and constraints; do this, don’t do that just came naturally.

What was difficult at first was getting an A.I. to write assembly. At first we trained it basic math; what machine instructions to use to add various numbers together. What order to add, subtract, multiply and divide. Just like we did with primary school children. Slowly but surely we grew the A.I to do more and more complex things like load and store information to memory, call functions and the like. We did this until we had an A.I that could replace the C compiler and linker and then moved to its bigger brother C++. It wasn’t easy but with perseverance we got there eventually.

This wasn’t the big breakthrough though.

This just laid the foundations for what was to come. For what was truly revolutionary.

Next we started stripping away needless language features. Who needed types when the AI could decide for you. Who needed memory management when the neural net decided the best allocation. Who needed abstraction layers when we didn’t need to know the hardware anymore. Who needed security layers when the only coder programming the OS was a single trusted A.I. Sure we still needed secured access to the blockchain to verify the hashes but this was unparalleled security and robustness at speed.

It was 2024 when natural language replaced the code. I say replaced, its still there just the A.I. interprets what we ask it to do and converts it into the ultra high level programming language that is hidden away today. Of course people still ‘program’ in that language but it’s mostly those that were what I would call programmers back in the day.

Now everybody codes. The revolution has come. We can all create.

My old boss now sits in an empty Warner Brothers studio (in his pants by all accounts, well why not?), VR specs on, a bank of whirring machines to his left, madly muttering to himself “do this, don’t do that” as he builds the next exciting installment. My mum has even got in on the act and taught Alexa how to build the new house extension.

But when I say code, I don’t really mean code. Not like in the old days of C++ programming; trying to squeeze cycles out of an n-dimensional matrix multiply or battling to squeeze an executable into a 64KB virtual page (not that those are needed anymore).

It is sad. But I suppose time moves on and without all these advances I wouldn’t have been able to write this A.I. The A.I. that wrote this article. The A.I. I taught to continue my legacy after my untimely death.

--

--