AI means a lot of things to a lot of people. Usually what it means is not very well thought out. It is felt, it is intuited. It is either adored, worshipped or deemed blasphemous, profane, to be feared.
In this article, I explore what society at large really means by artificial intelligence as opposed to what researchers or computer scientists mean. I want to clarify for the non-technical audience what can realistically be expected from AI, and more importantly, what is just unrealistic pie-in-the-sky speculation.
I am worried that blind fear — or in some cases worship — of AI is being used to manipulate society. Politicians, business people, and media personalities craft narratives around AI that stir up deep emotions that they use to their advantage. Meanwhile truth is only to be found in dense technical literature that is out of reach for the ordinary person. …
Current common wisdom has it that everyone will be losing their job to automation real soon now. AI is going to eat the world, and the US needs universal basic income because all the truck drivers are all going to lose all the jobs this week, or next week at the latest.
I have my doubts.
I suspect that many readers will be difficult to convince, so before I get to my argument I’ll just put forward a few facts.
In its 2017 executive summary, the International Federation of Robotics informs us that 35% of the world’s industrial robots are purchased by the automotive industry putting it ahead of even the semiconductor and electronics industries. Auto manufacturers’ use of robots has been steadily increasing since 2010. …
Programmers tend to make a big deal over the supposed difference between compiled languages and interpreted ones. Or dynamic languages vs. statically typed languages.
The conventional wisdom goes like this: A compiled language is stored in machine code and is executed by the CPU with no delay, an interpreted language is converted to machine language one instruction at a time, which makes it run slowly. Dynamic languages are slower because of the overhead of figuring out type at runtime.
A long long time ago, when programmers wrote to the bare metal and compilers were unsophisticated, this simplistic view may have been somewhat true. …
But first a word on art: Programming is is like writing music. There is no one true way to write a melody, but there are plenty of wrong ones. There are many combinations of notes that will be absolutely displeasing to all who listen to them, but amongst the many combinations are a few that will please almost anyone.
Similarly, there is not one true way to write a program.
A programmer’s coding style is unique and identifiable. Programmers are able to recognize the author of a piece of code once they know their style.
Code can be funny. I have spotted more than one joke in reading other people’s code, and I have left a few of my own when writing my code. But I will never know if anyone saw them and laughed. …
I have been working with an Agile development shop of late, and there is something about it that I definitely do not like. I do not enjoy being “educated” by well-meaning but nonetheless smug Scrum Masters and Product Owners who assume that anyone in corporate IT has not the slightest clue. These are not ad hoc lectures, they are formal presentations; Powerpoint decks approved by the development shop’s executives. The implication is that the company assumes we have never even heard of Agile.
The reality is that many people in corporate IT are as familiar with Scrum (because that is the only kind of Agility anyone seems aware of these days) as they are. And some, like myself, have significantly more experience, having worked with other Agile practices such as XP, CRC cards, Crystal, et al. I started Agile practices (XP in 1996) before they were officially “Agile”, and for that matter before some of these Scrum Masters were born. …
The man in the back without a jacket is Mel Kaye, about whom Ed Nather wrote his timeless classic of programmer lore, TheStory of Mel.
The year is 1960 and although a wealth of information exists about the hardware he programmed on, nothing remains of his programming other than Ed Nather’s delightful tale.
For the first twenty years of programming history, most programmers in the field, the ones writing the programs that got used in real life, were self-taught. There were no schools for programmers, no formal practices, nobody of knowledge, no discipline to speak of. …
If you want 3 or 4 very different opinions on this, just ask 1 or 2 programmers. On this particular subject, some of my colleagues feel so strongly that they might even be able to flame themselves.
And the meanings of words….
If we were to stick to dictionary definitions there is no question. Art is defined by the OED as “The expression or application of human creative skill and imagination…”, and Webster’s has this to say: “skill acquired by experience, study, or observation”.
Programming is clearly a human creative skill acquired by experience, so that is that. Bam!
But lets take a more nuanced approach. …
I am an over-eager programmer. I always have been.
When a friend or colleague comes to me in distress and asks me if I can help by writing a “quick” program, in spite of myself I always say yes.
What happens next follows three very predictable stages:
At first I am a hero. So unlike all those other stick-in-the mud prgrammers that hem and haw, the hedge and dodge and evade. So what if there are rough edges? I had warned them there would be, and speed trumps perfection.
After the initial delight has worn off, we quickly move to a phase of rapid-fire change requests. Can you do this? Can you do that? Can you make this a little different. …
“Most software today is very much like an Egyptian pyramid with millions of bricks piled on top of each other, with no structural integrity, but just done by brute force and thousands of slaves.”
Alan Kay spoke these words in an interview that he gave to the ACM’s Queue magazine in 2004. Things have only become more so since then.
For this article, let’s agree that when Kay says “most software today” he is not atalking about Google, Facebook, or Netflix (especially not in 2004), and in this article I am excluding them as well. As huge as Google’s codebase is, it is a drop in the bucket. …
Let me describe to you a recent programming job I did for a friend.
He asked me if I could create a web application that took a list of names and sent each name to a third-party web service which returned a list of results that could be imported into a spreadsheet. It bears mentioning that he already had an application that did this, but it was so difficult to use (another failed enterprise IT project) that he asked me for my help in creating a replacement. This is a classic example of shadow IT at its best.
Requirements (from his IT department) were that it be written in the Python programming language, and that any libraries or code snippets that I did not write myself used the MIT license (a type of open source license). …