WTF is … Artificial Intelligence?

Simon Copsey
The Curious Coffee Club
4 min readJul 31, 2018

Everyone seems to be talking about Artificial Intelligence (AI) these days, even my hairdresser. Some talk about AI as if its the fifth horseman of the apocalypse, whilst others chant its name with a twinkle in their eye.

Here’s a human-readable overview to prevent any Artificial Intelligence faux pas. See you on the other side.

Artificial Intelligence: our poor, misunderstood friend. Image credit: @ilferrets

Bear with me. This is a lot less painful if we first define Machine Learning, the most important element of Artificial Intelligence:

Machine Learning enables computers to continually find better solutions to a particular problem.

Google Translate has used Machine Learning to constantly improve its translation to near-human accuracy. NASA used a related technique to automatically design an aerial with the best radiation pattern.

Now that you’re Machine Learning experts, we can define Artificial Intelligence:

Artificial Intelligence is a set of technologies underpinned by Machine Learning that allow computers to learn to solve a narrow problem better through experience, versus being explicitly programmed by a human.

Phew, that was a long sentence. Let’s talk about some of the new concepts.

Firstly, we explicitly mentioned that AI focuses on solving narrow problems. The reason for making this explicit is that Hollywood often polarises our view of AI, encouraging us to imagine a single machine that can learn to do anything and everything that a human can. This is Artificial General Intelligence (AGI) and is some way off. In contrast, however, machines that can learn to solve narrow problems are already [walking?] among us.

Artificial General Intelligence, like our friend Walter, is thankfully some way off. Image credit: meetwalter.com

Secondly, what do we mean by learning through experience? Typically when programmers write a piece of software, it’s like writing a mathematical equation: they tell the computer what operations to perform on an input (e.g.: circle_radius) to create an output (e.g.: circle_area).

circle_area = π * circle_radius²

Machine Learning turns that on its head by using a method called training: we show the computer both the input and the desired output, and let the computer derive the operations that need to be performed to get from the input to the output.

Do any of these photos contain cats? Probably not, but let’s ask AI. Image credit: @joannakosinska

An example would make this clearer. Let’s say we want a computer to tell us, given a set of photos, which of those photos contain a cat. We could let a programmer write some clever software that tries to identify if a photo contains cat-like shapes, but that would get very complex very quickly: the photo could contain a cat in any position, or from any angle, or could contain more than one cat, or could contain a soft toy cat but not a real cat. Trying to accommodate each of these edge cases would be very difficult.

However, there is another way. If we give a computer a huge set of photos (the input) and let it know which of those images contain cats (the output), the computer can start to derive the ‘operations’ by itself — in this case, recognise the different patterns of pixels that represent cats.

When we’ve fed the computer enough training data containing the input (photo) and output (does it contain a cat?), the AI has [machine] learnt enough to be put to real use— in this case, categorise photos it hasn’t seen before depending on whether they contain a cat. This kind of training process — where the computer is provided with both the input and output data — is called supervised learning.

The feline example demonstrated the AI technology Computer Vision, but let’s list the common AI technologies:

  • Computer Vision: identifies objects, scenes, and activities in images.
  • Natural Language Processing: understand and manipulate language.
  • Speech Recognition: transcribes human speech.

By definition, each of these technologies is underpinned by Machine Learning, allowing a computer to continually improve its ability. A common Machine Learning technique you’ve probably already heard of is Neural Networks.

Though AI has been around since the 1940s, it has gained huge momentum with the recent explosion of cloud computing. Machine Learning generally requires large amounts of computing power, and cloud computing has now put this into the hands of anyone with a credit card.

So, remember: Machine Learning is like letting a computer write the mathematical equation itself. Artificial Intelligence are the cool things the computer can do as a result.

--

--