Enter the Mutant: Genetic algorithms, part III

Andy Elmsley
The Sound of AI
Published in
4 min readJun 19, 2019

Welcome back, chromosomes! Over the last two posts, we’ve been exploring Genetic Algorithms (GAs). You might want to review those pieces before reading this one. We’ll start off with some problems you may have noticed in last week’s code, and address how to solve them by adding the final aspect of GAs: mutation.

The X-Men get some pretty awesome mutations. Image: Nerdist

What’s in a search?

Last week I asked you to investigate the effect of the number of generations on the discovered solution. Setting the number of generations to 20 might give an output something like this:

Generation 0 best fitness: 272
Generation 1 best fitness: 278
Generation 2 best fitness: 278
Generation 3 best fitness: 294
Generation 4 best fitness: 297
Generation 5 best fitness: 317
Generation 6 best fitness: 317
Generation 7 best fitness: 334
Generation 8 best fitness: 334
Generation 9 best fitness: 337
Generation 10 best fitness: 337
Generation 11 best fitness: 337
Generation 12 best fitness: 350
Generation 13 best fitness: 350
Generation 14 best fitness: 350
Generation 15 best fitness: 350
Generation 16 best fitness: 350
Generation 17 best fitness: 350
Generation 18 best fitness: 350
Generation 19 best fitness: 350
Generation 20 best fitness: 350

As you can see, the fitness stops increasing (‘converges’) after 12 generations — but why does this happen?

Well, with selection and crossover being the only search operations in our GA, we only have a very limited gene pool to search. What’s worse, as the GA is iterating from one generation to the next, the search space shrinks. And after a while, the entire population will become exact clones of one another. At this stage the selection and crossover operations don’t actually do anything!

You may have also noticed that the GA search isn’t exhaustive — it won’t always discover the most optimum solution to the problem like a brute force method will. This is also partly down to the limited gene pool. The optimal solution will only be discovered if two criteria apply: the initial random population contains that magic set of genes, and they’re not lost in the crossover process. Quite a small chance of success, then.

In order to fix these two issues, we need to introduce the final weapon in the GA’s arsenal: mutation.

Mutating the search

In the theory of natural selection, forward leaps in fitness can only be achieved by a chance change in an organism’s genes. These genetic mutations may have allowed the first fish to breathe air, or the first mammals to walk upright — sometimes even spawning entirely new species.

Genetic mutation: nature’s answer to novelty in search. Image: CC0 Public Domain

In GAs, mutation is implemented by altering a few genes in a chromosome, based on some random chance. So far, we’ve looked at binary chromosomes — chromosomes containing zeros and ones — and for these a simple mutation function can be implemented like so:

Since mutation has the chance to result in a worse solution, it is usually implemented with a sense of elitism, where the best performing chromosome is not mutated:

Evolution and balance

What’s really happening with mutation is that we’re adding some randomness back into the population in order to keep variety in the search space, and potentially explore new spaces that weren’t in the initial population or have been lost over the generations. However, as with many things in AI, mutation isn’t a magic bullet. The GA will be sensitive to the balance between the population size and the mutation chance.

The following table shows this in action:

The larger the population size, the smaller the mutation chance should be. In the above example, a population of 10 got the best result with a mutation chance of 0.03, 20 worked best with 0.01 and when the population was 50 a mutation chance of 0 was the best.

Convergence

That’s it for this week! We’ve covered mutation and finished our implementation of a GA. The full code for the GA is on our GitHub.

We’re going to be taking a break for a little while on Code of AI, while I do some genetic evolution of my own (I’m having a baby). So be sure to subscribe to our mailing list here or give us a follow to stay up to date and receive updates on our latest posts. Please feel free to send us some feedback on what you might like to learn when Code of AI returns later this year!

--

--

Andy Elmsley
The Sound of AI

Founder & CTO @melodrivemusic. AI video game music platform. Tech leader, programmer, musician, generative artist and speaker.