Selection, Crossover, I choose you! Genetic algorithms, part II

Andy Elmsley
The Sound of AI
Published in
3 min readJun 5, 2019

Welcome back, phenotypes! Last time we introduced Genetic Algorithms (GAs), an AI method for searching complex spaces, inspired by natural selection. I’d recommend you review that article before you get started on this one, as we’re picking up where we left off. This week we’re going to dig into the evolution process behind GAs and learn how to use a fitness function to evolve fitter solutions.

Everything I know about evolution I learned from Pokémon.

The fitness function

As any Pokémon trainer knows, you need to build your Pokémon’s experience to make them evolve and become stronger. In GA terms, a Pokémon’s experience in battle would be known as its fitness. More generally, we can measure the fitness of any chromosome with a problem-specific fitness function. For instance, here’s my fitness function for the knapsack problem:

We first create a phenotype solution out of the chromosome, and then calculate the fitness based on weight and value. The more value we have in the knapsack, the higher the fitness.

The evolutionary process

Natural evolution is a process that happens among a population of entities. If you really want to be the very best, one Pokémon in your roster just won’t cut it — you need a team of Pokémon, each with their own unique abilities. In a GA, having a number of chromosome candidates to search your space will lead to a better result, faster. In the below function, we create a population of chromosomes by simply making a list of lists:

Only the fittest members of this population will be able to survive to the next generation. This is known as selection. First, the entire population is ranked by its fitness scores, and then a certain ratio (50% in this example) of the population are selected.

One part of evolution that Pokémon conveniently leaves out is exactly how genes are passed down from one generation to the next (NSFW: when a mummy and a daddy Pikachu love each other very much…). In GAs, we simplify the act of creating ‘children’ down to mixing genes, in a process known as crossover. We take two parents and create a child by mixing the chromosomes together:

This is one of the main search operations in a GA (there’s another one, but we’ll cover it next time). The intuition behind this is that by mixing chromosomes together, we can explore the search space in directions we already know work well.

Finally, the resulting children from the crossover process are added back into the population, replacing the rejected chromosomes, and the process continues. The loop of selection and crossover is then repeated for multiple generations.

If you run all the above code snippets (also available as a single script on our GitHub), you should notice fitness improving over time.

Next generation

That’s it for this week! We’ve covered the process of a GA’s evolution from population, to fitness, selection and crossover. I have a couple of challenges for you that won’t endanger your survival:

  1. What happens when you increase the number of generations in the example code (to e.g. 20)? Why do you think this happens?
  2. Compare the execution time and results of the previous week’s brute force approach and this week’s GA approach. Run them each a few times, changing the number of items. What do you notice about:
  • The discovered solutions?
  • The execution time?

As always, you can find the source code for all the above examples on our GitHub.

To begin your AI-coding training at day one, go here. And give us a follow to receive updates on our latest posts.

--

--

Andy Elmsley
The Sound of AI

Founder & CTO @melodrivemusic. AI video game music platform. Tech leader, programmer, musician, generative artist and speaker.