The Essence of Genetic Algorithms

Chi-Keong Goh
AI2 Labs
Published in
4 min readMay 3, 2020

Everyone have been talking about Artificial Intelligence (AI) for quite some time and it does not seem to be abating. From politicians to farmers and from smart cities to categorizing cucumbers, AI is appearing virtually everywhere.

Anyone digging further into AI will find themselves drowning with terms like Deep Learning, Convolutional Networks, Long-Short Term Memory, Generative Adversarial Networks, and the list goes on. Amongst these techniques, genetic algorithm (GA) is a relative unknown. Apart from being a good (if not better) alternative to gradient based training approaches for deep neural networks, GA is a crucial element in quite a few automated or self-service data analytics platforms.

The purpose of this post is to give an introduction to this powerful low profile technique, cutting through all the seeming complexities to lay out what really works.

So what is it?

GA is a class of stochastic optimization algorithm that finds inspiration from biology. It falls within the field of Computational Intelligence - a field of study on its own with dedicated journals and conferences from the likes of IEEE, IET, and so on. Yes, it is quite big within the academia though not as trendy anymore since it is not “deep”. It has several variants including Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), Evolution Strategies (ES), Bacteria Foraging Optimization, Lion Optimization, and many more. No, that is no exaggeration and there is absolutely no stopping the creativity of researchers when it comes to inspiration.

Let’s start by deciphering the definitions of the two more popular algorithms:

Genetic Algorithm is a class of population based stochastic optimization algorithms inspired by the process of biological evolution. Simple interpretation of biological evolution is simply alpha male meets alpha female, produces a group of alpha kids who would then find their respective alpha partners, and the process continues with increasingly alpha population.

Particle Swarm Optimization is a class of population based stochastic optimization technique inspired by social behavior of bird flocking or fish schooling. Simple interpretation of flocking or schooling is follow the head honcho- whoever can bring us to safety and food. If you have not seen flocking behavior that emerges just by following the temporary leader, here it is:

Flocking Behavior just by following the leader. Source: https://gifer.com/en/9MWY

I know it sounds abstract and very un-computer science. The pseudocodes below for GA and PSO are listed below, which should provide just a little bit more clarity.

Pseudocode for GA and PSO

My own classroom explanation is to picture yourself in a meeting room brainstorming ideas. The good ideas survive, some ideas get combined in different ways into better ideas, and finally you get an idea that is accepted by most in the room. Regardless of what you see, the GAs, PSOs, ACOs etc fit into the same algorithmic framework with the following 3 basic components:

  • Evaluation: Evaluate or access the quality of ideas/solutions. This could be minimizing some cost or maximizing some objective.
  • Selection: Some way of selecting good solutions from the jungle out there. One example is to rank the solutions and then selecting the top N solutions.
  • Adaption: Some way of combining elements of selected solutions together to form (potentially) better solutions. This can a simple mix and match of parts from two different solutions to form a new one. This is also known as cross-over.

So let’s take a look at the pseudocode again. Different terminology, different implementation but performing the same function.

Pseudocode for GA and PSO (rehashed)

The underlying assumption of these algorithms is that you cannot do worse when you combine good things together. Well, that is not always true but it has led to many interesting implementations out there.

The key message in this post is that biologically inspired optimization algorithms such as GA, PSO etc looks intimidating but really not. Most of the variations in the algorithms out there are centered around how Evaluation, Selection and Adaptation are implemented. There is no one way of doing it, only appropriate way of implementing it for your problems. I will follow-up with a little more on each of these components in the near future.

--

--

Chi-Keong Goh
AI2 Labs
Editor for

AI Technical Director | Digital Transformation | Academic-Industry Mentor | Yoozoo Innovation