Genetic Algorithm

Genetic algorithm (GA) is an amazing way to solve optimization problems. I want to share you some theory behind GA and the implementation of a GA.

The Theory

GA are adaptive heuristic search and optimization based on evolutionary ideas of natural selection and genetics. The basic technique is designed to simulate the principle of “survival of the fittest”. This means that the fittest individual competing for scarce resources will dominate over the weaker ones.

GA are more robust compared to conventional search and optimization algorithm. They do not break easily and can handle the presence of reasonable noise. As the problem complexity rises in terms of large state-space, multi-modal state-space, or n-dimensional surface, GA offer more benefits then traditional algorithms.

To simulate the survival of the fittest, individuals are placed in a population. Each individual represents a possible solution to the problem. The individual in the population are made to go through a process of evolution.

  • Individual compete for resources and mate.
  • The more fit the individual the higher the probability to produce offsprings.
  • Genes from fit individual will propagate throughout the population, two fit parents will sometimes produce offspring that are better than either parent.
  • Each generation will become more suited to their environment.

The Implementation

After the initial random population is generated, the algorithm evolves through three operators:

  • Selection: the solution with the higher fitness have a better chance of mating
  • Crossover: Mating between individual produces a offspring with genes shared between parents.
  • Mutation: A chance to mutate a gene at random during mating.

Selection

A fitness metric base on the problem criteria measures the strength of a solution. The fitter individual will have a higher probability to mate with other solutions to generate new solutions. A typical approach is to use a roulette wheel with slots sized according to fitness to pick out which individual can go to the mating pool. In this pool we would expect individuals with higher fitness to show up more times than lower fitness individuals.

Crossover

Two individual from the mating pool are chosen at random to mate. A crossover site between the bit strings is randomly chosen. The values of the two string are exchanged up to this point. If x1 = 010101 and x2 = 000111, if the cross over site is 2, x1' = 000101 and x2' = 010111. The two offspring from this mating are put to the next generation of population. By recombining portions of fit individuals, it is likely to to create even better individuals.

Mutation

There is a low probability, a portion of the new individual will have their bit flipped. The purpose of mutation is to maintain diversity within the population and inhibit premature convergence.

The Algorithm

An example of a basic GA is below:

  1. randomly initialize a population
  2. determine the fitness of the population
  3. repeat:
  • select parents from the population into a mating pool using roulette table base on fitness
  • perform crossover on parents creating population (n+1)
  • perform mutation on population (n+1)
  • determine fitness of population

4. stop when the best individual meets some metric

Summary

Via the operation of selection, crossover, and mutation the GA will converge over successive generations towards a global or near global optimum. This is largely due to the fact that GAs combine direction and chance in the search in an effective manner. GAs combine the fitter parents that can produce even fitter offsprings, eventually leading to an optimal solution. GA are powerful and robust optimization techniques due to the fact that they exploit and explore simultaneously.

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.