Simple Genetic Algorithm via Python, DEAP

dp
3 min readOct 28, 2018

--

I am currently reading “Genetic Algorithms and Investment Strategies” by Richard Bauer Jr. In the sixth chapter of his book, Richard walks through a basic example of a GA in order to optimize a basic business problem.

The fitness function to optimize is as follows:

= $600,000-( (F + V) + $350,000 )

The F and V variables are tied to our GA individual and for this problem it represents Quantity (number of items to produce). 600,000 is total revenue and 350,000 are for additional expenses.

Equation F = ( 20,000 / Quantity ) * $6000 and represents the fixed fee for a production run. Each run costs us $6,000.

Equation V = ( Quantity * $6 ) / 2 and represents the amount of money a particular production run costs.

Step 1: Build Fitness

def EOQ(individual):    def to_int(b):
return int(b, 2)

i = to_int(
''.join((str(xi) for xi in individual)))

if i == 0:
return (-1)*350000

f = round((20000 / i) * 6000, 0)
v = (i * 6) / 2

return 600000 - ( (f + v) + (350000) ),

Step 2: Setup GA using DEAP.

creator.create("FitnessMax", base.Fitness, weights=(1.0,))
creator.create("Individual", list, fitness=creator.FitnessMax)
  • Max Optimization of our Fitness.
  • Our Individual will be represented as an Array.
tbx = base.Toolbox()INDIVIDUAL_SIZE = 20

tbx.register("attr_int", random.randint, 0, 1)
tbx.register("individual",
tools.initRepeat,
creator.Individual,
tbx.attr_int,
n=INDIVIDUAL_SIZE)
tbx.register("population", tools.initRepeat, list, tbx.individual)
tbx.register("evaluate", EOQ)
tbx.register("mate", tools.cxOnePoint)
tbx.register("mutate", tools.mutFlipBit, indpb=0.01)
tbx.register("select", tools.selTournament, tournsize=5)
  • Each attribute in our individual will be either a 0 or 1. Each individual will be 20 bits in length.
  • Mating will use One Point Crossover, Mutation will flip the bit and Selection will use Tournament Selection.

Step 3: Running the GA.

## create random population,
population = tbx.population(n=100)

## set fitness,
set_fitness(population)

A look at individuals in our Population,

## quick look at the initial population,
population[:5]
[[0, 1, 0, 1, 0, 1, 1, 0, 1, 0, 0, 0, 1, 1, 1, 0, 1, 0, 0, 1],
[0, 0, 1, 1, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 1, 1, 1],
[0, 1, 0, 0, 1, 0, 1, 1, 0, 0, 1, 1, 0, 1, 0, 1, 1, 1, 1, 0],
[1, 1, 1, 0, 1, 0, 0, 0, 1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 0, 0],
[1, 0, 1, 1, 0, 1, 1, 0, 1, 1, 0, 0, 1, 0, 1, 1, 1, 1, 1, 1]]

The rest… Additional code was left out that sets the fitness on our individuals and collects statistics for the population, per iteration.

iteration = 1
while iteration < 100:

current_population = list(map(tbx.clone, population))

offspring = []
for _ in range(10):
i1, i2 = np.random.choice(range(len(population)), \
size=2, replace=False)
offspring1, offspring2 = \
tbx.mate(population[i1], population[i2])
offspring.append(tbx.mutate(offspring1)[0])
offspring.append(tbx.mutate(offspring2)[0])

for child in offspring:
current_population.append(child)
## reset fitness,
set_fitness(current_population)
## tourn. select,
population[:] = tbx.select(current_population, len(population))

iteration += 1

After 50 iterations, our population converged around 6,300 which is close to the optimal solution for this problem. Notebook with the full example can be found here.

--

--