Differential Evolution Algorithm (DGA): A Comprehensive Guide

Tahsin Soyak
2 min readAug 4, 2024

--

The Differential Evolution Algorithm (DGA) is a population-based heuristic optimization technique. It’s a simple yet powerful method for global optimization, particularly suited for solving nonlinear problems involving continuous data. The algorithm draws inspiration from genetic algorithms and operates with a stochastic approach.

Differential Evolution Algorithm

Key Concepts

  1. Population Initialization: Initialize a population of candidate solutions randomly.
  2. Mutation: For each candidate solution, create a mutant vector by combining other randomly selected solutions.
  3. Crossover: Mix the candidate solution with its mutant vector to create a trial vector.
  4. Selection: Compare the trial vector to the candidate solution and select the one with the better fitness value.
  5. Repeat: Iterate through the mutation, crossover, and selection steps until a stopping criterion is met.

Example Code

Here’s an implementation of the Differential Evolution Algorithm in Python:

import numpy as np

def differential_evolution(func, bounds, pop_size=10, mutation_factor=0.8, crossover_prob=0.7, max_iter=1000):
dim = len(bounds)
pop = np.random.rand(pop_size, dim)
min_b, max_b = np.asarray(bounds).T
diff = np.fabs(min_b - max_b)
pop_denorm = min_b + pop * diff
fitness = np.asarray([func(ind) for ind in pop_denorm])
best_idx = np.argmin(fitness)
best = pop_denorm[best_idx]

for i in range(max_iter):
for j in range(pop_size):
idxs = [idx for idx in range(pop_size) if idx != j]
a, b, c = pop[np.random.choice(idxs, 3, replace=False)]
mutant = np.clip(a + mutation_factor * (b - c), 0, 1)
cross_points = np.random.rand(dim) < crossover_prob
if not np.any(cross_points):
cross_points[np.random.randint(0, dim)] = True
trial = np.where(cross_points, mutant, pop[j])
trial_denorm = min_b + trial * diff
f = func(trial_denorm)
if f < fitness[j]:
fitness[j] = f
pop[j] = trial
if f < fitness[best_idx]:
best_idx = j
best = trial_denorm
yield best, fitness[best_idx]

# Example usage
def objective_function(x):
return sum(x**2)

bounds = [(-5, 5), (-5, 5)]
result = list(differential_evolution(objective_function, bounds, max_iter=1000))
print(f"Best solution: {result[-1][0]}")
print(f"Best value: {result[-1][1]}")

Best solution: [0. 0.]
Best value: 0.0

Differential Evolution Algorithm (DGA):

  • Population-based: Uses a population of candidate solutions.
  • Mutation and Crossover: Generates new solutions by combining existing ones.
  • Selection: Chooses the best solutions for the next generation.
Differential Evolution Algorithm

Artificial Intelligence — Tutorial #8 “Differential Evolution Algorithm”

For previous subject go here -> https://medium.com/p/46e33f1ecc02

For next subject go here -> https://medium.com/p/ec9e905e2789

Let me know if you’d like any further refinements or additions to this post! tahsinsoyakk@gmail.com

--

--