Optimize in no time(Numpy Vectorized)- Particle Swarm Optimization(PSO)
If you want learn how to implement Particle Swarm Optimization(PSO) in python using Numpy Vectorization then you’re at the right place. Particle Swarm Optimization is a type of evolutionary computation. According to Wikipedia’s definition, In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence. In evolutionary computation, an initial set of candidate solutions is generated and iteratively updated. Each new generation is produced by stochastically(randomly) removing less desired solutions, and introducing small random changes.
Particle Swarm Optimization is a optimization method introduced by Dr. Eberhart & Dr. Kennedy in 1995. The algorithm was inspired by the behavior of bird flocking and fish schooling. The algorithm is similar to Genetic Algorithm however it has different parameters. The algorithm is represented as a swarm containing many particles, each individual particle is a complete solution. Each particle in swarm has its own position, velocity and record of its own best position. PSO is very different in terms of gradient based optimization algorithms because it don’t use complex mathematics to calculate gradients to move in optimal direction.
Let’s take a look at algorithm(pseudo code) of PSO
Parameters
The algorithm can be stopped based on the number of iterations or the loss function to be optimized but in the code below we have just implemented for the number of iterations.
We can use various benchmark objective functions to test our optimization algorithm, you can visit the link to check a list of objective functions used for testing. But in our tutorial we’re going to optimize simple objective function known as sphere.
Easier Version(Numpy Array)
Particle
Create a Particle class as follows.
Here we defined position(x), velocity(v), personal best(pbest) and personal best post(pbestpos) of a particle. The x and v are randomly initialized while pbest is intially set to infinity and the pbestpos is intialized to zero as a starting position.
Swarm
Create a Swarm class.
Here we need to pass no of particle(no_particle), no of dimensions(dim_shape), range of dimension(x_range), velocity range(v_range), interia weight range(iw_range) and cognitive and social parameter(c).
Optimize Function
The optimize function for this version is little bit different and the easier one than the later vectorized version because it performs all of the operations like calculate, update velocity and update position of each particle and the pbest and gbest of particle and swarm respectiverly.
It takes function parameter which is objective function that we want to optimize, print_step determines after how many iterations we want to log the loss value of objective function and iter is the no of iterations/epochs.
Get Best Solution(Pair of weights/parameters)
It will return the best pair of parameters with optimized loss throughout the algorithm.
Sphere (Objective Function)
Run Algorithm
Now we just need to specify the parameter of Swarm and then we’re good to run the PSO.
Results
Implementation (Numpy Vectorization Version)
Let’s create class of particle.
Particle
Swarm
Update Particle Position
It checks whether the current fitness calculated at certain iteration is less than the personal best(pbest) of particle then it updates the particle.
Update Particle Velocity
It takes particle(p) as parameter and first calculates its velocity and then updates its position.
Algorithm Loop
Sphere Function
Run Algorithm
Results
Important Note
This vectorized code might look complex because of numpy vectorized functions but this code is the most optimized version. It has been the winner among other implementations of PSO using numpy array, python lists and C++.
Github
The code for PSO(Numpy Vectorized), PSO(Python Lists), PSO(Numpy Arrays) and PSO implemented in C++ is available on github.
Hope you enjoyed reading and it was helpful for you.
Your feedback will be appreciated..