AI - Simulated Annealing Algorithm to solve Magic Square Problem
Using AI to get Magic Square of order (n=3). Simulated Annealing concepts like badness, temperature and acceptance probability is discussed with interactive examples.
Introduction
This blog in continuation of the previous blog on Hill Climbing Algorithm . We have seen that in space space diagram the hill climbing algorithm can get stuck at local maxima (minima if the objective if minimization)or shoulder or ridges because Hill climbing algorithm works greedily. To overcome that we use simulated algorithm.
Simulated Annealing
- It is a probabilistic technique, local search algorithm to optimize a function.
- The algorithm is inspired by annealing in metallurgy where metal is heated to a high temperature quickly, then cooled slowly, which increases its strength.
Algorithm
- Similar to hill climbing algorithm. If the move improves the situation, it is always accepted (greedy)else the move is accepted with some probability less than 1 (explorations).
- The probability decreases exponentially with the “badness” of the move — the amount ΔE by which the evaluation is worsened.
- The probability also decreases as the “temperature” T goes down: “bad” moves are more likely to be allowed at the start when T is high, and they become more unlikely as T decreases.
- If the schedule lowers T slowly enough, the algorithm will find a global optimum with probability approaching 1
Our problem Magic Square
A magic square of order n is a square array of numbers consisting of the distinct positive integers 1, 2, …, n². It has a property that the sum of the n numbers in any horizontal, vertical, or main diagonal line is always the same number. Eg of a n order 3 magic square.
Coding
What have we coded?
- We made an initial batch of 50 candidates of magic square ,then selected the one with least violations.
- Now for every iteration we selected the best candidate from a new batch and compare it with the best one in it’s previous generation.
- If the successor is better than predecessor , it is always accepted , else it is accepted with a small probability.
- This probability depends on badness and temperature, which act as a hyperparameter. The temperature is updated at each step using of the formulae( Linear: t=t-a, Geometric : t=t*a (0<a<1);Slow decrease rule: t=t/(1+bt) ).
- It terminates when temp reaches 0 or 0 violations is found.
If we wanna see how the number of violations have decreased over iterations. We can plot a graph using the list x and y.
In the next article, we’ll talk about genetic algorithm. Drop your comments below. End of blog…..