How to Implement Simulated Annealing Algorithm in Python

AI Optimization Algorithm

The Simulated Annealing algorithm is commonly used when we’re stuck trying to optimize solutions that generate local minimum or local maximum solutions, for example, the Hill-Climbing algorithm. So we use the Simulated Annealing algorithm to have a better solution to find the global maximum or global minimum.

Why Simulated Annealing?

We’re going to simulate that process of some high-temperature systems, where things can move around quite frequently but, over time, decreasing that temperature until we eventually settle at an ultimate solution.

How does it work?

First, we have to determine how we will reduce the temperature on each iteration. In this example, we will start with a temperature of 90 degrees, and we will decrease the current temperature by 0.01 linearly until we reach the final temperature of 0.1 degrees.

initial_temp = 90
final_temp = .1
alpha = 0.01
current_temp = initial_temp

Then we will set the initial state and set it as the solution. You can set it up as a particular state or generate it randomly.

current_state = initial_state
solution = current_state

Now, we will repeat this process until the current temperature is less than the final temperature.

while current_temp > final_temp

For each iteration, we will get a random neighbor of the current state (the following state that we can go from the current state).

neighbor = random.choice(self.get_neighbors())

Then we calculate the differences between the neighbor and the current state.

cost_diff = self.get_cost(self.current_state) = self.get_cost(neighbor)

If the new solution is better, we will accept it.

if cost_diff > 0:
solution = neighbor

If the new solution is not better, we will still accept it if the temperature is high. With this approach, we will use the worst solution in order to avoid getting stuck in local minimum. But we will get a neighbor that is not that bit worse than the current state. We can determine that with the following probability equation:

if random.uniform(0, 1) < math.exp(cost_diff / current_temp):
solution = neighbor

The next step is to decrement the current temperature according to the alpha value.

current_temp -= alpha

So at the very end, we just return to whatever the current state happens to be.

This is the big picture for Simulated Annealing algorithm, which is the process of taking the problem and continuing with generating random neighbors. We’ll always move to a neighbor if it’s better than our current state. But even if the neighbor is worse than our current state, we’ll sometimes move there depending the temperature and how bad it is.

Conclusion

The Startup

Get smarter at building your thing. Join The Startup’s +725K followers.

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store