# Are you ready for solving the traveling salesman problem?

Recently, I encountered a traveling salesman problem (TSP)on leetcode: 943. Find the Shortest Superstring. You can find the problem here.

943. Find the Shortest Superstring

TSP is a famous NP problem. The naive solution’s complexity is O(n!). The DP (dynamic programming)version algorithm ( Bellman-Held-Karp algorithm) will have the complexity of O(2^n * n²). By reducing the complexity from factorial to exponential, if the size of n is relatively small, the problem can be solvable.

This problem is interested in me as I am interested in DP problems. Most DP problems reduce the time complexity from exponential to polynomial by avoiding the duplicated computation of subproblems. This problem here is one example of DP that can reduce the time complexity from factorial to exponential.

For why the naive solution complexity is O(n!) and the DP one is O(2^n*n²), there is a super nice and clear video from here.

Also, there is a nice post based on the video above. I love the clear explanation of the video, also I find the visualization of the problem is super nice from the post below.

However, if you compare the explanation video and the leetcode problem carefully, we will find some slight differences in describing the TSP problem. We will have two cases. I am going to use a baby example to illustrate the details of these algorithms. This baby example contains A, B, C, D 4 cities and is shown in Figure 1. In order to make it simple, we are generating a symmetric* TSP*, the distance between two cities is the same in each opposite direction.

- visit each city once and go back to the first visited city. ABCDA will be a the solution for this baby example.
- Visit each city only once. ADCB will be the solution.

I am going to have a step by step analysis based on these two cases.

Case 1: visit each city once and go back to the first visited city (ABCDA)

step 1: why the naive solution has the complexity of O(n!)

A complete search approach will give us (N-1)! possibilities. Why? We can visualize the possible choices by using figure 1. If we have n city, the first step and last step will be one of the city, then all the possible paths will be the permutation of rest n-1 cities. So the complexity will be (n-1)!, the complexity will be O(n!).

If the path cost is considered, the best path will be ABCDA or ADCBA. Actually, these two paths are the same path as we are analyzing symmetric* TSP.*

2: Can we speed up?

Yes.

From figure 4, we see we can prune some too expensive branches.

Figure 5 shows 3 pruned branches when generating paths already contain {A,B,C,D} and want to extend the path to connect with A. From this figure, we can not see a significant pruning, however, if we introduce more cities, the pruning will greatly improve the efficiency.

This process is exactly the critical part of the DP algorithm named Bellman–Held–Karp. The complexity as explained in the video is O(n²*2^n). It is much faster than the factorial one.

I implemented a straightforward Bellman–Held–Karp algorithm. The general idea is here:

If we have A, B, C, D

all the path contains 4 elements and ends with D can be written as:

{A,B,C,D} ends with D

It can be generated from

{A,B,C} ends with A + AD

{A,B,C} ends with B + BD

{A,B,C} ends with C+ CD

Then comes this solution below. I am using (A, B, C) + (A, ) to encode {A,B,C} ends with A

Solution 1 Straightforward Bellman–Held–Karp algorithm(844 ms)

Several solutions from the discussion of leetcode can be used to solve this TSP problem.

Solution 2(Bellman-Held-Karp algorithm) (runtime 936 ms) It is based on the DP TSP solution.

reference to this solution:

`class Solution:`

def shortestSuperstring(self, A: List[str]) -> str:

''' TSP: DP '''

n, N = len(A), 1 << len(A)

w = [[0] * n for _ in range(n)]

for i in range(n):

for j in range(n):

for k in range(min(len(A[i]), len(A[j])), 0, -1):

if A[j].startswith(A[i][-k:]):

w[i][j] = k

break

f = [[None] * n for _ in range(N)]

for i in range(N):

for k in (t for t in range(n) if (1 << t) & i):

i1 = i ^ (1 << k)

f[i][k] = min([f[i1][j] + A[k][w[j][k] :]

for j in filter(lambda x: (1 << x) & i1, range(n))],

key=len, default=A[k])

return min(filter(None, f[-1]), key=len)

Solution 2 (A* 50 ms)

I am still looking into the code and will have some explanation after I fully understand this one.

Reference:

from heapq import heappush, heappopdef dist(v, w, eq):

if eq:

return 10000

else:

for i in range(1, len(w)):

if v.endswith(w[:-i]):

return i

return len(w)def construct_seq(s, d, w):

t = w[s[0]]

for i in range(1, len(s)):

t = t + w[s[i]][-d[s[i-1]][s[i]]:]

return tdef heuristic(x, mdj):

return sum(mdj[i] for i in range(len(x)) if x[i] == 0)def adjacent_nodes(x):

ret = []

for i in range(len(x)):

if x[i] == 0:

y = list(x)

y[i] = 1

ret.append((i, tuple(y)))

return ret

class Solution(object):

def shortestSuperstring(self, A):

n = len(A)

# special case

if n == 1:

return A[0]

# assert n > 1

# distance between words

# dij := the cost in addition to add j after i

dij = [[dist(A[i], A[j], i == j) for j in range(n)] for i in range(n)]

# minimum cost to add j

mdj = [min(dij[i][j] for i in range(n)) for j in range(n)]

# A* search

# init

q = [] # priority queue with estimated cost

for i in range(n):

x = tuple(1 if j == i else 0 for j in range(n))

g = len(A[i]) # actual cost from start

h = heuristic(x, mdj) # lower bound of cost till the goal

heappush(q, (g + h, g, h, x, [i]))

best_f = None

best_p = None

while len(q) > 0:

# f, g, h, node, path

f, g, h, x, p = heappop(q)

if best_f is not None and f >= best_f:

break

for j, y in adjacent_nodes(x):

gy = g + dij[p[-1]][j]

py = p + [j]

if sum(y) == n: # is goal

if best_f is None or gy < best_f:

best_f = gy

best_p = py

else:

hy = heuristic(y, mdj)

heappush(q, (gy + hy, gy, hy, y, py))

return construct_seq(best_p, dij, A)

Solution 3 (Unclear about the method yet 40 ms) This code is collected from one of the fastest submissions. I am not clear about who is the original author. Also, I am not understanding the solution, but it is super fast and short. I will explain it later when I fully understand it.

`class Solution:`

def shortestSuperstring(self, words: List[str]) -> str:

def concat(s, t, mink):

for k in range(min(len(s), len(t)) - 1, 0, -1):

if k <= mink: break

if s[-k:] == t[:k]: return k, s + t[k:]

if t[-k:] == s[:k]: return k, t + s[k:]

return 0, s + t

if not words: return ''

while len(words) > 1:

sharedsize = a = b = -1

concatstr = ''

for j in range(len(words)):

for i in range(j):

k, s = concat(words[i], words[j], sharedsize)

if k > sharedsize:

sharedsize, concatstr = k, s

a, b = i, j

if sharedsize > 0:

words[b] = concatstr

words[a] = words[-1]

else:

words[0] += words[-1]

words.pop()

return words[0]

# 847. Shortest Path Visiting All Nodes

This problem has similarities to TSP, however, it is not a TSP as we are allowed to visit one node multiple times. Although it is not a TSP problem, we can use the method in solving TSP problem to prune the unneeded states.

without pruning (TLE)

`class Solution:`

def shortestPathLength(self, graph: List[List[int]]) -> int:

if any(len(g)==0 for g in graph):return 0

N = len(graph)

def bfs():

from collections import deque

Q = deque([(i, 0, {i}) for i in range(N)])

while Q:

i, d, seen = Q.popleft()

if len(seen)==N:return d

for j in graph[i]:

this_seen = seen | {j}

Q.append((j, d+1, this_seen))

return bfs()

With pruning (AC 348 ms)

`class Solution:`

def shortestPathLength(self, graph: List[List[int]]) -> int:

if any(len(g)==0 for g in graph):return 0

N = len(graph)

def bfs():

from collections import deque

Q = deque([(i, 0, {i}) for i in range(N)])

pruning = collections.defaultdict(lambda : float('inf'))

while Q:

i, d, seen = Q.popleft()

#pruning[tuple(sorted(seen))+(i,)]=d

if len(seen)==N:return d

for j in graph[i]:

this_seen = seen | {j}

this_key = tuple(sorted(this_seen))+(j,)

if pruning[this_key]>d+1:

pruning[this_key] = d+1

Q.append((j, d+1, this_seen))

return bfs()