Where should I locate my store?
The Maximal Covering Location Problem with Quantum Computing
by: Daniel Sierra-Sosa, Alejandro Giraldo Quintero, Juan Guillermo Lalinde, Emilio Pelaez and Ved Dharkar
based on: Using quantum computing to solve the maximal covering location problem
Nowadays, we have grown accustomed to having a variety of essential services located in close proximity to our homes, especially in urban areas. Grocery stores, markets, healthcare facilities, and more are easily accessible to us. However, the placement of these facilities is not always random, and there may be specific considerations to enable them to be located as close as possible to the greatest amount of customers.
When we aim to find the “greatest amount” or “closest service,” we are stepping into the realm of what mathematicians call an optimization problem. Optimization problems are ubiquitous in our daily lives, even if we may not be fully aware of them. For example, when we navigate our way to a friends house, we subconsciously evaluate various routes in our minds to determine the best path to take.
When a problem is small, it’s easy for us to figure out which option is the best. But what happens when the problem gets bigger or more complicated? We’re researching whether quantum computers will be able to step in, to help solve complex optimization problems too hard for classical computers. In this blog we’re going to explore the Maximal Covering Location Problem (MCLP) using Qiskit and Qiskit Optimization.
Problem Overview
Let’s go back to our earlier example of traveling to our friend’s house. Suppose we also want to pick up some donuts, make a payment on a bill, and avoid traffic. Now, the problem has become more complicated. Using the language of mathematicians, the new elements we just added are called “restrictions”, which means there are limits to what we can do. And our main goal is still to find our friend’s house, which is called the “objective function”.
Mathematicians use restrictions and objective functions to solve problems, even when they are complicated. They create models of the problem and use mathematical tools to find the best solution.
To solve this problem we need to consider the Maximal Covering Location Problem (MCLP). This optimization problem is one of the best-known NP-hard problems in the field of regional science, it was introduced by Church and ReVelle [1]. It starts by trying to answer the question: Where should I locate my store? The natural answer to this question is where the greatest amount of people can access it!
Locations with a significant concentration of customers are known as demand nodes. People in a node are satisfied if the closest store is within a predefined distance. In more rigorous terms (later on we will define the exact variables and constraints that make up the problem), the MCLP takes a population located in some set of demand nodes I, a fixed number of facilities p, with potential sites that make up set J where
and a value S that determines the distance beyond which a demand node is not covered, and tries to find the combination of facilities for which some measure is maximized or minimized. In this problem, the measure we seek to maximize is the following:
- Population covered within a distance S of the placed facilities.
An extra constraint can be added to the problem:
- All population nodes are within a distance T of some facility, taking into account that T > S.
This constraint tries to preserve some degree of equity to those people not covered in a distance S (remember that this problem originated in public facility location analysis), and it converts the problem into what we call the maximal covering location problem with mandatory closeness constraints. Here, we will not solve the MCLP with this additional constraint, but you can read more about this variation in [1].
To solve the MCLP with quantum computing, first we are going to define a linear optimization problem, which is known as the “minimization version” of the MCLP. Once we have this problem defined, we can convert it into a quadratic binary unconstrained optimization (QUBO) problem, which is also some expression we want to optimize, but this time with a quadratic component and only binary variables. Once we have a QUBO problem, we can map it into an annealing process or solve it with the quantum approximate optimization algorithm (QAOA) [2]. We will opt for the latter. After running the algorithm, we will just have to convert the solution vector into a solution for the MCLP.
Minimization version
To get to the minimization version of the MCLP, we need to start off with the more intuitive maximization version. First, let’s formally define the variables that we are going to use:
With this, the MCLP can be formulated as maximizing the objective function defined by the equation
such that
where we have introduced a new variable yᵢ, which given the first constraint will be equal to 1 only when there exists at least one xᵢ ∈ Nᵢ set to 1. In other words, yᵢ equals 1 when there is a facility covering the demand node i, and is 0 otherwise. To get the minimization version of the MCLP, we will define a new variable:
This means that our new variable will equal 1 whenever node i is not covered and 0 when it is covered. Substituting this variable into the original objective function, we get the equation
The first term in the equation above is simply the total population, which is a constant we know beforehand. Therefore, the problem can be interpreted as minimizing the second term of the equation (since it is subtracted from the first constant). Thus, also substituting ȳᵢ into the constraints in the original formulation, we arrive at the minimization version of the MCLP, which is defined as follows:
Defining this equations in words, we are attempting to minimize the population not covered by any facility.
Qiskit Implementation of the MCLP Problem
The first step is to import a MCLP problem, for simplicity, in this tutorial we will only store one MCLP problem in a single Excel file, and this would look as follows
The name
column is not used when reading the problem, but it may be helpful for organizing. The next two columns s
and p
correspond to the variables S and p, respectively. Then, the row
and col
columns specify how many rows and columns there are in the map corresponding to our problem. The maps we are going to deal with here will be of a square shape, but you could leave some nodes with no population to create non-square shapes. Then, we will have row x col
columns with indices 0,1,2,…, (row x col — 1)
. These columns correspond to the nodes that make up the map when read from left to right and top to bottom. In the first row, these columns indicate whether or not the node is a potential facility site. Anything can be written in the corresponding cell, here we chose to write xi
where i
is the column index. The second row indicates the population at each node, thus the value in these cells must be an integer.
As you can see, there is redundancy in the first four columns, as these will always have the same information in the first and second row. Therefore, you can leave these columns empty in the second row as they will not be accessed by the reader.
We’ll use the following code to import all the information we need to define the problem programmatically:
import pandas as pd
import numpy as np
import qiskitdf = pd.read_excel("example.xlsx")
rows = df.iloc[0]['row']
cols = df.iloc[0]['col']
p = df.iloc[0]['p'] # Number of facilities
S = df.iloc[0]['s'] # Service distance
I = range(rows * cols) # Set of demand nodes
J = [1 if str(df.iloc[0][j]) != 'nan' else 0 for j in I] # Potential facility sites map
pop = [int(df.iloc[1][j]) for j in I] # Population in each nodeprint(f"{p} facilities to place. Service distance is {S}.")
print(f"Map has {rows} rows and {cols} columns, for a total {len(I)} nodes.")
print("Potential facility map: ", J)
print("Population at each node: ", pop)
output:
2 facilities to place. Service distance is 1.
Map has 3 rows and 3 columns, for a total 9 nodes.
Potential facility map: [1, 0, 1, 0, 0, 0, 1, 0, 0]
Population at each node: [4, 5, 2, 2, 1, 4, 2, 2, 1]
Next, before starting to define the minimization model, we will create a helper function that helps us access the set Nᵢ for each node easily later on.
def get_neighbors(rows, cols, J, S):
sites = {}
counter = 0
for x in range(rows):
for y in range(cols):
sites[counter] = (x, y)
counter += 1
N = {}
for i in range(rows * cols):
for j in [i for i, e in enumerate(J) if e == 1]:
distance = np.sqrt((sites[i][0] - sites[j][0]) ** 2 + (sites[i][1] - sites[j][1]) ** 2)
if distance <= S:
try:
N[i].append(j)
except:
N[i] = [j]
return Nget_neighbors(rows, cols, J, S){0: [0], 1: [0, 2], 2: [2], 3: [0, 6], 5: [2], 6: [6], 7: [6]}
With this in place, we can go ahead and use Gurobi to create the minimization problem. The following piece of code creates a Gurobi Model
, adds the binary variables ȳᵢ and xⱼ, and specifies the objective function and constraints as we described them earlier.
import gurobipy as gp
from gurobipy import GRBm = gp.Model("MCLPmin")
## Add variables
# Add y_i
y = {}
for i in I:
y[i] = m.addVar(vtype = GRB.BINARY, name="y_" + str(i))
# Add x_j
x = {}# go through the list to get indices of potential facility sites
for j in [i for i, e in enumerate(J) if e == 1]:
x[j] = m.addVar(vtype = GRB.BINARY, name="x_" + str(j))
m.update()
## Add equations
# Objective function
m.setObjective(np.sum([pop[i] * y[i] for i in I]), GRB.MINIMIZE)
# Constraints 1
N = get_neighbors(rows, cols, J, S)
for i in I:
temp = []
if i in N.keys():
for j in N[i]:
temp.append(x[j])
if len(temp) >= 1:
m.addConstr(np.sum(temp)+ y[i] >= 1, "c1a_"+str(i))
else:
m.addConstr(y[i] >= 1, "c1b_"+str(i))
# Constraints 2
temp = []
for j in [i for i, e in enumerate(J) if e == 1]:
temp.append(x[j])
m.addConstr(np.sum(temp) == p, "c2_"+str(i))
m.update()
## Save .lp file with model
m.write("example_min_mclp.lp")with open("example_min_mclp.lp", "r") as f:
print(f.read())
output:
\ Model MCLPmin
\ LP format - for model browsing. Use MPS format to capture full model detail.
Minimize
4 y_0 + 5 y_1 + 2 y_2 + 2 y_3 + y_4 + 4 y_5 + 2 y_6 + 2 y_7 + y_8
Subject To
c1a_0: y_0 + x_0 >= 1
c1a_1: y_1 + x_0 + x_2 >= 1
c1a_2: y_2 + x_2 >= 1
c1a_3: y_3 + x_0 + x_6 >= 1
c1b_4: y_4 >= 1
c1a_5: y_5 + x_2 >= 1
c1a_6: y_6 + x_6 >= 1
c1a_7: y_7 + x_6 >= 1
c1b_8: y_8 >= 1
c2_8: x_0 + x_2 + x_6 = 2
Bounds
Binaries
y_0 y_1 y_2 y_3 y_4 y_5 y_6 y_7 y_8 x_0 x_2 x_6
End
The text output above represents the optimization model we have created. If you go through the objective function and the constraints carefully, you will notice that these indeed correspond to how we defined the MCLP minimization problem earlier. With the model in place, we can go ahead and run it to find the optimal solution. First, lets see how to solve this problem classically, using Gurobi’s classical method to have a base line solution.
m.setParam('IntFeasTol', 1e-5) # tolerance to non-integer solutions
m.params.timeLimit = 1800
m.optimize()
output:
Set parameter TimeLimit to value 1800
Gurobi Optimizer version 10.0.1 build v10.0.1rc0 (win64)
CPU model: 12th Gen Intel(R) Core(TM) i9-12900K, instruction set [SSE2|AVX|AVX2]
Thread count: 16 physical cores, 24 logical processors, using up to 24 threads
Optimize a model with 10 rows, 12 columns and 21 nonzeros
Model fingerprint: 0x76ea8c0d
Variable types: 0 continuous, 12 integer (12 binary)
Coefficient statistics:
Matrix range [1e+00, 1e+00]
Objective range [1e+00, 5e+00]
Bounds range [1e+00, 1e+00]
RHS range [1e+00, 2e+00]
Found heuristic solution: objective 6.0000000
Presolve removed 10 rows and 12 columns
Presolve time: 0.00s
Presolve: All rows and columns removed
Explored 0 nodes (0 simplex iterations) in 0.00 seconds (0.00 work units)
Thread count was 1 (of 24 available processors)
Solution count 1: 6
Optimal solution found (tolerance 1.00e-04)
Best objective 6.000000000000e+00, best bound 6.000000000000e+00, gap 0.0000%
The best value for the objective function that the model found is 6. This means that only a population of 6 is not covered by some facility. To see what facility assignments achieve this, we can print the variables xⱼ (remember that xⱼ=1 means there is a facility at node j).
{0: <gurobi.Var x_0 (value 0.0)>,
2: <gurobi.Var x_2 (value 1.0)>,
6: <gurobi.Var x_6 (value 1.0)>}
Thus, the optimal solution is to assign facilities to nodes 2 and 6!
Converting to QUBO
QAOA is a popular quantum algorithm for optimization problems, but it requires a QUBO-style of problem to work. As the name suggests, a QUBO program will have no constraints, only a quadratic equation with binary variables as its objective function. However, the minimization problem uses various constraints. Thus, we need to convert our problem into a QUBO format in order to solve it using quantum computing methods. Fortunately, the qiskit_optimization
module has some functions that will help us achieve this. The process to do this is as follows: first we convert all inequality constraints in our model into equality constraints by introduced new variables, then these equality constraints are converted into penalty terms in the objective function, and finally we convert the variables that were introduced into binary variables.
from qiskit_optimization.converters import InequalityToEquality
from qiskit_optimization.converters import IntegerToBinary
from qiskit_optimization.converters import LinearEqualityToPenalty
from qiskit_optimization.problems import QuadraticProgramqp = QuadraticProgram()
iq_to_eq = InequalityToEquality()
lin_eq_to_pen = LinearEqualityToPenalty()
int_to_bin = IntegerToBinary()# load problem with constraints
qp.read_from_lp_file("example_min_mclp.lp")# convert inequalities into equalities
qp = iq_to_eq.convert(qp)# remove equality constraints
qp = lin_eq_to_pen.convert(qp)# convert introduced variables into binary
qp = int_to_bin.convert(qp)
qp.name = "example_min_mclp_qubo"
# Replace variables with @ and save to new file
txt = qp.export_as_lp_string()
txt = txt.replace("@","_")
with open("example_min_mclp_qubo.lp", "w") as f:
f.write(txt)
f.close()
# Create new variable with QUBO problem
qubo = QuadraticProgram()
qubo.read_from_lp_file("example_min_mclp_qubo.lp")print(qubo.export_as_lp_string())
output:
\ This file has been generated by DOcplex
\ ENCODING=ISO-8859-1
\Problem name: example_min_mclp_qubo
Minimize
obj: - 44 y_0 - 43 y_1 - 46 y_2 - 46 y_3 - 47 y_4 - 44 y_5 - 46 y_6 - 46 y_7
- 47 y_8 - 240 x_0 - 240 x_2 - 240 x_6 + 48 c1a_0_int_slack_0
+ 48 c1a_1_int_slack_0 + 48 c1a_1_int_slack_1 + 48 c1a_2_int_slack_0
+ 48 c1a_3_int_slack_0 + 48 c1a_3_int_slack_1 + 48 c1a_5_int_slack_0
+ 48 c1a_6_int_slack_0 + 48 c1a_7_int_slack_0 + [ 48 y_0^2 + 96 y_0*x_0
- 96 y_0*c1a_0_int_slack_0 + 48 y_1^2 + 96 y_1*x_0 + 96 y_1*x_2
- 96 y_1*c1a_1_int_slack_0 - 96 y_1*c1a_1_int_slack_1 + 48 y_2^2
+ 96 y_2*x_2 - 96 y_2*c1a_2_int_slack_0 + 48 y_3^2 + 96 y_3*x_0
+ 96 y_3*x_6 - 96 y_3*c1a_3_int_slack_0 - 96 y_3*c1a_3_int_slack_1
+ 48 y_4^2 + 48 y_5^2 + 96 y_5*x_2 - 96 y_5*c1a_5_int_slack_0 + 48 y_6^2
+ 96 y_6*x_6 - 96 y_6*c1a_6_int_slack_0 + 48 y_7^2 + 96 y_7*x_6
- 96 y_7*c1a_7_int_slack_0 + 48 y_8^2 + 192 x_0^2 + 192 x_0*x_2
+ 192 x_0*x_6 - 96 x_0*c1a_0_int_slack_0 - 96 x_0*c1a_1_int_slack_0
- 96 x_0*c1a_1_int_slack_1 - 96 x_0*c1a_3_int_slack_0
- 96 x_0*c1a_3_int_slack_1 + 192 x_2^2 + 96 x_2*x_6
- 96 x_2*c1a_1_int_slack_0 - 96 x_2*c1a_1_int_slack_1
- 96 x_2*c1a_2_int_slack_0 - 96 x_2*c1a_5_int_slack_0 + 192 x_6^2
- 96 x_6*c1a_3_int_slack_0 - 96 x_6*c1a_3_int_slack_1
- 96 x_6*c1a_6_int_slack_0 - 96 x_6*c1a_7_int_slack_0
+ 48 c1a_0_int_slack_0^2 + 48 c1a_1_int_slack_0^2
+ 96 c1a_1_int_slack_0*c1a_1_int_slack_1 + 48 c1a_1_int_slack_1^2
+ 48 c1a_2_int_slack_0^2 + 48 c1a_3_int_slack_0^2
+ 96 c1a_3_int_slack_0*c1a_3_int_slack_1 + 48 c1a_3_int_slack_1^2
+ 48 c1a_5_int_slack_0^2 + 48 c1a_6_int_slack_0^2 + 48 c1a_7_int_slack_0^2
]/2 + 312
Subject To
Bounds
0 <= y_0 <= 1
0 <= y_1 <= 1
0 <= y_2 <= 1
0 <= y_3 <= 1
0 <= y_4 <= 1
0 <= y_5 <= 1
0 <= y_6 <= 1
0 <= y_7 <= 1
0 <= y_8 <= 1
0 <= x_0 <= 1
0 <= x_2 <= 1
0 <= x_6 <= 1
0 <= c1a_0_int_slack_0 <= 1
0 <= c1a_1_int_slack_0 <= 1
0 <= c1a_1_int_slack_1 <= 1
0 <= c1a_2_int_slack_0 <= 1
0 <= c1a_3_int_slack_0 <= 1
0 <= c1a_3_int_slack_1 <= 1
0 <= c1a_5_int_slack_0 <= 1
0 <= c1a_6_int_slack_0 <= 1
0 <= c1a_7_int_slack_0 <= 1
Binaries
y_0 y_1 y_2 y_3 y_4 y_5 y_6 y_7 y_8 x_0 x_2 x_6 c1a_0_int_slack_0
c1a_1_int_slack_0 c1a_1_int_slack_1 c1a_2_int_slack_0 c1a_3_int_slack_0
c1a_3_int_slack_1 c1a_5_int_slack_0 c1a_6_int_slack_0 c1a_7_int_slack_0
End
As you can see, the newly defined optimization problem does follow the QUBO format. The objective function has a lot more terms now, but all the constraints have been eliminated and all variables are binary.
Running QAOA
With our problem re-defined as a QUBO, we can turn our attention to the quantum algorithm that will help us solve it. The quantum approximate optimization algorithm (QAOA) is a variational algorithm that uses a parametrized unitary 𝑈(𝛽,𝛾) to prepare and optimize the state:
until the optimal parameters 𝛽ₒₚₜ and 𝛾ₒₚₜ are found. With these parameters are found, the state:
encodes the solution vector to the problem.
The unitary 𝑈(𝛽,𝛾) is in turn composed of two other unitaries:
where Hₘ is the mixing Hamiltonian and Hₚ the problem Hamiltonian. As the name indicates, Hₚ will encode the structure of the problem we want to solve into our quantum circuit. Given this, the quantum state is defined as:
where
With this in place, we will define the optimal parameters as those such that the expectation value
is minimized.
For a deeper review on how to define the mixing Hamiltonian check out [3] and for the problem Hamiltonian for QUBO problems look at [4]. Implementing QAOA from the start is complicated, but fortunately Qiskit already has many tools in place that allow us to take our QUBO problem and solve it with QAOA.
from qiskit import Aer
from qiskit.providers.aer import QasmSimulator
from qiskit.utils import QuantumInstance
from qiskit.algorithms import QAOA, NumPyMinimumEigensolver
from qiskit.utils.algorithm_globals import algorithm_globals
from qiskit_optimization.algorithms import MinimumEigenOptimizeralgorithm_globals.random_seed = 12345
simulator = QasmSimulator(method = "matrix_product_state")
quantum_instance = QuantumInstance(
simulator,
seed_simulator = algorithm_globals.random_seed,
seed_transpiler = algorithm_globals.random_seed,
)
qaoa_mes = QAOA(quantum_instance = quantum_instance, initial_point = [np.pi / 2, np.pi / 3], reps = 1)
qaoa = MinimumEigenOptimizer(qaoa_mes)qaoa_result = qaoa.solve(qubo)for i, var in enumerate(qaoa_result.variables):
print(f"{var.name}: {qaoa_result[i]}")
output:
y_0: 1.0
y_1: 0.0
y_2: 1.0
y_3: 1.0
y_4: 1.0
y_5: 1.0
y_6: 0.0
y_7: 0.0
y_8: 1.0
x_0: 0.0
x_2: 1.0
x_6: 1.0
c1a_0_int_slack_0: 0.0
c1a_1_int_slack_0: 0.0
c1a_1_int_slack_1: 1.0
c1a_2_int_slack_0: 1.0
c1a_3_int_slack_0: 0.0
c1a_3_int_slack_1: 1.0
c1a_5_int_slack_0: 1.0
c1a_6_int_slack_0: 0.0
c1a_7_int_slack_0: 0.0
Qiskit’s QAOA gives us a solution vector indicating the value of each variable in the QUBO problem. To get our solution, we need to look at the xⱼ variables, which indicate what facility locations will be occupied (if x_j: 1
then a facility should be placed there, at index j). As you can see, just as in the classical solution, we get that the optimal configuration is to put facilities in nodes 2 and 6.
References
[1]: Church, R., ReVelle, C. The maximal covering location problem. Papers of the Regional Science Association 32, 101–118 (1974). https://doi.org/10.1007/BF01942293
[2]: Farhi, E., Goldstone, J., & Gutmann, S. (2014). A Quantum Approximate Optimization Algorithm. arXiv: Quantum Physics.
[3]: Solving combinatorial optimization problems using QAOA. Learn Quantum Computation using Qiskit. https://qiskit.org/textbook/ch-applications/qaoa.html#QAOA.
[4]: Solving a simple QUBO with QAOA. Entropica Labs. https://docs.entropicalabs.io/qaoa/notebooks/6_solvingqubowithqaoa.