Pizza With 3N Slices

Deepak Gupta
8 min readSep 17, 2020

--

Let’s discuss an algorithm problem. You can find the problem here on leetcode: https://leetcode.com/problems/pizza-with-3n-slices/. There is no editorial, so I am writing one.

The Problem Statement

Given a pizza consisting of 3*N slices where N ∈ Integer. Each slice of pizza has some reward points associated with it. Alice wants to eat exactly N slices, whichever slice she eats, neighbouring slices on both sides disappear (or eaten away by someone else). Please note that the neighbouring slice doesn’t have to be adjacent but the closest available slice in both clockwise and anticlockwise directions. Alice gets reward points when she eats a slice, what is the maximum number of points that Alice can collect?

Let’s understand the problem with an example:

There are 6 slices meaning that Alice needs to choose 2 slices out of them. Let’s say she chooses C, slices B and D will disappear and Alice bags 100 points. After it, she again chooses E and bagging a total of 200 points.

Please try to solve the problem yourself before scrolling down to the solution. And remember, pen and paper always helps!

The Greedy Approach

I always start trying out the greedy approach. If it works, it leads to the easiest and the quickest implementations. The only catch is that it takes some proving and reasoning skills to feel confident that your greedy approach will work for all cases. One approach here could be:

total_points = 0
for i in range(N):
selected_slice = slice with max points from available slices
total_points += selected_slice.points
return total_points

The question is, would this approach always help Alice fetch the maximum possible score? No, it won’t. It is not hard to find an example where this approach doesn’t find the optimal result. The problem is that selecting a slice rejects two slices around the selected slice, this approach discards neighbours even without considering their worth. It is possible that the worth of two neighbours exceeds the worth of selected slice and there is nothing else in the rest of the pizza to compensate for the loss. Here is a concrete example:

Now you can clearly see here that selecting 100 would discard both slices with 90 points grabbing only a total of 110 points whereas it was possible to select 180 points by selecting the two 90 points slices.

The Brute Force Approach

This is the second approach you always want to try when greedy doesn’t help. This approach entails enumerating all possibilities, calculating reward points that Alice will get for each of those possibilities and comparing them to find the maximum.

How many possibilities: To start with Alice has 3N slices to choose from. After she selects a slice, she has (3N - 3) slices to choose from. So extending the series, she will have a total of 3N*(3N - 3)*(3N - 6)…(N times) possibilities which is roughly equal to (3N^N) possibilities.

How much time finding out points gathered in one of those possibilities: It will take O(N) time to go through the chosen slices and finding the sum. You can keep track of the maximum as you are doing the sum.

Total running time complexity: O(3N^N)*N

Clearly, the brute force solution doesn’t scale well and wouldn’t work even for small values of N. Practically speaking, exponential algorithms are as useful as the swiss knife in the nail-cutter.

The Optimal Approach

When stuck at trying to find an optimal solution to the problem, it’s always helpful to start writing down the observations and start cutting down the possibilities (a.k.a. pruning) from all the possibilities that we considered in our brute force solution.

One of the most obvious observations is that given a state of pizza, Alice can never eat two neighbouring slices. Given this observation, the best we can do is to pick the subset of “non-neighbouring” slices such that the sum of points of those slices is maximum possible. For the sake of simplicity and the lack of creativity, let’s call such a subset - the candidate subset. Please remember that there are few important things to remember about the “candidate subset”: (a) It contains exactly N slices (b) None of those slices is adjacent in the original pizza and (c) It is the subset with maximal points among all subsets that follow (a) and (b). The important question that immediately follows the statement we just made:

Is it always possible to pick up the candidate subset? It certainly not obvious. Please note that the neighbouring slices keep changing when we pick a slice and we may end up making two slices in the candidate subset neighbours of each other in an intermediate step. Example: Let’s say we had (A, B, C, D, E, F, G, H, I) and our candidate subset is (C, A, E). It’s a valid candidate subset because none of these is adjacent but when you pick C, you end up making A and E neighbours and thus you will never be able to pick both of them. For the same example, if we pick slices in order (A, C, E) - we are able to pick them up. So need to prove that there always exists an order that doesn’t end up making two selecting slices neighbours of each other in any intermediate state.

The proof is fairly simple. Consider the elements which aren’t selected in the candidate subset. There are 2N such elements spread among our N selected elements. If you place only the selected slices on the pizza box, there are N “gaps” that we need to fill up with these 2N not-selected elements. With the help of the pigeonhole principle, we can argue that there is at least one “gap” that we need to fill up with ≥2 non-selected slices. Picking up a slice from the end of such a gap will leave at least one non-selected slice between two selected slice making sure that two selected slices don’t end up being adjacent. We can apply this reasoning inductively (induction is to the world of proofs what recursion is to programming) and prove that there will always exist a way to select the candidate subset.

Great. Now, all we need is to find an optimal way to select the candidate subset. Let’s move on…

The problem we have at hand is: Find the maximum subset of integers such that no two selected integers are adjacents to each other in a circular array.

It is a standard DP problem that you might have already solved. But have you? In this case, the problem has to be solved on a circular array which makes a little bit different than solving it in a linear array. The difference being that the first and the last elements can’t be chosen together whereas, in case of the linear array, there is nothing that we do to make sure that both the first and the last elements aren’t chosen at the same time. BTW, if you haven’t solved the linear version, please go through this: https://medium.com/@arunistime/maximum-sum-of-non-adjacent-numbers-algorithm-explained-159f08b5790a

How do we solve the same problem for a circular array? A cheap solution is to consider two cases: a) when the first element is not selected and solve for linear array -> arr[1:N-1] b) when the last element is not selected and solve for the linear array -> arr[0:N-2], take a maximum of both cases and you are done. But wait, let’s not go for a cheap solution. There is a more interesting way to solve this.

I am going to make a statement — the minimum element from the array can never be in the candidate subset or if there are multiple minimum elements then at least one of them can never be in the candidate subset. If what I said is true, the solution is to simply open up the circular array such that the last element is the minimum element and now because we never need to select the minimum element, we will never need to select the last element and you can solve just for the linear array arr[0:N-2]. Voila!

But how on earth do we prove that the minimum element will never be a part of the candidate subset? You can prove it by contradiction. Let’s assume that the minimum element IS the part of the candidate subset. Having said that, we will make logical arguments to reach an illogical conclusion. Reaching an illogical conclusion means that our assumption was wrong, proving that the minimum element shouldn’t be the part of the candidate subset. I know it’s too interesting for me to just write it down here for you guys to read, please try the proof yourself and share it in the comments. I will edit the post and add the proof here next week!

Have a great day!

Edit 1: Putting up remaining proof — As we said, it can be proved by contradiction. Let’s say the minimum element should be a part of the candidate subset. There are 2 “gaps” (containing non-selected elements) around this minimum element. There are two possible cases from here:

a) At least one of those two “gaps” contains >1 non-selected elements. For simplicity, let’s understand this scenario with ABXC where X is the selected element (AB are placed on one side because that gap contains >1 element). What if you had chosen B instead of X? The solution can’t worsen because X is the minimum element, also the rest of the candidate subset remains unchanged.

b) Both “gaps” contains exactly one element. The funny thing is that in this case, the minimum element must be the last one to be picked up according to the picking strategy that we came up with (pick the one with > 1 gaps first). We should pick the max(A, B, X) and given that X is the minimum, it will never be chosen.

So we have proved that selecting the minimum element is never required. So you can just open-up the array around the minimum element and solve the linear version of the problem to get the correct answer.

Here is an implementation in python:

class Solution:

def __init__(self):
self.dp = {}

def solve_linear(self, slices, ind, to_choose):
if (ind, to_choose) not in self.dp:
N = len(slices)
if N <= ind or to_choose == 0:
return 0
result_choose = self.solve_linear(slices, ind + 2, to_choose - 1)
result_not_choose = self.solve_linear(slices, ind + 1, to_choose)
self.dp[(ind, to_choose)] = max(result_choose + slices[ind], result_not_choose)
return self.dp[(ind, to_choose)]

def maxSizeSlices(self, slices: List[int]) -> int:
min_elem = min(slices)
min_ind = 0
while slices[min_ind] != min_elem:
min_ind += 1
N = len(slices)
min_ind += 1
arr = [0 for _ in range(N)]
for i in range(N):
arr[i] = slices[(min_ind + i) % N]
return self.solve_linear(arr, 0, int(N/3))

--

--