Randomized algorithms

Pritesh Gaikwad
5 min readJan 11, 2022

--

An algorithm that uses random numbers to decide what to do next anywhere in its logic is called Randomized Algorithm. It uses uniform random bits which is also called as pseudo random number as an input to guides its behaviour. One of the example of randomize algorithm is randomize quick sort. In Randomized Quick Sort, we use random number to pick the next pivot element. This algorithm is used to reduce time complexity or space complexity.

Classification

Randomized algorithms are classified in two categories.

Las Vegas: These algorithms always show correct or optimum result. Time complexity of these algorithms is based on a random value and time complexity is evaluated as expected value. For example, Randomized QuickSort always sorts an input array and the expected worst case time complexity of QuickSort is O(nLogn).

Monte Carlo: This algorithm shows correct or optimum result with some probability. These algorithms have deterministic running time and it is generally easier to find out worst case time complexity. For example this implementation of Karger’s Algorithm produces minimum cut with probability greater than or equal to 1/n2 (n is number of vertices) and has worst case time complexity as O(E). Another example is Fermet Method for Primality Testing.

Example 1) : Randomized quick sort

In quicksort, we separate the list into two parts. And a pivot element is chosen by partitioning algorithm. The left part of the pivot element holds the smaller values than pivot element, and right part holds the larger value. After partitioning, each separate lists are partitioned using same procedure.

Here, we are randomly choosing random element. Then we are doing the partitioning, then recursively we sort the array.

Algorithm:-

Algorithm RandomizedQuickSort(A,p,q)

{

if (p<q) then

{

i <- random(p,q);

swap(a[i],a[q]);

q1 <- partition(A,p,q);

RandomizedQuicksort(A,p,q1–1);

RandomizedQuicksort(A,q1+1,q);

}

}

Complexity Analysis:-

In worst case, each partition divides array such that one side has n/4 elements and other side has 3n/4 elements. In the worst case, height of recursion tree is Log 3/4 n which is O(Log n).

T(n) < T(n/4) + T(3n/4) + O(n)

T(n) < 2T(3n/4) + O(n)

Solution of above recurrence is O(n Log n).

So, the time complexity for all cases is O(n Log n).

Example 2): Karger’s algorithm to find Minimum Cut

What is the minimum cut?

The minimum cut in a graph refers to the number of edges in the graph that must be disconnected to split the graph into two disjoint components. Let’s take a few pictorial examples to get a more clearer idea.

In this example, the minimum cut is 2 because, it is severing the edges E and G.

It can be also achieved by severing the edges B and D.

Here, we can see that minimum cut is achieved by severing the edges A and B, which splits the graph into two symmetric halves. So, the Minimum cut is 2.

Here, the size of the two disjoint components does not matter to us i.e. the two components can vary largely in size or be almost equivalent.

Karger’s algorithm:-

Karger’s algorithm is a randomized algorithm to find the minimum cut in an unweighted and undirected graph in which we take random edge and contract or coalesce the two ends of edge into a single point. So, all points eventually combine into super-nodes and the size of the graph keeps decreasing. All the while, all self loops are removed. At the end of the algorithm, we are left with two nodes connected by a number of edges. The number of edges connecting the two final super-nodes gives the required minimum cut.

Algorithm:

While there are more than 2 vertices-

a) Pick a random edge (x, y) in the contracted graph.

b) Merge the points x and y into a single vertex (update the contracted graph).

c) Remove self-loops.

Return cut represented by two vertices.

Let us go back to our first example and see this algorithm in action visually.

· So, here in the first step, we merge the vertices at the ends of edge B. So, we end up with a super node which now becomes one end of the edges A, C & D. The edge B will form a self-loop since both of its ends have converged. So, we omit it.

· We contract the edge C (or D, same meaning). Both the edges C and D will form self loops and are thus omitted. We now have a super node which becomes the other end point of the edges A and F.

· We pick the edge A (or F) and converge the points at either end. So, both A and F form self loops and are removed from the graph. We end up with a graph with two vertices connected by the edges E and G.

· Since only two vertices remain, the algorithm terminates and 2(the number of edges remaining in the graph) is returned as the minimum.

Success Probability and Time complexity:

Since Karger’s algorithm is a randomised algorithm, it doesn’t always arrive at the right answer in the first run. In fact, the probability of reaching the minimum cut is 2/n(n-1), which is low. So to ensure the minimum cut, we must run the algorithm an adequate number of times. It has been observed that we need atleast n2 log(n) runs to arrive at the optimal solution, where n is the number of vertices in the graph.

The time complexity for merging of any pair of vertices as well as removal of self loops is O(n). Since both of these will be done on the graph until only 2 vertices remain, the time complexity for a single run is O(n2). But for optimality, we will be running the algorithm n2 log(n) times, so our overall complexity shoots up to O(n4 log(n)).

Applications of randomized algorithms:-

· Randomized algorithms have huge applications in cryptography.

· It is also used in load balancing, primality testing.

· These algorithms also used in data structures like hashing, sorting, searching, order statistics and computational geometry.

· It is also used in parallel and distributed computing.

Conclusions:-

· Randomized algorithms have the potential for providing the efficient and elegant solutions for many problems.

· The concern about randomized algorithm is their dependence on the random bits they use.

· The bits are not really random, since they are generated by the pseudorandom number generator.

· The randomized algorithms are fully deterministic and the entired computation is completely fixed.

Authors:

Rajwardhan Patil

Prasad Pokale

Pritesh Gaikwad

Kaustubh Punekar

--

--