Complexity — It is not that complex

Aditya Agrawal
Programming Club, NIT Raipur
3 min readSep 18, 2018

Complexity : Complexity of an algorithm is a measure of the amount of time and/or space required by an algorithm for an input of a given size (n).

In analysing an algorithm, rather than a piece of code, we will try and predict the number of times “the principle activity” of that algorithm is performed. For example, if we are analysing a sorting algorithm we might count the number of comparisons performed.

Do not be startled if you haven’t written any code before. Just observe and try to connect the pieces.

C++:
int sum=0;
for(int i=0;i<n;i++) {
sum = sum + arr[i]
}

Python
sum = 0
for i in range(n):
sum = sum + arr[i]

If you understand the above code you can skip this paragraph,
In the above code I have done the same work in two different languages so you can compare and figure out what is the code supposed to do. With a little inspection, you can get the idea that

  1. we are trying to add values into a variable named ‘sum’.
  2. both pieces involve a loop where variable ‘i’ takes value from 0 to n-1
    (do not worry if you don’t understand how loop works, getting the gist of a loop is all that is required)

If adding of a number takes 1 operation, doing it ‘ n ‘ time would take n operations.

“Complexity is analysed in terms of time and space both.”

What effects run time of an algorithm?

 (a) the computer used, the hardware platform
(b) representation of abstract data types (ADT's)
(c) efficiency of compiler
(d) competence of implementer (programming skills)
(e) the complexity of underlying algorithm
(f) size of the input

We will show that of those above (e) and (f) are generally the most significant

Cases:

The number of operations may also depend on the size and values of the input.

eg: In this code, the number of operations actually depends on the value of ‘ a ‘.

int a;
a = 5, sum = 0;

for(int i=0;i<a;i++)
sum+=i;

  1. Best Case (Ω : Omega) — Lower Bounded
    In this case, we look at specific instances of input of size n. For example, we might get the best behaviour from a sorting algorithm if the input to it is already sorted.
  2. Worst Case (O : BigO) — Upper Bounded
    It is the maximum run time, over all inputs of size n, ignoring effects (a) through (d) above. That is, we only consider the “number of times the principle activity of that algorithm is performed”.
  3. Average Case (Θ : Theta)
    Arguably, average case is the most useful measure. But most difficult to calculate too.

While comparing two algorithms or approaches we always use O-notation ie worst case analysis. If the code passes for the worst case it will work for average too.

Lets do a small exercise, find the complexity of following code snippets

1:
for(int i=0;i<n;i++)
sum+=a[i];

Complexity : O(n)

2:
for(int i=0;i<n;i+=2)
sum+=a[i];

Complexity : O(n/2)

3:
for(int i=0;i<n/2;i++)
sum+=a[i];

Complexity : O(n/2)

4:
for(int i=0;i<n;i*=2)
sum+=a[i];

Complexity : O(inf) .. 0*2=0 →it is an infinite loop

5:
for(int i=1;i<n;i*=2)
sum+=a[i];

Complexity : O(log(n))

5:
for(int i=0;i<n;i++)
for(int j=0;j<n;j++)
sum+=a[i];

Complexity : O(n²)

6:
for(int i=0;i<n;i++)
for(int j=i;j<n;j++)
sum+=a[i];

Complexity : n²/2 →O(n²) actual is n²/2 but can be considered at par with n²

7:
for(int i=0;i<n;i++)
for(int j=i;j<n;j++)
sum+=a[i];

for(int j=i;j<n;j++)
sum+=a[i];

Complexity : n²/2 + n →O(n²) for bigger values of n only higher power terms contribute significantly. These assumptions are valid for almost all the cases.

You will come across many different kinds of complexities like
n, n², n³, n log(n), 2^n, 3^n, n!, n^n.

Keep this graph in mind, it will be useful afterward.

Pic credit: HackerEarth

Complexity is a great tool and anyone related to programming and computers must know about it. Using the above graph you can now compare two approaches without actually executing the code.

I highly recommend to read more about this.

--

--