Time & Space Complexity | Overview | Practice Problems
1. Time - What & Why
What is time complexity?
According to Wikipedia,
In computer science, the time complexity is the computational complexity that describes the amount of time it takes to run an algorithm.
In simple words,
Time complexity of a program is a simple measurement of how fast the time taken by a program grows, if the input increases.
Why should we care about time complexity?
There can be more than one way to solve a problem.
Consider this example
To check whether a number is Prime or not?
Method 1
function isPrime(n) {
for (let i = 2; i < n; ++i) {
if (n % i === 0) {
return false;
}
}
return true;
}
Here the loop will run n-2 times
Method 2
function isPrime(n) {
for (let i = 2; i <= Math.sqrt(n); ++i) {
if (n % i === 0) {
return false;
}
}
return true;
}
Here the loop will run √n-2 times
Conclusion
The second method is faster. That’s why time complexity is important. In real life we want softwares to be fast & smooth.
2. How to calculate time complexity
General Rules
The time taken by simple statements is constant, like:
let i = 0;
i = i + 1;
This constant time is considered as Big O of 1 i.e. O(1)
What is Big O? Don’t worry, we’ll cover this later in the series.
How to calculate time complexity?
code 1
for (let i = 0; i < n; ++i) {
console.log(i);
}
Here the loop will run n times
Time Complexity: O(n)
code 2
for (let i = 0; i < n; ++i) {
for (let j = 0; j < n; ++j) {
console.log(i, j);
}
}
Here the loop will run n² times
Time Complexity: O(n²)
code 3
for (let i = 1; i < n; i *= 2) {
console.log(i);
}
Here the loop will run (log n)-1 times
1st iteration, i = 1
2nd iteration, i = 2
3rd iteration, i = 4
4th iteration, i = 8
….
kth iteration, i = 2 ^ (k-1)
So, 2 ^ (k-1) < n
now taking log on both sides
log (2 ^ (k-1)) < log n
k-1 = log n
Time Complexity: O(log n)
Conclusion
Time complexity is expressed using mathematical notations. These notations represent the time required by an algorithm.
3. Space — What & Why
What is space complexity?
According to Wikipedia,
In computer science, the space complexity of an algorithm or a computer program is the amount of memory space required to solve an instance of the computational problem as a function of the size of the input.
In simple words,
Space complexity of a program is a simple measurement of how fast the space taken by a program grows, if the input increases.
Why should we care about space complexity?
A good algorithm keeps space complexity as low as possible. An algorithm with lower space complexity is always better than the one with higher.
There is often a time-space tradeoff involved. A case where an algorithm increases space usage with decreased time or vice versa.
Examples
Method 1
function fibonacci(n) {
const arr = [0, 1];
for (let i = 2; i <= n; ++i) {
arr.push(arr[i - 2] + arr[ i - 1]);
}
return arr[n - 1];
}
Here we are using an array of size n and a fixed space for iteration. Hence the space complexity is O(n).
Method 2
function fibonacci(n) {
let x = 0, y = 1, z;
if (n === 0) {
return x;
}
if (n === 1) {
return y;
}
for (let i = 2; i <= n; ++i) {
z = x + y;
x = y;
y = z;
}
return z;
}
Here we are using a fixed space for 4 variables. Hence the space complexity is O(1)
Conclusion
The second method is better.
There is no point in using more space to solve a problem if, we can do the same with lesser space complexity.
4. How to calculate space complexity
General rules
The space taken by variable declaration is fixed(constant), like:
let i = 0;
This space requirement is constant and is considered as Big O of 1 i.e., O(1)
Our focus is more on the non-constant space requirement, which grows with input size.
How to calculate space complexity?
code 1
const arr = [];
for (let i = 0; i < n; ++i) {
arr.push(i);
}
Here the array will take n space
Space Complexity: O(n)
code 2
const arr = [];
for (let i = 0; i < n; ++i) {
for (let j = 0; j < n; ++j) {
arr.push(i + j);
}
}
Here the array will take n² space
Space Complexity: O(n²)
code 3
const arr = [];
for (let i = 1; i < n; i *= 2) {
arr.push(i);
}
Here the array will take (log n)-1 space
Space Complexity: O(log n)
Conclusion
Similar to Time complexity, Space complexity also plays a crucial role in determining the efficiency of an algorithm/program.
If an algorithm takes up some extra time, you can still wait for its execution.
But, if a program takes up a lot of memory space, then due to the machine’s memory limitation it will not run.
5. Practice Problems
Problem 1
What is the time & space complexity of the following code:
let a = 0, b = 0;for (let i = 0; i < n; ++i) {
a = a + i;
}
for (let j = 0; j < m; ++j) {
b = b + j;
}
Problem 2
What is the time & space complexity of the following code:
let a = 0, b = 0;
for (let i = 0; i < n; ++i) {
for (let j = 0; j < n; ++j) {
a = a + j;
}
}
for (let k = 0; k < n; ++k) {
b = b + k;
}
Problem 3
What is the time and space complexity of the following code:
let a = 0;
for (let i = 0; i < n; ++i) {
for (let j = n; j > i; --j) {
a = a + i + j;
}
}
Solutions
Problem 1:
Time Complexity: O(n + m)
Space Complexity: O(1)
Problem 2:
Time Complexity: O(n²)
Space Complexity: O(1)
Problem 3:
Time Complexity: O(n²)
Space Complexity: O(1)
6. Practice Problems II
Problem 1
What is the time complexity of the following code:
for (let i = n; i > 0; i = parseInt(i / 2)) {
console.log(i);
}
Problem 2
What is the time complexity of the following code:
for (let i = 1; i < n; i = i * 2) {
console.log(i);
}
Problem 3
What is the time complexity of the following code:
for (let i = 0; i < n; ++i) {
for (let j = 1; j < n; j = j * 2) {
console.log(j);
}
}
Solutions
Problem 1:
Time Complexity: O(log n)
Problem 2:
Time Complexity: O(log n)
Problem 3:
Time Complexity: O(nlog n)
7. Practice Problems III (Recursion)
Problem 1
What is the time complexity of the following code:
// search an element in an array
// list is already sorted
function search (list, item, start, end) {
if (start > end) {
return false;
}
const mid = Math.floor((start + end) / 2);
if (list[mid] < item) {
return search(list, item, mid + 1, end);
}
if (list[mid] > item) {
return search(list, item, start, mid - 1);
}
return true;
}
Problem 2
What is the time complexity of the following code:
// count the occurrence of an element in an array
// list is already sorted
function count (list, item, start, end) {
if (start > end) {
return 0;
}
const mid = Math.floor((start + end) / 2);
if (list[mid] < item) {
return count(list, item, mid + 1, end);
}
if (list[mid] > item) {
return count(list, item, start, mid - 1);
}
return count(list, item, start, mid - 1) + 1 + count(list, item, mid + 1, end);
}
Problem 3
What is the time complexity of the following code:
// Fibonacci of nth element
function fibonacci (n) {
if (n <= 1) {
return 1;
}
return fibonacci(n - 1) + fibonacci(n - 2);
}
Solutions
Problem 1:
Time Complexity: O(log n)
Problem 2:
Time Complexity: O(n)
Problem 3:
Time Complexity: O(2^n)
8. Asymptotic Analysis
What is Asymptotic Analysis?
According to Wikipedia,
In mathematical analysis, asymptotic analysis, also known as asymptotics, is a method of describing limiting behavior.
In simple words,
Evaluation of the performance of an algorithm in terms of input size. How does the time/space taken by an algorithm change with the input size?
Cases to analyze an algorithm
- Worst Case Analysis
In this case, we calculate the upper bound on the running time of an algorithm.
In this case, we consider such inputs so that the algorithm executes the maximum number of operations. - Best Case Analysis
In this case, we calculate the lower bound on the running time of an algorithm.
In this case, we consider such inputs so that the algorithm executes a minimum of operations. - Average Case Analysis
In this case, we calculate both upper & lower bound on the running time of an algorithm.
In this case, we consider all possible inputs so that the algorithm executes an average of maximum & minimum number of operations.
Example
// linear search algorithm
function linearSearch(list, item) {
for (let i = 0; i < list.length; ++i) {
if (list[i] == item) {
return true;
}
}
return false;
}
In-depth
For a linear search algorithm,
where N = length of array (list.length)
Worst Case:
When the element to be searched is either not present in the array or is present at the end of the array.
Time Complexity: O(n)
Best Case:
When the element to be searched is present at the first location of the array.
Time Complexity: O(1)
Average Case:
Average of all the cases (complexities), when the element is present at all the locations.
Time Complexity: (N + (N — 1) + (N — 2) + … + 1) / N
Time Complexity: O(N)
9. Asymptotic Notations
What are Asymptotic Notations?
Asymptotic notations are used to represent the complexities (time & space) of algorithms for asymptotic analysis.
There are three notations that are mostly used:
- Big Oh Notation (O)
- Big Omega Notation (Ω)
- Big Theta Notation (Θ)
Big Oh (O)
Big Oh Notation (O)
The Big O notation defines the upper bound of an algorithm.
Mathematically defined as:
O(g(n)) = { f(n): there exist positive constants c and n0 such that 0 <= f(n) <= c * g(n) for all n >= n0 }
Graphical Representation:
Big Omega (Ω)
Big Omega Notation (Ω)
The Big Omega notation defines the lower bound of an algorithm.
Mathematically defined as:
O(g(n)) = { f(n): there exist positive constants c and n0 such that 0 <= c * g(n) <= f(n) for all n >= n0 }
Graphical Representation:
Big Theta (Θ)
Big Theta Notation (Θ)
The Big Theta notation defines both, upper & lower bound of an algorithm.
Mathematically defined as:
O(g(n)) = { f(n): there exist positive constants c1, c2 such that 0 <= c1 * g(n) <= f(n) <= c2 * g(n) for all n >= n0 }
Graphical Representation:
Thanks for reading ❤
Say Hello! manishsundriyal.in | Twitter