Asymptotic Notations
“Whenever we want to perform analysis of an algorithm, we need to calculate the complexity of that algorithm. But when we calculate the complexity of an algorithm it does not provide the exact amount of resource required.”
- Asymptotic Notations are mathematical tools that allow you to analyze an algorithm’s running time by identifying its behavior as its input size grows.
- Asymptotic notation of an algorithm is a mathematical representation of its complexity.
There are three cases to calculate the time complexity :
- Best Case − Minimum time required for program execution.
- Average Case − Average time required for program execution.
- Worst Case − Maximum time required for program execution.
Majorly, we use THREE types of Asymptotic Notations and those are as follows…
- Big – Oh (O)
- Big – Omega (Ω)
- Big – Theta (Θ
Big – Oh Notation (O)
Big – Oh notation is used to define the upper bound of an algorithm in terms of Time Complexity. That means Big – Oh notation always indicates the maximum time required by an algorithm for all input values. That means Big – Oh notation describes the worst case of an algorithm time complexity.
Big – Oh Notation can be defined as follows…
Consider function f(n) as time complexity of an algorithm and g(n) is the most significant term. If f(n) <= C g(n) for all n >= n0, C > 0 and n0 >= 1. Then we can represent f(n) as O(g(n)).
f(n) = O(g(n))
Consider the following graph drawn for the values of f(n) and C g(n) for input (n) value on X-Axis and time required is on Y-Axis
In above graph after a particular input value n0, always C g(n) is greater than f(n) which indicates the algorithm’s upper bound.
Big – Omega Notation (Ω)
Big – Omega notation is used to define the lower bound of an algorithm in terms of Time Complexity. That means Big-Omega notation always indicates the minimum time required by an algorithm for all input values. That means Big-Omega notation describes the best case of an algorithm time complexity.
Big – Omega Notation can be defined as follows…
Consider function f(n) as time complexity of an algorithm and g(n) is the most significant term. If f(n) >= C g(n) for all n >= n0, C > 0 and n0 >= 1. Then we can represent f(n) as Ω(g(n)).
f(n) = Ω(g(n))
Consider the following graph drawn for the values of f(n) and C g(n) for input (n) value on X-Axis and time required is on Y-Axis
In above graph after a particular input value n0, always C g(n) is less than f(n) which indicates the algorithm’s lower bound.
Big – Theta Notation (Θ)
Big – Theta notation is used to define the average bound of an algorithm in terms of Time Complexity. That means Big – Theta notation always indicates the average time required by an algorithm for all input values. That means Big – Theta notation describes the average case of an algorithm time complexity.
Big – Theta Notation can be defined as follows…
Consider function f(n) as time complexity of an algorithm and g(n) is the most significant term. If C1 g(n) <= f(n) <= C2 g(n) for all n >= n0, C1 > 0, C2 > 0 and n0 >= 1. Then we can represent f(n) as Θ(g(n)).
f(n) = Θ(g(n))
Consider the following graph drawn for the values of f(n) and C g(n) for input (n) value on X-Axis and time required is on Y-Axis
In above graph after a particular input value n0, always C1 g(n) is less than f(n) and C2 g(n) is greater than f(n) which indicates the algorithm’s average bound.