Open In App

Asymptotic Notations and how to calculate them

Improve
Improve
Like Article
Like
Save
Share
Report

In mathematics, asymptotic analysis, also known as asymptotics, is a method of describing the limiting behavior of a function. In computing, asymptotic analysis of an algorithm refers to defining the mathematical boundation of its run-time performance based on the input size. For example, the running time of one operation is computed as f(n), and maybe for another operation, it is computed as g(n2). This means the first operation running time will increase linearly with the increase in n and the running time of the second operation will increase exponentially when n increases. Similarly, the running time of both operations will be nearly the same if n is small in value.

Usually, the analysis of an algorithm is done based on three cases:

  1. Best Case (Omega Notation (Ω))
  2. Average Case (Theta Notation (Θ))
  3. Worst Case (O Notation(O))

All of these notations are discussed below in detail:

Omega (Ω) Notation:

Omega (Ω) notation specifies the asymptotic lower bound for a function f(n). For a given function g(n), Ω(g(n)) is denoted by:

Ω (g(n)) = {f(n): there exist positive constants c and n0 such that 0 ≤ c*g(n) ≤ f(n) for all n ≥ n0}. 

This means that, f(n) = Ω(g(n)), If there are positive constants n0 and c such that, to the right of n0 the f(n) always lies on or above c*g(n).

Calculating Asymptotic Notations

Graphical representation

Follow the steps below to calculate Ω for a program:

  1. Break the program into smaller segments.
  2. Find the number of operations performed for each segment(in terms of the input size) assuming the given input is such that the program takes the least amount of time.
  3. Add up all the operations and simplify it, let’s say it is f(n).
  4. Remove all the constants and choose the term having the highest order or any other function which is always less than f(n) when n tends to infinity, let say it is g(n) then, Omega (Ω) of f(n) is Ω(g(n)).

For example, consider the below pseudo code.

Pseudo Code




void fun(n) {
   for (i = 0; i < n; i++) {
     for (j = 0; j < n; j++) {
        
        // Do some constant work (no loop or
        // function call)
     }
      
     for (i = 0; i < n; i++) {
        // Do some constant work (no loop or
        // function call)
     }
}


The time taken by the above code can be written as a*n^2 + b*n + c where are a, b and c are some machine specific constants. In this case, the highest growing term is a*n^2. So we can say that the time complexity of the code is either Ω(n^2) or Ω(n) or Ω(Log n) or Ω(1)

Omega notation doesn’t really help to analyze an algorithm because it is bogus to evaluate an algorithm for the best cases of inputs.

Theta (Θ) Notation:

Big-Theta(Θ) notation specifies a bound for a function f(n). For a given function g(n), Θ(g(n)) is denoted by:

Θ (g(n)) = {f(n): there exist positive constants c1, c2 and n0 such that 0 ≤ c1*g(n) ≤ f(n) ≤ c2*g(n) for all n ≥ n0}. 

This means that, f(n) = Θ(g(n)), If there are positive constants n0 and c such that, to the right of n0 the f(n) always lies on or above c1*g(n) and below c2*g(n).

Asymptotic Notations and how to calculate them

Graphical representation

  1. Break the program into smaller segments.
  2. Find the number of operations performed for each segment(in terms of the input size) assuming the given input is such that the program takes the least amount of time.
  3. Add up all the operations and simplify it, let’s say it is f(n).
  4. Remove all the constants and choose the term having the highest order. Let say it is g(n) then, Omega (Θ) of f(n) is Θ(g(n)).

As an example. let us consider the above pseudo code only. In this case, the highest growing term is a*n^2. So the time complexity of the code is Θ(n^2)

Big – O Notation:

Big – O (O) notation specifies the asymptotic upper bound for a function f(n). For a given function g(n), O(g(n)) is denoted by:

O (g(n)) = {f(n): there exist positive constants c and n0 such that f(n) ≤ c*g(n) for all n ≥ n0}. 

This means that, f(n) = O(g(n)), If there are positive constants n0 and c such that, to the right of n0 the f(n) always lies on or below c*g(n).

Asymptotic Notations

Graphical representation

Follow the steps below to calculate O for a program:

  1. Break the program into smaller segments.
  2. Find the number of operations performed for each segment (in terms of the input size) assuming the given input is such that the program takes the maximum time i.e the worst-case scenario.
  3. Add up all the operations and simplify it, let’s say it is f(n).
  4. Remove all the constants and choose the term having the highest order because for n tends to infinity the constants and the lower order terms in f(n) will be insignificant, let say the function is g(n) then, big-O notation is O(g(n)) or O(h(n)) where h(n) has higher order of growth than g(n)

As an example. let us consider the above pseudo code only. In this case, the highest growing term is a*n^2. So the time complexity of the code is O(n^2) or O(n^3) or O(n Log n) or any other term can be put inside O that has higher growth than n^2.

It is the most widely used notation as it is easier to calculate since there is no need to check for every type of input as it was in the case of theta notation, also since the worst case of input is taken into account it pretty much gives the upper bound of the time the program will take to execute.



Last Updated : 28 Feb, 2024
Like Article
Save Article
Previous
Next
Share your thoughts in the comments
Similar Reads