Skip to content
Related Articles
Open in App
Not now

Related Articles

How to Analyse Loops for Complexity Analysis of Algorithms

Improve Article
Save Article
  • Difficulty Level : Easy
  • Last Updated : 03 Feb, 2023
Improve Article
Save Article

We have discussed Asymptotic AnalysisWorst, Average and Best Cases and Asymptotic Notations in previous posts. In this post, an analysis of iterative programs with simple examples is discussed. 

Constant Time Complexity O(1):

The time complexity of a function (or set of statements) is considered as O(1) if it doesn’t contain a loop, recursion, and call to any other non-constant time function. 
 i.e. set of non-recursive and non-loop statements

Example: 

  • swap() function has O(1) time complexity. 
  • A loop or recursion that runs a constant number of times is also considered O(1). For example, the following loop is O(1).

C




// Here c is a constant
for (int i = 1; i <= c; i++) {
    // some O(1) expressions
}

Java




// Here c is a constant
for (int i = 1; i <= c; i++) {
    // some O(1) expressions
}
 
// This code is contributed by Utkarsh

Python3




# Here c is a constant
for i in range(1, c+1):
    # some O(1) expressions
 
    # This code is contributed by Pushpesh Raj.

Javascript




   // Here c is a constant
for (var i = 1; i <= c; i++) {
    // some O(1) expressions
}

Linear Time Complexity O(n):

The Time Complexity of a loop is considered as O(n) if the loop variables are incremented/decremented by a constant amount. For example following functions have O(n) time complexity. 

C




// Here c is a positive integer constant
for (int i = 1; i <= n; i += c) {
    // some O(1) expressions
}
 
for (int i = n; i > 0; i -= c) {
    // some O(1) expressions
}

Java




// Here c is a positive integer constant
for (int i = 1; i <= n; i += c) {
    // some O(1) expressions
}
   
for (int i = n; i > 0; i -= c) {
    // some O(1) expressions
}
 
// This code is contributed by Utkarsh

Python3




# Here c is a positive integer constant
for i in range(1, n+1, c):
    # some O(1) expressions
 
for i in range(n, 0, -c):
    # some O(1) expressions
 
    # This code is contributed by Pushpesh Raj

Javascript




// Here c is a positive integer constant
for (var i = 1; i <= n; i += c) {
    // some O(1) expressions
}
 
for (var i = n; i > 0; i -= c) {
    // some O(1) expressions
}

Quadratic Time Complexity O(nc):

The time complexity is defined as an algorithm whose performance is directly proportional to the squared size of the input data, as in nested loops it is equal to the number of times the innermost statement is executed. For example, the following sample loops have O(n2) time complexity 

C




for (int i = 1; i <= n; i += c) {
    for (int j = 1; j <= n; j += c) {
        // some O(1) expressions
    }
}
 
for (int i = n; i > 0; i -= c) {
    for (int j = i + 1; j <= n; j += c) {
        // some O(1) expressions
    }

Java




for (int i = 1; i <= n; i += c) {
    for (int j = 1; j <= n; j += c) {
        // some O(1) expressions
    }
}
   
for (int i = n; i > 0; i -= c) {
    for (int j = i + 1; j <= n; j += c) {
        // some O(1) expressions
    }
 
// This code is contributed by Utkarsh

Python3




for i in range(1, n+1, c):
    for j in range(1, n+1, c):
        # some O(1) expressions
 
for i in range(n, 0, -c):
    for j in range(i+1, n+1, c):
        # some O(1) expressions
 
        # This code is contributed by Pushpesh Raj

Javascript




for (var i = 1; i <= n; i += c) {
    for (var j = 1; j <= n; j += c) {
        // some O(1) expressions
    }
}
 
for (var i = n; i > 0; i -= c) {
    for (var j = i + 1; j <= n; j += c) {
        // some O(1) expressions
    }

Example:  Selection sort and Insertion Sort have O(n2) time complexity. 

Logarithmic Time Complexity O(Log n):

The time Complexity of a loop is considered as O(Logn) if the loop variables are divided/multiplied by a constant amount. And also for recursive calls in the recursive function, the Time Complexity is considered as O(Logn).

C




for (int i = 1; i <= n; i *= c) {
    // some O(1) expressions
}
for (int i = n; i > 0; i /= c) {
    // some O(1) expressions
}

Java




for (int i = 1; i <= n; i *= c) {
    // some O(1) expressions
}
for (int i = n; i > 0; i /= c) {
    // some O(1) expressions
}
 
// This code is contributed by Utkarsh

Python3




i = 1
while(i <= n):
    # some O(1) expressions
    i = i*c
 
i = n
while(i > 0):
    # some O(1) expressions
    i = i//c
 
# This code is contributed by Pushpesh Raj

Javascript




for (var i = 1; i <= n; i *= c) {
    // some O(1) expressions
}
for (var i = n; i > 0; i /= c) {
    // some O(1) expressions
}

C




// Recursive function
void recurse(n)
{
    if (n == 0)
        return;
    else {
        // some O(1) expressions
    }
    recurse(n - 1);
}

Java




// Recursive function
void recurse(n)
{
    if (n == 0)
        return;
    else {
        // some O(1) expressions
    }
    recurse(n - 1);
}
// This code is contributed by Utkarsh

Python3




# Recursive function
def recurse(n):
    if(n == 0):
        return
    else:
        # some O(1) expressions
    recurse(n-1)
 
# This code is contributed by Pushpesh Raj

Javascript




// Recursive function
function recurse(n)
{
    if (n == 0)
        return;
    else {
        // some O(1) expressions
    }
    recurse(n - 1);
}

Example: Binary Search(refer iterative implementation) has O(Logn) time complexity.

Logarithmic Time Complexity O(Log Log n):

The Time Complexity of a loop is considered as O(LogLogn) if the loop variables are reduced/increased exponentially by a constant amount. 

C




// Here c is a constant greater than 1
for (int i = 2; i <= n; i = pow(i, c)) {
    // some O(1) expressions
}
// Here fun is sqrt or cuberoot or any other constant root
for (int i = n; i > 1; i = fun(i)) {
    // some O(1) expressions
}

Java




// Here c is a constant greater than 1
for (int i = 2; i <= n; i = Math.pow(i, c)) {
    // some O(1) expressions
}
// Here fun is sqrt or cuberoot or any other constant root
for (int i = n; i > 1; i = fun(i)) {
    // some O(1) expressions
}
 
// This code is contributed by Utkarsh

Python3




# Here c is a constant greater than 1
i = 2
while(i <= n):
    # some O(1) expressions
    i = i**c
 
# Here fun is sqrt or cuberoot or any other constant root
i = n
while(i > 1):
    # some O(1) expressions
    i = fun(i)
 
# This code is contributed by Pushpesh Raj

Javascript




// Here c is a constant greater than 1
for (var i = 2; i <= n; i = i**c) {
    // some O(1) expressions
}
// Here fun is sqrt or cuberoot or any other constant root
for (var i = n; i > 1; i = fun(i)) {
    // some O(1) expressions
}

See this for mathematical details. 

How to combine the time complexities of consecutive loops? 

When there are consecutive loops, we calculate time complexity as a sum of the time complexities of individual loops. 

C




for (int i = 1; i <= m; i += c) {
    // some O(1) expressions
}
for (int i = 1; i <= n; i += c) {
    // some O(1) expressions
}
 
// Time complexity of above code is O(m) + O(n) which is O(m + n)
// If m == n, the time complexity becomes O(2n) which is O(n).

Java




for (int i = 1; i <= m; i += c) {
    // some O(1) expressions
}
for (int i = 1; i <= n; i += c) {
    // some O(1) expressions
}
   
// Time complexity of above code is O(m) + O(n) which is O(m + n)
// If m == n, the time complexity becomes O(2n) which is O(n).
 
// This code is contributed by Utkarsh

Python3




for i in range(1, m+1, c):
    # some O(1) expressions
 
for i in range(1, n+1, c):
    # some O(1) expressions
 
 
# Time complexity of above code is O(m) + O(n) which is O(m + n)
# If m == n, the time complexity becomes O(2n) which is O(n).

Javascript




for (var i = 1; i <= m; i += c) {
    // some O(1) expressions
}
for (var i = 1; i <= n; i += c) {
    // some O(1) expressions
}
 
// Time complexity of above code is O(m) + O(n) which is O(m + n)
// If m == n, the time complexity becomes O(2n) which is O(n).

  
How to calculate time complexity when there are many if, else statements inside loops? 

As discussed here, the worst-case time complexity is the most useful among best, average and worst. Therefore we need to consider the worst case. We evaluate the situation when values in if-else conditions cause a maximum number of statements to be executed. 
For example, consider the linear search function where we consider the case when an element is present at the end or not present at all. 
When the code is too complex to consider all if-else cases, we can get an upper bound by ignoring if-else and other complex control statements. 

How to calculate the time complexity of recursive functions? 

The time complexity of a recursive function can be written as a mathematical recurrence relation. To calculate time complexity, we must know how to solve recurrences. We will soon be discussing recurrence-solving techniques as a separate post. 

Algorithms Cheat Sheet:

AlgorithmBest CaseAverage CaseWorst Case
Selection SortO(n^2)O(n^2)O(n^2)
Bubble SortO(n)O(n^2)O(n^2)
Insertion SortO(n)O(n^2)O(n^2)
Tree SortO(nlogn)O(nlogn)O(n^2)
Radix SortO(dn)O(dn)O(dn)
Merge SortO(nlogn)O(nlogn)O(nlogn)
Heap SortO(nlogn)O(nlogn)O(nlogn)
Quick SortO(nlogn)O(nlogn)O(n^2)
Bucket SortO(n+k)O(n+k)O(n^2)
Counting SortO(n+k)O(n+k)O(n+k)

Quiz on Analysis of Algorithms 
For more details, please refer: Design and Analysis of Algorithms.

Please write comments if you find anything incorrect, or you want to share more information about the topic discussed above.


My Personal Notes arrow_drop_up
Related Articles

Start Your Coding Journey Now!