Skip to content
Related Articles

Related Articles

Understanding Time Complexity with Simple Examples

View Discussion
Improve Article
Save Article
Like Article
  • Difficulty Level : Easy
  • Last Updated : 22 Jun, 2022

A lot of students get confused while understanding the concept of time complexity, but in this article, we will explain it with a very simple example.

Q. Imagine a classroom of 100 students in which you gave your pen to one person. You have to find that pen without knowing to whom you gave it. 

Here are some ways to find the pen and what the O order is.

  • O(n2): You go and ask the first person in the class if he has the pen. Also, you ask this person about the other 99 people in the classroom if they have that pen and so on, 
    This is what we call O(n2). 
  • O(n): Going and asking each student individually is O(N). 
  • O(log n): Now I divide the class into two groups, then ask: “Is it on the left side, or the right side of the classroom?” Then I take that group and divide it into two and ask again, and so on. Repeat the process till you are left with one student who has your pen. This is what you mean by O(log n). 

I might need to do:

  • The O(n2) searches if only one student knows on which student the pen is hidden
  • The O(n) if one student had the pen and only they knew it
  • The O(log n) search if all the students knew, but would only tell me if I guessed the right side. 

The above O -> is called Big – Oh which is an asymptotic notation. There are other asymptotic notations like theta and Omega.

NOTE: We are interested in the rate of growth over time with respect to the inputs taken during the program execution.

Is the Time Complexity of an Algorithm/Code the same as the Running/Execution Time of Code?

The Time Complexity of an algorithm/code is not equal to the actual time required to execute a particular code, but the number of times a statement executes. We can prove this by using the time command

For example: Write code in C/C++ or any other language to find the maximum between N numbers, where N varies from 10, 100, 1000, and 10000. For Linux based operating system (Fedora or Ubuntu), use the below commands: 

To compile the program: gcc program.c – o program
To execute the program: time ./program

You will get surprising results i.e.:

  • For N = 10: you may get 0.5 ms time, 
  • For N = 10,000: you may get 0.2 ms time. 
  • Also, you will get different timings on different machines. Even if you will not get the same timings on the same machine for the same code, the reason behind that is the current network load.

So, we can say that the actual time required to execute code is machine-dependent (whether you are using Pentium 1 or Pentium 5) and also it considers network load if your machine is in LAN/WAN. 

What is meant by the Time Complexity of an Algorithm?

Now, the question arises if time complexity is not the actual time required to execute the code, then what is it? 

The answer is: 

Instead of measuring actual time required in executing each statement in the code, Time Complexity considers how many times each statement executes. 

Example 1: Consider the below simple code to print Hello World

C++




#include <iostream>
using namespace std;
 
int main()
{
    cout << "Hello World";
    return 0;
}
 
// This code is contributed by vikash36905.

C




#include <stdio.h>
 
int main()
{
    printf("Hello World");
    return 0;
}

Java




import java.io.*;
 
class GFG {
    public static void main(String[] args)
    {
        System.out.print("Hello World");
    }
}
 
// This code is contributed by vikash36905.

Output

Hello World

Time Complexity: In the above code “Hello World” is printed only once on the screen. 
So, the time complexity is constant: O(1) i.e. every time a constant amount of time is required to execute code, no matter which operating system or which machine configurations you are using. 

Example 2:

C++




#include <iostream>
using namespace std;
 
int main()
{
 
    int i, n = 8;
    for (i = 1; i <= n; i++) {
        cout << "Hello World !!!\n";
    }
    return 0;
}
 
// This code is contributed by vikash36905.

C




#include <stdio.h>
void main()
{
    int i, n = 8;
    for (i = 1; i <= n; i++) {
        printf("Hello World !!!\n");
    }
}

Java




class GFG {
 
    public static void main(String[] args)
    {
        int i, n = 8;
        for (i = 1; i <= n; i++) {
            System.out.printf("Hello World !!!\n");
        }
    }
}
 
// This code is contributed by Rajput-Ji

C#




using System;
public class GFG {
 
    public static void Main(String[] args)
    {
        int i, n = 8;
        for (i = 1; i <= n; i++) {
            Console.Write("Hello World !!!\n");
        }
    }
}
 
// This code contributed by Rajput-Ji

Javascript




<script>
 
        var i, n = 8;
        for (i = 1; i <= n; i++) {
            document.write("Hello World !!!<br/>");
        }
 
// This code is contributed by Rajput-Ji
</script>

Output

Hello World !!!
Hello World !!!
Hello World !!!
Hello World !!!
Hello World !!!
Hello World !!!
Hello World !!!
Hello World !!!

Time Complexity: In the above code “Hello World !!!” is printed only n times on the screen, as the value of n can change. 
So, the time complexity is linear: O(n) i.e. every time, a linear amount of time is required to execute code.

Example 3:

C++




#include <iostream>
using namespace std;
 
int main()
{
 
    int i, n = 8;
    for (i = 1; i <= n; i=i*2) {
        cout << "Hello World !!!\n";
    }
    return 0;
}
 
// This code is contributed by Suruchi Kumari

C




#include <stdio.h>
void main()
{
    int i, n = 8;
    for (i = 1; i <= n; i=i*2) {
        printf("Hello World !!!\n");
    }
}
// This code is contributed by Suruchi Kumari

Java




class GFG {
 
    public static void main(String[] args)
    {
        int i, n = 8;
        for (i = 1; i <= n; i=i*2) {
            System.out.printf("Hello World !!!\n");
        }
    }
}
 
// This code is contributed by Suruchi Kumari

Output

Hello World !!!
Hello World !!!
Hello World !!!
Hello World !!!

Time Complexity: O(log2(n))

Example 4:

C++




#include <iostream>
#include <cmath>
using namespace std;
 
int main()
{
 
    int i, n = 8;
    for (i = 2; i <= n; i=pow(i,2)) {
        cout << "Hello World !!!\n";
    }
    return 0;
}
 
// This code is contributed by Suruchi Kumari

C




#include <stdio.h>
#include <math.h>
void main()
{
    int i, n = 8;
    for (i = 2; i <= n; i=pow(i,2)) {
        printf("Hello World !!!\n");
    }
}
// This code is contributed by Suruchi Kumari

Output

Hello World !!!
Hello World !!!

Time Complexity: O(log(log n))

How To Find The Time Complexity Of An Algorithm?

Now let us see some other examples and the process to find the time complexity of an algorithm:

Example: Let us consider a model machine that has the following specifications: 

  • Single processor 
  • 32 bit 
  • Sequential execution 
  • 1 unit time for arithmetic and logical operations 
  • 1 unit time for assignment and return statements 

Q1. Find the Sum of 2 numbers on the above machine:

For any machine, the pseudocode to add two numbers will be something like this:

C




Pseudocode : Sum(a, b) { return a + b }

Time Complexity:  

  • The above code will take 2 units of time(constant): 
    • one for arithmetic operations and 
    • one for return. (as per the above conventions). 
  • Therefore total cost to perform sum operation (Tsum) = 1 + 1 = 2
  • Time Complexity = O(2) = O(1), since 2 is constant

Q2. Find the sum of all elements of a list/array

The pseudocode to do so can be given as:

C




Pseudocode : list_Sum(A, n)
// A->array and
// n->number of elements in array
{
sum = 0          
for i = 0 to n-1   
    sum = sum + A[i]
return sum
}

To understand the time complexity of the above code, let’s see how much time each statement will take:

C




Pseudocode : list_Sum(A, n)
{
total =0                   // cost=1  no of times=1
for i=0 to n-1             // cost=2  no of times=n+1 (+1 for the end false condition)
    sum = sum + A[i]       // cost=2  no of times=n
return sum                 // cost=1  no of times=1
}

Therefore the total cost to perform sum operation 

Tsum=1 + 2 * (n+1) + 2 * n + 1 = 4n + 4 =C1 * n + C2 = O(n)

Therefore, the time complexity of the above code is O(n)

Q3. Find the sum of all elements of a matrix

For this one, the complexity is a polynomial equation (quadratic equation for a square matrix)

  • Matrix of size n*n => Tsum = a.n2 + b.n + c
  • Since Tsum is in order of n2, therefore Time Complexity = O(n2)

So from the above examples, we can conclude that the time of execution increases with the type of operations we make using the inputs.
 

How to Compare Algorithms?

To compare algorithms, let us define a few objective measures:

  • Execution times: Not a good measure as execution times are specific to a particular computer.
  • The number of statements executed: Not a good measure, since the number of statements varies with the programming language as well as the style of the individual programmer.
  • Ideal solution:  Let us assume that we express the running time of a given algorithm as a function of the input size n (i.e., f(n)) and compare these different functions corresponding to running times. This kind of comparison is independent of machine time, programming style, etc. 
    Therefore, an ideal solution can be used to compare algorithms.

Related articles:


My Personal Notes arrow_drop_up
Recommended Articles
Page :

Start Your Coding Journey Now!