Understanding Time Complexity with Simple Examples

A lot of students get confused while understanding the concept of time-complexity, but in this article, we will explain it with a very simple example:

Imagine a classroom of 100 students in which you gave your pen to one person. Now, you want that pen. Here are some ways to find the pen and what the O order is.

O(n2): You go and ask the first person of the class, if he has the pen. Also, you ask this person about other 99 people in the classroom if they have that pen and so on,
This is what we call O(n2).

O(n): Going and asking each student individually is O(N).

O(log n): Now I divide the class into two groups, then ask: “Is it on the left side, or the right side of the classroom?” Then I take that group and divide it into two and ask again, and so on. Repeat the process till you are left with one student who has your pen. This is what you mean by O(log n).



I might need to do the O(n2) search if only one student knows on which student the pen is hidden. I’d use the O(n) if one student had the pen and only they knew it. I’d use the O(log n) search if all the students knew, but would only tell me if I guessed the right side.

NOTE :

We are interested in rate of growth of time with respect to the inputs taken during the program execution .

Another Example

Time Complexity of algorithm/code is not equal to the actual time required to execute a particular code but the number of times a statement executes. We can prove this by using time command. For example, Write code in C/C++ or any other language to find maximum between N numbers, where N varies from 10, 100, 1000, 10000. And compile that code on Linux based operating system (Fedora or Ubuntu) with below command:

gcc program.c – o program
run it with time ./program 

You will get surprising results i.e. for N = 10 you may get 0.5ms time and for N = 10, 000 you may get 0.2 ms time. Also, you will get different timings on the different machine. So, we can say that actual time requires to execute code is machine dependent (whether you are using pentium1 or pentiun5) and also it considers network load if your machine is in LAN/WAN. Even you will not get the same timings on the same machine for the same code, the reason behind that the current network load.

Now, the question arises if time complexity is not the actual time require executing the code then what is it?
The answer is : Instead of measuring actual time required in executing each statement in the code, we consider how many times each statement execute.
For example:

filter_none

edit
close

play_arrow

link
brightness_4
code

#include <stdio.h>
int main()
{
    printf("Hello World");
}

chevron_right


Output:

Hello World

In above code “Hello World!!!” print only once on a screen. So, time complexity is constant: O(1) i.e. every time constant amount of time require to execute code, no matter which operating system or which machine configurations you are using.
Now consider another code:

filter_none

edit
close

play_arrow

link
brightness_4
code

#include <stdio.h>
void main()
{
    int i, n = 8;
    for (i = 1; i <= n; i++) {
        printf("Hello Word !!!");
    }
}

chevron_right


Output:



Hello Word !!!Hello Word !!!Hello Word !!!Hello Word !!!
Hello Word !!!Hello Word !!!Hello Word !!!Hello Word !!!

In above code “Hello World!!!” will print N times. So, time complexity of above code is O(N).

Source : Reddit

ADDITIONAL INFORMATION :

For example:
Let us consider a model machine which has the following specifications:
–Single processor
–32 bit
–Sequential execution
–1 unit time for arithmetic and logical operations
–1 unit time for assignment and return statements

1.Sum of 2 numbers :

filter_none

edit
close

play_arrow

link
brightness_4
code

Pseudocode:
Sum(a,b){
return a+b  //Takes 2 unit of time(constant) one for arithmetic operation and one for return.(as per above conventions)   cost=2 no of times=1
}

chevron_right


Tsum= 2 = C =O(1)

2.Sum of all elements of a list :

filter_none

edit
close

play_arrow

link
brightness_4
code

Pseudocode:
list_Sum(A,n){//A->array and n->number of elements in the array
total =0           // cost=1  no of times=1
for i=0 to n-1     // cost=2  no of times=n+1 (+1 for the end false condition)
sum = sum + A[i]   // cost=2  no of times=n 
return sum         // cost=1  no of times=1
}         

chevron_right


Tsum=1 + 2 * (n+1) + 2 * n + 1 = 4n + 1 =C1 * n + C2 = O(n)

3.Sum of all elements of a matrix :

For this one the complexity is a polynomial equation (quadratic equation for a square matrix)
Matrix nxn => Tsum= an2 +bn + c



For this Tsum if in order of n2 = O(2)

The above codes do not run in the IDE as they are pseudo codes and do not resemble any programming language .

So from the above, we can conclude that the time of execution increases with the type of operations we make using the inputs.

The above O -> is called Big – Oh which is an asymptotic notation. There are other asymptotic notations like theta and Ohm.

You can refer : Read about asymptotic notations

The additional information provided author of this article is Pathange Balaji Rao.



My Personal Notes arrow_drop_up

Check out this Author's contributed articles.

If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to contribute@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.



Improved By : BalajiPathange



Article Tags :

9


Please write to us at contribute@geeksforgeeks.org to report any issue with the above content.