Top MCQs on Complexity Analysis of Algorithms with Answers

Complexity analysis is defined as a technique to characterise the time taken by an algorithm with respect to input size (independent from the machine, language and compiler). It is used for evaluating the variations of execution time on different algorithms. More on Complexity Analysis
complexity analysis of algorithms

complexity analysis of algorithms

Question 1
What is recurrence for worst case of QuickSort and what is the time complexity in Worst case?
Cross
Recurrence is T(n) = T(n-2) + O(n) and time complexity is O(n^2)
Tick
Recurrence is T(n) = T(n-1) + O(n) and time complexity is O(n^2)
Cross
Recurrence is T(n) = 2T(n/2) + O(n) and time complexity is O(nLogn)
Cross
Recurrence is T(n) = T(n/10) + T(9n/10) + O(n) and time complexity is O(nLogn)


Question 1-Explanation: 
The worst case of QuickSort occurs when the picked pivot is always one of the corner elements in sorted array. In worst case, QuickSort recursively calls one subproblem with size 0 and other subproblem with size (n-1). So recurrence is T(n) = T(n-1) + T(0) + O(n) The above expression can be rewritten as T(n) = T(n-1) + O(n) void exchange(int *a, int *b) { int temp; temp = *a; *a = *b; *b = temp; } int partition(int arr[], int si, int ei) { int x = arr[ei]; int i = (si - 1); int j; for (j = si; j <= ei - 1; j++) { if(arr[j] <= x) { i++; exchange(&arr[i], &arr[j]); } } exchange (&arr[i + 1], &arr[ei]); return (i + 1); } /* Implementation of Quick Sort arr[] --> Array to be sorted si --> Starting index ei --> Ending index */ void quickSort(int arr[], int si, int ei) { int pi; /* Partitioning index */ if(si < ei) { pi = partition(arr, si, ei); quickSort(arr, si, pi - 1); quickSort(arr, pi + 1, ei); } } [/sourcecode]
Question 2

Suppose we have an O(n) time algorithm that finds the median of an unsorted array. Now consider a QuickSort implementation where we first find the median using the above algorithm, then use the median as a pivot. What will be the worst-case time complexity of this modified QuickSort?

Cross

O(n^2 Logn)

Cross

O(n^2)

Cross

O(n Logn Logn)

Tick

O(nLogn)



Question 2-Explanation: 

If we use the median as a pivot element, then the recurrence for all cases becomes T(n) = 2T(n/2) + O(n)

The above recurrence can be solved using Master method. It falls in case 2 of the master method.
So, the worst-case time complexity of this modified QuickSort is O(nLogn). 

Question 3

Given an unsorted array. The array has this property that every element in the array is at most k distance from its position in a sorted array where k is a positive integer smaller than the size of an array. Which sorting algorithm can be easily modified for sorting this array and what is the obtainable time complexity?

Cross

Insertion Sort with time complexity O(kn)

Tick

Heap Sort with time complexity O(nLogk)

Cross

Quick Sort with time complexity O(kLogk)

Cross

Merge Sort with time complexity O(kLogk)



Question 3-Explanation: 

We can perform this in O(nlogK) time using heaps:

First, create a min-heap with first k+1 elements. Now, we are sure that the smallest element will be in this K+1 element. Now, remove the smallest element from the min-heap(which is the root) and put it in the result array. Next, insert another element from the unsorted array into the mean-heap, now, the second smallest element will be in this..extract it from the mean-heap and continue this until no more elements are in the unsorted array. Next, use a simple heap sort for the remaining elements.

Time Complexity:

O(k) to build the initial min-heap
O((n-k)logk) for remaining elements.

Thus we get O(nlogk). Hence, option B is correct.

Question 4

Which of the following is not true about comparison-based sorting algorithms?

Cross

The minimum possible time complexity of a comparison-based sorting algorithm is O(n(log(n)) for a random input array

Cross

Any comparison based sorting algorithm can be made stable by using position as a criteria when two elements are compared

Cross

Counting Sort is not a comparison based sorting algorithm

Tick

Heap Sort is not a comparison based sorting algorithm.



Question 4-Explanation: 

Heap Sort is not a comparison based sorting algorithm is not correct.

Question 5

What is time complexity of fun()? 

C

int fun(int n)
{
    int count = 0;
    for (int i = n; i > 0; i /= 2)
        for (int j = 0; j < i; j++)
            count += 1;
    return count;
}
Cross

O(n2)

Tick

O(n*log(n))

Cross

O(n)

Cross

O(n*log(n*Log(n)))



Question 5-Explanation: 

For an input integer n, 

the outermost loop of fun() is executed log(n) times

the innermost statement of fun() is executed following times. n + n/2 + n/4 + ... 1 

So time complexity T(n) can be written as T(n) = O(log(n)) * O(n + n/2 + n/4 + ... 1) = O(n * log(n)) 

The value of count is also n + n/2 + n/4 + .. + 1

Question 6

What is the time complexity of fun()? 

C

int fun(int n)
{
    int count = 0;
    for (int i = 0; i < n; i++)
        for (int j = i; j > 0; j--)
            count = count + 1;
    return count;
}
Cross

Theta (n)

Tick

Theta (n2)

Cross

Theta (n*log(n))

Cross

Theta (n*(log(n*log(n))))



Question 6-Explanation: 

The time complexity can be calculated by counting the number of times the expression "count = count + 1;" is executed. The expression is executed 0 + 1 + 2 + 3 + 4 + .... + (n-1) times.

Time complexity = Theta(0 + 1 + 2 + 3 + .. + n-1) = Theta (n*(n-1)/2) = Theta(n2)

Question 7

The recurrence relation capturing the optimal time of the Tower of Hanoi problem with n discs is. (GATE CS 2012)

Cross

T(n) = 2T(n – 2) + 2

Cross

T(n) = 2T(n – 1) + n

Cross

T(n) = 2T(n/2) + 1

Tick

T(n) = 2T(n – 1) + 1



Question 7-Explanation: 

Following are the steps to follow to solve Tower of Hanoi problem recursively.

Let the three pegs be A, B and C. The goal is to move n pegs from A to C.
To move n discs from peg A to peg C:
    move n-1 discs from A to B. This leaves disc n alone on peg A
    move disc n from A to C
    move n-1 discs from B to C so they sit on disc n

The recurrence function T(n) for the time complexity of the above recursive solution can be written as follows. 
T(n) = 2T(n-1) + 1

Question 8

O( n2 ) is the worst case time complexity, so among the given options it can represent :-

Cross

O( n )

Cross

O( 1 )

Cross

O ( nlogn )

Tick

All of the above



Question 8-Explanation: 

O( n2 ) is the worst case time complexity, so, if the time complexity is O( n2 ), they all can be represented by it.

Question 9

Which of the following is not O(n2)?

Cross

(15) * n2

Cross

n1.98

Tick

n3/(sqrt(n))

Cross

(20) * n2



Question 9-Explanation: 

The order of growth of option c is n2.5 which is higher than n2.

Question 10

Which of the given options provides the increasing order of asymptotic complexity of functions f1, f2, f3, and f4?

  f1(n) = 2n
  f2(n) = n(3/2)
  f3(n) = n*log(n)
  f4(n) = nlog(n)
Tick

f3, f2, f4, f1

Cross

f3, f2, f1, f4

Cross

f2, f3, f1, f4

Cross

f2, f3, f4, f1



Question 10-Explanation: 
  f1(n) = 2^n
  f2(n) = n^(3/2)
  f3(n) = n*log(n)
  f4(n) = n^log(n)

Except for f3, all other are exponential. So f3 is definitely first in the output. Among remaining, n^(3/2) is next. One way to compare f1 and f4 is to take log of both functions. Order of growth of log(f1(n)) is Θ(n) and order of growth of log(f4(n)) is Θ(log(n) * log(n)). Since Θ(n) has higher growth than Θ(log(n) * log(n)), f1(n) grows faster than f4(n).

Following is another way to compare f1 and f4. Let us compare f4 and f1.

Let us take few values to compare

n = 32, f1 = 2^32, f4 = 32^5 = 2^25
n = 64, f1 = 2^64, f4 = 64^6 = 2^36
...............
............... 
There are 118 questions to complete.

  • Last Updated : 27 Sep, 2023

Share your thoughts in the comments
Similar Reads