Analysis of Algorithms

Question 1
What is recurrence for worst case of QuickSort and what is the time complexity in Worst case?
A
Recurrence is T(n) = T(n-2) + O(n) and time complexity is O(n^2)
B
Recurrence is T(n) = T(n-1) + O(n) and time complexity is O(n^2)
C
Recurrence is T(n) = 2T(n/2) + O(n) and time complexity is O(nLogn)
D
Recurrence is T(n) = T(n/10) + T(9n/10) + O(n) and time complexity is O(nLogn)
Analysis of Algorithms    Sorting    QuickSort    
Discuss it


Question 1 Explanation: 
The worst case of QuickSort occurs when the picked pivot is always one of the corner elements in sorted array. In worst case, QuickSort recursively calls one subproblem with size 0 and other subproblem with size (n-1). So recurrence is T(n) = T(n-1) + T(0) + O(n) The above expression can be rewritten as T(n) = T(n-1) + O(n) 1 void exchange(int *a, int *b) { int temp; temp = *a; *a = *b; *b = temp; } int partition(int arr[], int si, int ei) { int x = arr[ei]; int i = (si - 1); int j; for (j = si; j <= ei - 1; j++) { if(arr[j] <= x) { i++; exchange(&arr[i], &arr[j]); } } exchange (&arr[i + 1], &arr[ei]); return (i + 1); } /* Implementation of Quick Sort arr[] --> Array to be sorted si --> Starting index ei --> Ending index */ void quickSort(int arr[], int si, int ei) { int pi; /* Partitioning index */ if(si < ei) { pi = partition(arr, si, ei); quickSort(arr, si, pi - 1); quickSort(arr, pi + 1, ei); } } [/sourcecode]
Question 2
Suppose we have a O(n) time algorithm that finds median of an unsorted array. Now consider a QuickSort implementation where we first find median using the above algorithm, then use median as pivot. What will be the worst case time complexity of this modified QuickSort.
A
O(n^2 Logn)
B
O(n^2)
C
O(n Logn Logn)
D
O(nLogn)
Analysis of Algorithms    Sorting    QuickSort    
Discuss it


Question 2 Explanation: 
If we use median as a pivot element, then the recurrence for all cases becomes T(n) = 2T(n/2) + O(n) The above recurrence can be solved using Master Method. It falls in case 2 of master method.
Question 3
Given an unsorted array. The array has this property that every element in array is at most k distance from its position in sorted array where k is a positive integer smaller than size of array. Which sorting algorithm can be easily modified for sorting this array and what is the obtainable time complexity?
A
Insertion Sort with time complexity O(kn)
B
Heap Sort with time complexity O(nLogk)
C
Quick Sort with time complexity O(kLogk)
D
Merge Sort with time complexity O(kLogk)
Analysis of Algorithms    Sorting    QuickSort    HeapSort    
Discuss it


Question 3 Explanation: 
See http://www.geeksforgeeks.org/nearly-sorted-algorithm/ for explanation and implementation.
Question 4
Which of the following is not true about comparison based sorting algorithms?
A
The minimum possible time complexity of a comparison based sorting algorithm is O(nLogn) for a random input array
B
Any comparison based sorting algorithm can be made stable by using position as a criteria when two elements are compared
C
Counting Sort is not a comparison based sorting algortihm
D
Heap Sort is not a comparison based sorting algorithm.
Analysis of Algorithms    Sorting    HeapSort    CountingSort    
Discuss it


Question 4 Explanation: 
Question 5
What is time complexity of fun()?
int fun(int n)
{
  int count = 0;
  for (int i = n; i > 0; i /= 2)
     for (int j = 0; j < i; j++)
        count += 1;
  return count;
}
A
O(n^2)
B
O(nLogn)
C
O(n)
D
O(nLognLogn)
Analysis of Algorithms    
Discuss it


Question 5 Explanation: 
For a input integer n, the innermost statement of fun() is executed following times. n + n/2 + n/4 + ... 1 So time complexity T(n) can be written as T(n) = O(n + n/2 + n/4 + ... 1) = O(n) The value of count is also n + n/2 + n/4 + .. + 1
Question 6
What is the time complexity of fun()?
int fun(int n)
{
  int count = 0;
  for (int i = 0; i < n; i++)
     for (int j = i; j > 0; j--)
        count = count + 1;
  return count;
} 
A
Theta (n)
B
Theta (n^2)
C
Theta (n*Logn)
D
Theta (nLognLogn)
Analysis of Algorithms    
Discuss it


Question 6 Explanation: 
The time complexity can be calculated by counting number of times the expression "count = count + 1;" is executed. The expression is executed 0 + 1 + 2 + 3 + 4 + .... + (n-1) times. Time complexity = Theta(0 + 1 + 2 + 3 + .. + n-1) = Theta (n*(n-1)/2) = Theta(n^2)
Question 7
The recurrence relation capturing the optimal time of the Tower of Hanoi problem with n discs is. (GATE CS 2012)
A
T(n) = 2T(n – 2) + 2
B
T(n) = 2T(n – 1) + n
C
T(n) = 2T(n/2) + 1
D
T(n) = 2T(n – 1) + 1
Analysis of Algorithms    
Discuss it


Question 7 Explanation: 
Following are the steps to follow to solve Tower of Hanoi problem recursively.
Let the three pegs be A, B and C. The goal is to move n pegs from A to C.
To move n discs from peg A to peg C:
    move n-1 discs from A to B. This leaves disc n alone on peg A
    move disc n from A to C
    move n?1 discs from B to C so they sit on disc n
The recurrence function T(n) for time complexity of the above recursive solution can be written as following. T(n) = 2T(n-1) + 1
Question 8
Let w(n) and A(n) denote respectively, the worst case and average case running time of an algorithm executed on an input of size n. which of the following is ALWAYS TRUE? (GATE CS 2012)
(A) A(n) = \Omega(W(n))
(B) A(n) = \Theta(W(n))
(C) A(n) = O(W(n))
(D) A(n) = o(W(n))
A
A
B
B
C
C
D
D
Analysis of Algorithms    
Discuss it


Question 8 Explanation: 
The worst case time complexity is always greater than or same as the average case time complexity.
Question 9
Which of the following is not O(n^2)?
A
(15^10) * n + 12099
B
n^1.98
C
n^3 / (sqrt(n))
D
(2^20) * n
Analysis of Algorithms    
Discuss it


Question 9 Explanation: 
The order of growth of option c is n^2.5 which is higher than n^2.
Question 10
Which of the given options provides the increasing order of asymptotic complexity of functions f1, f2, f3 and f4?
  f1(n) = 2^n
  f2(n) = n^(3/2)
  f3(n) = nLogn
  f4(n) = n^(Logn)
A
f3, f2, f4, f1
B
f3, f2, f1, f4
C
f2, f3, f1, f4
D
f2, f3, f4, f1
Analysis of Algorithms    
Discuss it


Question 10 Explanation: 
  f1(n) = 2^n
  f2(n) = n^(3/2)
  f3(n) = nLogn
  f4(n) = n^(Logn)
Except f3, all other are exponential. So f3 is definitely first in output. Among remaining, n^(3/2) is next. One way to compare f1 and f4 is to take Log of both functions. Order of growth of Log(f1(n)) is Θ(n) and order of growth of Log(f4(n)) is Θ(Logn * Logn). Since Θ(n) has higher growth than Θ(Logn * Logn), f1(n) grows faster than f4(n). Following is another way to compare f1 and f4. Let us compare f4 and f1. Let us take few values to compare
n = 32, f1 = 2^32, f4 = 32^5 = 2^25
n = 64, f1 = 2^64, f4 = 64^6 = 2^36
...............
............... 
Also see http://www.wolframalpha.com/input/?i=2^n+vs+n^%28log+n%29 Thanks to fella26 for suggesting the above explanation.
There are 75 questions to complete.

Company Wise Coding Practice    Topic Wise Coding Practice