Question 1: What is the asymptotic boundary of T(n)?
 θ( n*log(n) )
 θ( n^{2} )
 θ( n )
 θ( n*log^{2}(n) )
 θ( n^{2}*log^{2}(n) )

This is obvious that for any k greater than √ n, each log_{k}n should be less than log_{√n}n = 2, while more than log_{n}n = 1. In mathematic language:
 A hint on UPPER boundary, for k > √ n:
 A hint on LOWER boundary, for k > √ n:
 A hint on UPPER boundary, for k > √ n:

Besides that, as the base of a logarithm increases, its value decreases; so none of the terms resulted from expansion of the first sigma can be more than the first ter, log_{2}n, nor can be less than the last one, which is log_{√n}n; in other sentences,
 Another hint on UPPER boundary, but this time for k < √ n:
 Another hint on LOWER boundary, but this time for k < √ n:
 Another hint on UPPER boundary, but this time for k < √ n:
 Lower boundary:
 Upper boundary:
Question 2: What is running time order of given program?
C PROGRAM: Input n of type integer
for(i= 2; i<n; i=i+1)
for(j = 1; j < n; j= j * i)
// A line of code of Θ(1)
 θ( n )
 θ( n*log(n) )
 θ( n^{2} )
 θ( n*log^{2}log(n) )
 θ( n^{2}*log^{2}(n) )
 The first code line, t_{1}(n), is: for(i= 2; i<n; i=i+1) // it runs (n – 2) times; so the time complexity of this line is of θ(n)
 The second code line, t_{2}(n) is: for(j = 1; j < n; j= j * i) // log_{2}^{n} + log_{3}^{n} + … + log_{n1}^{n} = Σlog_{i}^{n} ∈ Θ( n ) in according to PREVIOUS QUESTION of this article (Refer to Question 1)
 The third code line, t_{3}(n) is: //A code line of Θ(1) :: Inside loops, so its order time is as the same as that of previous line, which is of Θ( n )
Question 3: The following recurrence equation T(n) is given. How many number of proposed g_{i}(n), i=1 .. 5, functions is acceptable in order to have T(n) ∈ θ(f(n)) when f(n) = g_{i}(n)?
 1
 2
 3
 4
 5
In order to use the master theorem, there is a need to see that the given problem with specific “a”, “b”, and “f(n)” satisfies the condition of which case of this theorem. The three cases of master theorem and their conditions are:

case 1: This case happens the recursion tree is leafheavy (the work to split/recombine a problem is dwarfed by subproblems.)

case 2: This case occurs when the work to split/recombine a problem is comparable to subproblems.

case 3: This case takes place when the recursion tree is rootheavy (the work to split/recombine a problem dominates subproblems.)
C PROGRAM: inputs m and n of type integer
for(i= 1; i<= n; i=i+1) : n
for(j = 1; j <= m; j= j * 2)
for(k = 1; k <= j; k= k+1)
\\ A code line of Θ(1)
 θ( n * m*(m+1)/2 )
 θ( n*m + n*log^{2}(m) )
 θ( m^{3} )
 θ( n^{2} )
 θ( n^{2}*log^{2}(n) )
 for(i= 1; i<= n; i=i+1) // It runs n times
 for(j = 1; j <= m; j= j * 2) // iterates log_{2}(m) times, and it is inside another loop which multiply it n times
 for(k = 1; k <= j; k= k+1) // It runs 2 + 4 + 8 + … + 2^{log(m)} times This is also inside an outer loop, first “for” loop, which itself iterates n times
 \\A line of code of Θ(1) The same as previous line, Θ( m*n )
It can even be more simplified in according to the given prior knowledge which says that m ∈ Θ(n), or n ∈ Θ(m):
Question 5: There is a vector of integer numbers, called V[], which is of length “N”. For a specific problem (program), it is given that Σ_{i=1}^{N} V[i] = P. What is the time complexity of following code snippet? [Needless to say, P is also an integer number]
Tmp = 1; For r= 1 to N For S = 1 to V[r] Tmp = Tmp + 20;
 O( N + 2*N*P )
 O( N * P )
 O( N^{2} )
 O( P^{2} )
 O( 2*P + N )
 Each V[r] can take any integer value (even zero or negative ones), but it doesn’t matter as all negative values will lead to no execution of the second loop in programming languages like C. However, in programming languages, it is allowed to count downto (or to iterate over) negative numbers, but the algorithms are not being analyzed depends on programming languages, and the analysis is just based on the algorithm itself. What to say for sure is the information that is given in the question; so a shrewd action is to consider the absolute value of V[r] and also to use O() notation in order to get rid of being stuck. Otherwise, it has to be said that the program runs at least as much as the time needed for just execution of the first loop, or Ω(N)
 Although the running time order of this program does not seem to depend on two variables, but there is no more information for further analysis which is needed to compare P and N; so the asymptotic complexity depends on the values of both P and N; in other words, there is T(N, P) complexity function instead of T(N).
 The O() notation defines a looser boundary than the tight boundary specified by &theta() notation; Therefore, the sum of θ() and O() would be of O() type. In this problem, the total time complexity of the program, which is the sum of all code lines complexities, θ(1) + θ(N) + O( P ) + O( P ), belongs to set O( 2 * P + N + 1 ) or O(2*P + N).
Source:

A compilation of Iran university exams (with a bit of summarization, modification, and also translation)
Share your thoughts in the comments