The main idea of ** asymptotic analysis** is to have a measure of the efficiency of algorithms that don’t depend on machine-specific constants, mainly because this analysis doesn’t require algorithms to be implemented and time taken by programs to be compared. We have already discussed Three main asymptotic notations. The following 2 more asymptotic notations are used to represent the time complexity of algorithms.

**Little o Asymptotic Notation:**

**Little o Asymptotic Notation:**Big O is used as a tight upper bound on the growth of an algorithm’s effort (this effort is described by the function** f(n)),** even though, as written, it can also be a loose upper bound. “

**” (**

**Little o****) notation is used to describe an upper bound that cannot be tight.**

**o**In the domain of algorithm analysis, the little o notation is a valuable tool used to describe the behavior of functions as they approach certain limits. When we say that a function ** f(n)** is

**, we are essentially stating that**

**o(g(n))****grows slower than**

**f(n)****as n approaches infinity. In simpler terms, if**

**g(n)****, it means that**

**f(n) = o(g(n))****grows faster than**

**g(n)**

**f(n).**Thus, ** Little o** means

**of**

**loose upper-bound****. Little o is a rough estimate of the maximum order of growth whereas Big-O may be the actual order of growth.**

**f(n)**### Mathematical Relation:

`f(n) = o(g(n)) means lim f(n)/g(n) = 0 n→∞`

**Examples:**

**Examples:**

Is 7n + 8 ∈ o(n^{2})

In order for that to be true, for any c, we have to be able to find an n0 that makesasymptotically true.f(n) < c * g(n)

lets took some example,

If c = 100,we check the inequality is clearly true. If c = 1/100 , we’ll have to use a little more imagination, but we’ll be able to find an n0. (Try n0 = 1000.) From these examples, the conjecture appears to be correct.

then check limits,

lim f(n)/g(n) = lim (7n + 8)/(n^{2}) = lim 7/2n = 0 (l’hospital)n→∞ n→∞ n→∞hence

7n+8 ∈ o(n^{2})

**Little ω Asymptotic Notation:**

**Little ω Asymptotic Notation:**On the other hand, little ω notation is used to describe the relationship between two functions when one grows strictly faster than the other. If a function f(n) is ω(g(n)), it means that g(n) grows slower than f(n) as n approaches infinity.

** f(n)** has a higher growth rate than

**so main difference between**

**g(n)****Ω**

**Big Omega (****and**

**)****lies in their definitions.In the case of Big Omega**

**Little omega (ω)****and the bound is**

**f(n)=ω(g(n))****, but in case of little omega, it is true for**

**0<=cg(n)<=f(n)**

**0<=c*g(n)<f(n).**The relationship between ** Big Omega (**Ω

**and**

**)****is similar to that of Big-O and**

**Little Omega (ω)****except that now we are looking at the lower bounds. Little Omega (**

**Little o****) is a rough estimate of the order of the growth whereas Big Omega (Ω) may represent exact order of growth. We use notation to denote a lower bound that is not asymptotically tight, and**

**ω****if and only if**

**f(n) ∈ ω(g(n))**

**g(n) ∈ ο((f(n)).**### Mathmatical Relation:

` if f(n) ∈ ω(g(n)) then, lim f(n)/g(n) = ∞ n→∞ `

**Example:**

**Example:**

Prove that

4n+6 ∈ ο(1);The

running time can be proven by applying limit formula given below.little omega(ο)if lim f

then functions(n)/g(n) = ∞f(n) is ο(g(n))

n→∞

here,we have functionsandf(n)=4n+6g(n)=1lim (4n+6)/(1) = ∞

n→∞ and,also for any c we can get n0 for this inequality0<=c*g(n)<f(n) ,So,hence proved.0<=c*1<4n+6.

__Conclusion:__

In conclusion, little o and little ω notations are essential tools in algorithm analysis that allow us to compare the growth rates of functions in a precise manner. By understanding these notations, we can better analyze and predict the performance of algorithms as their input sizes grow.