Growth of function - Asymptotic Notation

 

Growth of function


We just want an algorithm that takes less time. In this problem, we make 7 algorithms A1 to A7, from different methods. But A5 algorithm takes less time, log n, so we will apply the A5 algorithm to generate the code.

For larger n, the growth of time complexity is

Log n < √n < n < n log n < n^2 < n^3 < 2^n < 3^n < n! < n^n

Asymptotic Notation

To represent complexity, there is an approximation method, which is also known as Asymptotic Notations. It is like an error method to approximate function means if you get an answer “5n + 2n^2 + nlogn”, then you have to focus on what’s important by abstracting away low-order terms (like 5n, nlogn) and constant factors (like 2).

The amount of time and storage required to execute an algorithm, determine its efficiency. Asymptotic notations are used to calculate efficiency. An algorithm's performance may differ depending on the type of input. The performance will change as the input size increases.




Asymptotic NotationSymbolApproximately equal to Nature
1. Big Oh OO ≈ ≤Upper Bound – Worst Case
2. Big Omega Ω Ω ≈ ≥Lower Bound – Best Case
3. Theta ΘΘ ≈ =Average Bound – Average Case (Exact Time)
4. Small oh oo ≈ <
5. Small omega ωω ≈ >


1. Big-Oh (O)  ={f(n): there exist positive constants C and no, such that 0 ≤ f(n) ≤ C.g(n) for all n ≥no}

g(n) is an asymptotic upper bound for f(n) for larger n (n ≥ no)
Suppose, if a married person wants to get ready for a party, then you have to give him more than half an hour. he continuously says that he requires only 10 minutes to get ready but the actual time will be more than 10 minutes. The time he takes will be g(n) and the original time to get ready is f(n). 

Examples
1. If f(n) = 2n^2  and g(n) = n^3, then what could be C and nif  f(n) = Og(n)

Ans. If C = 1 and no >= 2, 
         then 2n^2 = O(n^3) 

Putting value no = 2,           8 = 8
Putting value no > 2,           2n^2 < n^3
                       

2. If f(n) = 4n + 3  and  g(n) = n, then what could be C and nif  f(n) = Og(n)

Ans. If C = 5 and no >= 3,
         then 4n + 3 = O(5n)

Putting value no = 3,           15 = 15
Putting value no > 3,           4n + 3 < 5n

3. If f(n) = 2n^3 + 6 then what is big-oh g(n)?

Ans. if g(n) is greater than or equal to f(n), then g(n) is big-oh of f(n).
         2n^3 + 6 <= 2n^3 + 6n^3
                         <= 8n^3
         so, f(n) = O(n^3) if C = 8 and no >= 1
         Also, f(n) = O(n^3√n)/ O(n^4) / O(n^3 log n)  for different values of C and no.


2. Big-Omega (Ω)={f(n): there exist positive constants C and no, such that 0 ≤ C.g(n) ≤ f(n) for all n ≥no}


g(n) is an asymptotic lower bound for f(n) for larger n (n ≥ no)
Suppose, if a bachelor boy wants to get ready for a party, he is given only 10 minutes to get ready, but he will be ready in just 4-5 minutes, 10 minutes is the maximum limit he will get. The time he takes will be g(n) and the original time of 10 minutes to get ready is f(n). 

Examples
1. If f(n) = √n and g(n) = Log n, then what could be C and no if  f(n) = â„¦g(n). 

Ans. If C =1 and n= 16,
         then, √n = â„¦ (log n)

Putting value no = 16,           4 = 4
Putting value no > 16,           √n > (log n)

2. if f(n) = 2n^3 + 6, then what is big omega g(n)?

Ans. 2n^3 + 6 >= n^3
                         >= n
                         >= √n   and many more which are less than or equal to 2n^3 + 6

3. Theta (Θ) ={f(n): there exist positive constants C1, C2 and no, such that 0 ≤ C1.g(n) ≤ f(n) ≤ C2.g(n) for all n ≥no}
.


g(n) is asymptotically tightly bound for f(n) for larger n (n ≥ no)
It means Î˜ = O ∩ â„¦

Example
1. If f(n) = (n^2)/2 - 2n and g(n) = n^2, then what could be C1, C2 and nif  f(n) = Î˜g(n)

Ans. C1 =1/4     C2 = 1/2     n= 8 
         = 1/4 g(n) <= f(n) <= 1/2 g(n)
         = 1/4 (n^2) <=  (n^2)/2 - 2n <= 1/2 (n^2)
by putting n= 8
         = 16 <=  16 <= 32



4. Small-Oh (o)  ={f(n): there exist positive constants C and no, such that 0 ≤ f(n) < C.g(n) for all n ≥no}
.
The only difference between Big-Oh and Small-oh is that Big-Oh has f(n) ≤ C.g(n) and small-oh has f(n) < C.g(n).

Examples
1. n^1.9999 = o (n^2)

2. n^2/ log n = o (n^2)

3. n^2  ≠ o (n^2)


5. Small-Omega (ω)={f(n): there exist positive constants C and no, such that 0 ≤ C.g(n) < f(n) for all n ≥no}

The only difference between Big-Omega and Small-omega is that Big-Omega has C.g(n) ≤ f(n) and small-oh has C.g(n) < f(n).
Examples
1. n^2.0001 = Ï‰(n^2)

2. n^2 log n = Ï‰(n^2)

3. n^2  ≠  Ï‰(n^2)






Post a Comment

0 Comments