Asymptotic Notations
Execution time of an algorithm depends on the instruction set, processor speed, disk I/O speed, etc. Hence, we estimate the efficiency of an algorithm asymptotically.
Time function of an algorithm is represented by T(n), where n is the input size.
Different types of asymptotic notations are used to represent the complexity of an algorithm. Following asymptotic notations are used to calculate the running time complexity of an algorithm.
O − Big Oh
Ω − Big omega
θ − Big theta
O: Asymptotic Upper Bound
‘O’ (Big Oh) is the most commonly used notation. A function f(n) can be represented is the order of g(n) that is O(g(n)), if there exists a value of positive integer n as n0 and a positive constant c such that −
for in all case
Hence, function g(n) is an upper bound for function f(n), as g(n) grows faster than f(n).
Example
Considering
for all the values of
Hence, the complexity of f(n) can be represented as i.e.
Ω: Asymptotic Lower Bound
We say that when there exists constant c that for all sufficiently large value of n. Here n is a positive integer. It means function g is a lower bound for function f; after a certain value of n, f will never go below g.
Example
Let us consider a given function,
Considering
for all the values of
Hence, the complexity of f(n) can be represented as, i.e.
θ: Asymptotic Tight Bound
We say that when there exist constants c1 and c2 that for all sufficiently large value of n. Here n is a positive integer.
This means function g is a tight bound for function f.
Example
Considering, for all the large values of n.
Hence, the complexity of f(n) can be represented as ,i.e. θ(n3)
0 Comments