Asymptotic Analysis
Asymptotic Analysis
Definition:
Asymptotic Analysis is defined as the big idea that handles the above issues in
analyzing algorithms. In Asymptotic Analysis, we evaluate the performance of
an algorithm in terms of input size (we don’t measure the actual running time).
We calculate, how the time (or space) taken by an algorithm increases with the
input size.
Explanation:
Asymptotic analysis is the mechanism for observing and calculating an
algorithm’s efficiency based upon the time and memory it consumes. The
asymptotic are associated with certain mathematical notations. There are
specific notations such as theta notations, big O notations, and small o
notations used for asymptotic analysis for comparing and measuring the
algorithms’ speed. Asymptotic analysis is the best approach to check the
algorithm efficiency before implementing it through the programming
languages. The result values of the asymptotic analysis generally measured in
log notations. This analysis helps to standardize the performance of the
algorithm for machine-independent calculations. Computer algorithms like
sorting algorithms are preferred use cases for asymptotic analysis.
Here we calculate an algorithm’s running time in terms of input size to judge its
performance and efficiency and represent it using mathematical tools.
Asymptotic Notations:
Here are some of the asymptotic notations which are explained below:
1. Θ Notation
Often called ‘theta’ notation. This notation gives upper bound as well as lower
bound of an algorithm. E.g., if an algorithm is represented in the form of
equation in terms of g(n).
say, g(n)= 3n3+2n2+5n+7 then g(n) can also be written as Θ(n3) after dropping
all other constants as well as other lower degree terms of the equations. Thus,
in general, if g(n) is a function to represent the run-time complexity of an
algorithm where n is a number of inputs, and g(n) is non-negative for all values
greater than n0.
f(n) gives the exact asymptotic behavior of g(n) with changing inputs i.e., g(n) is
always between c1*f(n) and c2*f(n). If we use this notation then,
Best Case- Θ(n)
Worst case- Θ(n3)
Graphical Representation:
2. Big O Notation
This notation gives an upper bound of an algorithm, that bounds the function
from above.
For most of the algorithms, we only have an upper bound; thus, we use this
notation. This upper bound can also be seen as tight upper bound as it is
inclusive of the boundary values.
For e.g.: g(n)= 3n3+2n2+5n+7
g(n) =O(n3), such as 0<=g(n) <= 3*(n3).
Graphical Representation:
3. Ω Notation
This notation is analogous to the above-given notation, as this provides the
lower bound of an algorithm. Thus, it is always used to provide the best case
solution to a problem.
Consider a function g(n) that represent run-time behavior of an algorithm
where n is the number of inputs.
If there exist a function f(n), then f(n) is said to be lower bound for g(n) . Thus
f(n) gives the best-case run-time for the algorithm g(n). This can also be seen as
tight lower bound values as it is inclusive of lower boundary values.
For e.g.: g(n)= 3n3+2n2+5n+7
g(n) = Ω (n), such as 0<=5*(n) <= g(n)
Graphical Representation:
4. Little O Notation:
Little (o) notation provides an upper bound to a function that is not tight. If
there is function f(n) whose lower bound can be represented using g(n) such
that
o(g(n)) = {f(n): 0 ≤ f(n) < c*g(n), where for any real constant c > 0, there exists
an integer constant n0 ≥ 1}
Thus g(n) gives an upper bound for f(n) which is not tight. In mathematical
relation, f(n) = o(g(n)) means,
Lim f(n)/g(n) = 0
n→∞
For e.g.: g(n)= 3n3+2n2+5n+7
g(n) = o(n3), such as 0<=g(n) <3*(n3)
5. Little Omega notation(ω):
This notation gives the loose lower-bound for a function f(n), i.e. g(n) such as
ω (g(n)) = {f(n): f(n) > c * g(n) ≥ 0, where for any real constant c > 0, there exists
an integer constant n0 ≥ 1}
In mathematical relation,
if f(n) ∈ ω(g(n)) then,
Lim f(n)/g(n) = ∞
n→∞
For e.g.: g(n) = 3n3+2n2+5n+7
g(n) = ω (n), such as g(n) > 5*(n)>=0
Cases:
Depending on the calculation algorithm can be considered under one of the
below categories:
1. Best Case: The time required by the algorithm is the minimum of all. E.g., in
sorting algorithm when all the given numbers in the input are already sorted.
2. Average Case: The time required by an algorithm is the average time taken
by all. E.g., In the sorting algorithm, when given number in the input is half
sorted.
3. Worst Case: The time required by an algorithm is the maximum of all others.
E.g., in sorting algorithm when all the given numbers in the input are given in
reverse order as required.