Asymptotic Notations
• Goal: to simplify analysis of running time .
• Useful to identify how the running time of an
algorithm increases with the size of the input in the
limit.
• Asymptotic is a line that approaches a curve but
never touches.
Asymptotic Notations
Algorithm complexity is rough
estimation of the number of steps
performed by given computation
depending on the size of the input data.
Does not depend upon machine,
programming language etc.
No need to implement, we can analyze
algorithm.
Asymptotic Notations
Special Classes of Algorithms
• Logarithmic: O(log n)
• Linear: O(n)
• Quadratic: O(n2)
• Polynomial: O(nk), k >= 1
• Exponential: O(an), a > 1
Big-Oh (O) Notation
• Asymptotic upper bound
• f(n) = O (g(n)), if there exists
constants c and n0 such that,
• f(n) <= c g(n) for n >= n0
• f(n) and g(n) are functions over non-
negative integers.
• Used for Worst-case analysis.
Big-Oh (O) Notation
• Simple Rule:
Drop lower order terms and constant factors.
Example:
• 50n log n is O(n log n)
• 8n2 log n + 5 n2 + n is O(n2 log n)
O (Big O/Exact or Upper) Notation
The function f(n)=O(g(n)) [read as f of n is big oh of g
of n] if and only if there exist positive constants c
and n0 Such that f(n) ≤c*g(n) for all n, n ≥ n0.
In other words, f (n)=O(g(n)) if and only if there exist
positive constants c, and n0. , such that for all n ≥ n 0,
the inequality 0≤f(n)≤c*g(n) is satisfied.
The statement f(n)= O (g(n)) states only that g(n) is
an Upper bound for f (n) .
O(1) < O(log n) < O(n) < O(n log n) < O(n^2) < O(n^3) <
O(2^n) Where
O(1) Computing time. That is constant.
O(n) is called Linear
O(n^2) is called Quadratic
O(n^3) is called Cubic
O(2^n) is called Exponential
f(n)=O(g(n)) states only that g(n) is an upper bound on the
value of f(n) for all n, n ≥ n0.
CSE 205 @ Lovely Professional University
O-notation
Asymptotic upper bound
Big-Omega (Ω) Notation
• Asymptotic lower bound
• f(n) = Ω (g(n)), if there
exists constants c and n0 such
that,
c g(n) <= f(n) for n >= n0
• Used to describe Best-case
running time.
Ω (Omega / Exact /Lower) Notation
The function f(n)=Ω(g(n)) (read as “f of n is omega of
g of n”) iff there exist positive constants c and n0
such that f(n) ≥ c*g(n) for all n, n≥ n0.
The statement f(n)= Ω (g(n)) states only that g(n) is an
lower bound for f (n) .
CSE 205 @ Lovely Professional University
Example
When we say that the running time (no
modifier) of an algorithm is Ω (g(n)).
we mean that no matter what particular input
of size n is chosen for each value of n, the
running time on that input is at least a
constant times g(n), for sufficiently large n.
n3 + 20n ∈ Ω(n2)
Big-Theta (Ө)Notation
• Asymptotic tight bound
• f(n) = Ө (g(n)), if there exists
constants c1, c2 and n0 such that,
• c1 g(n) <= f(n) <= c2 g(n) for n
>= n0
• f(n) = Ө (g(n)), iff f(n) = O(g(n)) and
f(n) = Ω (g(n))
Θ (Theta/ Exact) Notation
The function f(n) = Θ(g(n)) if there exist positive constants n0,
c1, andc2 such that to the right of n0, the value of f(n) always
lies between c1* g(n) and c2*g(n) inclusive.
i.e. c1*g(n) ≤ f(n) ≤ c2*g(n)
The function f(n) = Θ(g(n)) is both upper and lower bound on
f(n).
CSE 205 @ Lovely Professional University
Little-Oh (o) Notation
• Non-tight analogue of Big-Oh.
• f(n) = o (g(n)), if for every c, there exists n0
such that,
f(n) < c g(n) for n >= n0
• Used for comparisons of running times.
Which notation is lower bound ?
a. Big O
b. Omega
c. Theta
d. None of the above .
CSE 205 @ Lovely Professional University
What does it mean when we say that an
algorithm X is asymptotically more efficient
than Y?
A. X will always be a better choice for large
inputs
B. X will always be a better choice for small
inputs
C. Y will always be a better choice for small
inputs
D. X will always be a better choice for all
inputs