Module I Notes
Module I Notes
Module I
Introduction to Algorithm Analysis
o Characteristics of Algorithms
o Criteria for Analysing Algorithms
Time and Space Complexity
o Best, Worst and Average Case Complexities
o Asymptotic Notations
Big-Oh (O), Big- Omega (Ω), Big-Theta (Θ), Little-oh (o) and Little- Omega (ω) and
their properties.
Classifying functions by their asymptotic growth rate
o Time and Space Complexity Calculation of simple algorithms
o Analysis of Recursive Algorithms:
Recurrence Equations
Solving Recurrence Equations
Master’s Theorem
Iteration Method
Recursion Tree Method
Substitution method
Computational Procedures
o Algorithms those are definite and effective.
o Example: Operating system of a digital computer. (When no jobs are available, it does not
terminate but continues in a waiting state until a new job is entered.)
Program: It is the expression of an algorithm in a programming language
Recursive Algorithms
A recursive function is a function that is defined in terms of itself.
An algorithm is said to be recursive if the same algorithm is invoked in the body.
Two types of recursive algorithms
Direct Recursion: An algorithm that calls itself is direct recursive.
Indirect Recursion: Algorithm A is said to be indirect recursive if it calls another
algorithm which in turn calls A.
1 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
Space Complexity
The space complexity of an algorithm is the amount of memory it needs to run to
completion
Space Complexity = Fixed Part + Variable Part
S(P) = c + SP , Where P is any algorithm
A fixed part:
It is independent of the characteristics of the inputs and outputs.
Eg:
o Instruction space(i.e., space for the code)
o space for simple variables and fixed-size component variables
o space for constants
A variable part:
It is dependent on the characteristics of the inputs and outputs.
Eg:
o Space needed by component variables whose size is dependent on the
particular problem instance being solved
o Space needed by referenced variables
o Recursion stack space.
Time Complexity
The time complexity of an algorithm is the amount of computer time it needs to run to
completion. Compilation time is excluded.
Time Complexity = Frequency Count * Time for Executing one Statement
Frequency Count Number of times a particular statement will execute
Eg1: Find the time and space complexity of matrix addition algorithm
Step/Execution Frequency Count Total Frequency Count
Algorithm mAdd(m,n,a,b,c) 0 0 0
{ 0 0 0
for i=1 to m do 1 m+1 m+1
for j=1 to n do 1 m(n+1) mn+m
c[i,j] := a[i,j] + b[i,j]; 1 mn mn
} 0 0 0
2mn + 2m +1
Time Complexity = 2mn + 2m + 1
Space Complexity = Space for parameters and Space for local variables
m1 n1 a[]mn b[]mn c[]mn i1 j1
Space complexity = 3mn + 4
2 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
Eg2: Find the time and space complexity of recursive sum algorithm
Step/Execution Frequency Count Total Frequency Count
n≤0 n>0 n≤0 n>0
Algorithm RSum(a,n) 0 0 0 0 0
{ 0 0 0 0 0
if n ≤ 0 then 1 1 1 1 1
return 0 1 1 0 1 0
Else 0 0 0 0 0
return a[n] + RSum(a,n-1) 1 + T(n-1) 0 1 0 1 + T(n-1)
} 0 0 0 0 0
2 2 + T(n-1)
Time Complexity = T(n) = 2 if n<=0
2 + T(n-1) Otherwise
T(n) = 2 + T(n-1)
=2 + 2 + T(n-2)
=2 + 2 + 2+ T(n-3)
=2x3 + T(n-3)
. . .
=2xn +T(n-n)
=2n + 2
University Questions:
1. Discuss the time complexity of the following three functions
int fun1(int n)
{ if(n ≤ 1)
return n;
return 2xfun1(n-1);
}
int fun2(int n)
{ if(n ≤ 1)
return n;
return fun2(n-1)xfun2(n-1);
}
3 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
Answer:
4 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
2. Analyze the complexity of the following program
int fun3(int n)
{ if(n ≤ 1)
return n;
return fun3(n-1) + fun3(n-1);
}
Answer:
5 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
4. Analyse the complexity of the following function
void function(int n)
{
int count=0;
for(int i=n/2; i<=n; i++)
for(int j=1; j<=n; j=2*j)
for(int k=1; k<=n; k=k*2)
count++;
}
Answer:
6 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
Answer:
7 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
7. Find the time complexity of the given code segment
for(int i=1;i<=n;i++)
{ for(int j=i+1;j<=n;j++)
{
//code
}
}
Answer: The most frequently executing statement is the innermost code. During the
first iteration of outer loop that code will execute n-1 times. During the 2nd iteration of
outer loop that code will execute n-2 times and so on. During the last iteration it will
never execute.
So the frequency count of that code = (n-1) + (n-2) + …….. + 0
= (n-1)n/2 = (n2-n)/2
2
Time Complexity = O(n )
8 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
10. Consider the following C function
int check(int n)
{
int i,j;
for (i=1;i<=n;i++)
{
for (j=1;j<n;j+=i)
{
printf("%d",i+j);
}
}
}
Find the time complexity of check in terms of ɵ notation
9 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
4. Illustrate best case, average case and worst case complexity of Insertion sort algorithm.
Answer:
Algorithm InsertionSort(A,n)
{
for i=1 to n-1 do
{
j=i
while j>0 and A[j-1] > A[j] do
{
Swap(A[j], A[j-1])
j=j-1
}
}
}
Best case complexity: Best case situation occurs when the array itself is in sorted order. In
this case the while loop will not execute successfully. So the time complexity is
proportional to number of times the for loop will execute. It will execute O(n) time.
The best case time complexity = Ω(n)
Worst case complexity: Worst case situation occurs when the array itself is in reverse
sorted order. In this case the while loop will execute 1+2+3+ . .. .+(n-1)=n(n+1)/2 times.
The time complexity is proportional to number of times the while loop will execute. It will
execute O(n2) time.
The worst case time complexity = O(n2)
Average case complexity: Average case situation occurs when the while loop will iterate
half of its maximum iterations. In this case the while loop will execute [1+2+3+ . .. .+(n-
1)]/2=n(n+1)/4 times.
The time complexity is proportional to number of times the while loop will execute. It will
execute O(n2) time.
The average case time complexity = Ɵ(n2)
5. Write an algorithm for insertion sort. Calculate the worst case time complexity of insertion
sort.
Asymptotic Notations
It is the mathematical notations to represent frequency count. 5 types of asymptotic notations
Big Oh (O)
The function f(n) = O(g(n)) iff there exists 2 positive constants c and n0 such that
0 ≤ f(n) ≤ c g(n) for all n ≥ n0
It is the measure of longest amount of time taken by an algorithm(Worst case).
It is asymptotically tight upper bound
O(1) : Computational time is constant
O(n) : Computational time is linear
O(n2) : Computational time is quadratic
O(n3) : Computational time is cubic
O(2n) : Computational time is exponential
10 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
Omega (Ω)
The function f(n) = Ω (g(n)) iff there exists 2 positive constant c and n0 such that
f(n) ≥ c g(n) ≥ 0 for all n ≥ n0
It is the measure of smallest amount of time taken by an algorithm(Best case).
It is asymptotically tight lower bound
Theta (Ɵ)
The function f(n) = Ɵ (g(n)) iff there exists 3 positive constants c1, c2 and n0 such that
0 ≤ c1 g(n) ≤ f(n) ≤ c2 g(n) for all n ≥ n0
It is the measure of average amount of time taken by an algorithm(Average case).
Little Oh (o)
The function f(n) = o(g(n)) iff for any positive constant c>0, there exists a constant n0>0
such that 0 ≤ f(n) < c g(n) for all n ≥ n0
It is asymptotically loose upper bound
Examples:
1. Find the O notation of the following functions
a) f(n) = 3n + 2
3n + 2 ≤ 4 n for all n≥2
Here f(n)= 3n + 2 g(n)=n c=4 n0=2
Therefore 3n + 2= O(n)
b) f(n) = 4n3 + 2n + 3
4n3 + 2n + 3 ≤ 5 n3 for all n≥2
Here f(n)= 4n3 + 2n + 3 g(n)= n3 c=5 n0=2
3 3
Therefore 4n + 2n + 3= O(n )
c) f(n) = 2n+1
2n+1 ≤ 2 2n for all n≥1
11 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
n+1 n
Here f(n)= 2 g(n)= 2 c=2 n0=1
Therefore 2n+1 = O(2n)
d) f(n) = 2n + 6n2 + 3n
2n + 6n2 + 3n ≤ 7 2n for all n≥5
Here f(n)= 2n + 6n2 + 3n g(n)= 2n c=7 n0=5
Therefore 2n + 6n2 + 3n = O(2n)
2
e) f(n) = 10n + 7
f) f(n) = 5n3 + n2 + 6n + 2
g) f(n) = 6n2 + 3n + 2
h) f(n) = 100n + 6
5. What is the smallest value of n such that an algorithm whose running time is 100n2 runs
faster than an algorithm whose running time is 2n on the same machine? (Univ Question)
12 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
n
d) f(n) = 4 2 + 3n
e) f(n) = 3n + 30
f) f(n) = 10 n2 + 4n + 2
b) f(n) = 3 2n + 4n2 + 5n + 2
3x2n + 4n2 + 5n + 2 ≤ 10x2n for all n≥1
3x2n + 4n2 + 5n + 2 = O(2n)
c) f(n) = 2 n2 + 16
d) f(n) = 27n2 + 16
13 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
Running Time Comparison (Classifying functions by their asymptotic growth rate)
Logarithmic functions are very slow
Exponential functions and factorial functions are very fast growing
n log n n n log n n2 n3 2n n!
10 3.3 10 3.3 x 10 102 103 103 3.6 x 1060
102 6.6 102 6.6 x 102 104 106 1.3 x 1030 9.3 x 10157
. .
103 10 103 10 x 103 106 109 . .
. .
104 13 104 13 x 104 108 1012 . .
. .
105 17 105 17 x 105 1010 1015 . .
. .
106 20 106 20 x 106 1012 1018
O(1) < O(log n) < O(n) < O(n log n) < O(nk) < O(2n) < O(n!)
University Questions:
1. Explain asymptotic notations in algorithm analysis
2. Define Big Oh, Big Omega and Theta notations and illustrate them graphically.
3. What do you meant by asymptotic growth rate? Define Big Oh, Big Omega and Theta
notations
4. Define asymptotic notation. Arrange the following functions in increasing order of
asymptotic growth rate. n3 , 2n , log n 3 , 2100, n2 log n, nn, log n, n0.3, 2log n
Answer : 2 < log n < log n < n0.3 < n2 log n < n3 < 2log n < 2n < nn
100 3
14 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
Recurrence Relations
A recurrence is an equation or inequality that describes a function in terms of its values on
smaller inputs.
There are several methods for solving recurrence relation
Iteration Method
Recursion tree Method
Substitution Method
Master’s Method
Iteration Method
15 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
k
Assume that n/2 =2
T(n) =2[1+2+2 +2 +……… +2k-1 ] + 2kT(n/2k)
2 3
16 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
2
7. T(n) = 2 T(n/2) + n (Univ Question)
Answer:
T(n) = n2 + 2T(n/2)
= n2 + 2[(n/2) 2+2T(n/22)] = n2+n2/2 + 22 T(n/22)
= n2 + n2/2 + 22 [(n/22) 2+2T(n/23)]
= n2+n2/2 + n2/22 +23 T(n/23)
……………………
= n2[1+ (1/2) + (1/2)2 + …. + (1/2)k-12 ] + 2k T(n/2k) kth term
Assume n/2k= 1 2k = n k = log2(n)
T(n) = n2 [(1-(1/2)k) / (1-(1/2))] + n T(1)
= 2 n2 [ 1-(1/2k)] + n = 2 n2 [ 1-(1/2log n)] + n
= 2 n2 [ 1-(1/nlog 2)] + n = 2 n2 [ 1-(1/n)] + n
= 2 n2 - 2n+ n
= O(n2)
8. T(n) = T(n/3) + n
Answer:
T(n) = n + T(n/3)
= n + [(n/3) + T(n/32)] = n + (n/3) + T(n/32)
=n+(n/3)+[(n/32)+T(n/33)]=n+(n/3)+(n/32)+T(n/33)
……………………
= n+(n/3)+(n/32)+ …. +(n/3k-1)+T(n/3k) kth term
Assume n/3k= 1 3k = n k = log3(n)
T(n) = n[1+(1/3)+(1/3) 2+….+(1/3) k-1] + T(n/3k)
3
= n [ [ 1- (1/3)k] / [ 1-(1/3)] ] + T(1) = 2n [1-(1/3k)] + T(1)
3 1 3 3
= 2n [1- 𝑛 ] +1 = 2n - 2 + 1
=O(n)
9. T(n) = 3 T(n/4) + n
Answer:
T(n) = n + 3T(n/4)
= n + 3[(n/4)+3T(n/42)] = n + (3/4)n+32T(n/42)
= n + (3/4)n+32[(n/4) 2+3T(n/43)]
= n + (3/4)n+(3/4) 2n+33T(n/43)
……………………
= n + (3/4)n+(3/4) 2n+ …. +(3/4) k-1n+3kT(n/4k) kth term
Assume n/4k= 1 4k = n k = log4(n)
T(n) = n[1+(3/4) +(3/4) 2+….+(3/4) k-1] + 3kT(1)
= n [[ 1- (3/4)k] / [ 1-(3/4)] ] + 3k x 1
= 4 n [ 1 – (3k/4k)] + 3k alogc(b) =blogc(a)
= 4 n [ 1 - (3log4n /n)] + 3log4n
= 4 n [ 1 - (nlog43 /n)] + nlog43
17 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
=4n–4 nlog43 + nlog43 =4n–3 nlog43
=O(n)
18 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
Recursion Tree Method
It is the pictorial representation of iteration method, which is in the form of a tree.
Solve the following recurrence relation using Recursion Tree method.
1. T(1) = 1 (Univ Question)
T(n) = 3 T(n/4) + cn2
Answer:
19 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
T(n) =n+(11/10)n+(11/10)2n+……..+(11/10)kn
=n[1+(11/10)+(11/10)2+……..+(11/10)k]
=n[ [(11/10)k+1 -1] / [(11/10)-1] ]
=10n[ [(11/10)x(11/10) k] -1]
=10n[ [(11/10)x(11/10) log10/9 n] -1]
=10n[ [(11/10) x n log10/9(11/10) ] -1]
=10n[ [(11/10) x n1 ] -1]
=11n2 -10n =O(n2)
Longest path: Rightmost path
Longest Path length:
9kn/10k=1 , where k is the length of the longest path
(10/9) k=n
k=log(10/9) n
Shortest path: Leftmost path
Shortest Path length:
n/10k=1 , where k is the length of the shortest path
10k=n
k=log10 n
20 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
4. T(n) = T(n/3) + T(2n/3) + n, where n>1 (Univ Question)
T(n) = 1, Otherwise
Answer:
21 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
2
7. T(n) = 2T(n/2) + n where n>1 (Univ Question)
0 otherwise
Answer:
8. T(n) = 3 T(n/3) + c n
Answer:
22 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
9. T(n) = 4 T(n/2) + n
Answer:
23 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
2
11. T(n) = 8 T(n/2) + n
Answer:
24 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
k
= n[[(3/2) (3/2) - 1] / [1/2] ]
= 2n[(3/2) (3 k /2 k) - 1]
= 2n[(3/2) (3 log2n /2 log2n) - 1]
= 2n[(3/2) (n log23 /n log22) - 1]
= 2n[(3/2) (n log23 /n) - 1]
= 3n log23 – 2n
= O(n log23 )
25 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
Master’s Method
n
T(n) = aT(b ) + 𝜃(𝑛𝑘 𝑙𝑜𝑔𝑝 𝑛)
a≥1 b>1 k≥0 and p is a real number
26 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
2
4. T(n) = 7 T(n/2) + n (Univ Question)
Answer:
T(n) = 7T(n/2) + n2
a=7 b=2 n2=Ɵ (n2 log0(n)) k=2 p=0
bk =22 = 4
Here a> bk
T(n) = Ɵ(n(log b a))
= Ɵ(n(log 2 7))
27 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
2
27. T(n) = 8 T(n/2) + Ɵ (n )
28. T(n) = 2 T(n/4) + 1
29. T(n) = 2 T(n/4) + n
30. T(n) = 2 T(n/4) + n2
31. T(n) = T(n/2) + Ɵ (1)
32. T(n) = 4 T(n/2) + n2 log n
Substitution Method
The substitution method for solving recurrences comprises two steps
1. Guess the form of the solution
2. Use mathematical induction to find the constant & show that the solution works.
This method is powerful, but we must be able to guess the form of the answer in order to
apply it.
Example:
1. Give the general idea of the substitution method for solving recurrences. Solve the
following recurrences using substitution method
T(n) = 2 T(n/2) + n T(1) = 1 (Univ Question)
Answer:
Guess that T(n) = O(n log n)
As per O notation definition T(n) <= c n log n
By mathematical induction
If n=1, T(1) <= c x 1 x log 1 1 <= 0 It is false
If n=2, T(2) <= c x 2 x log 2 2 T(1) + 2 <= 2 c 4 <= 2c It is true
If n=3, T(3) <= c x 3 x log 3 2 T(1) + 3 <= c x 3 x log 3
5 <= 3 c log 3 It is true
This relation is true when n= 2, 3, 4 . . . . ., k
T(k) <= c k log k, where 2<= k <= n
T(n/2) <= c (n/2) log (n/2) 1
Answer:
Guess that T(n) = O(n)
As per O notation definition T(n) <= c n
By mathematical induction
If n=1, T(1) <= c x 1 1 <= c It is true
28 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
If n=2, T(2) <= c x 2 2 T(1) + 1 <= 2 c 3 <= 2c It is true
This relation is true when n= 1, 2, 3, . . . . ., k
T(k) <= c k , where 2<= k <= n
T(n/2) <= c (n/2) 1
Answer:
Guess that T(n) = O(n log n)
As per O notation definition T(n) <= c n log n
By mathematical induction
If n=1, T(1) <= c x 1 x log 1 1 <= 0 It is false
If n=2, T(2) <= c x 2 x log 2 2 T(1) +kx2 <= 2 c
2 +2 k <= 2c It is true
This relation is true when n= 2, 3, 4 . . . . ., k
T(k) <= c k log k, where 2<= k <= n
T(n/2) <= c (n/2) log (n/2) 1
29 CS KTU Lectures
Module I CST 306 - Algorithm Analysis and Design(S6 CSE)
Arithmetic Progression
Series: a , a+d , a+2d , a+3d . . . . . . . . . . a+(n-1)d
nth term: a+(n-1)d
sum of first n terms:
Geometric progression
Series: a , ar , ar2 , ar3 . . . . . . . . . . arn-1
nth term: arn-1
sum of first n terms:
log a a = 1
if x k = n , then k = log k n
30 CS KTU Lectures