Analysis, Design of Algorithms
CS3610
Dr. Islam Hegazy
Associate Professor
F2023 Analysis, Design of Algorithms 1
Agenda
01 Introduction 02 Time complexity
03 Space complexity 04 Asymptotic notations
05 Common growth rates
F2023 Analysis, Design of Algorithms 2
What is an algorithm?
🡪 It is a sequence of unambiguous instructions for solving a problem, i.e., for
obtaining a required output for any legitimate input in a finite amount of time.
1. Introduction
F2023 Analysis, Design of Algorithms 3
An algorithm efficiency
🡪 An algorithm is expected to effectively solve a problem.
🡪 Two aspects to consider when writing an algorithm:
🡪 Its correction
🡪 Its effectiveness
1. Introduction
🡪 The efficiency of an algorithm is evaluated by:
🡪 Speed (in terms of execution time)
🡪 Resource consumption (storage space)
F2023 Analysis, Design of Algorithms 4
Case study: Fibonacci Algorithm
🡪 The Fibonacci Sequence is the series of numbers: 0, 1, 1, 2, 3, 5, 8, 13, 21, 34,
... The next number is found by adding up the two numbers before it.
🡪 Hence, we have a sequence, the numbers in the sequence are denoted by 𝐹𝑛 ,
1. Introduction
and we formally define them as:
𝐹0 = 0
𝐹1 = 1
𝐹𝑛 = 𝐹𝑛−1 + 𝐹𝑛−2, 𝑓𝑜𝑟 𝑛 > 1
F2023 Analysis, Design of Algorithms 5
Case study: Fibonacci Algorithm
🡪 Approach 1: Iterative algorithm
int fibonacci_iterative(int n)
{
if (n <= 1) {
return n;
}
1. Introduction
int a = 0;
int b = 1;
int c = 0;
for (int i = 2; i <= n; i++)
{
c = a + b;
a = b;
b = c;
}
return c;
}
F2023 Analysis, Design of Algorithms 6
Case study: Fibonacci Algorithm
🡪 Approach 2: Array approach
int fibonacci_array (int n)
{ int[] f = new int[n + 1];
f[0] = 0;
f[1] = 1;
1. Introduction
for (int i = 2; i <= n; i++)
f[i] = f[i - 1] + f[i - 2];
return f[n];
}
F2023 Analysis, Design of Algorithms 7
Case study: Fibonacci Algorithm
🡪 Approach 3: Recursive approach
int fibonacci_recursive(int n)
{ if (n <= 1) {
return n;
} else {
return
1. Introduction
fibonacci_recursive(n-1)+fibonacci_recursive(n-2);
}
}
F2023 Analysis, Design of Algorithms 8
Agenda
01 Introduction 02 Time complexity
03 Space complexity 04 Asymptotic notations
05 Common growth rates
F2023 Analysis, Design of Algorithms 9
How to measure an algorithm rapidity?
🡪 We do not measure the duration in hours, minutes, seconds because it
depends on the machine.
2. Time complexity
🡪 Instead, abstract units of time proportional to the number of operations
performed are used.
F2023 Analysis, Design of Algorithms 10
Algorithms time complexity
🡪 The execution time of an algorithm is the number of steps executed by the
algorithm; often referred to as the complexity of the algorithm.
🡪 Example:
2. Time complexity
int factorial (n) :
int fact; 1
fact = 1; 1
Time Complexity = 5n + 6
int i = 1; 2
while (i<=n): n+1
fact = fact * i; 2n
i++; 2n
return fact; 1
F2023 Analysis, Design of Algorithms 11
Time complexity for Fibonacci – approach 1
int fibonacci_iterative(int n)
{
if (n <= 1) {
return n;
}
2. Time complexity
int a = 0;
int b = 1;
int c = 0;
for (int i = 2; i <= n; i++)
{
c = a + b;
a = b;
b = c;
}
return c;
}
F2023 Analysis, Design of Algorithms 12
Time complexity for Fibonacci – approach 2
int fibonacci_array (int n)
{
int[] f = new int[n + 1];
2. Time complexity
f[0] = 0;
f[1] = 1;
for (int i = 2; i <= n; i++)
return f[n];
f[i] = f[i - 1] + f[i - 2];
}
F2023 Analysis, Design of Algorithms 13
Time complexity for Fibonacci – approach 3
int fibonacci_recursive(int n)
{ if (n <= 1) {
return n;
} else {
return
2. Time complexity
fibonacci_recursive(n-1)+fibonacci_recursive(n-2);
}
}
🡪 The algorithm is slow since it keeps recomputing the same calculations over
and over again!
🡪 The time complexity is 𝑂(1.6180)𝑛 - The calculations will be explained in
chapter 2.
F2023 Analysis, Design of Algorithms 14
Time complexity for Fibonacci – approach 3
int fibonacci_recursive(int n)
{ if (n <= 1) {
return n;
} else {
return
2. Time complexity
fibonacci_recursive(n-1)+fibonacci_recursive(n-2);
}
}
🡪 Example for n = 5
F2023 Analysis, Design of Algorithms 15
Agenda
01 Introduction 02 Time complexity
03 Space complexity 04 Asymptotic notations
05 Common growth rates
F2023 Analysis, Design of Algorithms 16
Space complexity
🡪 Space complexity is the amount of memory used by the algorithm (including
the input values to the algorithm) to execute and produce the result.
🡪 Space Complexity = Auxiliary Space + Input space
3. Space complexity
🡪 Auxiliary Space is the extra space or the temporary space used by the
algorithm during it's execution.
int ArraySum (arr, n) : (n+1) * 4 bytes Space Complexity = 4n + 12
int sum = 0; 1 * 4 bytes
for (int i =0; i<n; i++) 1 * 4 bytes
sum = sum + arr[i];
return sum;
F2023 Analysis, Design of Algorithms 17
Space complexity for Fibonacci – approach 1
int fibonacci_iterative(int n)
{
if (n <= 1) {
return n;
}
3. Space complexity
int a = 0;
int b = 1;
int c = 0;
for (int i = 2; i <= n; i++)
{
c = a + b;
a = b;
b = c;
}
return c;
}
F2023 Analysis, Design of Algorithms 18
Space complexity for Fibonacci – approach 2
int fibonacci_array (int n)
{ int[] f = new int[n + 1];
3. Space complexity
f[0] = 0;
f[1] = 1;
for (int i = 2; i <= n; i++)
f[i] = f[i - 1] + f[i - 2];
return f[n];
}
F2023 Analysis, Design of Algorithms 19
Space complexity for Fibonacci – approach 3
int fibonacci_recursive(int n)
{ if (n <= 1) {
return n;
} else {
return fibonacci_recursive(n-1)+fibonacci_recursive(n-2);
3. Space complexity
}
}
F2023 Analysis, Design of Algorithms 20
Space complexity
Approach Time complexity Space complexity
Iterative 7n + 7 20
3. Space complexity
Array 9n + 6 4n + 12
Recursive ~2^n 4n + 4
F2023 Analysis, Design of Algorithms 21
Space complexity
Time complexity Space complexity
Calculates time needed Calculates memory space
3. Space complexity
needed
Counts time for all statements Counts memory for all
variables (Even input)
More important for solution Less important with modern
optimization hardware
F2023 Analysis, Design of Algorithms 22
Agenda
01 Introduction 02 Time complexity
03 Space complexity 04 Asymptotic notations
05 Common growth rates
F2023 Analysis, Design of Algorithms 23
Case study: Linear Search
LinearSearch(A, key)
1 i←1
2 while i ≤ n and A[i] != key
4. Asymptotic
3 do i++
4 if i ≤ n
5 then return true
6 else return false
F2023 Analysis, Design of Algorithms 24
Case study: Linear Search
🡪 Worst-case Complexity
🡪 The maximum number of steps the algorithm takes during execution
🡪 Average-case Complexity
4. Asymptotic
🡪 The average number of steps the algorithm takes during execution
🡪 Best-case Complexity
🡪 The minimum number of steps the algorithm takes during execution
F2023 Analysis, Design of Algorithms 25
Case study: Linear Search
🡪 The linearSearch algorithm:
🡪 best case: 1+1+1+1 = 4
🡪 worst case: 1+ (n+1) + n + 1 + 1 = 2n+4
🡪 average-case complexity : 1+ n/2+ n/2 + 1 + 1 = n+3
4. Asymptotic
F2023 Analysis, Design of Algorithms 26
Asymptotic Notations – Big Oh,
Omega, and Theta
🡪 Asymptotic notations are a mathematical tool that can be used to determine
the time or space complexity of an algorithm without having to implement it in
a programming language.
🡪 This measure is unaffected by machine-specific constants. It is a way of
4. Asymptotic
describing a significant part of the cost of the algorithm.
F2023 Analysis, Design of Algorithms 27
Big Oh Notation (O)
🡪 This notation is denoted by ‘O’, and it is pronounced as “Big Oh”. Big Oh
notation defines upper bound for the algorithm; it means the running time of
algorithm cannot be more than it’s asymptotic upper bound for any random
sequence of data. We use it to express the worst-case complexity.
4. Asymptotic
🡪 Example, for the linearSearch algorithm, we say:
🡪 Time taken to execute the algorithm in the worst case: 𝑇(n) = O(n)
F2023 Analysis, Design of Algorithms 28
Big Oh Notation (O)
🡪 Let f(n) and g(n) are two nonnegative functions indicating the running time of
two algorithms. We say,
g(n) is upper bound of f(n)
🡪 if there exist positive constants c and n0
such that 0 ≤ f(n) ≤ c.g(n) for all n ≥ n0.
4. Asymptotic
It is denoted as 𝑇(𝑛) = 𝑂(𝑔(𝑛))
🡪 Example, the linearSearch algorithm:
🡪 𝑇(𝑛) = 2𝑛 + 4 ≤ 3𝑛
🡪 𝑇(𝑛) = O(𝑛) for all n>=4 and c = 3 For small input size, there may be many cross overs between the
growth rate of f(n) and c.g(n), but once n becomes significantly
large, f(n) grows always slower than c.g(n). This value of n is
called crossover point and is denoted as n0.
F2023 Analysis, Design of Algorithms 29
Big Omega Notation (Ω)
🡪 This notation is denoted by ‘Ω’, and it is pronounced as “Big Omega”. Big
Omega notation defines lower bound for the algorithm. It means the running
time of algorithm cannot be less than its asymptotic lower bound for any
random sequence of data. We use it to express the worst-case complexity.
4. Asymptotic
🡪 Example, for the linearSearch algorithm, we say:
🡪 Time taken to execute the algorithm in the best case: 𝑇(𝑛) = Ω(1)
F2023 Analysis, Design of Algorithms 30
Big Omega Notation (Ω)
🡪 Let f(n) and g(n) are two nonnegative functions indicating the running time of
two algorithms. We say,
g(n) is lower bound of f(n)
🡪 if there exist positive constants c and n0
4. Asymptotic
such that 0 ≤ c.g(n) ≤ f(n) for all n ≥ n0.
It is denoted as 𝑇(𝑛) = Ω(𝑔(𝑛))
🡪 Example, the linearSearch algorithm:
🡪 T(n) = 4 ≥ 1
🡪 T(n)= Ω(1)
F2023 Analysis, Design of Algorithms 31
Big Theta Notation (Θ)
🡪 This notation is denoted by ‘Θ’, and it is pronounced as “Big Theta”. Big Theta
notation defines tight bound for the algorithm. It means the running time of
algorithm cannot be less than or greater than it’s asymptotic tight bound for
any random sequence of data. We use it to express the average case
4. Asymptotic
complexity.
🡪 Example, for the linearSearch algorithm, we say:
🡪 Time taken to execute the algorithm in the best case: 𝑇(𝑛) = Θ(𝑛)
F2023 Analysis, Design of Algorithms 32
Big Theta Notation (Θ)
🡪 Let f(n) and g(n) are two nonnegative functions indicating the
running time of two algorithms. We say,
g(n) is tight bound of f(n)
🡪 if there exist positive constants c1, c2,
4. Asymptotic
and n0 such that 0 ≤ c1× g(n) ≤ f(n) ≤ c2× g(n)
for all n ≥ n0. It is denoted as 𝑇(𝑛) = Θ(𝑔(𝑛))
🡪 Example, the linearSearch algorithm:
🡪 𝑛 ≤ 𝑇(𝑛) = 𝑛 + 3 ≤ 2𝑛 ; for c1 = 1, c2 = 2 and n0 = 3
🡪 𝑇(𝑛) = Θ(𝑛)
F2023 Analysis, Design of Algorithms 33
Agenda
01 Introduction 02 Time complexity
03 Space complexity 04 Asymptotic notations
05 Common growth rates
F2023 Analysis, Design of Algorithms 34
Common growth rates
5. Common growth rates
F2023 Analysis, Design of Algorithms 35
Constant Time
🡪 O(1) → Constant Time, independent of n
🡪 O(1) means that it takes a constant amount of time to run an algorithm,
regardless of the size of the input.
5. Common growth rates
🡪 Examples:
🡪 Accessing an index of an array
🡪 Insert/Delete a value as a head of Linked List
🡪 Looping a constant time
F2023 Analysis, Design of Algorithms 36
Linear Time
🡪 O(n) → Linear Time
🡪 O(n) means that the execution time increases at the same rate as the input.
🡪 Examples:
5. Common growth rates
🡪 Print the values of a table
🡪 Insert at the end of a linked list
🡪 Calculate factorial n
F2023 Analysis, Design of Algorithms 37
Logarithmic Time
🡪 O(log n) → Logarithmic Time
🡪 O(log n) means that the execution time increases in proportion to the logarithm
of the input size, which means that the execution time barely increases when
5. Common growth rates
you increase the input.
🡪 Example:
🡪 let n = 20, we stop when 2^k > =20 🡪 when k = 5
i 1 2 2^2 = 4 2^3 = 8 2^4 = 16
iterations 1
st
2
nd
3
rd
4
th
5
th
F2023 Analysis, Design of Algorithms 38
Quadratic Time
🡪 O(n²) → Quadratic Time
🡪 O(n²) means that the calculation runs in quadratic time, which is the squared
size of the input data.
5. Common growth rates
🡪 Example
🡪 Selection sort
🡪 Table of Multiplication
F2023 Analysis, Design of Algorithms 39
F2023 Analysis, Design of Algorithms 40