0% found this document useful (0 votes)
24 views

FDS (Complexity - of - Algorithm)

FDS Time Complexity

Uploaded by

vidyadevi6996
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views

FDS (Complexity - of - Algorithm)

FDS Time Complexity

Uploaded by

vidyadevi6996
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 18

Unit 1

Introduction to Algorithm and Data


structure
Analysis of Algorithm
• Need of Algorithm analysis :
– It is an important part of computational complexity theory,
which provides theoretical estimation for the required
resources of an algorithm to solve a specific computational
problem.
– It is the process of analyzing the problem-solving capability
of the algorithm in terms of the time and size required.
– Analysis of Algorithm is performed to decide which
algorithm is better than other.
– Efficiency of algorithm can be decided by measuring the
performance of algorithm
Complexity of Algorithm
• What is Algorithm Complexity?
– The term algorithm complexity measures how many steps
are required by the algorithm to solve the given problem.
– Performance of algorithm can be measured by
requirement and utilization of resources (such as CPU
time, Memory) during execution.
– It focuses on calculation of memory requirement and
execution time taken by the algorithm.
• Space complexity-> (Memory requirement)
• Time complexity-> (Time Requirement)
Space Complexity
• Space Complexity:
– Definition-> Total amount of computer memory required by an
algorithm to complete its execution is called as space complexity
of that algorithm.
– Space complexity of an algorithm represents the amount of
memory space needed to store data in its life cycle.
– Memory requirement depends on input data size and number of
variable used in the algorithm.
– For any algorithm, memory is required for the following
purposes,
• To store program instructions.
• To store constant values.
• To store variable values.
Continued….
• Space needed by an algorithm is equal to the sum of the
following two components:
– A fixed part:
• A space required to store certain data and variables (i.e. simple
variables and constants, input data etc.), that are not dependent
of the size of the problem
– A variable part:
• A space required by variables, whose size is totally dependent on
the size of the problem. Ex. Control statements (do, while, for,
switch), recursion calls, function calls etc.
• Space complexity S(p) of any algorithm p is ,
S(p) = A + V
Where, A -> Fixed part
V-> Variable part
Example of Space Complexity
▪ Example#2:
#include<stdio.h>
int main() Space Complexity Calculation:
{ 1. Integer variables-> 3 (a, b, c)
int a = 5, b = 5, 2. Size of the integer data type-> 4 byte
c; 3. Total space-> 4 * 3 = 12 bytes.

c = a + b;
S(P)-> O(1)/Constant
printf("%d", c);
}
Example of Space Complexity
▪ Example#1:
#include <stdio.h> Space Complexity Calculation:
int main() 1. Array size -> n
{ 2. Space occupied by the array-> 4 * n = 4n
int n, i, sum = 0; 3. Integer Variable->3 (n, I, sum)
scanf("%d", &n); 4. Size of integers-> 3*4 = 12 bytes
int arr[n]; 5. Total space-> 4n + 12 bytes.
for(i = 0; i < n; i++)
{
S(P) -> O(n) or linear
scanf("%d", &arr[i]);
sum = sum + arr[i];
}
printf("%d", sum);
}
Time Complexity
• What is Time Complexity?
– Time complexity of an algorithm signifies the total time required
by the program to run till its completion.
– Time Complexity can be determined by:
• Size of input data in program
• No. of instructions
• The configuration of machine which is used to execute the
program
• Machine language instruction set
• Translation time required for compiler to convert to machine
language
• Time required for execution of each instruction.
Time Complexity
• Frequency count:
– For same algorithm execution time on faster computer will
be less whereas on slower computer it will be more.
– Hence actual time complexity can be different for each
machine
– The solution for this problem is frequency count.
– Frequency count is a count that denotes the number of
times each instruction in the algorithm is executed.
Time Complexity
• Example #1 of Frequency Count:
Void main()
{ Instruction Freq. Count
int x;
x=25; x=25 1
y=50;
y=50 1
x=x+y;
printf(“%d”, x); x=x+y 1
}
printf(“%d”, x) 1

Frequency Count-> 1+1+1+1= 4


Asymptotic Notations
• Asymptotic analysis:
– Asymptotic analysis refers to computing the running time
of any operation in mathematical units of computation.
– It refers to defining the mathematical boundation
/framing of its run-time performance.
– Usually, the time required by an algorithm falls under
three types −
• Best Case − Minimum time required for program execution.
• Average Case − Average time required for program
execution.
• Worst Case − Maximum time required for program
execution.
Asymptotic Notations
• Types of Asymptotic Notations:
– There are three commonly used asymptotic notations to
calculate the running time complexity of an algorithm.
• Ο Notation (Big O)
• Ω Notation (Omega)
• θ Notation (Theta)
Asymptotic Notations
• O notation(Big Oh Notation):
– The notation Ο(n) is the formal way to express the upper bound
of an algorithm's running time.
– It measures the worst case time complexity or the longest
amount of time an algorithm can possibly take to complete.
– Definition:
• Let function f(n) and g(n) be two non-negative functions.
• Positive integers n, n0, and constant c such that
– f(n)<= c. g(n)
– For n>n0 in all case
• Hence function g(n) is an upper bound for function f(n), as
g(n) grows faster than f(n)
Big o Notation
Ο(f(n)) = { g(n) : there exists c > 0 and n0 such that f(n) ≤ c . g(n) for all n
Asymptotic Notations
• Omega Notation, (Ω):
– The notation Ω(n) is the formal way to express the lower bound
of an algorithm's running time.
– It measures the best case time complexity or the best amount of
time an algorithm can possibly take to complete.
– Definition:
• Let function f(n) and g(n) be two non-negative functions.
• Positive integers n, n0, and constant c such that
– f(n)>= c. g(n)
– For n>n0 in all case
• Hence function g(n) is an lower bound for function f(n), as g(n)
grows slower than f(n)
Omega Notation
Ω(f(n)) ≥ { g(n) : there exists c > 0 and n0 such that g(n) ≤ c . f(n) for all n
Asymptotic Notations
• Theta Notation, (θ):
– The notation θ(n) is the formal way to express both the lower
bound and the upper bound of an algorithm's running time.
– Definition:
• Let function f(n) and g(n) be two non-negative functions.
• Positive integers n, n0, and constant c1 and c2 such that
– C1.g(n)<=f(n)<= c2.g(n)
– For n>n0 in all case
– It is represented as follows −
θ(g(n)) = {the set of functions f(n) such that f(n) = O(g(n))
and f(n)= Ω(g(n)) }
Theta Notation, θ
θ(f(n)) = { g(n) if and only if g(n) = Ο(f(n)) and g(n) = Ω(f(n)) for all n > n0. }

You might also like