Performance Analysis
Performance Analysis
• Space Complexity:
The space complexity of an algorithm is the
amount of memory it needs to run to
compilation.
• Time Complexity:
The time complexity of an algorithm is the
amount of computer time it needs to run to
compilation.
• Performance evaluation can be divided into two
major phases:
1. A Priori estimates – performance analysis
- Analysis done before execution.
- Priori estimates provides estimated values and
uniform values.
- Priori estimates are independent of CPU,OS and
System Architecture.
2. A Posterior testing – performance
measurement
- Posterior testing is done after execution.
- Posterior testing provides exact values and
these are non uniform vales .
- Posteriori values are dependent of CPU,OS
and System Architecture.
Space Complexity
The Space needed by each of these algorithms is seen to
be the sum of the following components.
1. A fixed part that is independent of the characteristics
(ex: number, size)of the inputs and outputs.
• The part typically includes the instruction space (ie.
Space for the code), space for simple variable and fixed-
size component variables (also called aggregate) space
for constants, and so on.
2. A variable part that consists of the space needed by
component variables whose size is dependent on the
particular problem instance being solved, the space
needed by referenced variables and the recursion stack
space.
• The space requirement S(P) of any algorithm p may
therefore be written as,
S(P) = c+ Sp(Instance characteristics)
Where ‘c’ is a constant.
Example :
1. Algorithm abc(a,b,c)
{
return a+b+b*c+(a+b-c)/(a+b) +4.0;
}
Assume that one word is adequate to store the
values of each of a, b, c and result, we see that the
space needed by abc is independent of the instance
characteristics. So, Sp (instance characteristics) = 0.
Ex2: Algorithm sum(a,n)
{
s=0.0;
for I=1 to n do
s= s + a[I];
return s;
}
• The problem instances for this algorithm are
characterized by n, the number of elements to be
summed.
• The space needed d by ‘n’ is one word, since it is of
type integer.
• The space needed by ‘a’ is the space needed
by variables of type array of floating point
numbers.
• This is at least ‘n’ words, since ‘a’ must be
large enough to hold the ‘n’ elements to be
summed.
• So, we obtain Ssum(n)>=(n+3)
[ n for a[],one each for n, I & s]
Recursive algorithms for Sum
• The recursion stack space includes the space for all formal
parameters ,the local variables and the return address.
Assume that the return address requires only one word of
memory.
• Each call to RSum requires at least 3 words (n, return add,
pointer to a[ ]).
• Since the depth of recursion is (n+1) , the recursion stack
space needed is >= 3(n+1).
Time Complexity
• The time T(P) taken by a program P is the sum of the
compile time and the run time.
The compile time does not depend on the instance
characteristics.
Also we may assume that a compiled program will be
run several times without recompilation.
This run time is denoted by tp (instance
characteristics).
The number of steps any problem statement is
assigned depends on the kind of statement.
• For example, comments 0 steps.
• Assignment statements 1 steps.
[Which does not involve any calls to other
algorithms]
• Interactive statement such as for, while &
repeat-until Control part of the statement.
• Time Complexity is calculated by using either
Step Count Method or Step Table Method.
1. Step count method
• We introduce a variable, count into the
program statement to increment count with
initial value 0.Statement to increment count
by the appropriate amount are introduced
into the program.
• This is done so that each time a statement in
the original program is executes count is
incremented by the step count of that
statement.
Algorithm sum(a,n)
Algorithm sum(a,n)
{
{
s=0.0; s= 0.0;
for i=1 to n do count = count+1;
s= s + a[i]; for i=1 to n do
return s; {
}
count =count+1;
If the count is zero to start with, s=s+a[i];
then it will be 2n+3 on count=count+1;
termination. So each }
invocation of sum execute a count=count+1;
total of 2n+3 steps. count=count+1;
return s;
}
Matrix Addition
Tmatrixadd(m,n) = 2mn+2m+1
2. Step Table Method
• The second method to determine the step
count of an algorithm is to build a table in
which we list the total number of steps
contributes by each statement.
First determine the number of steps per
execution (s/e) of the statement and the total
number of times (ie., frequency) each
statement is executed.
By combining these two quantities, the total
contribution of all statements, the step count
for the entire algorithm is obtained.
Algorithm Sum(a,n)
Algorithm RSum(a,n)
Matrix addition
Algorithm Analysis
• We have three cases to analyze an algorithm:
1) Worst Case 2) Average Case 3) Best Case
• In the worst case analysis, we calculate upper bound
on running time of an algorithm.
• In average case analysis, we take all possible inputs
and calculate computing time for all of the inputs. Sum
all the calculated values and divide the sum by total
number of inputs.
• In the best case analysis, we calculate lower bound on
running time of an algorithm.
Asymptotic Analysis Cont..
• Expressing the complexity in term of its relationship
to known function. This type analysis is called
asymptotic analysis.
• Asymptotic notations are the mathematical
notations used to describe the running time of an
algorithm when the input tends towards a particular
value or a limiting value.
• We have five asymptotic notations
1. Big ‘oh’ 2. Omega 3. Theta 4. Little ‘oh’
5. Little Omega
Big “oh” (O)
Ex: 1. 3n+2= O(n) as 3n+2<=4n for all n>=2
2. 3n+3= O(n) as 3n+3<=4n for all n>=3
3. 100n+6= O(n) as 100n+6<=101n for all n>=6
4. 10n2 + 4n+2 = O(n2) as 10n2 + 4n+2 <=11n2
for all n>=5
5. 6*2n + n2 = O(2n) as 6*2n + n2 <=7*2n for n>=4
• Big-O notation represents the upper
bound of the running time of an
algorithm.
• It gives the worst case complexity of an
algorithm.
Poll Question?
• 1000* n2 + 100*n -6 = O(n2 ) is it
correct (yes/no)
Omega (Ω)
Ex: 1. 3n+2= Ω(n) as 3n+2>=3n for all n>=1
2. 3n+3= Ω(n) as 3n+3>=3n for all n>=1
3. 100n+6= Ω(n) as 100n+6>=100n for all n>=1
4. 10n2 + 4n+2 = Ω(n2) as 10n2 + 4n+2 >=n2
for all n>=1
• Omega notation represents the lower
bound of the running time of an
algorithm.
• It provides best case complexity of an
algorithm.
Poll Question?
6*2n + n2 = Ω (2n) is it correct (Y/N)
Theta (ө)
Ex: 1. 3n+2= ө(n) as 3n+2>=3n for all n>=2 and
3n+2<=4n for all n>=2
2. 3n+3= ө(n)
3. 100n+6= ө(n)
4. 10n2 + 4n+2 = ө(n2)
5. 6*2n + n2 = ө(2n)
6. 10*log n+4 = ө(log n )
• Theta notation encloses the function
from above and below.
• Since it represents the upper and the
lower bound of the running time of an
algorithm.
• It is used for analyzing the average case
complexity of an algorithm.
Little “oh” (o)
Little omega (ω)
Home Assignment
• Find the time complexity of below algorithms
by using step table method
1.Matrix Multiplication
2.Sorting Technique
3.Compute xn