0% found this document useful (0 votes)
47 views

CSE-205 Algorithms: Algorithm, Asymptotic Notation & Complexity Analysis

The bubble sort algorithm has a worst case running time of O(N^2) as it requires N passes through the array, with each pass requiring N-1 comparisons. It sorts the input array in-place without using additional storage.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views

CSE-205 Algorithms: Algorithm, Asymptotic Notation & Complexity Analysis

The bubble sort algorithm has a worst case running time of O(N^2) as it requires N passes through the array, with each pass requiring N-1 comparisons. It sorts the input array in-place without using additional storage.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 43

CSE-205

Algorithms

Algorithm, Asymptotic Notation &


Complexity Analysis
Algorithm
• An Algorithm is a finite step-by-step list of well defined instructions for solving a particular
problem.
• The format for the formal presentation of an algorithm consists of two parts. The first part is
a paragraph which tells the purpose of the algorithm, identifies the variables which occur in
the algorithm and lists the input data. The second part of the algorithm consists of the lists of
steps that is to be executed.
• Example : A nonempty array DATA with N numerical values is given. Find the location LOC
and the value MAX of the largest element of DATA.
• Algorithm: Given a nonempty array DATA with N numerical values, this algorithm finds the
location LOC and the value MAX of the largest element of DATA.
1. Set K := 1, LOC := 1 and MAX := DATA[1].
2. Repeat steps 3 and 4 while K <= N:
3. If MAX < DATA[K], then :
» Set LOC := K and MAX := DATA[K].
» [End of if structure]
4. Set K := K +1.
5. Write: LOC, MAX.
6. Exit.

October 15, 2020 2


Analyzing Algorithms
• Predict the amount of resources required:
• memory: how much space is needed?
• computational time: how fast the algorithm runs?

• FACT: running time grows with the size of the input


• Input size (number of elements in the input)
– Size of an array, polynomial degree, # of elements in a matrix, # of bits in
the binary representation of the input, vertices and edges in a graph

Def: Running time = the number of primitive operations (steps)


executed before termination
– Arithmetic operations (+, -, *), data movement, control, decision making
(if, while), comparison
October 15, 2020 3
Algorithm Analysis: Example
• Alg.: MIN (a[1], …, a[n])
m ← a[1];
for i ← 2 to n
if a[i] < m
then m ← a[i];
• Running time:
– the number of primitive operations (steps) executed
before termination
T(n) =1 [first step] + (n) [for loop] + (n-1) [if condition] +
(n-1) [the assignment in then] = 3n - 1
• Order (rate) of growth:
– The leading term of the formula
– Expresses the asymptotic behavior of the algorithm

October 15, 2020 4


Typical Running Time Functions
• 1 (constant running time):
– Instructions are executed once or a few times

• logN (logarithmic)
– A big problem is solved by cutting the original problem in smaller
sizes, by a constant fraction at each step

• N (linear)
– A small amount of processing is done on each input element

• N logN
– A problem is solved by dividing it into smaller problems, solving
them independently and combining the solution

October 15, 2020 5


Typical Running Time Functions
• N2 (quadratic)
– Typical for algorithms that process all pairs of data items (double
nested loops)

• N3 (cubic)
– Processing of triples of data (triple nested loops)

• NK (polynomial)

• 2N (exponential)
– Few exponential algorithms are appropriate for practical use

October 15, 2020 6


Growth of Functions

October 15, 2020 7


Complexity Graphs

log(n)

October 15, 2020 8


Complexity Graphs

n log(n)

log(n)

October 15, 2020 9


Complexity Graphs

n10 n3

n2
n log(n)

October 15, 2020 10


Complexity Graphs (log scale)

3n
nn
n20

2n

n10

1.1n

October 15, 2020 11


Algorithm Complexity
• Worst Case Complexity:
– the function defined by the maximum number of steps
taken on any instance of size n
• Best Case Complexity:
– the function defined by the minimum number of steps
taken on any instance of size n
• Average Case Complexity:
– the function defined by the average number of steps
taken on any instance of size n

October 15, 2020 12


Best, Worst, and Average Case Complexity

Number Worst Case


of steps
Complexity

Average Case
Complexity

Best Case
Complexity

N
(input size)

October 15, 2020 13


Doing the Analysis
• It’s hard to estimate the running time exactly
– Best case depends on the input
– Average case is difficult to compute
– So we usually focus on worst case analysis
• Easier to compute
• Usually close to the actual running time
• Strategy: find a function (an equation) that, for large n, is an
upper bound to the actual function (actual number of steps,
memory usage, etc.)
Upper bound
Actual function
Lower bound

October 15, 2020 14


Motivation for Asymptotic Analysis

• An exact computation of worst-case


running time can be difficult
– Function may have many terms:
• 4n2 - 3n log n + 17.5 n - 43 n⅔ + 75
• An exact computation of worst-case
running time is unnecessary

October 15, 2020 15


Classifying functions by their
Asymptotic Growth Rates (1/2)
• Asymptotic growth rate, asymptotic order,
or order of functions
– Comparing and classifying functions that
ignores
• constant factors and
• small inputs.
• For this we have big oh O(g), big theta
(g), big omega (g)

October 15, 2020 16


Classifying functions by their
Asymptotic Growth Rates (2/2)
 O(g(n)), Big-Oh of g of n, the Asymptotic Upper
Bound;

 (g(n)), Theta of g of n, the Asymptotic Tight Bo


und; and

 (g(n)), Omega of g of n, the Asymptotic Lower


Bound.

October 15, 2020 17


Big-O
f  n   O g  n   : there exist positive constants c and n0 such that
0  f  n   cg  n  for all n  n0

• What does it mean?


– If f(n) = O(n2), then:
• f(n) can be larger than n2 sometimes, but…
• We can choose some constant c and some value n0 such
that for every value of n larger than n0 : f(n) < cn2
• That is, for values larger than n0, f(n) is never more than a
constant multiplier greater than n2
• Or, in other words, f(n) does not grow more than a constant
factor faster than n2.

October 15, 2020 18


Visualization of O(g(n))
cg(n)

f(n)

n0

October 15, 2020 19


Examples

– 2n 2 = O(n 2 ): 2n 2
≤ cn 2
 2n 2
≤ cn 2
 c = 2 and n0= 1

– n2 = O(n2): n ≤ cn  c ≥ 1  c = 1 and n0= 1


2 2

– 1000n2+1000n = O(n2):

1000n2+1000n ≤ cn2 ≤ cn2+ 1000n  c=1001 and n0 = 1

– n = O(n2): n ≤ cn 2
 cn ≥ 1  c = 1 and n0= 1

October 15, 2020 20


Big-O

2n  O n
2
 
2

1,000,000n  150,000  O n
2
 
2

5n  7 n  20  O n
2
 
2

2n  2  O n
3
  2

n 2.1
O n  
2

October 15, 2020 21


Example
Algorithm Bubble_Sort (DATA, N):

1.Repeat steps 2 and 3 for K = 1 to N-1.


2.Set PTR: =1.[Initializes pass pointer PTR]
3.Repeat while PTR<=N-K: [Executes pass]
4.If DATA[PTR]>DATA[PTR+1], then:
TEMP := A[PTR],
A[PTR] := A[PTR+1],
A[PTR+1] := temp
[End of if structure]
5.Set PTR: =PTR+1
[End of inner loop]
[End of step 1 Outer loop]
6.Exit
October 15, 2020 22
Visualization of (g(n))
c2g(n)

f(n)

c1g(n)

n0

October 15, 2020 23


Practice Example
• Code:
• a = b;

• Complexity:

October 15, 2020 24


Practice Example
• Code:
• for (i=1; i <=n; i++)
• printf(“Hello”);

• Complexity:

October 15, 2020 25


Practice Example
• Code:
• for (i=1; i<=n; i++)
• for (j=1; j<=n; j++)
• printf(“Hello”);
Complexity:

October 15, 2020 26


Practice Example
• Code:
• for (j=1; j<=n; j++)
• for (i=1; i<=j; i++)
• printf(“Hello”);
• Complexity:

October 15, 2020 27


Practice Example
• Code:
• for (s=1,i=1; s<=n; i++,s=s+i)
• printf(“Hello”);
• Complexity:

October 15, 2020 28


Practice Example
• Code:
• for (k=1; k<=n; k*=2)
• for (j=1; j<=n; j++)
• printf(“Hello”);
• Complexity:

October 15, 2020 29


Practice Example
• Code:
• for (i=1; i^2<=n; i++)
• printf(“Hello”);
• Complexity:

October 15, 2020 30


Practice Example
• Code:
• for (i=n/2; i<=n; i++)
• for (j=1; j<=n; j*=2)
• for (k=1; k<=n; k*=2)
• printf(“Hello”);
• Complexity:

October 15, 2020 31


Practice Example
• Code:
• for (i=1; i<=n; i++)
• for (j=1; j<=n; j+=i)
• printf(“Hello”);
• Complexity:

October 15, 2020 32


Time analysis of recursive program
• Code:
• A(n){
• if(n>1)
• return (A(n-1));
• }
• Time function: T(n)=1 + T(n-1) ; if n>1
• =0; if n=1
• Complexity:

October 15, 2020 33


Time analysis of recursive program
• Time function: T(n)=n + T(n-1) ; if n>1
• =1; if n=1
• Complexity:

October 15, 2020 34


Time analysis of recursive program
• Time function:
• T(n)=c + 2T(n/2) ; n>1
• =c ; n=1

October 15, 2020 35


Time analysis of recursive program
• Time function:
• T(n)=n + 2T(n/2) ; n>1
• =1 ; n=1
• Complexity:

October 15, 2020 36


Masters Theorem
• T(n)=aT(n/b)+ Θ(nklogpn)
• Where a>=1, b>1, k>=0 and p is real number
• There are following three cases:
1. If a>bk then T(n) = Θ(nlogba)
• 2.  If a=bk
• if p>-1 then T(n) = Θ(nlogba logp+1n)
• if p= -1 then T(n) = Θ(nlogba loglogn)
• if p<-1 then T(n) = Θ(nlogba )
• 3. If a<bk
• if p>=0 then T(n) = Θ(nk logpn)
• if p<0 then T(n) = Θ(nk )

October 15, 2020 37


Examples of masters theorem
• 1. T(n)=3T(n/2)+n2

• Here, a=3, b=2, k=2, p=0; so a<bk & p>=0


• Therefore, rule no. 3(a)
• Time complexity=T(n) = Θ(nk logpn) = Θ(n2 log0n) = Θ(n2)

October 15, 2020 38


Examples of masters theorem
• 2. T(n)=4T(n/2)+n2

• Here, a=4, b=2, k=2, p=0; so a=b k & p>-1


• Therefore, rule no. 2(a)
• Time complexity, T(n) = Θ(nlogba logp+1n)
• = Θ(nlog24 log0+1n)
• =Θ(n2logn)

October 15, 2020 39


Examples of masters theorem
• 3. T(n)=2nT(n/2)+n2

• Here, 2n doesn’t support the format

October 15, 2020 40


Examples of masters theorem
• 4. T(n)=2T(n/2)+nlogn

• Here, a=2, b=2, k=1, p=1; so a=b k & p>-1


• Therefore, rule no. 2(a)
• Time complexity, T(n) = Θ(nlogba logp+1n)
• = Θ(nlog22 log1+1n)
• =Θ(nlog2n)

October 15, 2020 41


Examples of masters theorem
• 4. T(n)=2T(n/2)+n/logn

• Here, a=2, b=2, k=1, p=-1; so a=b k & p=-1


• Therefore, rule no. 2(b)
• Time complexity, T(n) = Θ(nlogba loglogn)
• = Θ(nlog22 loglogn)
• =Θ(nloglogn)

October 15, 2020 42


Reference
• https://www.youtube.com/watch?v=FEnwM-iDb2
g

October 15, 2020 43

You might also like