0% found this document useful (0 votes)
9 views20 pages

lect3_Asymptotic Notation_Part 1

The document outlines a lecture on the design and analysis of algorithms, covering foundational concepts, sorting algorithms, graph algorithms, and advanced techniques like dynamic programming. It emphasizes the importance of analyzing algorithms in terms of running time and memory requirements, using asymptotic notation for comparison. Additionally, it discusses methods for solving recurrences and provides examples of different types of algorithm analysis.

Uploaded by

ammar eneen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views20 pages

lect3_Asymptotic Notation_Part 1

The document outlines a lecture on the design and analysis of algorithms, covering foundational concepts, sorting algorithms, graph algorithms, and advanced techniques like dynamic programming. It emphasizes the importance of analyzing algorithms in terms of running time and memory requirements, using asymptotic notation for comparison. Additionally, it discusses methods for solving recurrences and provides examples of different types of algorithm analysis.

Uploaded by

ammar eneen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Design and Analysis

Algorithms
Lecture 3
Dr. Metwally Rashad
2022
Table of Contents
1. Foundations
- The Role of Algorithms in Computing
- Design and Analysis algorithms
- Growth of Functions
2. Sorting
- Heapsort
- Quicksort
- Sorting in Linear Time
3. Graph Algorithms
- Elementary Graph Algorithms
- Minimum Spanning Trees
4. Advanced Design and Analysis Techniques
- Dynamic Programming
- Greedy Algorithms
5. Selected Topics
- Linear Programming
- String Matching
1/18
Ch3: Growth of Functions 2/18
Analysis of Algorithms
 An algorithm is a finite set of precise instructions for performing
a computation or for solving a problem

 What is the goal of analysis of algorithms?


 To compare algorithms mainly in terms of running time but also in terms of
other factors (e.g., memory requirements, programmer's effort etc.)

 What do we mean by running time analysis?


 Determine how running time increases as the size of the problem increases.

3/18
Types of Analysis
 Worst Case
 Provides an upper bound on running time
 An absolute guarantee that the algorithm would not run longer, no matter what the inputs are

 best Case
 Provides a lower bound on running time
 Input is the one for which the algorithm runs the fastest

 Average case
 Provides a prediction about the running time
 Assumes that the input is random

Lower Bound  Running Time  Upper Bound 4/18


How do we Compare Algorithms?
 We need to define a number of objective measures.
(1) Compare execution times?
Not good: times are specific to a particular computer !!

(2) Count the number of statements executed?


Not good: number of statements vary with the programming
language as well as the style of the individual
programmer.
5/18
How do we Compare Algorithms? (cont.)
 Ideal Solution
 Express running time as a function of the input size 𝑛 (i.e., 𝑓(𝑛)).

 Compare different functions corresponding to running times.

 Such an analysis is independent of machine time, programming


style, etc.

 Associate a "cost" with each statement.


 Find the "total cost“ by multiplying the cost with the total number of
times each statement is executed. 6/18
Asymptotic Notation
 To compare two algorithms with running times 𝑓(𝑛) and 𝑔(𝑛), we need
a rough measure that characterizes how fast each function grows with
respect to 𝑛
 big 𝑶 − notation: asymptotic for (Worst case)
𝑓(𝑛) = 𝑂(𝑔(𝑛)) implies: 𝑓(𝑛) ≤ 𝑔(𝑛)

 𝜴 −notation: asymptotic for (Best case)


𝑓(𝑛) = Ω (𝑔(𝑛)) implies: 𝑓(𝑛) ≥ 𝑔(𝑛)

  − notation: asymptotic for (Average case)


𝑓(𝑛)= (𝑔(𝑛)) implies: 𝑓(𝑛) = 𝑔(𝑛)
7/18
Asymptotic Notation (cont.)
 𝐛𝐢𝐠 𝑶-notation (Worst case)
• Intuitively:
𝑶(𝒈(𝒏)) = the set
of functions with a
smaller or same
order of growth as
𝑔(𝑛)

8/18
Asymptotic Notation (cont.)
 Examples (𝑶-notation)
(Ex. 1) 𝟐𝒏𝟐 = 𝑶 𝒏𝟑 : 2𝑛2 ≤ 𝑐𝑛3  2 ≤ 𝑐𝑛  𝑐 = 1 𝑎𝑛𝑑 𝑛0 = 2

(Ex. 2) 𝒏𝟐 = 𝑶 𝒏𝟐 : 𝑛2 ≤ 𝑐𝑛2  𝑐 ≥ 1  𝑐 = 1 𝑎𝑛𝑑 𝑛0 = 1


(Ex. 3) 𝟏𝟎𝟎𝟎𝒏𝟐 + 𝟏𝟎𝟎𝟎𝒏 = 𝑶 𝒏𝟐 :
1000𝑛2 + 1000𝑛 ≤ 1000𝑛2 + 1000𝑛2 = 2000𝑛2
 𝑐 = 2000 𝑎𝑛𝑑 𝑛0 = 1
(Ex. 4) 𝒏 = 𝑶(𝒏𝟐):
𝑛 ≤ 𝑐𝑛2  𝑐𝑛 ≥ 1  𝑐 = 1 𝑎𝑛𝑑 𝑛0 = 1 9/18
Asymptotic Notation (cont.)
 𝜴 – notation (Best case)
• Intuitively:
(𝒈(𝒏)) = the set
of functions with a
larger or same
order of growth as
𝑔(𝑛)

10/18
Asymptotic Notation (cont.)
 Examples (𝜴-notation)
(Ex. 1) 𝟓𝒏𝟐 = (𝒏)
 𝑐, 𝑛0 such that: 0  𝑐𝑛  5𝑛2  𝑐𝑛  5𝑛2  𝑐 = 1 and 𝑛0 = 1
(Ex. 2) 𝟏𝟎𝟎𝒏 + 𝟓 ≠ (𝒏𝟐)
 𝑐, 𝑛0 such that: 0  𝑐𝑛2  100𝑛 + 5
since 100𝑛 + 5  100𝑛 + 5𝑛 ( 𝑛  1) = 105𝑛
 𝑛(𝑐𝑛 – 105)  0
Since 𝑛 is positive 𝑐𝑛 – 105  0  𝑛  105/"c"
 contradiction: 𝒏 cannot be smaller than a constant 𝑐 11/18
Asymptotic Notation (cont.)
 -notation (Average case)

𝑶
• Intuitively:

(𝑔(𝑛)) = the set of


𝜴
functions with the
same order of growth
as 𝑔(𝑛)
12/18
Asymptotic Notation (cont.)
 Examples (-notation)
(Ex. 1) 𝒏2/2 –𝒏/2 = (𝒏2)
½ 𝑛2 − ½ 𝑛 ≤ ½ 𝑛 2 (𝒏 ≥ 𝟎 )  𝒄𝟐 = ½

½ 𝑛2 − ½ 𝑛 ≥ ½ 𝑛2 − ½ 𝑛 ∗ ½ 𝑛 (𝒏 ≥ 𝟐 )

= ¼ 𝑛2  𝒄𝟏 = ¼
(Ex. 2) 𝟔𝒏 ≠ (𝒏𝟐)
𝑐1 𝑛2 ≤ 6𝑛 ≤ 𝑐2 𝑛2  only holds for: 𝑛 ≤ 1/𝑐1
13/18
Properties of Asymptotic Notation
 Transitivity:
 𝑓(𝑛) = (𝑔(𝑛)) and 𝑔(𝑛) = (ℎ(𝑛))  𝑓(𝑛) = (ℎ(𝑛))
 Same for 𝑂 and 
 Reflexivity:
 𝑓(𝑛) = (𝑓(𝑛))
 Same for 𝑂 and 
 Symmetry:
 𝑓(𝑛) = (𝑔(𝑛)) if and only if 𝑔(𝑛) = (𝑓(𝑛))
 Transpose symmetry:
 𝑓(𝑛) = 𝑂(𝑔(𝑛)) if and only if 𝑔(𝑛) = (𝑓(𝑛)) 14/18
Logarithms
 In algorithm analysis we often use the notation “log 𝑛” without specifying
the base
log n  (log n )
k k

Binary logarithm lg n  log2 n log log n  log(logn )


Natural logarithm ln n  loge n log x y  y log x
log xy  log x  log y
x
log  log x  log y
y
loga x  loga b logb x
a logb x  x logb a 15/18
Recurrences
 Recurrence: an equation that describes a function in terms of its
value on smaller

 c n 1

 The expression T (n)  
2T  n   cn n  1
  2 
is a recurrence 16/18
Recurrences Examples

 0 n0  0 n0
s ( n)   s ( n)  
c  s (n  1) n  0 n  s (n  1) n  0

 c n 1  c n 1
 
T ( n)   T ( n)  
2T    c n  1
 n  n
aT    cn n  1
  2   b
17/18
Solving Recurrences

 Three methods for solving recurrences


 Substitution Method

 Recursion-tree Method

 Master Method

18/18

You might also like