Presentation on
Time and Space Complexity,
Average and worst case analysis,
and Asymptotic Notations
Presented By – Mr. mansab mirza
Guided By – Mr. M.A. Danish ch
Roadmap
• Algorithmic Complexity
• Time and space complexity
• Need for Complexity Analysis
• Average and worst case analysis
• Why worst case analysis?
• The importance of Asymptotics
• Asymptotic Notations - Q, O, W, etc.
• Relations between Q, O, W
• Comparison of functions
Algorithmic Complexity
Algorithmic complexity is a very important topic in computer science.
Knowing the complexity of algorithms allows you to answer questions
such as
• How long will a program run on an input?
• How much space will it take?
• Is the problem solvable?
These are important bases of comparison between different
algorithms. An understanding of algorithmic complexity provides
programmers with insight into the efficiency of their code. Complexity
is also important to several theoretical areas in computer science,
including algorithms, data structures, and complexity theory.
Time and Space Complexity
Time Complexity –
The time complexity is the
amount of time required by an
algorithm to execute.
It is measured in terms of
number of operations rather
than computer time; because
computer time is dependent
on the hardware, processor,
etc..
Some general order that we
may consider
O(c) < O(log n ) < O(n) < O(n log
n) < O(nc) < O(cn) < O(n!) <
O(nn), where c is some
constant.
Big-O Notation Examples of Algorithms
O(1) Push, Pop, Enqueue (if there
is a tail reference), Dequeue,
Accessing an array element
O(log(n)) Binary search
O(n) Linear search
O(n log(n)) Heap sort, Quick sort
(average), Merge sort
O(n2) Selection sort, Insertion
sort, Bubble sort
O(n3) Matrix multiplication
O(2n) Towers of Hanoi
Space Complexity –
The space complexity of an algorithm is the amount of memory it needs to run
to completion.
Space complexity can be defined as : Amount of computer memory required
during the program execution, as the function of input size.
The difference between space complexity and time complexity is that the space
can be reused.
Complexity : Why Bother?
• Estimation/Prediction :
When you write/run a program, you need to able to predict its
requirements.
• Usual Requirements :
- execution time
- memory space
• Quantities to estimate :
- execution time -> time complexity
- memory space -> space complexity
• It is pointless to run a program that requires:
- 64TB of RAM on a desktop machine.
- 10,000 years to run
• You do not want to wait for an hour :
- for the result of your query on Google.
- when you are checking for your bank account online.
- when you are opening a picture file on Photoshop.
 It is important to write efficient algorithms.
Average and Worst case Analysis
Worst-case complexity:
The worst case complexity is the
complexity of an algorithm when the
input is the worst possible with
respect to complexity.
Average Complexity:
The average complexity is the
complexity of an algorithm that is
averaged over all possible inputs
(assuming a uniform distribution
over the inputs).
Input
1 ms
2 ms
3 ms
4 ms
5 ms
A B C D E F G
worst-case
best-case
}average-case?
Why Worst Case Analysis?
Worst case running time : It is the longest running time for any input of
size n. We usually concentrate on finding only the worst-case running
time, that is, the longest running time for any input of size n, because
of the following reasons:
• The worst-case running time of an algorithm gives an upper bound
on the running time for any input. Knowing it provides a guarantee
that the algorithm will never take any longer.
• For some algorithms, the worst case occurs fairly often. For
example, in searching a database for a particular piece of
information, the searching algorithm’s worst case will often occur
when the information is not present in the database.
• The “average case” is often roughly as bad as the worst case.
The Importance of Asymptotics
• Asymptotic notation has many important benefits, which might not be
immediately obvious.
• An algorithm with asymptotically low running time (for example, one that
is O(n2) is beaten in the long run by an algorithm with an asymptotically
faster running time (for example, one that is O(n log n)), even if the
constant factor for the faster algorithm is worse.
Running Time Maximum Problem Size (n)
1 second 1 minute 1 hour
400 n 2,500 1,50,000 9,000,000
20n [log n] 4,096 166,666 7,826,087
2n2 707 5,477 42,426
n4 31 88 244
2n 19 25 31
Asymptotic Analysis
• Goal : to simplify the analysis of running time by
getting rid of “details” which may be affected by
specific implementation and hardware
 like “rounding” : 1,000,001 = 1,000,000
 3n2 = n2.
• Capturing the essence : how the running time of an
algorithm increases with the size of the input in the
limit.
 Asymptotically more efficient algorithms are best for all
but small inputs.
Asymptotic Notations
• Q, O, W, o, w
• Defined for functions over the natural numbers.
– Ex: f(n) = Q(n2).
– Describes how f(n) grows in comparison to n2.
• Define a set of functions; in practice used to
compare two function sizes.
• Asymptotic notation is useful because it allows us
to concentrate on the main factor determining a
functions growth.
Q-notation
• Q notation bounds a function to
within constant factors
• Definition:
For a given function g(n), we denote
Q(g(n)) the set of functions
Q(g(n)) = { f(n) : there exists positive
constants c1, c2 and n0 such that 0 ≤ c1
g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0. }
• Explanation:
We write f(n) = Q(g(n)), if there exist
positive constants n0, c1, and c2 such
that at the right of n0, the value of f(n)
always lies between c1 g(n) and c2g(n)
inclusive.
• We say that g(n) is asymptotically
tight bound for f(n).
O-notation
• We use O-notation to give an upper
bound on a function , to within a
constant factor.
• Definition:
For a function g(n), we denote by
O(g(n)) the set of functions
O(g(n)) = {f(n) : there exists positive
constants c and n0 such that 0 ≤ f(n) ≤
c g(n) for all n ≥ n0.
• Explanation:
We Write f(n) = O(g(n)) if there are
positive constants n0 and c such that at
and to the right of n0, the value of f(n)
always lies on or below c g(n).
W -notation
• W -notation provides an asymptotic
lower bound on a function.
• Definition:
For a given function g(n), we denote
W(g(n)) the set of functions
W(g(n)) = {f(n) : there exists positive
constants c and n0 such that 0 ≤ c g(n) ≤
f(n) for all n ≥ n0.
• Explanation:
We write f(n) = W(g(n)) if there are
positive constants n0 and c such that at
and the right of n0, the value of f(n)
always lies on or above c g(n).
Relations Between Q, O, W
o-notation
• The upper bound provided by O-notation may or may not be
asymptotically tight. We use o-notation to denote an upper
bound that is not asymptotically tight.
• We formally define o(g(n)) as the set
o(g(n)) = { f(n) : for any positive constant c>0, there exists
a constant n0 such that 0 ≤ f(n) < c g(n) for all n ≥ n0 }.
• In the o-notation, the function f(n) becomes insignificant
relative to g(n) as n approaches infinity; that is,
Lim [f(n) / g(n)] = 0.
n
w -notation
• We use w- notation to denote a lower bound that is
not asymptotically tight.
• Formal definition:
w (g(n)) = { f(n) : for any positive constant c > 0,
there exists a constant n0 such that 0 ≤ c g(n) <
f(n) for all n ≥ n0 }.
• The relation f(n) = w (g(n)) implies that
Lim [f(n) / g(n)] = .n
Relations Between Q, O, W
• Theorem : For any two functions g(n) and f(n),
f(n) = Q(g(n)) iff
f(n) = O(g(n)) and f(n) = W(g(n)).
• i.e. Q(g(n)) = O(g(n))  W(g(n))
• In practice, asymptotic tight bounds are
obtained from asymptotic upper and lower
bounds.
Comparison of Functions
f  g  a  b
f (n) = O(g(n))  a  b
f (n) = W(g(n))  a  b
f (n) = Q(g(n))  a = b
f (n) = o(g(n))  a < b
f (n) = w (g(n))  a > b
Comp 122
Properties
Transitivity
f(n) = Q(g(n)) & g(n) = Q(h(n))  f(n) = Q(h(n))
f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n))
f(n) = W(g(n)) & g(n) = W(h(n))  f(n) = W(h(n))
f(n) = o (g(n)) & g(n) = o (h(n))  f(n) = o (h(n))
f(n) = w(g(n)) & g(n) = w(h(n))  f(n) = w(h(n))
Reflexivity
f(n) = Q(f(n))
f(n) = O(f(n))
f(n) = W(f(n))
Comp 122
Properties
Symmetry
f(n) = Q(g(n)) iff g(n) = Q(f(n))
Complementarity
f(n) = O(g(n)) iff g(n) = W(f(n))
f(n) = o(g(n)) iff g(n) = w((f(n))
References
• Introduction to Algorithms – 3rd edition
• http://www.cs.cornell.edu/courses/cs312/2004fa/lectures/lec
ture16.htm - Cornell University
• http://en.wikipedia.org/wiki/Analysis_of_algorithms -
Wikipedia
• http://www.slideshare.net/ANKKATIYAR/time-and-space-
complexity - Slideshare
• NPTEL Videos
Thank You

More Related Content

PPTX
Computational Complexity
DOC
Time and space complexity
PPT
how to calclute time complexity of algortihm
PPTX
asymptotic analysis and insertion sort analysis
PPT
Time andspacecomplexity
PPTX
Analysis of algorithn class 3
PPT
02 order of growth
PPTX
Algorithm Analysis
Computational Complexity
Time and space complexity
how to calclute time complexity of algortihm
asymptotic analysis and insertion sort analysis
Time andspacecomplexity
Analysis of algorithn class 3
02 order of growth
Algorithm Analysis

What's hot (20)

PPTX
Performance analysis(Time & Space Complexity)
PPTX
Asymptotic Analysis in Data Structure using C
PPT
Algorithm.ppt
PPT
Complexity of Algorithm
PDF
Time complexity (linear search vs binary search)
PPTX
Complexity analysis in Algorithms
PPT
Asymptotic Notation and Complexity
PDF
Algorithm Analyzing
PPT
18103010 algorithm complexity (iterative)
PPTX
Lec 5 asymptotic notations and recurrences
PPTX
Asymptotic Notation
PDF
Asymptotic Notation
PPT
Asymptotic notations
PPT
Asymptotic analysis
PPTX
Analysis of algorithn class 2
PDF
Asymptotic Analysis
PDF
PDF
Data Structure: Algorithm and analysis
PPT
Lec03 04-time complexity
PPTX
Asymptotic notations
Performance analysis(Time & Space Complexity)
Asymptotic Analysis in Data Structure using C
Algorithm.ppt
Complexity of Algorithm
Time complexity (linear search vs binary search)
Complexity analysis in Algorithms
Asymptotic Notation and Complexity
Algorithm Analyzing
18103010 algorithm complexity (iterative)
Lec 5 asymptotic notations and recurrences
Asymptotic Notation
Asymptotic Notation
Asymptotic notations
Asymptotic analysis
Analysis of algorithn class 2
Asymptotic Analysis
Data Structure: Algorithm and analysis
Lec03 04-time complexity
Asymptotic notations
Ad

Similar to Asymptotics 140510003721-phpapp02 (20)

PPTX
Asymptotic Notations.pptx
PPTX
Asymptotic Notations
PPTX
1.algorithms
PPTX
Lecture 2 data structures and algorithms
PPTX
Algorithms required for data structures(basics like Arrays, Stacks ,Linked Li...
PPTX
cse couse aefrfrqewrbqwrgbqgvq2w3vqbvq23rbgw3rnw345
PDF
Data Structures and Algorithms - Lec 02.pdf
PPTX
1_Asymptotic_Notation_pptx.pptx
PPT
Algorithms
PPTX
Data structures notes for college students btech.pptx
PDF
Anlysis and design of algorithms part 1
PPT
Design and analysis of algorithm ppt ppt
PPTX
Unit ii algorithm
PPT
analysis of algorithms and asymptotic complexity
PPT
algorithms-1 master in computer application
PPTX
2. Asymptotic Notations and Complexity Analysis.pptx
PPTX
Analysis of algorithms
PDF
Algorithm Analysis.pdf
PPTX
Data structures and ALGORITHMS methd binary SEARCH
PPTX
Data structures and algorithms (DSA) are foundational concepts in computer sc...
Asymptotic Notations.pptx
Asymptotic Notations
1.algorithms
Lecture 2 data structures and algorithms
Algorithms required for data structures(basics like Arrays, Stacks ,Linked Li...
cse couse aefrfrqewrbqwrgbqgvq2w3vqbvq23rbgw3rnw345
Data Structures and Algorithms - Lec 02.pdf
1_Asymptotic_Notation_pptx.pptx
Algorithms
Data structures notes for college students btech.pptx
Anlysis and design of algorithms part 1
Design and analysis of algorithm ppt ppt
Unit ii algorithm
analysis of algorithms and asymptotic complexity
algorithms-1 master in computer application
2. Asymptotic Notations and Complexity Analysis.pptx
Analysis of algorithms
Algorithm Analysis.pdf
Data structures and ALGORITHMS methd binary SEARCH
Data structures and algorithms (DSA) are foundational concepts in computer sc...
Ad

More from mansab MIRZA (7)

DOC
Chapter1
PPT
software
PPT
software engineering models
PPTX
Dijkstra s algorithm
PDF
Dynamic programing 2
PDF
Dynamic programing 2
PDF
Fall 2015 time_table_bs by mansab mirza
Chapter1
software
software engineering models
Dijkstra s algorithm
Dynamic programing 2
Dynamic programing 2
Fall 2015 time_table_bs by mansab mirza

Recently uploaded (20)

PPTX
9 Bioterrorism.pptxnsbhsjdgdhdvkdbebrkndbd
PDF
CS3352FOUNDATION OF DATA SCIENCE _1_MAterial.pdf
PDF
Grey Minimalist Professional Project Presentation (1).pdf
PPTX
1 hour to get there before the game is done so you don’t need a car seat for ...
PPTX
AI_Agriculture_Presentation_Enhanced.pptx
PPT
dsa Lec-1 Introduction FOR THE STUDENTS OF bscs
PDF
technical specifications solar ear 2025.
PDF
A biomechanical Functional analysis of the masitary muscles in man
PPT
PROJECT CYCLE MANAGEMENT FRAMEWORK (PCM).ppt
PPTX
OJT-Narrative-Presentation-Entrep-group.pptx_20250808_102837_0000.pptx
PPTX
DATA ANALYTICS COURSE IN PITAMPURA.pptx
PPTX
Hushh.ai: Your Personal Data, Your Business
PPTX
PPT for Diseases.pptx, there are 3 types of diseases
PPTX
lung disease detection using transfer learning approach.pptx
PPTX
Chapter security of computer_8_v8.1.pptx
PDF
REPORT CARD OF GRADE 2 2025-2026 MATATAG
PPTX
chuitkarjhanbijunsdivndsijvndiucbhsaxnmzsicvjsd
PPTX
transformers as a tool for understanding advance algorithms in deep learning
PPTX
inbound6529290805104538764.pptxmmmmmmmmm
PPTX
indiraparyavaranbhavan-240418134200-31d840b3.pptx
9 Bioterrorism.pptxnsbhsjdgdhdvkdbebrkndbd
CS3352FOUNDATION OF DATA SCIENCE _1_MAterial.pdf
Grey Minimalist Professional Project Presentation (1).pdf
1 hour to get there before the game is done so you don’t need a car seat for ...
AI_Agriculture_Presentation_Enhanced.pptx
dsa Lec-1 Introduction FOR THE STUDENTS OF bscs
technical specifications solar ear 2025.
A biomechanical Functional analysis of the masitary muscles in man
PROJECT CYCLE MANAGEMENT FRAMEWORK (PCM).ppt
OJT-Narrative-Presentation-Entrep-group.pptx_20250808_102837_0000.pptx
DATA ANALYTICS COURSE IN PITAMPURA.pptx
Hushh.ai: Your Personal Data, Your Business
PPT for Diseases.pptx, there are 3 types of diseases
lung disease detection using transfer learning approach.pptx
Chapter security of computer_8_v8.1.pptx
REPORT CARD OF GRADE 2 2025-2026 MATATAG
chuitkarjhanbijunsdivndsijvndiucbhsaxnmzsicvjsd
transformers as a tool for understanding advance algorithms in deep learning
inbound6529290805104538764.pptxmmmmmmmmm
indiraparyavaranbhavan-240418134200-31d840b3.pptx

Asymptotics 140510003721-phpapp02

  • 1. Presentation on Time and Space Complexity, Average and worst case analysis, and Asymptotic Notations Presented By – Mr. mansab mirza Guided By – Mr. M.A. Danish ch
  • 2. Roadmap • Algorithmic Complexity • Time and space complexity • Need for Complexity Analysis • Average and worst case analysis • Why worst case analysis? • The importance of Asymptotics • Asymptotic Notations - Q, O, W, etc. • Relations between Q, O, W • Comparison of functions
  • 3. Algorithmic Complexity Algorithmic complexity is a very important topic in computer science. Knowing the complexity of algorithms allows you to answer questions such as • How long will a program run on an input? • How much space will it take? • Is the problem solvable? These are important bases of comparison between different algorithms. An understanding of algorithmic complexity provides programmers with insight into the efficiency of their code. Complexity is also important to several theoretical areas in computer science, including algorithms, data structures, and complexity theory.
  • 4. Time and Space Complexity Time Complexity – The time complexity is the amount of time required by an algorithm to execute. It is measured in terms of number of operations rather than computer time; because computer time is dependent on the hardware, processor, etc.. Some general order that we may consider O(c) < O(log n ) < O(n) < O(n log n) < O(nc) < O(cn) < O(n!) < O(nn), where c is some constant. Big-O Notation Examples of Algorithms O(1) Push, Pop, Enqueue (if there is a tail reference), Dequeue, Accessing an array element O(log(n)) Binary search O(n) Linear search O(n log(n)) Heap sort, Quick sort (average), Merge sort O(n2) Selection sort, Insertion sort, Bubble sort O(n3) Matrix multiplication O(2n) Towers of Hanoi
  • 5. Space Complexity – The space complexity of an algorithm is the amount of memory it needs to run to completion. Space complexity can be defined as : Amount of computer memory required during the program execution, as the function of input size. The difference between space complexity and time complexity is that the space can be reused.
  • 6. Complexity : Why Bother? • Estimation/Prediction : When you write/run a program, you need to able to predict its requirements. • Usual Requirements : - execution time - memory space • Quantities to estimate : - execution time -> time complexity - memory space -> space complexity • It is pointless to run a program that requires: - 64TB of RAM on a desktop machine. - 10,000 years to run • You do not want to wait for an hour : - for the result of your query on Google. - when you are checking for your bank account online. - when you are opening a picture file on Photoshop.  It is important to write efficient algorithms.
  • 7. Average and Worst case Analysis Worst-case complexity: The worst case complexity is the complexity of an algorithm when the input is the worst possible with respect to complexity. Average Complexity: The average complexity is the complexity of an algorithm that is averaged over all possible inputs (assuming a uniform distribution over the inputs). Input 1 ms 2 ms 3 ms 4 ms 5 ms A B C D E F G worst-case best-case }average-case?
  • 8. Why Worst Case Analysis? Worst case running time : It is the longest running time for any input of size n. We usually concentrate on finding only the worst-case running time, that is, the longest running time for any input of size n, because of the following reasons: • The worst-case running time of an algorithm gives an upper bound on the running time for any input. Knowing it provides a guarantee that the algorithm will never take any longer. • For some algorithms, the worst case occurs fairly often. For example, in searching a database for a particular piece of information, the searching algorithm’s worst case will often occur when the information is not present in the database. • The “average case” is often roughly as bad as the worst case.
  • 9. The Importance of Asymptotics • Asymptotic notation has many important benefits, which might not be immediately obvious. • An algorithm with asymptotically low running time (for example, one that is O(n2) is beaten in the long run by an algorithm with an asymptotically faster running time (for example, one that is O(n log n)), even if the constant factor for the faster algorithm is worse. Running Time Maximum Problem Size (n) 1 second 1 minute 1 hour 400 n 2,500 1,50,000 9,000,000 20n [log n] 4,096 166,666 7,826,087 2n2 707 5,477 42,426 n4 31 88 244 2n 19 25 31
  • 10. Asymptotic Analysis • Goal : to simplify the analysis of running time by getting rid of “details” which may be affected by specific implementation and hardware  like “rounding” : 1,000,001 = 1,000,000  3n2 = n2. • Capturing the essence : how the running time of an algorithm increases with the size of the input in the limit.  Asymptotically more efficient algorithms are best for all but small inputs.
  • 11. Asymptotic Notations • Q, O, W, o, w • Defined for functions over the natural numbers. – Ex: f(n) = Q(n2). – Describes how f(n) grows in comparison to n2. • Define a set of functions; in practice used to compare two function sizes. • Asymptotic notation is useful because it allows us to concentrate on the main factor determining a functions growth.
  • 12. Q-notation • Q notation bounds a function to within constant factors • Definition: For a given function g(n), we denote Q(g(n)) the set of functions Q(g(n)) = { f(n) : there exists positive constants c1, c2 and n0 such that 0 ≤ c1 g(n) ≤ f(n) ≤ c2g(n) for all n ≥ n0. } • Explanation: We write f(n) = Q(g(n)), if there exist positive constants n0, c1, and c2 such that at the right of n0, the value of f(n) always lies between c1 g(n) and c2g(n) inclusive. • We say that g(n) is asymptotically tight bound for f(n).
  • 13. O-notation • We use O-notation to give an upper bound on a function , to within a constant factor. • Definition: For a function g(n), we denote by O(g(n)) the set of functions O(g(n)) = {f(n) : there exists positive constants c and n0 such that 0 ≤ f(n) ≤ c g(n) for all n ≥ n0. • Explanation: We Write f(n) = O(g(n)) if there are positive constants n0 and c such that at and to the right of n0, the value of f(n) always lies on or below c g(n).
  • 14. W -notation • W -notation provides an asymptotic lower bound on a function. • Definition: For a given function g(n), we denote W(g(n)) the set of functions W(g(n)) = {f(n) : there exists positive constants c and n0 such that 0 ≤ c g(n) ≤ f(n) for all n ≥ n0. • Explanation: We write f(n) = W(g(n)) if there are positive constants n0 and c such that at and the right of n0, the value of f(n) always lies on or above c g(n).
  • 16. o-notation • The upper bound provided by O-notation may or may not be asymptotically tight. We use o-notation to denote an upper bound that is not asymptotically tight. • We formally define o(g(n)) as the set o(g(n)) = { f(n) : for any positive constant c>0, there exists a constant n0 such that 0 ≤ f(n) < c g(n) for all n ≥ n0 }. • In the o-notation, the function f(n) becomes insignificant relative to g(n) as n approaches infinity; that is, Lim [f(n) / g(n)] = 0. n
  • 17. w -notation • We use w- notation to denote a lower bound that is not asymptotically tight. • Formal definition: w (g(n)) = { f(n) : for any positive constant c > 0, there exists a constant n0 such that 0 ≤ c g(n) < f(n) for all n ≥ n0 }. • The relation f(n) = w (g(n)) implies that Lim [f(n) / g(n)] = .n
  • 18. Relations Between Q, O, W • Theorem : For any two functions g(n) and f(n), f(n) = Q(g(n)) iff f(n) = O(g(n)) and f(n) = W(g(n)). • i.e. Q(g(n)) = O(g(n))  W(g(n)) • In practice, asymptotic tight bounds are obtained from asymptotic upper and lower bounds.
  • 19. Comparison of Functions f  g  a  b f (n) = O(g(n))  a  b f (n) = W(g(n))  a  b f (n) = Q(g(n))  a = b f (n) = o(g(n))  a < b f (n) = w (g(n))  a > b
  • 20. Comp 122 Properties Transitivity f(n) = Q(g(n)) & g(n) = Q(h(n))  f(n) = Q(h(n)) f(n) = O(g(n)) & g(n) = O(h(n))  f(n) = O(h(n)) f(n) = W(g(n)) & g(n) = W(h(n))  f(n) = W(h(n)) f(n) = o (g(n)) & g(n) = o (h(n))  f(n) = o (h(n)) f(n) = w(g(n)) & g(n) = w(h(n))  f(n) = w(h(n)) Reflexivity f(n) = Q(f(n)) f(n) = O(f(n)) f(n) = W(f(n))
  • 21. Comp 122 Properties Symmetry f(n) = Q(g(n)) iff g(n) = Q(f(n)) Complementarity f(n) = O(g(n)) iff g(n) = W(f(n)) f(n) = o(g(n)) iff g(n) = w((f(n))
  • 22. References • Introduction to Algorithms – 3rd edition • http://www.cs.cornell.edu/courses/cs312/2004fa/lectures/lec ture16.htm - Cornell University • http://en.wikipedia.org/wiki/Analysis_of_algorithms - Wikipedia • http://www.slideshare.net/ANKKATIYAR/time-and-space- complexity - Slideshare • NPTEL Videos