Analysis of Algorithms and Time and Space Complexity
ROAD MAP
• What is an algorithm ?
• Analysis of An Algorithm
• Asymptotic Notation
• Big Oh Notation
• Omega Notation
• Theta Notation
• Rules about Asymptotic Notation
• Definition :
What is An A finite, clearly specified sequence of
instructions to be followed to solve a
Algorithm ? problem.
• – Simple recursive algorithms. Ex: Searching an
element in a list
• – Backtracking algorithms Ex: Depth-first recursive
search in a tree
•
– Divide and conquer algorithms. Ex: Quick sort and
merge sort
Types of • – Dynamic programming algorithms. Ex: Generation of
an
Fibonacci series
• – Greedy algorithms Ex: Counting currency
Algorithm • – Branch and bound algorithms. Ex: Travelling
salesman (visiting each city once and minimize the
total distance travelled)
• – Brute force algorithms. Ex: Finding the best path for
a travelling salesman
• – Randomized algorithms. Ex. Using a random number
to choose a pivot in quick sort).
Algorithm
• A computer program is a sequence of
instructions that comply the rules of a
specific programming language
v/s • Algorithms are general and have to be
Program translated into a specific programming
language
• An algorithm may be expressed in a number of ways,
including:
• natural language: usually verbose and ambiguous
• flow charts: avoid most (if not all) issues of ambiguity;
difficult to modify w/o specialized tools; largely
standardized
• pseudo-code: also avoids most issues of ambiguity;
vaguely resembles common elements of programming
languages; no particular agreement on syntax
• programming language: tend to require expressing
low-level details that are not necessary for a high-level
understanding
Common Elements of Algorithms
• acquire data (input) some means of reading values from an external source; most algorithms
require data values to define the specific problem (e.g., coefficients of a polynomial)
• computation some means of performing arithmetic computations, comparisons, testing logical
conditions, and so forth...
• selection some means of choosing among two or more possible courses of action, based upon
initial data, user input and/or computed results
• iteration some means of repeatedly executing a collection of instructions, for a fixed number of
times or until some logical condition holds
• report results (output) some means of reporting computed results to the user, or requesting
additional data from the user
The Process of Algorithm
Development
• Design
• divide&conquer, greedy, dynamic programming
• Validation
• check whether it is correct
• Analysis
• determine the properties of algorithm
• Implementation
• Testing
• check whether it works for all possible cases
Properties of an Algorithm
Input
• zero or more
Output
• one or more
Effectiveness
• simple
• can be carried out by pen and paper
Definiteness
• clear
• meaning is unique
Correctness
• give the right answer for all possible cases
Finiteness
• stop in reasonable time
Your comment……..
int i;
for (i=0;i<=32767;i++)
{
printf(“%d”,i);
}
int i;
for
(i=0;i<=printf(“hello”);i+
+)
{
printf(“hello”);
}
Program Performance and
Asymptotic Notations
• Program performance is the amount of computer memory and time
needed to run a program.
• How is it determined?
• Analytically
• performance analysis
• Experimentally
• performance measurement
Analysis of Algorithm
• Analysis investigates
• What are the properties of the algorithm?
• in terms of time and space
• How good is the algorithm ?
• according to the properties
• How it compares with others?
• not always exact
• Is it the best that can be done?
• difficult !
Mathematical
Background
• Assume the functions for running times of two
algorthms are found !
• For input size N
• Running time of Algorithm A = TA(N) = 1000 N
• Running time of Algorithm B = TB(N) = N2
• Which one is faster ?
If the unit of running time of algorithms A and B is µsec
N TA TB
10 10-2 sec 10-4 sec
100 10-1 sec 10-2 sec
1000 1 sec 1 sec
10000 10 sec 100 sec
100000 100 sec 10000 sec
So which algorithm is faster ?
T (Time)
TB
TA
If N<1000 TA(N) > TB(N)
o/w TB(N) >
TA(N)
N (Input size)
1000
Compare their relative growth ?
Mathematical
Background
• Is it always possible to have definite results?
NO !
• The running times of algorithms can change
because of the platform, the properties of the
computer, etc.
• We use asymptotic notations
• compare relative growth
• compare only algorithms
Criteria for Measurement
• Space
• amount of memory program occupies
• usually measured in bytes, KB or MB
• Time
• execution time
• usually measured by the number of executions
Space Complexity
• Space complexity is defined as the amount of memory a
program needs to run to completion.
• Why is this of concern?
• We could be running on a multi-user system where
programs are allocated a specific amount of space.
• We may not have sufficient memory on our computer.
• There may be multiple solutions, each having different
space requirements.
• The space complexity may define an upper bound on the
data that the program can handle.
Time Complexity
• Time complexity is the amount of computer time a program
needs to run.
• Why do we care about time complexity?
• Some computers require upper limits for program
execution times.
• Some programs require a real-time response.
• If there are many solutions to a problem, typically we’d like
to choose the quickest.
Time Complexity
• How do we measure?
• Count a particular operation (operation counts)
• Count the number of steps(step counts)
• Asymptotic complexity
Asymptotic Notation
• Describes the behavior of the time or space complexity for
large instance characteristic
• Big Oh (O) notation provides an upper bound for the function f
• Omega (Ω) notation provides a lower-bound
• Theta (Q) notation is used when an algorithm can be bounded
both from above and below by the same function
Big Oh Notation (O)
Growth-rate Functions
O(1) – constant time, O(log n) – logarithmic
the time is time, usually the log is O(n) – linear time, e.g.
independent of n, e.g. base 2, e.g. binary linear search
array look-up search
O(n*log n) – e.g. O(nk) – polynomial
O(n ) – quadratic time,
2
efficient sorting (where k is some
e.g. selection sort
algorithms constant)
O(2n) – exponential
time, very slow!
Order of growth of some common functions
O(1) < O(log n) < O(n) < O(n * log n) < O(n2) < O(n3) < O(2n)
Thank You!