Lecture 2 Asymptotic Analysis
Lecture 2 Asymptotic Analysis
Algorithms
Lesson 2 Asymptotic Analysis
Asymptotic Complexity
Running time of an algorithm as a function
of input size n for large n.
Expressed using only the highest-order
term in the expression for the exact
running time.
◼ Instead of exact running time, say Q(n2).
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 2
Running Time
Most algorithms transform best case
average case
input objects into output worst case
objects. 120
Running Time
80
with the input size.
60
Average case time is often
difficult to determine. 40
running time. 0
1000 2000 3000 4000
◼ Easier to analyze Input Size
◼ Crucial to applications such as
games, finance and robotics
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 3
Experimental Studies
Write a program 9000
Time (ms)
inputs of varying size and 5000
composition 4000
Use a method like 3000
System.currentTimeMillis() to 2000
get an accurate measure
1000
of the actual running time
0
Plot the results 0 50 100
Input Size
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 4
Limitations of Experiments
It is necessary to implement the
algorithm, which may be difficult
Results may not be indicative of the
running time on other inputs not included
in the experiment.
In order to compare two algorithms, the
same hardware and software
environments must be used
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 5
Theoretical Analysis
Uses a high-level description of the
algorithm instead of an implementation
Characterizes running time as a
function of the input size, n.
Takes into account all possible inputs
Allows us to evaluate the speed of an
algorithm independent of the
hardware/software environment
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 6
Pseudocode
High-level description Example: find max
of an algorithm element of an array
More structured than Algorithm arrayMax(A, n)
English prose Input array A of n integers
Less detailed than a Output maximum element of A
program
Preferred notation for currentMax A[0]
describing algorithms for i 1 to n − 1 do
Hides program design if A[i] currentMax then
issues currentMax A[i]
return currentMax
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 7
Pseudocode Details
Control flow Method call
◼ if … then … [else …] var.method (arg [, arg…])
◼ while … do … Return value
◼ repeat … until … return expression
◼ for … do … Expressions
◼ Indentation replaces braces Assignment
(like = in Java)
Method declaration = Equality testing
Algorithm method (arg [, arg…]) (like == in Java)
Input … n2 Superscripts and other
Output … mathematical
formatting allowed
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 8
Primitive Operations
Basic computations
Examples:
performed by an algorithm
◼ Evaluating an
Identifiable in pseudocode expression
Largely independent from the ◼ Assigning a value
to a variable
programming language
◼ Indexing into an
Exact definition not important array
(we will see why later) ◼ Calling a method
Returning from a
Assumed to take a constant ◼
method
amount of time in the RAM
model
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 9
Counting Primitive
Operations
By inspecting the pseudocode, we can determine the
maximum number of primitive operations executed by
an algorithm, as a function of the input size
Algorithm arrayMax(A, n) # operations
currentMax A[0] 2
for i 1 to n − 1 do 2n
if A[i] currentMax then 2(n − 1)
currentMax A[i] 2(n − 1)
{ increment counter i } 2(n − 1)
return currentMax 1
Total 8n − 2
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 10
Estimating Running Time
Algorithm arrayMax executes 8n − 2 primitive
operations in the worst case. Define:
a = Time taken by the fastest primitive operation
b = Time taken by the slowest primitive operation
Let T(n) be worst-case time of arrayMax. Then
a (8n − 2) T(n) b(8n − 2)
Hence, the running time T(n) is bounded by two
linear functions
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 11
Growth Rate of Running Time
Changing the hardware/ software
environment
◼ Affects T(n) by a constant factor, but
◼ Does not alter the growth rate of T(n)
The linear growth rate of the running
time T(n) is an intrinsic property of
algorithm arrayMax
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 12
Big-Oh Notation
10,000
Given functions f(n) and 3n
g(n), we say that f(n) is 2n+10
1,000
O(g(n)) if there are
n
positive constants
c and n0 such that 100
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 13
Big-Oh Example
1,000,000
n^2
Example: the function 100n
100,000
n2 is not O(n) 10n
◼ n2 cn 10,000 n
◼ nc
◼ The above inequality 1,000
cannot be satisfied
since c must be a 100
constant
10
1
1 10 100 1,000
n
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 14
Big-Oh and Growth Rate
The big-Oh notation gives an upper bound on the
growth rate of a function
The statement “f(n) is O(g(n))” means that the growth
rate of f(n) is no more than the growth rate of g(n)
We can use the big-Oh notation to rank functions
according to their growth rate
f(n) is O(g(n)) g(n) is O(f(n))
g(n) grows more Yes No
f(n) grows more No Yes
Same growth Yes Yes
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 15
Some Rules
Transitivity
f(n) = O(g(n)) and g(n) = O(h(n)) => f(n)
= O(h(n))
Addition
f(n) + g(n) = O(max { f(n) ,g(n)})
Polynomials
a0 + a1n + … + adnd = O(nd)
Heirachy of functions
n + log n = O(n); 2n + n3 = O(2n)
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 16
MORE RULES
Consecutive statements
◼ Maximum statement is the one counted
e.g. a fragment with single for-loop followed by
double for- loop is O(n2).
Block #1 t1
t1+t2 = max(t1,t2)
Block #2 t2
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 17
RULES FOR ANALYSIS
If/Else
if cond then S1
else
S2
Block #1 t1 Block #2 t2 Max(t1,t2)
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 18
RULES FOR ANALYSIS
3. For Loops
Running time of a for-loop is at most the
running time of the statements inside the
for-loop times number of iterations
for (i = sum = 0; i < n; i++)
sum += a[i];
for loop iterates n times, executes 2
assignment statements each iteration
==> asymptotic complexity of O(n)
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 19
RULES FOR ANALYSIS
4. Nested For-Loops
Analyze inside-out. Total running time is
running time of the statement multiplied
by product of the sizes of all the for-loops
e.g. for (i =0; i < n; i++)
for (j = 0, sum = a[0]; j <= i ; j++)
sum += a[j];
printf("sum for subarray - through %d is
%d\n", i, sum);
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 20
Asymptotic Algorithm Analysis
The asymptotic analysis of an algorithm determines
the running time in big-Oh notation
To perform the asymptotic analysis
◼ We find the worst-case number of primitive operations
executed as a function of the input size
◼ We express this function with big-Oh notation
Example:
◼ We determine that algorithm arrayMax executes at most
8n − 2 primitive operations
◼ We say that algorithm arrayMax “runs in O(n) time”
Since constant factors and lower-order terms are
eventually dropped anyhow, we can disregard them
when counting primitive operations
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 21
Prefix Averages (Linear)
The following algorithm computes prefix averages in
linear time by keeping a running sum
Algorithm prefixAverages2(X, n)
Input array X of n integers
Output array A of prefix averages of X #operations
A new array of n integers n
s0 1
for i 0 to n − 1 do n
s s + X[i] n
A[i] s / (i + 1) n
return A 1
Algorithm prefixAverages2 runs in O(n) time
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 22
Algorithm
1 int count_0( int N)
2 {
3 sum = 0 O(1)
4 for i=1 to N { O(N)
5 for j=1 to N { O(N2)
6 If i<=j then O(N2)
7 sum++ O(N2)
8 }
9 }
10 return sum O(1)
11 }
The running time is O(N2)
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 23
Math you need to Review
Basic probability
Logarithms and Exponents
properties of logarithms:
logb(xy) = logbx + logby
logb (x/y) = logbx - logby
logbxa = alogbx
logba = logxa/logxb
properties of exponentials:
a(b+c) = aba c
abc = (ab)c
ab /ac = a(b-c)
b = a logab
bc = a c*logab
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 24
Exercise
Tuesday, October 8,
2024 Lesson 2 Asymptotic Analysis 25