Lecture 2
Fundamentals of the Analysis of
Algorithm Efficiency
ANALYSIS OF ALGORITHMS
Analysis of Algorithms
What is the goal?
▪ Analyze time requirements - predict how
running time increases as the size of the
problem increases:
time = f(size)
Why is it useful?
▪ To compare different algorithms.
Defining “problem size”
▪ Typically, it is straightforward to
identify the size of a problem, e.g.:
▪ size of array
▪ size of stack, queue, list etc.
▪ vertices and edges in a graph
▪ But not always …
Time Analysis
• Provides upper and lower bounds of running time.
Lower Bound Running Time Upper Bound
• Different types of analysis:
- Worst case
- Best case
- Average case
Worst Case Analysis
▪ Provides an upper bound on
running time.
▪ An absolute guarantee that the
algorithm would not run longer,
no matter what the inputs are.
Best Case analysis
▪ Provides a lower bound on running
time.
▪ Input is the one for which the algorithm
runs the fastest.
Average Case Analysis
▪ Provides an estimate of “average”
running time.
▪ Assumes that the input is random.
▪ Useful when best/worst cases do not
happen very often (i.e., few input
cases lead to best/worst cases).
Lower Bound Running Time Upper Bound
Example: Searching
▪ Problem of searching an ordered list.
▪ Given a list L of n elements that
are sorted into a definite order
(e.g., numeric, alphabetical),
▪ And given a particular element x,
▪ Determine whether x appears in
the list, and if so, return its index
(i.e., position) in the list.
How do we analyze an algorithm?
▪ Need to define objective measures.
(1) Compare execution times?
Not good: times are specific to a particular
machine.
(2) Count the number of statements?
Not good: number of statements varies with
programming language and programming
style.
Example
Algorithm 1 Algorithm 2
arr[0] = 0; for(i=0; i<N; i++)
arr[1] = 0; arr[i] = 0;
arr[2] = 0;
...
arr[N-1] = 0;
How do we analyze an algorithm?
(cont.)
(3) Express running time t as a function of
problem size n (i.e., t=f(n) ).
- Given two algorithms having running
times f(n) and g(n), find which functions
grows faster.
- Such an analysis is independent of
machine time, programming style, etc.
How do we find f(n)?
(1) Associate a "cost" with each statement.
(2) Find total number of times each statement is executed.
(3) Add up the costs.
Algorithm 1 Algorithm 2
Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
...
arr[N-1] = 0; c1
----------- -------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 =
(c2 + c1) x N + c2
How do we find f(n)? (cont.)
How do we find f(n)? (cont.)
Comparing algorithms
•Given two algorithms
having running times
f(n) and g(n), how do
we decide which one is
faster?
▪ Compare “rates of growth” of f(n) and
g(n)
The idea
▪ Write down an algorithm
▪ Using Pseudocode
▪ In terms of a set of primitive operations
▪ Count the # of steps
▪ In terms of primitive operations
▪ Considering worst case input
▪ Bound or “estimate” the running time
▪ Ignore constant factors
▪ Bound fundamental running time
Which growth rate is best?
▪ T(n) = 1000n + n2 or T(n) = 2n + n3
GROWTH RATES
Growth Rates
▪ Growth rates of functions:
▪ Linear n 1E+29
Cubic
1E+27
▪ Quadratic n2 1E+25 Quadratic
▪ Cubic n3 1E+23
1E+21 Linear
1E+19
1E+17
▪ In a log-log chart, the slope
T(n)
1E+15
of the line corresponds to 1E+13
1E+11
the growth rate of the 1E+9
function 1E+7
1E+5
1E+3
1E+1
1E-1
1E-1 1E+1 1E+3 1E+5 1E+7 1E+9
n
Understanding Rate of Growth (cont’d)
▪ The low order terms of a function are relatively insignificant
for large n
n4 + 100n2 + 10n + 50
Approximation:
n4
▪ Highest order term determines rate of growth!
Example
▪ Suppose you are designing a website to process
user data (e.g., financial records).
▪ Suppose program A takes fA(n)=30n+8
microseconds to process any n records, while
program B takes fB(n)=n2+1 microseconds to
process the n records.
▪ Which program would you choose, knowing
you’ll want to support millions of users?
Compare rates of growth:
30n+8 ~ n and n2+1 ~ n2
Visualizing Orders of Growth
▪ On a graph, as you go to the
right, a faster growing
function eventually becomes
larger...
fA(n)=30n+8
Value of function →
fB(n)=n2+1
Increasing n →
Rate of Growth ≡Asymptotic Analysis
▪ Using rate of growth as a measure to
compare different functions implies
comparing them asymptotically (i.e., as n
→∞)
▪ If f(x) is faster growing than g(x), then f(x)
always eventually becomes larger than g(x)
in the limit (i.e., for large enough values of
x).
ASYMPTOTIC NOTATION
Asymptotic Notation
▪ O notation: asymptotic “less than”:
f(n)=O(g(n)) implies: f(n) “≤” c g(n) in the limit*
c is a constant
(used in worst-case analysis)
f(n) is O(g(n)) if f(n) cg(n) for n n0
Asymptotic Notation(Big-Omega )
▪ notation: asymptotic “greater than”:
c is a constant
f(n)= (g(n)) implies: f(n) “≥” c g(n) in the limit*
(used in best-case analysis)
◼ Definition: f(n) is (g(n)) if there is a constant c >
0 and an integer constant n0 1 such that f(n)
c•g(n) for n n0
◼ An asymptotic lower Bound
Asymptotic Notation
▪ notation: asymptotic “equality”:
f(n)= (g(n)) implies: f(n) “=” c g(n) in the limit*
c is a constant
(provides a tight bound of running time)
(best and worst cases are same)
Definition: f(n) is (g(n)) if there are constants c’ > 0
and c’’ > 0 and an integer constant n0 1 such that
c’•g(n) f(n) c’’•g(n) for n n0
Here we say that g(n) is an asymptotically tight bound
for f(n).
Prove n2/2 + lg n = Θ(n2).
▪ Proof. To prove this claim, we must determine positive constants c1, c2 and n0, s.t.
c1 n2<= n2/2 + lg n <= c2 n2
▪ c1 <= ½ + (lg n) / n2 <= c2 (divide thru by n2)
▪ Pick c1 = ¼ , c2 = ¾ and n0 = 2
▪ For n0 = 2, 1/4 <= ½ + (lg 2) / 4 <= ¾, TRUE
▪ When n0 > 2, the (½ + (lg 2) / 4) term grows smaller but never less
than ½, therefore n2/2 + lg n = Θ(n2).
Big-O Notation - Examples
fA(n)=30n+8 is O(n)
fB(n)=n2+1 is O(n2)
10n3 + 2n2 is O(n3)
n3 - n2 is O(n3)
1273 is O(1)
More Big-Oh Examples
7n-2
7n-2 is O(n)
need c > 0 and n0 1 such that 7n-2 c•n for n n0
this is true for c = 7 and n0 = 1
◼ 3n3 + 20n2 + 5
3n3 + 20n2 + 5 is O(n3)
need c > 0 and n0 1 such that 3n3 + 20n2 + 5 c•n3 for n n0
this is true for c = 4 and n0 = 21
◼ 3 log n + 5
3 log n + 5 is O(log n)
need c > 0 and n0 1 such that 3 log n + 5 c•log n for n n0
this is true for c = 8 and n0 = 2
Analysis of Algorithms 30
Big-O Notation - Examples
fA(n)=30n+8 is O(n) or O(n2)
But it is important to
use as “tight” bounds
fB(n)=n2+1 is O(n2) or O(n4) as possible!
10n3 + 2n2 is O(n3) or O(n4)
or O(n5)
n3 - n2 is O(n3)
1273 is O(1) or O(n)
Common orders of magnitude
Algorithm speed vs function growth
▪ An O(n2) algorithm will be slower than an O(n)
algorithm (for large n).
▪ But an O(n2) function will grow faster than an
O(n) function.
Value of function →
fA(n)=30n+8
fB(n)=n2+1
Increasing n →
Examples
i = 0;
while (i<N) {
X=X+Y; // O(1)
result = mystery(X); // O(N), just an example...
i++; // O(1)
}
▪ The body of the while loop: O(N)
▪ Loop is executed: N times
N x O(N) = O(N2)
Examples (cont.’d)
if (i<j)
for ( i=0; i<N; i++ )
O(N)
X = X+i;
else
X=0;
O(1)
Max ( O(N), O(1) ) = O (N)
Examples (cont.’d)
Examples (cont.’d)
Questions?
That’s
all
folks!