CS214-lec-3-4 Complexity
CS214-lec-3-4 Complexity
4
1. Set
A set is a collection of distinguishable members or elements.
2. Logarithm
• Definition:
3. Summation
4. Recursion
• Example:
• 1. Compute n!
• fn = n* fn-1
• 2. Fibonacci n
• fn = fn-1 + fn-2
5. Mathematical Proof
• Analysis
Very expensive approach
Depends on the machine architecture, current
technology, type of programming language used, the
programmer skills, etc.
• All of the above mentioned reasons lead to the
conclusion: This approach is not practical!!
.
Estimation of Time Complexity
Approach #2
• Select the most fundamental operation done by
the algorithm, then count the number of times
this operation takes place to solve a problem
instance of size n
Not all operations are equal. E.g. the CPU time to do
a multiplication is longer than the addition operation
Nevertheless, if the addition operation is more
frequent than the multiplication operation, the total
time taken to execute all additions adds up and
dominates the total time taken to execute the
multiplication operations, so fundamental means
dominant !
Approach #2
• In this approach, we use the number of times
the fundamental operation is executed as a
measure of time complexity. It is not measured
in seconds (or any other time units)
• Example
int Sum(int A[], int N) {
int s=0;
for (int i=0; i< N; i++)
s = s + A[i];
return s; }
Approach #2
• In this approach, we use the number of times
the fundamental operation is executed as a
measure of time complexity. It is not measured
in seconds (or any other time units)
• Example
int Sum(int A[], int N) {
int s=0;
for (int i=0; i< N; i++) The complexity function of the
algorithm is : f(N) = 5N +3
s = s + A[i];
return s; }
Approach #2
• Analysis:
People may choose different fundamental operations for the
same algorithm, so you may get more than one time complexity
function for the same algorithm!
Again, this approach depends on the architecture and the
technology used, if for example we design a machine that
executes * operation faster than + operation, our analysis will not
be same!
• Does it make any difference if someone tell you that the
time complexity of an algorithm A is T(n)=3n+2 and
somebody else insisted that it is T(n)=2n?
• Do you have to know the fundamental operation that the
analysis is based on?
Approach #2
• Example
int Sum(int A[], int N) {
int s=0;
The complexity function of the
for (int i=0; i< N; i++)
s = s + A[i];
algorithm is : f(N) = 5N +3
return s; }
Experiment#1 1
5
3
75
8
40
10
10
21
126
14.29%
59.52%
38.10%
31.75%
47.62%
8.73%
T(n) = 3n2+8n+10 10 300 80 10 392 76.53% 20.41% 3.06%
15 675 120 10 808 83.54% 14.85% 1.61%
100.00% 20 1200 160 10 1374 87.34% 11.64% 1.02%
90.00% 25 1875 200 10 2090 89.71% 9.57% 0.72%
80.00%
30 2700 240 10 2956 91.34% 8.12% 0.54%
70.00%
35 3675 280 10 3972 92.52% 7.05% 0.43%
60.00%
C(3n^2) 40 4800 320 10 5138 93.42% 6.23% 0.35%
50.00%
C(8n) 45 6075 360 10 6454 94.13% 5.58% 0.29%
40.00%
30.00% C(10) 50 7500 400 10 7920 94.70% 5.05% 0.25%
• Proof:
Time Complexity
8
We need to find two real 6
numbers n0>0 and c>0
where the inequality 4 3n+2
0≤f(n)≤cn is fulfilled
2 5n
Take n0=1 and c=5
0≤3n+2≤5n 0
Since the inequality is 0 1 2 3
fulfilled with n0=1 and Problem size n
c=5, therefore f(n)ЄO(n)
Example 2
400
350
• Show that f(n)=3n2+20
Time Complexity
300
has O(n2) 3n^2+20
250 4n^2
We need to find two real 200
numbers n0>0 and c>0
150
where the inequality 0≤
100
3n2+20 ≤cn2 is fulfilled
50
Let n0=5 and c=4
0
0≤ 3n2+20 ≤4n2 0 2 4 6 8 10 12
Problem size n
3n2+20 O(n2)
What is the time complexity of multiplying two
arrays of size n?
A. O(n^2)
B. O(n)
C. O(log n)
D. O(n log n)
Example 10
What is the time complexity of:
sum = 0;
for (k=1; k<=n; k*=2) // Do log n
times
for (j=1; j<=n; j++) // Do n times
sum++;
A. O(n^2)
B. O(n)
C. O(log n)
D. O(n log n)
7 functions used in analysis of algorithms
1. The exponential function
f(n) = 2n
Comparing the growth of the running time as the input grows to the growth of known
functions.
57
Example 11: Comparing Algorithm Efficiency
Algorithm A
sum = 0
for i = 1 to n
sum = sum +i
58
Example 11: Comparing Algorithm Efficiency
Algorithm A Algorithm B
sum = 0 sum = 0
for i =1 to n for i = 1 to n
sum = sum +i {for j =n to i
sum = sum +1
}
59
Example 11: Comparing Algorithm Efficiency
60
Example 11: Comparing Algorithm Efficiency
Algorithm A Algorithm B Algorithm C
sum = 0 sum = 0 sum = n * ( n +1)/ 2
for i =1 to n for i = 1 to n
sum = sum +i {for j =n to i
sum = sum +1 }
61
Big-Omega
• The function g(n) is W(f(n)) iff there exist a
real positive constant c > 0 and a positive
integer n0 such that g(n) cf(n) for all n n0
Big Omega is just opposite to Big Oh
It generalises the concept of “lower bound” ()
in the same way as Big Oh generalises the
concept of “upper bound” (≤)
If f(n) is O(g(n)) then g(n) is W(f(n))
62
Big-Theta
• The function g(n) is Q(f(n)) iff there exist
two real positive constants c1 > 0 and c2 > 0
and a positive integer n0 such that:
c1f(n) g(n) c2f(n) for all n n0
Whenever two functions, f and g, are of the
same order, g(n) is Q(f(n)), they are each Big-
Oh of the other:
g(n) is O(f(n)) AND f(n) is O(g(n))
63
W and Q Notations
• f(n) is W (g(n)) iff g(n) is O(f(n))
64
To DO
• Read chapter 2 in book