Topic 4 - Greedy Method
Topic 4 - Greedy Method
Greedy Method
Optimization Problems
• An Optimization Problem is a problem that involves searching through a set of
configurations to find one that minimises or maximizes an objective function
defined on these configurations
• Most of these problems have n inputs and require us to obtain a subset that satisfies some
constraints
• Any subset that satisfies those constraints is called a feasible solution
• We need to find a feasible solution that either maximizes or minimizes a given objective function
• A feasible solution that does this is called an optimal solution
• Minimization problem:
associate cost to every solution, find min-cost solution
• Maximization problem:
associate profit to every solution, find max-profit solution
• Moral: We are Being greedy for local optimization with the hope that it will
lead to a global optimal solution. As expected, it may not work always. But, in
many situations, it really works!!!
The Greedy Methods ...
• The greedy method solves a given Optimization Problem going through a
sequence of feasible choices
• The sequence starts from well-understood starting configuration, and then
iteratively makes the decision that seems best from all those that are
currently possible
• It makes the choice that looks best at the moment and adds it to the current
subsolution
• If the inclusion of the next input into the partially-constructed optimal subsolution
will result in an infeasible solution, then this input is not added to the partial
subsolution; otherwise, it is added
• It makes a locally-optimal choice at each step in the hope that these choices
will lead to a globally-optimal solution of the problem
What Makes a Greedy Algorithm?
• Feasible
• Has to satisfy the problem’s constraints
• Locally Optimal
• The greedy part
• Has to make the best local choice among all feasible choices available on that step
• If this local choice results in a global optimum then the problem has optimal
substructure
• Irrevocable
• Once a choice is made it can’t be un-done on subsequent steps of the algorithm
• Simple examples:
• Playing chess by making best move without lookahead
• Giving fewest number of coins as change
Designing A Greedy Algorithm
• When we have a choice to make, make the one that looks best right now
• The i th item is worth (or gives a profit of) pi dollars and weighs
wi pounds
Let xi be the fraction of item i, which will be put into the knapsack
Fractional Knapsack Problem
The Problem: Given a knapsack with a certain capacity M,
n items, which are to be put into the knapsack,
each item has a weight w1 , w2 , , wn and
a profit p1 , p 2 , , p n
.
The Goal: Find ( x1 , x 2 , , x n ) where 0 xi 1
n n
2
( x1 , x 2 , x 3 ) ( 0 , ,1)
3
3
2
px i i
25 * 0 + 24 *
3
+ 15 * 1 31
i 1
Example: Fractional Knapsack Problem
n3 (w1 , w2 , w3 ) (18,15,10)
M 20 ( p1 , p2 , p3 ) (25,24,15)
n4 ( w1 , w2 , w3 , w4 ) (5,15,10,12)
M 25 ( p1 , p2 , p3 , p4 ) (25,21,15,6)
Time Complexity
1. Calculate vi = pi / wi for i = 1, 2, …, n O(n)
2. Sort items by nonincreasing vi O(nlogn)
(all wi are also reordered correspondingly )
3. Let M' be the current weight limit (Initially, M' = M and
xi=0 ). In each iteration, choose item i from the head
of the unselected list. O(n)
If M' >= wi , set xi = 1, and M' = M' - wi
O(1) If M' < wi , set xi = M'/wi and the algorithm is finished
O(nlogn)
Correctness?
???
Proof: Correctness of Strategy #3
• Proved by the method of Contradiction
• Let X be the solution of greedy strategy #3
• Assume that X is not optimal
• There is an optimal solution Y and the profit of Y is greater than the profit of X
• Consider the item j in X but not in Y
• Get rid of some items with total weight wj (possibly fractional items) and add item j to Y
• The capacity remains the same
• Total value is not decreased
• One more item in X is added to Y
• Repeat the process until Y is changed to contain all items selected in X
• Total value is not decreased.
• The capacity remains the same
• X is optimal, too
Contradiction!
0-1 Knapsack Problem (xi can be 0 or 1)
Knapsack capacity: 50
0 0 0
+ + +
100 120 120
+ + +
60 60 100
=160 =180 =220
i 1 2 3
pi 60 100 120
Can 0-1 Knapsack be solved
wi 10 20 30 by greedy algorithm?
Apply Greedy Algorithms for Solving
Job Sequencing with Deadlines
JOB SEQUENCING WITH DEADLINES
The problem is stated as below.
• There are n jobs to be processed on a machine.
• Each job i has a deadline di≥ 0 and profit pi≥0 .
• Pi is earned iff the job is completed by its deadline.
• The job is completed if it is processed on a machine
for unit time.
• Only one machine is available for processing jobs.
• Only one job is processed at a time on the machine.
27
JOB SEQUENCING WITH DEADLINES
(Contd..)
• A feasible solution is a subset of jobs J such that
each job is completed by its deadline.
• An optimal solution is a feasible solution with
maximum profit value.
Example : Let n = 4,
(p1,p2,p3,p4) = (100,10,15,27),
(d1,d2,d3,d4) = (2,1,2,1)
28
JOB SEQUENCING WITH DEADLINES
(Contd..)
Sr.No. Feasible Processing Profit value
Solution Sequence
(i) (1,2) (2,1) 110
(ii) (1,3) (1,3) or (3,1) 115
(iii) (1,4) (4,1) 127 is the optimal one
(iv) (2,3) (2,3) 25
(v) (3,4) (4,3) 42
(vi) (1) (1) 100
(vii) (2) (2) 10
(viii) (3) (3) 15
(ix) (4) (4) 27
29
GREEDY ALGORITHM TO OBTAIN AN
OPTIMAL SOLUTION
• Consider the jobs in the non increasing order of profits
subject to the constraint that the resulting job sequence J is a
feasible solution.
• In the example considered before, the non-increasing profit
vector is
(100 27 15 10) (2 1 2 1)
p1 p4 p3 p2 d1 d4 d3 d2
30
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
J = { 1} is a feasible one
J = { 1, 4} is a feasible one with processing
sequence ( 4,1)
J = { 1, 3, 4} is not feasible
J = { 1, 2, 4} is not feasible
J = { 1, 4} is optimal
31
Job sequencing with deadlines
n Problem: n jobs, S={1, 2, …, n}, each job i has a
deadline di 0 and a profit pi 0. We need one unit
of time to process each job and we can do at most
one job each time. We can earn the profit pi if job i
is completed by its deadline.
i 1 2 3 4 5
pi 20 15 10 5 1
di 2 2 1 3 3
3 -32
Algorithm:
Step 1: Sort pi into nonincreasing order. After
sorting p1 p2 p3 … pn.
Step 2: Add the next job i to the solution set if i
can be completed by its deadline. Assign i to
time slot [r-1, r], where r is the largest
integer such that 1 r di and [r-1, r] is free.
Step 3: Stop if all jobs are examined. Otherwise,
go to step 2.
3 -33
e.g.
i pi di
1 20 2 assign to [1, 2]
2 15 2 assign to [0, 1]
3 10 1 reject
4 5 3 assign to [2, 3]
5 1 3 reject
Solution = {1, 2, 4}
Total Profit = 20 + 15 + 5 = 40
3 -34
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
35
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
Proof:
• By definition of the feasible solution if the jobs in J can be
processed in the order without violating any deadline then J
is a feasible solution.
• So, we have only to prove that if J is a feasible one, then
represents a possible order in which the jobs may be
processed.
36
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
37
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
38
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
39
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
40
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
41
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
• Theorem2:The Greedy method obtains an optimal solution to the
job sequencing problem.
• Proof: Let(pi, di) 1in define any instance of the job sequencing
problem.
• Let I be the set of jobs selected by the greedy method.
• Let J be the set of jobs in an optimal solution.
• Let us assume I≠J .
42
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
43
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
• So, there exists jobs a and b such that aI, aJ, bJ,bI.
• Let a be a highest profit job such that aI, aJ.
• It follows from the greedy method that pa pb for all jobs
bJ,bI. (If pb > pa then the Greedy method would consider
job b before job a and include it in I).
44
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
• Let S i and S j be feasible schedules for job sets I and J
respectively.
• Let i be a job such that iI and iJ.
(i.e. i is a job that belongs to the schedules generated by the
Greedy method and optimal solution).
• Let i be scheduled from t to t+1 in SI and t1to t1+1 in Sj.
45
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
• If t < t1, we may interchange the job scheduled in [t1
t1+1] in SI with i; if no job is scheduled in [t1 t1+1] in
SI then i is moved to that interval.
• With this, i will be scheduled at the same time in S I
and SJ.
• The resulting schedule is also feasible.
• If t1 < t, then a similar transformation may be made in
Sj.
• In this way, we can obtain schedules SI1 and SJ1 with
the property that all the jobs common to I and J are
scheduled at the same time.
46
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
• Consider the interval [Ta Ta+1] in SI1 in which the job a is scheduled.
• Let b be the job scheduled in Sj1 in this interval.
• As a is the highest profit job, pa pb.
• Scheduling job a from ta to ta+1 in Sj1 and discarding job b gives us a
feasible schedule for job set J1 = J-{b} U {a}. Clearly J1 has a profit
value no less than that of J and differs from in one less job than does
J.
47
GREEDY ALGORITHM TO OBTAIN AN OPTIMAL
SOLUTION (Contd..)
48
GREEDY ALGORITHM FOR JOB SEQUENSING
WITH DEADLINE
Procedure Greedy_Job (D, J, n) J may be represented by
// J is the set of n jobs to be completed// one dimensional array J (1: k)
// by their deadlines // The deadlines are
J {1} D (J(1)) D(J(2)) .. D(J(k))
FOR i 2 to n DO To test if J U {i} is feasible,
IF all jobs in J U{i} can be completed we insert i into J and verify
by their deadlines D(J(r)) r 1 r k+1
THEN J J U{i}
END IF
END FOR
END Greedy_Job
49
Apply Greedy Algorithms for Solving
Optimal Code Design Problem
Optimal text encoding: Huffman Code
“bla□bla …”
0100110000010000010011000001 …
Fixed-length code:
e = 000 n = 001 v = 010 0 = 011 r = 100 d = 101 l = 110 □ = 111
Length of encoded text: 12 x 3 = 36 bits
ALGO: HUFFMAN( C )
1 n |C|
2Q C
3 For i 1 to n – 1
4 Do allocate a new node z
5 left[z] x EXTRACT-MIN(Q)
6 right[z] y EXTRACT-MIN(Q)
7 f[z] f[x] + f[y]
8 INSERT(Q,Z)
9 Return EXTRACT-MIN(Q)
Why??
The Steps of Huffman’s Algorithm
The Steps of Huffman’s Algorithm ...
0 1
c1 c2 c3 c4 c5 c6 c7 c8
4 2 1 1 1 1 1 1
c1 c2 c3 c4 c5 c6 c7 c8 b
4 2 1 1 1 1 1 1 2
c1 c2 c3 c4 c5 c6 c7 c8 b
4 2 1 1 1 1 1 1 2
Proof. Let OPT be an optimal tree TOPT for C. If ci, ck are siblings in TOPT then
the lemma obviously holds, so assume this is not the case.
We will show how to modify TOPT into a tree T* such that
ck cm
ci depth = d1 cs
v
cs cm depth = d2 ci ck
Running time:
• O(n2) ?!
• O(n log n) if implemented smartly (use heap)
• Sorting + O(n) if implemented even smarter (hint: 2 queues)