Unit-2 (Module-2)
Unit-2 (Module-2)
UNIT – 2 (Module-2)
THE GREEDY METHOD
1. The General Method
2. Knapsack Problem
3. Job Sequencing with Deadlines
4. Minimum-Cost Spanning Trees
5. Prim’sAlgorithm
6. Kruskal’s Algorithm
7. Single Source Shortest Paths.
The method:
• Applicable to optimization problems ONLY
• Constructs a solution through a sequence of steps
• Each step expands a partially constructed solution so far, until a complete solution
to the problem is reached.
On each step, the choice made must be
• Feasible: it has to satisfy the problem‘s constraints
• Locally optimal: it has to be the best local choice among all feasible choices
available on that step
• Irrevocable: Once made, it cannot be changed on subsequent steps of the
algorithm
NOTE:
• Greedy method works best when applied to problems with the greedy-choice
property
• A globally-optimal solution can always be found by a series of local
improvements from a starting configuration.
• Optimal solutions:
Change making
Minimum Spanning Tree (MST)
Single-source shortest paths
Huffman codes
• Approximations:
Traveling Salesman Problem (TSP)
Fractional Knapsack problem
2.2Knapsack problem
2.2.1 One wants to pack n items in a luggage
2.2.1.1 The ith item is worth vi dollars and weighs wi pounds
2.2.1.2 Maximize the value but cannot exceed W pounds
2.2.1.3 vi , wi, W are integers
2.2.2 0-1 knapsack each item is taken or not taken
2.2.3 Fractional knapsack fractions of items can be taken
2.2.4 Both exhibit the optimal-substructure property
2.2.4.1 0-1: If item j is removed from an optimal packing, the remaining packing is an
optimal packing with weight at most W-wj
2.2.4.2 Fractional: If w pounds of item j is removed from an optimal packing, the
remaining packing is an optimal packing with weight at most W-w that can be taken
from other n-1items plus wj – w of item j
Greedy Algorithm for Fractional Knapsack problem
2.2.5 Fractional knapsack can be solvable by the greedy strategy
2.2.5.1 Compute the value per pound vi/wi for each item
2.2.5.2 Obeying a greedy strategy, take as much as possible of the item with the greatest
value perpound.
2.2.5.3 If the supply of that item is exhausted and there is still more room, take as
much as possible of the item with the next value per pound, and so forth until there is
no moreroom
2.2.5.4 O(n lg n) (we need to sort the items by value per
pound)O-1 knapsack is harder
2.2.6 knapsack cannot be solved by the greedy strategy
2.2.6.1 Unable to fill the knapsack to capacity, and the empty space lowers the effective
value perpound of the packing
2.2.6.2 We must compare the solution to the sub-problem in which the item is included
with thesolution to the sub-problem in which the item is excluded before we can make
the choice
2.3.9 Consider the jobs in the non increasing order of profits subject to the constraint that the
resultingjob sequence J is a feasible solution.
2.3.10 In the example considered before, the non-increasing profit
vector is(100 27 15 10) (2 1 2 1)
p1 p4 p3 p2 d1 d d3 d2
J = { 1} is a feasible one
J = { 1, 4} is a feasible one with processing sequence ( 4,1)
J = { 1, 3, 4} is not feasible
J = { 1, 2, 4} is not feasible
J = { 1, 4} is optimal
2.3.13 So, we have only to prove that if J is a feasible one, then Σ represents a possible order in
whichthe jobs may be processed.
2.3.14 Suppose J is a feasible solution. Then there exists Σ1 = (r1,r2,…,rk)
such thatdrj ≥ j, 1 ≤ j <k
i.e. dr1 ≥1, dr2 ≥ 2, …, drk ≥ k.
each job requiring an unit time.
5 2
d
c 3
Definition:
MST of a weighted, connected graph G is defined as: A spanning tree of G with
minimum total weight.
Example: Consider the example of spanning tree:
For the given graph there are three possible spanning trees. Among them the spanning
tree with the minimum weight 6 is the MST for the given graph
Algorithm:
ALGORITHM Prim (G)
//Prim‘s algorithm for constructing a MST
//Input: A weighted connected graph G = { V, E }
//Output: ET the set of edges composing a MST of G
// the set of tree vertices can be initialized with any vertex
VT → { v0}
ET → Ø
for i→ 1 to |V| - 1 do
Find a minimum-weight edge e* = (v*, u*) among all the edges (v, u) such
that v is in VT and u is in V - VT
VT → VT U { u*}
ET → ET U { e*}
return ET
STEP 1: Start with a tree, T0, consisting of one vertex
STEP 2: ―G row‖ tree one vertex/edge at a time
2.5.2.1 Construct a series of expanding sub-trees T1, T2, … Tn-1.
2.5.2.2 At each stage construct Ti + 1 from Ti by adding the minimum weight
edge connecting a vertex in tree (Ti) to one vertex not yet in tree, choose
from “fringe” edges (this is the “greedy” step!)
Algorithm stops when all vertices are included
Example:
Apply Prim‘s algorithm for the following graph to find MST.
1
b c
3 4 4 6
a 5 f 5
d
2
6 8
e
Solution:
1
c(b,1) b c
3
d(-,∞)
b ( a, 3 )
e(a,6) a
f(b,4)
1 c
b
d(c,6) 3
c ( b, 1 ) e(a,6) 4
f(b,4)
a f
1
b c
3 4
d(f,5) a f
f ( b, 4)
e(f,2)
2
1
b c
3 4
e ( f, 2) d(f,5) a f 5
d
2
Efficiency:
Efficiency of Prim‘s algorithm is based on data structure used to store priority queue.
2.5.3 Unordered array: Efficiency: Θ(n2)
2.5.4 Binary heap: Efficiency: Θ(m log n)
2.5.5 Min-heap: For graph with n nodes and m edges: Efficiency: (n + m) log n
Conclusion:
2.5.6 Prim‘s algorithm is a ―evrtex based algorithm‖
2.5.7 Prim‘s algorithm ― Needs priority queue for locating the nearest vertex.‖
The choice of priority queue matters in Prim implementation.
o Array - optimal for dense graphs
o Binary heap - better for sparse graphs
o Fibonacci heap - best in theory, but not in practice
Algorithm:
The method:
STEP 1: Sort the edges by increasing weight
STEP 2: Start with a forest having |V| number of trees.
STEP 3: Number of trees are reduced by ONE at every inclusion of an edge
At each stage:
• Among the edges which are not yet included, select the one with minimum
weight AND which does not form a cycle.
• the edge will reduce the number of trees by one by combining two trees of
the forest
Algorithm stops when |V| -1 edges are included in the MST i.e : when the number of
trees in the forest is reduced to ONE.
Example:
Apply Kruskal‘s algorithm for the following graph to find MST.
1
b c
3 4 4 6
a 5 f 5
d
2
6 8
e
Solution:
The list of edges is:
Edge ab af ae bc bf cf cd df de ef
Weight 3 5 6 1 4 4 6 5 8 2
1
bc b
Edge c
1 f
Weight
a d
Insertion
YES
status
Insertion e
1
order
ef 1
Edge b c
2
Weight a f d
Insertion
YES
status 2
Insertion e
2
order
ab 1
Edge 3 b c
3
Weight a f d
Insertion
YES
status 2
Insertion e
3
order
bf 1
Edge 3 b c
4
Weight a 4 f d
Insertion
YES
status 2
Insertion e
4
order
Edge cf
Weight 4
Insertion
NO
status
Insertion
-
order
Edge af
Weight 5
Insertion
NO
status
Insertion
-
order
df 1
Edge
3 b c
5
Weight
a 4 f d
Insertion
YES 5
status
2
Insertion
5 e
order
Algorithm stops as |V| -1 edges are included in the MST
Efficiency:
Efficiency of Kruskal‘s algorithm is based on the time needed for sorting the edge
weights of a given graph.
2.6.1 With an efficient sorting algorithm: Efficiency: Θ(|E| log |E| )
Conclusion:
2.6.2 Kruskal‘s algorithm is an ―dege based algorithm‖
2.6.3 Prim‘s algorithm with a heap is faster than Kruskal‘s algorithm.
2.7 Single Source Shortest Paths.
VT→0
Example:
Apply Dijkstra‘s algorithm to find Single source shortest paths with vertex a as the
source.
1
b c
3 4 4 6
a 5 f 5
d
2
6 8
e
Solution:
Length Dv of shortest path from source (s) to other vertices v and Penultimate vertex Pv
for every vertex v in V:
Da = 0 , Pa = null
Db = ∞ , Pb = null
Dc = ∞ , Pc = null
Dd = ∞ , Pd = null
De = ∞ , Pe = null
Df = ∞ , Pf = null
Da = 0 Pa = a 1
c ( b , 3+D1b)= 3 Pb = [ a, b ] b c
3
d ( - , ∞D)c = 4 Pc = [a,b,c]
b ( a, 3 )
e ( a , 6D)d = ∞ Pd = null a
f ( a , 5D)e = 6 Pe = [ a, e ]
Df = 5 Pf = [ a, f ]
Da = 0 Pa = a
Db =
d ( c , 4+6 ) 3 Pb = [ a, b ]
Dc = 5
e(a, 4 Pc = [a,b,c]
c ( b, 4 ) 6)Dd= 10 a f
f(a,5) Pd = [a,b,c,d]
De = 6 Pe = [ a, e ]
Df = 5 Pf = [ a, f ]
Da = 0 Pa = a
Db = 3 Pb = [ a, b ]
Dc = 4 Pc = [a,b,c] a
d ( c , 10D)d = 10 Pd = [a,b,c,d]
f ( a, 5) 6
e ( a , 6D)e = 6 Pe = [ a, e ] e
Df = 5 Pf = [ a, f ]
Da = 0 Pa = a 1
Db = 3 Pb = [ a, b ] b c
Dc = 4 Pc = [a,b,c]
e ( a, 6) d ( c, 10 )d=1 0 Pd = [a,b,c,d]
D 3 6
De = 6 Pe = [ a, e ] d
Df = 5 Pf = [ a, f ] a
Conclusion:
2.7.5 Doesn‘t work with negative weights
2.7.6 Applicable to both undirected and directed graphs
2.7.7 Use unordered array to store the priority queue: Efficiency = Θ(n2)
2.7.8 Use min-heap to store the priority queue: Efficiency = O(m log n)