0% found this document useful (0 votes)
4 views

Week12

The document discusses the Greedy Approach, a simple algorithm that makes the best immediate choice at each step to solve optimization problems, such as the Fractional Knapsack, Dijkstra's shortest path, and Prim/Kruskal's MST algorithms. While greedy algorithms can yield optimal solutions, they do not always guarantee them and are often the simplest and most efficient methods available. The document also outlines the processes of these algorithms and their applications in various scenarios.

Uploaded by

esraecrin9747
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Week12

The document discusses the Greedy Approach, a simple algorithm that makes the best immediate choice at each step to solve optimization problems, such as the Fractional Knapsack, Dijkstra's shortest path, and Prim/Kruskal's MST algorithms. While greedy algorithms can yield optimal solutions, they do not always guarantee them and are often the simplest and most efficient methods available. The document also outlines the processes of these algorithms and their applications in various scenarios.

Uploaded by

esraecrin9747
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 84

Chapter 12

Greedy Approach
Greedy Approach
• A simple and easily applicable algorithm.
• When solving a problem, evaluate the best
alternative you have first and then evaluate the
others.
• This algorithm also has weaknesses.

1-2
Greedy Algorithm
• A greedy algorithm for an optimization problem always
makes the choice that looks best at the moment and
adds it to the current subsolution. What’s output at
the end is an optimal solution.
• Examples already seen are Dijkstra’s shortest path
algorithm and Prim/Kruskal’s MST algorithms.
• Greedy algorithms don’t always yield optimal solutions
but, when they do, they’re usually the simplest and
most efficient algorithms available.

3
The Greedy Method

• The greedy approach does not always lead to an optimal


solution.
• Problems that have a greedy solution are called
problems with greedy choice property.
• The greedy approach is also used in the context of hard
(hard to solve) problems to produce an approximate
solution.

4
Fractional Knapsack (Kesirli
Sırt Çantası ) Problem
• For a set S with n elements, in the fractional knapsack
problem, each element i has a positive utility bi and a
positive weight wi, and we want to find the maximum
utility subset that does not exceed a certain weight.
• We are also allowed to take fractional parts of each
element.

5
Fractional Knapsack (Kesirli
Sırt Çantası ) Problem
• That is, we can get quantity xi of each item i,
so that

The total utility of the received items is


determined by the target function.

6
Fractional Knapsack Problem

7
For an optimization problem, a greedy algorithm
always makes the choice that currently seems best
and adds it to the existing sub-solution .

Greedy algorithms do not always provide optimal


solutions, but when they do they are often the
simplest and most efficient algorithms available.
Fractional Knapsack

Instead of making binary (0-1) choices for each


item, fractions of items can be taken.

While the Fractional Backpack Problem can be


solved by greedy strategy, the 0 - 1 problem
cannot.
Fractional Knapsack
• Calculate the value per unit for each object.

• By following a Greedy Strategy, we take the item


with the highest value per unit whenever possible.

• If there is no element left and it is possible to move


more, we get as much value as possible from the
next most valuable element.
11
• Now fill the backpack according to
the decreasing value of pi.

• First, select element Ii with


weight 5.
• Then select product I3 with weight 20.
The total weight of the backpack is
20+5=25
• Now the next item is I5 and its weight is
40, but we only want 35, so we selected
the fractional part of it.

12
13
Example 2
A thief enters a store and sees the following items:

Backpack is 4 kilos.
What to steal to maximize profits?
Fractional Backpack Problem: A thief can take a fraction of an
item.
0-1 Backpack Problem: The thief can only take or leave
the item. Can't take fractions.
Greedy solution for Fractional Knapsack

List the cost per pound from largest to smallest.


If the backpack capacity is 5 pds, the solution is;
Dijkstra Algorithm
Greedy Approach

18
Single-Source Shortest Path Problem

Single-Source Shortest Path Problem - The problem


of finding shortest paths from a source vertex V to all other
nodes in the graph.
Examples
- Maps (Map Quest, Google Maps)
- Routing Systems (Networks)
Dijkstra's algorithm
Dijkstra's algorithm - It is a solution to the single-source
shortest path problem in graph theory.
It works on both directed and undirected graphs. However,
all edges must have non-negative weights.

Input: G = {E, V} Weighted graph and node v∈V such that


all edge weights are positive.

Output: Shortest path lengths from a given node to all other


nodes
1-22
1-23
1-24
1-25
1-26
1-27
1-28
1-29
1-30
1-31
1-32
1-33
1-34
1-35
1-36
Example
0 ∞
2
A B

4 1 3 10

2 2
∞ C D E ∞

5 8 ∞ 4 6

1
F G

∞ ∞

37
Regulation of Nearest
Neighbors
0 2
2
A B

4 1 3 10

2 2
∞ C D E ∞

5 8 1 4 6

1
Distance(B) = 2 F G
Distance(D) = 1
∞ ∞

38
0 2
2
A B

4 1 3 10

2 2
∞ C D E ∞

5 8 1 4 6

1
F G

∞ ∞

39
Updating Neighborhoods
0 2
2
A B

4 1 3 10

2 2
3 C D E 3

5 8 1 4 6

1
Distance(C) = 1 + 2 = 3 F G
Distance(E) = 1 + 2 = 3
Distance(F) = 1 + 8 = 9 9 5
Distance(G) = 1 + 4 = 5

40
0 2
2
A B

4 1 3 10

2 2
3 C D E 3

5 8 1 4 6

1
F G

9 5

41
0 2
2
A B

4 1 3 10

2 2
3 C D E 3

5 8 1 4 6

1
F G

9 5

42
0 2
2
A B

4 1 3 10

2 2
3 C D E 3

5 8 1 4 6

1
Distance(F) = 3 + 5 = 8 F G

8 5

43
0 2
2
A B

4 1 3 10

2 2
3 C D E 3

5 8 1 4 6

1
F G
Previous distance
6 5
Distance(F) = min (8, 5+1) = 6

44
0 2
2
A B

4 1 3 10

2 2
3 C D E 3

5 8 1 4 6

1
F G

6 5

45
Example 2
Example 2
Example 2
Example 2
Example 2
Example 2
Example 2
Example 2
Example 2
Dijkstra’s Algorithm
• Graph G, weight function w, root s

relaxing
edges

55
56
57
Prim Algorithm
• Prims algorithms can be applied on both graphs and trees.
• The purpose of the algorithm is to navigate all nodes in the system at
the least cost.
• It was first created for trees.
• The working logic of the algorithm is to visit all nodes by creating a
Minimum Spanning Tree (MST).
• In other words, no matter how many nodes we have, the goal of the
prims algorithm is to reach all nodes by creating a scanning tree within
that graph structure.

1-58
Prim Algorithm
⦁ To start, any node is selected.
⦁ At each step, the node closest to the current node
(not previously included in the tree) is included in the
tree.
⦁ If there are two nodes at equal distance, one is
preferred according to the determined rule.
⦁ n-1 iterations occur for n nodes.

1-59
Prim Algorithm
• Prim’s algorithm grows a single tree T, one edge at a time, until
it becomes a spanning tree.
• We initialize T to be a singleton node, and no edges.
• At each step, Prim’s algorithm adds the cheapest edge with one
endpoint in T and other not in T.
• Since each edge adds one new vertex to T, after n − 1 additions,
T becomes a spanning tree.

1-60
Prim Algorithm

1-61
Prim Algorithm

1-62
Prim Algorithm

1-63
Prim Algorithm

1-64
Prim Algorithm

1-65
Prim Algorithm

1-66
Prim Algorithm

1-67
Prim Algorithm

1-68
Time Complexity

1-69
Kruskal Algorithm
• It is one of the simplest graph algorithms.
• The aim is to obtain the minimum cost tree covering all nodes in a
graph.
• In the Kruskal algorithm, all edges in a graph structure are sorted from
smallest to largest.
• The tree is created starting from the smallest edges.
• The edge that causes a cycle is not added to the scan tree as it will
disrupt the tree structure.
• Cycle formation means that when we draw the edge, a closed area is
created.
• The main criterion in this algorithm is to visit all nodes in the closed area
and to combine two nodes without creating a cycle.

1-70
Kruskal Algorithm
• In Kruskal’s algorithm, sort all edges of the
given graph in increasing order. Then it keeps
on adding new edges and nodes in the MST if
the newly added edge does not form a cycle.
• It picks the minimum weighted edge at first
and the maximum weighted edge at last.
Thus we can say that it makes a locally
optimal choice in each step in order to find
the optimal solution. Hence this is a
Greedy Algorithm.

1-71
How to find MST using
Kruskal’s algorithm?
1.Sort all the edges in non-decreasing
order of their weight.
2.Pick the smallest edge. Check if it forms
a cycle with the spanning tree formed so
far. If the cycle is not formed, include this
edge. Else, discard it.
3.Repeat step#2 until there are (V-1)
edges in the spanning tree.

1-72
Kruskal Algorithm
• Kruskal’s algorithm to find the minimum
cost spanning tree uses the greedy
approach. The Greedy Choice is to pick
the smallest weight edge that does not
cause a cycle in the MST constructed so
far. Let us understand it with an example:

1-73
Kruskal Algorithm

1-74
The graph contains 9 vertices and 14 edges.
So, the minimum spanning tree formed will
be having (9 – 1) = 8 edges.

1-75
After sorting:
Now pick all
edges one by
one from the
sorted list of
edges

1-76
Step 1: Pick edge 7-6. No cycle is formed, include it.
1-77
Step 2: Pick edge 8-2. No cycle is formed, include it.
1-78
Step 3: Pick edge 6-5. No cycle is formed, include it.
1-79
Step 4: Pick edge 0-1. No cycle is formed, include it.
1-80
Step 5: Pick edge 2-5. No cycle is formed, include it.
1-81
Step 6: Pick edge 8-6. Since including this edge
results in the cycle, discard it. Pick edge 2-3: No
cycle is formed, include it. 1-82
Step 7: Pick edge 7-8. Since including this
edge results in the cycle, discard it. Pick edge
0-7. No cycle is formed, include it. 1-83
Note: Si
nce the
number
of edges
included
in the
MST
equals to
(V – 1),
so the
algorithm
stops
here
Step 8: Pick edge 1-2. Since including this edge
results in the cycle, discard it. Pick edge 3-4. No
cycle is formed, include it. 1-84

You might also like