GREEDY METHOD
UNIT :3 -THE GREEDY METHOD
Knapsack Problem, Minimum Cost Spanning Tree, Single Source Shortest Path
THE GENERAL METHOD
The
Greedy Method is a problem-solving approach used in designing algorithms. It works
by making a series of choices, each of which is the best choice at the moment,
with the hope that these local choices will lead to the global optimal
solution.
Key Steps in the Greedy Method:
1.
Initialization: Start with an empty or initial solution.
2.
Selection: At each step, choose the option that looks best.
3.
Feasibility Check: Ensure that the choice is valid and keeps the solution
feasible.
4.
Repeat: Continue until the problem is solved or all choices are exhausted.
When does the Greedy Method work?
The
greedy method works if the problem satisfies the Greedy-Choice Property (local
optimal choices lead to a global optimum) and Optimal Substructure (optimal
solutions to subproblems combine to form an optimal solution to the whole
problem).
KNAPSACK PROBLEM
The
Knapsack Problem is a classic optimization problem where the goal is to
maximize the value of items you can carry in a knapsack with a fixed weight
capacity. Each item has a specific weight and value, and you must decide which
items to include.
The Greedy Method solves the problem by making a series of choices, each of which looks best at the moment. For the Fractional Knapsack Problem, this approach works optimally. Here's how it works:
1. Calculate Value-to-Weight Ratio: For each item, compute its value-to-weight ratio (`value/weight`). This measures how "valuable" each unit of weight is.
2.
Sort Items: Arrange the items in descending order of their value-to-weight
ratios.
3.
Pick Items:
- Start with the item having the highest
ratio.
- If the entire item fits in the knapsack,
take it completely.
- If not, take as much as possible
(fractional part).
4.
Stop When Full: Continue until the knapsack reaches its weight limit.
This greedy approach works only for the Fractional Knapsack Problem, where items can be split. For the 0/1 Knapsack Problem (items cannot be split), the greedy method does not always give the best solution.
Example:
-
Items:
Item 1: weight = 10, value = 60
Item 2: weight = 20, value = 100
Item 3: weight = 30, value = 120
- Knapsack Capacity: 50
- Value-to-Weight Ratios:
Item 1: 6
Item 2: 5
Item 3: 4
- Greedy Selection:
1. Take Item 1 completely (10 units, value =
60).
2. Take Item 2 completely (20 units, value =
100).
3. Take 20/30 of Item 3 (value = (120 times frac{20}{30}
= 80)).
- Total Value: (60 + 100 + 80 = 240).
MINIMUM COST SPANNING TREE
The
Minimum Cost Spanning Tree (MCST) is a tree that connects all the nodes
(vertices) in a graph with the least total edge cost, ensuring no cycles are
formed. It is useful in scenarios like designing efficient network layouts
(e.g., roads, cables).
A greedy algorithm builds the MCST step-by-step by always choosing the smallest available edge that doesn't form a cycle. Two popular greedy methods are:
1. Prim’s Algorithm:
- Start from any vertex.
- Repeatedly add the smallest edge that
connects a vertex in the tree to a vertex outside the tree.
- Continue until all vertices are included.
2. Kruskal’s Algorithm:
- Sort all edges by weight (cost) in ascending
order.
- Add edges one by one to the tree, skipping
any edge that forms a cycle.
- Stop when all vertices are connected.
Both methods ensure the final tree is a minimum cost spanning tree because they always take the "locally optimal" choice, which leads to a globally optimal solution.
SINGLE SOURCE SHORTEST PATH
The Single Source
Shortest Path (SSSP) problem aims to find the shortest paths from a given
starting node (source) to all other nodes in a weighted graph. The weights on
the edges represent costs, distances, or times.
The greedy approach to solve the SSSP problem is implemented in Dijkstra's Algorithm (for non-negative edge weights). Here's how it works:
1. Initialization: Start with a source node, and assign it a distance of 0. Assign all other nodes a distance of infinity ∞ (to represent that they are not yet reachable).
2. Greedy Choice: At each step, select the unvisited node with the smallest distance (the "closest" node) from the source.
3. Relaxation: Update the distances of the neighboring nodes of the selected node. If the path through the selected node offers a shorter route to a neighbor, update the distance of that neighbor.
4. Repeat: Mark the selected node as "visited" (processed) and repeat the process until all nodes have been processed or the shortest paths to all reachable nodes are found.
5. End: When the algorithm finishes, the shortest distance from the source to every other node is known.
Example:
In a graph with nodes (A,
B, C) and edges with weights:
- (A to B: 1)
- (A to C: 4)
- (B to C: 2)
1. Start at (A): Distance to (A = 0), (B = ∞), (C = ∞).
2. Choose (A): Update
distances ((B = 1), (C = 4)).
3. Choose (B) (shortest
distance): Update (C) ((C = 1 + 2 = 3)).
4. Choose (C): No
updates needed.
5. Result: Shortest
paths are (A to B = 1), (A to C = 3).
Key Idea: The algorithm uses the greedy choice of selecting the nearest unvisited node to ensure that each step is optimal.
Comments
Post a Comment