Heuristic Function
Example for Heuristic Function
• The average solution cost for a randomly generated 8-puzzle instance
is about 22 steps.
• The branching factor is about 3.
– When the empty tile is in the middle, four moves are possible; when it is in a corner,
two; and when it is along an edge, three.
• The evaluation function is construed as a cost estimate
– So the node with the lowest evaluation is expanded first
• This means that an exhaustive tree search to depth 22 would look at
about 322 ≈ 3.1×1010 states.
• A graph search would cut this down by a factor of about 170,000
because only 9!/2 = 181, 440 distinct states are reachable.
• This is a manageable
number, but the
corresponding number for
the 15-puzzle is roughly
1013,
• so the next order of business
is to find a good heuristic
function.
Two commonly used heuristic function for the 8-puzzle:
• h1 = the number of misplaced tiles.
– For Figure given, all of the eight tiles are out of position
– h1 is an admissible heuristic because it is clear that any tile
that is out of place must be moved at least once.
– so the start state would have h1 = 8.
• h2 = the sum of the distances of the tiles from their
goal positions.
– h2 for the start state = 3+1 + 2 + 2+ 2 + 3+ 3 + 2 = 18
– Because tiles cannot move along diagonals, the distance we will
count is the sum of the horizontal
– h2 is also admissible because all any move can do is move one
tile one step closer to the goal.
– it takes a certain amount of experience to know that hSLD is
correlated with actual road distances and is, therefore, a useful
heuristic.
• h2 is sometimes called the city block distance or
Manhattan distance.
• As expected, neither of these overestimates the true
solution cost, which is 26.
-Let h*(n) be the cost of the optimal path from n to a goal node
-Heuristic h(n) is admissible, if 0 h(n) h*(n)
-An admissible heuristic is always optimistic
• The effect of heuristic accuracy on performance
• Generating admissible heuristics from relaxed
problems
• Generating admissible heuristics from sub
problems: Pattern databases
• Learning heuristics from experience
LOCAL SEARCH ALGORITHMS AND OPTIMIZATION PROBLEMS
• The Local search algorithm searches only the final state, not the
path to get there.
• For example, in the 8-queens problem, we care only about finding a
valid final configuration of 8 queens (8 queens arranged on chess
board, and no queen can attack other queens) and not the path from
initial state to final state.
• Local search algorithms operate by searching from a start state to
neighboring states, without keeping track of the paths, nor the set of
states that have been reached.
• They are not systematic - they might never explore a portion of the
search space where a solution actually resides.
• They searches only the final state.
Local Search Algorithms:
• Hill Climbing Algorithm
• Simulated Annealing
• Local Beam Search
• Evolutionary Algorithm – Genetic Algorithm
Hill Climbing Algorithm
• Hill climbing algorithm is a Heuristic search algorithm which continuously
moves in the direction of increasing value to find the peak of the mountain
or best solution to the problem.
• It keeps track of one current state and on each iteration moves to the
neighboring state with highest value-that is, it heads in the direction that
provides the steepest ascent.
Simulated Annealing
• Simulated Annealing is a stochastic global search optimization algorithm and it is
modified version of stochastic hill climbing.
• This algorithm appropriate for nonlinear objective functions, where other local search
algorithms do not operate well.
• The simulated-annealing solution is to start by shaking hard (i.e., at a high temperature)
and then gradually reduce the intensity of the shaking (i.e., lower the temperature).
• Simulated Annealing (SA) is very useful for situations where there are a lot of local
minima.
Local beam search
• A heuristic search algorithm that examines a graph by extending the most
promising node in a limited set is known as beam search algorithm.
• The number of nodes n represents the beam width.
• This algorithm only keeps the lowest number of nodes on open list,
Components of Beam Search
• A beam search takes three components as its input:
1. The problem usually represented as graph and contains a set of
nodes in which one or more of the nodes represents a goal.
2. The set of heuristic rules for pruning: are rules specific to the
problem domain and prune unfavorable nodes from memory
regarding the problem domain.
3. A memory with a limited available capacity.
• The memory is where the "beam" is stored, memory is full, and a
node is to be added to the beam, the most costly node will be deleted,
such that the memory limit is not exceeded.
Genetic Algorithm
• A genetic algorithm (or GA) is a search technique used to find true or approximate
solutions.
• Genetic algorithms are categorized as global search heuristics.
• GAs are particular class of evolutionary algorithms that use techniques inspired by
evolutionary biology such as inheritance, mutation, selection, and crossover (also called
recombination).
Local Search in Continuous Spaces