0% found this document useful (0 votes)
95 views

Multi-Objective Genetic Algorithms

This document discusses multi-objective optimization and genetic algorithms for solving multi-objective problems. It begins by defining multi-objective problems as having multiple conflicting objectives that need to be optimized simultaneously. It then provides mathematical definitions for multi-objective optimization, including concepts like Pareto dominance, Pareto order, and Pareto fronts. The document discusses different approaches to multi-objective optimization like scalarization techniques. It introduces evolutionary algorithms and describes how genetic algorithms can be applied to multi-objective optimization problems using paradigms like Pareto-based, indicator-based, and decomposition-based multi-objective evolutionary algorithms. Specific algorithms discussed include NSGA-II, SMS-EMOA, and MOEA/D. It concludes by providing

Uploaded by

Kinjal Ray
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
95 views

Multi-Objective Genetic Algorithms

This document discusses multi-objective optimization and genetic algorithms for solving multi-objective problems. It begins by defining multi-objective problems as having multiple conflicting objectives that need to be optimized simultaneously. It then provides mathematical definitions for multi-objective optimization, including concepts like Pareto dominance, Pareto order, and Pareto fronts. The document discusses different approaches to multi-objective optimization like scalarization techniques. It introduces evolutionary algorithms and describes how genetic algorithms can be applied to multi-objective optimization problems using paradigms like Pareto-based, indicator-based, and decomposition-based multi-objective evolutionary algorithms. Specific algorithms discussed include NSGA-II, SMS-EMOA, and MOEA/D. It concludes by providing

Uploaded by

Kinjal Ray
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 52

Multi-Objective Genetic

Algorithms

Group - 17

Kinjal Ray (510517001)


Souparno Bhattacharyya (B05-510417020)
Anjishnu Mukherjee (B05-511017020)
Balraj Gupta (B05-511117031)
Presentation Overview
- What do we mean by Multi-Objective
Problems?
- The mathematics behind Multi-Objective
Optimisation problems
- What is a Genetic Algorithm?
Genetic Algorithms for Multi-Objective
Optimisation
- Results obtained using PyMoo, a Python
Library
What do we mean by “Multi-
Objective”?
Say, you are an owner of a factory.
What will be your objectives?
- You need maximum profits from
your investments.
- Environmental damage need to be
minimized.
- Employees need a safe working
environment.
- Legal constraints need to be
followed.

Clearly there are “multiple objectives”. This is an example of a


multi-objective problem.
Understanding Multi-Objective Optimization

Let’s step into the shoes of a car buyer. While buying a


car, a buyer has the following objectives:

● Minimum fuel consumption


● Minimum emissions
● Minimum on-road price
● Minimum maintenance cost (insurance, service)
● Maximum comfort
● Maximum power
● Maximum features
● Maximum longevity
● Maximum resale value
Conflicting Objectives

● Minimum fuel consumptions


● Maximum power
● Minimum emissions

● Maximum power
● Minimum on-road price
● Maximum comfort
● Minimum maintenance cost
● Maximum features

Using Multi-Objective Optimization techniques, we need to find a set of


solutions that define the best trade-off between these conflicting objectives
for this problem.
Mathematical Definitions for
Multi-Objective Optimization
Problems
Formal definition of multi-objective optimization
problems
● Multiple criteria decision making
● Concerned with mathematical optimization problems
● Typically involves more than one objective function that need to be
optimized simultaneously
● Optimal decision(s) needs to be taken in the presence of trade-off(s)
between two or more conflicting objectives
● Set of solutions that define the best tradeoff between conflicting
objectives is the answer
For a COVID vaccine, the set of objectives are:
- Maximise efficiency.
- Minimise undesirable side effects.

Mathematically, this can be modeled as a minimax problem.

Minimax Problem
Saddle point

Solution
Mathematical definition of multi-objective
optimization problems

- This definition does not take into


account constraint functions,
which have to be handled in
practical applications.

- In evolutionary multi objective


optimisation algorithms,
constraints are handled by
penalties that increase the
objective function values
proportional to the constraint
violation, using lagrange
multipliers.
Considering the points in time when additional information is provided during the
optimisation, there are 3 general approaches to MOO:

- A priori -> A total order is defined on the objective space and the optimisation
algorithm finds a minimal point and minimum value concerning the order.
User states their preferences prior to the optimisation.

- A posteriori -> A partial order is defined on the objective space and the
algorithm searches for minimal set concerning this order over all feasible
solutions. Here, the user states their preference after being informed about
the trade-offs among non-dominated solutions.

- Interactive -> Objective function and constraints and their prioritisation are
refined by requesting user feedback on preferences at multiple points in time
during execution.
Our focus…
We will be focusing on a posteriori (with the partial order in question, being
Pareto order) approaches to MOO, with no explicit constraints.

But why?
This allows us to demonstrate the MO problem solving approach without
focusing on specific cases.
Dominance and Order

● Pareto Dominance
○ Allows precise comparison of
objective functions

● Pareto Order
○ Strict partial order
○ Decision Space and Objective Space
in Multi Objective Optimisation
○ Pre-Order in Decision/Search Space
Pareto Front
Continuous,
Convex Pareto
front (Problem:
BNH )

PARETO Continuous,
FRONT Concave Pareto
front
(Problem: ZDT2)
Based on Based on
Shape Continuity

Discontinuous
Pareto front
Convex Concave (Problem: ZDT3)
Continuous Discontinuous
Time Complexity to find Minimal Elements
Numerical Approaches

Two common approaches


○ Scalarization
○ Karush-Kuhn-Tucker conditions
Scalarization
● Classical solution approach for MOO
● Objective functions aggregated
● Constrained single objective problem is solved

● Scalarization Techniques:
○ Linear Scalarization
○ Chebyshev Scalarization
○ 𝛜-Constraint Method
○ Boundary Intersection Method
Linear Scalarization
● Definition

● Solution lies on Pareto front

LSP with Different Weights

Convex Pareto front Concave Pareto front


Chebyshev Scalarization
● Definition

● Solution lies on Pareto front

CSP with Different Weights

Convex Pareto front Concave Pareto front


𝛜-Constraint Method
and Boundary Intersection Method
Karush-Kuhn-Tucker Conditions
Evolutionary Multi-Objective
Optimization
Evolutionary algorithms are a branch of bio-inspired search heuristics. They use
biological principles like selection, recombination and mutation to steer a
population (set) of individuals (decision vectors) towards an optimal solution.

Multi-objective evolutionary algorithms (MOEAs) generalise this idea to


approach sets of Pareto optimal solutions spread across the Pareto front.
What other algorithms are possible?
// Simple Genetic Algorithm
evaluate(population)
for g in range(ngen):
population = select(population, len(population))
offspring = variation_and(population, Pcx, Pmut)
evaluate(offspring)
population = offspring

// ES (mu + lambda)
evaluate(population)
for g in range(ngen):
offspring = variation_or(population, lambda, Pcx, Pmut)
evaluate(offspring)
population = select(population + offspring, mu)

Pcx – The probability that an offspring is produced by crossover.


Pmut – The probability that an offspring is produced by mutation.
lambda – The number of offspring to produce.
mu – The number of individuals to select for next generation.
There are 3 main paradigms for MOEA design:

- Pareto based MOEA


- Indicator based MOEA
- Decomposition based MOEA
Pareto based MOEA

Pareto based MOEAs use a 2 level ranking scheme.

- Pareto dominance relation governs the first ranking


- Contribution of points to diversity is the principle of the second level
ranking.

Examples :

- NSGA-II
- SPEA2
NSGA-II
Indicator based MOEA

Indicator based MOEAs are guided by an indicator that measures the


performance of a set, for example, hypervolume indicator.

Improvements concerning the indicator determine the selection procedure or


the ranking of the individuals.

Example :

- SMS-EMOA
SMS-EMOA
Decomposition based MOEA

Decomposition based MOEAs decompose the problem into several


subproblems, each of which target different parts of the Pareto front.

For each subproblem, a different weighting of a scalarization method is used.

The subproblems are solved simultaneously by dynamically assigning and re-


assigning points to subproblems and exchanging information from solutions to
neighbouring subproblems.

Examples :

- MOEA/D
- NSGA-III
MOEA/D
Examples
MOEA/D
Algorithms

MOO Problems

1. ZDT-1
2. MW3
3. BNH

NSGA2
Parameters used by NSGA-2

01 Sampling
Random Sampling

02 Selection
Based on Binary tournament

Simulated Binary Crossover


03 Crossover
with eta=15, prob=0.9

Polynomial Mutation with


04 Mutation same parameter values as in
the crossover

05 Survival Rank and Crowding Survival


Resulting Exponential distribution for simulated binary crossover
between 0.2 and 0.8

Eta = 30 Eta = 1
Parameters used by MOEA/D

01 Sampling
Random Sampling

02 Selection
None

Simulated Binary Crossover


03 Crossover
with eta=20, prob=1.0

Polynomial Mutation with


04 Mutation same parameter values as in
the crossover

05 Survival None
ZDT-1 PROBLEM

Objective of ZDT Suite of problems:

● Min f1(x)
● Min f2(x)=g(x)h(f1(x),g(x))

ZDT-1 Pareto Front


ZDT-1 using NSGA-II

Code Result
ZDT-1 using MOEA/D

Code

Result
MW3 PROBLEM

Problem

Pareto Front of MW3


MW3 using NSGA-II

Code Result
MW-3 using MOEA/D

Code

Result
BNH PROBLEM

Pareto Front
BNH
BNH using NSGA-II

Code Result
BNH using MOEA/D

Code

Result
References
Research Papers/Journals
1. Emmerich, M.T.M., Deutz, A.H. “A tutorial on multiobjective optimization: fundamentals and
evolutionary methods”. Nat Comput 17, 585–609 (2018).
Link: https://doi.org/10.1007/s11047-018-9685-y
2. Slowik, A., Kwasnicka, H. “Evolutionary algorithms and their applications to engineering problems”.
Neural Comput & Applic 32, 12363–12379 (2020).
Link: https://doi.org/10.1007/s00521-020-04832-8
3. Eckart Zitzler , Kalyanmoy Deb , and Lothar Thiele. ”Comparison of Multiobjective Evolutionary
Algorithms: Empirical Results”. Evolutionary Computation 2000 8:2, 173-195
Link: https://www.mitpressjournals.org/doi/abs/10.1162/106365600568202
4. To Thanh Binh and Ulrich Korn. “MOBES: A Multiobjective Evolution Strategy for Constrained
Optimization Problems”. PROCEEDINGS OF THE THIRD INTERNATIONAL CONFERENCE ON
GENETIC ALGORITHMS (1997).
Link: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.46.8661
5. Z. Ma and Y. Wang. "Evolutionary Constrained Multiobjective Optimization: Test Suite Construction and
Performance Comparisons". IEEE Transactions on Evolutionary Computation, vol. 23, no. 6, pp. 972-
986, Dec. 2019.
Link: https://ieeexplore.ieee.org/document/8632683
Web Resources
1. PyMOO: Multi-objective optimization in Python.
Link: https://pymoo.org/
2. DEAP: A novel evolutionary computation framework for rapid prototyping and testing of
ideas.
Link: https://deap.readthedocs.io/en/master/

Code
The experiments conducted and the results obtained which were used in the slides can be found in
this Google Colaboratory notebook:
https://colab.research.google.com/drive/1P_14ZzDXtjaDLmi7KNmgSFMUdNDsRCF0?usp=sharing

You might also like