Multivariable Nonlinear Programming
Unconstrained Optimization Techniques
Unconstrained Optimization Techniques- Search Method
• This topic deals with the various methods of solving the unconstrained
Minimization Problems
x1
x
Find X 2 which minimizes f (X)
Introduction ⁝
xn
• It is true that rarely a practical design problem would be unconstrained; still, a study
of this class of problems would be important for the following reasons:
The constraints do not have significant influence in certain design problems.
Some of the powerful and robust methods of solving constrained minimization
problems require the use of unconstrained minimization techniques.
The unconstrained minimization methods can be used to solve certain complex
engineering analysis problems. For example, the displacement response (linear or
nonlinear) of any structure under any specified load condition can be found by
minimizing its potential energy.
Necessary and Sufficient Conditions
𝐴 𝑝𝑜𝑖𝑛𝑡 𝑋∗𝑤𝑖𝑙𝑙 𝑏𝑒 𝑎 𝑟𝑒𝑙𝑎𝑡𝑖𝑣𝑒 𝑚𝑖𝑛𝑖𝑚𝑢𝑚 𝑜𝑓 𝑓 𝑋
𝑖𝑓 𝑡ℎ𝑒 𝑛𝑒𝑐𝑒𝑠𝑠𝑎𝑟𝑦 𝑐𝑜𝑛𝑑𝑖𝑡𝑖𝑜𝑛𝑠,
𝜕
•𝜕 𝑓 (𝑥 𝑋
𝑖)
= 𝑋 ∗ = 0, 𝑖 = 1,2, … . . , 𝑛 𝑎𝑟𝑒 𝑠𝑎𝑡𝑖𝑠𝑓𝑖𝑒𝑑.
𝑇ℎ𝑒 𝑝𝑜𝑖𝑛𝑡 𝑋∗𝑖𝑠 𝑔𝑢𝑎𝑟𝑎𝑛𝑡𝑒𝑒𝑑 𝑡𝑜 𝑏𝑒 𝑎 𝑟𝑒𝑙𝑎𝑡𝑖𝑣𝑒 𝑚𝑖𝑛𝑖𝑚𝑢𝑚 𝑖𝑓𝑡ℎ𝑒 𝐻𝑒𝑠𝑠𝑖𝑎𝑛
𝑚𝑎𝑡𝑟𝑖𝑥 𝑖𝑠 𝑝𝑜𝑠𝑖𝑡𝑖𝑣𝑒 𝑑𝑒𝑓𝑒𝑛𝑖𝑡𝑒, 𝑡ℎ𝑎𝑡 𝑖𝑠,
𝜕2𝑓
𝐻 𝑋∗ = 𝑋∗ = 𝑝𝑜𝑠𝑖𝑡𝑖𝑣𝑒 𝑑𝑒𝑓𝑒𝑛𝑖𝑡𝑒.
𝜕𝑥𝑖𝑥𝑗
If the function is not differentiable, these conditions are not applicable to
identify the optimum point.
Classical Optimization Techniques and Search Techniques
• Whenever we want to optimize a multivariable function, the efficiency of the
optimization technique totally depends on the nature of the objective
function.
• If the objective function is convex, we will get the optimal solution.
• But, it is very difficult to check whether the function is convex or not.
• To check for convexity, we need to find the first order and second order
derivatives and do some calculations to check for convexity.
• When the function is very complex getting first order and second order
derivatives are very difficult to obtain.
• Moreover, if the function is not really continuous (discontinuity in the function)
within the domain, then also it is difficult to obtain the extremum.
• There are search methods available to get the optimum solutions without
going for derivatives of the function.
Classification of Unconstrained Minimization Methods
Two distinct types of algorithms.
• Direct search methods use only objective function values to locate
the minimum point, and
• Gradient-based methods use the first and/or the second-order
derivatives of the objective function to locate the minimum point.
Classification of unconstrained minimization methods
Direct search methods Indirect Search (Descent) methods
Steepest descent (Cauchy
Random search method method)
Grid search method Fletcher-Reeves method
Univariate method Newton’s method
Pattern search methods Marquardt method
– Powell’s method Quasi-Newton methods
– Hooke-Jeeves method Davidon-Fletcher-Powell method
Rosenbrock’s method Broyden-Fletcher- Goldfarb-
Simplex method Shanno method
Direct search methods
They require only the objective function values but not the partial
derivatives of the function in finding the minimum and hence are
often called the nongradient methods.
The direct search methods are also known as zeroth- order methods
since they use zeroth-order derivatives of the function.
These methods are most suitable for simple problems involving a
relatively small numbers of variables.
These methods are in general less efficient than the descent
methods.
Direct search methods
Direct Search methods - Univariate method
Univariate method - Procedure
Direct Search methods - Univariate method
Univariate method - Procedure
Direct Search methods - Univariate method
Univariate method - Procedure
Direct Search methods - Univariate method – Example 1
Direct Search methods - Univariate method – Example 1
Direct Search methods - Univariate method – Example 1
Direct Search methods - Univariate method – Example 1
Direct Search methods - Univariate method – Example 1
Direct Search methods - Univariate method – Example 1
Direct Search methods - Univariate method – Example 1
Descent methods
The descent techniques require, in addition to the function values,
the first and in some cases the second derivatives of the objective
function.
Since more information about the function being minimized is used
(through the use of derivatives), descent methods are generally more
efficient than direct search techniques.
The descent methods are known as gradient methods.
Among the gradient methods, those requiring only first derivatives of
the function are called first-order methods; those requiring both first
and second derivatives of the function are termed second-order
methods.
General approach
• All unconstrained minimization methods are iterative in nature and hence they
start from an initial trial solution and proceed toward the minimum point in a
sequential manner.
• 𝑋𝑖+1 = 𝑋𝑖 + λ∗𝑆𝑖 ……………………………(1)
• where 𝑋𝑖 is the starting point, 𝑆𝑖 is the search direction, λ∗ is the optimal
step length, and 𝑋𝑖+1 is the final point in iteration 𝑖.
• It is important to note that all the unconstrained minimization methods
i. require an initial point
ii. differ from one another only in the method of generating
the new point and in testing the new point for optimality
Steepest Descent method
Steepest Descent Method
Steepest Descent Method : Example-1
Steepest Descent Method : Example-1
Steepest Descent Method : Example-1
Steepest Descent Method : Example-1
26
Steepest Descent Method : Example-1
27
Steepest Descent Method : Example-1
28
Steepest Descent Method : Example-2
Perform at the most three iterations of the steepest descent method for the following function.
f(x1, x2) = 4x12 + 6x22 – 8x1x2 with X1 = (1, 1).
Fletcher-Reeves Method (Conjugate Gradient Method)
Fletcher-Reeves Method
Algorithm
Fletcher-Reeves Method : Example-1
Fletcher-Reeves Method : Example-1
Fletcher-Reeves Method : Example-1
Fletcher-Reeves Method : Example-1
Fletcher-Reeves Method : Example-1
Fletcher-Reeves Method : Example-2
Minimize f(x1, x2) = x12 – x1x2 + x1 + 3x2 - 1 with X1 = (1, 2).
Answer: (3, 7)