Matlab Global Optimization Toolbox Users Guide R2020a The Mathworks download
Matlab Global Optimization Toolbox Users Guide R2020a The Mathworks download
https://ebookbell.com/product/matlab-global-optimization-toolbox-
users-guide-r2020a-the-mathworks-11236246
https://ebookbell.com/product/matlab-global-optimization-toolbox-
documentation-6741682
https://ebookbell.com/product/matlab-for-engineers-global-edition-5th-
edition-holly-moore-51057064
https://ebookbell.com/product/matlab-a-practical-introduction-to-
programming-and-problem-solving-6th-edition-stormy-attaway-46075418
https://ebookbell.com/product/matlab-for-engineering-
applications-5e-ise-5thise-william-j-palm-iii-46562738
Matlab And Simulink Crash Course For Engineers 1st Ed 2022 Eklas
Hossain
https://ebookbell.com/product/matlab-and-simulink-crash-course-for-
engineers-1st-ed-2022-eklas-hossain-46608958
https://ebookbell.com/product/matlab-medical-imaging-toolbox-users-
guide-the-mathworks-46611260
https://ebookbell.com/product/matlab-for-engineering-applications-5th-
edition-william-palm-46651690
https://ebookbell.com/product/matlab-for-engineering-berardino-
dacunto-48710470
https://ebookbell.com/product/matlab-for-medical-physics-reallife-
clinical-scenarios-and-projects-jidi-sun-49034592
Global Optimization Toolbox
User's Guide
R2020a
How to Contact MathWorks
Phone: 508-647-7000
Getting Started
v
Write Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-6
Consult Optimization Toolbox Documentation . . . . . . . . . . . . . . . . . . 2-6
Set Bounds . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-6
Ensure ga Options Maintain Feasibility . . . . . . . . . . . . . . . . . . . . . . . 2-6
Gradients and Hessians . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-7
Vectorized Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-7
vi Contents
Visualize the Basins of Attraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-27
vii
Speedup with Parallel Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-86
viii Contents
Search and Poll . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-38
Using a Search Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-38
Search Using a Different Solver . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-40
ix
Displaying Plots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-6
Resume ga . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-59
Resuming ga From the Final Population . . . . . . . . . . . . . . . . . . . . . . . . . 5-59
Resuming ga From a Previous Run . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-62
x Contents
Additional Output Arguments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-65
xi
Particle Swarm Optimization Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-9
Algorithm Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-9
Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-9
Iteration Steps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-10
Stopping Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-11
Surrogate Optimization
7
What Is Surrogate Optimization? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-2
xii Contents
Solve Nonlinear Problem with Integer and Nonlinear Constraints . . . . . 7-82
Multiobjective Optimization
9
What Is Multiobjective Optimization? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-2
xiii
Initialize Search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-13
Create Archive and Incumbents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-13
Poll to Find Better Points . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-13
Update archive and iterates Structures . . . . . . . . . . . . . . . . . . . . . . . . . . 9-14
Stopping Conditions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-14
Returned Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-15
Modifications for Parallel Computation and Vectorized Function Evaluation
..................................................... 9-15
Run paretosearch Quickly . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-15
Parallel Processing
10
How Solvers Compute in Parallel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-2
Parallel Processing Types in Global Optimization Toolbox . . . . . . . . . . . . 10-2
How Toolbox Functions Distribute Processes . . . . . . . . . . . . . . . . . . . . . 10-3
xiv Contents
Options Reference
11
GlobalSearch and MultiStart Properties (Options) . . . . . . . . . . . . . . . . . 11-2
How to Set Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-2
Properties of Both Objects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-2
GlobalSearch Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-5
MultiStart Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-6
xv
Command-Line Display . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-53
Output Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-54
Plot Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-56
Parallel Computing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-56
Checkpoint File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-57
Functions
12
xvi Contents
Getting Started
17
1
Global Optimization Toolbox provides functions that search for global solutions to problems that
contain multiple maxima or minima. Toolbox solvers include surrogate, pattern search, genetic
algorithm, particle swarm, simulated annealing, multistart, and global search. You can use these
solvers for optimization problems where the objective or constraint function is continuous,
discontinuous, stochastic, does not possess derivatives, or includes simulations or black-box
functions. For problems with multiple objectives, you can identify a Pareto front using genetic
algorithm or pattern search solvers.
You can improve solver effectiveness by adjusting options and, for applicable solvers, customizing
creation, update, and search functions. You can use custom data types with the genetic algorithm and
simulated annealing solvers to represent problems not easily expressed with standard data types. The
hybrid function option lets you improve a solution by applying a second solver after the first.
Key Features
• Surrogate solver for problems with lengthy objective function execution times and bound
constraints
• Pattern search solvers for single and multiple objective problems with linear, nonlinear, and bound
constraints
• Genetic algorithm for problems with linear, nonlinear, bound, and integer constraints
• Multiobjective genetic algorithm for problems with linear, nonlinear, and bound constraints
• Particle swarm solver for bound constraints
• Simulated annealing solver for bound constraints
• Multistart and global search solvers for smooth problems with linear, nonlinear, and bound
constraints
1-2
Comparison of Six Solvers
Function to Optimize
This example shows how to minimize Rastrigin’s function with six solvers. Each solver has its own
characteristics. The characteristics lead to different solutions and run times. The results, examined in
“Compare Syntax and Solutions” on page 1-9, can help you choose an appropriate solver for your
own problems.
Rastrigin’s function has many local minima, with a global minimum at (0,0):
Usually you don't know the location of the global minimum of your objective function. To show how
the solvers look for a global solution, this example starts all the solvers around the point [20,30],
which is far from the global minimum.
The rastriginsfcn.m file implements Rastrigin’s function. This file comes with Global Optimization
Toolbox software. This example employs a scaled version of Rastrigin’s function with larger basins of
attraction. For information, see “Basins of Attraction” on page 1-19.
rf2 = @(x)rastriginsfcn(x/10);
1-3
1 Introducing Global Optimization Toolbox Functions
This example minimizes rf2 using the default settings of fminunc (an Optimization Toolbox™
solver), patternsearch, and GlobalSearch. The example also uses ga and particleswarm with
nondefault options to start with an initial population around the point [20,30]. Because
surrogateopt requires finite bounds, the example uses surrogateopt with lower bounds of -70
and upper bounds of 130 in each variable.
fminunc
To solve the optimization problem using the fminunc Optimization Toolbox solver, enter:
rf2 = @(x)rastriginsfcn(x/10); % objective
x0 = [20,30]; % start point away from the minimum
[xf,ff,flf,of] = fminunc(rf2,x0)
fminunc returns
1-4
Comparison of Six Solvers
xf =
19.8991 29.8486
ff =
12.9344
flf =
1
of =
iterations: 3
funcCount: 15
stepsize: 1.7776e-06
lssteplength: 1
firstorderopt: 5.9907e-09
algorithm: 'quasi-newton'
message: 'Local minimum found.…'
patternsearch
To solve the optimization problem using the patternsearch Global Optimization Toolbox solver,
enter:
rf2 = @(x)rastriginsfcn(x/10); % objective
x0 = [20,30]; % start point away from the minimum
[xp,fp,flp,op] = patternsearch(rf2,x0)
patternsearch returns
Optimization terminated: mesh size less than options.MeshTolerance.
xp =
19.8991 -9.9496
fp =
4.9748
flp =
1
op =
function: @(x)rastriginsfcn(x/10)
problemtype: 'unconstrained'
pollmethod: 'gpspositivebasis2n'
maxconstraint: []
searchmethod: []
iterations: 48
funccount: 174
1-5
1 Introducing Global Optimization Toolbox Functions
meshsize: 9.5367e-07
rngstate: [1x1 struct]
message: 'Optimization terminated: mesh size less than options.MeshTolerance.'
ga
To solve the optimization problem using the ga Global Optimization Toolbox solver, enter:
initpop is a 20-by-2 matrix. Each row of initpop has mean [20,30], and each element is normally
distributed with standard deviation 10. The rows of initpop form an initial population matrix for the
ga solver.
ga uses random numbers, and produces a random result. In this case ga returns:
xga =
-0.0042 -0.0024
fga =
4.7054e-05
flga =
oga =
problemtype: 'unconstrained'
rngstate: [1×1 struct]
generations: 200
funccount: 9453
1-6
Comparison of Six Solvers
particleswarm
xpso =
9.9496 0.0000
fpso =
0.9950
flgpso =
opso =
1-7
1 Introducing Global Optimization Toolbox Functions
surrogateopt
surrogateopt does not require a start point, but does require finite bounds. Set bounds of –70 to
130 in each component. To have the same sort of output as the other solvers, disable the default plot
function.
rng default % for reproducibility
lb = [-70,-70];
ub = [130,130];
rf2 = @(x)rastriginsfcn(x/10); % objective
opts = optimoptions('surrogateopt','PlotFcn',[]);
[xsur,fsur,flgsur,osur] = surrogateopt(rf2,lb,ub,opts)
xsur =
-0.0033 0.0005
fsur =
2.2456e-05
flgsur =
osur =
elapsedtime: 2.3877
funccount: 200
rngstate: [1×1 struct]
message: 'Surrogateopt stopped because it exceeded the function evaluation limit set by ↵
GlobalSearch
1-8
Comparison of Six Solvers
problem is an optimization problem structure. problem specifies the fmincon solver, the rf2
objective function, and x0=[20,30]. For more information on using createOptimProblem, see
“Create Problem Structure” on page 3-4.
Note You must specify fmincon as the solver for GlobalSearch, even for unconstrained problems.
gs is a default GlobalSearch object. The object contains options for solving the problem. Calling
run(gs,problem) runs problem from multiple start points. The start points are random, so the
following result is also random.
All 10 local solver runs converged with a positive local solver exit flag.
xg =
1.0e-07 *
-0.1405 -0.1405
fg =
flg =
og =
funcCount: 2350
localSolverTotal: 10
localSolverSuccess: 10
localSolverIncomplete: 0
localSolverNoSolution: 0
message: 'GlobalSearch stopped because it analyzed all the trial points.↵↵All 10 local solver runs converged with a po
1-9
1 Introducing Global Optimization Toolbox Functions
• fminunc quickly reaches the local solution within its starting basin, but does not explore outside
this basin at all. fminunc has a simple calling syntax.
• patternsearch takes more function evaluations than fminunc, and searches through several
basins, arriving at a better solution than fminunc. The patternsearch calling syntax is the
same as that of fminunc.
• ga takes many more function evaluations than patternsearch. By chance it arrived at a better
solution. In this case, ga found a point near the global optimum. ga is stochastic, so its results
change with every run. ga has a simple calling syntax, but there are extra steps to have an initial
population near [20,30].
• particleswarm takes fewer function evaluations than ga, but more than patternsearch. In
this case, particleswarm found a point with lower objective function value than
patternsearch, but higher than ga. Because particleswarm is stochastic, its results change
with every run. particleswarm has a simple calling syntax, but there are extra steps to have an
initial population near [20,30].
• surrogateopt stops when it reaches a function evaluation limit, which by default is 200 for a
two-variable problem. surrogateopt has a simple calling syntax, but requires finite bounds.
surrogateopt attempts to find a global solution, and in this case succeeded. Each function
evaluation in surrogateopt takes a longer time than in most other solvers, because
surrogateopt performs many auxiliary computations as part of its algorithm.
• GlobalSearch run takes the same order of magnitude of function evaluations as ga and
particleswarm, searches many basins, and arrives at a good solution. In this case,
GlobalSearch found the global optimum. Setting up GlobalSearch is more involved than
setting up the other solvers. As the example shows, before calling GlobalSearch, you must
create both a GlobalSearch object (gs in the example), and a problem structure (problem).
Then, you call the run method with gs and problem. For more details on how to run
GlobalSearch, see “Workflow for GlobalSearch and MultiStart” on page 3-3.
See Also
More About
• “Optimization Problem Setup”
• “Solver Behavior with a Nonsmooth Problem” on page 1-11
1-10
Solver Behavior with a Nonsmooth Problem
In general, the solver decision tables provide guidance on which solver is likely to work best for your
problem. For smooth problems, see “Optimization Decision Table” (Optimization Toolbox). For
nonsmooth problems, see “Table for Choosing a Solver” on page 1-23 first, and for more information
consult “Global Optimization Toolbox Solver Characteristics” on page 1-24.
1/2
The function f (x) = x is nonsmooth at the point 0, which is the minimizing point. Here is a 2-D
x(1) x(2)
plot using the matrix norm for the 4-D point .
0 0
figure
x = linspace(-5,5,51);
[xx,yy] = meshgrid(x);
zz = zeros(size(xx));
for ii = 1:length(x)
for jj = 1:length(x)
zz(ii,jj) = sqrt(norm([xx(ii,jj),yy(ii,jj);0,0]));
end
end
surf(xx,yy,zz)
xlabel('x(1)')
ylabel('x(2)')
title('Norm([x(1),x(2);0,0])^{1/2}')
1-11
1 Introducing Global Optimization Toolbox Functions
This example uses matrix norm for a 2-by-6 matrix x. The matrix norm relates to the singular value
decomposition, which is not as smooth as the Euclidean norm. See “2-Norm of Matrix” (MATLAB).
patternsearch is the recommended first solver to try for nonsmooth problems. See “Table for
Choosing a Solver” on page 1-23. Start patternsearch from a nonzero 2-by-6 matrix x0, and
attempt to locate the minimum of f . For this attempt, and all others, use the default solver options.
Return the solution, which should be near zero, the objective function value, which should likewise be
near zero, and the number of function evaluations taken.
fun = @(x)norm([x(1:6);x(7:12)])^(1/2);
x0 = [1:6;7:12];
rng default
x0 = x0 + rand(size(x0))
x0 = 2×6
[xps,fvalps,eflagps,outputps] = patternsearch(fun,x0);
xps,fvalps,eflagps,outputps.funccount
1-12
Solver Behavior with a Nonsmooth Problem
xps = 2×6
10-4 ×
fvalps = 0.0073
eflagps = 1
ans = 10780
patternsearch reaches a good solution, as evinced by exit flag 1. However, it takes over 10,000
function evaluations to converge.
The documentation states that fminsearch sometimes can handle discontinuities, so this is a
reasonable option.
[xfms,fvalfms,eflagfms,outputfms] = fminsearch(fun,x0);
xfms,fvalfms,eflagfms,outputfms.funcCount
xfms = 2×6
fvalfms = 3.1971
eflagfms = 0
ans = 2401
Using default options, fminsearch runs out of function evaluations before it converges to a solution.
Exit flag 0 indicates this lack of convergence. The reported solution is poor.
Use particleswarm
particleswarm is recommended as the next solver to try. See “Choosing Between Solvers for
Nonsmooth Problems” on page 1-26.
[xpsw,fvalpsw,eflagpsw,outputpsw] = particleswarm(fun,12);
xpsw,fvalpsw,eflagpsw,outputpsw.funccount
xpsw = 1×12
10-12 ×
1-13
1 Introducing Global Optimization Toolbox Functions
-0.0386 -0.1282 -0.0560 0.0904 0.0771 -0.0541 -0.1189 0.1290 -0.0032 0.0
fvalpsw = 4.5222e-07
eflagpsw = 1
ans = 37200
particleswarm finds an even more accurate solution than patternsearch, but takes over 35,000
function evaluations. Exit flag 1 indicates that the solution is good.
Use ga
ga is a popular solver, but is not recommended as the first solver to try. See how well it works on this
problem.
[xga,fvalga,eflagga,outputga] = ga(fun,12);
Optimization terminated: average change in the fitness value less than options.FunctionTolerance.
xga,fvalga,eflagga,outputga.funccount
xga = 1×12
-0.0061 -0.0904 0.0816 -0.0484 0.0799 -0.1925 0.0048 0.3581 0.0848 0.0
fvalga = 0.6257
eflagga = 1
ans = 65190
ga does not find as good a solution as patternsearch or particleswarm, and takes about twice as
many function evaluations as particleswarm. Exit flag 1 is misleading in this case.
fminunc is not recommended for nonsmooth functions. See how it performs on this one.
[xfmu,fvalfmu,eflagfmu,outputfmu] = fminunc(fun,x0);
fminunc stopped because the size of the current step is less than
the value of the step size tolerance.
xfmu,fvalfmu,eflagfmu,outputfmu.funcCount
xfmu = 2×6
fvalfmu = 1.1269
eflagfmu = 2
ans = 442
1-14
Solver Behavior with a Nonsmooth Problem
The fminunc solution is not as good as the ga solution. However, fminunc reaches the rather poor
solution in relatively few function evaluations. Exit flag 2 means you should take care, the first-order
optimality conditions are not met at the reported solution.
fmincon can sometimes minimize nonsmooth functions. See how it performs on this one.
[xfmc,fvalfmc,eflagfmc,outputfmc] = fmincon(fun,x0);
fmincon stopped because the size of the current step is less than
the value of the step size tolerance and constraints are
satisfied to within the value of the constraint tolerance.
xfmc,fvalfmc,eflagfmc,outputfmc.funcCount
xfmc = 2×6
10-10 ×
fvalfmc = 1.3880e-05
eflagfmc = 2
ans = 1066
fmincon with default options produces an accurate solution after fewer than 1000 function
evaluations. Exit flag 2 does not mean that the solution is inaccurate, but that the first-order
optimality conditions are not met. This is because the gradient of the objective function is not zero at
the solution.
Summary of Results
Choosing the appropriate solver leads to better, faster results. This summary shows how disparate
the results can be. The solution quality is 'Poor' if the objective function value is greater than 0.1,
'Good' if the value is smaller than 0.01, and 'Mediocre' otherwise.
Solver = {'patternsearch';'fminsearch';'particleswarm';'ga';'fminunc';'fmincon'};
SolutionQuality = {'Good';'Poor';'Good';'Poor';'Poor';'Good'};
FVal = [fvalps,fvalfms,fvalpsw,fvalga,fvalfmu,fvalfmc]';
NumEval = [outputps.funccount,outputfms.funcCount,outputpsw.funccount,...
outputga.funccount,outputfmu.funcCount,outputfmc.funcCount]';
results = table(Solver,SolutionQuality,FVal,NumEval)
results=6×4 table
Solver SolutionQuality FVal NumEval
_________________ _______________ __________ _______
1-15
1 Introducing Global Optimization Toolbox Functions
figure
hold on
for ii = 1:length(FVal)
clr = rand(1,3);
plot(NumEval(ii),FVal(ii),'o','MarkerSize',10,'MarkerEdgeColor',clr,'MarkerFaceColor',clr)
text(NumEval(ii),FVal(ii)+0.2,Solver{ii},'Color',clr);
end
ylabel('FVal')
xlabel('NumEval')
title('Reported Minimum and Evaluations By Solver')
hold off
While particleswarm achieves the lowest objective function value, it does so by taking over three
times as many function evaluations as patternsearch, and over 30 times as many as fmincon.
1-16
Solver Behavior with a Nonsmooth Problem
fmincon is not generally recommended for nonsmooth problems. It is effective in this case, but this
case has just one nonsmooth point.
See Also
More About
• “Comparison of Six Solvers” on page 1-3
• “Table for Choosing a Solver” on page 1-23
• “Global Optimization Toolbox Solver Characteristics” on page 1-24
1-17
1 Introducing Global Optimization Toolbox Functions
In this section...
“Local vs. Global Optima” on page 1-18
“Basins of Attraction” on page 1-19
• A local minimum of a function is a point where the function value is smaller than or equal to the
value at nearby points, but possibly greater than at a distant point.
• A global minimum is a point where the function value is smaller than or equal to the value at all
other feasible points.
Generally, Optimization Toolbox solvers find a local optimum. (This local optimum can be a global
optimum.) They find the optimum in the basin of attraction of the starting point. For more
information, see “Basins of Attraction” on page 1-19.
In contrast, Global Optimization Toolbox solvers are designed to search through more than one basin
of attraction. They search in various ways:
• GlobalSearch and MultiStart generate a number of starting points. They then use a local
solver to find the optima in the basins of attraction of the starting points.
• ga uses a set of starting points (called the population) and iteratively generates better points from
the population. As long as the initial population covers several basins, ga can examine several
basins.
• particleswarm, like ga, uses a set of starting points. particleswarm can examine several
basins at once because of its diverse population.
• simulannealbnd performs a random search. Generally, simulannealbnd accepts a point if it is
better than the previous point. simulannealbnd occasionally accepts a worse point, in order to
reach a different basin.
• patternsearch looks at a number of neighboring points before accepting one of them. If some
neighboring points belong to different basins, patternsearch in essence looks in a number of
basins at once.
• surrogateopt begins by quasirandom sampling within bounds, looking for a small objective
function value. surrogateopt uses a merit function that, in part, gives preference to points that
are far from evaluated points, which is an attempt to reach a global solution. After it cannot
improve the current point, surrogateopt resets, causing it to sample widely within bounds
again. Resetting is another way surrogateopt searches for a global solution.
1-18
What Is Global Optimization?
Basins of Attraction
If an objective function f(x) is smooth, the vector –∇f(x) points in the direction where f(x) decreases
most quickly. The equation of steepest descent, namely
d
x(t) = − ∇ f (x(t)),
dt
yields a path x(t) that goes to a local minimum as t gets large. Generally, initial values x(0) that are
close to each other give steepest descent paths that tend to the same minimum point. The basin of
attraction for steepest descent is the set of initial values leading to the same local minimum.
The following figure shows two one-dimensional minima. The figure shows different basins of
attraction with different line styles, and it shows directions of steepest descent with arrows. For this
and subsequent figures, black dots represent local minima. Every steepest descent path, starting at a
point x(0), goes to the black dot in the basin containing x(0).
The following figure shows how steepest descent paths can be more complicated in more dimensions.
The following figure shows even more complicated paths and basins of attraction.
1-19
1 Introducing Global Optimization Toolbox Functions
Constraints can break up one basin of attraction into several pieces. For example, consider
minimizing y subject to:
• y ≥ |x|
• y ≥ 5 – 4(x–2)2.
The figure shows the two basins of attraction with the final points.
1-20
What Is Global Optimization?
The steepest descent paths are straight lines down to the constraint boundaries. From the constraint
boundaries, the steepest descent paths travel down along the boundaries. The final point is either
(0,0) or (11/4,11/4), depending on whether the initial x-value is above or below 2.
See Also
More About
• “Visualize the Basins of Attraction” on page 3-27
• “Comparison of Six Solvers” on page 1-3
1-21
1 Introducing Global Optimization Toolbox Functions
Optimization Workflow
To solve an optimization problem:
1 Decide what type of problem you have, and whether you want a local or global solution (see
“Local vs. Global Optima” on page 1-18). Choose a solver per the recommendations in “Table for
Choosing a Solver” on page 1-23.
2 Write your objective function and, if applicable, constraint functions per the syntax in “Compute
Objective Functions” on page 2-2 and “Write Constraints” on page 2-6.
3 Set appropriate options using optimoptions, or prepare a GlobalSearch or MultiStart
problem as described in “Workflow for GlobalSearch and MultiStart” on page 3-3. For details,
see “Pattern Search Options” on page 11-7, “Particle Swarm Options” on page 11-46,
“Genetic Algorithm Options” on page 11-24, “Simulated Annealing Options” on page 11-59, or
“Surrogate Optimization Options” on page 11-52.
4 Run the solver.
5 Examine the result. For information on the result, see “Solver Outputs and Iterative Display”
(Optimization Toolbox) or Examine Results for GlobalSearch or MultiStart.
6 If the result is unsatisfactory, change options or start points or otherwise update your
optimization and rerun it. For information, see “Global Optimization Toolbox Solver
Characteristics” on page 1-24 or Improve Results. For information on improving solutions that
applies mainly to smooth problems, see “When the Solver Fails” (Optimization Toolbox), “When
the Solver Might Have Succeeded” (Optimization Toolbox), or “When the Solver Succeeds”
(Optimization Toolbox).
See Also
More About
• “Optimization Problem Setup”
• “What Is Global Optimization?” on page 1-18
1-22
Table for Choosing a Solver
To start patternsearch at multiple points when you have finite bounds lb and ub on every
component, try:
x0 = lb + rand(size(lb)).*(ub - lb);
Many other solvers provide different solution algorithms, including the genetic algorithm solver ga
and the particleswarm solver. Try some of them if the recommended solvers do not perform well on
your problem. For details, see “Global Optimization Toolbox Solver Characteristics” on page 1-24.
See Also
Related Examples
• “Solver Behavior with a Nonsmooth Problem”
More About
• “Optimization Workflow” on page 1-22
• “Global Optimization Toolbox Solver Characteristics” on page 1-24
1-23
1 Introducing Global Optimization Toolbox Functions
Solver Choices
This section describes Global Optimization Toolbox solver characteristics. The section includes
recommendations for obtaining results more effectively.
To achieve better or faster solutions, first try tuning the recommended solvers on page 1-23 by
setting appropriate options or bounds. If the results are unsatisfactory, try other solvers.
1-24
Global Optimization Toolbox Solver Characteristics
f(x)=100x2(1–x)2–x,
x2 = fminbnd(fun,0.9,1.1)
x2 =
1.0049
Term Meaning
Single local solution Find one local solution, a point x where the objective function f(x)
is a local minimum. For more details, see “Local vs. Global Optima”
on page 1-18. In the example, both x1 and x2 are local solutions.
Multiple local solutions Find a set of local solutions. In the example, the complete set of
local solutions is {x1,x2}.
Single global solution Find the point x where the objective function f(x) is a global
minimum. In the example, the global solution is x2.
1-25
1 Introducing Global Optimization Toolbox Functions
1 Try GlobalSearch first. It is most focused on finding a global solution, and has an efficient local
solver, fmincon.
2 Try MultiStart next. It has efficient local solvers, and can search a wide variety of start points.
3 Try patternsearch next. It is less efficient, since it does not use gradients. However,
patternsearch is robust and is more efficient than the remaining local solvers To search for a
global solution, start patternsearch from a variety of start points.
4 Try surrogateopt next. surrogateopt attempts to find a global solution using the fewest
objective function evaluations. surrogateopt has more overhead per function evaluation than
most other solvers. surrogateopt requires finite bounds, and accepts both integer constraints
and nonlinear inequality constraints.
5 Try particleswarm next, if your problem is unconstrained or has only bound constraints.
Usually, particleswarm is more efficient than the remaining solvers, and can be more efficient
than patternsearch.
6 Try ga next. It can handle all types of constraints, and is usually more efficient than
simulannealbnd.
7 Try simulannealbnd last. It can handle problems with no constraints or bound constraints.
simulannealbnd is usually the least efficient solver. However, given a slow enough cooling
schedule, it can find a global solution.
GlobalSearch and MultiStart both provide multiple local solutions. For the syntax to obtain
multiple solutions, see “Multiple Solutions” on page 3-20. GlobalSearch and MultiStart differ in
the following characteristics:
• MultiStart can find more local minima. This is because GlobalSearch rejects many generated
start points (initial points for local solution). Essentially, GlobalSearch accepts a start point only
when it determines that the point has a good chance of obtaining a global minimum. In contrast,
MultiStart passes all generated start points to a local solver. For more information, see
“GlobalSearch Algorithm” on page 3-38.
• MultiStart offers a choice of local solver: fmincon, fminunc, lsqcurvefit, or lsqnonlin.
The GlobalSearch solver uses only fmincon as its local solver.
• GlobalSearch uses a scatter-search algorithm for generating start points. In contrast,
MultiStart generates points uniformly at random within bounds, or allows you to provide your
own points.
• MultiStart can run in parallel. See “How to Use Parallel Processing in Global Optimization
Toolbox” on page 10-11.
1-26
Global Optimization Toolbox Solver Characteristics
1 Use fminbnd first on one-dimensional bounded problems only. fminbnd provably converges
quickly in one dimension.
2 Use patternsearch on any other type of problem. patternsearch provably converges, and
handles all types of constraints.
3 Try surrogateopt for problems that have time-consuming objective functions. surrogateopt
searches for a global solution. surrogateopt requires finite bounds, and accepts both integer
constraints and nonlinear inequality constraints.
4 Try fminsearch next for low-dimensional unbounded problems. fminsearch is not as general
as patternsearch and can fail to converge. For low-dimensional problems, fminsearch is
simple to use, since it has few tuning options.
5 Try particleswarm next on unbounded or bound-constrained problems. particleswarm has
little supporting theory, but is often an efficient algorithm.
6 Try ga next. ga has little supporting theory and is often less efficient than patternsearch or
particleswarm. ga handles all types of constraints. ga and surrogateopt are the only Global
Optimization Toolbox solvers that accept integer constraints.
7 Try simulannealbnd last for unbounded problems, or for problems with bounds.
simulannealbnd provably converges only for a logarithmic cooling schedule, which is
extremely slow. simulannealbnd takes only bound constraints, and is often less efficient than
ga.
Solver Characteristics
Solver Convergence Characteristics
GlobalSearch Fast convergence to local optima for Deterministic iterates
smooth problems Gradient-based
Automatic stochastic start points
Removes many start points heuristically
MultiStart Fast convergence to local optima for Deterministic iterates
smooth problems Can run in parallel; see “How to Use
Parallel Processing in Global Optimization
Toolbox” on page 10-11
Gradient-based
Stochastic or deterministic start points, or
combination of both
Automatic stochastic start points
Runs all start points
Choice of local solver: fmincon, fminunc,
lsqcurvefit, or lsqnonlin
patternsearch Proven convergence to local Deterministic iterates
optimum; slower than gradient- Can run in parallel; see “How to Use
based solvers Parallel Processing in Global Optimization
Toolbox” on page 10-11
No gradients
1-27
1 Introducing Global Optimization Toolbox Functions
1-28
Global Optimization Toolbox Solver Characteristics
• Convergence — Solvers can fail to converge to any solution when started far from a local
minimum. When started near a local minimum, gradient-based solvers converge to a local
minimum quickly for smooth problems. patternsearch provably converges for a wide range of
problems, but the convergence is slower than gradient-based solvers. Both ga and
simulannealbnd can fail to converge in a reasonable amount of time for some problems,
although they are often effective.
• Iterates — Solvers iterate to find solutions. The steps in the iteration are iterates. Some solvers
have deterministic iterates. Others use random numbers and have stochastic iterates.
• Gradients — Some solvers use estimated or user-supplied derivatives in calculating the iterates.
Other solvers do not use or estimate derivatives, but use only objective and constraint function
values.
• Start points — Most solvers require you to provide a starting point for the optimization in order to
obtain the dimension of the decision variables. ga and surrogateopt do not require any starting
points, because they take the dimension of the decision variables as an input or infer dimensions
from bounds. These solvers generate a start point or population automatically, or they accept a
point or points that you supply.
Compare the characteristics of Global Optimization Toolbox solvers to Optimization Toolbox solvers.
1-29
1 Introducing Global Optimization Toolbox Functions
See Also
Related Examples
• “Solver Behavior with a Nonsmooth Problem”
More About
• “Optimization Workflow” on page 1-22
• “Table for Choosing a Solver” on page 1-23
1-30
2
The file that computes this function must accept a vector x of length 2, corresponding to the variables
x1 and x2, and return a scalar equal to the value of the function at x.
1 Select New > Script (Ctrl+N) from the MATLAB® File menu. A new file opens in the editor.
2 Enter the following two lines of code:
function z = my_fun(x)
z = x(1)^2 - 2*x(1)*x(2) + 6*x(1) + 4*x(2)^2 - 3*x(2);
3 Save the file in a folder on the MATLAB path.
my_fun([2 3])
ans =
31
For gamultiobj, suppose you have three objectives. Your objective function returns a three-element
vector consisting of the three objective function values:
function z = my_fun(x)
z = zeros(1,3); % allocate output
z(1) = x(1)^2 - 2*x(1)*x(2) + 6*x(1) + 4*x(2)^2 - 3*x(2);
z(2) = x(1)*x(2) + cos(3*x(2)/(2+x(1)));
z(3) = tanh(x(1) + x(2));
2-2
Other documents randomly have
different content
Fourth. Excellent silks and cloth of gold were also made at Málaga,
Seville, Toledo, and Valencia. Indeed, no better source exists for
studying the character of this important industry in older Spain than
the Ordinances of the cities I have just enumerated.[9] We learn
from these municipal provisions, most of which were framed or
ratified in the reign of Ferdinand and Isabella, that the mingling of
fine with base material was forbidden in the strictest terms, and that
the styles and classes of even the luxurious and elaborate stuffs,
which bore an infinite variety of devices, were very numerous. Thus,
there were satins, taffetas, azeytunis, double and single velvets
(Plates iv. and vii.), brocades, and silken serges; as well as fabrics
interwoven with gold and silver thread, including the gorgoranes,
restaños, sargas, and jergas de filigrana de plata. The Ordinances of
Toledo mention the following fabrics as manufactured in that city in
the reigns of Ferdinand and Isabella, and of Charles the Fifth:—
“Silver and gold materials, which are made like gorgoran or serge.
“Silver and gold stuffs which are made like taffetas, or in silver
with silk flowers.
“Embroidered stuffs.
“Velillo of silver.
“Church vestments.
“Silver primaveras.
But where the Spanish Moors were in possession of the soil, their
clothes were similar in most respects to those of eastern peoples.
Detailed notices of these costumes are furnished us by Ibn-Said and
other writers. Fray Pedro de Alcalá explains in his Vocabulary that,
among the Granadinos, the use of one garment in particular was
limited to royalty, or nobles of high rank. This was the libas (or, in
the Granadino dialect, libis), shaped like roomy breeches, and
greatly resembling the zaragüelles worn until this hour by the
peasants of the Huerta of Valencia. Ibn-Said, quoted by Al-Makkari
(see Gayangos, History of the Mohammedan Dynasties in Spain, Vol.
I., p. 116) says that the dress of the Moors of Andalusia was not
identical with that of the Asiatic Mussulman. The former, he declares,
would often discard the turban; especially those who lived towards
the eastern frontier. In the western region the turban continued to
be generally worn by the upper classes and by the leading State
officials. Thus, at Cordova and Seville every cadi and alfaqui would
wear a turban, while at Valencia and Murcia even the nobles went
without it, and among the lower classes it had fallen into absolute
disuse. Neither officers nor soldiers of the army wore the turban.
V
THE TUNIC OF BOABDIL EL CHICO
(National Museum of Artillery, Madrid)
The chronicle says that Abu-Said, “the Red,” who was assassinated
at Tablada, under the walls of Seville, by Pedro the Cruel, was
clothed in scarlet at the time of that atrocious deed. Boabdil was
also clothed in red at the battle of Lucena. The History of the House
of Cordova, from which I have already quoted, says: “Il était armé
d'une forte cuirasse à clous dorés, doublée de velours cramoisi, d'un
morion teint de grenat et doré…. Sur sa cuirasse était passé un
caban de brocart et de velours cramoisi” (Plate v.). Eguilaz quotes a
further passage from Hurtado de Mendoza, to prove that red
continued to be the official colour of the Moorish rulers of Granada;
for when the Moriscos had risen in the Alpujarra, and met together
to invest their leaders, Aben-Abu and Aben-Humeya, with the
insignia of royalty, they clothed the former in a red costume and the
latter in purple, “passing about his neck and shoulders a red token in
the form of a scarf.”[12]
Footnotes:
The price of Seville silks was also raised and the trade injuriously
affected, by the imposition, at the close of the reign of Philip the
Second, of the onerous millones tax, as well as of the minor dues
denominated alcavalas and cientos; while finally, when Philip the
Third was on the throne, the expulsion of the Moriscos precipitated
the utter ruin of this industry.
VI
THE “BANNER OF SAINT FERDINAND”
(Seville Cathedral)
Matters grew steadily worse all through the reign of Philip the
Fourth. The principal cause of this additional decline lay in the
constant depreciation of the national currency, which kept at an
intolerable pitch of dearness the price of home-grown silk, and
enabled foreign traders to undersell the Spaniard. This will be better
understood if we consider that the composition of the copper and
silver coinage was often tampered with by Crown and Parliament in
such a way as to allow the foreigner to rid the country of nearly all
her gold and silver, leaving in exchange only the baser metal. At
intervals of a few years, proclamations were issued altering the
values of the coinage in the most capricious and disastrous terms,
and Ulloa mentions as still in circulation in the eighteenth century,
ochavos of Philip the Third which bore inscribed the value of twelve
maravedis in Roman numerals, and also (owing to the restamping of
the coins by order of the Crown), the second and successive value of
eight maravedis, marked in ordinary numerals. In fact, so grave
were these abuses, that the arbitrary value imposed upon the coins
in question grew to be six times that of the actual value of the
metal.
“Taffetas, satins, silk cloths, and serges are fabricated here, as are
silk ribbons, plain and figured velvets, stuffs of silk and silver, stuffs
of silk and gold, galloons, gold and silver fringes, and silk stockings.
The factory employs three hundred and sixty-six looms, and affords
occupation to two thousand persons. There are annually consumed
in it about a hundred thousand pounds of silk, four thousand marks
of silver, and seventy marks of gold.
Further, each silk-dyer was to have six tinajas, or large jars (see
Vol. II., pp. 120 et seq.), kept continually full of dye, well settled,
and liable at any hour to be analyzed by the veedores. In dyeing
fabrics black, each pound of silk was to be treated with ten ounces
of foreign galls of fine quality, two ounces of copperas, and two
ounces of gum-arabic.
ebookbell.com