Sunteți pe pagina 1din 18

OptimizationAlgorithmsin MATLAB

MariaGVillarreal
ISEDepartment TheOhioStateUniversity

February03,2011

Outline
ProblemDescription OptimizationProblemthatcanbesolveinMATLAB O i i i bl h b l i OptimizationToolboxsolvers NonLinearOptimization Multobjective Optimization

ProblemDescription Problem Description


Objective:
Determine the values of the controllable process variables (factors) that improve the output of a process (or system).

Facts:
Have a computer simulator (input/output black box) to represent the blackbox) output of a process (or system). The simulation program takes very long time to run.

Procedure:
Evaluate a set (usually small) of input combination (DOE) into the computer code and obtain an output value for each one. Construct a mathematical model to relate inputs and outputs, which is easier and f t t evaluate th th actual computer code. i d faster to l t then the t l t d Use this model (metamodel), and via an optimization algorithm obtained the values of the controllable variables (inputs/factors) that optimize a particular output (s).
3

OptimizationProblemthatcanbesolvein MATLAB(OptimizationToolbox) ( i i i lb )
Constrained and Unconstrained continues ConstrainedandUnconstrainedcontinues anddiscrete
Linear Quadratic Binary Integer BinaryInteger Nonlinear M lti bj ti P bl Multiobjective Problems

OptimizationToolboxsolvers Optimization Toolbox solvers


Minimizers
This group of solvers attempts to find a local minimum of the objective function near a starting p point x0.

If you h have a Gl b l O ti i ti T lb Global Optimization Toolbox license, use the GlobalSearch or MultiStart solvers. These solvers automatically generate random start points within b i t ithi bounds. d
5

OptimizationToolboxsolvers Optimization Toolbox solvers


Multiobjective minimizers
This group of solvers attempts to either minimize the maximum value of a set of functions (fminimax), or to find a location where a collection of functions is below some prespecified values (fgoalattain).

NonLinearOptimization Non Linear Optimization

Rastrigin s Rastrigin's function

[GlobalOptimizationToolboxGuide]

GeneralSearchAlgorithm General Search Algorithm


Move from point x(k) to x(k+1)=x(k)+x(k), where x(k) p =kd(k), such that f(x(k+1))<f(x(k)). d(k) is a desirable search direction and, k is called the step size. To find x T fi d (k) we need t solve t subproblems, one d to l to b bl to find d(k) and one for k. These iterative procedures (techniques) are often called direction methods.
9

GeneralSteps General Steps


1. Estimateareasonableinitialpointx Set 1 Estimate a reasonable initial point x(0).Set k=0(iterationcounter). 2. Computeadirectionsearchd 2 Compute a direction search d(k) 3. Checkconvergence. 4. Calculatestepsizek inthedirectiond(k). 5. Updatex(k+1)=x(k)+kd(k) p 6. Gotostep2.
10

Algorithmstocompute SearchDirection(d) h (d)


Steepest Descent Method (Gradient method) SteepestDescentMethod(Gradientmethod) ConjugateGradientMethod Newtons Method (Uses second order partial Newton sMethod(Usessecondorderpartial
derivativeinformation)

QuasiNewton Methods (Approximates Hessian QuasiNewtonMethods(ApproximatesHessian


matrixanditsinverseusingfirstorderderivative)
DFPMethod(Approximatestheinverseofthe et od ( pp o a es e e se o e Hessian) BFGSMethod(ApproximatesHessianmatrix)
11

SteepestDescentMethod Steepest Descent Method

12

Newton sMethod Newtons Method


1. 2. 3. 4. 5. 6. 7. Estimatestartingpointx(0).Setk=0 Calculatec(k)(gradientoff(x)atx(k)) Checkconvergence (if||c(k)||<,stop) CalculatetheHessianmatrixatx(k),H(k). Calculatethedirectionsearchasd(k)=[H(k)]1c(k) .Useanother algorithmtocomputek . Updatex(k+1)=x(k)+kd(k). Setk=k+1andgotostep2.

Drawbacks: Needstocalculatesecondorderderivatives. Hneedstobepositivedefinitetoassureadecentdirection Hmaybesingularatsomepoint.


13

QuasiNewtonsMethod BFGSMethod ( h d (UsedinMALTAB) d )

[Arora,J.(2004),IntroductiontoOptimumDesign,2nd Ed,page.327]

14

Methodtocalculatestepsize (assumingdisknown) ( d k )
Equalintervalsearch GoldenSearch G ld S h PolynomialInterpolation InaccurateLineSearch

15

OptimizationtoolboxforNonLinear Optimization
Solvers:
fmincon (constrained nonlinear minimization)
Trustregionreflective (default)
Allows only bounds or linear equality constraints, but not both.

Activeset (solve KarushKuhnTucker (KKT) equations and used quasiNetwon method to approximate the hessian matrix) Interiorpoint Sequential Quadratic Programming (SQP)

fminunc (unconstrained nonlinear minimization)


LargeScale Problem: Trustregion method based on the interiorreflective Newton method MediumScale: BFGS QuasiNewton method with a cubic line search p procedure.

fminsearch (unconstrained multivariable optimization, nonsmooth functions)


NelderMead simplex (derivativefree method)

16

Multiobjective Optimization
Solvers:
fminmax (minimizethemaximumvalueofasetof functions). functions ) fgoalattain (findalocationwhereacollectionof functionsarebelowsomeprespecified functions are below some prespecified values).

17

ChoosingaSolver (OptimizationDecisionTable) ( bl )

[OptimizationToolboxGuide]

18

S-ar putea să vă placă și