What Is Optimization Toolbox?

What Is Optimization Toolbox? What Is Optimization Toolbox?

cda.psych.uiuc.edu
from cda.psych.uiuc.edu More from this publisher
12.07.2015 Views

2 Tutorialceq=[];DCeq = [ ];G contains the partial derivatives of the objective function, f, returned byobjfungrad(x), with respect to each of the elements in x:The columns of DC contain the partial derivatives for each respectiveconstraint (i.e., the ith column of DC is the partial derivative of the ithconstraint with respect to x). So in the above example, DC is(2-4)(2-5)Since you are providing the gradient of the objective in objfungrad.m and thegradient of the constraints in confungrad.m, youmust tell fmincon that theseM-files contain this additional information. Use optimset to turn the optionsGradObj and GradConstr to 'on' in the example’s existing options structure:options = optimset(options,'GradObj','on','GradConstr','on');If you do not set these options to 'on' in the options structure, fmincon doesnot use the analytic gradients.The arguments lb and ub place lower and upper bounds on the independentvariables in x. In this example, there are no bound constraints and so theyare both set to [].Step 3: Invoke the constrained optimization routine.x0 = [-1,1];% Starting guess2-16

Examples That Use Standard Algorithmsoptions = optimset('LargeScale','off');options = optimset(options,'GradObj','on','GradConstr','on');lb = [ ]; ub = [ ]; % No upper or lower bounds[x,fval] = fmincon(@objfungrad,x0,[],[],[],[],lb,ub,...@confungrad,options)[c,ceq] = confungrad(x) % Check the constraint values at xAfter 20 function evaluations, the solution produced isx =-9.5474 1.0474fval =0.0236c =1.0e-14 *0.1110-0.1776ceq =[]Gradient Check: Analytic Versus NumericWhen analytically determined gradients are provided, you can compare thesupplied gradients with a set calculated by finite-difference evaluation. Thisis particularly useful for detecting mistakes in either the objective function orthe gradient function formulation.Ifyouwantsuchgradientchecks,settheDerivativeCheck option to 'on'using optimset:options = optimset(options,'DerivativeCheck','on');Thefirstcycleoftheoptimizationchecks the analytically determinedgradients (of the objective function and, if they exist, the nonlinearconstraints). If they do not match the finite-differencing gradients within agiven tolerance, a warning message indicates the discrepancy and gives theoption to abort the optimization or to continue.2-17

2 Tutorialceq=[];DCeq = [ ];G contains the partial derivatives of the objective function, f, returned byobjfungrad(x), with respect to each of the elements in x:The columns of DC contain the partial derivatives for each respectiveconstraint (i.e., the ith column of DC is the partial derivative of the ithconstraint with respect to x). So in the above example, DC is(2-4)(2-5)Since you are providing the gradient of the objective in objfungrad.m and thegradient of the constraints in confungrad.m, youmust tell fmincon that theseM-files contain this additional information. Use optimset to turn the optionsGradObj and GradConstr to 'on' in the example’s existing options structure:options = optimset(options,'GradObj','on','GradConstr','on');If you do not set these options to 'on' in the options structure, fmincon doesnot use the analytic gradients.The arguments lb and ub place lower and upper bounds on the independentvariables in x. In this example, there are no bound constraints and so theyare both set to [].Step 3: Invoke the constrained optimization routine.x0 = [-1,1];% Starting guess2-16

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!