What Is Optimization Toolbox?

What Is Optimization Toolbox? What Is Optimization Toolbox?

cda.psych.uiuc.edu
from cda.psych.uiuc.edu More from this publisher
12.07.2015 Views

lsqnonlinoutputupperUpper bounds ubStructure containing information about theoptimization. The fields of the structure areiterationsfuncCountalgorithmcgiterationsstepsizeNumber of iterations takenThe number of functionevaluationsAlgorithm usedNumber of PCG iterations(large-scale algorithm only)The final step size taken(medium-scale algorithm only)firstorderopt Measure of first-order optimality(large-scale algorithm only)For large-scale bound constrainedproblems, the first-orderoptimality is the infinity normof v.*g, wherev is defined as in“Box Constraints” on page 4-10,and g is the gradient g = J T F (see“Nonlinear Least-Squares” onpage 4-12).Note The sum of squares should not be formed explicitly. Instead, yourfunction should return a vector of function values. See “Examples” onpage 8-174.OptionsOptimization options. You can set or change the values of these optionsusing the optimset function. Some options apply to all algorithms,some are only relevant when you are using the large-scale algorithm,and others are only relevant when you are using the medium-scale8-170

lsqnonlinalgorithm. See “Optimization Options” on page 6-8 for detailedinformation.The LargeScale option specifies a preference for which algorithm to use.It is only a preference because certain conditions must be met to use thelarge-scale or medium-scale algorithm. For the large-scale algorithm,the nonlinear system of equations cannot be underdetermined; that is,the number of equations (the number of elements of F returned by fun)must be at least as many as the length of x. Furthermore, only thelarge-scale algorithm handles bound constraints:LargeScaleUse large-scale algorithm if possible when set to 'on'.Use medium-scale algorithm when set to 'off'.Medium-Scale and Large-Scale AlgorithmsTheseoptionsareusedbyboththemedium-scale and large-scalealgorithms:DerivativeCheck Compare user-supplied derivatives (Jacobian) tofinite-differencing derivatives.DiagnosticsDiffMaxChangeDiffMinChangeDisplayDisplay diagnostic information about the functionto be minimized.Maximum change in variables for finitedifferencing.Minimum change in variables for finitedifferencing.Level of display. 'off' displays no output;'iter' displays output at each iteration; 'final'(default) displays just the final output.8-171

lsqnonlinoutputupperUpper bounds ubStructure containing information about theoptimization. The fields of the structure areiterationsfuncCountalgorithmcgiterationsstepsizeNumber of iterations takenThe number of functionevaluationsAlgorithm usedNumber of PCG iterations(large-scale algorithm only)The final step size taken(medium-scale algorithm only)firstorderopt Measure of first-order optimality(large-scale algorithm only)For large-scale bound constrainedproblems, the first-orderoptimality is the infinity normof v.*g, wherev is defined as in“Box Constraints” on page 4-10,and g is the gradient g = J T F (see“Nonlinear Least-Squares” onpage 4-12).Note The sum of squares should not be formed explicitly. Instead, yourfunction should return a vector of function values. See “Examples” onpage 8-174.Options<strong>Optimization</strong> options. You can set or change the values of these optionsusing the optimset function. Some options apply to all algorithms,some are only relevant when you are using the large-scale algorithm,and others are only relevant when you are using the medium-scale8-170

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!