What Is Optimization Toolbox?

What Is Optimization Toolbox? What Is Optimization Toolbox?

cda.psych.uiuc.edu
from cda.psych.uiuc.edu More from this publisher
12.07.2015 Views

fminuncY is a matrix that has the same number of rowsas there are dimensions in the problem. W =H*Y although H is not formed explicitly. fminuncuses Hinfo to compute the preconditioner. Theoptional parameters p1, p2, ... can be anyadditional parameters needed by hmfun. See“Avoiding Global Variables via Anonymous andNested Functions” on page 2-20 for informationon how to supply values for the parameters.Note 'Hessian' must be set to 'on' for Hinfoto be passed from fun to hmfun.HessPatternMaxPCGIterSee “Nonlinear Minimization with a Dense butStructured Hessian and Equality Constraints”on page 2-61 for an example.Sparsity pattern of the Hessian for finitedifferencing. If it is not convenient to computethesparseHessianmatrixH in fun, thelarge-scale method in fminunc can approximateH via sparse finite differences (of the gradient)provided the sparsity structure of H —i.e.,locations of the nonzeros—is supplied as thevalue for HessPattern. Intheworstcase,ifthestructure is unknown, you can set HessPatternto be a dense matrix and a full finite-differenceapproximation is computed at each iteration(this is the default). This can be very expensivefor large problems, so it is usually worth theeffort to determine the sparsity structure.Maximum number of PCG (preconditionedconjugate gradient) iterations (see “Algorithms”on page 8-88).8-84

fminuncPrecondBandWidthTolPCGUpper bandwidth of preconditioner for PCG. Bydefault, diagonal preconditioning is used (upperbandwidth of 0). For some problems, increasingthe bandwidth reduces the number of PCGiterations. Setting PrecondBandWidth to 'Inf'uses a direct factorization (Cholesky) ratherthan the conjugate gradients (CG). The directfactorization is computationally more expensivethan CG, but produces a better quality steptowards the solution.Termination tolerance on the PCG iteration.Medium-Scale Algorithm OnlyThese options are used only by the medium-scale algorithm:8-85

fminuncY is a matrix that has the same number of rowsas there are dimensions in the problem. W =H*Y although H is not formed explicitly. fminuncuses Hinfo to compute the preconditioner. Theoptional parameters p1, p2, ... can be anyadditional parameters needed by hmfun. See“Avoiding Global Variables via Anonymous andNested Functions” on page 2-20 for informationon how to supply values for the parameters.Note 'Hessian' must be set to 'on' for Hinfoto be passed from fun to hmfun.HessPatternMaxPCGIterSee “Nonlinear Minimization with a Dense butStructured Hessian and Equality Constraints”on page 2-61 for an example.Sparsity pattern of the Hessian for finitedifferencing. If it is not convenient to computethesparseHessianmatrixH in fun, thelarge-scale method in fminunc can approximateH via sparse finite differences (of the gradient)provided the sparsity structure of H —i.e.,locations of the nonzeros—is supplied as thevalue for HessPattern. Intheworstcase,ifthestructure is unknown, you can set HessPatternto be a dense matrix and a full finite-differenceapproximation is computed at each iteration(this is the default). This can be very expensivefor large problems, so it is usually worth theeffort to determine the sparsity structure.Maximum number of PCG (preconditionedconjugate gradient) iterations (see “Algorithms”on page 8-88).8-84

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!