11.12.2012 Views

Chapter 5 Robust Performance Tailoring with Tuning - SSL - MIT

Chapter 5 Robust Performance Tailoring with Tuning - SSL - MIT

Chapter 5 Robust Performance Tailoring with Tuning - SSL - MIT

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

αk. A one-dimensional line search is generally used to find the minimizing stepsize.<br />

In general, this value is not found exactly, instead a termination criteria is used to<br />

determine when the line search algorithm has gotten close enough. Bertsekas suggests<br />

implementations for the line search in [16].<br />

A.2 Newton’s Method<br />

In Newton’s method a quadratic approximation to f around the current iterate xk is<br />

minimized. The descent direction, therefore, includes second-order information:<br />

dk = − � ∇ 2 f (xk) � −1 ∇f (xk) (A.12)<br />

The quantity, ∇ 2 f is the second-order derivate of the objective <strong>with</strong> respect to the<br />

design variables and is known as the Hessian:<br />

∇ 2 f = ∂2 f (x)<br />

∂x 2<br />

⎡<br />

=<br />

⎢<br />

⎣<br />

∂2f(x) ∂x2 1<br />

∂2f(x) ∂x2x1<br />

.<br />

∂ 2 f(x)<br />

∂xnx1<br />

∂ 2 f(x)<br />

∂x1x2<br />

∂2f(x) ∂x2 2<br />

.<br />

∂ 2 f(x)<br />

∂xnx2<br />

...<br />

...<br />

.. .<br />

...<br />

∂ 2 f(x)<br />

∂x1xn<br />

∂2f(x) ∂x2xn<br />

.<br />

∂ 2 f(x)<br />

∂x 2 n<br />

⎤<br />

⎥<br />

⎦<br />

(A.13)<br />

As <strong>with</strong> the gradients, the Hessian may be calculated ananlytically or approximated<br />

through finite-difference or other methods. If the central difference equation (Equa-<br />

tion 4.4) is used to calculate gradients, then the diagonal elements of the Hessian can<br />

be obtained at very little additional expesne:<br />

∂ 2 f (x)<br />

∂x 2 i<br />

= f (x +∆x�ei) − f (x − ∆xei) − 2f (x)<br />

∆x 2<br />

(A.14)<br />

Newton’s method is very popular due to its fast convergence properties. In fact,<br />

the method finds the global minimum of a positive definite quadratic function in<br />

only one iteration. For this reason, many modified algorithms are based on Newton’s<br />

method. One drawback of the tecqhnique is that the Hessian is required to determine<br />

222

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!