06.01.2015 Views

The Steepest Descent Algorithm for Unconstrained Optimization and ...

The Steepest Descent Algorithm for Unconstrained Optimization and ...

The Steepest Descent Algorithm for Unconstrained Optimization and ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

A<br />

a<br />

Upper Bound on<br />

δ =<br />

( A/a−1<br />

A/a+1<br />

) 2<br />

Number of Iterations to Reduce<br />

the Optimality Gap by a factor of 10<br />

1.1 1.0 0.0023 1<br />

3.0 1.0 0.25 2<br />

10.0 1.0 0.67 6<br />

100.0 1.0 0.96 58<br />

200.0 1.0 0.98 116<br />

400.0 1.0 0.99 231<br />

Table 1: Sensitivity of <strong>Steepest</strong> <strong>Descent</strong> Convergence Rate to the Eigenvalue<br />

Ratio<br />

<strong>and</strong>instep2we need to solve α k =arg min α h(α) = f (x k + αd k ). In this<br />

example we will be able to derive an analytic expression <strong>for</strong> α k . Notice that<br />

h(α) = f (x k + αd k )<br />

=<br />

k k k k<br />

5(x 1 + αd k 1) 2 +(x 2) 2 +4(x 1 + αd k 2 + αd k 1)(x 2 + αd2) k −<br />

k<br />

k<br />

−14(x 1) − 6(x 2 + αd k<br />

1 + αd k 2)+20,<br />

<strong>and</strong> this is a simple quadratic function of the scalar α. It is minimized at<br />

α k =<br />

(d k 1 )2 +(d2 k )2<br />

2(5(d k 2 )2 +4d k 2 )<br />

1 )2 +(d k 1 dk<br />

Using the steepest descent algorithm to minimize f (x) starting from<br />

1 1<br />

x 1 =(x 1 ,x 2 )=(0, 10), <strong>and</strong> using a tolerance of ɛ = 10 −6 , we compute the<br />

iterates shown in Table 2 <strong>and</strong> in Figure 2:<br />

For a convex quadratic function f (x) = 1 x T Qx−c T x, the contours of the<br />

2<br />

function values will be shaped like ellipsoids, <strong>and</strong> the gradient vector ∇f (x)<br />

at any point x will be perpendicular to the contour line passing through x,<br />

see Figure 1.<br />

8

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!