06.01.2015 Views

The Steepest Descent Algorithm for Unconstrained Optimization and ...

The Steepest Descent Algorithm for Unconstrained Optimization and ...

The Steepest Descent Algorithm for Unconstrained Optimization and ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

1 <strong>The</strong> <strong>Algorithm</strong><br />

<strong>The</strong> problem we are interested in solving is:<br />

P : minimize f(x)<br />

s.t. x ∈R n ,<br />

where f(x) is differentiable. If x =¯x is a given point, f(x) can be approximated<br />

by its linear expansion<br />

f(¯ x + d) ≈ f(¯ x)+ ∇f(¯ x) T d<br />

if d “small”, i.e., if ‖d‖ is small. Now notice that if the approximation in<br />

the above expression is good, then we want to choose d so that the inner<br />

product ∇f(¯x) T d is as small as possible. Let us normalize d so that ‖d‖ = 1.<br />

<strong>The</strong>n among all directions d with norm ‖d‖ = 1, the direction<br />

d˜= −∇f(¯x)<br />

‖∇f(¯x)‖<br />

makes the smallest inner product with the gradient ∇f(¯x). This fact follows<br />

from the following inequalities:<br />

( )<br />

∇f(¯ x)‖‖d‖ = ∇f(¯) T −∇f(¯<br />

x) T x)<br />

d ≥ −‖∇f(¯ x<br />

= −∇f(¯) x T d. ˜<br />

‖∇f(¯ x)‖<br />

For this reason the un-normalized direction:<br />

d¯= −∇f(¯x)<br />

is called the direction of steepest descent at the point ¯x.<br />

Note that d¯ = −∇f(¯ x) is a descent direction as long as ∇f(¯ x) ̸=0. To<br />

see this, simply observe that d¯T ∇f(¯ x) = −(∇f(¯ x)) T ∇f(¯ x) < 0solongas<br />

∇f(¯ x) ̸=0.<br />

2

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!