03.12.2012 Views

C++ for Scientists - Technische Universität Dresden

C++ for Scientists - Technische Universität Dresden

C++ for Scientists - Technische Universität Dresden

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

14.6. THE NEWTON-RAPHSON METHOD FOR FINDING THE MINIMUM OF A CONVEX FUNCTION2<br />

Then p ′ (x) = 0 <strong>for</strong><br />

x = ˜x − f ′ (˜x)<br />

f ′′ (˜x)<br />

The Newton method goes as follows : In this algorithm τ is a tolerance <strong>for</strong> the stopping criterion.<br />

1<br />

2<br />

1. Given initial ˜x = x (0) .<br />

2. Repeat <strong>for</strong> j = 1, 2, . . .<br />

3<br />

2.1. Compute x (j) = x (j−1) − f ′ (x (j−1) )/f ′′ (x (j−1) 4<br />

)<br />

3. Until |f ′ (x (j−1) )/f ′′ (xj−1) 5<br />

)| < τ<br />

The iteration stops when the derivative is much smaller than the second order derivative. What<br />

happens when f ′′ (xj−1) = 0 ?<br />

14.6.2 Multivariate functions<br />

For multivariate functions, the principle is the same, but it is more complicated. A multivariate<br />

function f has an argument x ∈ R n , ie. a vector of size n. For example, f = sin(x1)+x2 cos(x1)<br />

is a multivariate function in the variables x1 and x2.<br />

We use the same idea as <strong>for</strong> one variable. That is, we use the Taylor expansion to approximate<br />

the function :<br />

f(x) � f(˜x) + ∇f(˜x) T (x − ˜x) + 1<br />

2 (x − ˜x)T H(f(˜x))(x − ˜x)<br />

⎛ ⎞<br />

∂f/∂x1<br />

⎜ ⎟<br />

∇f(˜x) = ⎝ . ⎠<br />

H(f) =<br />

⎡<br />

⎢<br />

⎣<br />

∂f/∂xn<br />

∂ 2 f/∂x1∂x1 · · · ∂ 2 f/∂x1∂xn<br />

.<br />

∂ 2 f/∂xn∂x1 · · · ∂ 2 f/∂xn∂xn<br />

where ∇f(˜x) is called the gradient vector and H(f) the Hessian matrix.<br />

The derivative becomes<br />

so, the derivative is zero when<br />

f ′ (x) = ∇f(˜x) + H(f(˜x))(x − ˜x)<br />

x = ˜x − {H(f(˜x))} −1 ∇f(˜x) .<br />

This requires the solution of an n × n linear system on each iteration. The Newton algorithm<br />

is very similar to the univariate case:<br />

14.6.3 Software <strong>for</strong> uni-variate functions<br />

The task is to first develop the function<br />

.<br />

⎤<br />

⎥<br />

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!