15.05.2015 Views

Introduction to Unconstrained Optimization - Scilab

Introduction to Unconstrained Optimization - Scilab

Introduction to Unconstrained Optimization - Scilab

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

A subclass of optimization problems is the problem where the inequality constraints<br />

are made of bounds. The bound-constrained optimization problem is the<br />

following,<br />

min f(x)<br />

x∈Rn (7)<br />

s.t. x i,L ≤ x i ≤ x i,U , i = 1, m. (8)<br />

where {x i,L } i=1,m (resp. {x i,U } i=1,m ) are the lower (resp. upper) bounds. Many<br />

physical problems are associated with this kind of problem, because physical parameters<br />

are generaly bounded.<br />

1.3 Rosenbrock’s function<br />

We now present a famous optimization problem, used in many demonstrations of<br />

optimization methods. We will use this example consistently thoughout this document<br />

and will analyze it with <strong>Scilab</strong>. Let us consider Rosenbrock’s function [13]<br />

defined by<br />

f(x) = 100(x 2 − x 2 1) 2 + (1 − x 1 ) 2 . (10)<br />

The classical starting point is x 0 = (−1.2, 1) T where the function value is f(x 0 ) =<br />

24.2. The global minimum of this function is x ⋆ = (1, 1) T where the function value<br />

is f(x ⋆ ) = 0. The gradient is<br />

( )<br />

−400x1 (x<br />

g(x) =<br />

2 − x 2 1) − 2(1 − x 1 )<br />

200(x 2 − x 2 (11)<br />

(<br />

1)<br />

)<br />

400x<br />

3<br />

=<br />

1 − 400x 1 x 2 + 2x 1 − 2<br />

−200x 2 . (12)<br />

1 + 200x 2<br />

(9)<br />

The Hessian matrix is<br />

H(x) =<br />

( 1200x<br />

2<br />

1 − 400x 2 + 2 −400x 1<br />

−400x 1 200<br />

)<br />

. (13)<br />

The following rosenbrock function computes the function value f, the gradient<br />

g and the Hessian matrix H of Rosenbrock’s function.<br />

function [f, g, H] = rosenbrock ( x )<br />

f = 100*( x(2) -x (1)^2)^2 + (1 -x (1))^2<br />

g (1) = - 400*( x(2) -x (1)^2)* x (1) - 2*(1 - x (1))<br />

g (2) = 200*( x(2) -x (1)^2)<br />

H (1 ,1) = 1200* x (1)^2 - 400* x (2) + 2<br />

H (1 ,2) = -400* x (1)<br />

H (2 ,1) = H (1 ,2)<br />

H (2 ,2) = 200<br />

endfunction<br />

It is safe <strong>to</strong> say that the most common practical optimization issue is related <strong>to</strong> a<br />

bad implementation of the objective function. Therefore, we must be extra careful<br />

7

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!