15.05.2015 Views

Introduction to Unconstrained Optimization - Scilab

Introduction to Unconstrained Optimization - Scilab

Introduction to Unconstrained Optimization - Scilab

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

1.2 What is an optimization problem ?<br />

In the current section, we present the basic vocabulary of optimization.<br />

Consider the following general constrained optimization problem.<br />

min f(x)<br />

x∈Rn (2)<br />

s.t. h i (x) = 0, i = 1, m ′ , (3)<br />

h i (x) ≥ 0, i = m ′ + 1, m. (4)<br />

The following list presents the name of all the variables in the optimization<br />

problem.<br />

• The variables x ∈ R n can be called the unknowns, the parameters or, sometimes,<br />

the decision variables. This is because, in many physical optimization<br />

problems, some parameters are constants (the gravity constant for example)<br />

and only a limited number of parameters can be optimized.<br />

• The function f : R n → R is called the objective function. Sometimes, we also<br />

call it the cost function.<br />

• The number m ≥ 0 is the number of constraints and the functions h are the<br />

constraints function. More precisely The functions {h i } i=1,m ′ are the equality<br />

constraints functions and {h i } i=m ′ +1,m are inequality constraints functions.<br />

Some additionnal notations are used in optimization and the following list present<br />

the some of the most useful.<br />

• A point which satisfies all the constraints is feasible. The set of points which<br />

are feasible is called the feasible set. Assume that x ∈ R n is a feasible point.<br />

Then the direction p ∈ R n is called a feasible direction if the point x+αp ∈ R n<br />

is feasible for all α ∈ R.<br />

• The gradient of the objective function is denoted by g(x) ∈ R n and is defined<br />

as<br />

( ∂f<br />

g(x) = ∇f(x) = , . . . , ∂f ) T<br />

. (5)<br />

∂x 1 ∂x n<br />

• The Hessian matrix is denoted by H(x) and is defined as<br />

H ij (x) = (∇ 2 f) ij =<br />

∂2 f<br />

∂x i ∂x j<br />

. (6)<br />

If f is twice differentiable, then the Hessian matrix is symetric by equality of<br />

mixed partial derivatives so that H ij = H ji .<br />

The problem of finding a feasible point is the feasibility problem and may be<br />

quite difficult itself.<br />

6

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!