Introduction to Unconstrained Optimization - Scilab
Introduction to Unconstrained Optimization - Scilab
Introduction to Unconstrained Optimization - Scilab
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
Exercise 2.5 (Convex function - 3 ) This exercise is associated with the proposition 2.6, which<br />
gives the second form of the first order condition of a differentiable convex function. The first part<br />
of the proof has already be proved in this chapter and the exercise is based on the proof of the<br />
second part. Let f be continously differentiable on the convex set C. Prove that if f satisfies the<br />
inequality<br />
for all x, y ∈ C, then f is convex over the convex set C.<br />
(g(x) − g(y)) T (x − y) ≥ 0 (61)<br />
Exercise 2.6 (Hessian of Rosenbrock’s function) We have seen in exercise 1.1 that the<br />
Hessian matrix of a quadratic function f(x) = b T x + 1 2 xT Hx is simply the matrix H. In this<br />
case the proposition 2.7 states that if the matrix H is positive definite, then it is convex. On<br />
the other hand, for a general function, there might be some points where the Hessian matrix is<br />
positive definite and some other points where it is indefinite. Hence the Hessian positivity is only<br />
local, as opposed <strong>to</strong> the global behaviour of a quadratic function. Consider Rosenbrock’s function<br />
defined by the equation 10. Use <strong>Scilab</strong> and prove that the Hessian matrix is positive definite at<br />
the point x = (1, 1) T . Check that it is indefinite at the point x = (0, 1) T . Make a random walk in<br />
the interval [−2, 2] × [−1, 2] and check that many points are associated with an indefinite Hessian<br />
matrix.<br />
34