Introduction to Unconstrained Optimization - Scilab
Introduction to Unconstrained Optimization - Scilab
Introduction to Unconstrained Optimization - Scilab
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
1.5<br />
1.0<br />
8<br />
12<br />
0.5<br />
0.0<br />
-0.5<br />
0.5<br />
2<br />
-1.0<br />
-1.5<br />
-2.0<br />
8<br />
12<br />
8<br />
-2.5<br />
-2.0 -1.5 -1.0 -0.5 0.0 0.5 1.0 1.5 2.0<br />
Figure 27: The con<strong>to</strong>urs of a quadratic function.<br />
In order <strong>to</strong> check that the computations are correct, we use the derivative function.<br />
-->[gfd , Hfd ]= derivative ( quadratic , x, H_form =’ blockmat ’)<br />
Hfd =<br />
2. 1.<br />
1. 4.<br />
gfd =<br />
4. 7.<br />
We finally compute the relative error between the computed gradient and Hessian and the finite<br />
difference formulas.<br />
--> norm (g-gfd ’)/ norm (g)<br />
ans =<br />
3.435D -12<br />
--> norm (H-Hfd )/ norm (H)<br />
ans =<br />
0.<br />
The relative error for the gradient indicates that there are approximately 12 significant digits.<br />
Therefore, our gradient is accurate. The Hessian matrix is exact.<br />
3.2 Answers for section 2.8<br />
Answer of Exercise 2.1 (Convex hull - 1 )<br />
Before really detailing the proof, we can detail an auxiliary result, which will help us in the<br />
design of the proof. We are going <strong>to</strong> prove that a convex combination of 2 points can be combined<br />
with a third point so that the result is a convex combination of the 3 points. Let us suppose that<br />
C is a convex set et let us assume that three points x 1 , x 2 , x 3 are in C. Let us assume that x 2 is<br />
a convex combination of x 1 and x 2 , i.e.<br />
x 2 = θ 2 x 1 + (1 − θ 2 )x 2 , (64)<br />
with 0 ≤ θ 2 ≤ 1. Let us define x 3 as a convex combination of x 2 and x 3 , i.e.<br />
x 3 = θ 3 x 2 + (1 − θ 3 )x 3 , (65)<br />
36