10.03.2015 Views

v2009.01.01 - Convex Optimization

v2009.01.01 - Convex Optimization

v2009.01.01 - Convex Optimization

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

7.2. SECOND PREVALENT PROBLEM: 517<br />

So convex problem (1260) is equivalent to the semidefinite program<br />

minimize − tr(V (D − 2Y )V )<br />

D , Y<br />

[ ]<br />

dij y ij<br />

subject to<br />

≽ 0 ,<br />

N ≥ j > i = 1... N −1<br />

y ij h 2 ij<br />

(1264)<br />

Y ∈ S N h<br />

D ∈ EDM N<br />

where the constants h 2 ij and N have been dropped arbitrarily from the<br />

objective.<br />

7.2.1.2 Gram-form semidefinite program, Problem 2, convex case<br />

There is great advantage to expressing problem statement (1264) in<br />

Gram-form because Gram matrix G is a bidirectional bridge between point<br />

list X and distance matrix D ; e.g., Example 5.4.2.2.4, Example 6.4.0.0.1.<br />

This way, problem convexity can be maintained while simultaneously<br />

constraining point list X , Gram matrix G , and distance matrix D at our<br />

discretion.<br />

<strong>Convex</strong> problem (1264) may be equivalently written via linear bijective<br />

(5.6.1) EDM operator D(G) (810);<br />

minimize<br />

G∈S N c , Y ∈ SN h<br />

subject to<br />

− tr(V (D(G) − 2Y )V )<br />

[ ]<br />

〈Φij , G〉 y ij<br />

≽ 0 ,<br />

y ij<br />

h 2 ij<br />

N ≥ j > i = 1... N −1<br />

(1265)<br />

G ≽ 0<br />

where distance-square D = [d ij ] ∈ S N h (794) is related to Gram matrix entries<br />

G = [g ij ] ∈ S N c ∩ S N + by<br />

d ij = g ii + g jj − 2g ij<br />

= 〈Φ ij , G〉<br />

(809)<br />

where<br />

Φ ij = (e i − e j )(e i − e j ) T ∈ S N + (796)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!