10.03.2015 Views

v2009.01.01 - Convex Optimization

v2009.01.01 - Convex Optimization

v2009.01.01 - Convex Optimization

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

580 APPENDIX B. SIMPLE MATRICES<br />

Proof. Figure 132 shows the four fundamental subspaces for the dyad.<br />

Linear operator Ψ : R N →R M provides a map between vector spaces that<br />

remain distinct when M =N ;<br />

B.1.0.1<br />

rank-one modification<br />

u ∈ R(uv T )<br />

u ∈ N(uv T ) ⇔ v T u = 0<br />

R(uv T ) ∩ N(uv T ) = ∅<br />

(1501)<br />

If A∈ R N×N is any nonsingular matrix and 1 + v T A −1 u ≠ 0, then<br />

[192, App.6] [344,2.3, prob.16] [127,4.11.2] (Sherman-Morrison)<br />

B.1.0.2<br />

dyad symmetry<br />

(A + uv T ) −1 = A −1 − A−1 uv T A −1<br />

1 + v T A −1 u<br />

<br />

(1502)<br />

In the specific circumstance that v = u , then uu T ∈ R N×N is symmetric,<br />

rank-one, and positive semidefinite having exactly N −1 0-eigenvalues. In<br />

fact, (Theorem A.3.1.0.7)<br />

uv T ≽ 0 ⇔ v = u (1503)<br />

and the remaining eigenvalue is almost always positive;<br />

λ = u T u = tr(uu T ) > 0 unless u=0 (1504)<br />

The matrix [ Ψ u<br />

u T 1<br />

for example, is rank-1 positive semidefinite if and only if Ψ = uu T .<br />

]<br />

(1505)<br />

B.1.1 Dyad independence<br />

Now we consider a sum of dyads like (1492) as encountered in diagonalization<br />

and singular value decomposition:<br />

( k∑<br />

)<br />

k∑<br />

R s i wi<br />

T = R ( )<br />

k∑<br />

s i wi<br />

T = R(s i ) ⇐ w i ∀i are l.i. (1506)<br />

i=1<br />

i=1<br />

i=1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!