12.07.2015 Views

v2010.10.26 - Convex Optimization

v2010.10.26 - Convex Optimization

v2010.10.26 - Convex Optimization

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

634 APPENDIX B. SIMPLE MATRICESB.1 Rank-one matrix (dyad)Any matrix formed from the unsigned outer product of two vectors,Ψ = uv T ∈ R M×N (1606)where u ∈ R M and v ∈ R N , is rank-one and called a dyad. Conversely, anyrank-one matrix must have the form Ψ . [202, prob.1.4.1] Product −uv T isa negative dyad. For matrix products AB T , in general, we haveR(AB T ) ⊆ R(A) , N(AB T ) ⊇ N(B T ) (1607)with equality when B =A [331,3.3,3.6] B.1 or respectively when B isinvertible and N(A)=0. Yet for all nonzero dyads we haveR(uv T ) = R(u) , N(uv T ) = N(v T ) ≡ v ⊥ (1608)where dim v ⊥ =N −1.It is obvious a dyad can be 0 only when u or v is 0 ;Ψ = uv T = 0 ⇔ u = 0 or v = 0 (1609)The matrix 2-norm for Ψ is equivalent to Frobenius’ norm;‖Ψ‖ 2 = σ 1 = ‖uv T ‖ F = ‖uv T ‖ 2 = ‖u‖ ‖v‖ (1610)When u and v are normalized, the pseudoinverse is the transposed dyad.Otherwise,Ψ † = (uv T ) † vu T=(1611)‖u‖ 2 ‖v‖ 2B.1 Proof. R(AA T ) ⊆ R(A) is obvious.R(AA T ) = {AA T y | y ∈ R m }⊇ {AA T y | A T y ∈ R(A T )} = R(A) by (142)

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!