10.07.2015 Views

v2007.09.13 - Convex Optimization

v2007.09.13 - Convex Optimization

v2007.09.13 - Convex Optimization

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

E.3. SYMMETRIC IDEMPOTENT MATRICES 589Proof is straightforward: The vector 2-norm is a convex function. Settinggradient of the norm-square to 0, applyingD.2,(A T ABZ T − A T (I − AA † ) ) xx T A = 0⇔A T ABZ T xx T A = 0(1672)because A T = A T AA † . Projector P =AA † is therefore unique; theminimum-distance projector is the orthogonal projector, and vice versa.We get P =AA † so this projection matrix must be symmetric. Then forany matrix A∈ R m×n , symmetric idempotent P projects a given vector xin R m orthogonally on R(A). Under either condition (1670) or (1671), theprojection Px is unique minimum-distance; for subspaces, perpendicularityand minimum-distance conditions are equivalent.E.3.1Four subspacesWe summarize the orthogonal projectors projecting on the four fundamentalsubspaces: for A∈ R m×nA † A : R n on R(A † A) = R(A T )AA † : R m on R(AA † ) = R(A)I −A † A : R n on R(I −A † A) = N(A)I −AA † : R m on R(I −AA † ) = N(A T )(1673)For completeness: E.7 (1665)N(A † A) = N(A)N(AA † ) = N(A T )N(I −A † A) = R(A T )N(I −AA † ) = R(A)(1674)E.7 Proof is by singular value decomposition (A.6.2): N(A † A)⊆ N(A) is obvious.Conversely, suppose A † Ax=0. Then x T A † Ax=x T QQ T x=‖Q T x‖ 2 =0 where A=UΣQ Tis the subcompact singular value decomposition. Because R(Q)= R(A T ), then x∈N(A)that implies N(A † A)⊇ N(A).

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!