v2009.01.01 - Convex Optimization

v2009.01.01 - Convex Optimization v2009.01.01 - Convex Optimization

convexoptimization.com
from convexoptimization.com More from this publisher
10.03.2015 Views

666 APPENDIX E. PROJECTION E.6.4.1.1 Example. Eigenvalues λ as coefficients of orthogonal projection. Let C represent any convex subset of subspace S M , and let C 1 be any element of C . Then C 1 can be expressed as the orthogonal expansion C 1 = M∑ i=1 M∑ 〈E ij , C 1 〉 E ij ∈ C ⊂ S M (1856) j=1 j ≥ i where E ij ∈ S M is a member of the standard orthonormal basis for S M (52). This expansion is a sum of one-dimensional orthogonal projections of C 1 ; each projection on the range of a vectorized standard basis matrix. Vector inner-product 〈E ij , C 1 〉 is the coefficient of projection of svec C 1 on R(svec E ij ). When C 1 is any member of a convex set C whose dimension is L , Carathéodory’s theorem [96] [266] [173] [36] [37] guarantees that no more than L +1 affinely independent members from C are required to faithfully represent C 1 by their linear combination. E.11 Dimension of S M is L=M(M+1)/2 in isometrically isomorphic R M(M+1)/2 . Yet because any symmetric matrix can be diagonalized, (A.5.2) C 1 ∈ S M is a linear combination of its M eigenmatrices q i q T i (A.5.1) weighted by its eigenvalues λ i ; C 1 = QΛQ T = M∑ λ i q i qi T (1857) i=1 where Λ ∈ S M is a diagonal matrix having δ(Λ) i =λ i , and Q=[q 1 · · · q M ] is an orthogonal matrix in R M×M containing corresponding eigenvectors. To derive eigen decomposition (1857) from expansion (1856), M standard basis matrices E ij are rotated (B.5) into alignment with the M eigenmatrices q i qi T of C 1 by applying a similarity transformation; [287,5.6] {QE ij Q T } = { qi q T i , i = j = 1... M } ( ) √1 qi 2 qj T + q j qi T , 1 ≤ i < j ≤ M (1858) E.11 Carathéodory’s theorem guarantees existence of a biorthogonal expansion for any element in aff C when C is any pointed closed convex cone.

E.6. VECTORIZATION INTERPRETATION, 667 which remains an orthonormal basis for S M . Then remarkably ∑ C 1 = M 〈QE ij Q T , C 1 〉 QE ij Q T i,j=1 j ≥ i ∑ = M ∑ 〈q i qi T , C 1 〉 q i qi T + M 〈QE ij Q T , QΛQ T 〉 QE ij Q T i=1 ∑ = M 〈q i qi T , C 1 〉 q i qi T i=1 i,j=1 j > i ∆ ∑ = M ∑ 〈P i , C 1 〉 P i = M q i qi T C 1 q i qi T i=1 ∑ = M λ i q i qi T i=1 i=1 ∑ = M P i C 1 P i i=1 (1859) this orthogonal expansion becomes the diagonalization; still a sum of one-dimensional orthogonal projections. The eigenvalues λ i = 〈q i q T i , C 1 〉 (1860) are clearly coefficients of projection of C 1 on the range of each vectorized eigenmatrix. (conferE.6.2.1.1) The remaining M(M −1)/2 coefficients (i≠j) are zeroed by projection. When P i is rank-one symmetric as in (1859), R(svec P i C 1 P i ) = R(svec q i q T i ) = R(svec P i ) in R M(M+1)/2 (1861) and P i C 1 P i − C 1 ⊥ P i in R M(M+1)/2 (1862) E.6.4.2 Positive semidefiniteness test as orthogonal projection For any given X ∈ R m×m the familiar quadratic construct y T Xy ≥ 0, over broad domain, is a fundamental test for positive semidefiniteness. (A.2) It is a fact that y T Xy is always proportional to a coefficient of orthogonal projection; letting z in formula (1851) become y ∈ R m , then P 2 =P 1 =yy T /y T y=yy T /‖yy T ‖ 2 (confer (1496)) and formula (1852) becomes 〈yy T , X〉 〈yy T , yy T 〉 yyT = yT Xy y T y yy T y T y = yyT y T y X yyT y T y ∆ = P 1 XP 1 (1863)

666 APPENDIX E. PROJECTION<br />

E.6.4.1.1 Example. Eigenvalues λ as coefficients of orthogonal projection.<br />

Let C represent any convex subset of subspace S M , and let C 1 be any element<br />

of C . Then C 1 can be expressed as the orthogonal expansion<br />

C 1 =<br />

M∑<br />

i=1<br />

M∑<br />

〈E ij , C 1 〉 E ij ∈ C ⊂ S M (1856)<br />

j=1<br />

j ≥ i<br />

where E ij ∈ S M is a member of the standard orthonormal basis for S M<br />

(52). This expansion is a sum of one-dimensional orthogonal projections<br />

of C 1 ; each projection on the range of a vectorized standard basis matrix.<br />

Vector inner-product 〈E ij , C 1 〉 is the coefficient of projection of svec C 1 on<br />

R(svec E ij ).<br />

When C 1 is any member of a convex set C whose dimension is L ,<br />

Carathéodory’s theorem [96] [266] [173] [36] [37] guarantees that no more<br />

than L +1 affinely independent members from C are required to faithfully<br />

represent C 1 by their linear combination. E.11<br />

Dimension of S M is L=M(M+1)/2 in isometrically isomorphic<br />

R M(M+1)/2 . Yet because any symmetric matrix can be diagonalized, (A.5.2)<br />

C 1 ∈ S M is a linear combination of its M eigenmatrices q i q T i (A.5.1) weighted<br />

by its eigenvalues λ i ;<br />

C 1 = QΛQ T =<br />

M∑<br />

λ i q i qi T (1857)<br />

i=1<br />

where Λ ∈ S M is a diagonal matrix having δ(Λ) i =λ i , and Q=[q 1 · · · q M ]<br />

is an orthogonal matrix in R M×M containing corresponding eigenvectors.<br />

To derive eigen decomposition (1857) from expansion (1856), M standard<br />

basis matrices E ij are rotated (B.5) into alignment with the M eigenmatrices<br />

q i qi<br />

T of C 1 by applying a similarity transformation; [287,5.6]<br />

{QE ij Q T } =<br />

{<br />

qi q T i , i = j = 1... M<br />

}<br />

( )<br />

√1<br />

qi 2<br />

qj T + q j qi<br />

T , 1 ≤ i < j ≤ M<br />

(1858)<br />

E.11 Carathéodory’s theorem guarantees existence of a biorthogonal expansion for any<br />

element in aff C when C is any pointed closed convex cone.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!