12.07.2015 Views

Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s)

Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s)

Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s)

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

2.3. <strong>Principal</strong> <strong>Component</strong>s Using a Correlation Matrix 25as in allometry (Section 13.2) and for compositional data (Section 13.3).We conclude this section by looking at three interesting properties whichhold for PCs derived from the correlation matrix. The first is that thePCs depend not on the absolute values of correlations, but only on theirratios. This follows because multiplication of all off-diagonal elements ofa correlation matrix by the same constant leaves the eigenvectors of thematrix unchanged (Chatfield and Collins, 1989, p. 67).The second property, which was noted by Hotelling (1933) in his originalpaper, is that if, instead of the normalization α ′ k α k =1,weuse˜α ′ k ˜α k = λ k , k =1, 2,...,p, (2.3.2)then, ˜α kj the jth element of ˜α k , is the correlation between the jth standardizedvariable x ∗ j and the kth PC. To see this note that for k =1, 2,...,p,˜α k = λ 1/2k α k, var(z k )=λ k ,and the p-element vector Σα k hasasitsjth element the covariance betweenx ∗ j and z k. But Σα k = λ k α k , so the covariance between x ∗ j and z k is λ k α kj .Also var(x ∗ j ) = 1, and the correlation between x∗ j and z k is thereforeλ k α jk[var(x ∗ j )var(z k)]= λ1/21/2 kα kj=˜α kj ,as required.Because of this property the normalization (2.3.2) is quite often used, inparticular in computer packages, but it has the disadvantage that it is lesseasy to informally interpret and compare a set of PCs when each PC has adifferent normalization on its coefficients. This remark is, of course, relevantto sample, rather than population, PCs, but, as with some other parts ofthe chapter, it is included here to avoid a possibly disjointed presentation.Both of these properties that hold for correlation matrices can bemodified for covariance matrices, but the results are, in each case, lessstraightforward.The third property is sufficiently substantial to deserve a label. It isincluded in this section because, at first sight, it is specific to correlationmatrix PCA although, as we will see, its implications are much wider.Proofs of the result are available in the references cited below and will notbe reproduced here.Property A6. For any integer q, 1 ≤ q ≤ p, consider the orthonormallinear transformationy = B ′ x, (2.3.3)as defined in Property A1. Let Rj:q 2 be the squared multiple correlation betweenx j and the q variables y 1 ,y 2 ,...,y q , defined by the elements of y.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!