12.07.2015 Views

Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s)

Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s)

Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s)

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

14.6. Miscellanea 401• Linear Approximation Asymmetric PCA. This leads to an equationthat is equivalent to (9.3.2). Hence the technique is the same as redundancyanalysis, one form of reduced rank regression and PCA ofinstrumental variables (Sections 9.3.3, 9.3.4, 14.3).• Cross-correlation Asymmetric PCA. This reduces to finding the SVDof the matrix of covariances between two sets of variables, and so isequivalent to maximum covariance analysis (Section 9.3.3).• Constrained PCA. This technique finds ‘principal components’ thatare constrained to be orthogonal to a space defined by a set of constraintvectors. It is therefore closely related to the idea of projectingorthogonally to the isometric vector for size and shape data (Section13.2) and is similar to Rao’s (1964) PCA uncorrelated withinstrumental variables (Section 14.3). A soft-constraint version of thistechnique, giving a compromise between constrained PCA and ordinaryPCA, is discussed in Diamantaras and Kung (1996, Section7.3).• Oriented PCA. In general terms, the objective is to find a 1 , a 2 ,...,a k ,...that successively maximize a′ k S1a ka ′ k S2a k , where S 1, S 2 are two covariancematrices. Diamantaras and Kung (1996, Section 7.2) note thatspecial cases include canonical discriminant analysis (Section 9.1) andmaximization of a signal to noise ratio (Sections 12.4.3, 14.2.2).Xu and Yuille (1992) describe a neural network approach based on statisticalphysics that gives a robust version of PCA (see Section 10.4). Fancourtand Principe (1998) propose a network that is tailored to find PCs forlocally stationary time series.As well as using neural networks to find PCs, the PCs can also beused as inputs to networks designed for other purposes. Diamantaras andKung (1996, Section 4.6) give examples in which PCs are used as inputsto discriminant analysis (Section 9.1) and image processing. McGinnis(2000) uses them in a neural network approach to predicting snowpackaccumulation from 700 mb geopotential heights.14.6.2 <strong>Principal</strong> <strong>Component</strong>s for Goodness-of-Fit StatisticsThe context of this application of PCA is testing whether or not a (univariate)set of data y 1 ,y 2 ,...,y n could have arisen from a given probabilitydistribution with cumulative distribution function G(y); that is, we want agoodness-of-fit test. If the transformationx i = G(y i ),i =1, 2,...,nis made, then we can equivalently test whether or not x 1 ,x 2 ,...,x n arefrom a uniform distribution on the range (0, 1). Assume, without loss of

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!