Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s)
Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s) Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s)
14.6. Miscellanea 401• Linear Approximation Asymmetric PCA. This leads to an equationthat is equivalent to (9.3.2). Hence the technique is the same as redundancyanalysis, one form of reduced rank regression and PCA ofinstrumental variables (Sections 9.3.3, 9.3.4, 14.3).• Cross-correlation Asymmetric PCA. This reduces to finding the SVDof the matrix of covariances between two sets of variables, and so isequivalent to maximum covariance analysis (Section 9.3.3).• Constrained PCA. This technique finds ‘principal components’ thatare constrained to be orthogonal to a space defined by a set of constraintvectors. It is therefore closely related to the idea of projectingorthogonally to the isometric vector for size and shape data (Section13.2) and is similar to Rao’s (1964) PCA uncorrelated withinstrumental variables (Section 14.3). A soft-constraint version of thistechnique, giving a compromise between constrained PCA and ordinaryPCA, is discussed in Diamantaras and Kung (1996, Section7.3).• Oriented PCA. In general terms, the objective is to find a 1 , a 2 ,...,a k ,...that successively maximize a′ k S1a ka ′ k S2a k , where S 1, S 2 are two covariancematrices. Diamantaras and Kung (1996, Section 7.2) note thatspecial cases include canonical discriminant analysis (Section 9.1) andmaximization of a signal to noise ratio (Sections 12.4.3, 14.2.2).Xu and Yuille (1992) describe a neural network approach based on statisticalphysics that gives a robust version of PCA (see Section 10.4). Fancourtand Principe (1998) propose a network that is tailored to find PCs forlocally stationary time series.As well as using neural networks to find PCs, the PCs can also beused as inputs to networks designed for other purposes. Diamantaras andKung (1996, Section 4.6) give examples in which PCs are used as inputsto discriminant analysis (Section 9.1) and image processing. McGinnis(2000) uses them in a neural network approach to predicting snowpackaccumulation from 700 mb geopotential heights.14.6.2 Principal Components for Goodness-of-Fit StatisticsThe context of this application of PCA is testing whether or not a (univariate)set of data y 1 ,y 2 ,...,y n could have arisen from a given probabilitydistribution with cumulative distribution function G(y); that is, we want agoodness-of-fit test. If the transformationx i = G(y i ),i =1, 2,...,nis made, then we can equivalently test whether or not x 1 ,x 2 ,...,x n arefrom a uniform distribution on the range (0, 1). Assume, without loss of
402 14. Generalizations and Adaptations of Principal Component Analysisgenerality, that x 1 ≤ x 2 ≤ ... ≤ x n , and define the sample distributionfunction as F n (x) =i/n for x i ≤ x
- Page 382 and 383: 13.4. Principal Component Analysis
- Page 384 and 385: 13.4. Principal Component Analysis
- Page 386 and 387: 13.5. Common Principal Components 3
- Page 388 and 389: 13.5. Common Principal Components 3
- Page 390 and 391: 13.5. Common Principal Components 3
- Page 392 and 393: 13.5. Common Principal Components 3
- Page 394 and 395: 13.6. Principal Component Analysis
- Page 396 and 397: 13.6. Principal Component Analysis
- Page 398 and 399: 13.7. PCA in Statistical Process Co
- Page 400 and 401: 13.8. Some Other Types of Data 369A
- Page 402 and 403: 13.8. Some Other Types of Data 371d
- Page 404 and 405: 14Generalizations and Adaptations o
- Page 406 and 407: 14.1. Non-Linear Extensions of Prin
- Page 408 and 409: 14.1. Additive Principal Components
- Page 410 and 411: 14.1. Additive Principal Components
- Page 412 and 413: 14.1. Additive Principal Components
- Page 414 and 415: 14.2. Weights, Metrics, Transformat
- Page 416 and 417: 14.2. Weights, Metrics, Transformat
- Page 418 and 419: 14.2. Weights, Metrics, Transformat
- Page 420 and 421: 14.2. Weights, Metrics, Transformat
- Page 422 and 423: 14.2. Weights, Metrics, Transformat
- Page 424 and 425: 14.3. PCs in the Presence of Second
- Page 426 and 427: 14.4. PCA for Non-Normal Distributi
- Page 428 and 429: 14.5. Three-Mode, Multiway and Mult
- Page 430 and 431: 14.5. Three-Mode, Multiway and Mult
- Page 434 and 435: 14.6. Miscellanea 40314.6.3 Regress
- Page 436 and 437: 14.7. Concluding Remarks 405space o
- Page 438 and 439: Appendix AComputation of Principal
- Page 440 and 441: A.1. Numerical Calculation of Princ
- Page 442 and 443: A.1. Numerical Calculation of Princ
- Page 444 and 445: A.1. Numerical Calculation of Princ
- Page 446 and 447: ReferencesAguilera, A.M., Gutiérre
- Page 448 and 449: References 417Apley, D.W. and Shi,
- Page 450 and 451: References 419Benasseni, J. (1986b)
- Page 452 and 453: References 421Boik, R.J. (1986). Te
- Page 454 and 455: References 423Castro, P.E., Lawton,
- Page 456 and 457: References 425Cook, R.D. (1986). As
- Page 458 and 459: References 427Dempster, A.P., Laird
- Page 460 and 461: References 429Feeney, G.J. and Hest
- Page 462 and 463: References 431in Descriptive Multiv
- Page 464 and 465: References 433Gunst, R.F. and Mason
- Page 466 and 467: References 435Hocking, R.R., Speed,
- Page 468 and 469: References 437Jeffers, J.N.R. (1978
- Page 470 and 471: References 439Kazi-Aoual, F., Sabat
- Page 472 and 473: References 441Krzanowski, W.J. (200
- Page 474 and 475: References 443Mann, M.E. and Park,
- Page 476 and 477: References 445Monahan, A.H., Tangan
- Page 478 and 479: References 447Pack, P., Jolliffe, I
- Page 480 and 481: References 449Richman M.B. (1993).
402 14. Generalizations and Adaptations of <strong>Principal</strong> <strong>Component</strong> <strong>Analysis</strong>generality, that x 1 ≤ x 2 ≤ ... ≤ x n , and define the sample distributionfunction as F n (x) =i/n for x i ≤ x