Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s)
Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s) Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s)
11.2. Alternatives to Rotation 283Figure 11.5. Loadings of second winter components for PCA, RPCA, SCoT,SCoTLASS and simple component analysis.
284 11. Rotation and Interpretation of Principal Componentsvariables in the functions. The constraints are designed to make the resultingcomponents simpler to interpret than PCs, but without sacrificing toomuch of the variance accounted for by the PCs. The first idea, discussedin Section 11.2.1, is a simple one, namely, restricting coefficients to a set ofintegers, though it is less simple to put into practice. The second type oftechnique, described in Section 11.2.2, borrows an idea from regression, thatof the LASSO (Least Absolute Shrinkage and Selection Operator). By imposingan additional constraint in the PCA optimization problem, namely,that the sum of the absolute values of the coefficients in a component isbounded, some of the coefficients can be forced to zero. A technique fromatmospheric science, empirical orthogonal teleconnections, is described inSection 11.2.3, and Section 11.2.4 makes comparisons between some of thetechniques introduced so far in the chapter.11.2.1 Components with Discrete-Valued CoefficientsA fairly obvious way of constructing simpler versions of PCs is to successivelyfind linear functions of the p variables that maximize variance, as inPCA, but with a restriction on the values of coefficients in those functionsto a small number of values. An extreme version of this was suggested byHausmann (1982), in which the loadings are restricted to the values +1,−1 and 0. To implement the technique, Hausman (1982) suggests the useof a branch-and-bound algorithm. The basic algorithm does not includean orthogonality constraint on the vectors of loadings of successive ‘components,’but Hausmann (1982) adapts it to impose this constraint. Thisimproves interpretability and speeds up the algorithm, but has the implicationthat it may not be possible to find as many as p components. In the6-variable example given by Hausmann (1982), after 4 orthogonal componentshave been found with coefficients restricted to {−1, 0, +1} the nullvector is the only vector with the same restriction that is orthogonal to allfour already found. In a unpublished M.Sc. project report, Brooks (1992)discusses some other problems associated with Hausmann’s algorithm.Further information is given on Hausmann’s example in Table 11.2. Herethe following can be seen:• The first component is a straightforward average or ‘size’ componentin both analyses.• Despite a considerable simplication, and a moderately different interpretation,for the second constrained component, there is very littleloss in variance accounted by the first two constrained componentscompared to first two PCs.A less restrictive method is proposed by Vines (2000), in which the coefficientsare also restricted to integers. The algorithm for finding so-calledsimple components starts with a set of p particularly simple vectors of
- Page 264 and 265: 10.1. Detection of Outliers Using P
- Page 266 and 267: 10.1. Detection of Outliers Using P
- Page 268 and 269: 10.1. Detection of Outliers Using P
- Page 270 and 271: 10.1. Detection of Outliers Using P
- Page 272 and 273: 10.1. Detection of Outliers Using P
- Page 274 and 275: 10.1. Detection of Outliers Using P
- Page 276 and 277: 10.1. Detection of Outliers Using P
- Page 278 and 279: 10.1. Detection of Outliers Using P
- Page 280 and 281: 10.2. Influential Observations in a
- Page 282 and 283: 10.2. Influential Observations in a
- Page 284 and 285: 10.2. Influential Observations in a
- Page 286 and 287: 10.2. Influential Observations in a
- Page 288 and 289: 10.2. Influential Observations in a
- Page 290 and 291: 10.3. Sensitivity and Stability 259
- Page 292 and 293: 10.3. Sensitivity and Stability 261
- Page 294 and 295: 10.4. Robust Estimation of Principa
- Page 296 and 297: 10.4. Robust Estimation of Principa
- Page 298 and 299: 10.4. Robust Estimation of Principa
- Page 300 and 301: 11Rotation and Interpretation ofPri
- Page 302 and 303: 11.1. Rotation of Principal Compone
- Page 304 and 305: oot of the corresponding eigenvalue
- Page 306 and 307: 11.1. Rotation of Principal Compone
- Page 308 and 309: 11.1. Rotation of Principal Compone
- Page 310 and 311: 11.2. Alternatives to Rotation 279w
- Page 312 and 313: 11.2. Alternatives to Rotation 281F
- Page 316 and 317: 11.2. Alternatives to Rotation 285T
- Page 318 and 319: 11.2. Alternatives to Rotation 287T
- Page 320 and 321: 11.2. Alternatives to Rotation 289A
- Page 322 and 323: 11.2. Alternatives to Rotation 291
- Page 324 and 325: 11.3. Simplified Approximations to
- Page 326 and 327: 11.3. Simplified Approximations to
- Page 328 and 329: 11.4. Physical Interpretation of Pr
- Page 330 and 331: 12Principal Component Analysis forT
- Page 332 and 333: 12.1. Introduction 301series is alm
- Page 334 and 335: 12.2. PCA and Atmospheric Time Seri
- Page 336 and 337: 12.2. PCA and Atmospheric Time Seri
- Page 338 and 339: and a typical row of the matrix is1
- Page 340 and 341: 12.2. PCA and Atmospheric Time Seri
- Page 342 and 343: 12.2. PCA and Atmospheric Time Seri
- Page 344 and 345: 12.2. PCA and Atmospheric Time Seri
- Page 346 and 347: 12.2. PCA and Atmospheric Time Seri
- Page 348 and 349: 12.3. Functional PCA 317A key refer
- Page 350 and 351: 12.3. Functional PCA 319The sample
- Page 352 and 353: 12.3. Functional PCA 321speed (mete
- Page 354 and 355: 12.3. Functional PCA 323of the data
- Page 356 and 357: 12.3. Functional PCA 325subject to
- Page 358 and 359: 12.3. Functional PCA 327series than
- Page 360 and 361: 12.4. PCA and Non-Independent Data
- Page 362 and 363: 12.4. PCA and Non-Independent Data
11.2. Alternatives to Rotation 283Figure 11.5. Loadings of second winter components for PCA, RPCA, SCoT,SCoTLASS and simple component analysis.