Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s)

Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s) Jolliffe I. Principal Component Analysis (2ed., Springer, 2002)(518s)

cda.psych.uiuc.edu
from cda.psych.uiuc.edu More from this publisher
12.07.2015 Views

11.2. Alternatives to Rotation 287Table 11.3. Jeffers’ pitprop data - coefficients and variance for the first component.Variable PCA SCoT SCoTLASS Simple1 0.40 0.44 0.50 12 0.41 0.44 0.51 13 0.12 0.10 0 04 0.17 0.14 0 05 0.06 0.04 0 16 0.28 0.25 0.05 17 0.40 0.40 0.35 18 0.29 0.27 0.23 19 0.36 0.35 0.39 110 0.38 0.38 0.41 111 −0.01 −0.01 0 012 −0.11 −0.09 0 −113 −0.11 −0.08 0 −1Variance(%) 32.45 32.28 29.07 28.23Table 11.4. Jeffers’ pitprop data - coefficients and cumulative variance for thefourth component.Variable PCA SCoT SCoTLASS Simple1 −0.03 0.02 −0.07 −1332 −0.02 0.02 −0.11 −1333 0.02 −0.05 0.16 6034 0.01 −0.00 0.08 6015 0.25 −0.02 0 796 −0.15 −0.01 −0.02 −3337 −0.13 −0.01 −0.05 −2738 0.29 0.02 0.47 2509 0.13 0.03 0 −6810 −0.20 −0.04 0 22411 0.81 1.00 0.51 2012 −0.30 −0.00 −0.68 −30813 −0.10 0.00 −0.04 −79CumulativeVariance (%) 74.0 59.9 70.0 68.7

288 11. Rotation and Interpretation of Principal Componentsthe selected variables in the equation. Tibshirani (1996) proposes a newmethod, the ‘least absolute shrinkage and selection operator’ or LASSO,which is a hybrid of variable selection and shrinkage estimators. The procedureshrinks the coefficients of some of the variables not simply towardszero, but exactly to zero, giving an implicit form of variable selection. TheLASSO idea can be transferred to PCA, as will now be shown.In standard multiple regression we have the equationp∑y i = β 0 + β j x ij + ɛ i , i =1, 2,...,n,j=1where y 1 ,y 2 ,...,y n are measurements on a response variable y; x ij , i =1, 2,...,n, j =1, 2,...,p, are corresponding values of p predictor variables;β 0 ,β 1 ,β 2 ,...,β p are parameters in the regression equation; and ɛ i is anerror term. In least squares regression, the parameters are estimated byminimizing the residual sum of squares,n∑ (p∑ ) 2.y i − β 0 − β j x iji=1The LASSO imposes an additional restriction on the coefficients, namelyp∑|β j |≤tj=1for some ‘tuning parameter’ t. For suitable choices of t this constrainthas the interesting property that it forces some of the coefficients in theregression equation to zero.Now consider PCA, in which linear combinations a ′ kx,k=1, 2,...,p,of the p measured variables x are found that successively have maximumvariance a ′ k Sa k, subject to a ′ k a k = 1 (and, for k ≥ 2, a ′ h a k =0,h

288 11. Rotation and Interpretation of <strong>Principal</strong> <strong>Component</strong>sthe selected variables in the equation. Tibshirani (1996) proposes a newmethod, the ‘least absolute shrinkage and selection operator’ or LASSO,which is a hybrid of variable selection and shrinkage estimators. The procedureshrinks the coefficients of some of the variables not simply towardszero, but exactly to zero, giving an implicit form of variable selection. TheLASSO idea can be transferred to PCA, as will now be shown.In standard multiple regression we have the equationp∑y i = β 0 + β j x ij + ɛ i , i =1, 2,...,n,j=1where y 1 ,y 2 ,...,y n are measurements on a response variable y; x ij , i =1, 2,...,n, j =1, 2,...,p, are corresponding values of p predictor variables;β 0 ,β 1 ,β 2 ,...,β p are parameters in the regression equation; and ɛ i is anerror term. In least squares regression, the parameters are estimated byminimizing the residual sum of squares,n∑ (p∑ ) 2.y i − β 0 − β j x iji=1The LASSO imposes an additional restriction on the coefficients, namelyp∑|β j |≤tj=1for some ‘tuning parameter’ t. For suitable choices of t this constrainthas the interesting property that it forces some of the coefficients in theregression equation to zero.Now consider PCA, in which linear combinations a ′ kx,k=1, 2,...,p,of the p measured variables x are found that successively have maximumvariance a ′ k Sa k, subject to a ′ k a k = 1 (and, for k ≥ 2, a ′ h a k =0,h

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!