Principal Component Analysis Slides
Principal Component Analysis Slides Principal Component Analysis Slides
Principal Component AnalysisMorgan Bengtsson benmo417@student.liu.seCliver Zardán cliza724@student.liu.se
- Page 2 and 3: Introduction: In this lectureWhat i
- Page 4 and 5: What is PCA?Extract of relevant inf
- Page 6 and 7: Example: The Spring DataComplicated
- Page 8 and 9: Linear Algebra: Eigenvalues & Eigen
- Page 10 and 11: Example ContinuedComplicated multid
- Page 12 and 13: Example Continued: PCA Steps 1 - 74
- Page 14 and 15: Example: SummaryComplicated data se
- Page 16 and 17: Data mining and PCAApplicationsMili
- Page 18 and 19: ReferencesPrincipal Componentshttp:
<strong>Principal</strong> <strong>Component</strong> <strong>Analysis</strong>Morgan Bengtsson benmo417@student.liu.seCliver Zardán cliza724@student.liu.se
Introduction: In this lectureWhat is PCA?Example: The SpringMathematical basicsLinear AlgebraStatisticsExample ContinuedStepsData mining and PCAConclusionsReferences
What is PCA?Magic BoxUnderstandingRefined DataData MiningVisualization
What is PCA?Extract of relevant informationFind patternsRelatively simpleReduce of dimensionsChange of basisSomewhat remove noise
Example: The SpringArtificially createdCould be relatively real thoughIdeal spring
Example: The Spring DataComplicated multidimensional dataset
Matematical BasicsLinear algebra:Eigen vectorsEigen valuesMatrix AlgebraStatistics:Standard DeviationVarianceCovarianceCovariance Matrix
Linear Algebra: Eigenvalues & EigenvectorsEigenvectors can only be found for square matrices.Eigenvectors and eigenvalues comes in pairs.OrthogonalEx. A 3x3 matrix has 3 eigenvectors. The highesteigenvalue represents the best eigenvector.
Statistics: Covariance and Covariance MatrixCovariance = Dependence betwen two setsCovariance Matrix = If working with many dimensionsCovariance Matrix of a 3-dimensional data set.
Example ContinuedComplicated multidimensional dataset
Example Continued: PCA Steps 1 - 71. Subtract the mean along each dimension2. Calculate the covariance matrixcov = 1/(N-1) *dataDist*dataDist';1. Calculate the <strong>Principal</strong> <strong>Component</strong>s of cov. (eigs in Matlab)Sorted with respect to eigenvalues
Example Continued: PCA Steps 1 - 74. Select the principal components that are relevantThree principal componentsData transformed with respect to each PC
Example Continued: PCA Steps5. Transform with respect to those (Feature Vector)dataPC12 = PC(:,1:2)'*dataDist;6. ChooseTransform to original coordinatesKeep the new coordinate system7. Re add the mean values
Example: SummaryComplicated data setFound principal componentsReduced dimensionTransformed to new basis
Data mining and PCABetter representation of dataMore representative basisAccuracy of classification modelFaster and better data processing
Data mining and PCAApplicationsMilitaryMedicineExperimentsNeuroscienceComputer GraphicsInfovis...
ConclucionsStrengthsEasyWidely usedEfficentNo tweakingWeaknessesNo tweaking
References<strong>Principal</strong> <strong>Component</strong>shttp://csnet.otago.ac.nz/cosc453/student_tutorials/principal_components.pdfA Tutorial on <strong>Principal</strong> <strong>Component</strong> <strong>Analysis</strong>http://www.snl.salk.edu/%7Eshlens/pub/notes/pca.pdf<strong>Principal</strong> <strong>Component</strong>s <strong>Analysis</strong>http://www.resample.com/xlminer/help/PCA/pca_intro.htm
Questions?