01.11.2014 Views

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

19<br />

( )<br />

Xʹ′ = W X − µ<br />

(2.6)<br />

i i i<br />

Components of<br />

X ' can be seen as the coordinates in the orthogonal base of<br />

i<br />

i<br />

In order to reconstruct the original data vector<br />

using the property of an orthogonal matrix<br />

X from<br />

i<br />

X<br />

i<br />

X = W X ʹ′ + µ<br />

T<br />

i i i<br />

W<br />

W<br />

− 1 T<br />

= .<br />

'<br />

X .<br />

, one must compute:<br />

Now, instead of using all the eigenvectors of the covariance matrix, one may represent the data in<br />

terms of only a few basis vectors of the orthogonal basis. If we denote the reduced transfer matrix<br />

W , that contains only the k first eigenvectors. The reduced transformation is, thus:<br />

k<br />

( )<br />

Xʹ′ = W X − µ<br />

i k i i<br />

i<br />

X ' lives now in a coordinates system of dimension k. Such a transformation reduces the meansquare<br />

error between the original data point and its projection. If the data is concentrated in a<br />

linear subspace, this provides a way to compress data without losing much information and<br />

simplifying the representation. By picking the eigenvectors having the largest eigenvalues we lose<br />

as little information as possible in the mean-square sense. One can e.g. choose a fixed number<br />

of eigenvectors and their respective eigenvalues and get a consistent representation, or<br />

abstraction of the data. This preserves a varying amount of energy of the original data.<br />

Alternatively, we can choose approximately the same amount of energy and a varying amount of<br />

eigenvectors and their respective eigenvalues. This would in turn give approximately consistent<br />

amount of information with the expense of varying representations with regard to the dimension of<br />

the subspace.<br />

© A.G.Billard 2004 – Last Update March 2011

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!