20.01.2013 Views

Master Thesis - Department of Computer Science

Master Thesis - Department of Computer Science

Master Thesis - Department of Computer Science

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

4.2 Obtaining Eigenmodels in Range Space and Null Space <strong>of</strong> Within-<br />

class Scatter<br />

In appearance based face recognition techniques, a face image <strong>of</strong> w × h pixels is<br />

represented by a vector in a d (= wh)-dimensional space. Therefore, each face image<br />

corresponds to a point in d-dimensional image space. Let the training set be defined as<br />

X = [x1 1 , x12 , ..., x1N , x21 , ..., xCN ], where C is the number <strong>of</strong> classes and N is the number<br />

<strong>of</strong> samples per class. An element x i j denotes jth sample from class i and a vector<br />

in d-dimensional space R d . Total number <strong>of</strong> training samples is M = NC. Then<br />

within-class (Sw), between-class (Sb) and total (St) scatter matrices can be defined<br />

as,<br />

Sw =<br />

Sb =<br />

St =<br />

C� N�<br />

(x<br />

i=1 j=1<br />

i j − µi)(x i j − µi) T , (4.1)<br />

C�<br />

N(µi − µ)(µi − µ)<br />

i=1<br />

T , (4.2)<br />

C� N�<br />

(x<br />

i=1 j=1<br />

i j − µ)(x i j − µ) T = Sw + Sb. (4.3)<br />

where µ is the overall mean and µi is the mean <strong>of</strong> i th class. Methods searching for<br />

discriminative directions only in null space <strong>of</strong> Sw try to maximize a criterion given<br />

as,<br />

J(W Null<br />

opt ) = arg max<br />

|W T |W<br />

SwW |=0<br />

T SbW |<br />

= arg max<br />

|W T |W<br />

SwW |=0<br />

T StW |. (4.4)<br />

Similarly, the directions providing discrimination in range space can be searched by<br />

maximizing a criterion J(W Range<br />

opt ) which is defined as,<br />

J(W Range<br />

opt ) = arg max<br />

|W T |W<br />

SwW |�=0<br />

T SbW |. (4.5)<br />

To find the optimal projection vectors in null space, we project all samples in null<br />

space and then obtain W Null<br />

opt<br />

by applying PCA. As the basis vectors constituting null<br />

space do not contain intra-class variations, we obtain single common vector for each<br />

class. Any sample as well as mean <strong>of</strong> a class will provide the same common vector for<br />

that class after projection in null space. The same does not happen for range space<br />

as it contains the complete intra-class variations present in the face space. So, we<br />

74

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!