Master Thesis - Department of Computer Science
Master Thesis - Department of Computer Science
Master Thesis - Department of Computer Science
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
PCA Projection<br />
Class 1<br />
Class 2<br />
LDA Projection<br />
Figure 2.3: An example <strong>of</strong> PCA and LDA projection for a two class problem.<br />
to a lower dimension space using PCA and then LDA is applied to this PCA<br />
subspace.<br />
• DCV (Discriminative Common Vectors Approach): DCV [20] solves<br />
“small sample size problem” <strong>of</strong> LDA by optimizing a variant <strong>of</strong> Fisher’s crite-<br />
rion. It searches for the optimal projection vectors in the null space <strong>of</strong> within-<br />
class scatter Sw (see equation 2.4), satisfying the criterion,<br />
J(Wopt) = arg max<br />
|W T |W<br />
SwW |=0<br />
T SbW | = arg max<br />
|W T |W<br />
SwW |=0<br />
T StW |. (2.7)<br />
So, to find the optimal projection vectors in the null space <strong>of</strong> Sw, it projects<br />
the face samples onto the null space <strong>of</strong> Sw to generate common vectors for each<br />
class and then obtain the projection vectors by performing PCA on common<br />
vectors. A new set <strong>of</strong> vectors, called as discriminative common vectors, are ob-<br />
tained by projecting face samples on the projection vectors. Thus each class is<br />
represented by a single discriminative common vector. Among two algorithms<br />
to extract the discriminant common vectors for representing each person in the<br />
training set <strong>of</strong> face database, one algorithm uses within-class scatter matrix <strong>of</strong><br />
the samples in the training set while the other uses the subspace methods and<br />
the Gram-Schmidt orthogonalization procedures to obtain the discriminative<br />
15