20.01.2013 Views

Master Thesis - Department of Computer Science

Master Thesis - Department of Computer Science

Master Thesis - Department of Computer Science

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

a lower dimensional space using PCA and then LDA is applied to this PCA subspace.<br />

But the removed subspace may contain some useful discriminative information which<br />

is lost in this process.<br />

Chen et al. [25] suggested that the null space <strong>of</strong> the within-class scatter Sw<br />

contains most discriminative information, and hence they proposed LDA in null space<br />

<strong>of</strong> Sw called N-LDA. However, when the number <strong>of</strong> training samples is large, the null<br />

space becomes small resulting in loss <strong>of</strong> discriminative information outside this null<br />

space. The performance <strong>of</strong> the null space depends on the dimension <strong>of</strong> the null<br />

space. Thus any preprocessing that reduces original sample space should be avoided.<br />

Another drawback <strong>of</strong> this approach is that it involves solving the eigenvalue problem<br />

for a very high dimensional matrix. Yu et al. [138] proposed an algorithm which<br />

basically uses simultaneous diagonalization method [39]. First, the null space <strong>of</strong><br />

between-class scatter Sb is removed, assuming that the null space <strong>of</strong> Sb contains no<br />

discriminative information, and then the method seeks a projection to minimize Sw<br />

in the transformed subspace <strong>of</strong> Sb. As the rank <strong>of</strong> Sb is smaller than that <strong>of</strong> Sw,<br />

removing the null space <strong>of</strong> Sb may lose entire null space <strong>of</strong> Sw. So Sw is likely to be <strong>of</strong><br />

full rank after this removal [14, 28, 45]. One more drawback is that the whitening <strong>of</strong><br />

Sb is redundant in this method. Another novel method, proposed by Huang et al. [45]<br />

uses PCA+Null space approach. Here the core idea is that null space <strong>of</strong> Sw is useful for<br />

discrimination unlike that <strong>of</strong> Sb. Since the null space <strong>of</strong> total scatter matrix St is the<br />

intersection <strong>of</strong> the null spaces <strong>of</strong> Sb and Sw, this method applies PCA to remove the<br />

null space <strong>of</strong> St and then Null space method is used on reduced subspace <strong>of</strong> St. Hakan<br />

et al. [20] proposed a novel scheme to solve SSS problem called Discriminant common<br />

vectors (DCV) method. Among two algorithms to extract the discriminant common<br />

vectors for representing each person in the training set <strong>of</strong> face database, one algorithm<br />

uses within-class scatter matrix <strong>of</strong> the samples in the training set while the other uses<br />

the subspace methods and the Gramm-Schmidt orthogonalization procedure. These<br />

DCVs are used for classification <strong>of</strong> new faces.<br />

Most <strong>of</strong> the null space based methods like DCV [20], PCA+Null space approach<br />

[45] explored null space <strong>of</strong> Sw while others like PCA+LDA [120] and Fisherface [12]<br />

eliminated a part <strong>of</strong> null space to obtain discriminative features. None <strong>of</strong> them ex-<br />

72

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!