01.11.2014 Views

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

87<br />

statistical independence. We define a measure of this independence<br />

through λ( K ,.... K ) 1 ρ( K ,.... K )<br />

= − . This quantity is also comprised between 0 and 1 and<br />

1 N<br />

1<br />

N<br />

is equal to 1 if the variables X ,..., 1<br />

X are pairwise independent. Optimizing for independence<br />

N<br />

can be done by maximizing λ ( K K )<br />

function:<br />

1 ,.... N<br />

. This is equivalent to minimizing the following objective<br />

( ,.... ) log λ ( ,.... )<br />

J K K K K<br />

=− (5.24)<br />

1 N<br />

1<br />

This optimization problem is solved in an iterative method. One starts with an initial guess for W<br />

and then iterates by gradient descent on J. Details can be found in Bach & Jordan 2002.<br />

Interestingly, an implementation of Kernel-ICA, whose computational complexity is linear in the<br />

number of data points, is proposed there. This reduces importantly the computational costs.<br />

N<br />

© A.G.Billard 2004 – Last Update March 2011

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!