01.11.2014 Views

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

29<br />

Figure 2-6: Mixture of variables<br />

Note that the random variables x 1 and x 2 are not independent any more, see Figure 2-6; an easy<br />

way to see this is to consider, whether it is possible to predict the value of one of them, say x 2 ,<br />

from the value of the other. Clearly it is the case, as they line up along a line of slope 1.<br />

The problem of estimating the data model of ICA is now to estimate the mixing matrix A using<br />

only information contained in the mixtures x 1 and x 2 . In the above example, it is easy to estimate<br />

A by simply solving the inverse for each of the four data points, assuming that we have at least<br />

four points. The problem is less trivial, once we consider a mixture of two arbitrary continuous<br />

distributions.<br />

Let us now assume that s 1 and s 2 were generated by the following uniform distribution:<br />

( )<br />

p s<br />

i<br />

⎧⎧ 1<br />

⎪⎪ if si<br />

≤ 3<br />

= ⎨⎨2 3<br />

(2.20)<br />

⎪⎪⎩⎩ 0 otherwise<br />

Figure 2-7: joint distributions of S1 and S2<br />

© A.G.Billard 2004 – Last Update March 2011

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!