01.11.2014 Views

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

196<br />

EM is a description of a class of related algorithm, not of a particular algorithm. EM is a recipe or<br />

meta-algorithm, which is used to devise particular algorithms. The Baum-Welch algorithm is an<br />

example of an EM algorithm applied to Hidden Markov Models. Another example is the K-means<br />

clustering algorithm. It can be shown that an EM iteration does not decrease the observed data<br />

likelihood function, and that the only stationary points of the iteration are the stationary points of<br />

the observed data likelihood function. In practice, this means that an EM algorithm will converge<br />

to a local maximum of the observed data likelihood function.<br />

EM proceeds thus iteratively and is particularly suited for parameter estimation under incomplete<br />

data or missing data situations. By using the EM procedure, the so-called marginal (or<br />

incomplete-data) likelihood is obtained by computing the average or expectation of the<br />

complete-data likelihood with respect to the missing data using the current parameter estimates<br />

(E-step), then the new parameter estimates are obtained by maximizing the marginal likelihood<br />

(M-step).<br />

© A.G.Billard 2004 – Last Update March 2011

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!