MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA MACHINE LEARNING TECHNIQUES - LASA

01.11.2014 Views

196 EM is a description of a class of related algorithm, not of a particular algorithm. EM is a recipe or meta-algorithm, which is used to devise particular algorithms. The Baum-Welch algorithm is an example of an EM algorithm applied to Hidden Markov Models. Another example is the K-means clustering algorithm. It can be shown that an EM iteration does not decrease the observed data likelihood function, and that the only stationary points of the iteration are the stationary points of the observed data likelihood function. In practice, this means that an EM algorithm will converge to a local maximum of the observed data likelihood function. EM proceeds thus iteratively and is particularly suited for parameter estimation under incomplete data or missing data situations. By using the EM procedure, the so-called marginal (or incomplete-data) likelihood is obtained by computing the average or expectation of the complete-data likelihood with respect to the missing data using the current parameter estimates (E-step), then the new parameter estimates are obtained by maximizing the marginal likelihood (M-step). © A.G.Billard 2004 – Last Update March 2011

197 10 References • Machine Learning, Tom Mitchell, McGraw Hill, 1997. • Pattern Classification Richard O. Duda, Peter E. Hart, David G. Stork, • Information Theory, Inference and Learning Algorithms, David J.C Mackay, Cambridge University Press, 2003. • Independent Component Analysis, A. Hyvarinen, J. Karhunen and E. Oja, Wiley Inter- Sciences. 2001. • Artificial Neural Networks and Information Theory, Colin Fyfe, Tech. Report, Dept. of Computing and Infortion Science, The University of Paisley, 2000. • Neural Networks, Simon Haykin, Prentice Hall International Editions, 1994. • Self-Organizing Maps, Teuvo Kohonen, Springer Series in Information Sciences, 30, Springer. 2001. • Learning with Kernels, B. Scholkopf and A. Smola, MIT Press 2002 • Reinforcement Learning: An Introduction,. R. Sutton & A. Barto, A Bradford Book. MIT Press, 1998. • Neural Networks for Pattern Recognition, Bishop, C.M. 1996. New York: Oxford University Press. 10.1 Other Recommended Textbooks • Elements of Machine Learning, Pat Langley, Morgan Kaufmann, 1996. • Applied Nonlinear Control, J-J. Slotine, Slotine, J.J.E., and Li, W., Prentice-Hall, 1991. • Neural Networks, Simon Haykin, Prentice Hall International Editions, 1994. • Cluster Analysis, Copyright StatSoft, Inc., 1984-2004 © A.G.Billard 2004 – Last Update March 2011

197<br />

10 References<br />

• Machine Learning, Tom Mitchell, McGraw Hill, 1997.<br />

• Pattern Classification Richard O. Duda, Peter E. Hart, David G. Stork,<br />

• Information Theory, Inference and Learning Algorithms, David J.C Mackay,<br />

Cambridge University Press, 2003.<br />

• Independent Component Analysis, A. Hyvarinen, J. Karhunen and E. Oja, Wiley Inter-<br />

Sciences. 2001.<br />

• Artificial Neural Networks and Information Theory, Colin Fyfe, Tech. Report, Dept. of<br />

Computing and Infortion Science, The University of Paisley, 2000.<br />

• Neural Networks, Simon Haykin, Prentice Hall International Editions, 1994.<br />

• Self-Organizing Maps, Teuvo Kohonen, Springer Series in Information Sciences, 30,<br />

Springer. 2001.<br />

• Learning with Kernels, B. Scholkopf and A. Smola, MIT Press 2002<br />

• Reinforcement Learning: An Introduction,. R. Sutton & A. Barto, A Bradford Book. MIT<br />

Press, 1998.<br />

• Neural Networks for Pattern Recognition, Bishop, C.M. 1996. New York: Oxford<br />

University Press.<br />

10.1 Other Recommended Textbooks<br />

• Elements of Machine Learning, Pat Langley, Morgan Kaufmann, 1996.<br />

• Applied Nonlinear Control, J-J. Slotine, Slotine, J.J.E., and Li, W., Prentice-Hall, 1991.<br />

• Neural Networks, Simon Haykin, Prentice Hall International Editions, 1994.<br />

• Cluster Analysis, Copyright StatSoft, Inc., 1984-2004<br />

© A.G.Billard 2004 – Last Update March 2011

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!