01.11.2014 Views

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

164<br />

Figure 7-3: The Forward Procedure used in the Baum-Welch algorithm for estimating the HMM parameters<br />

The forward procedure is thus iterative with an initialization step, an induction step (typical of<br />

dynamic programming) and termination step, see Figure 7-3 right.<br />

It is easy to see that the same principle can be extended backwards and, thus, that one can infer<br />

the probability of being in a particular state by starting from the last state and building one’s way<br />

back in time. This is referred to as the backward procedure and is illustrated in Figure 7-4.<br />

Figure 7-4: The backward procedure in the Baum-Welch algorithm for estimating the HMM parameters<br />

While solving with Equation (7.2) would take on the order of 2T*N T computation steps, the<br />

forward or backward procedures reduce this computation greatly and take instead on the order of<br />

N 2 T computation steps each. One may wonder what the advantage would be to do the<br />

computation either forwards or backwards. There is no direct advantage to do either. These two<br />

procedures are however crucial to the estimation of the parameters of the HMM. It is easy to see<br />

that, if one can compute the probability to be in state i at time t with the forward procedure and<br />

the probability to be in state j at time t+1 with the backward procedure, then one can combine<br />

© A.G.Billard 2004 – Last Update March 2011

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!