MACHINE LEARNING TECHNIQUES - LASA
MACHINE LEARNING TECHNIQUES - LASA
MACHINE LEARNING TECHNIQUES - LASA
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
132<br />
Figure 6-7: A difficult classification problem using a multi-layered feed-forward NN with increasing numbers<br />
of neurons in the hidden layer. 3 neurons are sufficient to solve this problem.<br />
[DEMOS\CLASSIFICATION\MLP-NEURONS.ML]<br />
6.4.3 The Backpropagation Algorithm<br />
1. Initialize the weights to small random numbers<br />
p<br />
2. Present an input patter X to the network<br />
3. Compute<br />
⎛⎛ ⎞⎞<br />
p<br />
1<br />
ai = f ⎜⎜∑<br />
wijxj<br />
⎟⎟<br />
⎝⎝ j ⎠⎠<br />
and<br />
⎛⎛ ⎞⎞<br />
p<br />
2<br />
y = f ⎜⎜ wijaj<br />
⎟⎟<br />
⎝⎝ j ⎠⎠<br />
∑ the outputs of the<br />
hidden and output units respectively, where f is the activation function<br />
1 2<br />
(usually the sigmoid), and w ,<br />
ij<br />
w are the weights from 1 st layer, input to<br />
ij<br />
hidden units, and 2 nd layer, from hidden units to output units, respectively.<br />
4. Compute the error, according to (6.12)<br />
5. Compute the gradient of the error along each weight direction<br />
second layer.<br />
6. Compute the gradient of the error<br />
where<br />
s<br />
1 p<br />
i ij j<br />
j<br />
error to compute<br />
∂E<br />
∂E<br />
∂s<br />
∂w ∂s ∂w<br />
p p<br />
i<br />
=<br />
1 1<br />
ij i ij<br />
∂E<br />
∂w<br />
p<br />
2<br />
ij<br />
for the<br />
along the hidden units,<br />
= ∑ w x . In order to do this, you need to backpropagate the<br />
∂E<br />
∂s<br />
7. Update all weights according to (6.16)<br />
i<br />
p<br />
.<br />
8. Repeat all steps from 2 for all patterns and until the network has converged<br />
(reached a minimal error).<br />
Exercise: Show that a 1-layer feed-forward NN can be used to compute the XOR problem.<br />
© A.G.Billard 2004 – Last Update March 2011