01.11.2014 Views

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

126<br />

6.3.1 Learning rule for the Perceptron<br />

Let<br />

{ 1<br />

,..., }<br />

( )<br />

( p) ( p<br />

p<br />

)<br />

= be the set of input patterns and<br />

X x x n<br />

set of values for the output.<br />

{ }<br />

( p) ( 1) ( P)<br />

= ,..., the associated<br />

yˆ yˆ yˆ<br />

Learning Algorithm:<br />

1. Initialize the weights to small random values w ( 0 ) ( 0 i n)<br />

and x<br />

0<br />

= 1.<br />

2. Present the next pattern<br />

n<br />

⎛⎛<br />

yt = f⎜⎜∑<br />

wi<br />

t⋅xi<br />

⎝⎝ i=<br />

0<br />

3. Calculate ( ) ( )<br />

{ 1<br />

,..., }<br />

( )<br />

( p) ( p<br />

p<br />

)<br />

X = x x n<br />

( p) 4. Adjust the weights following ( ) ( 1) η ˆ ( )<br />

( p)<br />

⎞⎞<br />

⎟⎟<br />

⎠⎠<br />

i<br />

≤ ≤ with<br />

0<br />

( p<br />

( )<br />

)<br />

( )<br />

i i i<br />

w fixed (the bias)<br />

w t = w t− + ⋅ y − y t ⋅ x t (6.2)<br />

The learning rate η modulates the speed at which the weight will vary.<br />

5. Move on to the next pattern and go back to 2.<br />

The process stops if all patterns have been learned. However, it is not clear that all patterns can<br />

be learned. The Perceptron acts as a classifier that separates data into two classes. Note that the<br />

dataset must be linearly separable for the learning to converge, see illustration in Figure 6-3.<br />

Figure 6-3: Learning in one neuron.<br />

Exercise: Which of the AND, OR and XOR gates satisfy the condition of linear separability?<br />

© A.G.Billard 2004 – Last Update March 2011

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!