01.11.2014 Views

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

133<br />

6.5 Willshaw net<br />

David Willshaw developed one of the simplest associative memory in 1967.<br />

Figure 6-8: The Willshaw Network - Learning phase<br />

The update rule follows:<br />

⎧⎧1 if h = 1<br />

Δ wij = δ( xi ⋅ yi<br />

) δ( h)<br />

=⎨⎨ (6.17)<br />

⎩⎩ 0 otherwise<br />

The retrieval rule follows:<br />

⎛⎛<br />

⎞⎞<br />

⎧⎧1 if h ≥ 0<br />

yj = ϑ⎜⎜<br />

wij ⋅xi<br />

− w w = h =⎨⎨<br />

⎝⎝ i ⎠⎠ ⎩⎩0 otherwise<br />

∑ 0⎟⎟<br />

0<br />

1 ϑ( )<br />

(6.18)<br />

Such a network suffers from two major drawbacks: the memory quickly fills up; the network<br />

cannot learn patterns with overlapping inputs.<br />

One can show that the total number of patterns that can be stored in such a network is:<br />

⎛⎛ MIN MOUT<br />

ρ = 1−⎜⎜1− ⋅<br />

⎝⎝ NIN<br />

NOUT<br />

⎞⎞<br />

⎟⎟<br />

⎠⎠<br />

(6.19)<br />

with N<br />

IN<br />

input lines , M<br />

IN<br />

the nm of input bits set to 1, NOUT<br />

output lines<br />

M the nm of output bits set to 1.<br />

OUT<br />

and<br />

A more generic learning rule for such associative net is the Hebbian rule, which we will see next.<br />

© A.G.Billard 2004 – Last Update March 2011

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!