01.11.2014 Views

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

157<br />

The continuous time Hopfield network is a powerful to tool to store sequences. It has been used<br />

in numerous applications in robotics, e.g. learning to recognize and reproduce complex patterns<br />

of motion. The strength of this modeling comes from the fact that the complexity of the pattern<br />

that emerges from mixing different first order differential equations as given in (6.76) becomes<br />

very rapidly untractable.<br />

Consider the case whereby the neuron received a single steady input S with no self-connection,<br />

i.e. with no decay term. The behavior of such a neuron is a linear differential equation that can be<br />

solved analytically and is given by:<br />

−t<br />

/ τ<br />

mt () = ( m0<br />

− S)<br />

e + S<br />

1<br />

(6.78)<br />

xt () =<br />

−D⋅ ( m( t) + b)<br />

1 + e<br />

The system grows until reaching a maximum as shown in Figure 6-16.<br />

Figure 6-16: Dynamics of a single Leaky-Integrator neuron with no self-connection<br />

The behavior of a single leaky-integrator neuron with a self-connection is much more complex. It<br />

follows a nonlinear differential equation that cannot be solved analytically. To study convergence<br />

of the neuron, one must then rely on finding the equilibrium points of the activation function of its<br />

membrane potential. This is given by:<br />

dm<br />

= 0 ⇒ m− w11σ<br />

( m+ b)<br />

= S<br />

dt<br />

(6.79)<br />

1<br />

where σ ( z) = is the sigmoid function<br />

−Dz<br />

1 + e<br />

© A.G.Billard 2004 – Last Update March 2011

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!