01.11.2014 Views

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

MACHINE LEARNING TECHNIQUES - LASA

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

139<br />

6.6.4 Oja’s one Neuron Model<br />

In the 1980’s Oja proposed a model that extracts the largest principal components from the input<br />

data. The model uses a single output neuron with the classical activation rule:<br />

Oja’s variation of the Hebbian rule is as follows:<br />

y = ∑ wx<br />

(6.32)<br />

i<br />

i<br />

2<br />

( )<br />

i i i<br />

i<br />

Δ w = α x y− y w<br />

(6.33)<br />

2<br />

Note that this rule is defined by a multiplicative constraint of the form y γ ( w)<br />

= and so will<br />

converge to the principal eigenvector of the input covariance matrix. The weight decay term has<br />

the simultaneous effect of making ∑ ( w ) 2<br />

i<br />

tend towards 1, i.e. the weights are normalized.<br />

i<br />

This rule will allow the network to find only the first eigenvector. In order to determine the other<br />

principal components, one must let the neurons interact with one another.<br />

6.6.5 Convergence of the Weights Decay rule<br />

The Hebbian rule with decay solves the problem of large weights. It eliminates very small and<br />

irregular noise. However, it does so at a price. The environment must continuously present all<br />

stimuli that have associations. Without reinforcement, associations will decay away. Moreover, it<br />

takes longer to eliminate the effect of noise.<br />

As mentioned earlier on, an important and distinctive function of the Hebbian learning rule is its<br />

stability. Work by Miller & MacKay made an important contribution in showing the convergence of<br />

the Hebbian learning rule with weight decay.<br />

Let us consider the continuous time dependent learning rule with weight decay.<br />

© A.G.Billard 2004 – Last Update March 2011

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!