06.09.2021 Views

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

hidden unit computes its error by summing together all <strong>of</strong> the error signals that it<br />

receives from the output units to which it is connected. Fifth, once the hidden unit<br />

error has been computed, the weights <strong>of</strong> the hidden units can be modified using the<br />

same equation that was used to alter the weights <strong>of</strong> each <strong>of</strong> the output units.<br />

This procedure can be repeated iteratively if there is more than one layer <strong>of</strong><br />

hidden units. That is, the error <strong>of</strong> each hidden unit in one layer can be propagated<br />

backwards to an adjacent layer as an error signal once the hidden unit weights have<br />

been modified. Learning about this pattern stops once all <strong>of</strong> the connections have<br />

been modified. Then the next training pattern can be presented to the input units,<br />

and the learning process occurs again.<br />

There are a variety <strong>of</strong> different ways in which the generic algorithm given above<br />

can be realized. For instance, in stochastic training, connection weights are updated<br />

after each pattern is presented (Dawson, 2004). This approach is called stochastic<br />

because each pattern is presented once per epoch <strong>of</strong> training, but the order <strong>of</strong><br />

presentation is randomized for each epoch. Another approach, batch training, is to<br />

accumulate error over an epoch and to only update weights once at the end <strong>of</strong> the<br />

epoch, using accumulated error (Rumelhart, Hinton, & Williams, 1986a). As well,<br />

variations <strong>of</strong> the algorithm exist for different continuous activation functions. For<br />

instance, an elaborated error term is required to train units that have Gaussian activation<br />

functions, but when this is done, the underlying mathematics are essentially<br />

the same as in the original generalized delta rule (Dawson & Schopflocher, 1992b).<br />

New Connectionism was born when the generalized delta rule was invented.<br />

Interestingly, the precise date <strong>of</strong> its birth and the names <strong>of</strong> its parents are not completely<br />

established. The algorithm was independently discovered more than once.<br />

Rumelhart, Hinton, and Williams (1986a, 1986b) are its most famous discoverers<br />

and popularizers. It was also discovered by David Parker in 1985 and by Yann LeCun<br />

in 1986 (Anderson, 1995). More than a decade earlier, the algorithm was reported<br />

in Paul Werbos’ (1974) doctoral thesis. The mathematical foundations <strong>of</strong> the generalized<br />

delta rule can be traced to an earlier decade, in a publication by Shun-Ichi<br />

Amari (1967).<br />

In an interview (Anderson & Rosenfeld, 1998), neural network pioneer Stephen<br />

Grossberg stated that “Paul Werbos, David Parker, and Shun-Ichi Amari should<br />

have gotten credit for the backpropagation model, instead <strong>of</strong> Rumelhart, Hinton,<br />

and Williams” (pp. 179–180). Regardless <strong>of</strong> the credit assignment problem associated<br />

with the scientific history <strong>of</strong> this algorithm, it transformed cognitive science in<br />

the mid-1980s, demonstrating “how the lowly concepts <strong>of</strong> feedback and derivatives<br />

are the essential building blocks needed to understand and replicate higher-order<br />

phenomena like learning, emotion and intelligence at all levels <strong>of</strong> the human mind”<br />

(Werbos, 1994, p. 1).<br />

Elements <strong>of</strong> Connectionist <strong>Cognitive</strong> <strong>Science</strong> 161

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!