06.09.2021 Views

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

unit activity via the existing connection weights. Second, output unit error is computed<br />

by taking the difference between actual output unit activity and desired<br />

output unit activity for each output unit in the network. This kind <strong>of</strong> training is<br />

called supervised learning, because it requires an external trainer to provide the<br />

desired output unit activities. Third, Hebb-style learning is used to associate input<br />

unit activity with output unit error: weight change is equal to a learning rate times<br />

input unit activity times output unit error. (In modern perceptrons, this triple product<br />

can also be multiplied by the derivative <strong>of</strong> the output unit’s activation function,<br />

resulting in gradient descent learning [Dawson, 2004]).<br />

The supervised learning <strong>of</strong> a perceptron is designed to reduce output unit<br />

errors as training proceeds. Weight changes are proportional to the amount <strong>of</strong> generated<br />

error. If no errors occur, then weights are not changed. If a task’s solution can<br />

be represented by a perceptron, then repeated training using pairs <strong>of</strong> input-output<br />

stimuli is guaranteed to eventually produce zero error, as proven in Rosenblatt’s<br />

perceptron convergence theorem (Rosenblatt, 1962).<br />

Being a product <strong>of</strong> Old Connectionism, there are limits to the range <strong>of</strong> input-output<br />

mappings that can be mediated by perceptrons. In their famous computational<br />

analyses <strong>of</strong> what perceptrons could and could not learn to compute, Minsky and<br />

Papert (1969) demonstrated that perceptrons could not learn to distinguish some<br />

basic topological properties easily discriminated by humans, such as the difference<br />

between connected and unconnected figures. As a result, interest in and funding for<br />

Old Connectionist research decreased dramatically (Medler, 1998; Papert, 1988).<br />

However, perceptrons are still capable <strong>of</strong> providing new insights into phenomena<br />

<strong>of</strong> interest to cognitive science. The remainder <strong>of</strong> this section illustrates this by<br />

exploring the relationship between perceptron learning and classical conditioning.<br />

The primary reason that connectionist cognitive science is related to empiricism<br />

is that the knowledge <strong>of</strong> an artificial neural network is typically acquired via<br />

experience. For instance, in supervised learning a network is presented with pairs<br />

<strong>of</strong> patterns that define an input-output mapping <strong>of</strong> interest, and a learning rule is<br />

used to adjust connection weights until the network generates the desired response<br />

to a given input pattern.<br />

In the twentieth century, prior to the birth <strong>of</strong> artificial neural networks<br />

(McCulloch & Pitts, 1943), empiricism was the province <strong>of</strong> experimental psychology.<br />

A detailed study <strong>of</strong> classical conditioning (Pavlov, 1927) explored the subtle<br />

regularities <strong>of</strong> the law <strong>of</strong> contiguity. Pavlovian, or classical, conditioning begins with<br />

an unconditioned stimulus (US) that is capable, without training, <strong>of</strong> producing an<br />

unconditioned response (UR). Also <strong>of</strong> interest is a conditioned stimulus (CS) that<br />

when presented will not produce the UR. In classical conditioning, the CS is paired<br />

with the US for a number <strong>of</strong> trials. As a result <strong>of</strong> this pairing, which places the CS<br />

190 Chapter 4

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!