06.09.2021 Views

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

input pattern in an all-or-none fashion to a particular category. A second is function<br />

approximation: generating a continuous response to a set <strong>of</strong> input values.<br />

Section 4.6 then proceeds to computational analyses <strong>of</strong> how capable networks<br />

are <strong>of</strong> accomplishing these tasks. These analyses prove that networks are as powerful<br />

as need be, provided that they include hidden units. They can serve as arbitrary<br />

pattern classifiers, meaning that they can solve any pattern classification problem<br />

with which they are faced. They can also serve as universal function approximators,<br />

meaning that they can fit any continuous function to an arbitrary degree <strong>of</strong><br />

precision. This computational power suggests that artificial neural networks belong<br />

to the class <strong>of</strong> universal machines. The section ends with a brief review <strong>of</strong> computational<br />

analyses, which conclude that connectionist networks indeed can serve as<br />

universal Turing machines and are therefore computationally sophisticated enough<br />

to serve as plausible models for cognitive science.<br />

Computational analyses need not limit themselves to considering the general<br />

power <strong>of</strong> artificial neural networks. Computational analyses can be used to explore<br />

more specific questions about networks. This is illustrated in Section 4.7, “What<br />

Do Output Unit Activities Represent?” in which we use formal methods to answer<br />

the question that serves as the section’s title. The section begins with a general discussion<br />

<strong>of</strong> theories that view biological agents as intuitive statisticians who infer<br />

the probability that certain events may occur in the world (Peterson & Beach, 1967;<br />

Rescorla, 1967, 1968). An empirical result is reviewed that suggests artificial neural<br />

networks are also intuitive statisticians, in the sense that the activity <strong>of</strong> an output<br />

unit matches the probability that a network will be “rewarded” (i.e., trained to turn<br />

on) when presented with a particular set <strong>of</strong> cues (Dawson et al., 2009).<br />

The section then ends by providing an example computational analysis: a formal<br />

pro<strong>of</strong> that output unit activity can indeed literally be interpreted as a conditional<br />

probability. This pro<strong>of</strong> takes advantage <strong>of</strong> known formal relations between neural networks<br />

and the Rescorla-Wagner learning rule (Dawson, 2008; Gluck & Bower, 1988;<br />

Sutton & Barto, 1981), as well as known formal relations between the Rescorla-<br />

Wagner learning rule and contingency theory (Chapman & Robbins, 1990).<br />

4.6 Beyond the Terminal Meta-postulate<br />

Connectionist networks are associationist devices that map inputs to outputs, systems<br />

that convert stimuli into responses. However, we saw in Chapter 3 that classical<br />

cognitive scientists had established that the stimulus-response theories <strong>of</strong><br />

behaviourist psychology could not adequately deal with the recursive structure <strong>of</strong><br />

natural language (Chomsky, 1957, 1959b, 1965, 1966). In the terminal meta-postulate<br />

argument (Bever, Fodor, and Garrett, 1968), it was noted that the rules <strong>of</strong> associative<br />

theory defined a “terminal vocabulary <strong>of</strong> a theory, i.e., over the vocabulary in which<br />

Elements <strong>of</strong> Connectionist <strong>Cognitive</strong> <strong>Science</strong> 149

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!