06.09.2021 Views

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

current PDP networks are not autonomous because their learning principles are<br />

not in fact directly realized in the network architecture. That is, networks governed<br />

by these principles require explicit signals from some external controller to determine<br />

when they will learn or when they will perform a learned task. (Dawson and<br />

Schopflocher 1992a, pp. 200–201)<br />

This is not a principled limitation, for Dawson and Schopflocher presented a much<br />

more elaborate architecture that permits a standard pattern associator to learn and<br />

recall autonomously, that is, without the need for a user’s intervention. However,<br />

this architecture is not typical; standard pattern associators like the one in Figure<br />

4-1 demand executive control.<br />

The need for such control is not limited to simple distributed memories. The<br />

same is true for a variety <strong>of</strong> popular and more powerful multilayered network<br />

architectures, including multilayered perceptrons and self-organizing networks<br />

(Roy, 2008). “There is clearly a central executive that oversees the operation <strong>of</strong> the<br />

back-propagation algorithm” (p. 1436). Roy (2008) proceeded to argue that such<br />

control is itself required by brain-like systems, and therefore biologically plausible<br />

networks demand not only an explicit account <strong>of</strong> data transformation, but also a<br />

biological theory <strong>of</strong> executive control.<br />

In summary, connectionist networks generally require the same kind <strong>of</strong> control<br />

that is a typical component <strong>of</strong> a classical model. Furthermore, it was argued earlier<br />

that there does not appear to be any principled distinction between this kind <strong>of</strong> control<br />

and the type that is presumed in an embodied account <strong>of</strong> cognition. Control is a<br />

key characteristic <strong>of</strong> a cognitive theory, and different schools <strong>of</strong> thought in cognitive<br />

science are united in appealing to the same type <strong>of</strong> control mechanisms. In short,<br />

central control is not a mark <strong>of</strong> the classical.<br />

7.4 Serial versus Parallel Processing<br />

Classical cognitive science was inspired by the characteristics <strong>of</strong> digital computers;<br />

few would deny that the classical approach exploits the digital computer metaphor<br />

(Pylyshyn, 1979a). Computers are existence pro<strong>of</strong>s that physical machines are capable<br />

<strong>of</strong> manipulating, with infinite flexibility, semantically interpretable expressions<br />

(Haugeland, 1985; Newell, 1980; Newell & Simon, 1976). Computers illustrate how<br />

logicism can be grounded in physical mechanisms.<br />

The connectionist and the embodied reactions to classical cognitive science<br />

typically hold that the digital computer metaphor is not appropriate for theories <strong>of</strong><br />

cognition. It has been argued that the operations <strong>of</strong> traditional electronic computers<br />

are qualitatively different from those <strong>of</strong> human cognition, and as a result the<br />

334 Chapter 7

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!