06.09.2021 Views

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

in general, but because these errors are <strong>of</strong>ten systematic, and their systematicity<br />

reflects the underlying algorithm.<br />

Because we rely upon their accuracy, we would hope that error evidence would<br />

be difficult to collect for most calculating devices. However, error evidence should<br />

be easily available for calculators that might be <strong>of</strong> particular interest to us: humans<br />

doing mental arithmetic. We might find, for instance, that overtaxed human calculators<br />

make mistakes by forgetting to carry values from one column <strong>of</strong> numbers<br />

to the next. This would provide evidence that mental arithmetic involved representing<br />

numbers in columnar form, and performing operations column by column<br />

(Newell & Simon, 1972). Very different kinds <strong>of</strong> errors would be expected if a different<br />

approach was taken to perform mental arithmetic, such as imagining and<br />

manipulating a mental abacus (Hatano, Miyake, & Binks, 1977).<br />

In summary, discovering and describing what algorithm is being used to calculate<br />

an input-output mapping involves the systematic examination <strong>of</strong> behaviour.<br />

That is, one makes and interprets measurements that provide relative complexity<br />

evidence, intermediate state evidence, and error evidence. Furthermore, the algorithm<br />

that will be inferred from such measurements is in essence a sequence <strong>of</strong><br />

actions or behaviours that will produce a desired result.<br />

The discovery and description <strong>of</strong> an algorithm thus involves empirical methods<br />

and vocabularies, rather than the formal ones used to account for input-output regularities.<br />

Just as it would seem likely that input-output mappings would be the topic<br />

<strong>of</strong> interest for formal researchers such as cyberneticists, logicians, or mathematicians,<br />

algorithmic accounts would be the topic <strong>of</strong> interest for empirical researchers<br />

such as experimental psychologists.<br />

The fact that computational accounts and algorithmic accounts are presented in<br />

different vocabularies suggests that they describe very different properties <strong>of</strong> a device.<br />

From our discussion <strong>of</strong> black boxes, it should be clear that a computational account<br />

does not provide algorithmic details: knowing what input-output mapping is being<br />

computed is quite different from knowing how it is being computed. In a similar vein,<br />

algorithmic accounts are silent with respect to the computation being carried out.<br />

For instance, in Understanding <strong>Cognitive</strong> <strong>Science</strong>, Dawson (1998) provides an<br />

example machine table for a Turing machine that adds pairs <strong>of</strong> integers. Dawson<br />

also provides examples <strong>of</strong> questions to this device (e.g., strings <strong>of</strong> blanks, 0s, and<br />

1s) as well as the answers that it generates. Readers <strong>of</strong> Understanding <strong>Cognitive</strong><br />

<strong>Science</strong> can pretend to be the machine head by following the instructions <strong>of</strong> the<br />

machine table, using pencil and paper to manipulate a simulated ticker tape. In<br />

this fashion they can easily convert the initial question into the final answer—they<br />

fully understand the algorithm. However, they are unable to say what the algorithm<br />

accomplishes until they read further in the book.<br />

Multiple Levels <strong>of</strong> Investigation 45

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!