06.09.2021 Views

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

the only time that output unit activity will equal 1 is when both input units are activated<br />

with 1 (i.e., when p and q are both true). This is because this situation will produce<br />

a net input <strong>of</strong> 2, which exceeds the threshold. In all other cases, the net input<br />

will either be 1 or 0, which will be less than the threshold, and which will therefore<br />

produce output unit activity <strong>of</strong> 0.<br />

The ability <strong>of</strong> the Figure 4-3 perceptron to compute AND can be described<br />

in terms <strong>of</strong> the pattern space in Figure 4-2A. The threshold and the connection<br />

weights <strong>of</strong> the perceptron provide the location and orientation <strong>of</strong> the single straight<br />

cut that carves the pattern space into decision regions (the dashed line in Figure<br />

4-2A). Activating the input units with some pattern presents a pattern space location<br />

to the perceptron. The perceptron examines this location to decide on which<br />

side <strong>of</strong> the cut the location lies, and responds accordingly.<br />

This pattern space account <strong>of</strong> the Figure 4-3 perceptron also points to a limitation.<br />

When the Heaviside step function is used as an activation function, the perceptron<br />

only defines a single straight cut through the pattern space and therefore<br />

can only deal with linearly separable problems. A perceptron akin to the one illustrated<br />

in Figure 4-3 would not be able to compute XOR (Figure 4-2B) because the<br />

output unit is incapable <strong>of</strong> making the two required cuts in the pattern space.<br />

How does one extend computational power beyond the perceptron? One<br />

approach is to add additional processing units, called hidden units, which are intermediaries<br />

between input and output units. Hidden units can detect additional features<br />

that transform the problem by increasing the dimensionality <strong>of</strong> the pattern<br />

space. As a result, the use <strong>of</strong> hidden units can convert a linearly nonseparable problem<br />

into a linearly separable one, permitting a single binary output unit to generate<br />

the correct responses.<br />

Figure 4-4 shows how the AND circuit illustrated in Figure 4-3 can be added<br />

as a hidden unit to create a multilayer perceptron that can compute the linearly<br />

nonseparable XOR operation (Rumelhart, Hinton, & Williams, 1986a). This perceptron<br />

also has two input units whose activities respectively represent the state <strong>of</strong><br />

the propositions p and q. Each <strong>of</strong> these input units sends a signal through a connection<br />

to an output unit; the figure indicates that the weight <strong>of</strong> each connection is 1.<br />

The threshold <strong>of</strong> the output’s activation function () is 0.5. If we were to ignore the<br />

hidden unit in this network, the output unit would be computing OR, turning on<br />

when one or both <strong>of</strong> the input propositions are true.<br />

However, this network does not compute OR, because the input units are also<br />

connected to a hidden unit, which in turn sends a third signal to be added into the<br />

output unit’s net input. The hidden unit is identical to the AND circuit from Figure<br />

4-3. The signal that it sends to the output unit is strongly inhibitory; the weight <strong>of</strong><br />

the connection between the two units is –2.<br />

Elements <strong>of</strong> Connectionist <strong>Cognitive</strong> <strong>Science</strong> 145

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!