17.01.2013 Views

Chapter 2. Prehension

Chapter 2. Prehension

Chapter 2. Prehension

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

394 A pp e It d ic e s<br />

training set; if they are the same, the network has learned the correct<br />

computation for that input. If they are different, the second phase<br />

occurs. During the second phase, the weights between the neurons<br />

are adjusted by propagating the error backwards using equation (13) to<br />

change the weights.<br />

Weight space refers to the space of all possible weight<br />

combinations. It is generally true that the more weights available to<br />

adjust (Le., the higher the weight space dimensionality), the more<br />

likely it is that a solution will be found. As a simple example consider<br />

the simple mapping shown in the table in Figure C.7. Four patterns<br />

are mapped into a single value, computing an Exclusive-OR function.<br />

A binary representation is shown at the left. A ‘distributed’<br />

representation is also shown that encodes the same information in a<br />

four element, rather than two element, vector. For each encoding, a<br />

network is shown in Figure C.7. For the binary representation, there<br />

are no connection weights such that a single neuron can achieve the<br />

mapping. However, with the distributed representation, it is a simple<br />

matter to find a set of weights which solve the problem. There are<br />

several arguments why this strategy tends to be successful, but it<br />

suffices to say that mappings are more likely to be realized in a higher<br />

dimensional space.<br />

Input output<br />

Binarv Distributed<br />

0 0- 0 0 0 1 0<br />

0 1 0 0 1 0 1<br />

1 0 0 1 0 0 1<br />

1 1 1 0 0 0 0<br />

Binary<br />

3<br />

Distributed<br />

Figure C.7 Two encodings of the same function.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!