17.01.2013 Views

Chapter 2. Prehension

Chapter 2. Prehension

Chapter 2. Prehension

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>Chapter</strong> 4 - Planning of <strong>Prehension</strong> 87<br />

Table 4.1 Training set for neural network. A task is defined by<br />

task requirements, which are stated in terms relative to a hand and<br />

its CaDabilitv. For the given task reauirements. either Dad or Dalm<br />

opposition & selected (from Iberall, i988; adapted by -permission).<br />

TASK REQUIREMENTS CHOSEN<br />

TASK Surface Object Size Precision OPPOS-<br />

Length Width of<br />

Forces<br />

Needed ITION<br />

Lift heavy<br />

beer mug<br />

>4 fingers large large low PALM<br />

Lift long steel >4 fingers<br />

cylinder<br />

medium large low PALM<br />

Lift short<br />

cylinder<br />

3 fingers medium large low PALM<br />

Place wide 2fingers large medium high PAD<br />

short steel<br />

cylinder<br />

Lift large >4fingers large medium medium PAD<br />

glass<br />

Place cylinder > 4 fingers medium small high PAD<br />

Lift small lfinger small small medium PAD<br />

disk<br />

Lift med. disk 1 finger medium small low PAD<br />

Place med. 1 finger medium small high PAD<br />

disk<br />

Throw med. 1 finger medium medium low PAD<br />

disk<br />

computed ouput, a logistic activation function was used. A<br />

momentum term was added in order to increase the learning rate. An<br />

error cutoff of 0.05 was used to indicate that the network learned the<br />

training set. It took 833 repetitions of the training data to converge on<br />

a solution. The weights into which the network settled are seen in<br />

Figure 4.8b. The first row of squares from the bottom represent the<br />

weights on the links from the input units to the first hidden neuron<br />

(hidden neuron on the left in Figure 4.8a). The size of the square is<br />

proportional to the magnitude of the weight. Larger squares mean a<br />

larger influence. A negative influence is shown by black squares,<br />

positive by white. The second row of squares from the bottom<br />

represent the weights to the second hidden neuron, and so on. As can<br />

be seen, the size of the squares in the third row from the bottom are<br />

close to zero and thus the third hidden unit from the left has little

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!