06.09.2021 Views

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

Mind, Body, World- Foundations of Cognitive Science, 2013a

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

easons” as would the decision tree. This is why Dawson et al. hoped that this classical<br />

theory would literally be translated into the network.<br />

Apart from the output unit behaviour, how could one support the claim that<br />

a classical theory had been translated into a connectionist network? Dawson et<br />

al. (2000) interpreted the internal structure <strong>of</strong> the network in an attempt to see<br />

whether such a network analysis would reveal an internal representation <strong>of</strong> the<br />

classical algorithm. If this were the case, then standard training practices would<br />

have succeeded in translating the classical algorithm into a PDP network.<br />

One method that Dawson et al. (2000) used to interpret the trained network<br />

was a multivariate analysis <strong>of</strong> the network’s hidden unit space. They represented<br />

each mushroom as the vector <strong>of</strong> five hidden unit activation values that it produced<br />

when presented to the network. They then performed a k-means clustering <strong>of</strong> this<br />

data. The k-means clustering is an iterative procedure that assigns data points to k<br />

different clusters in such a way that each member <strong>of</strong> a cluster is closer to the centroid<br />

<strong>of</strong> that cluster than to the centroid <strong>of</strong> any other cluster to which other data<br />

points have been assigned.<br />

However, whenever cluster analysis is performed, one question that must be<br />

answered is How many clusters should be used?—in other words, what should the<br />

value <strong>of</strong> k be?. An answer to this question is called a stopping rule. Unfortunately,<br />

no single stopping rule has been agreed upon (Aldenderfer & Blashfield, 1984;<br />

Everitt, 1980). As a result, there exist many different types <strong>of</strong> methods for determining<br />

k (Milligan & Cooper, 1985).<br />

While no general method exists for determining the optimal number <strong>of</strong> clusters,<br />

one can take advantage <strong>of</strong> heuristic information concerning the domain being<br />

clustered in order to come up with a satisfactory stopping rule for this domain.<br />

Dawson et al. (2000) argued that when the hidden unit activities <strong>of</strong> a trained network<br />

are being clustered, there must be a correct mapping from these activities<br />

to output responses, because one trained network itself has discovered one such<br />

mapping. They used this position to create the following stopping rule: “Extract the<br />

smallest number <strong>of</strong> clusters such that every hidden unit activity vector assigned to<br />

the same cluster produces the same output response in the network.” They used this<br />

rule to determine that the k-means analysis <strong>of</strong> the network’s hidden unit activity<br />

patterns required the use <strong>of</strong> 12 different clusters.<br />

Dawson et al. (2000) then proceeded to examine the mushroom patterns that<br />

belonged to each cluster in order to determine what they had in common. For each<br />

cluster, they determined the set <strong>of</strong> descriptive features that each mushroom shared.<br />

They realized that each set <strong>of</strong> shared features they identified could be thought <strong>of</strong> as<br />

a condition, represented internally by the network as a vector <strong>of</strong> hidden unit activities,<br />

which results in the network producing a particular action, in particular, the<br />

edible/poisonous judgement represented by the first output unit.<br />

182 Chapter 4

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!