Embodiment in cognitive systems - David Vernon

Embodiment in cognitive systems - David Vernon Embodiment in cognitive systems - David Vernon

11.07.2015 Views

Chapter 1Embodiment in cognitive systems: on the mutualdependence of cognition and roboticsDavid Vernon, Giorgio Metta and Giulio SandiniAbstractCognitive systems anticipate, assimilate, and adapt. They develop and learn,predict, explain, and imagine. In this chapter, we look briefly at the two principalparadigms of cognition, cognitivism and emergence, to determine whatembodied form each entails, if any. We highlight one specific emergentapproach to embodied cognition – enaction – and discuss the challenges itposes for the advancement of both robotics and cognition.AQ11.1 IntroductionIn recent years, robotic systems have increasingly been deployed in theempirical study of cognition. In this introductory chapter, we will explore thebasis for the relationship between these two areas, with the specific goal ofteasing out exactly what role robotics plays in cognition research and whetheror not it is a necessary role. To do this, we briefly identify the differentapproaches that have been taken to modelling and realising cognitive systemsand the relevance of embodiment to each approach. We then examine what itactually means to be embodied and identify different forms of embodiment. Byconsidering the different pairings of modes of cognition and modes of embodiment,we can reveal the different roles of robotics in cognition research. Wefocus on one particular pairing which forms the cornerstone of enaction, a farreachingparadigm of adaptive intelligence founded on generative (i.e., selfconstructed)autonomous development and learning [1, 2]. Our goal here inhighlighting enactive systems is not to present a comprehensive review of thefield but to identify its chief tenets and anchor the important relationshipbetween cognition and robotics in a significant research paradigm. We concludewith an overview of the challenges facing both robotics and cognitivesystems in pushing forward their related research agendas.Nefti-Meziani_Chapter01_p001-012 15 February 2010; 16:49:41

Chapter 1<strong>Embodiment</strong> <strong>in</strong> <strong>cognitive</strong> <strong>systems</strong>: on the mutualdependence of cognition and robotics<strong>David</strong> <strong>Vernon</strong>, Giorgio Metta and Giulio Sand<strong>in</strong>iAbstractCognitive <strong>systems</strong> anticipate, assimilate, and adapt. They develop and learn,predict, expla<strong>in</strong>, and imag<strong>in</strong>e. In this chapter, we look briefly at the two pr<strong>in</strong>cipalparadigms of cognition, cognitivism and emergence, to determ<strong>in</strong>e whatembodied form each entails, if any. We highlight one specific emergentapproach to embodied cognition – enaction – and discuss the challenges itposes for the advancement of both robotics and cognition.AQ11.1 IntroductionIn recent years, robotic <strong>systems</strong> have <strong>in</strong>creas<strong>in</strong>gly been deployed <strong>in</strong> theempirical study of cognition. In this <strong>in</strong>troductory chapter, we will explore thebasis for the relationship between these two areas, with the specific goal ofteas<strong>in</strong>g out exactly what role robotics plays <strong>in</strong> cognition research and whetheror not it is a necessary role. To do this, we briefly identify the differentapproaches that have been taken to modell<strong>in</strong>g and realis<strong>in</strong>g <strong>cognitive</strong> <strong>systems</strong>and the relevance of embodiment to each approach. We then exam<strong>in</strong>e what itactually means to be embodied and identify different forms of embodiment. Byconsider<strong>in</strong>g the different pair<strong>in</strong>gs of modes of cognition and modes of embodiment,we can reveal the different roles of robotics <strong>in</strong> cognition research. Wefocus on one particular pair<strong>in</strong>g which forms the cornerstone of enaction, a farreach<strong>in</strong>gparadigm of adaptive <strong>in</strong>telligence founded on generative (i.e., selfconstructed)autonomous development and learn<strong>in</strong>g [1, 2]. Our goal here <strong>in</strong>highlight<strong>in</strong>g enactive <strong>systems</strong> is not to present a comprehensive review of thefield but to identify its chief tenets and anchor the important relationshipbetween cognition and robotics <strong>in</strong> a significant research paradigm. We concludewith an overview of the challenges fac<strong>in</strong>g both robotics and <strong>cognitive</strong><strong>systems</strong> <strong>in</strong> push<strong>in</strong>g forward their related research agendas.Nefti-Meziani_Chapter01_p001-012 15 February 2010; 16:49:41


2 Advances <strong>in</strong> <strong>cognitive</strong> <strong>systems</strong>1.2 CognitionCognitive <strong>systems</strong> anticipate, assimilate, and adapt; they develop andthey learn [3]. Cognitive <strong>systems</strong> predict future events when select<strong>in</strong>g actions,they subsequently learn from what actually happens when they do act,and thereby they modify subsequent predictions, <strong>in</strong> the process chang<strong>in</strong>g howth<strong>in</strong>gs are perceived and what actions are appropriate. Typically, they operateautonomously. To do this, <strong>cognitive</strong> <strong>systems</strong> generate and use knowledgewhich can be deployed not just for prediction, look<strong>in</strong>g forward <strong>in</strong> time, butalso for explanation, look<strong>in</strong>g back <strong>in</strong> time, and even for imag<strong>in</strong>ation, explor<strong>in</strong>gcounterfactual scenarios.The hallmark of a <strong>cognitive</strong> system is that it can function effectively <strong>in</strong> circumstancesthat were not planned for explicitly when the system was designed.That is, it has a degree of plasticity and is resilient <strong>in</strong> the face of the unexpected [4].This adaptive, anticipatory, autonomous viewpo<strong>in</strong>t reflects the position of Freemanand Nu´n˜ ez who, <strong>in</strong> their book Reclaim<strong>in</strong>g Cognition [5], assert the primacy ofaction, <strong>in</strong>tention, and emotion <strong>in</strong> cognition. In the past, however, cognition wasviewed <strong>in</strong> a very different light as a symbol-process<strong>in</strong>g module of m<strong>in</strong>d concernedwith rational plann<strong>in</strong>g and reason<strong>in</strong>g. Today, however, this is chang<strong>in</strong>g and evenproponents of these early approaches now see a much tighter relationship betweenperception, action, and cognition (e.g., see [6, 7]).So, if <strong>cognitive</strong> <strong>systems</strong> anticipate, assimilate, and adapt; if they developand learn, predict, expla<strong>in</strong>, and imag<strong>in</strong>e, the question is how do they do this?Unsurpris<strong>in</strong>gly, many approaches have been taken to address this problem.Among these, however, we can discern two broad classes: the cognitivistapproach based on symbolic <strong>in</strong>formation-process<strong>in</strong>g representational <strong>systems</strong>,and the emergent <strong>systems</strong> approach, embrac<strong>in</strong>g connectionist <strong>systems</strong>, dynamical<strong>systems</strong>, and enactive <strong>systems</strong>, all based to a lesser or greater extent onpr<strong>in</strong>ciples of self-organisation [8, 9].Cognitivist approaches correspond to the classical and still common viewthat cognition is a type of computation [10], a process that is symbolic, rational,encapsulated, structured, and algorithmic [11]. In contrast, advocates of connectionist,dynamical, and enactive <strong>systems</strong> approaches argue <strong>in</strong> favour of aposition that treats cognition as emergent, self-organis<strong>in</strong>g, and dynamical [12].Although the use of symbols is often presented as def<strong>in</strong>itive differencebetween the cognitivist and emergent positions, they differ more deeply andmore fundamentally, well beyond a simple dist<strong>in</strong>ction based on symbolmanipulation. For example, each adopts a different stance on computationaloperation, representational framework, semantic ground<strong>in</strong>g, temporal constra<strong>in</strong>ts,<strong>in</strong>ter-agent epistemology, embodiment, perception, action, anticipation,adaptation, motivation, and autonomy [3].Cognitivism asserts that cognition <strong>in</strong>volves computations def<strong>in</strong>ed over<strong>in</strong>ternal representations of knowledge, <strong>in</strong> a process whereby <strong>in</strong>formation aboutthe world is abstracted by perception, and represented us<strong>in</strong>g some appropriatesymbolic data-structure, reasoned about, and then used to plan and act <strong>in</strong> theNefti-Meziani_Chapter01_p001-012 15 February 2010; 16:50:1


On the mutual dependence of cognition and robotics 3world. The approach has also been labelled by many as the <strong>in</strong>formation process<strong>in</strong>g(or symbol manipulation) approach to cognition [8, 11–17].For cognitivists, cognition is representational <strong>in</strong> a strong and particularsense: it entails the manipulation of explicit symbolic representations of thestate and behaviour of the external world and the storage of the knowledgega<strong>in</strong>ed from experience to reason even more effectively <strong>in</strong> the future [18]. Perceptionis concerned with the abstraction of faithful spatio-temporal representationsof the external world from sensory data.In most cognitivist approaches concerned with the build<strong>in</strong>g artificial <strong>cognitive</strong><strong>systems</strong>, the symbolic representations are <strong>in</strong>itially at least the product of a humandesigner: the designer’s description of his view of the world. This is significantbecause it means that these representations can be directly accessed and <strong>in</strong>terpretedby others. It has been argued that this is also the key limit<strong>in</strong>g factor of cognitivist<strong>systems</strong> as these programmer-dependent representations effectively bl<strong>in</strong>d thesystem [19], constra<strong>in</strong><strong>in</strong>g it to an idealised description that is a consequence of the<strong>cognitive</strong> requirements of human activity. This approach works well as long asthe assumptions that are implicit <strong>in</strong> the designer model are valid. As soon as they arenot, the system fails. Sometimes, cognitivist <strong>systems</strong> deploy mach<strong>in</strong>e learn<strong>in</strong>gand probabilistic modell<strong>in</strong>g <strong>in</strong> an attempt to deal with the <strong>in</strong>herently uncerta<strong>in</strong> and<strong>in</strong>complete nature of the sensory data that is be<strong>in</strong>g used to drive this representationalframework. However, this doesn’t alter the fact that the representationalstructure is still predicated on the descriptions of the designers.Emergent approaches take a very different view of cognition. They viewcognition as the process whereby an autonomous system becomes viable andeffective <strong>in</strong> its environment. It does so through a process of self-organisationthrough which the system is cont<strong>in</strong>ually reconstitut<strong>in</strong>g itself <strong>in</strong> real-time toma<strong>in</strong>ta<strong>in</strong> its operational identity. For emergent approaches, perception isconcerned with the acquisition of sensory data to enable effective action [20]and is dependent on the richness of the action <strong>in</strong>terface [21]. It is not a processwhereby the structure of an absolute external environment is abstracted andrepresented <strong>in</strong> a more or less isomorphic manner.In contrast to the cognitivist approach, many emergent approaches assertthat the primary model for <strong>cognitive</strong> learn<strong>in</strong>g is anticipative skill constructionrather than knowledge acquisition, and that processes which both guide actionand improve the capacity to guide action while do<strong>in</strong>g so are taken to be theroot capacity for all <strong>in</strong>telligent <strong>systems</strong> [22]. Cognitivism entails a selfconta<strong>in</strong>edabstract model that is disembodied <strong>in</strong> pr<strong>in</strong>ciple and the physical<strong>in</strong>stantiation of the <strong>systems</strong> plays no part <strong>in</strong> the model of cognition [4, 23]. Incontrast, emergent approaches are <strong>in</strong>tr<strong>in</strong>sically embodied and the physical<strong>in</strong>stantiation plays a pivotal role <strong>in</strong> cognition.1.3 <strong>Embodiment</strong>What exactly it is to be embodied? One form of embodiment, and clearly thetype envisaged by proponents of the emergent <strong>systems</strong> approach to cognition,Nefti-Meziani_Chapter01_p001-012 15 February 2010; 16:50:1


4 Advances <strong>in</strong> <strong>cognitive</strong> <strong>systems</strong>is a physically active body capable of mov<strong>in</strong>g <strong>in</strong> space, manipulat<strong>in</strong>g itsenvironment, alter<strong>in</strong>g the state of the environment, and experienc<strong>in</strong>g the physicalforces associated with that manipulation [24]. But there are other forms ofembodiment. Ziemke <strong>in</strong>troduced a framework to characterise five differenttypes of embodiment [25, 26]:1. Structural coupl<strong>in</strong>g between agent and environment <strong>in</strong> the sense a systemcan be perturbed by its environment and can <strong>in</strong> turn perturb itsenvironment;2. Historical embodiment as a result of a history of structural coupl<strong>in</strong>g;3. Physical embodiment <strong>in</strong> a structure that is capable of forcible action (thisexcludes software agents);4. Organismoid embodiment, i.e., organism-like bodily form (e.g., humanoid orrat-like robots); and5. Organismic embodiment of autopoietic liv<strong>in</strong>g <strong>systems</strong>.This list is ordered by <strong>in</strong>creas<strong>in</strong>g specificity and physicality. Structural coupl<strong>in</strong>gentails that the system can <strong>in</strong>fluence and be <strong>in</strong>fluenced by the physical world. Acomputer controll<strong>in</strong>g a power plant or a computerised cruise control <strong>in</strong> a passengercar would satisfy this level of embodiment. A computer game would notnecessarily do so. Historical embodiment adds the <strong>in</strong>corporation of a history ofstructural coupl<strong>in</strong>g to this level of physical <strong>in</strong>teraction so that past <strong>in</strong>teractionsshape the embodiment. Physical embodiment is most closely allied to conventionalrobot <strong>systems</strong>, with organismoid embodiment add<strong>in</strong>g the constra<strong>in</strong>t thatthe robot morphology is modelled on specific natural species or some feature ofnatural species. Organismic embodiment corresponds to liv<strong>in</strong>g be<strong>in</strong>gs.Despite the current emphasis on embodiment <strong>in</strong> <strong>cognitive</strong> system research,Ziemke argues that many current approaches <strong>in</strong> <strong>cognitive</strong> robotics and epigeneticrobotics still adhere to the functionalist hardware/software dist<strong>in</strong>ction <strong>in</strong>the sense that the computational model does not <strong>in</strong> pr<strong>in</strong>ciple <strong>in</strong>volve a physical<strong>in</strong>stantiation and, furthermore, that all physical <strong>in</strong>stantiations are equivalent aslong as they support the required computations. Ziemke notes that this is asignificant problem because the fundamental idea underp<strong>in</strong>n<strong>in</strong>g embodiment isthat the morphology of the system is actually a key component of the <strong>systems</strong>dynamics. The morphology of the <strong>cognitive</strong> system not only matters, it is aconstitutive part of the <strong>cognitive</strong> process.1.4 The Role of Robotics <strong>in</strong> CognitionCognitivist <strong>systems</strong> do not need to be embodied. They are functionalist <strong>in</strong>pr<strong>in</strong>ciple [5]: cognition comprises computational operations def<strong>in</strong>ed oversymbolic representations and these computational operations are not tied toany given <strong>in</strong>stantiation. Although any computational system requires somephysical realisation to effect its computations, the underly<strong>in</strong>g computationalmodel is <strong>in</strong>dependent of the physical platform on which it is implemented. ForNefti-Meziani_Chapter01_p001-012 15 February 2010; 16:50:1


On the mutual dependence of cognition and robotics 5this reason, it has also been noted that cognitivism exhibits a form of m<strong>in</strong>d–body dualism [11, 24].Cognitivism is also positivist <strong>in</strong> outlook: all <strong>cognitive</strong> <strong>systems</strong> – designerand designed – share a common universally accessible and universally representableworld that is apprehended by perception. Consequently, symbolicknowledge about this world, framed <strong>in</strong> the concepts of the designer, can beprogrammed <strong>in</strong> directly and doesn’t necessarily have to be developed by thesystem itself through exploration of the environment. Some cognitivist <strong>systems</strong>exploit learn<strong>in</strong>g to augment or even supplant the a priori designed-<strong>in</strong> knowledgeand thereby achieve a greater degree of adaptiveness and robustness.<strong>Embodiment</strong> at any of the five levels identified <strong>in</strong> the previous section mayoffer an additional degree of freedom to facilitate this learn<strong>in</strong>g, but it is by nomeans necessary.The perspective from emergent <strong>systems</strong> is diametrically opposed to thecognitivist position. Emergent <strong>systems</strong>, by def<strong>in</strong>ition, must be embodiedand embedded <strong>in</strong> their environment <strong>in</strong> a situated historical developmentalcontext [11]. Furthermore, the physical <strong>in</strong>stantiation plays a direct constitutiverole <strong>in</strong> the <strong>cognitive</strong> process [4, 27, 28].To see why embodiment is a necessary condition of emergent cognition,consider aga<strong>in</strong> what cognition means <strong>in</strong> the emergent paradigm. It is the processwhereby an autonomous system becomes viable and effective <strong>in</strong> itsenvironment. In this, there are two complementary th<strong>in</strong>gs go<strong>in</strong>g on: one is theself-organisation of the system as a dist<strong>in</strong>ct entity, and the second is the coupl<strong>in</strong>gof that entity with its environment. ‘Perception, action, and cognitionform a s<strong>in</strong>gle process’ [24] of self-organisation <strong>in</strong> the specific context of environmentalperturbations of the system. This gives rise to the ontogenic developmentof the system over its lifetime. This development is identically the<strong>cognitive</strong> process of establish<strong>in</strong>g the space of mutually consistent coupl<strong>in</strong>gs.These environmental perturbations don’t control the system s<strong>in</strong>ce they are notcomponents of the system (and, by def<strong>in</strong>ition, don’t play a part <strong>in</strong> the selforganisation)but they do play a part <strong>in</strong> the ontogenic development of thesystem. Through this ontogenic development, the <strong>cognitive</strong> system developsits own epistemology, i.e., its own system-specific knowledge of its world,knowledge that has mean<strong>in</strong>g exactly because it captures the consistencyand <strong>in</strong>variance that emerges from the dynamic self-organisation <strong>in</strong> the face ofenvironmental coupl<strong>in</strong>g. Thus, we can see that, from this perspective, cognitionis <strong>in</strong>separable from ‘bodily action’ [24]: without physical embodied exploration,a <strong>cognitive</strong> system has no basis for development.1.5 EnactionEnactive <strong>systems</strong> [8, 19, 20, 29–32] take the emergent paradigm one step further.The five central concepts of enactive <strong>cognitive</strong> science are embodiment, experience,emergence, autonomy, and sense-mak<strong>in</strong>g [2, 33]. In contradist<strong>in</strong>ctionNefti-Meziani_Chapter01_p001-012 15 February 2010; 16:50:1


6 Advances <strong>in</strong> <strong>cognitive</strong> <strong>systems</strong>to cognitivism, which <strong>in</strong>volves a view of cognition that requires the representationof a given objective predeterm<strong>in</strong>ed world [8, 34], enaction asserts that cognitionis a process whereby the issues that are important for the cont<strong>in</strong>uedexistence of a <strong>cognitive</strong> entity are brought out or enacted: co-determ<strong>in</strong>ed bythe entity as it <strong>in</strong>teracts with the environment <strong>in</strong> which it is embedded. The termco-determ<strong>in</strong>ation [20] is laden with mean<strong>in</strong>g. It implies that the <strong>cognitive</strong> agent isdeeply embedded <strong>in</strong> the environment and specified by it. At the same time, itimplies that the process of cognition determ<strong>in</strong>es what is real or mean<strong>in</strong>gful forthe agent. In other words, the system’s actions def<strong>in</strong>e the space of perception.This space of perceptual possibilities is predicated not on an objective environment,but on the space of possible actions that the system can engage <strong>in</strong>whilst still ma<strong>in</strong>ta<strong>in</strong><strong>in</strong>g the consistency of the coupl<strong>in</strong>g with the environment.Co-determ<strong>in</strong>ation means that the agent constructs its reality (its world) as aresult of its operation <strong>in</strong> that world. In this context, <strong>cognitive</strong> behaviour is<strong>in</strong>herently specific to the embodiment of the system and dependent on the system’shistory of <strong>in</strong>teractions, i.e., its experiences. Thus, noth<strong>in</strong>g is ‘pre-given’. Insteadthere is an enactive <strong>in</strong>terpretation: a real-time context-based choos<strong>in</strong>g of relevance(i.e., sense-mak<strong>in</strong>g).For cognitivism, the role of cognition is to abstract objective structure andmean<strong>in</strong>g through perception and reason<strong>in</strong>g. For enactive <strong>systems</strong>, the purposeof cognition is to uncover unspecified regularity and order that can then beconstrued as mean<strong>in</strong>gful because they facilitate the cont<strong>in</strong>u<strong>in</strong>g operation anddevelopment of the <strong>cognitive</strong> system. In adopt<strong>in</strong>g this stance, the enactiveposition challenges the conventional assumption that the world as the systemexperiences it is <strong>in</strong>dependent of the <strong>cognitive</strong> system (‘the knower’). Instead,knower and known ‘stand <strong>in</strong> relation to each other as mutual specification:they arise together’ [8].For an enactive system, knowledge is the effective use of sensorimotorcont<strong>in</strong>gencies grounded <strong>in</strong> the structural coupl<strong>in</strong>g <strong>in</strong> which the nervous systemexists. Knowledge is particular to the system’s history of <strong>in</strong>teraction. If thatknowledge is shared among a society of <strong>cognitive</strong> agents, it is not because ofany <strong>in</strong>tr<strong>in</strong>sic abstract universality, but because of the consensual history ofexperiences shared between <strong>cognitive</strong> agents with similar phylogeny and compatibleontogeny.The knowledge possessed by an enactive system is built on sensorimotorassociations, achieved <strong>in</strong>itially by exploration, and affordancees. 1 However,this is only the beg<strong>in</strong>n<strong>in</strong>g. The enactive system uses the knowledge ga<strong>in</strong>ed toform new knowledge which is then subjected to empirical validation to seewhether or not it is warranted (we, as enactive be<strong>in</strong>gs, imag<strong>in</strong>e many th<strong>in</strong>gs butnot everyth<strong>in</strong>g we imag<strong>in</strong>e is plausible or corresponds well with reality, i.e., ourphenomenological experience of our environment).AQ2AQ31 For true enaction, everyth<strong>in</strong>g is affordance s<strong>in</strong>ce everyth<strong>in</strong>g that is experienced is cont<strong>in</strong>gent uponthe <strong>systems</strong> own spatiotemporal experience and embodiment.Nefti-Meziani_Chapter01_p001-012 15 February 2010; 16:50:1


On the mutual dependence of cognition and robotics 7One of the key issues <strong>in</strong> cognition, <strong>in</strong> general, and enaction, <strong>in</strong> particular,is the importance of <strong>in</strong>ternal simulation <strong>in</strong> accelerat<strong>in</strong>g the scaffold<strong>in</strong>gof this early developmentally acquired sensorimotor knowledge to provide ameans to:AQ4●●●predict future events;expla<strong>in</strong> observed events (construct<strong>in</strong>g a causal cha<strong>in</strong> lead<strong>in</strong>g to that event);imag<strong>in</strong>e new events.Crucially, there is a need to focus on (re-)ground<strong>in</strong>g predicted, expla<strong>in</strong>ed, orimag<strong>in</strong>ed events <strong>in</strong> experience so that the system – the robot – can do someth<strong>in</strong>gnew and <strong>in</strong>teract with the environment <strong>in</strong> a new way.The dependence of a <strong>cognitive</strong> system’s perceptions and knowledge onits history of coupl<strong>in</strong>g (or <strong>in</strong>teraction) with the environment and on the veryform or morphology of the system itself has an important consequence: thereis no guarantee that the resultant cognition will be consistent with humancognition. This may not be a problem, as long as the system behaves as wewould wish it to. On the other hand, if we want to ensure compatibility withhuman cognition, then we have to admit a stronger humanoid form ofembodiment and adopt a doma<strong>in</strong> of discourse that is the same as the one <strong>in</strong>which we live: one that <strong>in</strong>volves physical movement, forcible manipulation,and exploration [35].1.6 ChallengesThe adoption of an embodied approach to the development of <strong>cognitive</strong> <strong>systems</strong>poses many challenges. We highlight just a few <strong>in</strong> the follow<strong>in</strong>g text.The first challenge is the identification of the phylogenetic configurationand ontogenetic processes. Phylogeny – the evolution of the system configurationfrom generation to generation – determ<strong>in</strong>es the sensory-motor capabilitiesthat a system is configured with at the outset and that facilitate the system’s<strong>in</strong>nate behaviours. Ontogenetic development – the adaptation and learn<strong>in</strong>g ofthe system dur<strong>in</strong>g its lifetime – gives rise to the <strong>cognitive</strong> capabilities that weseek. To enable development, we must somehow identify a m<strong>in</strong>imal phylogenicstate of the system. In practice, this means that we must identify and effectperceptuo-motor capabilities for the m<strong>in</strong>imal behaviours that ontogeneticdevelopment will subsequently build on to achieve <strong>cognitive</strong> behaviour.The requirements of real-time synchronous system–environment coupl<strong>in</strong>gand historical, situated, and embodied development pose another challenge.Specifically, the maximum rate of ontogenic development is constra<strong>in</strong>ed bythe speed of coupl<strong>in</strong>g (i.e., the <strong>in</strong>teraction) and not by the speed at which<strong>in</strong>ternal process<strong>in</strong>g can occur [19]. Natural <strong>cognitive</strong> <strong>systems</strong> have a learn<strong>in</strong>gcycle measured <strong>in</strong> weeks, months, and years and, while it might be possibleto condense these <strong>in</strong>to m<strong>in</strong>utes and hours for an artificial system becauseof <strong>in</strong>creases <strong>in</strong> the rate of <strong>in</strong>ternal adaptation and change, it cannot be reducedNefti-Meziani_Chapter01_p001-012 15 February 2010; 16:50:1


8 Advances <strong>in</strong> <strong>cognitive</strong> <strong>systems</strong>below the timescale of the <strong>in</strong>teraction. You cannot short-circuit ontogeneticdevelopment because it is the agent’s own experience that def<strong>in</strong>es its <strong>cognitive</strong>understand<strong>in</strong>g of the world <strong>in</strong> which it is embedded. This has serious implicationsfor the degree of <strong>cognitive</strong> development we can practically expect of these<strong>systems</strong>.Development implies the progressive acquisition of predictive anticipatorycapabilities by a system over its lifetime through experiential learn<strong>in</strong>g. Developmentdepends crucially on motivations which underp<strong>in</strong> the goals of actions.The two most important motives that drive actions and development are socialand explorative. There are at least two exploratory motives, one focuss<strong>in</strong>g onthe discovery of novelty and regularities <strong>in</strong> the world, and one focuss<strong>in</strong>g on thepotential of one’s own actions. A challenge that faces all developmentalembodied robotic <strong>cognitive</strong> <strong>systems</strong> is that of modell<strong>in</strong>g these motivations andtheir <strong>in</strong>terplay, and identify<strong>in</strong>g how they <strong>in</strong>fluence action.Enactive <strong>systems</strong> are founded on the pr<strong>in</strong>ciple that the system discovers orconstructs for itself a world model that supports its cont<strong>in</strong>ued autonomy andmakes sense of that world <strong>in</strong> the context of the system’s own morphologydependentcoupl<strong>in</strong>g or <strong>in</strong>teraction with that world. The identification of suchgenerative self-organis<strong>in</strong>g processes is pivotal to the future progress of the field.While much current research concentrates on generative processes that focuson sensorimotor perception–action <strong>in</strong>variances, such as learn<strong>in</strong>g affordances, itis not clear at present how to extend this work to generate the more abstractknowledge that will facilitate the prediction, explanation, and imag<strong>in</strong>ation thatis so characteristic of a true <strong>cognitive</strong> system.F<strong>in</strong>ally, development <strong>in</strong> it fullest sense represents a great challenge forrobotics. It is not just the state of the system that is subject to development butalso the very morphology, physical properties, and structure of the system – thek<strong>in</strong>ematics and dynamics – that develop and contribute to the emergence ofembodied <strong>cognitive</strong> capabilities. To realise this form of development, we willneed new adaptive materials and a new way of th<strong>in</strong>k<strong>in</strong>g to <strong>in</strong>tegrate them <strong>in</strong>toour models of cognition.1.7 ConclusionsCognitivist <strong>systems</strong> are dualist, functionalist, and positivist. They are dualist <strong>in</strong>the sense that there is a fundamental dist<strong>in</strong>ction between the m<strong>in</strong>d (the computationalprocesses) and the body (the computational <strong>in</strong>frastructure and,where required, the devices that effect any physical <strong>in</strong>teraction). They arefunctionalist <strong>in</strong> the sense that the actual <strong>in</strong>stantiation and computational<strong>in</strong>frastructure is <strong>in</strong>consequential: any <strong>in</strong>stantiation that supports the symbolicprocess<strong>in</strong>g is sufficient. They are positivist <strong>in</strong> the sense that they assert a uniqueand absolute empirically accessible external reality that can be modelled andembedded <strong>in</strong> the system by a human designer. Consequently, embodiment ofany type plays no necessary role.Nefti-Meziani_Chapter01_p001-012 15 February 2010; 16:50:1


On the mutual dependence of cognition and robotics 9In the enactive paradigm, the situation is the reverse. The perceptualcapacities are a consequence of an historic embodied development and, consequently,are dependent on the richness of the motoric <strong>in</strong>terface of the <strong>cognitive</strong>agent with its world. That is, the action space def<strong>in</strong>es the perceptualspace and thus is fundamentally based <strong>in</strong> the frame-of-reference of the agent.Consequently, the enactive position is that cognition can only be created<strong>in</strong> a developmental agent-centred manner, through <strong>in</strong>teraction, learn<strong>in</strong>g, andco-development with the environment. It follows that, through this ontogenicdevelopment, the <strong>cognitive</strong> system develops its own epistemology, i.e., its ownsystem-specific knowledge of its world, knowledge that has mean<strong>in</strong>g exactlybecause it captures the consistency and <strong>in</strong>variance that emerges from thedynamic self-organisation <strong>in</strong> the face of environmental coupl<strong>in</strong>g.Despite the current emphasis on embodiment <strong>in</strong> <strong>cognitive</strong> <strong>systems</strong>research, many current approaches <strong>in</strong> <strong>cognitive</strong> and epigenetic robotics stilladhere to the functionalist dualist hardware/software dist<strong>in</strong>ction. It is not yetclear that researchers have embraced the deep philosophical and scientificcommitments of adopt<strong>in</strong>g an enactive approach to embodied robotic <strong>cognitive</strong><strong>systems</strong>: the non-functionalist, non-dualist, and non-positivist stance of enaction.It is non-functionalist s<strong>in</strong>ce the robot body plays a constitutive part <strong>in</strong> the<strong>cognitive</strong> process and is not just a physical <strong>in</strong>put–output device.It is non-dualist s<strong>in</strong>ce there is no dist<strong>in</strong>ction between body and m<strong>in</strong>d <strong>in</strong> thedynamical system that constitutes a <strong>cognitive</strong> system. It is non-positivist s<strong>in</strong>ceknowledge <strong>in</strong> an enactive system is phenomenological and not directly accessible;the best we can hope for is a common epistemology deriv<strong>in</strong>g from ashared history of experiences.There are many challenges to be overcome <strong>in</strong> push<strong>in</strong>g back the boundariesof <strong>cognitive</strong> <strong>systems</strong> research, particularly <strong>in</strong> the area of enaction. Foremostamong these is the difficult task of identify<strong>in</strong>g the necessary phylogeny andontogeny of an artificial <strong>cognitive</strong> system: the requisite <strong>cognitive</strong> architecturethat facilitates both the system’s autonomy (i.e., its self-organisation andstructural coupl<strong>in</strong>g with the environment) and its capacity for development andself-modification. To allow true ontogenetic development, this <strong>cognitive</strong> architecturemust be embodied <strong>in</strong> a way that allows the system the freedom to exploreand <strong>in</strong>teract and to do so <strong>in</strong> an adaptive physical form that enables the systemto expand its space of possible autonomy-preserv<strong>in</strong>g <strong>in</strong>teractions. This <strong>in</strong>turn creates a need for new physical platforms that offer a rich repertoire ofperception–action coupl<strong>in</strong>gs and a morphology that can be altered as a consequenceof the system’s own dynamics. In meet<strong>in</strong>g these challenges, we movewell beyond attempts to build cognitivist <strong>systems</strong> that exploit embeddedknowledge and which try to see the world the way we designers see it. We evenmove beyond learn<strong>in</strong>g and self-organis<strong>in</strong>g <strong>systems</strong> that uncover for themselvesstatistical regularity <strong>in</strong> their perceptions. Instead, we set our sights on build<strong>in</strong>genactive phenomenologically grounded <strong>systems</strong> that construct their own understand<strong>in</strong>gof their world through adaptive embodied exploration and social<strong>in</strong>teraction.Nefti-Meziani_Chapter01_p001-012 15 February 2010; 16:50:1


10 Advances <strong>in</strong> <strong>cognitive</strong> <strong>systems</strong>AcknowledgementThis work was supported by the European Commission, Project IST-004370RobotCub, under Strategic Objective 2.3.2.4: Cognitive Systems.ReferencesAQ51. T. Froese and T. Ziemke. Enactive artificial <strong>in</strong>telligence: Investigat<strong>in</strong>gthe systemic organization of life and m<strong>in</strong>d. Artificial Intelligence,173:466–500, 2009.2. E. Di Paolo, M. Rohde and H. De Jaegher. Horizons for the enactivem<strong>in</strong>d: Values, social <strong>in</strong>teraction, and play. In J. Stewart, O. Gapenne, andE. Di Paolo, eds., Enaction: Towards a New Paradigm for Cognitive Science.MIT Press, Cambridge, MA, 2008.3. D. <strong>Vernon</strong>, G. Metta, and G. Sand<strong>in</strong>i. A survey of artificial <strong>cognitive</strong><strong>systems</strong>: Implications for the autonomous development of mental capabilities<strong>in</strong> computational agents. IEEE Transaction on EvolutionaryComputation, 11(2):151–180, 2007.4. D. <strong>Vernon</strong>. The space of <strong>cognitive</strong> vision. In H. I. Christensen and H. H.Nagel, eds., Cognitive Vision Systems: Sampl<strong>in</strong>g the Spectrum ofApproaches, LNCS, pp. 7–26. Spr<strong>in</strong>ger-Verlag, Heidelberg, 2006.5. W. J. Freeman and R. Nu´n˜ ez. Restor<strong>in</strong>g to cognition the forgotten primacyof action, <strong>in</strong>tention and emotion. Journal of Consciousness Studies,6(11–12):ix–xix, 1999.6. J. R. Anderson, D. Bothell, M. D. Byrne, S. Douglass, C. Lebiere, andY. Q<strong>in</strong>. An <strong>in</strong>tegrated theory of the m<strong>in</strong>d. Psychological Review,111(4):1036–1060, 2004.7. P. Langley. An adaptive architecture for physical agents. IEEE/WIC/ACM International Conference on Intelligent Agent Technology,pp. 18–25. IEEE Computer Society Press, Compiegne, France, 2005.8. F. J. Varela. Whence perceptual mean<strong>in</strong>g? A cartography of current AQ6ideas. In F. J. Varela and J.-P. Dupuy, eds., Understand<strong>in</strong>g Orig<strong>in</strong>s –Contemporary Views on the Orig<strong>in</strong> of Life, M<strong>in</strong>d and Society, BostonStudies <strong>in</strong> the Philosophy of Science, pp. 235–263. Kluwer AcademicPublishers, Dordrecht, 1992.9. A. Clark. M<strong>in</strong>dware – An Introduction to the Philosophy of CognitiveScience. Oxford University Press, New York, 2001.10. Z. W. Pylyshyn. Computation and Cognition (2nd edition). BradfordBooks, MIT Press, Cambridge, MA, 1984.11. E. Thelen and L. B. Smith. A Dynamic Systems Approach to the Developmentof Cognition and Action. MIT Press/Bradford Books Series <strong>in</strong>Cognitive Psychology. MIT Press, Cambridge, MA, 1994.12. J. A. S. Kelso. Dynamic Patterns – The Self-Organization of Bra<strong>in</strong> andBehaviour (3rd edition). MIT Press, Cambridge, MA, 1995.Nefti-Meziani_Chapter01_p001-012 15 February 2010; 16:50:1


On the mutual dependence of cognition and robotics 1113. D. Marr. Artificial <strong>in</strong>telligence – A personal view. Artificial Intelligence,9:37–48, 1977.14. A. Newell and H. A. Simon. Computer science as empirical <strong>in</strong>quiry:Symbols and search. Communications of the Association for Comput<strong>in</strong>gMach<strong>in</strong>ery, vol. 19, pp. 113–126, March 1976. Tenth Tur<strong>in</strong>g award lecture,ACM, 1975.15. J. Haugland. Semantic eng<strong>in</strong>es: An <strong>in</strong>troduction to m<strong>in</strong>d design. In J.Haugland, ed., M<strong>in</strong>d Design: Philosophy, Psychology, Artificial Intelligence,pp. 1–34. Bradford Books, MIT Press, Cambridge, MA, 1982.16. S. P<strong>in</strong>ker. Visual cognition: An <strong>in</strong>troduction. Cognition, 18:1–63, 1984.17. J. F. Kihlstrom. The <strong>cognitive</strong> unconscious. Science, vol. 237, pp.1445–1452, September 1987.18. E. Hollnagel and D. D. Woods. Cognitive <strong>systems</strong> eng<strong>in</strong>eer<strong>in</strong>g: New w<strong>in</strong>d<strong>in</strong> new bottles. International Journal of Human-Computer Studies,51:339–356, 1999.19. T. W<strong>in</strong>ograd and F. Flores. Understand<strong>in</strong>g Computers and Cognition –A New Foundation for Design. Addison-Wesley Publish<strong>in</strong>g Company,Inc., Read<strong>in</strong>g, MA, 1986.20. H. Maturana and F. Varela. The Tree of Knowledge – The BiologicalRoots of Human Understand<strong>in</strong>g. New Science Library, Boston andLondon, 1987.21. G. H. Granlund. The complexity of vision. Signal Process<strong>in</strong>g, 74:101–126,1999.22. W. D. Christensen and C. A. Hooker. An <strong>in</strong>teractivist-constructivistapproach to <strong>in</strong>telligence: Self-directed anticipative learn<strong>in</strong>g. PhilosophicalPsychology, 13(1):5–45, 2000.23. D. <strong>Vernon</strong>. Cognitive vision: The case for embodied perception. Imageand Vision Comput<strong>in</strong>g, <strong>in</strong> press:1–14, 2007AQ724. E. Thelen. Time-scale dynamics and the development of embodied cognition.In R. F. Port and T. van Gelder eds., M<strong>in</strong>d as Motion – Explorations<strong>in</strong> the Dynamics of Cognition, pp. 69–100. Bradford Books, MITPress, Cambridge, MA, 1995.25. T. Ziemke. Are robots embodied? In C. Balkenius, J. Zlatev, K.Dautenhahn, H. Kozima and C. Breazeal eds., Proceed<strong>in</strong>gs of the FirstAQ8International Workshop on Epigenetic Robotics – Model<strong>in</strong>g CognitiveDevelopment <strong>in</strong> Robotic Systems, Lund University Cognitive Studies, vol.85, pp. 75–83, Lund, Sweden, 2001.26. T. Ziemke. What’s that th<strong>in</strong>g called embodiment? In R. Alterman and D.Kirsh, eds., Proceed<strong>in</strong>gs of the 25th Annual Conference of the CognitiveScience Society, Lund University Cognitive Studies, pp. 1134–1139.Lawrence Erlbaum, Mahwah, NJ, 2003.27. J. L. Krichmar and G. M. Edelman. Pr<strong>in</strong>ciples underly<strong>in</strong>g the constructionof bra<strong>in</strong>-based devices. In T. Kovacs and J. A. R. Marshall, eds.,Proceed<strong>in</strong>gs of AISB ‘06 – Adaptation <strong>in</strong> Artificial and Biological Systems,Nefti-Meziani_Chapter01_p001-012 15 February 2010; 16:50:2


12 Advances <strong>in</strong> <strong>cognitive</strong> <strong>systems</strong>Symposium on Grand Challenge 5: Architecture of Bra<strong>in</strong> and M<strong>in</strong>d, vol. 2,pp. 37–42. University of Bristol, Bristol, 2006.28. H. Gardner. Multiple Intelligences: The Theory <strong>in</strong> Practice. Basic Books,New York, 1993.29. H. Maturana. Biology of cognition. Research Report BCL 9.0, Universityof Ill<strong>in</strong>ois, Urbana, IL, 1970.30. H. Maturana. The organization of the liv<strong>in</strong>g: A theory of the liv<strong>in</strong>gorganization. International Journal of Man-Mach<strong>in</strong>e Studies, 7(3):313–332,1975.31. H. R. Maturana and F. J. Varela. Autopoiesis and Cognition – The Realizationof the Liv<strong>in</strong>g. Boston Studies on the Philosophy of Science. D.Reidel Publish<strong>in</strong>g Company, Dordrecht, Holland, 1980.32. F. Varela. Pr<strong>in</strong>ciples of Biological Autonomy. Elsevier North Holland,New York, 1979.33. E. Thompson. M<strong>in</strong>d <strong>in</strong> Life: Biology, Phenomenology, and the Sciences ofM<strong>in</strong>d. Harvard University Press, Boston, MA, 2007.34. T. van Gelder and R. F. Port. It’s about time: An overview of the dynamicalapproach to cognition. In R. F. Port and T. van Gelder, eds., M<strong>in</strong>das Motion – Explorations <strong>in</strong> the Dynamics of Cognition, pp. 1–43.Bradford Books, MIT Press, Cambridge, MA, 1995.35. R. A. Brooks. Flesh and Mach<strong>in</strong>es: How Robots Will Change Us.Pantheon Books, New York, 2002.Nefti-Meziani_Chapter01_p001-012 15 February 2010; 16:50:2

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!