19.01.2013 Views

IROS 2011 Awards - Lirmm

IROS 2011 Awards - Lirmm

IROS 2011 Awards - Lirmm

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>2011</strong> IEEE/RSJ International Conference<br />

on Intelligent Robots and Systems<br />

September 25-30, <strong>2011</strong><br />

San Francisco, USA<br />

Celebrating 50 Years of Robotics<br />

Co-Sponsoring Societies<br />

Corporate Sponsors


Foreword<br />

The spring of 1961 marks an important milestone in the history of robotics. The world first<br />

working robot, the Unimate, developed by Devol and Engelberger, was deployed in the<br />

assembly line at General Motors for die casting handling. The centuries-old robot concept<br />

finally came to fruition in the form of a working machine in production. Fifty years later<br />

today, robotics is expanding beyond its success in manufacturing. Our field is entering a<br />

new era of human-centered developments attracting collaborations and interactions<br />

between diverse research communities from different disciplines.<br />

<strong>IROS</strong> <strong>2011</strong> is both a celebration and a renewal. Celebrating the achievements of the field,<br />

the <strong>2011</strong> edition focuses on human-centered robotics bringing a wide coverage of its subfields<br />

and diverse communities. <strong>IROS</strong> <strong>2011</strong> was designed to achieve the highest quality in<br />

technical content and to promote discussion and social interactions. A major feature of this<br />

conference is the interactive presentation model. Authors of selected contributions are<br />

given the opportunity to first outline their work during regular sessions and then to fully<br />

engage small groups of conference attendees, presenting and discussing their work using<br />

videos, slides, and simulators. With the interactive presentations, the number of parallel<br />

tracks is significantly reduced, thus promoting better interaction among the participants.<br />

The conference also features a significant number of special-topic symposia; each<br />

associated with an active research area of robotics introduced by an invited expert.<br />

In addition, the program includes robot demonstrations, thematic plenaries on design, biorobotics,<br />

and intelligent transportation, and four forums: (i) Robots, the Next Generation; (ii)<br />

Medical Robotics; (iii) Robots, the New Platforms; and (iv) Robotics: Beyond the Horizon.<br />

The program also includes a Town Hall Meeting on Robotics Conferences, 27 workshops<br />

and tutorials, technical tours, and a large exhibition showing some of the latest<br />

developments in the field.<br />

<strong>IROS</strong> <strong>2011</strong> received 2459 paper submissions. The reviews were carried out by the<br />

Conference Editorial Board (CEB), which involved 9 editors, 258 associate editors, and<br />

over 3500 reviewers coordinated by the Editor-in-Chief. At its meeting, the Senior Program<br />

Committee discussed the submissions and their corresponding reviews and CEB<br />

recommendations, and finally selected 790 papers for presentation at the conference,<br />

organized in 130 sessions in ten parallel tracks.<br />

A vibrant social program including four events: a Cruise on the San Francisco Bay, an<br />

Evening at the de Young Museum, a Welcome, and a Farewell Reception. This program is<br />

designed to provide an atmosphere for fruitful discussions and meaningful interactions<br />

among the participants.<br />

We would like to take this opportunity to express our thanks and appreciations to all the<br />

members of the Organizing Committee and the Conference Editorial Board for the effort<br />

they devoted to the program and organization of this conference. Our special thanks go to<br />

the volunteers and staffs for the long hours and hard work they have generously given to<br />

the conference. Above all, <strong>IROS</strong> <strong>2011</strong> is the result of its technical contributions, and we<br />

would like to thank all the authors, speakers, and participants for taking part in and<br />

contributing to this conference. We warmly welcome you to San Francisco!<br />

Oussama Khatib Gaurav Sukhatme<br />

General Chair, <strong>IROS</strong> <strong>2011</strong> Program Chair, <strong>IROS</strong> <strong>2011</strong>


Foreword<br />

Table of Contents<br />

<strong>IROS</strong> Organization .…...………………………………………………………………..… ii<br />

<strong>IROS</strong> <strong>2011</strong> Organizing Committee .…………………………………………………….. iii<br />

<strong>IROS</strong> <strong>2011</strong> Conference Editorial Board .……………………………………………….. v<br />

<strong>IROS</strong> <strong>2011</strong> Reviewers .…………………………………………………………………… viii<br />

Plenary Sessions .………………………………………………………………………… xxii<br />

Workshops and Tutorials .………………………………………………………..……… xxv<br />

Demonstration Sessions .………………………………………………………………... xxviii<br />

Exhibition .………………………………………………………………………………..... xxxiv<br />

<strong>IROS</strong> <strong>2011</strong> Corporate Sponsors ..………………………………………………………. xxxvi<br />

<strong>IROS</strong> <strong>2011</strong> <strong>Awards</strong> ……………....………………………………………………………. xxxvii<br />

Social Events .…………………………………………………………………………….. xxxviii<br />

Map of Conference Venue .……………………………………………………………… xl<br />

Conference Overview .…………………………………………………………………… xli<br />

Technical Program at a Glance<br />

Monday, September 26 ..…………………………………………………………... xliii<br />

Tuesday, September 27 ..………………………………………………...………... xliv<br />

Wednesday, September 28 .……………………………………………………….. xlv<br />

Thurdsay, September 29 .....…………………………………………………..….... xlvi<br />

Technical Program Digest<br />

Monday, September 26 ..……………………………………...…………………… 1<br />

Tuesday, September 27.…..………………………………………………...……… 39<br />

Wednesday, September 28 ..….….………………………………………………... 93<br />

Thursday, September 29 …..………………………………………………..……… 167<br />

Author Index ..……………………………………………………………………………... 231<br />

Keyword Index ..…………………………………………………………………………… 261<br />

Corporate Sponsors Ads<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–i–


<strong>IROS</strong> Advisory/Steering Committee<br />

<strong>IROS</strong> Organization<br />

Advisory Council Founding Honorary Chair<br />

Fumio Harashima, Tokyo Metropolitan University, Japan<br />

Past Advisory Council Honorary Chair<br />

Tzyh-Jong Tarn, Washington University, USA<br />

Advisory Council Honorary Chair<br />

C.S. George Lee, Purdue University, USA<br />

Advisory Council Chair<br />

Toshio Fukuda, Nagoya University, Japan<br />

Advisory Council Vice Chair<br />

Shin'ichi Yuta, University of Tsukuba, Japan<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–ii–<br />

Co-sponsoring Societies<br />

IEEE Robotics and Automation Society (RAS)<br />

www.ieee-ras.org<br />

IEEE Industrial Electronics Society (IES)<br />

www.ieee-ies.org<br />

Robotics Society of Japan (RSJ)<br />

www.rsj.or.jp<br />

Society of Instrument and Control Engineers (SICE)<br />

www.sice.or.jp<br />

New Technology Foundation (NTF)<br />

www.ntf.or.jp<br />

Technical Co-sponsoring Society<br />

Institute of Control, Robotics and Systems (ICROS)<br />

www.icros.org


General Chair<br />

<strong>IROS</strong> <strong>2011</strong> Organizing Committee<br />

Oussama Khatib, Stanford University, USA<br />

Program Chair<br />

Gaurav Sukhatme, University of Southern California, USA<br />

Program Co-Chair<br />

Roland Siegwart, Swiss Federal Institute of Technology Zurich, Switzerland<br />

CEB Chair<br />

Nancy M. Amato, Texas A&M University, USA<br />

50 Year of Robotics Symposia<br />

Vijay Kumar (Chair), University of Pennsylvania, USA<br />

Henrik Christensen, Georgia Institute of Technology, USA<br />

Oliver Brock, Technische Universität Berlin, Germany<br />

Interactive Presentations<br />

Daniela Rus (Chair), Massachusetts Institute of Technology, USA<br />

Peter Corke, Queensland University of Technology, Australia<br />

<strong>Awards</strong><br />

Raja Chatila (Chair), Centre National de la Recherche Scientifique, France<br />

Shigeki Sugano, Waseda University, Japan<br />

Allison Okamura, The Johns Hopkins University, USA<br />

Workshops & Tutorials<br />

C.S. George Lee (Chair), Purdue University, USA<br />

Christian Laugier, INRIA Rhône-Alpes, France<br />

Shin’ichi Yuta, University of Tsukuba, Japan<br />

Local Arrangements<br />

Torsten Kroeger (Chair), Stanford University, USA<br />

Francois Conti, Stanford University, USA<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–iii–


Finance Chair<br />

Xiaoping Yun, Naval Postgraduate School, USA<br />

Publications Chair<br />

Alessandro De Luca, Università di Roma "La Sapienza", Italy<br />

Exhibits<br />

Kurt Konolige (Chair), Willow Garage, USA<br />

Luis Sentis, University of Texas at Austin, USA<br />

Kai Oliver Arras, Albert-Ludwigs-Universität Freiburg, Germany<br />

Publicity<br />

Aude Billard (Chair), Swiss Federal Institute of Technology Lausanne, Switzerland<br />

Ken Goldberg, University of California Berkeley, USA<br />

Yoshi Nakamura, University of Tokyo, Japan<br />

Student Travel <strong>Awards</strong><br />

Dezhen Song (Chair), Texas A&M University, USA<br />

M. Ani Hsieh, Drexel University, USA<br />

Dylan Shell, Texas A&M University, USA<br />

Technical Tours<br />

Erin Rapacki, Adept Technology, Inc., USA<br />

Senior Program Committee<br />

Marcelo Ang, National University of Singapore, Singapore<br />

Wolfram Burgard, Albert-Ludwigs-Universität Freiburg, Gemrany<br />

Mark Cutkosky, Stanford University, USA<br />

Rüdiger Dillmann, Karlsruhe Institute of Technology, Germany<br />

Ken Goldberg, University of California Berkeley, USA<br />

Makoto Kaneko, Osaka University, Japan<br />

Lydia Kavraki, Rice University, USA<br />

Dong-soo Kwon, Korea Advanced Institute of Science and Technology, Korea<br />

Kewin Lynch, Northwestern University, USA<br />

Eduardo nebot, University of Sydney, Australia<br />

Brad Nelson, Swiss Federal Institute of Technology Zurich, Switzerland<br />

Bruno Siciliano, Università degli Studi di Napoli Federico II, Italy<br />

Satoshi Tadokoro, Tohoku University, Japan<br />

Seth Hutchinson, University of Illinois at Urbana-Champaign, USA<br />

Stefan Schaal, University of Southern California, USA<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–iv–


<strong>IROS</strong> <strong>2011</strong> Conference Editorial Board<br />

Editor-in-Chief<br />

Nancy M. Amato, Texas A&M University, USA<br />

Editors<br />

I-Ming Chen, Nanyang Technological University, Singapore<br />

Alessandro De Luca, Università di Roma "La Sapienza", Italy<br />

Chad Jenkins, Brown University, USA<br />

Danica Kragic, Royal Institute of Technology, Sweden<br />

Nikos Papanikolopoulos, University of Minnesota, USA<br />

Frank Park, Seoul National University, Korea<br />

Lynne Parker, University of Tennessee, USA<br />

Shigeki Sugano, Waseda University, Japan<br />

Frank van der Stappen, Utrecht University, Netherlands<br />

Associate Editors<br />

Pieter Abbeel, UC Berkeley, USA<br />

Farhad Aghili, Canadian Space Agency, Canada<br />

Aaron Ames, Texas A&M Univ., USA<br />

Yacine Amirat, Univ. Paris Est Créteil, France<br />

Monica Anderson, Univ. of Alabama, USA<br />

Juan Andrade-Cetto, CSIC-UPC, Spain<br />

Wei Tech Ang, Nanyang Tech. Univ., Singapore<br />

Adnan Ansar, JPL, USA<br />

Gianluca Antonelli, Univ. Cassino, Italy<br />

Hirohiko Arai, AIST, Japan<br />

Helder Araujo, Univ. Coimbra, Portugal<br />

Kai Oliver Arras, Univ. Freiburg, Germany<br />

Panagiotis Artemiadis, MIT, USA<br />

Tamim Asfour, KIT, Germany<br />

Olivier Aycard, Joseph Fourier Univ., France<br />

Jan Babic, Inst. Jozef Stefan, Slovenia<br />

Shaoping Bai, Aalborg Univ., Denmark<br />

Devin Balkcom, Dartmouth College, USA<br />

Patrick Beeson, TracLabs Inc, USA<br />

Michael Beetz, TU Munich, Germany<br />

Sven Behnke, Univ. Bonn, Germany<br />

Kostas Bekris, Univ. of Nevada-Reno, USA<br />

Maren Bennewitz, Univ. Freiburg, Germany<br />

Ryad Benosman, UPMC/ISIR, France<br />

Dmitry Berenson, CMU, USA<br />

Sarah Bergbreiter, Univ. of Maryland, USA<br />

Philippe Bidaud, UPMC, Paris 6, France<br />

Nathaniel Bird, Ohio Northern University, USA<br />

Christoph Borst, DLR, Germany<br />

Alan Bowling, Univ. of Texas Arlington, USA<br />

Tim Bretl, Univ. of Illinois Urbana, USA<br />

Michael Bruenig, CSIRO, Australia<br />

Etienne Burdet, Imperial College London, UK<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–v–<br />

Wolfram Burgard, Univ. Freiburg, Germany<br />

Darius Burschka, TU Munich, Germany<br />

David Cappelleri, Stevens Inst. of Tech., USA<br />

Giuseppe Carbone, Univ. Cassino, Italy<br />

Stefano Carpin, Univ. of California Merced, USA<br />

Alicia Casals, UPC, Spain<br />

Jose Castellanos, Univ. Zaragoza, Spain<br />

Francois Chaumette, RIA Rennes, France<br />

XiaoQi Chen, Univ. Canterbury, New Zealand<br />

Sonia Chernova, Worcester Polytech. Inst., USA<br />

Chee Meng Chew, National Univ., Singapore<br />

Kiyo Chinzei, AIST, Japan<br />

Sachin Chitta, Willow Garage Inc., USA<br />

Kyu-Jin Cho, Seoul National Univ., Korea<br />

Youngjin Choi, Hanyang Univ., Korea<br />

Howie Choset, CMU, USA<br />

Changmook Chun, KIST, Korea<br />

Timothy Chung, Naval Postgraduate School, USA<br />

Woojin Chung, Korea Univ., Korea<br />

Chris Clark, California Polytech. State Univ., USA<br />

Nikolaus Correll, Univ. of Colorado Boulder, USA<br />

Juan Cortes, LAAS, France<br />

Jacob Crandall, MIT/Masdar, USA<br />

Elizabeth Croft, Univ. of British Columbia, Canada<br />

Jian Dai, King's College London, UK<br />

Aveek Das, Sarnoff Corporation, USA<br />

Angel del Pobil, Univ. Jaume I, Spain<br />

Jorge Dias, Univ. Coimbra, Portugal<br />

Ryan Eustice, Univ. of Michigan, USA<br />

Manuel Ferre, UP Madrid, Spain<br />

Antoine Ferreira, Univ. Orleans, France<br />

Nicola Ferrier, Univ. of Wisconsin Madison, USA<br />

Paolo Fiorini, Univ. Verona, Italy


Robert Fitch, Univ. Sydney, Australia<br />

Thierry Fraichard, INRIA Rhone-Alpes, France<br />

Emilio Frazzoli, MIT, USA<br />

Eric Frew, Univ. of Colorado, USA<br />

Antonio Frisoli, Scuola Superiore Sant'Anna, Italy<br />

Nicholas Gans, Univ. of Texas Dallas, USA<br />

Roland Geraerts, Utrecht Univ., Netherlands<br />

Christopher Geyer, iRobot Corp., USA<br />

Bill Goodwine, Notre Dame Univ., USA<br />

Ambarish Goswami, Honda Research Inst., USA<br />

Giorgio Grisetti, Univ. Freiburg, Germany<br />

Josechu Guerrero, Univ. Zaragoza, Spain<br />

Eugenio Gugliemelli, Campus Bio-Medico, Italy<br />

Yi Guo, Stevens Inst. of Technology, USA<br />

Kamal Gupta, Simon Fraser Univ., Canada<br />

Li Han, Clark Univ., USA<br />

Kensuke Harada, AIST, Japan<br />

Kris Hauser, Indiana Univ., USA<br />

Shinichi Hirai, Ritsumeikan Univ., Japan<br />

Yasuhisa Hirata, Tohuku Univ., Japan<br />

Koh Hosoda, Osaka Univ., Japan<br />

Franz Hover, MIT, USA<br />

Stefan Hrabar, CSIRO, Australia<br />

M. Ani Hsieh, Drexel Univ., USA<br />

Wes Huang, iRobot Corp., USA<br />

Javier Ibanez-Guzman, Renault, France<br />

Tetsunari Inamura, Nat’l Inst. of Informatics, Japan<br />

Kazuo Ishii, Kyushu Inst. of Technology, Japan<br />

Volkan Isler, Univ. of Minnesota, USA<br />

Hiroyasu Iwata, Waseda Univ., Japan<br />

Michael Jakuba, Univ. Sydney, Australia<br />

Patric Jensfelt, Royal Inst. of Technology, Sweden<br />

Rolf Johansson, Lund Univ., Sweden<br />

Chris Jones, iRobot Corp., USA<br />

Ming-Shaun Ju, Nat’l Cheng Kung Univ., Taiwan<br />

Simon Julier, University College London, UK<br />

Takayuki Kanda, ATR, Japan<br />

Balajee Kannan, Carnegie Mellon Univ., USA<br />

Peter Kazanzides, Johns Hopkins Univ., USA<br />

Abderrahmane Kheddar, CNRS, France/Japan<br />

Ryo Kikuuwe, Kyushu Univ., Japan<br />

Xianwen Kong, Heriot-Watt Univ., UK<br />

Jana Kosecka, George Mason Univ., USA<br />

Hadas Kress-Gazit, Northwestern Univ., USA<br />

Venkat Krovi, Univ. Buffalo (SUNY), USA<br />

Naoyuki Kubota, Tokyo Metropolitan Univ., Japan<br />

Takashi Kubota, JAXA, Japan<br />

James Kuffner, CMU, USA<br />

Dana Kulic, Univ. Waterloo, Canada<br />

Daisuke Kurabayashi, Tokyo Inst. of Tech., Japan<br />

Hanna Kurniawati, SMART, Singapore<br />

Kostas Kyriakopoulos, NTU Athens, Greece<br />

Ville Kyrki, Lappeenranta Univ. of Tech., Finland<br />

Dongjun Lee, Univ. of Tennessee, USA<br />

Ji Yeong Lee, Hanyang Univ., Korea<br />

Weihua Li, Univ. Wolloongong, Australia<br />

Xingyan Li, Univ. Auckland, New Zealand<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–vi–<br />

Yangmin Li, Univ. of Macau, Macau<br />

Jyh-Ming Lien, George Mason Univ., USA<br />

Maxim Likhachev, CMU, USA<br />

Achim Lilienthal, Orebro Univ., Sweden<br />

Pedro Lima, Lisbon Technical Univ., Portugal<br />

Dikai Liu, Univ. of Technology Sydney, Australia<br />

Honghai Liu, Univ. Portsmouth, UK<br />

Peter X. Liu, Carleton Univ., Canada<br />

Xin-Jun Liu, Tsinghua Univ., China<br />

Savvas Loizou, Cyprus Univ. of Tech., Cyprus<br />

Meghann Lomas, Lockheed Martin, USA<br />

Manuel Lopes, INRIA, France<br />

Gonzalo Lopez-Nicolas, Univ. Zaragoza, Spain<br />

Yunjiang Lou, Harbin Inst. of Tech., China<br />

Ou Ma, New Mexico State Univ., USA<br />

Shugen Ma, Ritsumeikan Univ., Japan<br />

Yusuke Maeda, Yokohama National Univ., Japan<br />

Mohammad Mahoor, Denver Univ., USA<br />

Eric Marchand, INRIA Rennes, France<br />

Gian Luca Mariottini, Univ. Texas Arlington, USA<br />

Agostino Martinelli, INRIA Rhone-Alpes, France<br />

Philippe Martinet, Blaise Pascal Univ., France<br />

Alcherio Martinoli, EPFL, Switzerland<br />

Favio Masson, Univ. Nacional del Sur, Argentina<br />

Fumitoshi Matsuno, Kyoto Univ., Japan<br />

Claudio Melchiorri, Univ. Bologna, Italy<br />

Arianna Menciassi, Scuola Sup. Sant'Anna, Italy<br />

Giorgio Metta, IIT, Italy<br />

Javier Minguez, Univ. Zaragoza, Spain<br />

Sarthak Misra, Univ. Twente, Netherlands<br />

Mark Moll, Rice Univ., USA<br />

Katja Mombaur, Univ. Heidelberg, Germany<br />

Hyungpil Moon, Sungkyunkwan Univ., Korea<br />

Antonio Morales, Univ. Jaume I, Spain<br />

Marco Morales, Inst. Tecn. Autónomo, Mexico<br />

Taketoshi Mori, Univ. Tokyo, Japan<br />

Jun Morimoto, ATR, Japan<br />

El Mustapha Mouaddib, Univ. of Picardie, France<br />

Andreas Mueller, Univ. Duisburg-Essen, Germany<br />

Hiroki Murakami, Ishikawajima-Harima Ind., Japan<br />

Rafael Murrieta-Cid, CIMAT, Mexico<br />

Fawzi Nashashibi, INRIA, France<br />

Jose Neira, Univ. Zaragoza, Spain<br />

Jason O'Kane, Univ. of South Carolina, USA<br />

Tetsuya Ogata, Kyoto Univ., Japan<br />

Edwin Olson, Univ. of Michigan, USA<br />

Giuseppe Oriolo, Univ. Roma “La Sapienza”, Italy<br />

Koichi Osuka, Osaka Univ., Japan<br />

Erika Ottaviano, Univ. Cassino, Italy<br />

Erhan Oztop, ATR/NICT, Japan<br />

Taskin Padir, WPI, USA<br />

Evangelos Papadopoulos, NTU Athens, Greece<br />

Jong Hyeon Park, Hanyang Univ., Korea<br />

Ioannis Pavlidis, Univ. Houston, USA<br />

Janice Pearce, Berea College, USA<br />

Angelika Peer, TU Munchen, Germany<br />

Jan Peters, Max Planck Inst., Germany


Thierry Peynot, Univ. Sydney, Australia<br />

Erion Plaku, Catholic Univ. of America, USA<br />

Erwin Prassler, Hoch. Bonn-Rein-Sieg, Germany<br />

Joerg Raczkowsky, Univ. Karlsruhe, Germany<br />

Subramanian Ramamoorthy, Univ. Edinburgh, UK<br />

Yiannis Rekleitis, McGill Univ., Canada<br />

Patrick Rives, INRIA, France<br />

Cameron Riviere, CMU, USA<br />

Paolo Robuffo Giordano, Max Planck Inst., Germany<br />

Paolo Rocco, Politecnico di Milano, Italy<br />

Raul Rojas, Freie Universität Berlin, Germany<br />

Radu Bogdan Rusu, Willow Garage Inc., USA<br />

Paul Rybski, Carnegie Mellon Univ., USA<br />

Jee-Hwan Ryu, Korea Univ. Tech. & Edu., Korea<br />

Carlos Sagues, Univ. Zaragoza, Spain<br />

Srikanth Saripalli, Arizona State Univ., USA<br />

Paul Scerri, Carnegie Mellon Univ., USA<br />

Jim Schmiedeler, Univ. of Notre Dame, USA<br />

Wei-Min Shen, USC, USA<br />

Yantao Shen, Univ. of Nevada, Reno, USA<br />

Anton Shiriaev, Umea Univ., Sweden<br />

Thierry Simeon, LAAS-CNRS, France<br />

Reid Simmons, CMU, USA<br />

Metin Sitti, CMU, USA<br />

Ryan Smith, Queensland Univ. of Tech., Australia<br />

Jorge Solis, Waseda Univ., Japan<br />

Jae-Bok Song, Korea Univ., Korea<br />

Mohan Sridharan, Texas Tech, USA<br />

Siddhartha Srinivasa, Intel Labs, USA<br />

Cyrill Stachniss, Univ. Freiburg, Germany<br />

Mike Stilman, Georgia Tech, USA<br />

Kasper Stoy, Univ of Southern Denmark, Denmark<br />

Ethan Stump, ARL, USA<br />

Jianbo Su, Shanghai Jiao Tong Univ., China<br />

Attawith Sudsang, Chulalongkorn Univ., Thailand<br />

Il Hong Suh, Hanyang Univ., Korea<br />

Salah Sukkarieh, Univ. Sydney, Australia<br />

Dong Sun, City Univ. Hong Kong, China<br />

Yu Sun, Univ. Toronto, Canada<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–vii–<br />

Yukio Takeda, Tokyo Inst. of Technology, Japan<br />

Jindong Tan, MTU, USA<br />

Fang Tang, California State Polytech. Pomona, USA<br />

Bert Tanner, Univ. of Delaware, USA<br />

Adriana Tapus, ParisTech, France<br />

Juan Tardos, Univ. Zaragoza, Spain<br />

Ashley Tews, CSIRO, Australia<br />

Marc Toussaint, TU Berlin, Germany<br />

Jeff Trinkle, Rensselaer Polytechnic Inst., USA<br />

Nikolaos Tsagarakis, IIT, Italy<br />

Ales Ude, Jozef Stefan Inst., Slovenia<br />

Jun Ueda, Georgia Tech, USA<br />

Kazunori Umeda, Chuo Univ., Japan<br />

Pascal Vasseur, Univ. Rouen, France<br />

Manuela Veloso, CMU, USA<br />

Marilena Vendittelli, Univ. Roma “La Sapienza”, Italy<br />

Markus Vincze, Vienna Univ. of Tech., Austria<br />

Marsette Vona, Northeastern Univ., USA<br />

Richard Voyles, Denver Univ., USA<br />

Danwei Wang, Nanyang Techn. Univ., Singapore<br />

Fei-Yue Wang, Univ. of Arizona, USA<br />

Hesheng Wang, Shanghai Jiao Tong Univ., China<br />

Stefan Williams, Univ. Sydney, Australia<br />

Reg Wilson, JPL, USA<br />

Denis Wolf, Univ. Sao Paulo, Brazil<br />

Fengfeng Xi, Ryerson Univ., Canada<br />

Jing Xiao, UNCC, USA<br />

Yasushi Yagi, Osaka Univ., Japan<br />

Katsu Yamane, Disney Research, USA<br />

Guilin Yang, Inst. of Manuf. Tech., Singapore<br />

Anna Yershova, Duke Univ., USA<br />

Jingang Yi, Rutgers, USA<br />

Eiichi Yoshida, AIST, Japan<br />

Takashi Yoshimi, Shibaura Inst. of Tech., Japan<br />

Peijang Yuan, Beihang Univ., China<br />

Fumin Zhang, Georgia Tech, USA<br />

Hong Zhang, Univ. of Alberta, Canada<br />

Jiang-Yu Zheng, IUPUI, USA


<strong>IROS</strong> <strong>2011</strong> Reviewers<br />

Joel Abadie<br />

Gursel Alici<br />

Bilal Ahmed Arain<br />

Jose Baca<br />

Jake Abbott<br />

Hatem Alismail<br />

Nancy Arana-Daniel Abraham Bachrach<br />

Tariq Abdelrahman<br />

Ben Allan<br />

Joan Aranda<br />

Wael Bachta<br />

Mohamed Abderrahim Peter Allen<br />

Miguel Aranda<br />

Leonardo Badino<br />

Hamid Abdi<br />

River Allen<br />

Jumpei Arata<br />

Norm Badler<br />

Nichola Abdo<br />

Guillaume Allibert<br />

Helder Araujo<br />

Ji-Hun Bae<br />

Farzaneh Abdollahi<br />

James Allington<br />

Rui Araújo<br />

Joonbum Bae<br />

A. H. Abdul Hafez<br />

Benedetto Allotta<br />

Mario Arbulu<br />

Stanley Baek<br />

Alexandre Abellard<br />

Jose Almeida<br />

Laurent Arcese<br />

Mitra Bahadorian<br />

Leon Abelmann<br />

Luis Almeida<br />

James Archibald<br />

Alexander Bahr<br />

Pramod Abichandani Oscar Alonso<br />

Gustavo Arechavaleta He Bai<br />

Satoko Abiko<br />

Redwan Alqasemi<br />

Juan Carlos Arevalo Qadeer Baig<br />

Elias Abou Zeid<br />

Pablo Javier Alsina<br />

Adrian Arfire<br />

Tim Bailey<br />

Konstantinos Aboudolas Ron Alterovitz<br />

Brenna Argall<br />

John Baillieul<br />

Dino Accoto<br />

Salah Althloothi<br />

Sylvain Argentieri<br />

Andrea Bajo<br />

Jeffrey Ackerman<br />

Kaspar Althoefer<br />

Miguel Arias<br />

Max Bajracharya<br />

Carlos A. Acosta Calderon Daniel Althoff<br />

Hiroaki Arie<br />

Christopher R Baker<br />

Martin Adams<br />

Armando Alves Neto Keisuke Arikawa<br />

Benjamin Balaguer<br />

Antonio Adan<br />

Angelos Amanatiadis Masakazu Arima<br />

Carlos Balaguer<br />

Olaoluwa Adeniba<br />

Yoshiharu Amano<br />

Hichem Arioui<br />

Stephen Balakirsky<br />

Nagesh Adluru<br />

Josep Amat<br />

Christopher Armbrust Ravi Balasubramanian<br />

Lounis Adouane<br />

Nancy M. Amato<br />

Leopoldo Armesto<br />

Sivakumar Balasubramanian<br />

Erwin Aertbelien<br />

Cinzia Amici<br />

Jose Armingol<br />

Tucker Balch<br />

Zohaib Aftab<br />

Luis Ernesto Amigo<br />

Kai Oliver Arras<br />

Devin Balkcom<br />

Gabriel Agamennoni Francesco Amigoni<br />

Filippo Arrichiello<br />

Dana Ballard<br />

Priyanshu Agarwal<br />

Vahid Aminzadeh<br />

Marc Arsenault<br />

Hansjorg Baltes<br />

Ali-akbar Agha-mohammadi Farshid Amirabdollahian Marc Arsicault<br />

Haris Baltzakis<br />

Navid Aghasadeghi<br />

Mehdi Ammi<br />

Panagiotis Artemiadis Xiaojun Ban<br />

Mahdi Agheli<br />

Arvind Ananthanarayanan Jordi Artigas<br />

Jared Bancroft<br />

Farhad Aghili<br />

Margarita Anastassova Minoru Asada<br />

Antonio Bandera<br />

Noa Agmon<br />

Michael Skipper Andersen Ali Asadian<br />

Antonio Bandera Rubio<br />

Lucio Agostinho Rocha John Anderson<br />

Masoud Asadpour<br />

Juan Pedro Bandera Rubio<br />

Amer Agovic<br />

Monica Anderson<br />

Hajime Asama<br />

Tirthankar Bandyopadhyay<br />

Gabriel Aguirre-Ollinger Sterling Anderson<br />

Fumihiko Asano<br />

Mayank Bansal<br />

Carl Ahlberg<br />

Stuart Anderson<br />

Tamim Asfour<br />

Emilia I. Barakova<br />

Aamir Ahmad<br />

Sean Andersson<br />

Lars Asplund<br />

Adrien Baranes<br />

Mojtaba Ahmadi<br />

Franz Andert<br />

Seyed Farokh Atashzar Jose Barata<br />

Hossein Ahmadzadeh Takeshi Ando<br />

J. Alan Atherton<br />

Timothy Barfoot<br />

Nisar Ahmed<br />

Juan Andrade-Cetto Christopher Atkeson Stephen Barkby<br />

Sang Chul Ahn<br />

Henrik Andreasson<br />

Jason Atkin<br />

Adrien Barral<br />

SungHwan Ahn<br />

Nicolas Andreff<br />

Samuel Au<br />

Alejandra Barrera<br />

Omar Ait-Aider<br />

Alexander Andreopoulos Tsz-Chiu Au<br />

João P. Barreto<br />

Yasumichi Aiyama<br />

Claude Andriot<br />

Fernando Auat Cheein David Barrett<br />

Batu Akan<br />

Wei Tech Ang<br />

Pierre Avanzini<br />

Antonio Barrientos<br />

Adel Akbarimajd<br />

Jorge Angeles<br />

J. Gabriel Avina-Cervantes Jordi Barrio Gragera<br />

Abdullah Akce<br />

Adrien Angeli<br />

Carlo Alberto Avizzano Luis J. Barrios<br />

Erhan Akdogan<br />

Andreas Angerer<br />

Kean C. Aw<br />

Adrien Bartoli<br />

Srinivas Akella<br />

David A. Anisi<br />

Ben Axelrod<br />

Luca Bascetta<br />

Baris Akgun<br />

Roberta Annicchiarico Mustafa Ayad<br />

Cagatay Basdogan<br />

Aadeel Akhtar<br />

David Miguel Antunes Victor Ayala-Ramirez Nicola Basilico<br />

H. Levent Akin<br />

Salvatore Maria Anzalone Nora Ayanian<br />

Maxim Batalin<br />

Yoshitake Akiyama<br />

Shinya Aoi<br />

Olivier Aycard<br />

Adam Bates<br />

Samer Al Moubayed Takeshi Aoki<br />

Alper Aydemir<br />

Jorge Batista<br />

Sven Albrecht<br />

Yannick Aoustin<br />

Pedram Azad<br />

Gabriel Baud-Bovy<br />

Alin Albu-Schäffer<br />

Tadayoshi Aoyama<br />

Christine Azevedo Coste Marcus Baum<br />

Javier Adolfo Alcazar Sumeet S. Aphale<br />

Asma Azim<br />

Eric Baumgartner<br />

Feras Alchama<br />

Ilias Apostolopoulos Jose Raul Azinheira Enrique Bauzano<br />

David Aldavert<br />

Rosario Aragues<br />

Mahdi Azizian<br />

Eduardo-J. Bayro-Corrochano<br />

Alen Alempijevic<br />

Fumihito Arai<br />

Jose M. Azorin<br />

Matthew Bays<br />

Guillem Alenyà<br />

Takeshi Arai<br />

Y. Babazadeh Bedoustani Cabbar V. Baysal<br />

Jacopo Aleotti<br />

Tatsuo Arai<br />

Jan Babic<br />

Ali Bazaei<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–viii–


Stephane Bazeille<br />

Jean-Charles Bazin<br />

Chris Beall<br />

Ozkan Bebek<br />

Israel Becerra<br />

Aaron Becker<br />

Brian C. Becker<br />

Jan Becker<br />

Marcelo Becker<br />

Randall Beer<br />

Kristopher Beevers<br />

Jounghoon Beh<br />

Aman Behal<br />

Evan Behar<br />

Laxmidhar Behera<br />

Michael Behrens<br />

Roland Behrens<br />

Maximilian Beinhofer<br />

Kostas E. Bekris<br />

Anna Belardinelli<br />

Rachid Belaroussi<br />

Karim Belharet<br />

Martin Bello<br />

Nicola Bellotto<br />

Tony Belpaeme<br />

Faiz Ben Amar<br />

Heni Ben Amor<br />

Fathi Ben Ouezdou<br />

Pinhas Ben-Tzvi<br />

Mehdi Benallegue<br />

Asher Bender<br />

Rodrigo Benenson<br />

Ben Benfold<br />

Beno Benhabib<br />

Mohamed Bensalah<br />

Darrin Bentivegna<br />

Ahmed Benzerrouk<br />

Dmitry Berenson<br />

Massimo Bergamasco<br />

Luis Miguel Bergasa<br />

Sarah Bergbreiter<br />

Christos Bergeles<br />

Cyrille Berger<br />

Marcel Bergerman<br />

Peter Berkelman<br />

Spring Berman<br />

Alexandre Bernardino<br />

Karsten Berns<br />

Christian Bersch<br />

Giovanni Berselli<br />

Jean-Marc Berthommé<br />

Sylvain Bertrand<br />

Eva Besada-Portas<br />

Ross Beveridge<br />

David Bevly<br />

Felix Beyeler<br />

Jürgen Beyerer<br />

Nicola Bezzo<br />

Amit Bhatia<br />

Rajankumar Bhatt<br />

Sourabh Bhattacharya<br />

Subhrajit Bhattacharya<br />

Luigi Biagiotti<br />

Joshua J Bialkowski<br />

Antonio Bicchi<br />

Manuele Bicego<br />

Estela Bicho<br />

Philippe Bidaud<br />

Julien Bidot<br />

Alexander Bierbaum<br />

Geoffrey Biggs<br />

Hakan Bilen<br />

Aude Billard<br />

Ghassan Bin Hammam<br />

Jonathan Binney<br />

Oliver Birbach<br />

Stan Birchfield<br />

Nathaniel Bird<br />

Andreas Birk<br />

Rachael Bis<br />

Horst Bischof<br />

Bradley Bishop<br />

Hannes Bistry<br />

Joydeep Biswas<br />

Sebastian Bitzer<br />

Lars Blackmore<br />

François Blais<br />

Jose-Luis Blanco<br />

M. Dolores Blanco<br />

Sebastian Blumenthal<br />

Antônio Padilha Lanari Bó<br />

Michael Boardman<br />

Botond Bocsi<br />

Jeannette Bohg<br />

Aude Bolopion<br />

Vanderlei Bonato<br />

Adrian Bonchis<br />

Gary Bone<br />

Ilian Bonev<br />

Marcello Bonfe<br />

Silvere Bonnabel<br />

Devin Bonnie<br />

Philippe Bonnifait<br />

Wayne Book<br />

Shaunak D. Bopardikar<br />

Mirko Bordignon<br />

Johann Borenstein<br />

Paulo Vinicius K. Borges<br />

Gianni Borghesan<br />

Per Henrik Borgstrom<br />

Ali Borji<br />

Branislav Borovac<br />

Michael Bosse<br />

Silvia Botelho<br />

Debora Botturi<br />

Samir Bouabdallah<br />

Patrick Michael Bouffard<br />

Abdelbaki Bouguerra<br />

Mehdi Boukallel<br />

Abdeslam Boularias<br />

Mohamed Bouri<br />

Karim Bouyarmane<br />

Alan Bowling<br />

Isil Bozma<br />

David Bradley<br />

David Bradley<br />

Gary Bradski<br />

Julia Braman<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–ix–<br />

Alexandre Santos Brandao<br />

David Brandt<br />

Michael Branicky<br />

Kusy Brano<br />

David J. Braun<br />

Cynthia Breazeal<br />

Sebastian Brechtel<br />

Andreas Breitenmoser<br />

Michael Brenner<br />

Jose Breñosa<br />

Timothy Bretl<br />

Reuben Brewer<br />

Adrien Briod<br />

Sébastien Briot<br />

Oliver Brock<br />

Roland Brockers<br />

Yury Brodskiy<br />

Graham Brooker<br />

Christopher Brooks<br />

Jonathan Brookshire<br />

Xavier Broquere<br />

H. Ben Brown<br />

Travis Brown<br />

Brett Browning<br />

Frank Broz<br />

Drazen Brscic<br />

James Robert Bruce<br />

Davide Brugali<br />

Alberto Brunete<br />

Emma Brunskill<br />

Herman Bruyninckx<br />

Adam Bry<br />

Mitch Bryson<br />

Dirk Buchholz<br />

Jonas Buchli<br />

Jennifer Buehler<br />

Heiko Buelow<br />

Jose M. Buenaposada<br />

Olivier Buffet<br />

Magdalena Bugajska<br />

Chris Burbridge<br />

Wolfram Burgard<br />

Xavier Burgos-Artizzu<br />

Antoni Burguera<br />

Arne Burisch<br />

Jenny Burke<br />

Corves Burkhard<br />

Darius Burschka<br />

Thomas Buschmann<br />

Antoine Bussy<br />

Evan Butler<br />

Zack Butler<br />

Jonathan Butzke<br />

Georg Bätz<br />

Berthold Bäuml<br />

Fernando Caballero<br />

Flavio Cabrera-Mora<br />

Gonçalo Cabrita<br />

Fabrizio Caccavale<br />

Cesar Dario Cadena Lerma<br />

Binghuang Cai<br />

Hongyuan Cai<br />

Yueri Cai<br />

Andrea Caiti<br />

Maya Cakmak<br />

Andrea Calanca<br />

Darwin G. Caldwell<br />

Timothy Caldwell<br />

Daniele Caligiore<br />

Sylvain Calinon<br />

Daniele Calisi<br />

Massimo Callegari<br />

Andrew Calway<br />

Stephen Cameron<br />

Jason Campbell<br />

Mark Campbell<br />

Domenico Campolo<br />

Jordi Campos<br />

Mario Montenegro Campos<br />

Salvatore Candido<br />

Angelo Cangelosi<br />

Giorgio Cannata<br />

Jiangtao Cao<br />

Jesus Capitan<br />

David Cappelleri<br />

Barbara Caputo<br />

Alexander Carballo<br />

Giuseppe Carbone<br />

Philippe Cardou<br />

Ricardo Carelli<br />

Anthony Carfang<br />

Craig Carignan<br />

Christophe Cariou<br />

Nicolas Carlési<br />

Luca Carlone<br />

Raffaella Carloni<br />

Tom Carlson<br />

Stéphane Caro<br />

Guillaume Caron<br />

Stefano Carpin<br />

Giorgio Carpino<br />

Marc Carreras<br />

Juan A. Carretero<br />

Henry Carrillo<br />

J. Carlos Mendes Carvalho<br />

Maura Casadio<br />

Giuseppe Casalino<br />

Gianni Castelli<br />

Virginia Castelli<br />

Claudio Castellini<br />

Pedro Castillo<br />

Eduardo Castillo-Castaneda<br />

Modesto Castrillón<br />

Glauco A. de Paula Caurin<br />

Albert Causo<br />

Adriano Cavalcanti<br />

Filippo Cavallo<br />

Gary Cavanough<br />

M. Cenk Cavusoglu<br />

Miguel Cazorla<br />

Enric Celaya<br />

Zhiwei Cen<br />

Andrea Censi<br />

Pablo Cerrada<br />

Enric Cervera<br />

Damien Chablat<br />

Marco A. Chacin Torres<br />

Brahim Chaib-draa


Nicolas Chaillet<br />

Luiz Chaimowicz<br />

Nilanjan Chakraborty<br />

Punarjay Chakravarty<br />

Suman Chakravorty<br />

Hugo Chale-Gongora<br />

Maxime Chalon<br />

Lyle Chamberlain<br />

Thierry Chaminade<br />

Ambrose Chan<br />

Chen-Yu Chan<br />

Mervin Chandrapal<br />

Doyoung Chang<br />

Jen-Yuan (James) Chang<br />

Lillian Chang<br />

Richard Chang<br />

Yung-Jung Chang<br />

Crystal Chao<br />

Roland Chapuis<br />

Jean-Remy Chardonnet<br />

Marcela Charfuelan<br />

Francois Charpillet<br />

Thierry Chateau<br />

Dimitris Chatzigeorgiou<br />

Pratik Chaudhari<br />

Francois Chaumette<br />

Ricardo O. Chavez Garcia<br />

C. C. Cheah<br />

Paul Checchin<br />

Denis Chekhlov<br />

Ryad Chellali<br />

Ahmed Chemori<br />

Baojun Chen<br />

Chao Chen<br />

Haoyao Chen<br />

Heping Chen<br />

Hongjun Chen<br />

Ian Yen-Hung Chen<br />

Jian Chen<br />

Qijun Chen<br />

Qing Chen<br />

Weidong Chen<br />

Weihai Chen<br />

Wenjie Chen<br />

Wenjie Chen<br />

Xin Chen<br />

Xin Chen<br />

Yongquan Chen<br />

Yuren Chen<br />

Zhaopeng Chen<br />

Zheng Chen<br />

Zhiyong Chen<br />

Gordon Cheng<br />

Harry Cheng<br />

Victor H.L. Cheng<br />

Yang Cheng<br />

Joono Cheong<br />

Véronique Cherfaoui<br />

Anoop Cherian<br />

Sonia Chernova<br />

Andrea Cherubini<br />

Graziano Chesi<br />

Mohamed Chetouani<br />

Christine Chevallereau<br />

Chee Meng Chew<br />

Héctor Chiacchiarini<br />

Mu Chiao<br />

Ryosuke Chiba<br />

Cheng Chin<br />

Eris Chinellato<br />

Francesco Chinello<br />

Kiyoyuki Chinzei<br />

Gregory Chirikjian<br />

Hamidreza Chitsaz<br />

Han-Pang Chiu<br />

Margarita Chli<br />

Baek-Kyu Cho<br />

Jang Ho Cho<br />

Kyu-Jin Cho<br />

Han-Lim Choi<br />

Hee-Byoung Choi<br />

Jeong-Sik Choi<br />

Young-ho Choi<br />

Youngjin Choi<br />

Nikhil Chopra<br />

Safwan Choudhury<br />

Anders Lyhne Christensen<br />

David Johan Christensen<br />

Henrik Iskov Christensen<br />

Anna-Karin Christiansson<br />

Eftychios Christoforou<br />

Kang-Ching Chu<br />

Daisuke Chugo<br />

Changmook Chun<br />

Timothy H. Chung<br />

Wan Kyun Chung<br />

Matteo Cianchetti<br />

Anna Lisa Ciancio<br />

Grzegorz Cielniak<br />

Matei Ciocarlie<br />

Christian Cipriani<br />

Xavier Clady<br />

Mark Claffee<br />

Christopher M. Clark<br />

James Clark<br />

Jonathan Clark<br />

José Claver<br />

Kevin Cleary<br />

Charles Clercq<br />

Cédric Clévy<br />

Salvador Cobos Guzman<br />

José M. Cogollor Delgado<br />

Benjamin Cogrel<br />

Benjamin Cohen<br />

Paul Cohen<br />

A. Paulo Coimbra<br />

Michael Coleman<br />

Edward Colgate<br />

Alvaro Collet<br />

Christophe Collewet<br />

Roberto Colombo<br />

Julian Colorado<br />

Brian Coltin<br />

Frederic Comby<br />

Olivier Company<br />

Andrew Ian Comport<br />

Michele Conconi<br />

Shuang Cong<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–x–<br />

David Conner<br />

Christian Pascal Connette<br />

Jorg Conradt<br />

Daniela Constantinescu<br />

Francois Conti<br />

Marco Controzzi<br />

Atlas F. Cook IV<br />

Joseph Cooper<br />

Jeremy Cooperstock<br />

Lennon Cork<br />

Peter Corke<br />

Alex Cornejo<br />

Jordi Cornella<br />

Nikolaus Correll<br />

Nick Corson<br />

Rui Cortesao<br />

Anna H. R. Costa<br />

Joao Paulo Costeira<br />

Sebastien Cotton<br />

Estelle Courtial<br />

Noah J. Cowan<br />

Anthony Cowley<br />

Eric Coyle<br />

John Crassidis<br />

Alessandro Crespi<br />

Vincent Creuze<br />

Vincent Crocher<br />

André Crosnier<br />

Charles R. Crowell<br />

Xavier Cufi<br />

Jinshi Cui<br />

Lei Cui<br />

Yanzhe Cui<br />

Mark Joseph Cummins<br />

Alexander Cunningham<br />

Robert Cupec<br />

Mark Cutkosky<br />

Filippo D'Ippolito<br />

Boubaker Daachi<br />

Torbjorn Dahl<br />

Christian Dahmen<br />

Xiaochen Dai<br />

Matthew N. Dailey<br />

Konstantinos Dalamagkidis<br />

Sebastien Dalibard<br />

Fabio DallaLibera<br />

Skyler Dalley<br />

Mohsen Dalvand<br />

Bruno Damas<br />

Patrick Danès<br />

Hao Dang<br />

Kostas Daniilidis<br />

Todd Danko<br />

Neil Dantam<br />

Thanh-Son Dao<br />

Behzad Dariush<br />

Colin Das<br />

Jnaneshwar Das<br />

Raj Dasgupta<br />

Hisashi Date<br />

Chandan Datta<br />

Philip David<br />

Duane Davis<br />

James Davis<br />

Andrew J Davison<br />

Alireza Davoodi<br />

Shameka Dawson<br />

Feras Dayoub<br />

Anibal de Almeida<br />

Raoul de Charette<br />

Jesús M de la Cruz<br />

Arturo de la Escalera<br />

Tinne De Laet<br />

Kathryn De Laurentis<br />

Alessandro De Luca<br />

Giuseppe De Maria<br />

A. A. de Menezes Pereira<br />

Elena De Momi<br />

Agostino De Santis<br />

Michael Defoort<br />

Sarah Degallier<br />

Amir Degani<br />

Ehsan Dehghan<br />

Marc Peter Deisenroth<br />

Jaime del Cerro<br />

Frédéric Delaunay<br />

Anne Delettre<br />

Frank Dellaert<br />

Babette Dellen<br />

Vivien Delsart<br />

Eric Demeester<br />

Emel Demircan<br />

Yiannis Demiris<br />

Cédric Demonceaux<br />

Oscar Déniz<br />

Jory Denny<br />

Anna Derbakova<br />

Jason Derenick<br />

Konstantinos Dermitzakis<br />

Ashish Deshpande<br />

Guilherme DeSouza<br />

Renaud Detry<br />

Carrick Detweiler<br />

Michel Devy<br />

Debadeepta Dey<br />

Travis Deyle<br />

Ezequiel Di Mario<br />

Rosen Diankov<br />

Xiumin Diao<br />

Jorge Dias<br />

M. Bernardine Dias<br />

M. Freddie Dias<br />

Iñaki Díaz<br />

James Diaz<br />

Jakob Lund Dideriksen<br />

Alexander Dietrich<br />

Charles Dietrich<br />

Michael Dille<br />

Eric D. Diller<br />

Rüdiger Dillmann<br />

Edward Dillon<br />

Haris Dindo<br />

Feng Ding<br />

HuaFeng Ding<br />

Liang Ding<br />

Xilun Ding<br />

Xu Chu Ding<br />

Thang Dinh


Albert Diosi<br />

Gamini Dissanayake<br />

Warren Dixon<br />

Vladimir Djapic<br />

Dalila Djoudi<br />

Martin Do<br />

Phuong Anh Do Hoang<br />

Andrew Dobson<br />

Zachary Dodds<br />

Mehmet Remzi Dogar<br />

Nakju Doh<br />

Miwato Doi<br />

Christophe Doignon<br />

John M. Dolan<br />

Dmitri Dolgov<br />

Mihai Emanuel Dolha<br />

Irina Dolinskaya<br />

Aaron Dollar<br />

Etienne Dombre<br />

Paolo Domenici<br />

Peter Ford Dominey<br />

Haiwei Dong<br />

Hao Dong<br />

Jun Feng Dong<br />

Lixin Dong<br />

Shuonan Dong<br />

Wenjie Dong<br />

Strahinja Dosen<br />

Finale Doshi<br />

Bertrand Douillard<br />

Zoe Doulgeri<br />

Anca Dragan<br />

Andrew Drenner<br />

Paulo Drews Jr<br />

Evan Drumwright<br />

Sebastien Druon<br />

Haiping Du<br />

Noel E. Du Toit<br />

Steven Dubowsky<br />

Tom Duckett<br />

Gregory Dudek<br />

Elliot Duff<br />

Matthew David Dunbabin<br />

Claire Dune<br />

Matthew Dunnigan<br />

Tuan Duong<br />

Edmond DuPont<br />

Pierre Dupont<br />

Francesco Durante<br />

Christian Duriez<br />

Jason Durrie<br />

Alexander Duschau-Wicke<br />

Ethan Eade<br />

Matthew Earl<br />

Aaron Edsinger<br />

Magnus Egerstedt<br />

Ruwan Egoda Gamage<br />

Amy Eguchi<br />

Donald Eickstedt<br />

Robert Eidenberger<br />

Carl Henrik Ek<br />

Chinwe Ekenna<br />

Oussama El Hamzoui<br />

Sahar El Khoury<br />

Ahmed K. El-Shenawy<br />

Imad Elhajj<br />

Haytham Elhawary<br />

Robert Ellenberg<br />

Jan Elseberg<br />

Jack Elston<br />

Phillips Emilie<br />

Gen Endo<br />

Takahiro Endo<br />

Felix Endres<br />

Erik Daniel Engeberg<br />

Brendan Englot<br />

Clemens Eppner<br />

Alina M. Eqtami<br />

Kemalettin Erbatur<br />

Tom Erez<br />

Lawrence H Erickson<br />

Stefan Ericson<br />

Gorkem Erinc<br />

Aydan Erkmen<br />

Ismet Erkmen<br />

Duygun Erol Barkana<br />

Adrien Escande<br />

Bernard Espiau<br />

Joel Esposito<br />

Claudia Esteves<br />

Carlos Estrada<br />

Simos Evangelou<br />

William C. Evans<br />

Jani Even<br />

Jacob Everist<br />

Friederike Eyssel<br />

Stephan Fabel<br />

Tibor Fabian<br />

Andrew Fagg<br />

Farbod Fahimi<br />

Nathaniel Fairfield<br />

Pietro Falco<br />

Riccardo Falconi<br />

Maurice Fallon<br />

Egidio Falotico<br />

Yongchun Fang<br />

Pietro Fanghella<br />

Isabelle Fantoni<br />

Gregory Faraut<br />

Alessandro Farinelli<br />

Ildar Farkhatdinov<br />

Shane Farritor<br />

Imraan Faruque<br />

Juan Fasola<br />

John Fasoulas<br />

Sergej Fatikow<br />

Abbas Fattah<br />

Ehsan Fazl-Ersi<br />

Pooyan Fazli<br />

Ronald Fearing<br />

Duc Fehr<br />

David Feil-Seifer<br />

Javier Felip<br />

Martin Felis<br />

Mirko Felisa<br />

Vicente Feliu<br />

Aaron Fenster<br />

Cornelia Fermuller<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xi–<br />

Leandro Carlos Fernandes<br />

Eduardo Fernandez<br />

P. Fernández Alcantarilla<br />

M. Fernandez-Carmona<br />

Jesus Fernandez-Lozano<br />

J.-A. Fernandez-Madrigal<br />

C. Fernandez-Maloigne<br />

Vincenzo Ferrari<br />

Manuel Ferre<br />

Alexander Ferrein<br />

Antoine Ferreira<br />

João Filipe Ferreira<br />

Manuel Ferreira<br />

Gonzalo Ferrer<br />

Gianni Ferretti<br />

Giancarlo Ferrigno<br />

António Ferrolho<br />

Mark Fiala<br />

Gabor Fichtinger<br />

MaryAnne Fields<br />

Rafael Fierro<br />

Giorgio Figliolini<br />

Ioannis Filippidis<br />

David Filliat<br />

Holger Finger<br />

Allan Daniel Finistauri<br />

Jonathan Fink<br />

Bernd Finkemeyer<br />

Paolo Fiorini<br />

Gregory Fischer<br />

Jan Fischer<br />

Wolfgang Fischer<br />

Robert Charles Fitch<br />

Kevin Fite<br />

Fabrizio Flacco<br />

David Flavigné<br />

Daniel Montrallo Flickinger<br />

Dario Floreano<br />

Mihaela Cecilia Florescu<br />

Michele Focchi<br />

David Fofi<br />

Torea Foissotte<br />

Amalia Foka<br />

David Folio<br />

John Folkesson<br />

Romeo Fomena Tatsambon<br />

Josep Maria Font-Llagunes<br />

Marco Fontana<br />

Daniele Fontanelli<br />

James Richard Forbes<br />

Amir Ali Forough Nassiraei<br />

Per-Erik Forssen<br />

Clément Fouque<br />

Michael Fox<br />

Jan-Michael Frahm<br />

Juan-Carlos Fraile<br />

Philippe Fraisse<br />

Antonio Franchi<br />

Barbara Frank<br />

Heinz Frank<br />

Jörg Franke<br />

Michel Franken<br />

Friedrich Fraundorfer<br />

Emilio Frazzoli<br />

Luigi Freda<br />

Regina Frei<br />

Leonid Freidovich<br />

Vincent Fremont<br />

Udo Frese<br />

Eric W. Frew<br />

Georges Fried<br />

Werner Friedl<br />

Martin Friedmann<br />

Simone Frintrop<br />

Mario Fritz<br />

Anselmo Frizera Neto<br />

Emanuele Frontoni<br />

Dominic R. Frutiger<br />

Li-Chen Fu<br />

Michael J. Fu<br />

Karl Cheng-Heng Fua<br />

Ohmi Fuchiwaki<br />

Masakatsu G. Fujie<br />

Hideo Fujimoto<br />

Kikuo Fujimura<br />

Toshihiro Fujita<br />

Takanori Fukao<br />

Toshio Fukuda<br />

Kotaro Fukui<br />

Rui Fukui<br />

Yasuhiro Fukuoka<br />

Edwardo F. Fukushima<br />

Yosuke Fukushima<br />

Matteo Fumagalli<br />

Ryu Funase<br />

Tetsuro Funato<br />

Juan G. Victores<br />

Péter Galambos<br />

Ignacio Galiana<br />

Cipriano Galindo<br />

Juan Alvaro Gallego<br />

Kevin Galloway<br />

Stefano Galvan<br />

Dorian Galvez Lopez<br />

Javier Gamez Garcia<br />

Andrej Gams<br />

Dongming Gan<br />

Seng Keat Gan<br />

Jacques Gangloff<br />

Bingtuan Gao<br />

Haibo Gao<br />

Elena Garcia<br />

Fernando Garcia<br />

Pablo Garcia<br />

Rafael Garcia<br />

Sebastian García<br />

Alfonso García-Cerezo<br />

Simon Garnier<br />

Carlos Garre<br />

Anais Garrell<br />

Jose Gaspar<br />

Alessandro Gasparetto<br />

Andrea Gasparri<br />

Roger Gassert<br />

Philippe Gaussier<br />

Michael Gauthier<br />

Russell Gayle<br />

Yunjian Ge


Anne-Lise Gehin<br />

Christopher Geib<br />

Andreas Geiger<br />

Peter Gemeiner<br />

Sebastien Gemme<br />

Michael Gennert<br />

Brian Gerkey<br />

Gregory J. Gerling<br />

Sven Gestegård Robertz<br />

Hartmut Geyer<br />

Robert Ghrist<br />

Michael Gienger<br />

Philippe Giguere<br />

Arturo Gil<br />

Stephanie Gil<br />

Jeremy Gillula<br />

Antonio Gimenez<br />

Luca Giona<br />

Xavier Giralt<br />

Nivedhitha Giri<br />

Dylan F. Glas<br />

Sebastien Glaser<br />

Guy Godin<br />

Roy Godzdanker<br />

Robert Goeddel<br />

Christian Goerick<br />

Martin Goerner<br />

Grigore Gogu<br />

Orcun Goksel<br />

Ken Goldberg<br />

Michael Goldfarb<br />

Fernando Gomez<br />

A. Gómez de Silva Garza<br />

Nelson Goncalves<br />

Paulo Gonçalves<br />

Tiago Gonçalves<br />

Javier Gonzalez<br />

Juan Pablo Gonzalez<br />

Víctor González<br />

Yolanda Gonzalez<br />

Luis E. González-Jiménez<br />

Bill Goodwine<br />

Faramarz Gordaninejad<br />

Francisco Gordillo<br />

Clement Gosselin<br />

Florian Gosselin<br />

Frederick P. Gosselin<br />

Ambarish Goswami<br />

Dip Goswami<br />

Michèle Gouiffès<br />

Stephen Gould<br />

François Goulette<br />

Marc Gouttefarde<br />

Sven Gowal<br />

Nuno Gracias<br />

Devin Grady<br />

Birgit Graf<br />

Christophe Grand<br />

Grzegorz Granosik<br />

Oscar G. Grasa<br />

Valdir Grassi Junior<br />

Antoni Grau<br />

Jan Tommy Gravdahl<br />

David Gravel<br />

Andrew Gray<br />

Steven Gray<br />

Markus Grebenstein<br />

Colin Green<br />

Keith Evan Green<br />

Robert D. Gregg<br />

Nicola Greggio<br />

Lorenzo Grespan<br />

Matthew Greytak<br />

Elena Gribovskaya<br />

Matthew Koichi Grimes<br />

Garrett Grindle<br />

Giorgio Grioli<br />

Ben Grocholsky<br />

Martin Groeger<br />

Daniel Grollman<br />

William Grosky<br />

Roderich Gross<br />

Thilo Grundmann<br />

Rod Grupen<br />

Slawomir Grzonka<br />

Jason Gu<br />

Yisheng Guan<br />

Corrado Guarino Lo Bianco<br />

Wail Gueaieb<br />

Eugenio Guglielmelli<br />

Mohamed Guiatni<br />

Jose Guivant<br />

James Gunderson<br />

Keith Gunura<br />

Chunzhao Guo<br />

Shuxiang Guo<br />

Yan Guo<br />

Yi Guo<br />

Rakesh Gupta<br />

Sabri Gurbuz<br />

Hakan Gurocak<br />

Jens-Steffen Gutmann<br />

Stephen J. Guy<br />

Felix Hackbarth<br />

Yassine Haddab<br />

Amir Haddadi<br />

Sami Haddadin<br />

Hicham Hadj-Abdelkader<br />

Testuji Haga<br />

Nicola Hagemeister<br />

Gregory Hager<br />

Norihiro Hagita<br />

Masaya Hagiwara<br />

Tamas Haidegger<br />

Adam Halasz<br />

Dogan Sinan Haliyo<br />

Aarne J. Halme<br />

Dan Halperin<br />

Masaki Hamamoto<br />

Tarek Hamel<br />

William R. Hamel<br />

Chang-Soo Han<br />

Gyu Chull Han<br />

Jianda Han<br />

Kyung min Han<br />

Li Han<br />

Long Han<br />

Ryo Hanai<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xii–<br />

Ankur Handa<br />

Uwe D. Hanebeck<br />

Guangbo Hao<br />

Masayuki Hara<br />

Naoyuki Hara<br />

Susumu Hara<br />

Kanako Harada<br />

Kensuke Harada<br />

Takashi Harada<br />

Tatsuya Harada<br />

Frederick Harris<br />

Stephen Hart<br />

Robert Haschke<br />

Osamu Hasegawa<br />

Tadahiro Hasegawa<br />

Tsutomu Hasegawa<br />

Yasuhisa Hasegawa<br />

Kenji Hashimoto<br />

Koichi Hashimoto<br />

Minoru Hashimoto<br />

Shuji Hashimoto<br />

Takuya Hashimoto<br />

Naohisa Hashimoto<br />

Takeshi Hatanaka<br />

Naotaka Hatao<br />

Ross Hatton<br />

Sabine Hauert<br />

Helmut Hauser<br />

Kris Hauser<br />

Nicolas Hautière<br />

Nick Hawes<br />

Yoshikazu Hayakawa<br />

Mitsuhiro Hayashibe<br />

Jean-Bernard Hayet<br />

Galen Clark Haynes<br />

Vincent Hayward<br />

Klas Hedenberg<br />

Markus Hehn<br />

Hordur K Heidarsson<br />

Björn Hein<br />

Milton Heinen<br />

Achim Hekler<br />

Patrick Helmer<br />

Gustaf Hendeby<br />

Gregory Henderson<br />

Thomas C. Henderson<br />

Philipp Hennig<br />

Peter Henry<br />

Damitha Chandana Herath<br />

Evan Herbst<br />

Just Herder<br />

Andrei Herdt<br />

Bruno Herisse<br />

Sorin Herle<br />

Benoît Herman<br />

Alejandro Hernandez Arieta<br />

Daniel Hernández-Sosa<br />

Albert Hernansanz<br />

Micha Hersch<br />

Joachim Hertzberg<br />

Joel Hesch<br />

Tarcisio Hess-Coelho<br />

Matthew Scott Hester<br />

Todd Hester<br />

Clint Heyer<br />

Mitsuru Higashimori<br />

Masaru Higuchi<br />

Ulrich Hillenbrand<br />

Nick Hillier<br />

Volker Hilsenstein<br />

Salim Hima<br />

Lindsey Hines<br />

James Hing<br />

Tatsuya Hirahara<br />

Hiroaki Hirai<br />

Yasuharu Hirasawa<br />

Yasuhisa Hirata<br />

Heiko Hirschmüller<br />

Gerd Hirzinger<br />

Handa Hisashi<br />

Sean Ho<br />

Van Ho<br />

John Hoare<br />

Warren Hoburg<br />

Nico Hochgeschwender<br />

Antony Hodgson<br />

Michael Hofbaur<br />

Nicholas Hoff<br />

Alwin Hoffmann<br />

Gabriel Hoffmann<br />

Heiko Hoffmann<br />

Andrew Hogue<br />

Shinji Hokamoto<br />

Geoffrey Hollinger<br />

Patrick Hollis<br />

Rebecca Hollmann<br />

Dirk Holz<br />

Masaaki Honda<br />

Man Bok Hong<br />

Carl D. Hoover<br />

Ben Horan<br />

Toshio Hori<br />

Armin Hornung<br />

Satoshi Hoshino<br />

Koh Hosoda<br />

Shigeyuki Hosoe<br />

Feili Hou<br />

Jonathan How<br />

Ayanna Howard<br />

Matthew Howard<br />

Tom Howard<br />

Robert D. Howe<br />

Stefan Hrabar<br />

Kaijen Hsiao<br />

M. Ani Hsieh<br />

David Hsu<br />

John Hsu<br />

Bo Hu<br />

Chao Hu<br />

Chunhua Hu<br />

Guoqiang Hu<br />

Hongsheng Hu<br />

Jwu-Sheng Hu<br />

Tianjiang Hu<br />

Zhencheng Hu<br />

Changchun Hua<br />

Ana Huaman<br />

Albert S. Huang


Cheng-Ming Huang<br />

Chih-Fang Huang<br />

Haibo Huang<br />

Han-Pang Huang<br />

Haomiao Huang<br />

Jian Huang<br />

Jian Huang<br />

Ke Huang<br />

Qiang Huang<br />

Ruining Huang<br />

Shoudong Huang<br />

Shuguang Huang<br />

Shuo Huang<br />

Tzu-Hao Huang<br />

Yazhou Huang<br />

Daniel Huber<br />

Manfred Huber<br />

Christian Hubicki<br />

Kai Huebner<br />

Joel C. Huegel<br />

Zhao Huihua<br />

Jan Huissoon<br />

Thomas Hulin<br />

James Sean Humbert<br />

Terry Huntsberger<br />

Jonathan Hurst<br />

Ammar Husain<br />

Islam Hussein<br />

Manfred Husty<br />

Seth Hutchinson<br />

Gilgueng Hwang<br />

Heeseon Hwang<br />

Jung-Hoon Hwang<br />

Kazuyuki Hyodo<br />

Sang-Ho Hyon<br />

Baro Hyun<br />

Heikki Sakari Hyyti<br />

Karl Iagnemma<br />

Ion Iancu<br />

Akihiko Ichikawa<br />

Romain Iehl<br />

Siohoi Leng<br />

Fumiya Iida<br />

Kojiro Iizuka<br />

Atsutoshi Ikeda<br />

Yoshito Ikemata<br />

Shuhei Ikemoto<br />

Yusuke Ikemoto<br />

Viorela Ila<br />

Boyko Iliev<br />

Jarmo Ilonen<br />

Atsushi Imiya<br />

Gokhan Ince<br />

Giovanni Indiveri<br />

Marina Indri<br />

Francois Felix Ingrand<br />

Kenji Inoue<br />

Takahiro Inoue<br />

Luca Iocchi<br />

Iulian Iordachita<br />

Ioannis Iossifidis<br />

Masatsugu Iribe<br />

Josep Isern-González<br />

Genya Ishigami<br />

Akio Ishiguro<br />

Hiroyuki Ishii<br />

Kazuo Ishii<br />

Jun Ishikawa<br />

Volkan Isler<br />

Daigoro Isobe<br />

Kazuyuki Ito<br />

Keiichiro Ito<br />

Tomotaka Itoh<br />

Iñaki Iturrate<br />

Serena Ivaldi<br />

Ioan Alexandru Ivan<br />

Yoshio Iwai<br />

Masami Iwase<br />

Yumi Iwashita<br />

Hiroyasu Iwata<br />

Yasushi Iwatani<br />

Martin Jagersand<br />

Leonard Jaillet<br />

Advait Jain<br />

Dominik Jain<br />

Ravi Kant Jain<br />

Michael Jakuba<br />

Doug L. James<br />

Rodrigo Jamisola<br />

Mo Jamshidi<br />

Farrokh Janabi-Sharifi<br />

Balazs Janko<br />

Pakpong Jantapremjit<br />

Alberto Jardon Huete<br />

Nathanaël Jarrassé<br />

Luc Jaulin<br />

C.V. Jawahar<br />

Graylin Jay<br />

Suhada Jayasuriya<br />

Chandimal Jayawardena<br />

Jagadeesan Jayender<br />

Changsoo Je<br />

Michael Jenkin<br />

Leif P. Jentoft<br />

Jeong hwan Jeon<br />

Jay Jeong<br />

Mun-Ho Jeong<br />

Seonghee Jeong<br />

Sabina Jeschke<br />

Nikolay Jetchev<br />

Sang Hoon Ji<br />

Zhengping Ji<br />

Yan-Bin Jia<br />

Zhaoyin Jia<br />

Guangying Jiang<br />

Shu Jiang<br />

Xin Jiang<br />

Yun Jiang<br />

Jiyong Jin<br />

Sungho Jo<br />

Rolf Johansson<br />

Aaron Johnson<br />

Andrew Johnson<br />

Collin Johnson<br />

Miles Johnson<br />

Matthew Johnson-Roberson<br />

Magnus Johnsson<br />

Dominik Joho<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xiii–<br />

Cyril Joly<br />

Edward Gil Jones<br />

Ben Jonker<br />

Brett Jordan<br />

Ajay Joshi<br />

Ming-Shaung Ju<br />

Zhaojie Ju<br />

Simon Justin Julier<br />

Hong-Gul Jun<br />

Boyoon Jung<br />

Changbae Jung<br />

Min Yang Jung<br />

Yoon C. Jung<br />

Michael Kaess<br />

Satoshi Kagami<br />

Lueder Alexander Kahrs<br />

Shuuji Kajita<br />

Mrinal Kalakrishnan<br />

Vinutha Kallem<br />

Pasi Johannes Kallio<br />

Marcelo Kallmann<br />

Tetsushi Kamegawa<br />

Mitsuhiro Kamezaki<br />

Akiya Kamimura<br />

Hiroshi Kaminaga<br />

Sören Kammel<br />

Roman Kamnik<br />

Stratis Kanarachos<br />

Takefumi Kanda<br />

Fumio Kanehiro<br />

Kenji Kaneko<br />

Makoto Kaneko<br />

Yeonsik Kang<br />

Balajee Kannan<br />

Dimitrios Kanoulas<br />

George Kantor<br />

Imin Kao<br />

Ankur Kapoor<br />

Sertac Karaman<br />

Michael Karg<br />

Nikhil Karnad<br />

Amir Karniel<br />

Joanna Karpinska<br />

George Karras<br />

Sisir Karumanchi<br />

Dusko Katic<br />

Naomi Kato<br />

Max Katsev<br />

Pantelis Katsiaris<br />

Dov Katz<br />

Lydia Kavraki<br />

Hideki Kawahara<br />

Tatsuya Kawahara<br />

Atsushi Kawakami<br />

Kazuya Kawamura<br />

Sadao Kawamura<br />

Hiroshi Kawano<br />

Hiroshi Kawasaki<br />

Hiroaki Kawashima<br />

Kenji Kawashima<br />

Hirohiko Kawata<br />

Kentaro Kayama<br />

Richard Kelley<br />

Jonathan Kelly<br />

Anssi Juhani Kemppainen<br />

Farid Kendoul<br />

Benjamin Kent<br />

Lychek Keo<br />

Mehrdad R. Kermani<br />

Serge Kernbach<br />

Kristian Kersting<br />

Samuel B. Kesner<br />

Saad Khan<br />

Piyush Khandelwal<br />

Abderrahmane Kheddar<br />

Kazuo Kiguchi<br />

Takehito Kikuchi<br />

Ryo Kikuuwe<br />

Byeong-Sang Kim<br />

Byoung-Ho Kim<br />

Chang Young Kim<br />

ChangHwan Kim<br />

Chanki Kim<br />

Chyon Hae Kim<br />

Doik Kim<br />

Dong Hwan Kim<br />

Donghyeon Kim<br />

H. Jin Kim<br />

Hee-Young Kim<br />

Hongho Kim<br />

Jinwook Kim<br />

Jiwoong Kim<br />

Jong-Hwan Kim<br />

Jong-Wook Kim<br />

Jonghoek Kim<br />

Jongwon Kim<br />

Joo H. Kim<br />

Jung Kim<br />

Jung-Yup Kim<br />

Junggon Kim<br />

Keehoon Kim<br />

Kyung-Joong Kim<br />

MinJun Kim<br />

Sangbae Kim<br />

Whee Kuk Kim<br />

Yongtae Kim<br />

Young J. Kim<br />

Youngmoo Kim<br />

Shinichi Kimura<br />

Chih-Hung King<br />

H. Hawkeye King<br />

Hitoshi Kino<br />

James Kinsey<br />

Tetsuya Kinugasa<br />

Zsolt Kira<br />

Frank Kirchner<br />

Nathan Kirchner<br />

Alexis Kirke<br />

Nobuyuki Kita<br />

Yasuyo Kita<br />

Bernd Kitt<br />

Gregor Klancar<br />

Ulrich Klank<br />

Alexander Kleiner<br />

Andrew Klesh<br />

Denis Klimentjew<br />

Stephen Kloder<br />

Markus Klotzbuecher


Boris Kluge<br />

Pawel Kmiotek<br />

Laurent Kneip<br />

Ross A Knepper<br />

Heather Knight<br />

Alois Knoll<br />

George Knopf<br />

Jeremie Knuesel<br />

Kheng Lee Koay<br />

Etsuko Kobayashi<br />

Futoshi Kobayashi<br />

Hiroyuki Kobayashi<br />

Taizo Kobayashi<br />

Yo Kobayashi<br />

Yuichi Kobayashi<br />

Jens Kober<br />

Marin Kobilarov<br />

Sarath Kodagoda<br />

Kenri Kodaka<br />

Masanao Koeda<br />

James Koeneman<br />

Nathan Koenig<br />

Koichi Koganezawa<br />

Kiminao Kogiso<br />

Norihiro Koizumi<br />

Nigel Kojimoto<br />

Ewa Kolakowska<br />

Maxim Kolesnikov<br />

Thomas Kollar<br />

Andreas Kolling<br />

J. Zico Kolter<br />

Erik Komendera<br />

Haldun Komsuoglu<br />

Taku Komura<br />

Takashi Komuro<br />

Kazuyuki Kon<br />

Toshiyuki Kondo<br />

Fanyu Kong<br />

Xianwen Kong<br />

Rainer Konietschke<br />

Atsushi Konno<br />

Kurt Konolige<br />

Michail Kontitsis<br />

Ioannis Kontolatis<br />

Masashi Konyo<br />

Ig Mo Koo<br />

Ja Choon Koo<br />

Gert Kootstra<br />

Petar Kormushev<br />

G. Ayorkor Korsah<br />

David Kortenkamp<br />

Gabor Kosa<br />

Hatice Kose-Bagci<br />

Jana Kosecka<br />

Yoshihiko Koseki<br />

D. I. Kosmopoulos<br />

Kazuhiro Kosuge<br />

Shinya Kotosaka<br />

Navinda Kottege<br />

Mirko Kovac<br />

Gerhard Kraetzschmar<br />

Dirk Kraft<br />

Florian Kraft<br />

Danica Kragic<br />

Michael Krainin<br />

Bradley Kratochvil<br />

Frédéric Kratz<br />

Hermano Igo Krebs<br />

Philipp Kremer<br />

Ingo Kresse<br />

Henrik Kretzschmar<br />

Simon Kriegel<br />

Madhava Krishna<br />

Torsten Kroeger<br />

Oliver Kroemer<br />

Athanasios Krontiris<br />

Kristian Kroschel<br />

Volker Krueger<br />

Alexandre Krupa<br />

Sebastien Krut<br />

Maarja Kruusmaa<br />

Jiun-Yih Kuan<br />

Wilfried Kubinger<br />

Takashi Kubota<br />

Daniel Kubus<br />

Shunsuke Kudoh<br />

Bernhard Kuebler<br />

Rainer Kuemmerle<br />

James Kuffner<br />

Benjamin Kuipers<br />

Dana Kulic<br />

Ganesh Kumar<br />

Manish Kumar<br />

Prasanth Kumar<br />

Rajesh Kumar<br />

Vijay Kumar<br />

Yasuharu Kunii<br />

Yoshinori Kuno<br />

Tobias Kunz<br />

I Han Kuo<br />

Victor Kuo<br />

Kentarou Kurashige<br />

Ryo Kurazume<br />

Masamitsu Kurisu<br />

Yuichi Kurita<br />

Yoji Kuroda<br />

Haruhisa Kurokawa<br />

Yoshiaki Kuwata<br />

Dong-Soo Kwon<br />

Hyukseong Kwon<br />

Hyunki Kwon<br />

Ohung Kwon<br />

SangJoo Kwon<br />

Yong Moo Kwon<br />

Young-Sik Kwon<br />

Kolja Kühnlenz<br />

Ville Kyrki<br />

Erik Kyrkjebø<br />

Bruno L'Esperance<br />

Hung La<br />

Janne Laaksonen<br />

Ouiddad Labbani-Igbida<br />

Francisco Lacerda<br />

Bakir Lacevic<br />

Simon Lacroix<br />

Matteo Laffranchi<br />

Chin-Lun Lai<br />

Kevin Lai<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xiv–<br />

Yongjun Lai<br />

Sara Lal<br />

Thierry Laliberte<br />

Jean-Francois Lalonde<br />

Chi Pang Lam<br />

Raymond Lam<br />

Olivier Lambercy<br />

Florent Lamiraux<br />

Pierre Lamon<br />

Roberto Lampariello<br />

Christoph H. Lampert<br />

Chao-Chieh Lan<br />

David Lane<br />

Christian Lang<br />

Jochen Lang<br />

Tobias Lang<br />

Friedrich Lange<br />

Sascha Lange<br />

Jacob Langelaan<br />

Jørgen Larsen<br />

Amy Larson<br />

Cecilia Laschi<br />

Boris Lau<br />

Darwin Lau<br />

Manfred Lau<br />

Nuno Lau<br />

Tak Kit Lau<br />

Tim Laue<br />

Martin Lauer<br />

Christian Laugier<br />

Steven M. LaValle<br />

Brian Lawson<br />

Maria Teresa Lazaro<br />

Olivier Le Meur<br />

Jerome Le Ny<br />

Brigitte Le-Pévédic<br />

Chil-Woo Lee<br />

Daewon Lee<br />

Daniel D. Lee<br />

Dongheui Lee<br />

Doo Yong Lee<br />

Hyunglae Lee<br />

Hyunsuk Lee<br />

Jae Hoon Lee<br />

Jaewoo Lee<br />

Ji-Yeong Lee<br />

Jihong Lee<br />

Jonghyun Lee<br />

Ju-Jang Lee<br />

Juhyun Lee<br />

Kiju Lee<br />

Kyoung Mu Lee<br />

Loo Hay Lee<br />

Mark Lee<br />

Sanghoon Lee<br />

Sangyoon Lee<br />

Se-Jin Lee<br />

Seongsoo Lee<br />

Seung-Mok Lee<br />

Sukhan Lee<br />

Wen-Yo Lee<br />

Woosub Lee<br />

Yun-Jung Lee<br />

Benjamin Lefaudeux<br />

Dirk Lefeber<br />

Stéphanie Lefèvre<br />

Jie Lei<br />

Antonio C. Leite<br />

Sylvie Lelandais<br />

Roland Lenain<br />

Sebastien Lengagne<br />

Scott Lenser<br />

Tommaso Lenzi<br />

John Leonard<br />

Simon Leonard<br />

Matteo Leonetti<br />

Vincent Lepetit<br />

Ilkka M. Leppänen<br />

Frederic Lerasle<br />

Charles Lesire<br />

Adrian Leu<br />

Keith Yu Kit Leung<br />

Christopher Lewis<br />

Jeremy Lewis<br />

M. Anthony Lewis<br />

Michael Lewis<br />

Christian Lexcellent<br />

Sigrid Leyendecker<br />

Bin Li<br />

Changchun Li<br />

Cheng Gang Li<br />

Guangyong Li<br />

Haizhou Li<br />

Hui Li<br />

Jingliang Li<br />

Min Li<br />

Ming Li<br />

Peng Li<br />

Qinchuan Li<br />

Ruonan Li<br />

Shigang Li<br />

Tsai-Yen Li<br />

Xiang Li<br />

Xingyan Li<br />

Yangming Li<br />

Yi Li<br />

Yuan Yuan Li<br />

Yuanping Li<br />

Yuwen Li<br />

Zhibin Li<br />

Zhibin Li<br />

Zhijun Li<br />

Qiaokang Liang<br />

Xinwu Liang<br />

Yao Liang<br />

Hongen Liao<br />

Minas Liarokapis<br />

Somchaya Liemhetcharat<br />

Jyh-Ming Lien<br />

Neal Y Lii<br />

Pål Liljebäck<br />

Chee Wang Lim<br />

Jongwoo Lim<br />

Kyeong Bin Lim<br />

Sejoon Lim<br />

Ser-Nam Lim<br />

Wenbin Lim<br />

Bor-Shen Lin


Chia-How Lin<br />

Kuen-Han Lin<br />

Lanny Lin<br />

Pei-Chun Lin<br />

Shih Chi Lin<br />

Wei Lin<br />

Yu Lin<br />

Yucong Lin<br />

Zhiyun Lin<br />

Zhuohua Lin<br />

Magnus Linderoth<br />

Laura Lindzey<br />

Qiang Ling<br />

Kai Lingemann<br />

Vittorio Lippi<br />

Vincenzo Lippiello<br />

Michael Lipsett<br />

Bingbing Liu<br />

Chao Liu<br />

Chenggang Liu<br />

Fangfang Liu<br />

Guanfeng Liu<br />

Guangjun Liu<br />

Hanwei Liu<br />

Hong Liu<br />

Hong Liu<br />

Hongbin Liu<br />

Honghai Liu<br />

Jie Liu<br />

Jindong Liu<br />

Jing Liu<br />

Jing-Sin Liu<br />

Jinguo Liu<br />

Karen Liu<br />

Lianqing Liu<br />

Minjie Liu<br />

Rongqiang Liu<br />

Tao Liu<br />

Tong Liu<br />

Xinyu Liu<br />

Yang Liu<br />

Yang Liu<br />

Yong Liu<br />

Yugang Liu<br />

Fernando Lizarralde<br />

Xavier Lladó<br />

Jorge Lobo<br />

Vo-Gia Loc<br />

Dario Lodi Rizzini<br />

Sauro Longhi<br />

Rosemarijn Looije<br />

Manuel Lopes<br />

Natalia Lopez Celani<br />

Carlos Al. López-Franco<br />

Ismael Lopez-Juarez<br />

Rigoberto Lopez-Padilla<br />

Antonio Loria<br />

Yunjiang Lou<br />

Miguel Lourenço<br />

Bryan Kian Hsiang Low<br />

K. H. Low<br />

Thomas Low<br />

Robert Lowe<br />

Rogelio Lozano<br />

Tomas Lozano-Perez<br />

Qi Lu<br />

Tien-Fu Lu<br />

Yan Lu<br />

Yanyan Lu<br />

Yi Lu<br />

Tomasz Lubecki<br />

Matthias Luber<br />

Carlos Luck<br />

Brandon Luders<br />

Greg R. Luecke<br />

Ching-Hu Luh<br />

Wen Lik Dennis Lui<br />

Henk Luinge<br />

Christopher Lum<br />

Ryan Luna<br />

Jun Luo<br />

Ren Luo<br />

Zhiqiang Luo<br />

Sergei Lupashin<br />

Alexis Lussier Desbiens<br />

Goran Lynch<br />

Kevin Lynch<br />

Daniel Lyons<br />

Ingo Lütkebohle<br />

Martin Lösch<br />

Hongbo Ma<br />

Jeremy Ma<br />

Shugen Ma<br />

Heiko Maass<br />

Bruce MacDonald<br />

D. Guimarães Macharet<br />

Doug MacKenzie<br />

Robert A. MacLachlan<br />

David MacNair<br />

Yasushi Mae<br />

Guilherme Jorge Maeda<br />

Shingo Maeda<br />

Andreas Maeder<br />

Shoichi Maeyama<br />

Gianantonio Magnani<br />

Martin Magnusson<br />

Aditya Mahadevan<br />

Ian Mahon<br />

Robert Mahony<br />

Mohammad Mahoor<br />

Mohsen Mahvash<br />

Daniel Maier<br />

Jim Mainprice<br />

Perla Maiolino<br />

Elmar Mair<br />

Jeremy Maitin-Shepard<br />

Andras Majdik<br />

Carmel Majidi<br />

Toshihiro Maki<br />

Masaaki Makikawa<br />

Satoshi Makita<br />

Michail Makrodimitris<br />

Alexis Maldonado<br />

Anthony Mallet<br />

Angelos Mallios<br />

Kenneth Mallory<br />

Abed Malti<br />

Noureddine Manamanni<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xv–<br />

Kasra Manavi<br />

Ian Manchester<br />

Roberto Manduchi<br />

Olivier Mangin<br />

Kalyan Mankala<br />

Justin Mann<br />

Nicolas Mansard<br />

Zhi-Hong Mao<br />

Panos Marantos<br />

James Marble<br />

Eric Marchand<br />

Andrew Marchese<br />

Luca Marchetti<br />

Lorenzo Marconi<br />

Rebeca Marfil<br />

Luis Felipe Marin-Urias<br />

Dimitri Marinakis<br />

Alessandro Marino<br />

Gian Luca Mariottini<br />

Ali Marjovi<br />

Ivan Markovic<br />

Lino Marques<br />

Steven Marra<br />

Jose Luis Marroquin<br />

Joshua A. Marshall<br />

Sylvain Martel<br />

Anne Martin<br />

Patrick Martin<br />

Steven Colin Martin<br />

Ester Martinez<br />

Oscar Martinez Mozos<br />

Harold R. Martinez Salazar<br />

Ruben Martinez-Cantin<br />

J.R. Martínez-de Dios<br />

Simone Martini<br />

Alcherio Martinoli<br />

Eric Martinson<br />

Zoltan-Csaba Marton<br />

Noriaki Maru<br />

Hisataka Maruyama<br />

Ignacio Mas<br />

Ken Masamune<br />

Lorenzo Masia<br />

Julian Mason<br />

Ahmad A. Masoud<br />

Hiroyuki Masuta<br />

Maja Mataric<br />

Bogdan Matei<br />

T, William Mather<br />

George Mathews<br />

Fernando Matia<br />

Takamitsu Matsubara<br />

Osamu Matsumoto<br />

Yoshio Matsumoto<br />

Takayuki Matsuno<br />

Daisuke Matsuura<br />

Kenichiro Matsuzaki<br />

Dennis Matthews<br />

Larry Matthies<br />

Giuliana Mattiazzo<br />

Rosana Matuk Herrera<br />

Saburo Matunaga<br />

Christophe Maufroy<br />

Francesco Maurelli<br />

Iñaki Maurtua<br />

S. Mohammad Mavadati<br />

Constantinos Mavroidis<br />

Bruce Maxwell<br />

Stephen Maybank<br />

Jerome Maye<br />

Christoph Mayer<br />

Walterio Mayol<br />

Anirban Mazumdar<br />

Barbara Mazzarino<br />

Barbara Mazzolai<br />

Stefano Mazzoleni<br />

Lachlan McCalman<br />

Chris McCarthy<br />

Samuel McCarthy<br />

Erik McDermott<br />

Christopher McFarland<br />

Martin McGinnity<br />

Michael McHenry<br />

Gerard McKee<br />

David McKinnon<br />

T.W. McLain<br />

James McLurkin<br />

John J. McPhee<br />

Timothy McPherson<br />

Rafik Mebarki<br />

Lashika Medagoda<br />

Gustavo Medrano-Cerda<br />

Debbie Meduna<br />

Sanford Meek<br />

Wim Meeussen<br />

David Paul Meger<br />

Malika Meghjani<br />

Syed Bilal Mehdi<br />

Mehran Mehrandezh<br />

Siddhartha Mehta<br />

Christopher Mei<br />

Luis Mejias<br />

Claudio Melchiorri<br />

Daniel Mellinger<br />

Rochelle Mellish<br />

Arianna Menciassi<br />

Caio Mendes<br />

Ricardo F. Mendoza Garcia<br />

Emanuele Menegatti<br />

Felipe Meneguzzi<br />

Paulo Menezes<br />

Yan Meng<br />

Tekin Mericli<br />

Luis Merino<br />

Jean-Pierre Merlet<br />

Gregory Mermoud<br />

Torsten Merz<br />

Elena Messina<br />

Uwe Mettin<br />

Daniel Meyer-Delius<br />

Youcef Mezouar<br />

Krzysztof Mianowski<br />

Alain Micaelli<br />

Nathan Michael<br />

Andreas Michaels<br />

Francois Michaud<br />

Fabien Michel<br />

Bernard Michini


Davide Migliore<br />

Matjaž Mihelj<br />

David Mikesell<br />

Michael J Milford<br />

Isaac Miller<br />

James K. Mills<br />

Bojan Milosevic<br />

Adam Milstein<br />

Dejan Milutinovic<br />

Hyeun Jeong Min<br />

Mamoru Minami<br />

Aiguo Ming<br />

Nicoleta Minoiu Enache<br />

Mark Minor<br />

Sylvain Miossec<br />

Luiz Gustavo Mirisola<br />

Faraz Mirzaei<br />

Marcell Missura<br />

Michael Mistry<br />

Seiichi Mita<br />

Atsushi Mitani<br />

Kazuhisa Mitobe<br />

Nikos Mitsou<br />

Mamoru Mitsuishi<br />

Noriaki Mitsunaga<br />

Jun Miura<br />

Kanako Miura<br />

Toyomi Miyagawa<br />

Seiichi Miyakoshi<br />

Shuhei Miyashita<br />

Takahiro Miyashita<br />

Mehrdad Moallem<br />

Hiromi Mochiyama<br />

Joseph Modayil<br />

Birgit Moeller<br />

Richard Phillip Mohamed<br />

Mahmood Mohammadi<br />

Samer Mohammed<br />

Vishwanathan Mohan<br />

Kamran Mohseni<br />

Shahram Mohseni-vahed<br />

Rezia Molfino<br />

Mark Moll<br />

Vito Monaco<br />

Francesco Mondada<br />

Raúl Monroy<br />

Luis Montano<br />

Luis Montesano<br />

Eduardo Montijano<br />

Chang-bae Moon<br />

Hyungpil Moon<br />

Jae-Sung Moon<br />

Brian Moore<br />

Joseph Moore<br />

Richard J.D. Moore<br />

Frank Moosmann<br />

Andres Mora<br />

Hadi Moradi<br />

Antonio Morales<br />

Eduardo Morales<br />

Jesús Morales<br />

Marco Morales<br />

Luis Yoichi Morales Saiki<br />

Fabio Morbidi<br />

Guillaume Morel<br />

Yannick Morel<br />

Vassilios Morellas<br />

Jan Morén<br />

Juan Camilo Moreno<br />

Luis Moreno<br />

Francesc Moreno-Noguer<br />

Kristi Morgansen<br />

Takeshi Mori<br />

Taketoshi Mori<br />

Yasushi Morikawa<br />

Mitsuharu Morisawa<br />

Takuro Moriyama<br />

Nicholas Morizio<br />

Nicholas Morozovsky<br />

Daniel D. Morris<br />

Peter Morton<br />

Ryan Morton<br />

Mark Moseley<br />

Alejandro R. Mosteo<br />

El Mustapha Mouaddib<br />

Tetsuya Mouri<br />

Anastasios Mourikis<br />

Annan Mozeika<br />

Andreas Mueller<br />

Joerg Mueller<br />

Katharina Muelling<br />

Jonathan Mugan<br />

Ranjan Mukherjee<br />

Rudranarayan Mukherjee<br />

Albert Mukovskiy<br />

Enzo Mumolo<br />

Marko Munih<br />

Luis Miguel Munoz<br />

Victor Muñoz<br />

Riccardo Muradore<br />

Kenichi Murakami<br />

Kouji Murakami<br />

Toshiyuki Murakami<br />

Ana Cristina Murillo<br />

Todd Murphey<br />

Chris Murphy<br />

David Murray<br />

Richard Murray<br />

Rafael Murrieta-Cid<br />

Giovanni Muscato<br />

Shabbir K. Mustafa<br />

Bilge Mutlu<br />

George Mylonas<br />

Holger Mönnich<br />

Thomas Mörwald<br />

S. Nadubettu Yadukumar<br />

Hajime Nagahara<br />

Kiyoshi Nagai<br />

Takayuki Nagai<br />

Yukie Nagai<br />

Hiroyuki Nagamatsu<br />

Kenji Nagaoka<br />

Umashankar Nagarajan<br />

Kazuyuki Nagata<br />

Keiji Nagatani<br />

Radhika Nagpal<br />

Prathap Nair<br />

Michael D. Naish<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xvi–<br />

Eiichi Naito<br />

Zoran Najdovski<br />

Homayoun Najjaran<br />

Kazuhiro Nakadai<br />

Ryu Nakadate<br />

Masahiro Nakajima<br />

Akio Nakamura<br />

Ryoichi Nakamura<br />

Taro Nakamura<br />

Yutaka Nakamura<br />

Hiroaki Nakanishi<br />

Hiroki Nakanishi<br />

Jun Nakanishi<br />

Yuto Nakanishi<br />

Akira Nakashima<br />

Atsushi Nakazawa<br />

Alireza Nakhaei<br />

Roberto Naldi<br />

Akio Namiki<br />

Mehrzad Namvar<br />

Thrishantha Nanayakkara<br />

Jun Nango<br />

Kostas Nanos<br />

Ashley Napier<br />

Nils Napp<br />

Kenichi Narioka<br />

Oleg Naroditsky<br />

David Naso<br />

Sameh Nassar<br />

Ciro Natale<br />

Lorenzo Natale<br />

Shuvra Nath<br />

Robert Nawrocki<br />

Shahriar Negahdaripour<br />

José Neira<br />

Goldie Nejat<br />

Bradley J. Nelson<br />

Carl Nelson<br />

Bojan Nemec<br />

Dragomir Nenchev<br />

Esha Nerurkar<br />

Issa Nesnas<br />

Stephen Nestinger<br />

Eric Nettleton<br />

Jeremiah Neubert<br />

Peter Neuhaus<br />

Bradford Neuman<br />

Jan Neumann<br />

Teck Chew Ng<br />

Victor Ng-Thow-Hing<br />

Trung Dung Ngo<br />

Hai Nguyen<br />

Duy Nguyen-Tuong<br />

Kai Ni<br />

Marta Niccolini<br />

Monica Nicolescu<br />

Gunter Niemeyer<br />

Juan Nieto<br />

Carlos Nieto-Granda<br />

Misato Nihei<br />

Ryuma Niiyama<br />

Janosch Nikolic<br />

Nattee Niparnan<br />

Shun Nishide<br />

Yasutaka Nishioka<br />

Koichi Nishiwaki<br />

Ilana Nisky<br />

Verena Nitsch<br />

Tomoyuki Noda<br />

David Noelle<br />

Hiroshi Noguchi<br />

Masahiro Nohmi<br />

Scott Nokleby<br />

Narges Noori<br />

Toshiro Noritsugu<br />

Navid Nourani Vatani<br />

Illah Reza Nourbakhsh<br />

Walter Nowak<br />

Cameron Nowzari<br />

Shahin Nudehi<br />

Jesus Nuevo<br />

Urbano Nunes<br />

Pedro Núñez Trujillo<br />

Emmanuel Nuño<br />

Stephen Nuske<br />

Rehan O'Grady<br />

Keith O'Hara<br />

Jason O'Kane<br />

Damien O'Rourke<br />

Oliver Obst<br />

Tobias Obst<br />

Manuel Ocaña<br />

Mitsushige Oda<br />

Lael Odhner<br />

Jean-Marc Odobez<br />

Denny Oetomo<br />

Tsukasa Ogasawara<br />

Tetsuya Ogata<br />

Tetsuji Ogawa<br />

Koichi Ogawara<br />

Dimitri Ognibene<br />

Petter Ogren<br />

Paul Y. Oh<br />

Se-Young Oh<br />

Sehoon Oh<br />

Songhwai Oh<br />

Kenichi Ohara<br />

Kazunori Ohno<br />

Apollon Oikonomopoulos<br />

Lauro Ojeda<br />

Kei Okada<br />

Masafumi Okada<br />

Jun Okamoto<br />

Jun Okamoto Junior<br />

Allison M. Okamura<br />

Hiroshi G. Okuno<br />

Andrej Olenšek<br />

Nejat Olgac<br />

Miguel A. Olivares-Mendez<br />

Paulo Oliveira<br />

Tiago Roux Oliveira<br />

Gabriel A. Oliver<br />

Anibal Ollero<br />

Daniel Olmeda<br />

Bjorn Olofsson<br />

Clark Olson<br />

Edwin Olson<br />

Tomas Olsson


Toru Omata<br />

Cagdas Denizel Onal<br />

Ahmet Onat<br />

Hiromu Onda<br />

Masahiro Ono<br />

Yizhar Or<br />

David Orin<br />

Agustin A. Ortega Jimenez<br />

Alberto Ortiz<br />

Daniel Ortiz Morales<br />

Tobias Ortmaier<br />

Fernando Osório<br />

James Ostrowski<br />

Koichi Osuka<br />

Jun Ota<br />

Masatsugu Otsuki<br />

Christian Ott<br />

Lionel Ott<br />

Michael W. Otte<br />

Mark Ottensmeyer<br />

Zhicai Ou<br />

Puren Ouyang<br />

Dai Owaki<br />

Jason Owens<br />

Eimei Oyama<br />

John O. Oyekan<br />

Ryuta Ozawa<br />

Onur Ozcan<br />

Ali Gürcan Özkil<br />

Erhan Oztop<br />

Taskin Padir<br />

Vincent Padois<br />

Ali Paikan<br />

Nicholas Paine<br />

Ana Lucia Pais<br />

Jordi Palacin<br />

Gianluca Palli<br />

Rainer Palm<br />

Luther R. Palmer III<br />

Matteo Claudio Palpacelli<br />

Jia Pan<br />

Qinxue Pan<br />

Ya-Jun Pan<br />

Dimitra Panagou<br />

Amit Kumar Pandey<br />

Shuo Pang<br />

Dejan Pangercic<br />

Giorgio Panin<br />

Athanasia Panousopoulou<br />

Julien Pansiot<br />

Caroline Pantofaru<br />

Evangelos Papadopoulos<br />

Georgios Papadopoulos<br />

Nikos Papanikolopoulos<br />

George J. Pappas<br />

Byungjae Park<br />

Edward J. Park<br />

Ga-Ram Park<br />

Hyongju Park<br />

Hyungmin Park<br />

Jae Byung Park<br />

Jaeheung Park<br />

Jong Jin Park<br />

Sangdeok Park<br />

Shinsuk Park<br />

Soonyong Park<br />

Sukho Park<br />

Sung-Kee Park<br />

Wooram Park<br />

Yong-Jai Park<br />

Youngjin Park<br />

Aaron Parness<br />

Vicente Parra Vega<br />

Martin Peter Parsley<br />

Federica Pascucci<br />

Anatol Pashkevich<br />

Carolina Passenberg<br />

Peter Pastor<br />

Maria Pateraki<br />

Kaustubh Pathak<br />

Volkan Patoglu<br />

Alexandru Patriciu<br />

Ugo Pattacini<br />

Alexander Patterson IV<br />

Rohan Paul<br />

Frederick Pauling<br />

Liam Paull<br />

Dietrich Paulus<br />

Marco Pavone<br />

Jonathan Paxman<br />

Shahram Payandeh<br />

Lina María Paz<br />

Roger Pearce<br />

Federico Pecora<br />

Luis Pedraza<br />

Angelika Peer<br />

Claude Pégard<br />

Yuanteng Pei<br />

Paulo Peixoto<br />

Pietro Peliti<br />

Ron Pelrine<br />

Virginia Pensabene<br />

Ivan Penskiy<br />

Romain Pepy<br />

Guilherme Pereira<br />

Jose Nuno Pereira<br />

Douglas Perrin<br />

Nicolas Yves Perrin<br />

Francesco Pescarmona<br />

Gustavo Pessin<br />

Steven Peters<br />

Henrik Gordon Petersen<br />

Klaus Petersen<br />

Cammy Peterson<br />

Kevin Peterson<br />

Lars Petersson<br />

Antoine Petit<br />

Florian Petit<br />

Clément Pêtrès<br />

Tadej Petric<br />

Ron Petrick<br />

Anna Petrovskaya<br />

Kathrin Eva Peyer<br />

Zachary Pezzementi<br />

Patrick Pfaff<br />

Friedrich Pfeiffer<br />

Frank Pfenning<br />

Max Pfingsthorn<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xvii–<br />

Martin Pfurner<br />

Minh Tu Pham<br />

Louis Phee<br />

Sylvie Philipp-Foliguet<br />

Roland Philippsen<br />

Johan Philips<br />

Mike Phillips<br />

Thanathorn Phoka<br />

Emmanuel Piat<br />

Stefano Piazza<br />

Brennand Pierce<br />

Vinay Pilania<br />

Luciano Pimenta<br />

Benny Ping Lai Lo<br />

Pedro Pinies<br />

Charles Pinto<br />

Peam Pipattanasomporn<br />

Tony Pipe<br />

Charles Pippin<br />

Gabriel Pires<br />

Hamed Pirsiavash<br />

Antonio Pistillo<br />

Benjamin Pitzer<br />

Mihail Pivtoraiko<br />

Oscar Pizarro<br />

Christian Plagemann<br />

Erion Plaku<br />

Robert Platt<br />

Patrick Plonski<br />

Frederic Plumet<br />

Paul G. Plöger<br />

Kishore Pochiraju<br />

Janez Podobnik<br />

Sameera Poduri<br />

Philippe Poignet<br />

Dragoljub Pokrajac<br />

Nancy S Pollard<br />

Lorenzo Pollini<br />

Ilia G. Polushin<br />

Jorge Pomares<br />

Hasan Poonawala<br />

Dan Popa<br />

Josep M Porta<br />

Ingmar Posner<br />

Dave Post<br />

Alexis Potelle<br />

Veljko Potkonjak<br />

Andreas Pott<br />

Ioannis Poulakakis<br />

Paul Pounds<br />

Soha Pouya<br />

Nathan Powel<br />

Cedric Pradalier<br />

José Augusto Prado<br />

David Prasser<br />

Dilip Kumar Pratihar<br />

Mario Prats<br />

Jerry Pratt<br />

Domenico Prattichizzo<br />

Samuel Prentice<br />

Muriel Pressigout<br />

Edson Prestes<br />

Alberto Pretto<br />

Libor Preucil<br />

Carsten Preusche<br />

Andrzej Pronobis<br />

Nicolas Pronost<br />

Amanda Prorok<br />

Markus Przybylski<br />

Gustavo Armando Puerto<br />

Luis Puig<br />

Luca Pulina<br />

Bing Qiao<br />

Hong Qiao<br />

Tian Qiu<br />

Xingda Qu<br />

Giuseppe Quaglia<br />

Morgan Quigley<br />

Roger, D. Quinn<br />

Annika Raatz<br />

Amina Radgui<br />

Katayon Radkhah<br />

Amir Rahmani<br />

Talal Rahwan<br />

Jurdak Raja<br />

Micky Rakotondrabe<br />

Srikumar Ramalingam<br />

Subramanian Ramamoorthy<br />

Amar Ramdane-Cherif<br />

Nacim Ramdani<br />

Arnau Ramisa<br />

Nadeesha Ranasinghe<br />

Ananth Ranganathan<br />

Pradeep Ranganathan<br />

Markus Rank<br />

Shrisha Rao<br />

Mohammad Rastgaar<br />

Ravi Kulan Rathnam<br />

Nathan Ratliff<br />

Banahalli Ratna<br />

Georg Rauter<br />

Barak Raveh<br />

Prabu Ravindran<br />

Konrad Rawlik<br />

Anjan Kumar Ray<br />

Francesco Rea<br />

Pierluigi Rea<br />

Christopher M. Reardon<br />

Brice Rebsamen<br />

Gabriel Recatala<br />

Signe Redfield<br />

Brooks Reed<br />

Kyle Brandon Reed<br />

Monica Reggiani<br />

Stéphane Régnier<br />

Rob Reilink<br />

Oscar Reinoso<br />

Luís Paulo Reis<br />

Ulrich Reiser<br />

Dustin Reishus<br />

Georgios Rekleitis<br />

C. David Remy<br />

Ping Ren<br />

Wei Ren<br />

Xiaofeng Ren<br />

Pierre Renaud<br />

Maria Joao Rendas<br />

Mark Rentschler


Filoktimon Repoulias<br />

Ari Requicha<br />

David Ribas<br />

Isabel Ribeiro<br />

Rogerio Richa<br />

Arthur Richards<br />

Andrew Richardson<br />

Markus Rickert<br />

Jose-de-Jesus Rico<br />

Robert Riener<br />

Hala Rifai<br />

Ludovic Righetti<br />

Gerhard Rigoll<br />

Marcia Riley<br />

Helge Joachim Ritter<br />

Alejandro Rituerto<br />

Cameron Riviere<br />

Maximo A. Roa<br />

Richard Roberts<br />

Rodney Roberts<br />

Anders Robertsson<br />

Paolo Rocco<br />

Rui Paulo Rocha<br />

Eduardo Rocon<br />

Tobias Rodemann<br />

Adrian Rodriguez<br />

Alberto Rodriguez<br />

Carlos Rodriguez<br />

F. de Borja Rodríguez<br />

Marcela Patricia Rodriguez<br />

Samuel Rodriguez<br />

A.Rodríguez Tsouroukdissian<br />

F. Rodriguez y Baena<br />

Erick J. Rodriguez-Seda<br />

Arne Roennau<br />

John G. Rogers III<br />

Se-gon Roh<br />

John Roland<br />

Chris Roman<br />

Joseph M. Romano<br />

Eric Rombokas<br />

Lotfi Romdhane<br />

Antonio Romeo<br />

Javier Romero<br />

R. Ap. Francelin Romero<br />

M.-A. Romero-Ramirez<br />

Lluis Ros<br />

Carlos Rosales<br />

Jan Rosell<br />

Jacob Rosen<br />

Stephanie Rosenthal<br />

Robert Ross<br />

Claudio Rossi<br />

Juergen Rossmann<br />

Gerhard Roth<br />

Axel Rottmann<br />

Pierre Rouanet<br />

Stergios Roumeliotis<br />

Giannis Roussos<br />

Anthony Rowe<br />

Nicholas Roy<br />

Eric Royer<br />

Michael Rubenstein<br />

Martin Rufli<br />

Fabio Ruggiero<br />

Michael Ruhnke<br />

Andy Ruina<br />

Fabio Ruini<br />

Federico Ruiz<br />

Ubaldo Ruiz<br />

Javier Ruiz-del-Solar<br />

Wheeler Ruml<br />

Juergen Rummel<br />

Ian Rust<br />

Malcolm Ryan<br />

Paul E. Rybski<br />

Julian Ryde<br />

Jee-Hwan Ryu<br />

Jeha Ryu<br />

Thomas Röfer<br />

Thorsteinn Rögnvaldsson<br />

Selma Sabanovic<br />

Jose M. Sabater Navarro<br />

Lorenzo Sabattini<br />

Christophe Sabourin<br />

Parvaneh Saeedi<br />

Ryo Saegusa<br />

Alessandro Saffiotti<br />

Satoshi Saga<br />

Jody Alessandro Saglia<br />

Ranjana Sahai<br />

Erol Sahin<br />

Satoru Sakai<br />

Paolo Salaris<br />

Camille Salaun<br />

Septimiu E. Salcudean<br />

Antonino Salerno<br />

Curt Salisbury<br />

Roque Saltaren<br />

Giampiero Salvi<br />

Joaquim Salvi<br />

Gionata Salvietti<br />

Paul Samuel<br />

Siddharth Sanan<br />

Emilio Sánchez<br />

Abraham Sánchez López<br />

Gildardo Sanchez-Ante<br />

William Sandberg<br />

Arthur Sanderson<br />

Giulio Sandini<br />

Corina Sandu<br />

Alberto Sanfeliu<br />

Hyon Sang-Ho<br />

Kiattisak Sangpradit<br />

Ganesh Sankaranarayanan<br />

Pedro Santana<br />

Cristina Santos<br />

Veronica J. Santos<br />

Vitor Santos<br />

Amin Sarafraz<br />

Sezimaria F. P. Saramago<br />

Afsar Saranli<br />

Uluc Saranli<br />

Shigeru Sarata<br />

Irene Sardellitti<br />

Sanem Sariel-Talay<br />

Nilanjan Sarkar<br />

Hiroshi Saruwatari<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xviii–<br />

Yoko Sasaki<br />

Martin Saska<br />

Shivakumar Sastry<br />

RM Satava<br />

M. Sathia Narayanan<br />

Aykut Cihan Satici<br />

Massimo Satler<br />

Daisuke Sato<br />

Noritaka Sato<br />

Junaed Sattar<br />

Joe Saunders<br />

Guillaume Saupin<br />

Eric Sauser<br />

Jean-Philippe Saut<br />

Jesus Savage<br />

Ketan Savla<br />

Glauco Garcia Scandaroli<br />

Davide Scaramuzza<br />

Umberto Scarfogliero<br />

Paul Scerri<br />

Stefan Schaal<br />

Thomas Schauß<br />

Stefano Scheggi<br />

Michael Scheint<br />

Sebastian Scherer<br />

Paul Schermerhorn<br />

Alexis Scheuer<br />

Giuseppina Schiavone<br />

Andre Schiele<br />

Andreas Schierl<br />

Julian Schill<br />

Lars Schillingmann<br />

Joseph Schimmels<br />

Alexander Schlaefer<br />

Christian Schlegel<br />

C. Schmidt-Wetekam<br />

Alexander Schmitz<br />

Axel Schneider<br />

Sven Schneider<br />

Christian Schnier<br />

Jonathan Scholz<br />

Debra Schreckenghost<br />

Hans Peter Schrocker<br />

Derik Schroeter<br />

Björn Schuller<br />

Niclas Schult<br />

Ulrik Pagh Schultz<br />

Dirk Schulz<br />

Hannes Schulz<br />

Mac Schwager<br />

Maxim Schwartz<br />

Bernd Schäfer<br />

Enea Scioni<br />

Stephen Se<br />

Luís Seabra Lopes<br />

Jose Maria Sebastian<br />

Cristian Secchi<br />

Vlad Seghete<br />

Stephan Sehestedt<br />

Masahiro Sekimoto<br />

Claudio Semini<br />

Kei Senda<br />

Taku Senoo<br />

Jonathon Sensinger<br />

Luis Sentis<br />

TaeWon Seo<br />

Young-Woo Seo<br />

Joao Sequeira<br />

Fabrizio Sergi<br />

Andrea Serrani<br />

Fabien Servant<br />

Travis Service<br />

Andre Seyfarth<br />

Antonio Sgorbissa<br />

Robert Shade<br />

Azad Shademan<br />

Danelle Shah<br />

Shridhar Shah<br />

Hossein Shahbazi<br />

Azamat Shakhimardanov<br />

Elie Shammas<br />

Zeyong Shan<br />

Weiwei Shang<br />

Xiaowei Shao<br />

Zhufeng Shao<br />

Amir Shapiro<br />

Inna Sharf<br />

Rajnikant Sharma<br />

Dvijesh Shastri<br />

Cheng Yap Shee<br />

Raymond Ka-Man Sheh<br />

Amarda Shehu<br />

Dylan Shell<br />

Jinglin Shen<br />

Shuhan Shen<br />

Wei-Min Shen<br />

Xiangrong Shen<br />

Yanjun Shen<br />

Yantao Shen<br />

Apoorva Shende<br />

Weihua Sheng<br />

Jinbo Shi<br />

Qing Shi<br />

Mizuho Shibata<br />

Takanori Shibata<br />

Tomohiro Shibata<br />

Zvi Shiller<br />

David Hyunchul Shim<br />

Masahiro Shimizu<br />

Masayuki Shimizu<br />

Toshimi Shimizu<br />

Shingo Shimoda<br />

Makoto Shimojo<br />

Masamichi Shimosaka<br />

Dongjun Shin<br />

YongDeuk Shin<br />

Hiroyuki Shinoda<br />

Patrick Yuri Shinzato<br />

Masahiro Shiomi<br />

Shun Shiramatsu<br />

Bijan Shirinzadeh<br />

Babak Shirmohammadi<br />

Ming-Chiuan Shiu<br />

Alexander Shkolnik<br />

Florian Shkurti<br />

Steven Shladover<br />

Shraga Shoval<br />

Gabe Sibley


Bruno Siciliano<br />

Candace Sidner<br />

Daniel Sidobre<br />

Roland Siegwart<br />

Basilio Sierra<br />

Olivier Sigaud<br />

Ricardo Silva Souza<br />

Geraldo Silveira<br />

David Silver<br />

Carlos Silvestre<br />

Rob Sim<br />

Nabil Simaan<br />

Olivier Simonin<br />

Richard Simpson<br />

Jivko Sinapov<br />

Amarjeet Singh<br />

Sanjiv Singh<br />

Surya Singh<br />

Edoardo Sinibaldi<br />

Ryan W. Sinnet<br />

Hebertt Sira-Ramírez<br />

Daniel Sirkett<br />

Shahin Sirouspour<br />

Emrah Akin Sisbot<br />

Metin Sitti<br />

Dimitrios Skarlatos<br />

Henrik Skibbe<br />

Danijel Skocaj<br />

Alexander Skoglund<br />

Goran Skorja<br />

William Smart<br />

Claes Christian Smith<br />

James Smith<br />

Joshua R. Smith<br />

Ryan Smith<br />

Stephen L. Smith<br />

Ruben Smits<br />

Jamie Snape<br />

Paolo Soda<br />

Joan Solà<br />

Massimiliano Solazzi<br />

Jorge Solis<br />

Hyoung Il Son<br />

Dan Song<br />

Dezhen Song<br />

Guangming Song<br />

Kai-Tai Song<br />

Qi Song<br />

Xuan Song<br />

Zhangjun Song<br />

Zhibin Song<br />

Olof Sornmo<br />

Domenico G. Sorrenti<br />

Edoardo Sotgiu<br />

Francois Soumis<br />

Cristóvão Sousa<br />

Giacomo Spampinato<br />

Diana F. Spears<br />

Stefanie Speidel<br />

Matthew Spenko<br />

Fabien Spindler<br />

Luciano Spinello<br />

John Spletzer<br />

Mark Spong<br />

Alexander Sproewitz<br />

Christoph Sprunk<br />

Mohan Sridharan<br />

Siddhartha Srinivasa<br />

Manoj Srinivasan<br />

Cyrill Stachniss<br />

Maciej Stachura<br />

Sergiu-Dan Stan<br />

Bogdan Stanciulescu<br />

Milos Stankovic<br />

Aaron Staranowicz<br />

Olivier Stasse<br />

Christoph Staub<br />

Bastian Steder<br />

Mark Steedman<br />

Paolo Stegagno<br />

Jochen J. Steil<br />

Gerald Steinbauer<br />

Aaron Steinfeld<br />

Erik Steltz<br />

Annett Stelzer<br />

Björn Stenger<br />

Benjamin Stephens<br />

Yiannis Stergiopoulos<br />

Bruno Steux<br />

Nicholas Stiffler<br />

Christoph Stiller<br />

Daniel Stilwell<br />

Serge Stinckwich<br />

Timothy Stirling<br />

Leo Stocco<br />

Dan Stoianovici<br />

Rustam Stolkin<br />

Andreas Stolt<br />

Kasper Stoy<br />

Danail Stoyanov<br />

Todor Stoyanov<br />

Alexander Stoytchev<br />

Stefano Stramigioli<br />

Morten Strandberg<br />

Hauke Strasdat<br />

Olivier Strauss<br />

Johannes H. Strom<br />

Freek Stulp<br />

Jürgen Sturm<br />

Peter Sturm<br />

Nathan Sturtevant<br />

Jörg Stückler<br />

Michael Styer<br />

Hao Su<br />

Francisco Suarez<br />

Raul Suarez<br />

Halit Bener Suay<br />

Ram Subramanian<br />

Ioan Alexandru Sucan<br />

Jozef Suchý<br />

Attawith Sudsang<br />

Hiromichi Suetani<br />

Yusuke Sugahara<br />

Taisuke Sugaiwa<br />

Thomas Sugar<br />

Tomomichi Sugihara<br />

Norikazu Sugimoto<br />

Yasuhiro Sugimoto<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xix–<br />

Komei Sugiura<br />

Yuta Sugiura<br />

Masashi Sugiyama<br />

Osamu Sugiyama<br />

P.B. Sujit<br />

Gaurav Sukhatme<br />

Wael Suleiman<br />

Khan Suleman<br />

Cornel Sultan<br />

Yasushi Sumi<br />

Dong Sun<br />

Hanxu Sun<br />

Kuan-Chun Sun<br />

Kui Sun<br />

Qiao Sun<br />

Xiaoxun Sun<br />

Zhenglong Sun<br />

Joseph Sun de la Cruz<br />

YoonChang Sung<br />

Michael Suppa<br />

Kenji Suzuki<br />

Takashi Suzuki<br />

Tatsuya Suzuki<br />

Tsuyoshi Suzuki<br />

Mikhail Svinin<br />

Lee Sword<br />

Filip Szufnarowski<br />

Klementyna Szwaykowska<br />

Duc Anh Ta<br />

Seyed Nasr Tabatabaei<br />

Armando Tacchella<br />

Kenjiro Tadakuma<br />

Riichiro Tadakuma<br />

Satoshi Tadokoro<br />

Fabrizio Taffoni<br />

Hamid Taghirad<br />

Kenji Tahara<br />

Adnan Tahirovic<br />

Dave Tahmoush<br />

Omar Tahri<br />

Michel Taïx<br />

Kouichi Taji<br />

Keiki Takadama<br />

Motoki Takagi<br />

Junji Takahashi<br />

Masaki Takahashi<br />

Toru Takahashi<br />

Takeshi Takaki<br />

Atsuo Takanishi<br />

Wataru Takano<br />

Hidenori Takauji<br />

Leila Takayama<br />

Toshinobu Takei<br />

Hiroshi Takemura<br />

Yasunori Takemura<br />

Hiroki Takeuchi<br />

Hironori Takimoto<br />

Tomohito Takubo<br />

Takashi Takuma<br />

Mehdi Tale Masouleh<br />

Ali Talebi<br />

Levente Tamas<br />

Minija Tamosiunaite<br />

U-Xuan Tan<br />

Xiaobo Tan<br />

Youhua Tan<br />

Kanji Tanaka<br />

Masayuki Tanaka<br />

Yoshihiro Tanaka<br />

Yoshiyuki Tanaka<br />

Chaoquan Tang<br />

Chinpei Tang<br />

Hsiao-Wei Tang<br />

Xinyu Tang<br />

James Tangorra<br />

Kazuhiro Taniguchi<br />

Tadahiro Taniguchi<br />

Yong Tao<br />

Lydia Tapia<br />

Adriana Tapus<br />

Jean-Philippe Tarel<br />

Mihai Olimpiu Tatar<br />

Fujiura Tateshi<br />

Mahmoud Tavakoli<br />

Abdelhamid Tayebi<br />

Camillo Jose Taylor<br />

Matthew Taylor<br />

Yuichi Tazaki<br />

Krzysztof Tchon<br />

Russ Tedrake<br />

Bruno O.S. Teixeira<br />

Onur Tekdas<br />

Seth Teller<br />

Thomas Temple<br />

Ernesto H. Teniente Avilés<br />

Moritz Tenorth<br />

Kenji Terabayashi<br />

Carlos R. Tercero Villagran<br />

Alexander V. Terekhov<br />

Marco Henrique Terra<br />

Zdravko Terze<br />

Matthew Tesch<br />

Luka Teslic<br />

Celine Teuliere<br />

Evangelos Theodorou<br />

Dirk Thomas<br />

Federico Thomas<br />

Stephen James Thomas<br />

Andrea Lockerd Thomaz<br />

David Thompson<br />

Paul Thompson<br />

Blair Thornton<br />

Ivar Thorson<br />

Benoit Thuilot<br />

Jiang Tian<br />

David Tick<br />

Szu-Chi Tien<br />

Yung Ting<br />

Gian Diego Tipaldi<br />

Ethan Tira-Thompson<br />

Rashi Tiwari<br />

David Tlalolini<br />

Andreas Tobergte<br />

Selene Tognarelli<br />

Juan Marcos Toibero<br />

Pratap Tokekar<br />

Michael Thomas Tolley<br />

Federico Tombari


Masahiro Tomono<br />

Lachlan Toohey<br />

Elin Anna Topp<br />

Carme Torras<br />

Abril Torres<br />

Fernando Torres<br />

Eduardo Torres-Jara<br />

Alexander Toshev<br />

Iwaki Toshima<br />

David S. Touretzky<br />

Marc Toussaint<br />

Benjamin Tovar<br />

Panos Trahanias<br />

Duc Trong Tran<br />

Aksel Andreas Transeth<br />

Alberto Traslosheros<br />

Alberto Traslosheros-M<br />

Peter Trautman<br />

V. Javier Traver<br />

Ana Luisa Trejos<br />

Philip Tresadern<br />

Alberto Trevisani<br />

Vito Trianni<br />

George Triantafyllou<br />

Jean Triboulet<br />

Rudolph Triebel<br />

Marco Trincavelli<br />

Chiara Troiani<br />

Nikolaos Tsagarakis<br />

Chia-Hung Tsai<br />

Dimitris Tsakiris<br />

Nikolaos Tsekos<br />

Panagiotis Tsiotras<br />

Manolis Tsogas<br />

John Tsotsos<br />

Antonios Tsourdos<br />

Sadayuki Tsugawa<br />

Toshiaki Tsuji<br />

Hideyuki Tsukagoshi<br />

Yuichi Tsumaki<br />

Mihran Tuceryan<br />

Stephen Tully<br />

Win Tun Latt<br />

Alessio Turetta<br />

Masaru Uchiyama<br />

Ryuichi Ueda<br />

Mitsunori Uemura<br />

Hiroshi Ueno<br />

Toshio Ueshiba<br />

Tsuyoshi Ueyama<br />

Emre Ugur<br />

Barkan Ugurlu<br />

Klaus Uhl<br />

Heinz Ulbrich<br />

Giovanni Ulivi<br />

Paul Umbanhowar<br />

James Patrick Underwood<br />

Alois Unterholzner<br />

Ben Upcroft<br />

Takateru Urakubo<br />

Alfonso Urquia<br />

Nobuhiro Ushimi<br />

Yuzuko Utsumi<br />

Nikolaus Vahrenkamp<br />

Ravi Vaidyanathan<br />

Pablo Valdivia y Alvarado<br />

Philip Valencia<br />

Luis M. Valentin-Coronado<br />

Alberto Valero-Gomez<br />

Heike Vallery<br />

Jaime Valls Miro<br />

Marco Valtorta<br />

Michaël Van Damme<br />

M.J.G. van de Molengraft<br />

Michiel van de Panne<br />

Joop van de Ven<br />

Jur van den Berg<br />

Ferdi van der Heijden<br />

H.F. Machiel Van der Loos<br />

Frank van der Stappen<br />

Ronald Van Ham<br />

Joshua Vander Hook<br />

Bram Vanderborght<br />

Paul Varnell<br />

Pablo Varona<br />

Panagiotis Vartholomeos<br />

Francisco Vasconcelos<br />

Eric Vasselin<br />

Pascal Vasseur<br />

Gabriele Vassura<br />

Shrihari Vasudevan<br />

Monica Vatteroni<br />

Richard Vaughan<br />

Marynel Vazquez<br />

Harini Veeraraghavan<br />

Prasanna Velagapudi<br />

Matteo Venanzi<br />

Marilena Vendittelli<br />

Jan Veneman<br />

Subramaniam Venkatraman<br />

Rodrigo Ventura<br />

Gentiane Venture<br />

Kevin Veon<br />

Alexander Verl<br />

Paul Vernaza<br />

John Vial<br />

Janis Viba<br />

Marco Vicentini<br />

Alessandro Correa Victorino<br />

Rene Vidal<br />

Fernando Vidal-Verdú<br />

José Vieira<br />

Diego Viejo<br />

Pierre Vieyres<br />

Bjørnar Vik<br />

Vishesh Vikas<br />

Carlos Villalpando<br />

Michael Villamizar Vergel<br />

Luigi Villani<br />

Markus Vincze<br />

Stephane Viollet<br />

Francesco Visentin<br />

Arnoud Visser<br />

Michael Vistein<br />

Nicola Vitiello<br />

Valentina Vitiello<br />

Marie-Aude Vitrani<br />

Iason Vittorias<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xx–<br />

Michael Vitus<br />

Kostas Vlachos<br />

Christopher Vo<br />

Luca Vollero<br />

Marsette Vona<br />

Randolph Voorhies<br />

Philipp Vorst<br />

Stavros Vougioukas<br />

Trung-Dung Vu<br />

Sven Wachsmuth<br />

Masayoshi Wada<br />

Neeti Wagle<br />

Glenn Wagner<br />

Mattias Wahde<br />

Friedrich M. Wahl<br />

Keith Wait<br />

Shuichi Wakimoto<br />

Ian Walker<br />

Andrew M. Wallace<br />

Jennifer Walter<br />

Matthew Walter<br />

Michael Leonard Walters<br />

Chaoli Wang<br />

Chieh-Chih Wang<br />

Danwei Wang<br />

David Wang<br />

Fei-Yue Wang<br />

Han Wang<br />

Hao Wang<br />

Hesheng Wang<br />

Hongbo Wang<br />

Huadong Wang<br />

Hui Wang<br />

Huifang Wang<br />

Jianxun Wang<br />

Jing Wang<br />

Jingguo Wang<br />

Jiuguang Wang<br />

Junping Wang<br />

Kai Wang<br />

Kundong Wang<br />

Long Wang<br />

Lu Wang<br />

Meimei Wang<br />

Nianfeng Wang<br />

Qining Wang<br />

Shuguo Wang<br />

Shuo Wang<br />

Wei-Wen Wang<br />

Weifu Wang<br />

Wenhui Wang<br />

X. Rosalind Wang<br />

Xinqing Wang<br />

Yao-Dong Wang<br />

Yue Wang<br />

Zheng Wang<br />

Zhidong Wang<br />

Zhifeng Wang<br />

Zhijie Wang<br />

Zhongkui Wang<br />

Tim Wark<br />

Steven Lake Waslander<br />

John Wason<br />

Mutsumi Watanabe<br />

Tetsuyou Watanabe<br />

Jens Wawerla<br />

Stephen Leslie Scott Webb<br />

Sarah E. Webster<br />

Robert James Webster III<br />

Hongxing Wei<br />

Andrew P.H. Weightman<br />

Felipe A. Weilemann Belo<br />

Stephan Weiss<br />

Alfredo Weitzenfeld<br />

Uchechukwu C. Wejinya<br />

Kai Welke<br />

Briana Wellman<br />

Johannes Wendeberg<br />

Philippe Wenger<br />

Patrick Wensing<br />

Justin Werfel<br />

Moritz Werling<br />

E.P. Westebring – v.d. Putten<br />

Nicholas Wettels<br />

David Wettergreen<br />

Paul White<br />

Jacob Whitehill<br />

Eric Whitman<br />

Mark Albert Whitty<br />

Pierre-Brice Wieber<br />

Nicholas Wilkinson<br />

Volker Willert<br />

Richard Willgoss<br />

Aaron L. Williams<br />

Brian Patrick Williams<br />

Ian Williams<br />

Kjerstin Williams<br />

Mary-Anne Williams<br />

Stefan Bernard Williams<br />

Stephen Williams<br />

Reg Willson<br />

Thomas Wimboeck<br />

Alan Winfield<br />

Alexander Winkler<br />

Amos Greene Winter<br />

Raul Wirz Gonzalez<br />

Heinz Woern<br />

Walter Wohlkinger<br />

Hans Woithe<br />

Michael Wolf<br />

Sebastian Wolf<br />

Bryn Wolfe<br />

Carsten Wolff<br />

Dirk Wollherr<br />

Lok Sang Lawson Wong<br />

Tichakorn Wongpiromsarn<br />

John Wood<br />

Nathan Wood<br />

Robert Wood<br />

Oliver Woodford<br />

Oliver Woodman<br />

Matthew Woodward<br />

Craig Woolsey<br />

James Worcester<br />

Franz Wotawa<br />

Britta Wrede<br />

Sebastian Wrede<br />

Ban Wu


Bing-Fei Wu<br />

Chao Wu<br />

Guanglei Wu<br />

Guosheng Wu<br />

Haiyuan Wu<br />

Huapeng Wu<br />

Jianxin Wu<br />

Po Wu<br />

Shandong Wu<br />

Wencen Wu<br />

Burkhard Wuensche<br />

Kai M. Wurm<br />

Gordon Wyeth<br />

Christian Wögerer<br />

FengFeng Xi<br />

Tian Xia<br />

Zhiyu Xiang<br />

Jizhong Xiao<br />

Jun Xiao<br />

Shunli Xiao<br />

Ming Xie<br />

Shaorong Xie<br />

Anqi Xu<br />

Bin Xu<br />

Changhai Xu<br />

De Xu<br />

Jijie Xu<br />

Kai Xu<br />

Qingsong Xu<br />

Tao Xu<br />

Yang Xu<br />

Yangsheng Xu<br />

Yiliang Xu<br />

Yunfei Xu<br />

Zhe Xu<br />

Zhixing Xue<br />

Takehisa Yairi<br />

Hiroya Yamada<br />

Katsuhiko Yamada<br />

Takayoshi Yamada<br />

Hiroaki Yamaguchi<br />

Masaki Yamakita<br />

Ikuo Yamamoto<br />

Motoji Yamamoto<br />

Yoko Yamanishi<br />

Mitsuhiro Yamano<br />

Natsuki Yamanobe<br />

Atsushi Yamashita<br />

Hiromasa Yamashita<br />

Junji Yamato<br />

Brian Yamauchi<br />

Tasuku Yamawaki<br />

Kimitoshi Yamazaki<br />

Gangfeng Yan<br />

Rujiao Yan<br />

Holly Yanco<br />

Chenguang Yang<br />

Guilin Yang<br />

Huizhen Yang<br />

Junho Yang<br />

Ming Yang<br />

Ruiguo Yang<br />

Sungwook Yang<br />

Tao Yang<br />

Woosung Yang<br />

Xiaoli Yang<br />

Yousheng Yang<br />

Zhenwang Yao<br />

Masahito Yashima<br />

Sergey Yatsun<br />

Jano Yazbeck<br />

M. J. Yazdanpanah<br />

Changlong Ye<br />

Hsin-Yi (Cindy) Yeh<br />

Je Sung Yeon<br />

Dmitry Yershov<br />

Anna Yershova<br />

Byung-Ju Yi<br />

Jianqiang Yi<br />

Jingang Yi<br />

Chih-Chen Yih<br />

Mark Yim<br />

Sehyuk Yim<br />

KangKang Yin<br />

John David Yoder<br />

Kazuhito Yokoi<br />

Yasuyoshi Yokokohji<br />

Naokazu Yokoya<br />

Kan Yoneda<br />

Hyun-Soo Yoon<br />

Jungwon Yoon<br />

Yong-San Yoon<br />

Eiichi Yoshida<br />

Morio Yoshida<br />

Takami Yoshida<br />

Kenji Yoshigoe<br />

Kazuyoshi Yoshii<br />

Tomoaki Yoshikai<br />

Taizo Yoshikawa<br />

Kitaro Yoshimitsu<br />

Jason Yosinski<br />

Kuu-young Young<br />

Shelley S.C. Young<br />

Ryuh Youngsun<br />

Karim Youssef<br />

Hongnian Yu<br />

Huili Yu<br />

Hung-Hsiu Yu<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxi–<br />

Jiancheng Yu<br />

Jingjin Yu<br />

Jingjun Yu<br />

Ningbo Yu<br />

Qian Yu<br />

Shengwei Yu<br />

Son Cheol Yu<br />

Yong Yu<br />

Yue-Qing Yu<br />

Chunrong Yuan<br />

Jianjun Yuan<br />

Peijiang Yuan<br />

Dongwon Yun<br />

Seung-kook Yun<br />

Kris Zacny<br />

Spyros Zafeiropoulos<br />

Michael Friedrich Zäh<br />

Nazanin Zaker<br />

Alexey Zakharov<br />

Avideh Zakhor<br />

Eduardo Zalama<br />

Andrea Maria Zanchettin<br />

Damiano Zanotto<br />

René Zapata<br />

Michael M. Zavlanos<br />

Massimiliano Zecca<br />

Milos Zefran<br />

Said Zeghloul<br />

Garth J Zeglin<br />

John S. Zelek<br />

Andreas Zell<br />

Hendrik Zender<br />

Shuqing Zeng<br />

Wenwu Zeng<br />

Davide Zerbato<br />

Chengkun Zhang<br />

Dongjun Zhang<br />

Guoxuan Zhang<br />

Haihong Zhang<br />

Hong Zhang<br />

Hong Zhang<br />

Houxiang Zhang<br />

Jian Zhang<br />

Jiangbo Zhang<br />

Jianwei Zhang<br />

Junjie Zhang<br />

Li Zhang<br />

Li Zhang<br />

Liang Zhang<br />

Liangjun Zhang<br />

Mingjun Zhang<br />

Shiqi Zhang<br />

Tao Zhang<br />

Weizhong Zhang<br />

Wenzeng Zhang<br />

Xianmin Zhang<br />

Xuping Zhang<br />

Yan Zhang<br />

Yan Liang Zhang<br />

Yanwu Zhang<br />

Ying Zhang<br />

Yizhai Zhang<br />

Yong Zhang<br />

Yongsheng Zhang<br />

Yu (Tony) Zhang<br />

Yunong Zhang<br />

Zhao Zhang<br />

Dongbin Zhao<br />

Huijing Zhao<br />

Jianguo Zhao<br />

Jing Zhao<br />

Mingguo Zhao<br />

Tao Zhao<br />

Nanning Zheng<br />

Xiaoming Zheng<br />

Yu Zheng<br />

Debao Zhou<br />

Kai Zhou<br />

Lelai Zhou<br />

Quan Zhou<br />

Xiaobo Zhou<br />

Xun Zhou<br />

Yan Zhou<br />

Yu Zhou<br />

Zhi Zhou<br />

Wen-Hong Zhu<br />

Zhiwei Zhu<br />

Stefan Zickler<br />

Julius Ziegler<br />

Marc Ziegler<br />

Michael Zillich<br />

Primo Zingaretti<br />

Michael Zinn<br />

Zoran Zivkovic<br />

Alessandro Antonio Zizzari<br />

Robert Zlot<br />

M.H. Zokaei Ashtiani<br />

Johann Marius Zöllner<br />

Raoul Zöllner<br />

Loredana Zollo<br />

Qingze Zou<br />

Wei Zou<br />

Matthew Zucker<br />

Zhiyuan Zuo<br />

Matthijs Jan Zwinderman


Plenary Sessions<br />

Plenary I: Design<br />

Tuesday, September 27, <strong>2011</strong>, 12:30-13:45, Continental Ballroom<br />

Moderator: Professor Bernard Roth, Stanford University<br />

The Design Plenary features three design pioneers (Hirose, Hirzinger, and Raibert)<br />

discussing their perspectives on various aspects of mechanism design. After initial<br />

presentations by the panelists, the chair (Roth) will moderate a panel discussion with the<br />

experts and the audience.<br />

Professor Shigeo Hirose was born in Tokyo in 1947. He received his B.Eng.<br />

Degree with First Class Honors in Mechanical Engineering from Yokohama<br />

National University in 1971, and his M. Eng. and Ph.D. Eng. Degrees in<br />

Control Engineering from Tokyo Institute of Technology in 1973 and 1976,<br />

respectively. From 1976 to 1979 he was a Research Associate, and from<br />

1979 to 1992 an Associate Professor. Since 1992 he has been a Professor<br />

in the Department of Mechanical and Aerospace Engineering at the Tokyo<br />

Institute of Technology. He is a fellow of JSME and IEEE. He is engaged in creative design<br />

of robotic systems. Prof. Hirose has been awarded more than twenty prizes.<br />

Professor Gerd Hirzinger is director at DLR’s institute for „Robotics and<br />

Mechatronics“, which is one of the biggest and most acknowledged<br />

Institutes in the field worldwide. He was prime investigator of the space robot<br />

technology experiment ROTEX, the first remote controlled robot in space,<br />

which flew onboard shuttle COLUMBIA in April 1993. He has published<br />

more than 600 papers in robotics. He received numerous national and<br />

international awards, e.g., in 1994 the Joseph-Engelberger-Award for<br />

achievements in robotic science and in 1995 the Leibniz-Award, the highest scientific<br />

award in Germany and the JARA (Japan robotics association) Award. In 2005 he received<br />

the IEEE Pioneer Award of the Robotics and Automation Society, and in 2007 the IEEE<br />

Field Award “Robotics and Automation”.<br />

Dr. Marc Raibert was Professor of Electrical Engineering and Computer<br />

Science at MIT and a member of the Artificial Intelligence Laboratory from<br />

1986 through 1995. He is co-founder and President of Boston Dynamics Inc,<br />

(BDI), which is located near MIT in Cambridge. Raibert's research is devoted<br />

to the study of systems that move dynamically, including physical robots and<br />

animated creatures. Raibert received a B.S. degree in Electrical Engineering<br />

from Northeastern University in 1973, and a Ph.D from the Massachusetts<br />

Institute of Technology in 1977. He is author of Legged Robots That Balance published by<br />

MIT Press, and is on the Editorial Board of the International Journal of Robotics Research.<br />

He is a fellow of the AAAI.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxii–


Plenary II: BioRobotics<br />

Wednesday, September 28, <strong>2011</strong>, 12:30-13:45, Continental Ballroom<br />

Moderator: Professor Ruzena Bajcsy, University of California, Berkeley<br />

The BioRobotics Plenary features three biorobotics pioneers (Berthoz, Buelthoff, and<br />

Srinivasan) discussing their perspectives on various aspects of biorobotics and biomimetic<br />

robotics. After initial presentations by the panelists, the chair (Bajcsy) will moderate a panel<br />

discussion with the experts and the audience.<br />

Professor Alain Berthoz was born in 1939. He became Civil Engineer at<br />

École des Mines in 1963, Ph.D. in 1973. As researcher at (CNRS) (1966-<br />

1981), he established and coordinated the Neurosensory Physiology<br />

Laboratory (1981-1993). Since 1993, he is professor at Collège de France<br />

and director of UMR CNRS/Collège de France "Physiology of perception<br />

and action". He has over 200 scientific publications in international journals<br />

on physiology of sensori-motor functions and more specifically on the<br />

oculomotor system, the vestibular system, balance control, and movement perception. He<br />

had given about 90 invited talks across the world.<br />

Professor Heinrich Bülthoff is scientific member of the Max Planck<br />

Society and director at the Max Planck Institute for Biological Cybernetics<br />

in Tübingen. He is head of the Department Human Perception, Cognition<br />

and Action in which a group of about 70 researchers investigate<br />

psychophysical and computational aspects of higher level visual processes<br />

in object and face recognition, sensory-motor integration, spatial cognition,<br />

and perception and action in virtual environments. He holds a Ph.D. degree<br />

in the natural sciences from the Eberhard-Karls-Universität in Tübingen. He was Assistant,<br />

Associate and Full Professor of Cognitive Science at Brown University in Providence from<br />

1988-1993 before becoming director at the Max Planck Institute for Biological Cybernetics.<br />

Professor Mandyam Srinivasan is at the Queensland Brain Institute and<br />

the School of Information Technology and Electrical Engineering of the<br />

University of Queensland. He holds an undergraduate degree in Electrical<br />

Engineering from Bangalore University, a Master's degree in Electronics<br />

from the Indian Institute of Science, a Ph.D. in Engineering and Applied<br />

Science from Yale University, a D.Sc. in Neuroethology from the Australian<br />

National University, and an Honorary Doctorate from the University of<br />

Zurich. Among his awards and honors are Fellowships of the Australian Academy of<br />

Science, of the Royal Society of London, and of the Academy of Sciences for the<br />

Developing World, an Inaugural Federation Fellowship, the 2006 Australia Prime Minister’s<br />

Science Prize, and the 2008 U.K. Rank Prize for Optoelectronics.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxiii–


Plenary III: Self-Driving Cars<br />

Thursday, September 29, <strong>2011</strong>, 13:30-14:30, Continental Ballroom<br />

Professor Sebastian Thrun, Stanford University/Google<br />

Abstract: Most of us use a car every day. But unlike airplanes, which have been flying on<br />

autopilots for decades, cars are still driven manually - just the way they were driven 100<br />

years ago. This talk will introduce the transformative concept of a self-driving car. Following<br />

early research in the 1990 in Germany and the US, and more recently the DARPA<br />

Challenges, this technology has now been advanced to a point where it is within reach of<br />

commercial realizations that may provide benefits to pretty much anyone who drives a car.<br />

At the core of this progress is a new generation of cutting-edge artificial intelligence, which<br />

enables a self-driving car to understand its environment and to interact with other traffic.<br />

The speaker will discuss the Google Self-Driving Car project, in which a fleet of self-driving<br />

cars navigated more than 160,000 miles on public roads in California and Nevada,<br />

including the downtowns of San Francisco and Los Angeles. The speaker will also discuss<br />

some of the societal implications of this new technology.<br />

Sebastian Thrun is a Professor of Computer Science at Stanford<br />

University and director of the Stanford Artificial Intelligence Laboratory<br />

(SAIL). He led the development of the robotic vehicle Stanley that won<br />

the 2005 DARPA Grand Challenge. His team also developed Junior,<br />

which placed second at the DARPA Urban Challenge in 2007. Thrun led<br />

the development of the Google self-driving car and is well known for his<br />

work on probabilistic programming techniques in robotics, with<br />

applications including robotic mapping. He was elected into the National<br />

Academy of Engineering and also into the German Academy of Sciences<br />

Leopoldina in 2007. In <strong>2011</strong>, he received the Max-Planck-Research Award and the<br />

inaugural AAAI Ed Feigenbaum Prize. Thrun received his Diplom (master's degree) in<br />

1993 and a PhD (summa cum laude) in 1995 in computer science and statistics from the<br />

University of Bonn. He was on the CMU faculty from 1995 till 2003. Since 2003 he has<br />

been on the faculty of the Stanford Computer Science department. He is also a Google<br />

Fellow.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxiv–


Workshops and Tutorials<br />

A total of 27 workshops and tutorials are scheduled on Sunday, Monday, and Friday of the<br />

conference week. They are listed here for your consideration. More detailed information<br />

and all workshops and tutorials proceedings can be found in the conference website<br />

www.iros<strong>2011</strong>.org/workshops-and-tutorials. All events take place in the Continental Ball<br />

and Parlor rooms 1–9, and in the Golden Gate Rooms 6 and 7.<br />

Sunday, September 25, <strong>2011</strong>: Full-day workshops and tutorials<br />

Session Room Title and Website Link Organizers<br />

ST1<br />

Tutorial<br />

ST2<br />

Tutorial<br />

ST3<br />

Tutorial<br />

SW4<br />

Workshop<br />

SW5<br />

Workshop<br />

SW6<br />

Workshop<br />

SW7<br />

Workshop<br />

SW8<br />

Workshop<br />

SW9<br />

Workshop<br />

Golden<br />

Gate<br />

Rooms 6-7<br />

Continental<br />

Parlor 1<br />

Continental<br />

Parlor 2<br />

Continental<br />

Ballroom 4<br />

Continental<br />

Ballroom 6<br />

Continental<br />

Parlor 3<br />

Continental<br />

Parlor 7<br />

Continental<br />

Parlor 8<br />

Continental<br />

Parlor 9<br />

Motion Planning for Real Robots<br />

3D Point Cloud Processing: PCL (Point<br />

Cloud Library)<br />

DARwIn-OP: An Open Platform, Miniature<br />

Humanoid Robot Platform for Research,<br />

Education and Outreach<br />

20 years of Microrobotics: progress,<br />

challenges, and future directions<br />

Cognitive Neuroscience Robotics<br />

Image-Guided Medical Robotic<br />

Interventions<br />

Knowledge Representation for<br />

Autonomous Robots<br />

Autonomous Underwater Robotics for<br />

Intervention<br />

Reconfigurable Modular Robotics:<br />

Challenges of Mechatronic and Bio-<br />

Chemo- Hybrid Systems<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxv–<br />

Sachin Chitta<br />

Edward Gil Jones<br />

Ioan Alexandru Sucan<br />

Mark Moll*<br />

Lydia Kavraki<br />

Radu Bogdan Rusu*<br />

Michael Dixon<br />

Aitor Aldoma<br />

Suat Gedikli<br />

Dennis Hong*<br />

Daniel D. Lee<br />

Dan Popa*<br />

Fumihito Arai<br />

Kenichi Narioka*<br />

Yukie Nagai<br />

Minoru Asada<br />

Hiroshi Ishiguro<br />

Sarthak Misra*<br />

Jaydev P. Desai<br />

Michael Beetz<br />

Rachid Alami<br />

Joachim Hertzberg<br />

Alessandro Saffiotti<br />

Moritz Tenorth*<br />

Junku Yuh<br />

Giuseppe Casalino<br />

Alessio Turetta*<br />

Serge Kernbach*<br />

Robert Charles Fitch


Monday, September 26, <strong>2011</strong>: Half-day workshops and tutorials<br />

Session Room Title and Website Link Organizers<br />

MT1<br />

Tutorial<br />

MW2<br />

Workshop<br />

MW3<br />

Workshop<br />

MW4<br />

Workshop<br />

MW5<br />

Workshop<br />

MW6<br />

Workshop<br />

MW7<br />

Workshop<br />

MW8<br />

Workshop<br />

MW9<br />

Workshop<br />

Continental<br />

Ballroom 5<br />

Continental<br />

Parlor 2<br />

Continental<br />

Parlor 8<br />

Continental<br />

Parlor 1<br />

Continental<br />

Parlor 3<br />

Continental<br />

Parlor 7<br />

Continental<br />

Ballroom 4<br />

Continental<br />

Ballroom 6<br />

Continental<br />

Parlor 9<br />

Introduction to Rescue Robotics Robin Murphy*<br />

Visual Control of Mobile Robots<br />

(ViCoMoR)<br />

New and Emerging Technologies in<br />

Assistive Robotics<br />

Visual Tracking and Omni-directional<br />

Vision<br />

Space Robotics Simulation<br />

Current Directions in Marine Vehicle<br />

Autonomy Research<br />

Active Semantic Perception and Object<br />

Search in the Real World<br />

Sixth International Cognitive Vision<br />

Workshop (ICVW <strong>2011</strong>) Situated Vision vs.<br />

Internet Vision<br />

Redundancy in Robot Manipulators and<br />

Multi-Robot Systems<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxvi–<br />

Youcef Mezouar<br />

G. Lopez-Nicolas*<br />

Kazuyoshi Wada*<br />

Machiel Van der Loos<br />

Loredana Zollo<br />

El Mustapha Mouaddib*<br />

Eric Marchand<br />

João P. Barreto<br />

Yasushi Yagi<br />

Rainer Krenn*<br />

Yves Gonthier<br />

Donald Eickstedt*<br />

Mae Seto<br />

Alper Aydemir<br />

Andrzej Pronobis<br />

Bhaskara Marthi<br />

Patric Jensfelt*<br />

Dirk Holz<br />

Barbara Caputo*<br />

Fiora Pirri<br />

Michael Zillich<br />

Dejan Milutinovic*<br />

Jacob Rosen


Friday, September 30, <strong>2011</strong>: Full-day workshops and tutorials<br />

Session Room Title and Website Link Organizers<br />

FW1<br />

Workshop<br />

FW2<br />

Workshop<br />

FW3<br />

Workshop<br />

FW4<br />

Workshop<br />

FW5<br />

Workshop<br />

SW6<br />

Workshop<br />

FW7<br />

Workshop<br />

FW8<br />

Workshop<br />

FW9<br />

Workshop<br />

Continental<br />

Ballroom 4<br />

Continental<br />

Ballroom 6<br />

Continental<br />

Parlor 1<br />

Continental<br />

Parlor 3<br />

Continental<br />

Parlor 7<br />

Continental<br />

Parlor 9<br />

Continental<br />

Parlor 2<br />

Continental<br />

Parlor 8<br />

Continental<br />

Ballroom 5<br />

* = Contact organizer<br />

European Efforts in Strengthening the<br />

Academia-Industry Collaboration<br />

Progress and Open Problems in Motion<br />

Planning<br />

Current and Future Related Technologies<br />

for Robotic Automation in Micro/Nano<br />

Scale<br />

Metrics and Methodologies for<br />

Autonomous Robot Teams in Logistics<br />

(MMART-LOG)<br />

Methods for Safer Surgical Robotics<br />

Procedures<br />

Robotics for Neurology and Rehabilitation<br />

Robotics for Environmental Monitoring<br />

Perception and Navigation for<br />

Autonomous Vehicles in Human<br />

Environment<br />

The PR2 Workshop: Results, Challenges<br />

and Lessons Learned in Advancing<br />

Robotics With a Common Platform<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxvii–<br />

Reinhard Lafrenz*<br />

Alois Knoll<br />

Bruno Siciliano<br />

Rainer Bischoff<br />

Anne Wendel<br />

Timothy Bretl*<br />

Dan Halperin<br />

Kostas E. Bekris<br />

Masahiro Nakajima*<br />

Toshio Fukuda<br />

Alexander Kleiner*<br />

Rolf Lakaemper<br />

Raj Madhavan<br />

Rainer Konietschke*<br />

Stefan Joerg<br />

Paolo Fiorini<br />

Gentiane Venture*<br />

Philippe Fraisse<br />

Mitsuhiro Hayashibe<br />

Thierry Keller<br />

Lino Marques*<br />

Ryan Smith<br />

Volkan Isler<br />

Philippe Martinet*<br />

Christian Laugier<br />

Urbano Nunes<br />

William Smart*<br />

Sachin Chitta<br />

Caroline Pantofaru<br />

Radu Bogdan Rusu


Floorplan<br />

Demonstration Sessions<br />

Robotic demonstrations will be shown on the exhibition floor at the Hilton hotel (Golden Gate<br />

Room 8, Lobby level), next to the interactive sessions.<br />

Schedule<br />

Demonstrations run in parallel to the technical sessions from Tuesday, September 27 to<br />

Thursday morning, September 29, <strong>2011</strong>. Demonstrations marked with a star (*) have<br />

been selected for presentation in the special demonstration symposium (see next table).<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxviii–


Monday, Sept 26 Tuesday, Sept 27 Wednesday, Sept 28 Thursday, Sept 29<br />

Demo Symposium Morning Afternoon Morning Afternoon Morning Afternoon<br />

Space A<br />

SP5<br />

"Teleoperation"<br />

(Care-o-bot)<br />

SP3*<br />

"Service<br />

robot"<br />

(Care-o-bot)<br />

SP4*<br />

"Active<br />

perception"<br />

(Care-o-bot)<br />

SP6*<br />

"Mobile<br />

manipulation"<br />

(Care-o-bot)<br />

OR4*<br />

"Pixhawk<br />

MAV"<br />

Space B<br />

SP7* SP9 SP10*<br />

"iTaSC "Manipulation "Haptic<br />

framework" tools" coupling"<br />

(PR2) (PR2) (PR2, youBot)<br />

OR9<br />

"Ballbot<br />

Rezero"<br />

OR8<br />

"The r-one"<br />

Space C<br />

OR6*<br />

"Bio-inspired<br />

humanoid"<br />

OR7<br />

"Needle<br />

steering"<br />

OR10<br />

"Perception<br />

for<br />

manipulation"<br />

OR5<br />

"Haptic<br />

rendering"<br />

SP12*<br />

"More<br />

cowbell"<br />

(Nao)<br />

Space D<br />

OR3<br />

"DARPA<br />

Arm"<br />

OR2*<br />

"Display-<br />

Swarm"<br />

OR1*<br />

"Kilobot"<br />

SP13<br />

"Visual<br />

servoing"<br />

(Nao)<br />

SP11*<br />

"Climbing<br />

stairs"<br />

(Nao)<br />

Space E<br />

iCub space<br />

SP1<br />

"Cooperative<br />

game"<br />

(iCub)<br />

List of Demonstrations<br />

SP2*<br />

"Aquila"<br />

(iCub)<br />

Standard Platform Demonstrations<br />

Other iCub<br />

demos<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxix–<br />

SP8<br />

"Do not touch"<br />

(PR2)<br />

Other iCub<br />

demos<br />

Other iCub<br />

demos<br />

A number of standard platforms for mobility and manipulation in robotics research now exist,<br />

as both off-the-shelf commercial products, and research platforms with wide distribution.<br />

Several manufacturers have made available their platforms for demonstration. There are two<br />

benefits to this arrangement: demonstrators will not have to transport hardware, and the<br />

demonstrations will show the increasing use of standard hardware as a mean of attacking<br />

advanced applications. The accepted demonstrations for this session are:<br />

SP1 – iCub Learning a Cooperative Game through Interaction with his Human Partner<br />

Stephane Lallee, Ugo Pattacini, Lorenzo Natale, Giorgio Metta, Peter Ford Dominey<br />

Stem Cell and Brain Research Institute, Bron, France, Italian Institute of Technology, Genoa, Italy<br />

Platform: iCub<br />

Abstract: A human can interact with the iCub by pointing and speaking together in order to learn,<br />

name and play with unknown objects on a table. User can then ask for more complex actions and<br />

“program” small games by instructing the robot with a shared plan.<br />

SP2 – Aquila - The Cognitive Robotics Toolkit (*)<br />

Martin Peniak, Anthony F. Morse<br />

University of Plymouth, UK<br />

Platform: iCub<br />

Abstract: A live demonstration of Aquila, an open source cognitive robotics software toolkit for the<br />

iCub robot. This demonstration includes modeling sensorimotor learning in child development,<br />

action acquisition, and robot teleoperation.


SP3 – General Purpose Service Robot (*)<br />

Nico Hochgeschwender, Jan Paulus, Michael Reckhaus, Frederik Hegger, Christian A. Mueller,<br />

Sven Schneider, Paul G. Ploeger, Gerhard K. Kraetzschmar<br />

Bonn-Rhein-Sieg University of Applied Sciences, Bonn, Germany<br />

Platform: Care-O-Bot 3<br />

Abstract: In our demonstration we show a general purpose service robot which performs various<br />

tasks on demand.<br />

SP4 – Active Perception Planning on the Care-o-Bot platform (*)<br />

Robert Eidenberger, Michael Fiegert, Georg von Wichert, Gisbert Lawitzky<br />

Siemens, Munich, Germany<br />

Platform: Care-O-Bot 3<br />

Abstract: The Care-O-Bot platform localizes household objects in complex environments and<br />

grasps them. Environment perception and active perception planning is demonstrated and used for<br />

efficient scene modeling.<br />

SP5 – Semi-autonomous Tele-Operation Interface for Robotic Fetch and Carry Tasks<br />

Renxi Qiu 1 , Alexandre Noyvirt 1 , Nayden Chivarov 2 , Rafael Lopez 3 , Georg Arbeiter 4 , Dayou Li 5<br />

1 2 3<br />

Cardiff Univ., UK; Bulgarian Academy of Science, Bulgaria; Robotnik Automation, Spain;<br />

4 5<br />

Fraunhofer IPA, Germany; Univ. of Bedfordshire, UK<br />

Platform: Care-O-Bot 3<br />

Abstract: The demonstration focuses on remote user interface such as Apple IPAD to tele-operate<br />

robot semi-autonomously. It enables non-expert users taking charge of the robot around the home.<br />

SP6 – Using Mobile Manipulation to solve a Household Task (*)<br />

Florian Weisshardt, Alexander Bubeck, Jan Fischer, Ulrich Reiser<br />

Fraunhofer IPA, Stuttgart, Germany<br />

Platform: Care-O-Bot 3<br />

Abstract: The task of serving objects is significantly scalable and reaches from table top<br />

manipulation of simple objects to retrieving objects that require mobile manipulation of the<br />

environment first.<br />

SP7 – Demonstration of iTaSC as a unified framework for task specification, control, and<br />

coordination for mobile manipulation (*)<br />

Dominick Vanthienen, Tinne De Laet, Markus Klotzbuecher, Ruben Smits, Wilm Decré, Koen Buys,<br />

Steven Bellens, Herman Bruyninckx, and Joris De Schutter<br />

Katholieke Universiteit Leuven, Belgium<br />

Platform: PR2<br />

Abstract: Mobile co-manipulation task of a PR2 and a human, to illustrate the potential of the iTaSC<br />

method and its Orocos implementation to specify sensor-based, multi-frame, partially-specified<br />

robot tasks.<br />

SP8 – Please Do Not Touch the Robot<br />

Joseph M. Romano, Katherine J. Kuchenbecker<br />

University of Pennsylvania, Philadelphia, USA<br />

Platform: PR2<br />

Abstract: Natural human-robot interaction requires a tight coupling between sensing and control.<br />

Our PR2 demo use high-bandwidth tactile signals to showcase dynamic interactive behaviors such<br />

as high-fives.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxx–


SP9 – Interactive Manipulation Tools for the PR2 Robot<br />

Matei Ciocarlie, Kaijen Hsiao<br />

Willow Garage, Menlo Park, USA<br />

Platform: PR2<br />

Abstract: Interactive manipulation tools for the PR2: everything from low-level tools such as moving<br />

grippers around in Cartesian space, to high-level tools such as grasping segmented or recognized<br />

objects.<br />

SP10 – Haptic coupling with augmented feedback between the KUKA youBot and the PR2<br />

robot arms (*)<br />

Steven Bellens, Koen Buys, Nick Vanthienen, Tinne De Laet, Ruben Smits, Markus Klotzbuecher,<br />

Wilm Decré, Herman Bruyninckx, and Joris De Schutter<br />

Katholieke Universiteit Leuven, Belgium<br />

Platform: PR2 and youBot<br />

Abstract: We demonstrate haptic coupling between the KUKA youBot and the PR2 robot arms. A<br />

wearable display provides the operator with PR2 camera images, his head movement is coupled<br />

with the PR2's head.<br />

SP11 – Detecting and Climbing Stairs with a Laser-equipped Nao Humanoid (*)<br />

Stefan Osswald, Armin Hornung, Maren Bennewitz<br />

University of Freiburg, Germany<br />

Platform: Nao<br />

Abstract: We demonstrate autonomous stair climbing based on laser and vision data. After<br />

collecting a 3D laser scan, Nao detects steps of a staircase and climbs the staircase using visual<br />

information.<br />

SP12 – More cowbell! A musical ensemble with the NAO thereminist (*)<br />

Angelica Lim, Takeshi Mizumoto, Takuma Otsuka, Tatsuhiko Itohara, Kazuhiro Nakadai 1,2 , Tetsuya<br />

Ogata, Hiroshi G. Okuno<br />

Kyoto University; 1 Honda Research Institute, Wako, Japan; 2 Tokyo Institute of Technology, Japan<br />

Platform: Nao<br />

Abstract: We present the first live performance of our interactive music robot: Nao plays the<br />

theremin and listens to humans to stay in sync. Demo attendees are invited to join by playing<br />

cowbell or maracas.<br />

SP13 – Model-based Visual Servoing Tasks on Nao Robot<br />

A. Abou Moughlbay 1 , J. J. Sorribes 2 , E. Cervera 2 , P. Martinet 1<br />

1<br />

LASMEA, Blaise Pascal University, France<br />

2<br />

RobInLab, Jaume-I University, Spain<br />

Platform: Nao<br />

Abstract: Model-based visual servoing is implemented on the Nao robot. Based on geometric<br />

models, the robot performs localization, tracking and grasping of objects in a semi-structured<br />

environment.<br />

Open Research Demonstrations<br />

The goal of the open research demonstration session is to give participants an open forum<br />

to present their latest developments on robotic systems and software. The accepted<br />

demonstrations for this session are:<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxxi–


OR1 – Kilobot: A Low Cost Scalable Robot System for Collective Behaviors (*)<br />

Michael Rubenstein, Radhika Nagpal<br />

Harvard University, Cambridge, USA<br />

Abstract: The Kilobot is a low cost, easy to use robot, which is designed for testing swarm<br />

algorithms on hundreds to thousands of robots. We will demo robot operations, and sample<br />

behaviors on 100 Kilobots.<br />

OR2 – DisplaySwarm: A robot swarm displaying images (*)<br />

Javier Alonso-Mora 1,2 , Andreas Breitenmoser 1 , Martin Rufli 1 , Stefan Haag, Gilles Caprari 3 , Roland<br />

Siegwart 1 , Paul Beardsley 2<br />

1 ETH Zurich; 2 Disney Research Zurich; 3 CGTronic Lugano, Switzerland<br />

Abstract: DisplaySwarm represents images in a novel way using small mobile robots with<br />

controllable colored illumination. The work addresses research questions in collision avoidance and<br />

pattern formation.<br />

OR3 – The DARPA ARM Robot: Available at Your Desk<br />

Gill Pratt 1 , Jim Pippine 2 , Andrew Mor 3 , Natalie Salaets 4<br />

1 DARPA DSO; 2 Golden Knight Technologies; 3 RE2, 4 System Planning Corporation, USA<br />

Abstract: The DARPA ARM demonstration features both the publicly available ARM robot simulator<br />

and a real remote robot. Visitors interact with the simulator to perform various tasks, then watch the<br />

real remote robot do the same tasks.<br />

OR4 – Flying the Pixhawk MAV - A Computer Vision Controlled Quadrotor (*)<br />

Lorenz Meier, Petri Tanskanen, Lionel Heng, Gim-Hee Lee, Friedrich Fraundorfer, Marc Pollefeys<br />

ETH Zurich, Switzerland<br />

Abstract: The Pixhawk micro air vehicle is an autonomous flying robot equipped with cameras and<br />

onboard computer. It will fly based on onboard visual localization and perform the task of object<br />

tracking and pattern recognition.<br />

OR5 – Proxy Method for Fast Haptic Rendering from Time Varying Point Clouds<br />

Fredrik Ryden, Sina Nia Kosari, Howard Jay Chizeck<br />

University of Washington, Seattle, USA<br />

Abstract: This demonstration will allow users to directly “feel” physical objects imaged by a Kinect.<br />

They will be able to directly experience haptic rendering of both fixed and moving objects in real-time.<br />

OR6 – Bio-inspired vertebral column, compliance and semi-passive dynamics in a<br />

lightweight humanoid robot (*)<br />

Olivier Ly, Matthieu Lapeyre, Pierre-Yves Oudeyer<br />

LABRI-INRIA, Talence, France<br />

Abstract: We demonstrate the use of compliance and vertebral column in humanoid locomotion.<br />

OR7 – A Robotic System for Needle Steering<br />

Ann Majewicz 1 , John Swensen 2 , Tom Wedlick 2 , Kyle Reed 3 , Ron Alterovitz 4 , Vinutha Kallem 5 ,<br />

Wooram Park 6 , Animesh Garg, Gregory Chirikjian 2 , Ken Goldberg 7 , Animesh Garg 7 , Noah Cowan 2 ,<br />

and Allison Okamura 1<br />

1 Stanford Univ.; 2 John Hopkins Univ.; 3 Univ. of South Florida; 4 University of North Carolina at<br />

Chapel Hill; 5 Univ. of Pennsylvania; 6 Univ. of Texas at Dallas, USA; 7 UC Berkeley;<br />

Abstract: A live demonstration of robotic needle steering in artificial tissue, as well as videos and<br />

posters about models and simulations, path planners, controllers, and integration with medical<br />

imaging.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxxii–


OR8 – The r-one: A Low-Cost Robot for Research, Education, and Outreach<br />

A. Lynch, J. McLurkin<br />

Rice University, Houston, USA<br />

Abstract: We present the r-one: An advanced, low-cost robot for research and education. Come<br />

see autonomous multi-robot exploration, drive a mini-swarm, and write your program at<br />

programming stations.<br />

OR9 – Ballbot Rezero<br />

Simon Doessegger, Peter Fankhauser, Corsin Gwerder, Jonathan Huessy, Jerome Kaeser,<br />

Thomas Kammermann, Lukas Limacher, Michael Neunert, Francis Colas, Cedric Pradalier, Roland<br />

Siegwart<br />

ETH Zurich, Switzerland<br />

Abstract: Ballbots are able to balance and drive on a single sphere. Designed for high agility,<br />

organic motion and human interaction, Ballbot Rezero exploits its inherent, unstable dynamics.<br />

OR10 – Generic Perception for Manipulation<br />

Manuel Brucker 1 , Chavdar Papazov 2 , Simon Leonard 3 , Darius Burschka 2 , Gregory D. Hager 3 , Tim<br />

Bodenmueller 1<br />

1 DLR, Germany; 2 TU Munich, Germany; 3 John Hopkins University, Baltimore, USA<br />

Abstract: Our scene parsing algorithm leverages physical constraints to enhance data fitting over<br />

time and space. Our demonstration will show a robot manipulating complex scenes composed of<br />

various objects.<br />

Symposium<br />

A special symposium, Robot Demonstrations, will highlight twelve of the best demo<br />

proposals, with short descriptions of the demos by the authors.<br />

When: Monday, September 26, <strong>2011</strong>, 16:00<br />

Where: Continental Ballroom 4 (Session MoBT4)<br />

List of talks<br />

SP2 – Aquila - The Cognitive Robotics Toolkit<br />

SP3 – General Purpose Service Robot<br />

SP4 – Active Perception Planning on the Care-o-Bot platform<br />

SP6 – Using Mobile Manipulation to solve a Household Task<br />

SP7 – Demonstration of iTaSC as a unified framework for task specification, control, and<br />

coordination for mobile manipulation<br />

SP10 – Haptic coupling with augmented feedback between the KUKA youBot and the PR2 robot<br />

arms<br />

SP11 – Detecting and Climbing Stairs with a Laser-equipped Nao Humanoid<br />

SP12 – More cowbell! A musical ensemble with the NAO thereminist<br />

OR1 – Kilobot: A Low Cost Scalable Robot System for Collective Behaviors<br />

OR2 – DisplaySwarm: A robot swarm displaying images<br />

OR4 – Flying the Pixhawk MAV - A Computer Vision Controlled Quadrotor<br />

OR6 – Bio-inspired vertebral column, compliance and semi-passive dynamics in a lightweight<br />

humanoid robot<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxxiii–


Floorplan<br />

Exhibition<br />

The Exhibition venue is the Golden Gate room at the Lobby level of the Hilton hotel.<br />

Schedule<br />

Exhibitors will be active during the following days:<br />

Tuesday, September 27, <strong>2011</strong> 9:00 – 17:00<br />

Wednesday, September 28, <strong>2011</strong> 9:00 – 17:00<br />

Thursday, September 29, <strong>2011</strong> 9:00 – noon<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxxiv–


Exhibitors<br />

Platinum Corporate Sponsors exhibiting:<br />

Bosch Research and Technology Center www.boschresearch.com<br />

Honda Research Institute www.honda-ri.com/HRI_Us<br />

KUKA Robotics Corp. www.kuka.com<br />

SRI International www.sri.com<br />

Gold Corporate Sponsors exhibiting:<br />

Willow Garage, Inc. www.willowgarage.com<br />

Silver Corporate Sponsors exhibiting:<br />

Aldebaran Robotics www.aldebaran-robotics.com<br />

Google, Inc. research.google.com<br />

Intuitive Surgical, Inc. www.intuitivesurgical.com<br />

Shunk Intek, Inc. www.us.schunk.com<br />

Additional Exhibitors:<br />

Adept MobileRobots www.adept.com<br />

Ascending Technologies, Inc. www.asctec.de<br />

Barobo, Inc. www.barobo.com<br />

Barrett Technology, Inc. www.barrett.com<br />

BioRob GmbH www.biorob.de<br />

Butterfly Haptics, LLC www.butterflyhaptics.com<br />

Coroware, Inc. www.coroware.com<br />

DARPA ARM Outreach www.thearmrobot.com<br />

Festo Didactic www.festo.us<br />

Hokuyo Automatic Co., Ltd. www.hokuyo-aut.jp<br />

John Wiley & Sons www.wiley.com<br />

Kawada Industries, Inc. www.kawada.co.jp<br />

National Instruments Corporation www.ni.com<br />

RoadNarrows Robotics www.roadnarrows.com<br />

RobotCub Consortium (iCub) www.icub.org<br />

Robotis, Inc. www.robotis.com<br />

SimLab Co., Ltd www.simlab.co.kr<br />

Skybotix AG www.skybotix.com<br />

Springer Science+Business Media www.springer.com<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxxv–


<strong>IROS</strong> <strong>2011</strong> Corporate Sponsors<br />

We acknowledge the support of the following Corporate Sponsors to the <strong>2011</strong> IEEE/RSJ<br />

International Conference on Intelligent Robots and Systems.<br />

Platinum<br />

www.boschresearch.com www.honda-ri.com/HRI_Us www.kuka.com www.sri.com<br />

Gold<br />

Silver<br />

www.abb.com/robots www.willowgarage.com<br />

www.aldebaran-robotics.com research.google.com www.intuitivesurgical.com www.us.schunk.com<br />

At the end of this Digest, you will find ads from the above Corporate Sponsors.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxxvi–


<strong>IROS</strong> <strong>2011</strong> <strong>Awards</strong><br />

<strong>IROS</strong> Harashima Award<br />

To honor Professor Fumio Harashima, the Honorary Founding Chair of <strong>IROS</strong> conferences,<br />

by recognizing outstanding contributions of an individual of the <strong>IROS</strong> community, who has<br />

pioneered activities in robotics and intelligent systems.<br />

NTF Award for Entertainment Robots and Systems<br />

Started in 2007, this award is to encourage research and development of "entertainment<br />

robots and systems", and new technologies for future entertainment. Sponsored by the<br />

New Technology Foundation; Certificate + USD 1,000.<br />

JTCF Novel Technology Paper Award for Amusement Culture<br />

This award recognizes practical technology contributing to toys, toy models, and amusement<br />

culture. Sponsored by the Japan Toy Culture Foundation; Certificate + JPY 100,000.<br />

RoboCup Best Paper Award<br />

Sponsored by The RoboCup Federation; Certificate + USD 500.<br />

<strong>IROS</strong> CoTeSys Cognitive Robotics Best Paper Award<br />

Started in 2010, this award is to promote interdisciplinary research on cognition for<br />

technical systems (CoTeSys) and advancements of cognitive robotics in industry, home<br />

applications, and daily life. Sponsored by the German Cluster of Excellence, CoTeSys;<br />

Certificate + USD 1,000.<br />

Best Application Paper Award<br />

Sponsored by the Institute of Control, Robotics, and Systems (ICROS); Certificate + USD<br />

1,000.<br />

<strong>IROS</strong> <strong>2011</strong> Best Paper Award<br />

To recognize the most outstanding paper presented at the Conference, based on the<br />

quality of the contribution, of the written paper, and of the oral presentation. Sponsored by<br />

RSJ and SICE; Certificate + USD 2,000.<br />

<strong>IROS</strong> <strong>2011</strong> Best Student Paper Award<br />

To recognize the most outstanding paper authored primarily and presented by a student at<br />

the Conference, based on the quality of the contribution, of the written paper and of the oral<br />

presentation. Sponsored by the <strong>IROS</strong> <strong>2011</strong> Conference; Certificate + USD 1,000.<br />

All award winners will be announced during the <strong>Awards</strong> Lunch on Thursday, September 29.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxxvii–


Welcome Reception<br />

Social Events<br />

A welcome reception will be offered to all <strong>IROS</strong> <strong>2011</strong> participants on Monday,<br />

September 26, 18:30-20:00. After the end of the sessions, we will meet in the Continental<br />

Ballroom.<br />

Dinner Cruise on the Bay<br />

San Francisco's skyline is most spectacular when seen from the water. On Tuesday,<br />

September 27, we will take you on an afternoon and dinner cruise on the San Francisco<br />

Belle, a beautiful three-leveled 292-foot sternwheeler with a unique Art Nouveau style.<br />

During the cruise, you will have ample opportunities to enjoy the views of the City and<br />

many of its most famous sights, such as Alcatraz, Angel Island, the Embarcadero,<br />

Fisherman's Wharf, Coit Tower, the skyscrapers of the Financial District, and – of course –<br />

the Golden Gate Bridge. Transfer busses will leave at the Hilton hotel at 16:00, and we<br />

expect to be back at the hotel at 22:00.<br />

Gold Lunch<br />

On Wednesday, September 28, at noon, a lunch for all Graduates of the Last Decade<br />

(GOLD) will be organized at the Continental Ballroom level. This luncheon was initiated<br />

within the RAS Technical Activities Board as a mean to let graduates be aware of what the<br />

society has to offer and to network with each other. The opportunity is also used to present<br />

the structure of the society and introduce the various Technical Committees forming TAB.<br />

RAS members will have first priority. If you are not yet a member, you can sign up now at<br />

www.ieee-ras.org.<br />

Lunch with Leaders<br />

This event is organized by the IEEE Robotics and Automation Society (RAS) and will take<br />

place in parallel to the GOLD lunch on Wednesday, September 28, at noon. Lunch with<br />

Leaders (LwL) was initiated by the Student Activities Committee with the aim to provide<br />

students with an opportunity to get in contact with leaders and get advice and mentoring on<br />

their career and research. RAS members will have first priority.<br />

Evening at the Museum with Picasso<br />

This year, we have discovered a very special venue for the conference banquet. It will be<br />

held on the evening of Wednesday, September 28, in the de Young Museum. The<br />

museum is situated in Golden Gate Park, which extends all the way to Ocean Beach, and<br />

which features many other attractions such as the Japanese Tea Garden, the Botanical<br />

Garden, and the California Academy of Sciences, to name but a few. As a special highlight,<br />

we will have exclusive access to the Picasso Exhibition. Transfer busses will leave at the<br />

Hilton hotel at 18:00, and we expect to be back at the hotel at 22:00.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxxviii–


<strong>Awards</strong> Ceremony<br />

All <strong>IROS</strong> <strong>2011</strong> participants are invited to attend the <strong>Awards</strong> Ceremony in the Continental<br />

Ballroom on Thursday, September 29, at 12:30. All conference participants are welcome<br />

to join while the eight <strong>IROS</strong> <strong>2011</strong> Conference <strong>Awards</strong> will be presented.<br />

Farewell Reception<br />

At the end of the conference, a reception will gather all <strong>IROS</strong> <strong>2011</strong> participants in the<br />

Continental Ballroom on Thursday, September 29, 18:30-20:00.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xxxix–


Map of Conference Venue<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xl–


Conference Overview<br />

<strong>IROS</strong> <strong>2011</strong> features three types of sessions. In addition to the customary technical tracks<br />

of regular sessions, there are specially organized symposia, novel interactive sessions,<br />

and five forums.<br />

Regular Sessions<br />

These consist of eight papers. Five papers are presented in regular (15 minute) talk format.<br />

Three papers are allotted short (5 minute) presentations. Each of these “teaser” talks<br />

previews the corresponding paper and invites the audience to the full presentation of the<br />

paper at the subsequent (90 minute) interactive session in the special presentation area<br />

equipped with large screens.<br />

Symposia<br />

Symposia are invited sessions that are focused on a specific subfield. Organized by<br />

renowned researchers, they are a celebration of 50 years of robotics. All symposia consist<br />

of papers that have undergone the regular review process. In addition, a longer invited talk<br />

(semi-plenary) is included in some symposia.<br />

Interactive sessions<br />

These sessions allow a carefully selected number of papers to be presented in an<br />

interactive manner on large flat screens. Up to 30 papers are presented in parallel to small<br />

groups of interested attendees during each session, in order to foster a lively exchange of<br />

ideas between participants. Before being presented in an interactive session, every such<br />

paper is previewed in the form of a short (5 minute) “teaser” talk in a regular session. It is<br />

important to note that papers presented during interactive sessions were reviewed and are<br />

published exactly as regular papers. The Senior Program Committee selected those<br />

papers for interactive sessions that are most likely to benefit from the special presentation<br />

format.<br />

Industrial Forums<br />

<strong>IROS</strong> <strong>2011</strong> features three industrial forums, each one session long. Each forum includes<br />

representatives from robotics companies making short presentations and engaging in a<br />

moderated panel discussion among themselves and with the audience.<br />

Robots: The New Commercial Platforms<br />

This forum focuses on recent and emerging robotic platforms. Participants from industry<br />

discuss recent and projected advances in robotic technology with a commentary on<br />

emerging applications. The forum is moderated by Professor Rüdiger Dillmann.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xli–


Robots: The Next Generation<br />

This forum focuses on next generation robotic platforms, new business models for robotics,<br />

and the role of open software. Participants from industry discuss advances in robotic<br />

technology with an emphasis on software and applications. The forum is moderated by<br />

Steve Cousins.<br />

Medical Robotics<br />

This forum focuses on the rapidly growing field of medical robotics. In synergy with the<br />

Symposia on medical robotics at <strong>IROS</strong> <strong>2011</strong>, this forum features industry experts<br />

discussing and presenting cutting edge commercial advances in the field. The forum is<br />

moderated by Professor Paolo Dario.<br />

Forums<br />

Robotics: Beyond the Horizon<br />

In addition to the industrial forums, <strong>IROS</strong> <strong>2011</strong> features a special ‘Blue Sky’ forum on the<br />

future of robotics. Participants from academia, government, and industry present their<br />

visions for the future of the field. The forum is moderated by Professor Hirochika Inoue.<br />

On Robotics Conferences: A Town Hall Meeting<br />

<strong>IROS</strong> <strong>2011</strong> features a town hall meeting forum. The forum is an opportunity for conference<br />

attendees to provide feedback about the conference, and for future conference organizers<br />

to briefly present their vision of the meetings they are planning. The forum is moderated by<br />

Professor Peter Corke.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xlii–


<strong>IROS</strong> <strong>2011</strong> Technical Program at a Glance: Monday, September 26, <strong>2011</strong><br />

Track T1 Track T2 Track T3 Track T4 Track T5 Track T6 Track T7 Track T8 Track T9 Track T10<br />

14:00-15:30<br />

MoAT1<br />

Continental<br />

Parlor 1<br />

Microsensing<br />

16:00-17:30<br />

MoBT1<br />

Continental<br />

Parlor 1<br />

Micromanipulation<br />

14:00-15:30<br />

MoAT2<br />

Continental<br />

Parlor 2<br />

Localization<br />

16:00-17:30<br />

MoBT2<br />

Continental<br />

Parlor 2<br />

Localization with<br />

Constraints<br />

14:00-15:30<br />

MoAT3<br />

Continental<br />

Parlor 3<br />

Symposium:<br />

Robot Audition:<br />

Active Audition<br />

16:00-17:30<br />

MoBT3<br />

Continental<br />

Parlor 3<br />

Symposium:<br />

Robot Audition:<br />

From Sound<br />

Source<br />

Localization to<br />

Automatic Speech<br />

Recognition<br />

14:00-15:30<br />

MoAT4<br />

Continental<br />

Ballroom 4<br />

Symposium:<br />

Telerobotics<br />

16:00-17:30<br />

MoBT4<br />

Continental<br />

Ballroom 4<br />

Symposium:<br />

Robot<br />

Demonstrations<br />

14:00-15:30<br />

MoAT5<br />

Continental<br />

Ballroom 5<br />

Symposium:<br />

Bio-Inspired<br />

Robotics I<br />

16:00-17:30<br />

MoBT5<br />

Continental<br />

Ballroom 5<br />

Symposium:<br />

Bio-Inspired<br />

Robotics II<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xliii–<br />

14:00-15:30<br />

MoAT6<br />

Continental<br />

Ballroom 6<br />

Symposium:<br />

Field Robotics I<br />

16:00-17:30<br />

MoBT6<br />

Continental<br />

Ballroom 6<br />

Symposium:<br />

Field Robotics II<br />

14:00-15:30<br />

MoAT7<br />

Continental<br />

Parlor 7<br />

Wheeled Robots<br />

16:00-17:30<br />

MoBT7<br />

Continental<br />

Parlor 7<br />

Teleoperation<br />

14:00-15:30<br />

MoAT8<br />

Continental<br />

Parlor 8<br />

Learning<br />

Parameterized<br />

Policies<br />

16:00-17:30<br />

MoBT8<br />

Continental<br />

Parlor 8<br />

Learning for<br />

Control<br />

14:00-15:30<br />

MoAT9<br />

Continental<br />

Parlor 9<br />

Novel Actuators I<br />

16:00-17:30<br />

MoBT9<br />

Continental<br />

Parlor 9<br />

Novel Actuators II<br />

16:00-17:30<br />

MoBPT10<br />

Golden Gate<br />

Room<br />

Interactive I


<strong>IROS</strong> <strong>2011</strong> Technical Program at a Glance: Tuesday, September 27, <strong>2011</strong><br />

Track T1 Track T2 Track T3 Track T4 Track T5 Track T6 Track T7 Track T8 Track T9 Track T10<br />

08:00-09:30<br />

TuAT1<br />

Continental<br />

Parlor 1<br />

Object<br />

Recognition,<br />

Segmentation,<br />

and Detection<br />

10:00-11:30<br />

TuBT1<br />

Continental<br />

Parlor 1<br />

Perception,<br />

Saliency and<br />

Novelty<br />

14:00-15:30<br />

TuCT1<br />

Continental<br />

Parlor 1<br />

Object Detection<br />

& Collision<br />

Avoidance<br />

08:00-09:30<br />

TuAT2<br />

Continental<br />

Parlor 2<br />

Simultaneous<br />

Localization and<br />

Mapping<br />

10:00-11:30<br />

TuBT2<br />

Continental<br />

Parlor 2<br />

Semantic SLAM &<br />

Loop Closure<br />

14:00-15:30<br />

TuCT2<br />

Continental<br />

Parlor 2<br />

Motion Estimation,<br />

Mapping & SLAM<br />

08:00-09:30<br />

TuAT3<br />

Continental<br />

Parlor 3<br />

Symposium:<br />

Microrobotics<br />

I<br />

10:00-11:30<br />

TuBT3<br />

Continental<br />

Parlor 3<br />

Symposium:<br />

Microrobotics<br />

II<br />

14:00-15:30<br />

TuCT3<br />

Continental<br />

Parlor 3<br />

Symposium:<br />

Microrobotics<br />

III<br />

08:00-09:30<br />

TuAT4<br />

Continental<br />

Ballroom 4<br />

Industrial Forum:<br />

Robots: The New<br />

Commercial<br />

Platforms<br />

10:00-11:30<br />

TuBT4<br />

Continental<br />

Ballroom 4<br />

Industrial Forum:<br />

Robots: The Next<br />

Generation<br />

14:00-15:30<br />

TuCT4<br />

Continental<br />

Ballroom 4<br />

Forum:<br />

Robotics: Beyond<br />

the Horizon<br />

08:00-09:30<br />

TuAT5<br />

Continental<br />

Ballroom 5<br />

Symposium:<br />

Medical Robotics<br />

I<br />

10:00-11:30<br />

TuBT5<br />

Continental<br />

Ballroom 5<br />

Symposium:<br />

Medical Robotics<br />

II<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xliv–<br />

08:00-09:30<br />

TuAT6<br />

Continental<br />

Ballroom 6<br />

Symposium:<br />

Grasping and<br />

Manipulation:<br />

Control and<br />

Learning<br />

10:00-11:30<br />

TuBT6<br />

Continental<br />

Ballroom 6<br />

Symposium:<br />

Grasping and<br />

Manipulation:<br />

Mechanics and<br />

Design<br />

12:30-13:45 TuPL<br />

Continental Ballroom<br />

Plenary I: Design<br />

14:00-15:30<br />

TuCT5<br />

Continental<br />

Ballroom 5<br />

Symposium:<br />

Medical Robotics<br />

III<br />

14:00-15:30<br />

TuCT6<br />

Continental<br />

Ballroom 6<br />

Symposium:<br />

Grasping and<br />

Manipulation:<br />

Grasp Planning<br />

and Quality<br />

08:00-09:30<br />

TuAT7<br />

Continental<br />

Parlor 7<br />

Software<br />

Architectures &<br />

Frameworks<br />

10:00-11:30<br />

TuBT7<br />

Continental<br />

Parlor 7<br />

Contact and<br />

Deformation<br />

14:00-15:30<br />

TuCT7<br />

Continental<br />

Parlor 7<br />

Mechanisms:<br />

Actuator Design<br />

08:00-09:30<br />

TuAT8<br />

Continental<br />

Parlor 8<br />

Bio-Inspired &<br />

Biomimetic Robots<br />

10:00-11:30<br />

TuBT8<br />

Continental<br />

Parlor 8<br />

Biomimetic<br />

Limbed Robots<br />

14:00-15:30<br />

TuCT8<br />

Continental<br />

Parlor 8<br />

Reptile & Fish-<br />

Inspired Robots<br />

08:00-09:30<br />

TuAT9<br />

Continental<br />

Parlor 9<br />

Probabilistic<br />

Exploration and<br />

Coverage<br />

10:00-11:30<br />

TuBT9<br />

Continental<br />

Parlor 9<br />

Perceptual<br />

Learning<br />

14:00-15:30<br />

TuCT9<br />

Continental<br />

Parlor 9<br />

Novel Sensors<br />

08:00-09:30<br />

TuAPT10<br />

Golden Gate<br />

Room<br />

Interactive II<br />

10:00-11:30<br />

TuBPT10<br />

Golden Gate<br />

Room<br />

Interactive III<br />

14:00-15:30<br />

TuCPT10<br />

Golden Gate<br />

Room<br />

Interactive IV


<strong>IROS</strong> <strong>2011</strong> Technical Program at a Glance: Wednesday, September 28, <strong>2011</strong><br />

Track T1 Track T2 Track T3 Track T4 Track T5 Track T6 Track T7 Track T8 Track T9 Track T10<br />

08:00-09:30<br />

WeAT1<br />

Continental<br />

Parlor 1<br />

HRI: Modeling<br />

Human Behavior<br />

10:00-11:30<br />

WeBT1<br />

Continental<br />

Parlor 1<br />

Socially Assistive<br />

Robots<br />

14:00-15:30<br />

WeCT1<br />

Continental<br />

Parlor 1<br />

Human-Robot<br />

Interaction and<br />

Cooperation<br />

16:00-17:30<br />

WeDT1<br />

Continental<br />

Parlor 1<br />

Human-Robot<br />

Collaboration<br />

08:00-09:30<br />

WeAT2<br />

Continental<br />

Parlor 2<br />

Models and<br />

Representation<br />

10:00-11:30<br />

WeBT2<br />

Continental<br />

Parlor 2<br />

Estimation &<br />

Sensor Fusion<br />

14:00-15:30<br />

WeCT2<br />

Continental<br />

Parlor 2<br />

Pose Estimation<br />

& Visual Tracking<br />

16:00-17:30<br />

WeDT2<br />

Continental<br />

Parlor 2<br />

Recognition &<br />

Prediction of<br />

Motion<br />

08:00-09:30<br />

WeAT3<br />

Continental<br />

Parlor 3<br />

Medical Robotics:<br />

Tracking &<br />

Detection<br />

10:00-11:30<br />

WeBT3<br />

Continental<br />

Parlor 3<br />

Surgical Robotics<br />

14:00-15:30<br />

WeCT3<br />

Continental<br />

Parlor 3<br />

Robot Safety<br />

16:00-17:30<br />

WeDT3<br />

Continental<br />

Parlor 3<br />

Haptic Rendering<br />

& Object<br />

Recognition<br />

08:00-09:30<br />

WeAT4<br />

Continental<br />

Ballroom 4<br />

08:00-09:30<br />

WeAT5<br />

Continental<br />

Ballroom 5<br />

Symposium: Symposium:<br />

Haptics Interfaces Robot Motion<br />

for the Fingertip, Planning:<br />

Hand, and Arm Achievements and<br />

Emerging<br />

Approaches<br />

10:00-11:30<br />

WeBT4<br />

Continental<br />

Ballroom 4<br />

Symposium:<br />

Hardware and<br />

Software Design<br />

for Haptic Systems<br />

14:00-15:30<br />

WeCT4<br />

Continental<br />

Ballroom 4<br />

Symposium:<br />

Haptic Feedback<br />

and System<br />

Evaluation<br />

16:00-17:30<br />

WeDT4<br />

Continental<br />

Ballroom 4<br />

Industrial Forum:<br />

Medical Robotics<br />

10:00-11:30<br />

WeBT5<br />

Continental<br />

Ballroom 5<br />

Symposium:<br />

Foundations and<br />

Future Prospects<br />

of Sampling-Based<br />

Motion Planning<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xlv–<br />

08:00-09:30<br />

WeAT6<br />

Continental<br />

Ballroom 6<br />

Symposium:<br />

Aerial Robotics:<br />

Estimation,<br />

Perception and<br />

Control<br />

10:00-11:30<br />

WeBT6<br />

Continental<br />

Ballroom 6<br />

Symposium:<br />

Aerial Robotics:<br />

Control and<br />

Planning<br />

12:30-13:45 WePL<br />

Continental Ballroom<br />

Plenary II: BioRobotics<br />

14:00-15:30<br />

WeCT5<br />

Continental<br />

Ballroom 5<br />

Symposium:<br />

Symbolic<br />

Approaches to<br />

Motion Planning<br />

and Control<br />

16:00-17:30<br />

WeDT5<br />

Continental<br />

Ballroom 5<br />

Symposium:<br />

Robot Motion<br />

Planning: New<br />

Frameworks and<br />

High Performance<br />

14:00-15:30<br />

WeCT6<br />

Continental<br />

Ballroom 6<br />

Symposium:<br />

Marine Robotics:<br />

Platforms and<br />

Applications<br />

16:00-17:30<br />

WeDT6<br />

Continental<br />

Ballroom 6<br />

Symposium:<br />

Marine Robotics:<br />

Control and<br />

Planning<br />

08:00-09:30<br />

WeAT7<br />

Continental<br />

Parlor 7<br />

Robot Walking<br />

10:00-11:30<br />

WeBT7<br />

Continental<br />

Parlor 7<br />

Passive Walking<br />

& Leg-Wheeled<br />

Robots<br />

14:00-15:30<br />

WeCT7<br />

Continental<br />

Parlor 7<br />

Humanoid<br />

Control<br />

16:00-17:30<br />

WeDT7<br />

Continental<br />

Parlor 7<br />

Tracking & Gait<br />

Analysis<br />

08:00-09:30<br />

WeAT8<br />

Continental<br />

Parlor 8<br />

Networked Robots<br />

10:00-11:30<br />

WeBT8<br />

Continental<br />

Parlor 8<br />

Multirobot<br />

Systems:<br />

Rendezvous &<br />

Task Switching<br />

14:00-15:30<br />

WeCT8<br />

Continental<br />

Parlor 8<br />

Multirobot<br />

Planning<br />

16:00-17:30<br />

WeDT8<br />

Continental<br />

Parlor 8<br />

Multirobot<br />

Coordination &<br />

Modular Robots<br />

08:00-09:30<br />

WeAT9<br />

Continental<br />

Parlor 9<br />

Vision: From<br />

Features to<br />

Applications<br />

10:00-11:30<br />

WeBT9<br />

Continental<br />

Parlor 9<br />

Visual Servoing<br />

14:00-15:30<br />

WeCT9<br />

Continental<br />

Parlor 9<br />

Visual & Multi-<br />

Sensor Calibration<br />

16:00-17:30<br />

WeDT9<br />

Continental<br />

Parlor 9<br />

Calibration &<br />

Identification<br />

08:00-09:30<br />

WeAPT10<br />

Golden Gate<br />

Room<br />

Interactive V<br />

10:00-11:30<br />

WeBPT10<br />

Golden Gate<br />

Room<br />

Interactive VI<br />

14:00-15:30<br />

WeCPT10<br />

Golden Gate<br />

Room<br />

Interactive VII<br />

16:00-17:30<br />

WeDPT10<br />

Golden Gate<br />

Room<br />

Interactive VIII


<strong>IROS</strong> <strong>2011</strong> Technical Program at a Glance: Thursday, September 29, <strong>2011</strong><br />

Track T1 Track T2 Track T3 Track T4 Track T5 Track T6 Track T7 Track T8 Track T9 Track T10<br />

08:00-09:30<br />

ThAT1<br />

Continental<br />

Parlor 1<br />

Force and<br />

Stiffness Control<br />

10:00-11:30<br />

ThBT1<br />

Continental<br />

Parlor 1<br />

Control for<br />

Manipulation<br />

& Grasping<br />

14:45-16:15<br />

ThCT1<br />

Continental<br />

Parlor 1<br />

Manipulation<br />

08:00-09:30<br />

ThAT2<br />

Continental<br />

Parlor 2<br />

Range and<br />

RGB-D Sensing<br />

10:00-11:30<br />

ThBT2<br />

Continental<br />

Parlor 2<br />

Mapping<br />

14:45-16:15<br />

ThCT2<br />

Continental<br />

Parlor 2<br />

Industrial Robots<br />

16:45-17:45<br />

ThDT1<br />

Continental Balllroom<br />

Forum:<br />

On Robotics Conferences:<br />

A Town Hall Meeting<br />

08:00-09:30<br />

ThAT3<br />

Continental<br />

Parlor 3<br />

Discrete &<br />

Kinodynamic<br />

Planning<br />

10:00-11:30<br />

ThBT3<br />

Continental<br />

Parlor 3<br />

08:00-09:30<br />

ThAT4<br />

Continental<br />

Ballroom 4<br />

08:00-09:30<br />

ThAT5<br />

Continental<br />

Ballroom 5<br />

Symposium: Symposium:<br />

Stochasticity in Humanoid<br />

Robotics and Robotics and<br />

Biological Systems Biped Locomotion<br />

I<br />

Randomized Symposium:<br />

Planning & Stochasticity in<br />

Kinematic Control Robotics and<br />

Biological Systems<br />

II<br />

14:45-16:15<br />

ThCT3<br />

Continental<br />

Parlor 3<br />

Marine Systems<br />

I<br />

16:45-17:45<br />

ThDT3<br />

Continental<br />

Parlor 3<br />

Marine Systems<br />

II<br />

10:00-11:30<br />

ThBT4<br />

Continental<br />

Ballroom 4<br />

14:45-16:15<br />

ThCT4<br />

Continental<br />

Ballroom 4<br />

Symposium:<br />

(Self-)assembly<br />

from the Nano to<br />

the Macro Scale:<br />

State of the Art<br />

and Future<br />

Directions<br />

16:45-17:45<br />

ThDT4<br />

Continental<br />

Parlor 2<br />

Novel System<br />

Designs:<br />

Locomotion<br />

10:00-11:30<br />

ThBT5<br />

Continental<br />

Ballroom 5<br />

Symposium:<br />

Humanoid<br />

Technologies<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–xlvi–<br />

08:00-09:30<br />

ThAT6<br />

Continental<br />

Ballroom 6<br />

Symposium:<br />

Robots that<br />

Can See I<br />

10:00-11:30<br />

ThBT6<br />

Continental<br />

Ballroom 6<br />

Symposium:<br />

Robots that<br />

Can See II<br />

13:30-14:30 ThPT11<br />

Continental Ballroom<br />

Plenary III: Self-Driving Cars<br />

14:45-16:15<br />

ThCT5<br />

Continental<br />

Ballroom 5<br />

Symposium:<br />

Humanoid<br />

Applications<br />

16:45-17:45<br />

ThDT5<br />

Continental<br />

Parlor 1<br />

Climbing &<br />

Brachiation<br />

14:45-16:15<br />

ThCT6<br />

Continental<br />

Ballroom 6<br />

Symposium:<br />

Computer Vision<br />

for Robotics<br />

16:45-17:45<br />

ThDT6<br />

Continental<br />

Parlor 9<br />

Novel System<br />

Designs: Sensing<br />

and Manipulation<br />

08:00-09:30<br />

ThAT7<br />

Continental<br />

Parlor 7<br />

Mechanisms: Joint<br />

Design<br />

10:00-11:30<br />

ThBT7<br />

Continental<br />

Parlor 7<br />

Medical Robots<br />

& Systems<br />

14:45-16:15<br />

ThCT7<br />

Continental<br />

Parlor 7<br />

Exoskeleton<br />

Robots & Gait<br />

Rehabilitation<br />

16:45-17:45<br />

ThDT7<br />

Continental<br />

Parlor 7<br />

Medical Robotics:<br />

Motion Planning &<br />

State Estimation<br />

08:00-09:30<br />

ThAT8<br />

Continental<br />

Parlor 8<br />

Autonomous<br />

Vehicles<br />

10:00-11:30<br />

ThBT8<br />

Continental<br />

Parlor 8<br />

Search & Rescue<br />

Robots<br />

14:45-16:15<br />

ThCT8<br />

Continental<br />

Parlor 8<br />

Aerial Robots:<br />

Navigation,<br />

Tracking &<br />

Landing<br />

16:45-17:45<br />

ThDT8<br />

Continental<br />

Parlor 8<br />

Aerial Robots<br />

08:00-09:30<br />

ThAT9<br />

Continental<br />

Parlor 9<br />

Towards<br />

Anthropomimetic<br />

Robots<br />

10:00-11:30<br />

ThBT9<br />

Continental<br />

Parlor 9<br />

Intelligent<br />

Transportation<br />

Systems<br />

(Automotive)<br />

14:45-16:15<br />

ThCT9<br />

Continental<br />

Parlor 9<br />

Swarms and<br />

Flocks<br />

08:00-09:30<br />

ThAPT10<br />

Golden Gate<br />

Room<br />

Interactive IX<br />

10:00-11:30<br />

ThBPT10<br />

Golden Gate<br />

Room<br />

Interactive X<br />

14:45-16:15<br />

ThCPT10<br />

Golden Gate<br />

Room<br />

Interactive XI<br />

16:45-17:45<br />

ThDPT10<br />

Golden Gate<br />

Room<br />

Interactive XII


14:00-15:30<br />

Technical Program Digest<br />

Session MoAT1 ⎯⎯ Microsensing<br />

Session MoAT2 ⎯⎯ Localization<br />

Session MoAT3 ⎯⎯ Symposium: Robot Audition: Active Audition<br />

Session MoAT4 ⎯⎯ Symposium: Telerobotics<br />

Session MoAT5 ⎯⎯ Symposium: Bio-inspired Robotics I<br />

Session MoAT6 ⎯⎯ Symposium: Field Robotics I<br />

Session MoAT7 ⎯⎯ Wheeled Robots<br />

Session MoAT8 ⎯⎯ Learning Parameterized Policies<br />

Session MoAT9 ⎯⎯ Novel Actuators I<br />

16:00-17:30<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–1–<br />

Monday<br />

September 26, <strong>2011</strong><br />

Session MoBT1 ⎯⎯ Micromanipulation<br />

Session MoBT2 ⎯⎯ Localization with Constraints<br />

Session MoBT3 ⎯⎯ Symposium: Robot Audition: From Sound Source Localization to<br />

Automatic Speech Recognition<br />

Session MoBT4 ⎯⎯ Symposium: Robot Demonstrations<br />

Session MoBT5 ⎯⎯ Symposium: Bio-inspired Robotics II<br />

Session MoBT6 ⎯⎯ Symposium: Field Robotics II<br />

Session MoBT7 ⎯⎯ Teleoperation<br />

Session MoBT8 ⎯⎯ Learning for Control<br />

Session MoBT9 ⎯⎯ Novel Actuators II


<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–2–<br />

Monday<br />

September 26, <strong>2011</strong>


Session MoAT1 Continental Parlor 1 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Microsensing<br />

Chair Koichi Hashimoto, Tohoku Univ.<br />

Co-Chair Fumihito Arai, Nagoya Univ.<br />

14:00–14:15 MoAT1.1<br />

Evaluation of Biological Clock Activity<br />

Capsulated by Lipid-mono-layer<br />

Masaru Kojima 1 , Masahiro Nakajima 2 , Kingo Takiguchi 3 ,<br />

Michio Homma 3 , Takao Kondo 3 and Toshio Fukuda 1,2<br />

1.Dept. of Micro-Nano Systems Eng., Nagoya Univ., Japan<br />

2.Center for Micro-Nano Mechatronics, Nagoya Univ., Japan<br />

3.Div. of biological science, Nagoya Univ., Japan<br />

• Toward to application, the components of<br />

the biological clock reconstituted into<br />

phospholipid-coated microdroplets.<br />

• The clock activity in phospholipid-coated<br />

microdroplets was detected as long<br />

period.<br />

• Localization of KaiB and KaiC proteins in<br />

droplets was observed by using<br />

fluorescent microscopy<br />

• Different distribution between KaiB and<br />

KaiC suggest that biased local<br />

concentration make long period.<br />

phosphorylation rate<br />

50 μ m<br />

1<br />

0.9<br />

0.8<br />

0.7<br />

0.6<br />

0.5<br />

0.4<br />

0.3<br />

0.2<br />

Control<br />

Droplet<br />

0.1<br />

0<br />

35 hr 35 hr<br />

0 10 20 30 40 50 60<br />

70 80 90 100<br />

time (hr)<br />

Clock activity in phospholipidcoated<br />

microdroplets<br />

14:30–14:45 MoAT1.3<br />

Temperature Measurement by Color Analysis of<br />

Fluorescent Spectrum Using Cell Investigation Tool<br />

Impregnated with Quantum Dot for<br />

Cell Measurement on a Microfluidic Chip<br />

Hisataka MARUYAMA, Kyohei TOMITA, Taisuke MASUDA<br />

Department of Micro-Nano Systems Engineering, Nagoya University, Japan<br />

Fumihito Arai<br />

Department of Micro-Nano Systems Engineering, Nagoya University, Japan<br />

Seoul National University, Korea.<br />

• Local temperature measurement using<br />

hydrogel containing temperature sensitive<br />

quantum dot was achieved.<br />

• Temperature was measured by color<br />

information instead of fluorescent intensity<br />

for long-lifetime temperature<br />

measurement.<br />

• Sensitivity of temperature measurement<br />

was -1.3%/K and accuracy of temperature<br />

was �0.3 K, respectively.<br />

• Lifetime of temperature measurement<br />

using color is much longer than<br />

temperature measurement using<br />

fluorescent intensity.<br />

Fluorescent image of gel-tool<br />

containing Q-dots<br />

14:15–14:30 MoAT1.2<br />

Fast and adaptive auto-focusing algorithm<br />

for microscopic cell observation<br />

Takeshi Obara and Koichi Hashimoto<br />

Graduate School of Information Sciences, Tohoku University, Japan<br />

Yasunobu Igarashi<br />

Olympus Software Technology Corp., Japan<br />

• Auto-focusing algorithm to observe a<br />

freely motile cell, combining 2 methods<br />

below<br />

• Fast auto-focus: Depth from Diffraction<br />

(DFDi)<br />

• Adaptive auto-focus: wide scan using 2D<br />

Laplacian to correct DFDi parameter<br />

• A freely moving paramecium was autofocused<br />

and tracked during 45s (see fig.)<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–3–<br />

Time lapse images of the freely<br />

moving paramecium<br />

14:45–14:50 MoAT1.4<br />

Tracking of Objects in Motion-Distorted Scanning<br />

Electron Microscope Images<br />

Christian Dahmen and Sergej Fatikow<br />

AMiR, University of Oldenburg, Germany<br />

• Motion of objects induces object<br />

appearance distortion due to scanning<br />

nature of scanning electron microscope<br />

• Adjacent lines have closest temporal<br />

relation<br />

• For tracking, a template is decomposited<br />

into line templates, and the line templates<br />

flexibly matched<br />

• This allows tracking of objects where<br />

standard algorithms may fail due to<br />

distortions.


Session MoAT1 Continental Parlor 1 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Microsensing<br />

Chair Koichi Hashimoto, Tohoku Univ.<br />

Co-Chair Fumihito Arai, Nagoya Univ.<br />

14:50–14:55 MoAT1.5<br />

Cell Hardness Measurement by Using Twofingered<br />

Microhand with Micro Force Sensor<br />

Daiki KAWAKAMI*, Kenichi OHARA*,<br />

Tomohito TAKUBO*, Yasushi MAE*, Akihiko ICHIKAWA**,<br />

Tamio TANIKAWA***, Tatsuo ARAI*<br />

* Graduate School of System Engineering,Osaka Univ., Japan<br />

** Dept. of Micro-Nano System Engineering, Nagoya Univ.,Japan<br />

***Intelligent Systems Research Institute, AIST, Japan<br />

• Two-fingered microhand with micro force<br />

sensor was developed.<br />

• The microhand can realize cell operation<br />

and cell condition measurement.<br />

• Meyer hardness is focused as a validation<br />

factor.<br />

• Calibration strategy obtaining the<br />

relationships between measured sensor<br />

data to SI units is shown.<br />

• The algorithm to obtain Meyer hardness<br />

by using the microhand is proposed.<br />

• Validation results are shown through the<br />

experiments using fibroblast cell<br />

End-effector with micro force sensor<br />

Loading and unloading process<br />

for fibroblast cell<br />

15:00–15:15 MoAT1.7<br />

Nanoforce estimation with Kalman filtering<br />

applied to a force sensor based on diamagnetic<br />

levitation<br />

Emmanuel Piat and Joël Abadie and Stéphane Oster<br />

Femto-st Institute, CNRS – UFC – ENSMM – UTBM, France<br />

• Nanoforce sensor based on diamagnetic<br />

levitation<br />

• Force/displacement transducer based on<br />

macroscopic seismic mass<br />

• Deconvolution of displacement using<br />

Kalman filtering to estimate the force<br />

• Uncertain a priori modelling of the<br />

unknown force<br />

• Trade-off between force resolution and<br />

sensor response time<br />

• The size of the image may be adjusted<br />

14:55–15:00 MoAT1.6<br />

An ultra-high precision, high bandwidth torque<br />

sensor for microrobotics applications<br />

Ben Finio and Robert Wood<br />

School of Engineering and Applied Sciences, Harvard University, USA<br />

Kevin Galloway<br />

Wyss Institute, Harvard University, USA<br />

• We present, analysis, design, fabrication, calibration and use of a highresolution,<br />

single-axis torque sensor<br />

• Sensor fulfills need for torque measurements in micro-Newton-meter<br />

range, not possible with commercially available sensors<br />

• As an example application, we measure open-loop pitch torques<br />

generated by wing motions of a robotic fly<br />

15:15–15:30 MoAT1.8<br />

Optimal Design of Non Intuitive Compliant<br />

Microgripper With High Resolution<br />

Aymen Grira and Christine Rotinat-Libersa<br />

CEA –LIST, Interactive Robotics Laboratory France<br />

Aymen Grira, Bernard Legrand, Estelle Mairiaux, Lionel Buchaillot<br />

IEMN, Nano and micro systems group, Lille, France<br />

• New high force-resolution microgripper with<br />

large jaw displacement, electrostatically<br />

actuated by a comb-drive and instrumented<br />

with an integrated differential capacitive<br />

displacement sensor.<br />

• This microgripper has been non-intuitively<br />

designed using a multi-objective optimization<br />

method, to reach the best compromise<br />

between chosen performance criteria.<br />

• It has been modeled and fabricated using<br />

silicon micro-technology.<br />

• The theoretical gripping force resolution<br />

obtained is 78 pN along the direction of the<br />

mobile tip displacement.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–4–<br />

Differential<br />

capacitive<br />

sensor<br />

(a)<br />

Fixed<br />

tip<br />

Mobile<br />

tip<br />

Combdrive<br />

Fixed<br />

electrodes<br />

(b)<br />

actuators<br />

Sensing<br />

20 μm<br />

electrodes<br />

Mobile<br />

electrodes<br />

Fixed<br />

electrodes<br />

500 μm 100 μm (c)<br />

SEM pictures of the fabricated microgripper.<br />

(a) Overview of the optimized microgripper<br />

combined with the comb-drive actuator and<br />

the differential capacitive displacement sensor.<br />

(b) A closed view of the differential capacitive<br />

displacement sensor made of 30 fingers. Each<br />

one is 485 µm long and 4 µm wide. (c) A<br />

closed view of the electrostatic comb-drive<br />

actuator made of 68 fingers. Each finger is 20<br />

µm long and 2 µm wide. The fingers are<br />

spaced by 2 µm and the overlapping distance<br />

between fixed and mobile fingers at rest is 10<br />

µm.


Session MoAT2 Continental Parlor 2 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Localization<br />

Chair Juan D. Tardos, Univ. de Zaragoza<br />

Co-Chair Antonio Franchi, Max Planck Inst. for Biological Cybernetics<br />

14:00–14:15 MoAT2.1<br />

Real-Time Loop Detection<br />

with Bags of Binary Words<br />

Dorian Gálvez-López and Juan D. Tardós<br />

I3A, University of Zaragoza, Spain<br />

• Real-time loop closure detection applying<br />

the bag-of-words + geometrical checking<br />

approach on a binary descriptor space.<br />

• The feature extraction time is reduced by<br />

using FAST points with Binary Robust<br />

Independent Elementary Features<br />

(BRIEF).<br />

• Geometrical checking is speeded up by<br />

pre-filtering correspondence points with a<br />

direct index.<br />

• Extracting features, detecting loops in a<br />

database with 19K images and performing<br />

geometrical checking take less than 50<br />

milliseconds.<br />

Execution time with 2723 images.<br />

Precision: 100%, Recall: 57.86%<br />

14:30–14:45 MoAT2.3<br />

An Observability-Constrained<br />

Sliding Window Filter for SLAM<br />

Guoquan Huang, and Stergios Roumeliotis<br />

Dept. of Computer Science and Engineering, Univ. of Minnesota, USA<br />

Anastasios Mourikis<br />

Dept. of Electrical Engineering, Univ. of California, Riverside, USA<br />

• Sliding-Window Filter (SWF):<br />

A resource-adaptive smoother,<br />

better than filtering at dealing with<br />

nonlinearity<br />

• Inconsistency of standard SWF:<br />

rank of the Hessian is erroneously<br />

increased<br />

• Observability-Constrained (OC)-<br />

SWF selects linearization points to<br />

ensure correct rank of Hessian<br />

while minimizing linearization errors<br />

OC-SWF: better consistency and<br />

accuracy than competing approaches,<br />

performs closer to batch Maximum-A-<br />

Posterior (MAP) estimator in tests<br />

14:15–14:30 MoAT2.2<br />

Best-First Branch and Bound Search Method for<br />

Map Based Localization<br />

Jari Saarinen, Janne Paanajärvi and Pekka Forsman<br />

Automation and Systems technology, Aalto University, Finland<br />

• An efficient algorithm that calculates<br />

globally optimal pose estimate<br />

• The algorithm is based on best-first<br />

branch and bound search in the<br />

relative pose space between the data<br />

sets<br />

• Provides also an estimation of the<br />

accuracy<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–5–<br />

The measurement is evaluated in<br />

the center of the cell, for which<br />

the score and upper bound<br />

are estimated<br />

14:45–14:50 MoAT2.4<br />

Corrective Gradient Refinement for<br />

Mobile Robot Localization<br />

Joydeep Biswas, Brian Coltin, Manuela Veloso<br />

School of Computer Science, Carnegie Mellon University, USA<br />

• Corrective Gradient Refinement uses<br />

state space derivatives of observation<br />

likelihood to refine proposal distributions<br />

• Efficient, Analytic computation using<br />

Vector Maps and 2D (Laser rangefinder)<br />

and 3D (Kinect Depth Camera) point cloud<br />

observation models<br />

• Experimentally shown to be more<br />

accurate, and require far fewer particles<br />

than Sampling / Importance Resampling<br />

Monte Carlo Localization<br />

• Long term trials (> 13km traversed indoors<br />

over 19 days) demonstrate reliability<br />

CGR<br />

SIR - MCL<br />

CGR efficiently samples<br />

directions of uncertainty based on<br />

observations


Session MoAT2 Continental Parlor 2 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Localization<br />

Chair Juan D. Tardos, Univ. de Zaragoza<br />

Co-Chair Antonio Franchi, Max Planck Inst. for Biological Cybernetics<br />

14:50–14:55 MoAT2.5<br />

A Monocular Vision-based System<br />

for 6D Relative Robot Localization<br />

Andreas Breitenmoser, Laurent Kneip and Roland Siegwart<br />

Autonomous Systems Laboratory, ETH Zurich, Switzerland<br />

• Novel relative localization system with<br />

two complementary modules:<br />

• Camera module: monocular vision<br />

• Target module: 4 active or passive<br />

markers<br />

• 6D relative localization in 3D space,<br />

based on target design, P3P-localization,<br />

blob extraction & pose prediction<br />

• Experimental validation on a quadrotor<br />

helicopter and on a team of e-puck robots<br />

• Accuracies of a few centimeters in<br />

position & a few degrees in orientation<br />

Relative localization modules<br />

installed on three e-puck robots<br />

15:00–15:15 MoAT2.7<br />

Loop-closure candidates selection by exploiting<br />

structure in vehicle trajectory<br />

J. Nieto, G. Agamennoni and T. Vidal-Calleja<br />

Australian Centre for Field Robotics, The University of Sydney, Australia<br />

• Detecting loops is a key problem in robot<br />

localisation.<br />

• Most of the current techniques use<br />

observations of the environment as main<br />

features to produce loop-hypotheses.<br />

• In this paper we investigate the feasibility<br />

of producing loop candidates from<br />

features of the robot trajectory.<br />

• We present an alignment likelihood<br />

function to measure similarity between<br />

trajectory sequences.<br />

• Experimental results with data collected in<br />

Sydney CBD are used to validate our<br />

approach.<br />

Estimated trajectory with loop<br />

candidates selected<br />

14:55–15:00 MoAT2.6<br />

Accurate Human Motion Capture in Large Areas<br />

by Combining IMU- and Laser-Based<br />

People Tracking<br />

Jakob Ziegler, Henrik Kretzschmar,<br />

Cyrill Stachniss, and Wolfram Burgard<br />

Dept. of Computer Science, University of Freiburg, Germany<br />

Giorgio Grisetti<br />

Dept. of Systems and Computer Science, Sapienza University of Rome, Italy<br />

• Estimate the globally aligned full body<br />

posture of a person<br />

• Combine an inertial motion capture suit<br />

with laser-based people tracking<br />

• Formulate a joint least squares<br />

optimization problem to compute a<br />

maximum likelihood trajectory estimate<br />

given all measurements<br />

• Experimental evaluation in large outdoor<br />

scenes<br />

15:15–15:30 MoAT2.8<br />

Optimized Motion Strategies for Localization in<br />

Leader-Follower Formations<br />

Xun S. Zhou*, Ke X. Zhou†, and Stergios I. Roumelitotis*<br />

*Dept. of Computer Science & Engineering, Univ. of Minnesota, USA<br />

†Dept. of Eletrical & Computer Engineering, Univ. of Minnesota, USA<br />

• Main issue: Robot-to-robot pose is<br />

unobservable using distance- or bearing-only<br />

measurements, when moving in formation.<br />

• Key idea: Allow follower to deviate from the<br />

desired formation within a predefined region<br />

to minimize the localization uncertainty.<br />

• Formulated as non-convex optimization<br />

problems<br />

• Key contribution:<br />

(a) Analyze the trade-off between the<br />

formation and the estimation accuracy;<br />

(b) Analytically compute the global optimal<br />

motion strategy (by solving a system of<br />

multivariate polynomials).<br />

• Performance: Indistinguishable from gridbased<br />

exhaustive search, and significantly<br />

better than state of the art.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–6–<br />

{F}<br />

{L}<br />

The follower (F) seeks the next<br />

best pose within a disk centered<br />

at the desired position p o with<br />

radius r, such that the localization<br />

uncertainty is minimized.


Session MoAT3 Continental Parlor 3 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Robot Audition: Active Audition<br />

Chair Patrick Danès, Univ. de Toulouse ; LAAS-CNRS ; UPS ; F-31077<br />

Co-Chair Ian Lane, Carnegie Mellon Univ. Silicon Valley<br />

14:00–14:15 MoAT3.1*<br />

Semi-Plenary Invited Talk: An Overview of the<br />

BINAAHR Project<br />

Patrick Danès, Université de Toulouse ; LAAS-CNRS ; UPS ;<br />

F-31077<br />

Hiroshi G. Okuno, Kyoto University<br />

14:30–14:45 MoAT3.3<br />

Active Soft Pinnae for Robots<br />

Makoto Kumon and Yoshitaka Noda<br />

Graduate School of Science and Technology, Kumamoto University, Japan<br />

• A novel pinna system that can deform its<br />

shape actively is proposed.<br />

• The pinna has a soft deformable skin<br />

made of silicone rubber, and it is covered<br />

with fur.<br />

• Kinematic model of the pinna is<br />

investigated, and it is validated by<br />

experiments.<br />

• Measured acoustic characteristics shows<br />

that the pinna can focus the sound coming<br />

from the front, and that it can encode the<br />

elevation of the sound source as a<br />

frequency cue.<br />

14:15–14:30 MoAT3.2<br />

Assessment of Single-channel<br />

Ego Noise Estimation Methods<br />

Gökhan Ince 1,2 , Kazuhiro Nakadai 1,2 , Tobias Rodemann 3 ,<br />

Jun-ichi Imura 2 , Keisuke Nakamura 1 and Hirofumi Nakajima 1<br />

1- Honda Research Institute Japan Co., Ltd. 2- Tokyo Institute of Tech., Japan<br />

3- Honda Research Institute Europe GmbH<br />

• Problem: Ego noise (Fan noise + motor<br />

noise generated during robot motions)<br />

• Method: Instantaneous estimation of ego<br />

noise using parameterized templates<br />

• Evaluation: Performance comparison<br />

with other state-of-the-art single-channel<br />

noise estimation methods<br />

• Results:<br />

1) Reduced estimation error<br />

2) Reduced log spectral distortion<br />

3) Improved signal-to-noise ratio<br />

4) Improved word correct rates<br />

during the arm and head motion of a<br />

humanoid robot in the presence of<br />

changing background noise<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–7–<br />

Noisy signal<br />

Enhanced signal<br />

Yes sir,<br />

we are cool!<br />

Robots<br />

are cool!<br />

14:45–14:50 MoAT3.4<br />

Particle-filter Based Audio-visual Beat-tracking<br />

for Music Robot Ensemble with Human Guitarist<br />

Tatsuhiko Itohara, Takuma Otsuka ,Takeshi Mizumoto<br />

Tetsuya Ogata, and Hiroshi G. Okuno<br />

Graduate School of Informatics, Kyoto University, Japan<br />

• Beat-tracking, estimation of tempo<br />

and beat-times, is critical to the high<br />

quality of musical ensemble.<br />

• Most conventional methods lack the<br />

adaptation of some feature of guitar<br />

performances, such as back-beats.<br />

• Our method integrates the audio and<br />

visual information with a particle filter<br />

method.<br />

• Experimental results reveal the<br />

robustness against the guitar features<br />

and real-time performance.<br />

A ensemble by a thereminist<br />

robot and a human guitarist


Session MoAT3 Continental Parlor 3 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Robot Audition: Active Audition<br />

Chair Patrick Danès, Univ. de Toulouse ; LAAS-CNRS ; UPS ; F-31077<br />

Co-Chair Ian Lane, Carnegie Mellon Univ. Silicon Valley<br />

14:50–14:55 MoAT3.5<br />

Optimizing a Reconfigurable Robotic<br />

Microphone Array<br />

Eric Martinson, Thomas Apker, Magda Bugajska<br />

Navy Center for Applied Research in Artificial Intelligence<br />

Naval Research Laboratory, USA<br />

Omnidirectional<br />

Camera<br />

Microphones,<br />

with Fiducials<br />

• A distributed, robotic microphone array is deployed to localize sound<br />

sources in predetermined regions.<br />

• The array shape, (relative microphone locations), impact sound<br />

localization accuracy.<br />

• Random array shapes often have poor performance.<br />

• Optimizing robotic positions for the region of interest improves sound<br />

localization accuracy > 70%.<br />

15:00–15:15 MoAT3.7<br />

Acoustic Models and Kalman Filtering Strategies<br />

for Active Binaural Sound Localization<br />

Alban Portello † , Patrick Danès † , Sylvain Argentieri ‡<br />

† CNRS-LAAS, Université de Toulouse, France<br />

‡ ISIR-CNRS, UPMC, France<br />

• A generic framework to the localization of<br />

a moving sound source from a moving<br />

binaural sensor is presented.<br />

• An accurate model of binaural cues in the<br />

presence of motion is set up.<br />

• A multi-hypothesis estimation scheme is<br />

proposed, which can ensure a correct<br />

(non-overconfident) localization with no<br />

prior knowledge.<br />

• The strategy is shown to resolve frontback<br />

ambiguities and to provide range<br />

information about the source.<br />

• The approach is validated on simulated<br />

and experimental results.<br />

(a) Sensor motion and and emitter<br />

position estimate in the world<br />

frame f (b) Range R and d azimuth i th<br />

estimates in the sensor frame along<br />

time<br />

14:55–15:00 MoAT3.6<br />

Incremental Learning for<br />

Ego Noise Estimation of a Robot<br />

Gökhan Ince 1,2 , Kazuhiro Nakadai 1,2 , Tobias Rodemann 3 ,<br />

Jun-ichi Imura 2 , Keisuke Nakamura 1 and Hirofumi Nakajima 1<br />

1- Honda Research Institute Japan Co., Ltd. 2- Tokyo Institute of Tech., Japan<br />

3- Honda Research Institute Europe GmbH<br />

• Problem: Template estimation suffers<br />

from constantly growing database size<br />

• Method: Incremental learning of the<br />

templates with an update mechanism<br />

• Results:<br />

1) Reduced database size<br />

2) Reduced log spectral distortion<br />

3) Reduced noise estimation error<br />

4) Improved signal-to-noise ratio<br />

5) Improved word correct rates<br />

during the arm and head motion of a<br />

humanoid robot in the presence of<br />

changing background noise<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–8–<br />

Number of templates in the<br />

database<br />

2000<br />

Standard template<br />

1800<br />

learning<br />

1600<br />

Incremental learning<br />

1400 of templates<br />

1200<br />

1000<br />

800<br />

600<br />

400<br />

200<br />

0<br />

1 2 3 4 5<br />

Number of motion repetitions<br />

Yes sir, we are<br />

totally cool!<br />

Robots are<br />

cool!<br />

15:15–15:30 MoAT3.8<br />

Intelligent Sound Source Localization and Its<br />

Application to Multimodal Human Tracking<br />

Keisuke Nakamura, Kazuhiro Nakadai<br />

Futoshi Asano, and Gökhan Ince<br />

Honda Research Institute Japan Co., Ltd., Japan<br />

• The proposed human tracking system in a<br />

real environment by intelligent sound<br />

source localization (SSL) achieves:<br />

1. Intelligent sound source tracking based<br />

on the sort of sound source such as<br />

speech, music, and so on by integrating<br />

SSL with sound source identification<br />

2. Highly-robust human tracking by<br />

multimodal integration of a microphone<br />

array, ToF camera, and thermo camera<br />

even when sensor information is<br />

missing or ambiguous due to occlusion<br />

or acoustic noise.<br />

Intelligent Human<br />

Tracking System


Session MoAT4 Continental Ballroom 4 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Telerobotics<br />

Chair Angelika Peer, Tech. Univ. München<br />

Co-Chair M. Cenk Cavusoglu, Case Western Res. Univ.<br />

14:00–14:15 MoAT4.1*<br />

Semi-Plenary Invited Talk: Development of VGO<br />

Robotic Telepresence System<br />

Thomas Ryden, VGO Communications<br />

14:30–14:45 MoAT4.3<br />

Hybrid Virtual-Proxy Based Control Framework<br />

for Passive Bilateral Teleoperation over Internet<br />

Ke Huang* and Dongjun Lee +<br />

* MABE Department, University of Tennessee - Knoxville, USA<br />

+ School of MAE, Seoul National University, South Korea<br />

• Novel hybrid control framework for bilateral<br />

teleoperation over packet-switched<br />

network with varying-delay, packet loss,<br />

data swapping, etc.<br />

• Key innovation is to exploit oftenoverlooked<br />

hybrid nature of teleoperation<br />

(i.e., continuous robot, discrete comm.,<br />

and sampled-date control)<br />

• Closed-loop passivity guaranteed by a<br />

combination of discrete/adjustable virtual<br />

proxy (VP) damping and continuous/unadjustable<br />

device damping<br />

• Large device damping requirement of VPless<br />

direct PD-coupling (IFAC11) for<br />

severe comm. imperfectness is removed<br />

System structure of VP based<br />

bilateral teleoperation<br />

14:15–14:30 MoAT4.2<br />

Semi-Plenary Invited Talk: Development of VGO<br />

Robotic Telepresence System<br />

Thomas Ryden, VGO Communications<br />

14:45–14:50 MoAT4.4<br />

Mutual Telexistence Surrogate System: TELESAR4<br />

- telexistence in real environments using<br />

autostereoscopic immersive display -<br />

Susumu Tachi, Kouichi Watanabe, Keisuke Takeshita,<br />

Kouta Minamizawa, Takumi Yoshida, and Katsunari Sato<br />

Graduate school of Media Design, Keio University, Japan<br />

• TELESAR4 affords a remote person the<br />

opportunity to virtually participate in some<br />

event, such as a party or a meeting.<br />

• The system provides the remote<br />

participant with a life size 360˚ wide-range<br />

stereo view of the event venue with local<br />

participants in real time.<br />

• The local participants at the event are able<br />

to see the face and expressions of the<br />

remote participant in real time.<br />

• The system also allows the remote<br />

participant to move freely about the venue<br />

and to perform some manipulatory tasks<br />

such as handshake and gestures.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–9–<br />

a<br />

b<br />

(a) Remote participant virtually<br />

attending an event.<br />

(b) Local participants and<br />

surrogate robot in the venue.


Session MoAT4 Continental Ballroom 4 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Telerobotics<br />

Chair Angelika Peer, Tech. Univ. München<br />

Co-Chair M. Cenk Cavusoglu, Case Western Res. Univ.<br />

14:50–14:55 MoAT4.5<br />

Experiments of Passivity-Based Bilateral Aerial<br />

Teleoperation of a Group of UAVs with<br />

Decentralized Velocity Synchronization<br />

Paolo Robuffo Giordano * , Antonio Franchi * , Cristian Secchi ** , and<br />

Heinrich H. Bülthoff * ***<br />

* Max Planck Institute for Biological Cybernetics, Germany<br />

** DISMI, University of Modena and Reggio Emilia, Italy<br />

*** DBCE, Korea University, Korea<br />

• Decentralized Bilateral Teleoperation<br />

of a group of UAVs<br />

• Time-varying topology (arbitrary<br />

split/join) is handled in a stable (passive)<br />

way<br />

• Formal proof of steady-state velocity<br />

synchronization via a variable damping<br />

action<br />

• Experiments with 4 Quadcopters (slaveside)<br />

and a Omega.3 force-feedback<br />

device (master-side)<br />

Experiments of 4 Quadcopters<br />

teleoperated in a cluttered<br />

environment<br />

15:00–15:15 MoAT4.7<br />

Network Representation and Passivity of<br />

Delayed Teleoperation Systems<br />

J.Artigas, C. Preusche, G. Hirzinger<br />

Institute of Robotics and Mechatronic, German Aerospace Center, Germany<br />

J-H. Ryu<br />

Biorobotics Laboratory, Korean University of Technology, South Korea<br />

• General network based<br />

representation, analysis and<br />

design.<br />

• A solution to the ambiguity of the<br />

channel causality (with lack of effort<br />

or flow transmitted!).<br />

• A framework to design<br />

teleoperation architectures:<br />

• Time delay independency<br />

• For any channel architecture<br />

• Stability through Time Domain<br />

Passivity.<br />

14:55–15:00 MoAT4.6<br />

Design of Single-Operator-Multi-Robot<br />

Teleoperation Systems with<br />

Random Communication Delays<br />

Yunyi Jia, Ning Xi and John Buether<br />

Electritical and Computer Engineering Department,<br />

Michigan State University, USA<br />

• Propose a system structure for<br />

designing Single-Operator-Multi-<br />

Robot Teleoperation Systems<br />

• Non-time based control is applied<br />

to cope with random<br />

communication delays<br />

• Convenient to change the multirobot<br />

structure, like adding or<br />

removing robots<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–10–<br />

Single-Operator-Multi-Robot<br />

Teleoperation System<br />

15:15–15:30 MoAT4.8<br />

Controlling Telerobotic Operations Adaptive to<br />

Quality of Teleoperator and Task Dexterity<br />

Yunyi Jia, Ning Xi, Fei Wang, Yunxia Wang and Xin Li<br />

Electritical and Computer Engineering Department,<br />

Michigan State University, USA<br />

• Propose a concept named Quality<br />

of Teleoperator (QoT)<br />

• Integrate the quality of<br />

teleoperator into teleoperation<br />

system<br />

• Adjust controller parameters<br />

adaptive to QoT and Task<br />

dexterity<br />

• Teleoperation efficiency and<br />

safety are enhanced<br />

Telerobotic systems adaptive to QoT and TDI


Session MoAT5 Continental Ballroom 5 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Bio-inspired Robotics I<br />

Chair Justin Seipel, Purdue<br />

Co-Chair Xinyan Deng, Purdue Univ.<br />

14:00–14:15 MoAT5.1*<br />

Semi-Plenary Invited Talk: Robustness in Animals as<br />

Inspiration for the Next Generation Robot<br />

Robert Full, University of California at Berkeley<br />

14:30–14:45 MoAT5.3<br />

A MATLAB Framework for Efficient Gait Creation<br />

C. David Remy and Roland Siegwart, Fellow, IEEE<br />

Autonomous Systems Lab, ETH Zurich, Switzerland<br />

Keith Buffinton<br />

Dept. of Mech Engr., Bucknell University, Lewisburg PA, USA<br />

• The paper introduces a framework for the<br />

creation of efficient gaits that can exploit<br />

natural dynamics.<br />

• The presented tools and methods are<br />

illustrated with the following three<br />

examples:<br />

• The stability analysis of a passive<br />

dynamic walker<br />

• An analysis of the efficiency (COT) of<br />

an actuated monopod hopper<br />

• The control of a bounding quadruped<br />

14:15–14:30 MoAT5.2*<br />

Semi-Plenary Invited Talk: Snake-inspired Robots<br />

Shigeo Hirose, Tokyo Institute of Technology<br />

14:45–14:50 MoAT5.4<br />

A Controller for Continuous Wave<br />

Peristaltic Locomotion<br />

Alexander S. Boxerbaum, Andrew D. Horchler, Roger D. Quinn<br />

Mechanical Engineering, Case Western Reserve University, USA<br />

Kendrick M. Shaw, Hillel J. Chiel<br />

Biology, Case Western Reserve University, USA<br />

• Recent successes with a continuously deformable wormlike robot have<br />

informed the investigation of novel soft-bodied control strategies<br />

• A dynamic analysis of peristalsis suggests ways to minimize slippage<br />

• A Wilson-Cowan model of<br />

neuronal populations is explored<br />

as a means to generate many<br />

different waveforms<br />

• Unlike CPG networks, this<br />

model can readily change its<br />

spatial and temporal<br />

frequencies, even coming to a<br />

complete stop<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–11–<br />

A wormlike robot with a continuously<br />

deformable outer mesh


Session MoAT5 Continental Ballroom 5 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Bio-inspired Robotics I<br />

Chair Justin Seipel, Purdue<br />

Co-Chair Xinyan Deng, Purdue Univ.<br />

14:50–14:55 MoAT5.5<br />

Energetics of Bio-Inspired Legged Robot<br />

Locomotion with Elastically-Suspended Loads<br />

Jeffrey Ackerman and Justin Seipel<br />

School of Mechanical Engineering, Purdue University, USA<br />

• Loads such as batteries, electronics, and<br />

fuel can be elastically-suspended from a<br />

robot<br />

• Elastically-suspended loads increase the<br />

energy efficiency of legged locomotion<br />

• Thus, a robot’s speed, operating time, or<br />

load carrying capacity could be increased<br />

Prototype robot with an<br />

elastically-suspended load<br />

15:00–15:15 MoAT5.7<br />

Descending Commands to an Insect Leg Controller<br />

Network Cause Smooth Behavioral Transitions<br />

Brandon Rutter, Brian Taylor, John Bender, Roy Ritzmann and<br />

Roger Quinn<br />

Mech. & Aero Eng., and Biology, Case Western Reserve Univ., USA<br />

William Lewinger<br />

Inst. Of Perception, Action & Behavior, U. of Edinburgh, UK<br />

Marcus Blümel<br />

Dept. of Animal Physiology, Inst. For Zoology, U. of Cologne, Germany<br />

• Leg Controller Network (LegConNet)<br />

controls forward stepping of a robotic<br />

model cockroach leg.<br />

• Hypothesized additional pathways allow<br />

the model to smoothly transition to insectlike<br />

turning behaviors.<br />

• The modification of these pathways and<br />

their strengths is equivalent to descending<br />

commands in the animal<br />

• Several experiments were run showing<br />

the operation of the system under differing<br />

timing of descending commands.<br />

14:55–15:00 MoAT5.6<br />

A Rapidly Reconfigurable Robot for Assistance<br />

in Urban Search and Rescue<br />

Alexander Hunt, Richard Bachmann and Roger Quinn<br />

Department of Mechanical and Aerospace Engineering<br />

Case Western Reserve University, USA<br />

Robin Murphy<br />

Department of Computer Science and Engineering, Texas A&M, USA<br />

• USAR Whegs TM successfully implements<br />

features desirable for search and rescue<br />

on a mobile robot<br />

• Relatively lightweight and compact<br />

• Four wheel-legs and differential steering<br />

• Carbon fiber wheel-legs for strength and<br />

reduction of rotary inertia by factor of eight<br />

• Quick change devices allow the robot to<br />

be changed to/from tracks and wheel-legs<br />

• Geosystems Zippermast for elevating<br />

camera field of view up to 8 feet (xx<br />

meters)<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–12–<br />

Robot with mast raised next to a<br />

74 inch (188 cm) tall man<br />

15:15–15:30 MoAT5.8<br />

Virtual Chassis for Snake Robots<br />

David Rollinson and Howie Choset<br />

The Robotics Institute, Carnegie Mellon University, USA<br />

• A new method for determining<br />

intuitive body coordinate frames for<br />

snake robots.<br />

• Separates internal motion due to the<br />

robot’s shape changes from external<br />

motion in the world.<br />

• A general method is presented for<br />

any shape configuration.<br />

• Additional refinements can be made<br />

in certain cases that exploit the<br />

symmetry of the robot’s shape.


Session MoAT6 Continental Ballroom 6 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Field Robotics I<br />

Chair Satoshi Tadokoro, Tohoku Univ.<br />

Co-Chair David A. Anisi, ABB<br />

14:00–14:15 MoAT6.1*<br />

Semi-Plenary Invited Talk: Rescue mobile robot<br />

Quince --Toward emergency response to the nuclear<br />

accident at Fukushima Daiichi Nuclear Power Plant on<br />

March <strong>2011</strong><br />

Keiji Nagatani, Tohoku University<br />

Tomoaki Yoshida, Chiba Institute of Technology<br />

14:30–14:45 MoAT6.3<br />

Perception for a River Mapping Robot<br />

Andrew Chambers, Supreeth Achar, Stephen Nuske, Jörn Rehder,<br />

Lyle Chamberlain, Justin Haines, Sebastian Scherer, Sanjiv Singh<br />

Robotics Institute, Carnegie Mellon University, USA<br />

Bernd Kitt<br />

The Institute of Measurement and Control Systems,<br />

Karlsruhe Institute of Technology, Germany<br />

• Perception system for mapping rivers with<br />

thick canopy<br />

• 360� obstacle detection with off-axis<br />

spinning laser<br />

• Adaptive appearance model for river<br />

detection and map generation<br />

• Graph-based global state estimation from<br />

visual odometry, inertial measurements,<br />

and intermittent GPS<br />

3D Point Cloud overlaid on 2km<br />

river traverse<br />

14:15–14:30 MoAT6.2<br />

Semi-Plenary Invited Talk: Rescue mobile robot<br />

Quince --Toward emergency response to the nuclear<br />

accident at Fukushima Daiichi Nuclear Power Plant on<br />

March <strong>2011</strong><br />

Keiji Nagatani, Tohoku University<br />

Tomoaki Yoshida, Chiba Institute of Technology<br />

14:45–14:50 MoAT6.4<br />

Real-world demonstration of sensor-based<br />

robotic automation in oil & gas facilities<br />

David A. Anisi, Erik Persson and Clint Heyer<br />

Department for Strategic R&D for Oil, Gas<br />

and Petrochemicals, ABB, Norway<br />

• On-site demonstration of autonomous<br />

valve manipulation and thermal inspection<br />

• Both operations were requested by the<br />

current site operators<br />

• Run amidst live hydrocarbon processes<br />

• The first system performing sensor-based,<br />

close-contact operations in a real<br />

operational environment (ATEX)<br />

• Using torque overload detection method to<br />

avoid over-tightening/loosening the valve<br />

• Rely on four independent layers of safety<br />

to meet the level of robustness, accuracy<br />

and reliability required by the industry<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–13–<br />

Industrial robot certified for running<br />

in explosive atmospheres (ATEX)


Session MoAT6 Continental Ballroom 6 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Field Robotics I<br />

Chair Satoshi Tadokoro, Tohoku Univ.<br />

Co-Chair David A. Anisi, ABB<br />

14:50–14:55 MoAT6.5<br />

Offshore Robotics –<br />

Survey, Implementation, Outlook<br />

Kai Pfeiffer, Alexander Bubeck and Matthias Bengel<br />

Robot Systems Department, Fraunhofer Institute for<br />

Manufacturing Engineering and Automation (IPA), Germany<br />

• Survey of production operation tasks on<br />

offshore installations shows potential for<br />

robotic automation<br />

• MIMROex: World wide first ATEX<br />

certifiable mobile inspection robot tested<br />

on an offshore installation<br />

• Extension of robot control system for<br />

assisted tele-manipulation for<br />

interaction with process equipment<br />

Mobile inspection robot MIMROex<br />

on an offshore installation in the<br />

south Chinese sea<br />

15:00–15:15 MoAT6.7<br />

Combining Radar and Vision for Self-Supervised<br />

Ground Segmentation in Outdoor Environments<br />

Annalisa Milella<br />

Institute of Intelligent Systems for Automation, CNR, Italy<br />

Giulio Reina<br />

Department of Engineering for Innovation, University of Salento, Italy<br />

James Underwood, and Bertrand Douillard<br />

Australian Centre for Field Robotics, University of Sydney, Australia<br />

• Self-supervised radar-vision classification<br />

system that allows an autonomous vehicle<br />

to automatically construct online a visual<br />

model of the ground and perform accurate<br />

ground segmentation.<br />

• Automatic online labeling based on a<br />

radar ground segmentation approach prior<br />

to image analysis to avoid time consuming<br />

manual labeling to construct the training<br />

set.<br />

• Adaptive ground model learning by<br />

retraining the classifier using the most<br />

recent radar scans.<br />

14:55–15:00 MoAT6.6<br />

Active camera control with obstacle avoidance for<br />

remote operations with industrial manipulators:<br />

Implementation and experimental results<br />

Magnus Bjerkeng and Kristin Pettersen<br />

Department of Engineering Cybernetics, NTNU, Norway<br />

Aksel Transeth, Erik Kyrkebø and Sigurd Fjerdingen<br />

SINTEF ICT Applied Cybernetics, Norway<br />

• A controller for dynamic surveillance of<br />

remote operations for enhanced<br />

remote awareness is presented.<br />

• A pseudoinverse controller based on<br />

the stereographic projection achieves<br />

real-time camera tracking and tunable<br />

camera distance.<br />

• Obstacle avoidance is achieved using<br />

artificial potential fields respecting joint<br />

limitations without oscillations.<br />

• Experiments using industrial<br />

manipulators are presented.<br />

15:15–15:30 MoAT6.8<br />

Assessing the Deepwater Horizon Oil Spill with<br />

the Sentry Autonomous Underwater Vehicle<br />

James C. Kinsey 1 , Dana R. Yoerger 1 , Michael V. Jakuba 2 ,<br />

Rich Camilli 1 , Charles R. Fisher 3 , and Christopher R. German 1<br />

1 Woods Hole Oceanographic Institution, USA<br />

2 Australian Centre for Field Robotics, Australia<br />

3 Pennsylvania State University, USA<br />

• The Sentry AUV is an autonomous<br />

underwater robot capable of operations to<br />

4500m<br />

• In 2010, Sentry was deployed on two<br />

cruises in response to the Deepwater<br />

Horizon Oil Spill<br />

• In June 2010, Sentry was equipped with<br />

the TETHYS mass spectrometer [Camilli<br />

and Duryea, 2009] and tracked an<br />

underwater oil plume up to 35km from the<br />

spill site<br />

• Another cruise in December 2010 did<br />

bathymetric and photo surveys to assess<br />

the impact of the spill on coral<br />

communities<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–14–<br />

Top: Sentry AUV at the DWH Oil Spill.<br />

Bottom: Methane and water velocity<br />

measurements of the plume<br />

[Camilli et al., 2010]


Session MoAT7 Continental Parlor 7 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Wheeled Robots<br />

Chair Yasuhisa Hirata, Tohoku Univ.<br />

Co-Chair Matthew Spenko, Illinois Inst. of Tech.<br />

14:00–14:15 MoAT7.1<br />

SO(2) and SO(3), Omni-Directional Personal Mobility<br />

with Link-Driven Spherical Wheels<br />

Sungsuk Ok, Atsushi Kodama, Yuma Matsumura<br />

and Yoshihiko Nakamura<br />

Mechano-Informatics, The University of Tokyo, Japan<br />

• This paper proposed Link-Driven<br />

Spherical Wheel composed of three serial<br />

links jointed to the spherical wheel.<br />

• It is realized that LDSW move to any<br />

direction by designed controller.<br />

• The SO(3) installing 3LDSW is developed<br />

for a hospital bed or a personal mobility<br />

and realized to move to omni-direction by<br />

designed controller.<br />

• The SO(2) installing 2LDSW is developed<br />

for a personal mobility and experimented<br />

by designed controller based on the<br />

dynamics and kinematics of mobility.<br />

The overview of Link-Driven<br />

Spherical Wheel & Mobilities<br />

( LDSW, SO(2) and SO(3) )<br />

14:30–14:45 MoAT7.3<br />

Wheel-Soil Interaction Model for Rover<br />

Simulation Based on Plasticity Theory<br />

Ali Azimi, Jozsef Kovecses, and Jorge Angeles<br />

Department of Mechanical Engineering<br />

and Centre for Intelligent Machines<br />

McGill University, Canada<br />

• A novel approach based on plasticity<br />

theory is introduced here, which is<br />

computationally efficient<br />

• The simulation results are verified with<br />

experimentally validated semiempirical<br />

models in steady-state<br />

operations<br />

• This approach can address some of<br />

the shortcomings of the semi-empirical<br />

models, like slip-sinkage phenomenon<br />

• Our approach is well suited for<br />

implementation in a multibody<br />

dynamics simulation environment like<br />

Vortex<br />

Sojourner Simulation in Vortex<br />

14:15–14:30 MoAT7.2<br />

Design and control of an active anti-roll system<br />

for a fast rover<br />

Mohamed Krid and Faiz Ben Amar<br />

Institut des Systèmes Intelligents et de Robotiques<br />

Université Pierre et Marie Curie UPMC Paris 6, CNRS UMR7222, France<br />

• Fast off-road rover with high clearance<br />

and large suspension displacement<br />

• Design of an innovative anti-roll active<br />

device and its kinematic modeling<br />

• Roll dynamics modeling<br />

• Stability metrics defined by Lateral Load<br />

Transfer<br />

• Design of a model predictive control MPC<br />

for roll control and optimal stability<br />

considering the path following control<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–15–<br />

Figure caption is optional,<br />

use Arial Narrow 20pt<br />

14:45–14:50 MoAT7.4<br />

Using Unmanned Ground Vehicle Performance<br />

Measurements as a Unique Method of Terrain<br />

Classification<br />

Siddharth Odedra<br />

Autonomous Systems Lab, Middlesex University, UK<br />

• An Unmanned Ground Vehicles’ biggest<br />

obstacle is terrain; which can be<br />

unpredictable, unstructured and<br />

untraversable.<br />

• Systems with more knowledge about<br />

terrain will have greater success when<br />

operating in unforgiving conditions.<br />

• It is proposed that terrain can be classified<br />

by measuring its effect on vehicle<br />

performance.<br />

Vehicle platform used to<br />

measure the vehicle-terrain<br />

interaction.


Session MoAT7 Continental Parlor 7 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Wheeled Robots<br />

Chair Yasuhisa Hirata, Tohoku Univ.<br />

Co-Chair Matthew Spenko, Illinois Inst. of Tech.<br />

14:50–14:55 MoAT7.5<br />

A Multi-tiered Robust Steering Controller Based<br />

on Yaw Rate and Side Slip Estimation<br />

Ming Xin and Mark Minor<br />

Mechanical Engineering Department, University of Utah, USA<br />

• Model considers curvature based vehicle<br />

kinematics, and slip angle and yaw rate<br />

dynamics.<br />

• Multi-tiered sliding mode control well<br />

matched to vehicle sensors, actuators,<br />

and dynamics.<br />

• High gain observer estimates slip angle<br />

and yaw rate for output feedback control<br />

• Simulations and experiment verify<br />

performance at high and low speeds<br />

The Experimental Platform –<br />

“Red Rover”<br />

15:00–15:15 MoAT7.7<br />

A Diameter-Dependent Terramechanics Model<br />

for Small-Wheeled UGVs on Deformable Terrain<br />

Gareth Meirion-Griffith and Matthew Spenko<br />

Mechanical, Materials and Aerospace Engineering Department, Illinois Institute<br />

of Technology, U.S.A.<br />

• In this paper a diameter-dependent<br />

pressure-sinkage model is introduced.<br />

• This model is used to derive a modified<br />

terramechanics framework for smallwheeled<br />

UGVs.<br />

• This framework is used to simulate vehicle<br />

performance on deformable terrains.<br />

• Field tests using an experimental UGV are<br />

used to validate this model.<br />

Elliptical distribution of normal<br />

stress at the wheel-soil interface<br />

14:55–15:00 MoAT7.6<br />

Differential flatness of a front-steered vehicle<br />

with tire-force control<br />

Steven Peters and Karl Iagnemma<br />

Dept. of Mechanical Engineering, MIT, USA<br />

Emilio Frazzoli<br />

Lab for Information and Decision Systems (LIDS), MIT, USA<br />

• The position of the front center of<br />

oscillation of a bicycle is a flat output<br />

• Applying tire force control allows this<br />

point to track desired trajectories,<br />

even at slip angles of 10-20 degrees<br />

• Structure in yaw dynamics resembling<br />

Liénard system enables nonlinear<br />

stability analysis of 3 rd degree of<br />

freedom, yaw dynamics<br />

15:15–15:30 MoAT7.8<br />

Development of Passive type Double<br />

Wheel Caster Unit based on Analysis of<br />

Feasible Braking Force and Moment Set<br />

Masao Saida, Yasuhisa Hirata and Kazuhiro Kosuge<br />

Engineering, Tohoku University, Japan<br />

• We introduce passive mobile robots<br />

consisting of various wheels which are<br />

controlled by servo brakes.<br />

• We analyze a feasible braking<br />

force/moment necessary for controlling<br />

the robot with servo brakes.<br />

• We reveal the advantages and<br />

disadvantages of the feasible braking<br />

force/moment of each robot.<br />

• We propose a new passive type double<br />

wheel caster unit, called PDC.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–16–<br />

F yr<br />

x r<br />

Izz mxr c<br />

Fxf<br />

Fyf<br />

Feasible Braking Force and Moment<br />

Passive type Double Wheel Caster Unit<br />

δ


Session MoAT8 Continental Parlor 8 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Learning Parameterized Policies<br />

Chair Jan Peters, Darmstadt Univ. of Tech.<br />

Co-Chair Petar Kormushev, Istituto Italiano di Tecnologia<br />

14:00–14:15 MoAT8.1<br />

Bipedal Walking Energy Minimization<br />

by Reinforcement Learning with<br />

Evolving Policy Parameterization<br />

Petar Kormushev, S. Calinon, N. G. Tsagarakis, D. G. Caldwell<br />

Advanced Robotics, Italian Institute of Technology (IIT), Italy<br />

Barkan Ugurlu<br />

Toyota Technological Institute (TTI), Japan<br />

• New reinforcement learning approach<br />

evolves the policy parameterization<br />

dynamically during the learning process<br />

• Advantages:<br />

• Faster convergence<br />

• Avoids local optima<br />

• Decreased computation time<br />

• Applications:<br />

• Function approximation task<br />

• Bipedal robot walking energy<br />

consumption minimization<br />

• Variable-CoM-height ZMP-based walk<br />

generator uses efficiently the passive<br />

compliance of the robot<br />

Passively-compliant robot COMAN<br />

walking with varying CoM-height<br />

14:30–14:45 MoAT8.3<br />

Learning Anticipation Policies<br />

for Robot Table Tennis<br />

Zhikun Wang 1,2 , Christoph H. Lampert 3 , Katharina Mülling 1,2 ,<br />

Bernhard Schölkopf 1 , Jan Peters 1,2<br />

1 MPI for Intelligent Systems 2 TU Darmstadt 3 IST Austria<br />

• The robot learns to play table tennis against human players.<br />

• We develop an anticipation system that allows the robot to<br />

• perceive the opponent’s moves;<br />

• predict his intention;<br />

• react accordingly with optimized hitting plans.<br />

• The anticipation policies are learned using reinforcement learning.<br />

• Figure. The optimal hitting plan is chosen by the learned anticipation policies.<br />

14:15–14:30 MoAT8.2<br />

Learning Motion Primitive Goals<br />

For Robust Manipulation<br />

Freek Stulp, Evangelos Theodorou, Mrinal Kalakrishnan,<br />

Peter Pastor, Ludovic Righetti, Stefan Schaal<br />

Computational Learning and Motor Control Lab,<br />

University of Southern California, USA<br />

• Contributions:<br />

• Simplify the PI2 model-free<br />

reinforcement learning<br />

algorithm<br />

• Enable PI2 to simultaneously<br />

learn the shape and goal (i.e.<br />

end-point) of a motion primitive<br />

• Apply to learning motion<br />

primitives for manipulation that<br />

are robust to uncertainty in<br />

object position<br />

• Evaluated on a 11DOF<br />

manipulation platform<br />

14:45–14:50 MoAT8.4<br />

Learning Elementary Movements<br />

Jointly with a Higher Level Task<br />

Jens Kober and Jan Peters<br />

Department of Empirical Inference, MPI for Intelligent Systems, Germany<br />

Intelligent Autonomous Systems Group, TU Darmstadt, Germany<br />

• Learning primitive movements and higher<br />

level strategies at the same time.<br />

• Lower level learned by Cost-regularized<br />

Kernel Regression.<br />

• Transition probabilities of the higher level<br />

estimated from the lower level.<br />

• Approach evaluated on a side-stall-style<br />

throwing game both in simulation and with<br />

a real BioRob<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–17–


Session MoAT8 Continental Parlor 8 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Learning Parameterized Policies<br />

Chair Jan Peters, Darmstadt Univ. of Tech.<br />

Co-Chair Petar Kormushev, Istituto Italiano di Tecnologia<br />

14:50–14:55 MoAT8.5<br />

Learning Interaction Control Policies<br />

by Demonstration<br />

Vasiliki Koropouli, Dongheui Lee and Sandra Hirche<br />

Institute of Automatic Control Engineering,<br />

Technical University of Munich, Germany<br />

• Learning interaction forces to interact with compliant environments.<br />

• Modeling of forces by an autonomous control policy based on parallel<br />

force/position control.<br />

• Parameterization and learning of the policy by Locally Weighted<br />

Regression.<br />

• Implemented in virtual manipulation tasks using 2DOF haptic device.<br />

15:00–15:15 MoAT8.7<br />

Adapting Control Policies for Expensive Systems to<br />

Changing Environments<br />

Matthew Tesch, Jeff Schneider and Howie Choset<br />

Robotics Institute, Carnegie Mellon University, United States of America<br />

• Optimal control parameters can vary<br />

significantly from environment to<br />

environment<br />

• Optimizing expensive systems is<br />

restrictive in even a single environment<br />

• Control policies define a mapping from<br />

environment parameters to controller<br />

parameters<br />

• Learning good policies for expensive<br />

systems requires reasoning about related<br />

environments and careful experiment<br />

selection<br />

A predicted model of an objective<br />

function, with the corresponding<br />

optimal policy.<br />

14:55–15:00 MoAT8.6<br />

Motion Generation by Reference-Point-Dependent<br />

Trajectory HMMs<br />

Komei Sugiura, Naoto Iwahashi, and Hideki Kashioka<br />

NICT, Japan<br />

• Imitation learning method for object-manipulating motions such as<br />

“place X on Y” and “raise Z”<br />

• EM algorithm and cross-validation are used for estimating the optimal<br />

coordinate system and state number<br />

• Reference-point-dependent trajectory HMM is used for generating<br />

smooth trajectory<br />

Imitation learning of motion “throw-away”<br />

15:15–15:30 MoAT8.8<br />

Online movement adaptation<br />

based on previous sensor experiences<br />

Peter Pastor1 , Ludovic Righetti1 ,<br />

Mrinal Kalakrishnan1 , and Stefan Schaal1,2 1CLMC Lab, University of Southern California, USA<br />

2Max-Planck-Institute for Intelligent Systems, Tübingen, Germany<br />

• Stereotypical movements allow to<br />

accumulate previous sensor<br />

experiences and generate predictive<br />

models for subsequent task executions<br />

• These sensor traces are used to adapt<br />

the movement plan online to react to<br />

unexpected perturbations<br />

• Tight feedback integration is achieved<br />

using Dynamic Movement Primitives<br />

and a compliant control architecture<br />

• Our approach combines reactive<br />

behaviors with stereotypical<br />

movements to create a rich set of<br />

possible motions that account for<br />

external perturbations and perception<br />

uncertainty to generate truly robust<br />

behaviors<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–18–


Session MoAT9 Continental Parlor 9 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Novel Actuators I<br />

Chair Han-Pang Huang, National Taiwan Univ.<br />

Co-Chair Stephen Mascaro, Univ. of Utah<br />

14:00–14:15 MoAT9.1<br />

Design of a New Variable Stiffness Actuator and<br />

Application for Assistive Exercise Control<br />

Tzu-Hao Huang, Jiun-Yih Kuan, and Han-Pang Huang*<br />

Department of Mechanical Engineering, National Taiwan University, Taiwan<br />

• Design concept of CCEA: antagonistically<br />

identical four-bar linkages with two<br />

extensional linear springs<br />

• A human-robot interaction model is<br />

proposed to investigate the properties of<br />

the system itself and its stability during<br />

physical human-robot interaction<br />

• Two control models are proposed to<br />

investigate the performance of assistive<br />

exercise control<br />

• CCEA exoskeleton is built for elbow<br />

rehabilitation. Both simulations and<br />

experiments are conducted to show some<br />

desired properties of the proposed CCEA<br />

system<br />

Fac<br />

M ac<br />

R1<br />

y ac<br />

xac<br />

R 2<br />

yca<br />

xca<br />

R1 2 R<br />

M ca<br />

14:30–14:45 MoAT9.3<br />

Optimal energy density piezoelectric twisting<br />

actuators<br />

Ben Finio and Robert Wood<br />

School of Engineering and Applied Sciences, Harvard University, USA<br />

• Piezoelectric actuators are frequently<br />

useful in microrobotics applications due to<br />

their compact size<br />

• We present a torsional piezoelectric<br />

actuator<br />

• Modeling using laminate-plate theory is<br />

verified with experimental actuator<br />

performance testing<br />

X ac<br />

X<br />

β<br />

X ca<br />

Fl<br />

Y<br />

14:15–14:30 MoAT9.2<br />

A New Variable Stiffness Actuator (CompAct-<br />

VSA): Design and Modelling<br />

Nikos G. Tsagarakis, Irene Sardellitti, Darwin G. Caldwell<br />

Department of Advanced Robotics, Istituto Italiano di Tecnologia, Italy<br />

• This work introduces the mechanics, the<br />

principle of operation and the model of new<br />

variable stiffness actuator .<br />

• The actuator regulates the compliance<br />

using an amplification mechanism based<br />

on the lever arm principle with a<br />

continuously regulated pivot point.<br />

• The proposed principle permits the<br />

realization of an actuation unit with a wide<br />

range of stiffness and a fast stiffness<br />

regulation response.<br />

• Preliminary results are presented to<br />

demonstrate the fast stiffness regulation<br />

response and the wide range of stiffness<br />

achieved by the proposed design.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–19–<br />

The CompAct-VSA unit<br />

14:45–14:50 MoAT9.4<br />

A Nonlinear Series Elastic Actuator for Highly<br />

Dynamic Motions<br />

• Called the HypoSEA<br />

• Designed specifically for force<br />

control of jumping robots<br />

• Nonlinear torque-angle<br />

relationship and 40J energy<br />

storage capability are optimized<br />

for hopping motions<br />

• Very robust against shock<br />

Ivar Thorson and Darwin Caldwell<br />

Department of Advanced Robotics,<br />

Istituto Italiano di Tecnologia, Italy<br />

The HypoSEA Actuator


Session MoAT9 Continental Parlor 9 Monday, September 26, <strong>2011</strong>, 14:00–15:30<br />

Novel Actuators I<br />

Chair Han-Pang Huang, National Taiwan Univ.<br />

Co-Chair Stephen Mascaro, Univ. of Utah<br />

14:50–14:55 MoAT9.5<br />

Development of a 3-DOF Inchworm Mechanism<br />

Organized by a Pair of Y-shaped Electromagnets<br />

and 6 Piezoelectric Actuators<br />

-Design, Principle, and Experiments of Translational Motions-<br />

O. Fuchiwaki, M. Yatsurugi, S. Omura, and K. Arafuka<br />

Yokohama National University<br />

Hodogaya, Yokohama, Kanagawa, Japan<br />

• Newly developed 3-DOF inchworm<br />

mechanism with 6 contact points<br />

on a surface<br />

• 3-DOF simple harmonic vibration<br />

model to get the input signals<br />

• Improvement of a positioning<br />

repeatability, especially when it<br />

carries a payload<br />

• VTR demonstration of interesting<br />

omnidirectional motion is also<br />

available to check how it works<br />

Y-shaped<br />

electromagnet<br />

Piezoelectric<br />

actuator<br />

Newly-developed mechanism with<br />

better positioning repeatability<br />

15:00–15:15 MoAT9.7<br />

Dielectric Elastomer Bender Actuator Applied to<br />

Modular Robotics<br />

Paul White, Stella Latscha,<br />

Steve Schlaefer, and Mark Yim<br />

Mechanical Engineering and Applied Mechanics<br />

University of Pennsylvania, USA<br />

• Comparison of several actuation<br />

technologies with respect to important<br />

module metrics indicates dielectric<br />

elastomer actuation is a promising<br />

technology<br />

• Design, fabrication, and experimental<br />

characterization of a 30mm length scale<br />

dielectric elastomer actuated module<br />

• Six different modular robot configurations<br />

comprising two or three modules<br />

• Verified parallel actuation with two<br />

modules lifting twice the load a single<br />

module can<br />

0V<br />

3500V<br />

Single module (top); two module<br />

chain unactuated (middle) and<br />

actuated (bottom)<br />

14:55–15:00 MoAT9.6<br />

Wet Shape Memory Alloy Actuated Robotic<br />

Heart with Thermofluidic Feedback<br />

Matt Pierce and Stephen Mascaro<br />

Mechanical Engineering, University of Utah, USA<br />

• First ever successful implementation of a<br />

self-sustaining thermofluidically powered<br />

SMA pump<br />

• Distributes thermofluidic energy to arrays<br />

of SMA actuators that function as robotic<br />

muscles<br />

• Designed and optimized by varying key<br />

parameters through modeling, simulation,<br />

and experimentation<br />

• Capable of a net output of 66 mL/min<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–20–<br />

Wet SMA Robotic Heart Prototype<br />

15:15–15:30 MoAT9.8<br />

Stretchable Circuits and Sensors<br />

For Robotic Origami<br />

Jamie K Paik, Rebecca K Kramer and Robert J Wood<br />

Microrobotics Laboratory,<br />

School of Engineering and Applied Sciences, Harvard University, USA<br />

• We present two types of stretchable<br />

circuits that are directly applicable to low<br />

profile and high strain environment of<br />

robotic origami: copper-mesh circuit and<br />

eGaIn-filled PDMS micro channel circuit.<br />

• Both circuits are directly connected to the<br />

SMA actuator module that produces<br />

independent folding with 0� -180� range of<br />

motion and 3.5mNm torque.<br />

• Copper-mesh circuit has strategic slits that<br />

are embedded within polymer to provide<br />

high conductivity during actuator motions.<br />

eGaIn circuit can produce strain up to<br />

160% before failure.<br />

Two flexible circuits integrated<br />

with robotic origami tile modules.<br />

The left shows the copper-mesh<br />

circuit and the right is the eGaIn<br />

circuit. Both circuits have trace<br />

widths of 1mm. For the eGaIn<br />

circuit, the curvature sensor is<br />

embedded to the right of the<br />

actuator.


Session MoBT1 Continental Parlor 1 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Micromanipulation<br />

Chair Tatsuo Arai, Osaka Univ.<br />

Co-Chair Jake Abbott, Univ. of Utah<br />

16:00–16:15 MoBT1.1<br />

Image-Based Magnetic Control of<br />

Paramagnetic Microparticles in Water<br />

Jasper D. Keuning, Jeroen de Vries, Leon Abelmann<br />

and Sarthak Misra<br />

University of Twente, The Netherlands<br />

• Implementation of a system for controlling<br />

the position of a 100μm paramagnetic<br />

microparticle using electromagnets.<br />

• The position of the particle is determined<br />

by processing microscopic images.<br />

• The particle is positioned with an accuracy<br />

of 8.4μm, achieving speeds of 235μm/s.<br />

• The particle can follow a circular or a<br />

figure-eight path.<br />

Image of the setup and results of the<br />

particle tracking a figure-eight path<br />

16:30–16:45 MoBT1.3<br />

������������������������������<br />

�������������������������������������<br />

��������������������������������������������������<br />

������� � ������������������� � ������������ � ��������������� � �����<br />

������������� ���<br />

� �����������������������������������������������������������������������<br />

�� ������������������������������������������������������������<br />

� ����������������������������������������<br />

����������������������������������������<br />

����������������������������<br />

� ����������������������������������������<br />

���������������������������������������<br />

����������������������������������������<br />

�������<br />

� ������������������������������������<br />

������������������������������������������<br />

���������������������������������������������<br />

��������������������������������������<br />

� �����������������������������������������<br />

������������������������������������������<br />

�����������������������<br />

������������������������������<br />

����������������������������������<br />

������������������������������<br />

���������������������������<br />

16:15–16:30 MoBT1.2<br />

Automated Micromanipulation of a Microhand<br />

with All-In-Focus Imaging System<br />

Chanh-Nghiem Nguyen, Kenichi Ohara, Ebubekir AVCI,<br />

Tomohito Takubo, Yasushi Mae and Tatsuo Arai<br />

Department of Systems Innovation, Graduate School of Engineering Science,<br />

Osaka University, Japan<br />

• In this paper, automated<br />

micromanipulation using a twofingered<br />

microhand was<br />

performed<br />

• 3D positions of both end-effectors<br />

and target objects were calculated<br />

using All-In-Focus imaging system<br />

• Line-Type Pattern Matching is<br />

proposed to calculate 3D positions<br />

of end-effectors<br />

• Automated micromanipulation was<br />

demonstrated with single pickand-place<br />

task of microspheres<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–21–<br />

Target microsphere<br />

Static microhand<br />

50µm<br />

Target is<br />

released<br />

Automated pick-and-place<br />

(clockwise from Top-Left)<br />

Target is<br />

picked up<br />

16:45–16:50 MoBT1.4<br />

Modeling and Design of Magnetic Sugar<br />

Particles Manipulation System for Fabrication of<br />

Vascular Scaffold<br />

Chengzhi Hu, Carlos Tercero, Seiichi Ikeda, Toshio Fukuda<br />

Department of Micro-nano Systems Engineering, Nagoya University, Japan<br />

Fumihito Arai<br />

Department of Mechanical Science and Engineering, Nagoya University, Japan<br />

Makoto Negoro<br />

Department of Neurosurgery, Fujita Health University, Japan<br />

• Proposing a method for developing<br />

magnetically controllable templates for<br />

particulate leaching by encapsulating<br />

magnetic microparticles inside sugar<br />

microspheres to realize 3D manipulation<br />

of magnetic particle used for structured<br />

porogen-based fabrication of vascular<br />

scaffolds.<br />

• Control methodology based on Maxwell<br />

coils and Helmholtz coils was illustrated<br />

and we theoretically analyzed the<br />

movement properties of MSP<br />

• A prototype magnetic motion control<br />

system for MSP/MSP cluster is presented


Session MoBT1 Continental Parlor 1 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Micromanipulation<br />

Chair Tatsuo Arai, Osaka Univ.<br />

Co-Chair Jake Abbott, Univ. of Utah<br />

16:50–16:55 MoBT1.5<br />

Toward Intuitive Teleoperation of Micro/Nano-<br />

Manipulators with Piezoelectric Stick-Slip Actuators<br />

Manikantan Nambi, Aayush Damani, and Jake J. Abbott<br />

Mechanical Engineering, University of Utah, USA<br />

• Our goal is to enable intuitive openloop<br />

teleoperated rate control of<br />

micro/nano-manipulators without<br />

any closed-loop feedback from the<br />

vision system.<br />

• Two algorithms are developed for<br />

rate control of stick-slip actuator<br />

based micro/nano- manipulators.<br />

• A method for incorporating openloop<br />

models of the manipulator is<br />

developed.<br />

• Experimental results for a<br />

positioning task in the horizontal<br />

plane are presented.<br />

(top) Experimental setup.<br />

(bottom) Simulation results moving from initial<br />

to target position. Algorithm 1 shows<br />

undesirable drift at very low velocities.<br />

17:00–17:15 MoBT1.7<br />

Evaluation and Application of Thermoresponsive<br />

Gel Handling towards Manipulation of Single Cells<br />

Masaru Takeuchi, Masaru Kojima and Toshio Fukuda<br />

Department of Micro-Nano Systems Engineering, Nagoya University, Japan<br />

Masahiro Nakajima<br />

Center for Micro-nano Mechatronics, Nagoya University, Japan<br />

• Evaluation of the Thermoresponsive Gel<br />

probe (GeT probe) was conducted<br />

• The temperature at the probe tip was<br />

analyzed using Finite Element Method<br />

(FEM) analysis<br />

• Fixation force was measured using AFM<br />

cantilever<br />

• Two methods were proposed to handle<br />

microbeads in the pure water by the GeT<br />

probe<br />

GeT probe<br />

Microbead<br />

AFM cantilever<br />

Thermoresponsive gel 50 �m<br />

50 �m<br />

Fixation force measurement<br />

using AFM cantilever<br />

16:55–17:00 MoBT1.6<br />

Pairing and Moving Swarm of Micro Particles into<br />

Array with a Robot-tweezer Manipulation System<br />

Haoyao Chen<br />

Mechanical Engineering and Automation, HITSZ, P.R. China<br />

Dong Sun<br />

MEEM, City University of Hong Kong, Hong Kong<br />

• Present the batch manipulation of<br />

particles by using integrated robotics and<br />

holographic optical tweezers technologies<br />

• A controller is proposed to drive pairs of<br />

particles to the assigned regions which<br />

are centered at array points.<br />

• The potential field method is utilized to<br />

avoid collisions between particles.<br />

• Experiments on colloidal particles are<br />

performed to demonstrate the<br />

effectiveness of the proposed approach.<br />

17:15–17:30 MoBT1.8<br />

Comparison on Experimental and Numerical<br />

Results for Helical Swimmers inside Channels<br />

Ahmet Fatih Tabak, Fatma Zeynep Temel,<br />

Serhat Yesilyurt<br />

Mechatronics Engineering, Sabanci University, Turkey<br />

• Magnetic swimmers with ~200 micron<br />

diameter Nd 2Fe 14B particles (body) and ~<br />

2mm long helix-shaped rigid wires (tail).<br />

• 1-mm diameter glass tube filled with<br />

glycerol at room temperature placed<br />

inside Helmoltz coil arrangement.<br />

• Hydrodynamic model of the swimmer with<br />

external magnetic torque and wall effects.<br />

• Experimental data and simulation results<br />

agree well on the forward swimming<br />

motion inside the channel.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–22–


Session MoBT2 Continental Parlor 2 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Localization with Constraints<br />

Chair Giuseppe Oriolo, Univ. di Roma<br />

Co-Chair Ethan Stump, US Army Res. Lab.<br />

16:00–16:15 MoBT2.1<br />

Mutual Localization using<br />

Anonymous Bearing Measurements<br />

Paolo Stegagno*, Marco Cognetti*<br />

Antonio Franchi**, Giuseppe Oriolo*<br />

*Università di Roma “La Sapienza”<br />

**Max Plank Institute for Biological Cybernetics, Germany<br />

• Decentralized mutual localization<br />

problem for multi-robot systems<br />

• Only anonymous relative bearing<br />

measurements are used (no global<br />

localization, no distances, no identity)<br />

• Novel algorithm for probabilistic<br />

multiple registration of these<br />

measurements<br />

• Extensive experimental validation<br />

and comparison with the case of<br />

bearing-plus-distance measurements<br />

experiments and simulations with<br />

4 ground robots<br />

16:30–16:45 MoBT2.3<br />

Monte Carlo Localization using 3D Texture Maps<br />

Yu Fu<br />

National Taiwan University of Science and Technology, Taiwan<br />

Stephen Tully, George Kantor, and Howie Choset<br />

Robotics Institute, Carnegie Mellon University,USA<br />

• Monte Carlo Localization is<br />

adopted to localize a robot in a<br />

simplified 3D texture map<br />

• A simplified 3D texture map<br />

consists of vertical planes and<br />

their appearance<br />

• Particles converge faster using<br />

our method than with range-only<br />

MCL due to observing texture<br />

• Particles are more concentrated<br />

because the observed texture for<br />

a particle is generated from a<br />

texture map<br />

Localization in 3D texture map<br />

16:15–16:30 MoBT2.2<br />

Robust Local Localization for Indoor<br />

Environments with Uneven Floors and<br />

Inaccurate Maps<br />

Peter Abeles<br />

Intelligent Automation Inc., USA<br />

• Fault-tolerant robotic localization for indoor<br />

environments using Laser Range Finder<br />

(LRF) .<br />

• Unlike past work, specifically designed to<br />

handle inaccurate maps and uneven<br />

floors.<br />

• Novel motion estimation equations without<br />

singularities in long hallways.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–23–<br />

Terrain at MOUT site<br />

16:45–16:50 MoBT2.4<br />

Active Target Localization<br />

for Bearing Based Robotic Telemetry<br />

Pratap Tokekar, Joshua Vander Hook and Volkan Isler<br />

Department of Computer Science,<br />

Univ. of Minnesota, USA<br />

• A novel robotic telemetry system<br />

for localizing radio-tagged invasive<br />

fish in frozen lakes<br />

• We present a technique for<br />

bearing-estimation from sparse<br />

measurements obtained from a<br />

directional radio antenna<br />

• Next, we study 3 active localization<br />

strategies to select sensing<br />

locations to minimize uncertainty in<br />

the location of a target<br />

• Our system can operate on frozen<br />

lakes and localize the target to<br />

within values as low as one meter<br />

The Husky robot with radio tracking<br />

equipment (left top of chassis)<br />

and antenna during field testing<br />

on Lake Casey, MN


Session MoBT2 Continental Parlor 2 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Localization with Constraints<br />

Chair Giuseppe Oriolo, Univ. di Roma<br />

Co-Chair Ethan Stump, US Army Res. Lab.<br />

16:50–16:55 MoBT2.5<br />

Orientation Descriptors<br />

for Localization in Urban Environments<br />

Philip David and Sean Ho<br />

Army Research Laboratory, USA<br />

• Determine the 2D position and 3D<br />

orientation of a ground-level camera in<br />

urban environments<br />

• The urban model is a 2D building map<br />

generated from aerial imagery<br />

• Register building Footprint Orientation<br />

(FPO) descriptors calculated from the<br />

2D map and from omnidirectional<br />

ground imagery<br />

• Accuracy: 1m in position and 1� in<br />

orientation<br />

17:00–17:15 MoBT2.7<br />

Monte Carlo Localization and Registration to<br />

Prior Data for Outdoor Navigation<br />

David Silver and Anthony Stentz<br />

Robotics Institute, Carnegie Mellon University, USA<br />

• GPS provides positioning for outdoor<br />

navigation, but can be intermittent<br />

• GPS may not be properly aligned with<br />

prior sources of environmental data<br />

• A non-parametric sensor model can be<br />

learned between generic onboard and<br />

prior data<br />

• The learned model can be used to localize<br />

in the absence of GPS, or register GPS<br />

and prior data<br />

Comparison of GPS, odometry,<br />

and MCL localized paths<br />

16:55–17:00 MoBT2.6<br />

A Hybrid Estimation Framework for Cooperative<br />

Localization under Communication Constraints<br />

Esha D. Nerurkar + , Ke X. Zhou*, and Stergios I. Roumeliotis +<br />

+ Dept. of Comp. Sci. and Engineering, Univ. of Minnesota, USA<br />

*Dept. of Electrical and Comp. Engineering, Univ. of Minnesota, USA<br />

• Develop a novel hybrid estimation scheme for CL under severe bandwidth<br />

constraints (robot only allowed to communicate single bit per analog msr.)<br />

• Proposed scheme enables each robot to process quantized msrs. from<br />

other robots, as well as its own analog msrs.<br />

• Present an improved quantization rule by using more accurate estimates<br />

from the hybrid estimation framework<br />

• Hybrid scheme achieves higher estimation accuracy than existing<br />

estimators that process only quantized msrs.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–24–


Session MoBT3 Continental Parlor 3 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Symposium: Robot Audition: From Sound Source Localization to Automatic Speech Recognition<br />

Chair Kazuhiro Nakadai, Honda Res. Inst. Japan Co., Ltd.<br />

Co-Chair Hiroshi G. Okuno, Kyoto Univ.<br />

16:00–16:15 MoBT3.1*<br />

Semi-Plenary Invited Talk: Real-Time Large Vocabulary<br />

Conversational Speech Recognition for Robust<br />

Human-Robot Interaction<br />

Ian Lane, Carnegie Mellon University Silicon Valley<br />

16:30–16:45 MoBT3.3<br />

Variable Frame Rate Hierarchical Analysis for<br />

Robust Speech Recognition<br />

Jean ROUAT and Stéphane LOISELLE<br />

NECOTIS, GEGI, Univ. de Sherbrooke, Canada<br />

Stéphane MOLOTCHNIKOFF<br />

Dépt. biologie, Univ. de Montréal, Canada<br />

• Extraction of acoustical events<br />

corresponding to ONSETS, stable,<br />

ascending or descending 2 nd level<br />

complex features<br />

• Events are time markers on which a<br />

subsequent signal frame can be placed<br />

for speech recognition<br />

• The VFR yields more than 50% increase<br />

in rates on TI46-words corrupted with<br />

Aurora-2 noises with SNR between 0<br />

and 20dB<br />

• Robustness to reverberation is shown<br />

16:15–16:30 MoBT3.2*<br />

Semi-Plenary Invited Talk: Recent Advances on Noise<br />

Reduction and Source Separation Technology for<br />

Robot Audition<br />

Hiroshi Saruwatari, Nara Institute of Science and Technology<br />

16:45–16:50 MoBT3.4<br />

SLAM-based Online Calibration of<br />

Asynchronous Microphone Array for<br />

Robot Audition<br />

1 1<br />

Hiroki Miura, Takami Yoshida,<br />

2<br />

1,2<br />

Keisuke Nakamura, Kazuhiro Nakadai<br />

1. Dept. of Mechanical and Environmental Informatics,<br />

Tokyo Institute of Techonology, Japan<br />

2. Honda Research Institute Japan Co., Ltd, Japan<br />

• We present online calibration of an<br />

asynchronous microphone array in<br />

which a location and a clock delay of each<br />

microphone are unknown.<br />

• The calibration is done by handclaps of a<br />

person walking around the microphone<br />

array with EKF-SLAM which<br />

simultaneously estimates a location and a<br />

clock delay of each microphone and a<br />

location of a sound source (handclapping).<br />

• We showed that at most 14 hand claps<br />

are necessary for calibration and then a<br />

sound source is successfully localized<br />

with the calibrated microphone array.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–25–<br />

Mic locations and<br />

delay are unknown.<br />

Before calibration<br />

Calibration<br />

(EKF-SLAM)<br />

Sound<br />

location<br />

After calibration<br />

Spatial Spectrum of Beamforming


Session MoBT3 Continental Parlor 3 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Symposium: Robot Audition: From Sound Source Localization to Automatic Speech Recognition<br />

Chair Kazuhiro Nakadai, Honda Res. Inst. Japan Co., Ltd.<br />

Co-Chair Hiroshi G. Okuno, Kyoto Univ.<br />

16:50–16:55 MoBT3.5<br />

HARK based Real-time Single Pane 3D Auditory<br />

Scene Visualizer Empowered by Speech Arrow<br />

Zheng Gong and Ichiro Hagiwara<br />

Tokyo Institute of Technology, Japan<br />

Kazuhiro Nakadai<br />

Honda Research Institute Japan Co., Ltd., Japan<br />

Tokyo Institute of Technology, Japan<br />

Hirofumi Nakajima<br />

Honda Research Institute Japan Co., Ltd., Japan<br />

• How to visually present information<br />

effectively to assist people in perception is<br />

a key issue for robot audition.<br />

• Speech Arrow is proposed to display<br />

recognized speeches in animated arrow<br />

shaped frames indicating sound source<br />

orientations.<br />

• A single pane auditory scene visualizer is<br />

implemented by combining visualization of<br />

several types of data in one window to<br />

increase robot auditory scene awareness.<br />

• The 3D vision technology is applied to<br />

enhance visual experiences.<br />

• The 3D sound is also employed to restore<br />

the original sound field.<br />

17:00–17:15 MoBT3.7<br />

A Scene-Associated Training Method for Mobile<br />

Robot Speech Recognition in Multisource<br />

Reverberated Environments<br />

Jindong Liu, Edward Johns and Guang-Zhong Yang<br />

The Hamlyn Centre, Imperial College London, United Kingdom<br />

• Aim: Train a speech recognition model<br />

working in varying reverberation<br />

environments<br />

• Measure room impulse<br />

response and group<br />

together similar auditory<br />

scenes to train a scenedependent<br />

speech<br />

recognition model<br />

• Utilise vision to<br />

recognise scenes and<br />

call the corresponding<br />

recognition model<br />

16:55–17:00 MoBT3.6<br />

Multi-modal Front-end for Speaker Activity<br />

Detection in Small Meetings<br />

Jani Even, Panikos Heracleous, Carlos Ishi and Norihiro Hagita<br />

ATR Intelligent Robotics and Communication Laboratories, Kyoto, Japan<br />

• Use LRF based human tracker for<br />

monitoring the movement of the<br />

participants<br />

• Assign one audio stream to each of the<br />

participants using the position from the<br />

human tracker<br />

• Beamforming technique is used to reduce<br />

background noise and interference<br />

• Use GMM trained before hand to detect<br />

speaker activity and identify the active<br />

speakers<br />

17:15–17:30 MoBT3.8<br />

The effects of microphone array processing on<br />

pitch extraction in real noisy environment<br />

Carlos T. Ishi, Dong Liang, Hiroshi Ishiguro, Norihiro Hagita<br />

ATR Intelligent Robotics and Communication Labs, Kyoto, Japan<br />

• Pitch extraction is important for humanrobot<br />

interaction, since pitch may carry<br />

information about intention, attitude or<br />

emotion expression<br />

• We propose a pitch extraction method<br />

by combining microphone array and<br />

auditory scene analysis technologies,<br />

and evaluate pitch extraction of multiple<br />

speakers in robot’s real noisy<br />

environment<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–26–<br />

Correlogram of baseline system<br />

Correlogram of proposed method<br />

Pitch of the target speech


Session MoBT4 Continental Ballroom 4 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Symposium: Robot Demonstrations<br />

Chair Oliver Brock, Tech. Univ. Berlin<br />

Co-Chair Kurt Konolige, Willow Garage<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–27–


Session MoBT5 Continental Ballroom 5 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Symposium: Bio-inspired Robotics II<br />

Chair K. H. Low, Nanyang Tech. Univ.<br />

Co-Chair Ravi Vaidyanathan, Imperial Coll. London<br />

16:00–16:15 MoBT5.1*<br />

Semi-Plenary Invited Talk: Robotics at Boston<br />

Dynamics<br />

Gabriel Nelson, Boston Dynamics<br />

16:30–16:45 MoBT5.3<br />

Design of a Miniature Integrated Multi-Modal<br />

Jumping and Gliding Robot<br />

Matthew A. Woodward and Metin Sitti<br />

Dept. of Mechanical Engineering, Carnegie Mellon University, USA<br />

• Multiple locomotion modes can<br />

enhance a system’s performance<br />

in unstructured terrain.<br />

• A integrated strategy for<br />

developing systems which exhibit<br />

multiple locomotion modes is<br />

proposed.<br />

• A highly integrated robot is<br />

developed which utilizes<br />

approximately 80% of the system<br />

mass for the jumping mode and<br />

67% for the gliding mode.<br />

• This robot is capable of jumping<br />

heights of 6 meters with a mass of<br />

under 100 grams.<br />

16:15–16:30 MoBT5.2*<br />

Semi-Plenary Invited Talk: Progress on 100 Milligram<br />

Robotic Insects<br />

Robert Wood, Harvard University<br />

16:45–16:50 MoBT5.4<br />

ScarlETH: Design and Control of a Planar<br />

Running Robot<br />

Marco Hutter, C. David Remy, Mark A. Höpflinger,<br />

Roland Siegwart<br />

Autonomous Systems Lab, ETH Zurich, Switzerland<br />

• Design of an articulated robotic leg<br />

based on high compliant series<br />

elastic actuation<br />

• Combination of versatility and<br />

efficiency:<br />

• precise torque control<br />

• fast position control<br />

• efficient energy storage<br />

• maximal range of motion<br />

• Virtual model control is applied for<br />

hopping in 2D:<br />

• virtual spring in vertical direction<br />

• angle of attack and force control<br />

in horizontal direction<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–28–


Session MoBT5 Continental Ballroom 5 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Symposium: Bio-inspired Robotics II<br />

Chair K. H. Low, Nanyang Tech. Univ.<br />

Co-Chair Ravi Vaidyanathan, Imperial Coll. London<br />

16:50–16:55 MoBT5.5<br />

Modeling and Control on Hysteresis Nonlinearity<br />

in Biomimetic Undulating Fins<br />

Tianjiang Hu, Huayong Zhu, Han Zhou, Lincheng Shen<br />

College of Mechatronics and Automation, Natioanl University of Defense Technology (NUDT), China<br />

K. H. Low<br />

School of Mechanical and Aerospace Engineering, Nanyang Technological University (NTU), Singapore<br />

• Biomimetic undulating fins are considered<br />

with focus on their hysteresis nonlinearity,<br />

which causes effect on biomimetic<br />

investigation of fish swimming.<br />

• Hysteresis is confirmed with experimental<br />

data on RoboGnilos, and qualitative<br />

modeling on this nonlinear action is then<br />

achieved by using Preisach equations.<br />

• The developed iterative learning control is<br />

applied to eliminate hysteresis nonlinearity<br />

in biorobotic undulating fins.<br />

• Simulation and experiments show that the<br />

proposed control method is effective and<br />

feasible to improve the tracking<br />

performance of biorobotic fins.<br />

17:00–17:15 MoBT5.7<br />

Biologically Derived Models of the Sunfish for<br />

Investigations of Multi-Fin Swimming<br />

James Tangorra, Anthony Mignano, Gabe Carryon, and Jeff Kahn<br />

Department of Mechanical Engineering, Drexel University, USA<br />

• Biologically derived models of the<br />

sunfish, fin central pattern<br />

generators, and pectoral fin<br />

sensory systems were developed.<br />

• The biorobotic fish enables<br />

interdependencies of sub-systems<br />

to be investigated during multi-fin<br />

swimming.<br />

• Neural oscillator networks<br />

generate gaits for steady<br />

swimming and maneuvers, and<br />

makes stable transitions.<br />

• Relationships were determined<br />

between distributed sensory<br />

information and fin forces.<br />

Biorobotic sunfish, instrumented pectoral<br />

fins, and neural oscillator network.<br />

16:55–17:00 MoBT5.6<br />

Translational Damping on Flapping Cicada<br />

Wings<br />

Perry Parks, Bo Cheng, Zheng Hu, and Xinyan Deng<br />

School of Mechanical Engineering, Purdue University<br />

• Robotic insect thorax mechanism<br />

• Flaps up to 65Hz,and weigh 2.86g<br />

including the motor and wing<br />

• Transitional damping when body moves<br />

• Indirect measurement using a pendulum<br />

• Compared to analytical estimation from<br />

Blade-Element Analysis<br />

• Good agreement with estimation in<br />

forward-backward and left-right translation<br />

• Asymmetry in up-down translation<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–29–<br />

Man-made (top) and cicada<br />

(bottom) wings mounted on the<br />

high frequency robotic thorax<br />

mechanism.<br />

17:15–17:30 MoBT5.8<br />

Dynamic Modeling of Robotic Fish and<br />

Its Experimental Validation<br />

Jianxun Wang, Freddie Alequin-Ramos and Xiaobo Tan<br />

Department of Electrical and Computer Engineering,<br />

Michigan State University, MI, USA<br />

• Model carangiform robotic fish<br />

motion by merging rigid-body<br />

dynamics with Lighthill’s largeamplitude<br />

elongated-body theory<br />

• Introduce an effective approach to<br />

the evaluation of time-varying drag<br />

force and moment coefficients<br />

• Show that the pressure force acting<br />

at the tail tip plays a significant role<br />

• Validate the proposed approach<br />

with experiments on a freeswimming<br />

robotic fish<br />

A robotic fish prototype with servo-driven<br />

rigid tail, used for model validation


Session MoBT6 Continental Ballroom 6 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Symposium: Field Robotics II<br />

Chair David A. Anisi, ABB<br />

Co-Chair Satoshi Tadokoro, Tohoku Univ.<br />

16:00–16:15 MoBT6.1*<br />

Semi-Plenary Invited Talk: Recent Developments in<br />

Robotics Technology and Flight Implementation at JPL<br />

Richard Volpe, Jet Propulsion Laboratory, Caltech<br />

16:30–16:45 MoBT6.3<br />

Path Planning and Evaluation for Planetary<br />

Rovers Based on Dynamic Mobility Index<br />

Genya Ishigami<br />

Japan Aerospace Exploration Agency, Japan<br />

Keiji Nagatani and Kazuya Yoshida<br />

Tohoku University, Japan<br />

• Three-step path planning and<br />

evaluation method is proposed.<br />

• Step 1: Path generation on a given<br />

terrain map with a basic planning<br />

algorithm.<br />

• Step 2: Dynamic simulation<br />

providing metrics for robotic<br />

mobility.<br />

• Step 3: Path evaluation based on<br />

dynamic mobility index.<br />

• Demonstrations for the path planning<br />

and evaluation are presented.<br />

Dynamic simulation profile for<br />

each candidate path<br />

16:15–16:30 MoBT6.2<br />

On On-Orbit Passive Object Handling by<br />

Cooperating Space Robotic Servicers<br />

Georgios Rekleitis and Evangelos Papadopoulos<br />

Department of Mechanical Engineering<br />

National Technical University of Athens, Athens, Greece<br />

• A technique for 3D on-orbit manipulation of a passive object by<br />

free-flying servicers is presented<br />

• On-off thrusters and proportional action<br />

manipulators used<br />

• Passive object trajectory<br />

tracking motion strategy and<br />

optimal end-effector contact point selection via a two-layer optimization<br />

• Novel, model based controller, adapted to the special requirements is<br />

presented. Performance and robustness are studied via simulations<br />

• Comparison with standard on-off control shows reduced fuel consumption<br />

16:45–16:50 MoBT6.4<br />

Control of a Passively Steered Rover<br />

using 3-D Kinematics<br />

Neal Seegmiller and David Wettergreen<br />

Robotics Institute, Carnegie Mellon University, USA<br />

• 3-D kinematic control for<br />

passively-steered rovers<br />

• The passive-steering design is<br />

reliable and efficient, but<br />

challenging to control<br />

• The 3-D controller uses inertial &<br />

proprioceptive sensing for<br />

accurate steering on rough<br />

terrain<br />

• The controller was validated in<br />

both simulation and physical<br />

experiments<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–30–<br />

Experiment, simulation, and diagram of the Zoë<br />

rover traversing an obstacle with 3-D control


Session MoBT6 Continental Ballroom 6 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Symposium: Field Robotics II<br />

Chair David A. Anisi, ABB<br />

Co-Chair Satoshi Tadokoro, Tohoku Univ.<br />

16:50–16:55 MoBT6.5<br />

Optical Flow Odometry with<br />

Robustness to Self-shadowing<br />

Neal Seegmiller and David Wettergreen<br />

Robotics Institute, Carnegie Mellon University, USA<br />

• Efficient optical flow odometry for mobile<br />

robots using a single downward looking<br />

camera.<br />

• Robustness to self-shadowing errors,<br />

even in extreme cases when the<br />

shadow dominates the image.<br />

• Method: (1) prevent feature selection<br />

near shadow edges (2) outlier rejection<br />

based on rigidity constraint<br />

• Experimental results: the presented<br />

algorithm accurately estimates velocity<br />

when outlier rejection alone fails<br />

• Tested at 2 m/s (algorithm design permits<br />

higher speeds). Odometry error 2-3% of<br />

distance traveled.<br />

Tracked features in image<br />

dominated by the LATUV shadow<br />

17:00–17:15 MoBT6.7<br />

Time-Optimal Detumbling Maneuver<br />

along an Arbitrary Arm Motion<br />

during the Capture of a Target Satellite<br />

Tomohisa Oki Satoko Abiko<br />

MDA Space Missions, Canada Tohoku University, Japan<br />

Hiroki Nakanishi Kazuya Yoshida<br />

Japan Aerospace Exploration Agency, Japan Tohoku University, Japan<br />

• Time-Optimal control method to<br />

stabilize a detumbling satellite is<br />

proposed<br />

• Combined system dynamics between<br />

a free-floating robot system and a<br />

target satellite is presented<br />

• Detumbling operation along an<br />

arbitrary arm motion is parameterized<br />

by a single parameter<br />

• The concept of detumbling speed is<br />

proposed and advantageous when<br />

handling parameter uncertainty in the<br />

target dynamics<br />

• Numerical simulation result verifies<br />

the proposed control scheme<br />

16:55–17:00 MoBT6.6<br />

Vision-based Space Autonomous Rendezvous :<br />

A Case Study<br />

Antoine Petit<br />

IRISA/INRIA Rennes, France<br />

Eric Marchand<br />

IRISA, Université de Rennes 1, France<br />

Keyvan Kanani<br />

Astrium, France<br />

• 3D model-based tracking for the<br />

navigation of the final phase of a space<br />

rendezvous.<br />

• Tests on a mock-up of a<br />

satellite with a 6-DOF<br />

robotic arm, in open loop.<br />

• Precision, real-time. Robustness to<br />

motions, orientation, illumination.<br />

• 2 1/2 D visual servoing,<br />

using pose estimation,<br />

with satisfactory results.<br />

17:15–17:30 MoBT6.8<br />

3D SLAM for Planetary Worksite Mapping<br />

Chi Hay Tong and Timothy D. Barfoot<br />

Institute for Aerospace Studies, University of Toronto, Canada<br />

Erick Dupuis<br />

Space Exploration, Canadian Space Agency, Canada<br />

• 3D SLAM framework designed for the planetary worksite mapping task<br />

using a laser rangefinder mounted on a mobile rover platform<br />

• Utilizes a combination of sparse features and dense data for data<br />

association<br />

• Post-alignment automatic verification check to ensure map quality<br />

• Experimental validation at two planetary analogue test facilities<br />

Map of our planetary analogue test facility created by the 3D SLAM framework.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–31–


Session MoBT7 Continental Parlor 7 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Teleoperation<br />

Chair Nikhil Chopra, Univ. of Maryland, Coll. Park<br />

Co-Chair Patrick van der Smagt, DLR<br />

16:00–16:15 MoBT7.1<br />

A Constrained Optimization Approach to Virtual<br />

Fixtures for Multi-Robot Collaborative Tele-op<br />

Tian Xia and Peter Kazanzides and Russell Taylor<br />

Computer Science, Johns Hopkins University, USA<br />

Ankur Kapoor<br />

National Institute of Health, USA<br />

• Developed a system for roboticassisted<br />

manipulation using the<br />

da Vinci ® surgical robot<br />

• Used constrained optimization<br />

to implement multi-robot virtual<br />

fixtures for surgical knot<br />

positioning<br />

• Used virtual fixtures to maintain<br />

spatial and temporal<br />

relationships between robots to<br />

complete task<br />

• Used virtual fixtures provide<br />

haptic feedback to user<br />

Master<br />

Controller<br />

Slave<br />

Controller<br />

F c m<br />

Transformations<br />

- + F<br />

Fs<br />

d s<br />

∆X d∗<br />

sl<br />

∆X d sl<br />

Collaborative<br />

Controller<br />

Master/Slave Pair<br />

F c m<br />

Transformations<br />

F d s<br />

+<br />

- Fs<br />

∆X d sr<br />

∆X d∗<br />

sr<br />

Control algorithm overview<br />

Master<br />

Controller<br />

Slave<br />

Controller<br />

16:30–16:45 MoBT7.3<br />

Small Gain Design of Cooperative Teleoperator<br />

System with Projection-Based Force Reflection<br />

Ilia Polushin and Amir Takhmar and Rajni V. Patel<br />

Department of Electrical and Computer Engineering,<br />

University of Western Ontario, Canada<br />

• Developments to the small-gain approach<br />

to the design of networked cooperative<br />

force-reflecting teleoperators are<br />

presented<br />

• Explicit assumptions on the human<br />

dynamics are incorporated into the<br />

stability analysis<br />

• The conservatism of the small-gain design<br />

is eliminated using the projeciton-based<br />

force reflection principle<br />

• Samples of the experimental results are<br />

presented<br />

Cooperative teleoperator system<br />

16:15–16:30 MoBT7.2<br />

A Task-space Weighting Matrix Approach to<br />

Semi-autonomous Teleoperation Control<br />

Pawel Malysz and Shahin Sirouspour<br />

Department of Electrical and Computer Engineering<br />

McMaster University, Canada<br />

• A new general framework for shared teleoperation/autonomous control<br />

of mobile base and/or twin-armed manipulator robots<br />

• Hierarchical controller based on<br />

task-level coordinating velocity<br />

commands and low-level joint<br />

velocity control<br />

• A task-space weighting matrix<br />

adjusts relative priority between<br />

autonomous and human<br />

teleoperation control<br />

• Rigorous performance and<br />

stability analysis<br />

• Experimental evaluation in<br />

teleoperation of a twin-armed<br />

Twin-armed Mobile Robot<br />

mobile slave robot<br />

16:45–16:50 MoBT7.4<br />

An Enhanced Sliding-Mode Control for a<br />

Pneumatic-Actuated Teleoperation System<br />

M.Q. Le1 , M.T. Pham1 , M. Tavakoli2 , R. Moreau1 1 Laboratoire Ampère, UMR CNRS 5005, Université de Lyon, France<br />

2 Department of Electrical and Computer Engineering, University of Alberta, Canada<br />

• This paper presents an enhanced sliding mode control for<br />

pneumatic master-slave teleoperation systems with low-cost<br />

solenoid valves<br />

• By using 2 additional modes, the system dynamic performance<br />

is improved and the valve’s switching activities is reduced<br />

• The proposed control design is experimentally verified on a<br />

two-channel bilateral teleoperation architecture involving<br />

position-position, force-force, or force-position schemes<br />

• Good transparency and stability were achieved<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–32–<br />

Pressure sensors<br />

Solenoid valves<br />

Force sensor<br />

Mass<br />

Position sensor Cylinder<br />

Environement


Session MoBT7 Continental Parlor 7 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Teleoperation<br />

Chair Nikhil Chopra, Univ. of Maryland, Coll. Park<br />

Co-Chair Patrick van der Smagt, DLR<br />

16:50–16:55 MoBT7.5<br />

Subspace-oriented Energy Distribution for the<br />

Time Domain Passivity Approach<br />

Christian Ott, Jordi Artigas, and Carsten Preusche<br />

Institute of Robotics and Mechatronics,<br />

German Aerospace Center (DLR), Germany<br />

• Time Domain Passivity Approach applies<br />

a dissipative control action in order to<br />

preserve passivity.<br />

• Requires a criterion for the dissipation<br />

among multiple degrees of freedom.<br />

• We propose an energy based formulation<br />

for distribution of damping to task<br />

space and nullspace of a redundant<br />

manipulator.<br />

• The solution avoids disturbance of the<br />

task space, by prioritizing dissipation in<br />

the nullspace.<br />

• The method is based on an optimization<br />

formulation and allows to consider<br />

constraints in the maximum torques and<br />

damping values.<br />

passivity observer<br />

task space<br />

damping<br />

subspace<br />

damping<br />

distribution<br />

nullspace<br />

damping<br />

17:00–17:15 MoBT7.7<br />

Semi-Autonomous Teleoperation in Task Space<br />

with Redundant Slave Robot<br />

Yen-Chen Liu and Nikhil Chopra<br />

Department of Mechanical Engineering, University of Maryland,<br />

College Park, USA<br />

• Difficult to achieve efficient task space<br />

teleoperation of complex robots operating<br />

in cluttered environments.<br />

• Redundancy in the slave robot can be<br />

utilized to autonomously achieve several<br />

sub-tasks.<br />

• Control algorithms developed to<br />

guarantee task space tracking in the<br />

presence of uncertainties and delays.<br />

• Algorithm is validated with sub-tasks such<br />

as singularity avoidance, joint limits, and<br />

collision avoidance.<br />

Configurations of the slave<br />

robot with sub-task control<br />

16:55–17:00 MoBT7.6<br />

EMG-Based Teleoperation and Manipulation<br />

with the DLR LWR-III<br />

Jörn Vogel, Claudio Castellini and Patrick van der Smagt<br />

Institute of Robotics and Mechatronics, German Aerospace Center, Germany<br />

• Decoding wrist position, orientation<br />

and grasp force from surface<br />

electromyography<br />

• Control of a lightweight robotic arm<br />

and hand using the decoded signals<br />

• No kinematic model of the users arm<br />

or precise positioning of sEMGelectrodes<br />

needed<br />

• Possible applications in the field of<br />

variable stiffness robotics and<br />

rehabilitation<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–33–<br />

Schematic overview<br />

of the system<br />

17:15–17:30 MoBT7.8<br />

Noninvasive Brain-Computer Interface-based<br />

Control of Humanoid Navigation<br />

Yongwook Chae1 , Jaeseung Jeong2 , and Sungho Jo1 1Department of Computer Science, KAIST, Korea<br />

2Department of Bio and Brain Engineering, KAIST, Korea<br />

Key Features<br />

1) Natural & Direct Brain-actuated<br />

Navigation<br />

Mapped with three natural motor<br />

intentions into the five low level<br />

motion commands<br />

2) Direct-Control System<br />

Free for the external cues and<br />

environmental restrictions<br />

3) Asynchronous BCI System<br />

User can command the humanoid at<br />

any time<br />

4) Postural Dependent Control<br />

Not based on environmental<br />

condition (applicable to any<br />

environments)


Session MoBT8 Continental Parlor 8 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Learning for Control<br />

Chair Jun Nakanishi, Univ. of Edinburgh<br />

Co-Chair Jorg Conradt, Tech. Univ. München<br />

16:00–16:15 MoBT8.1<br />

Adding a Receding Horizon to Locally Weighted<br />

Regression for Learning Robot Control<br />

Christopher Lehnert and Gordon Wyeth<br />

Queensland University of Technology, Australia<br />

• Locally Weighted Regression (LWR)<br />

fails to learn a useful controller when<br />

the system is temporally dependent.<br />

• We define the concept of a horizon of<br />

temporal dependence, and illustrate<br />

why LWR fails to learn.<br />

• By introducing a receding horizon of<br />

future output states of the system, we<br />

show that sufficient constraint is<br />

applied to learn good solutions.<br />

• Our new method, Receding Horizon<br />

Locally Weighted Regression, is<br />

demonstrated through one-shot<br />

learning on a real Series Elastic<br />

Actuator controlling a pendulum.<br />

Control response of Series Elastic<br />

Actuator after 20 seconds of training<br />

using Receding Horizon Locally<br />

Weighted Regression.<br />

16:30–16:45 MoBT8.3<br />

Learning Task-Space Control with Kernels<br />

D. Nguyen-Tuong1 and J. Peters1,2 1Max Planck Institute for Intelligent Systems, Germany<br />

2Universität Darmstadt, Intelligent Autonomous Systems, Germany<br />

• Learning models for task-space control<br />

is an ill-posed problem, i.e., learning<br />

one-to-many mappings.<br />

• We propose a local learning approach<br />

appropriate for learning one-to-many<br />

mappings.<br />

• The proposed approach is based on the insight that despite being a<br />

globally ill-posed problem, such task-space control mapping is locally<br />

well-defined. We show in simulation the ability of the method for online<br />

model learning for task-space control of redundant robots.<br />

16:15–16:30 MoBT8.2<br />

Learning Inverse Kinematics<br />

with Structured Prediction<br />

1 Botond Bócsi, 2 Duy Nguyen-Tuong, 1 Lehel Csató,<br />

2 Bernhard Schölkopf, 2 Jan Peters<br />

1 Faculty of Mathematics and Informatics, Babes-Bolyai University, Romania<br />

2 Dep. of Emp. Inf., Max Planck Inst. for Intelligent Sys., Tubingen, Germany<br />

Technische Universitaet Darmstadt, Intelligent Autonomous Systems Group,<br />

Germany<br />

• Inverse kinematics mappings are<br />

multivalued functions, thus, they are<br />

ill-defined<br />

• We model the joint input-output (task<br />

space position - joint space<br />

configuration) mapping and<br />

maximize over the output space<br />

• The maximization is possible using<br />

gradient search started from the<br />

current arm position<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–34–<br />

Inverse kinematics<br />

prediction scheme<br />

16:45–16:50 MoBT8.4<br />

Learning to Control Planar Hitting Motions in a<br />

Minigolf-like Task<br />

K.Kronander, S.M Khansari-Zadeh and A.Billard<br />

Learning Algorithms and Systems Laboratory (LASA),<br />

Ecole Polytechnique Federale de Lausanne (EPFL), Switzerland<br />

• We present a robotic system for<br />

learning to play minigolf from<br />

human demonstrations.<br />

• First, the robot is taught a basic<br />

hitting motion using kinesthetic<br />

demonstrations.<br />

• Then, the robot learns a hitting<br />

motion adaptation model from<br />

examples provided by the<br />

teacher.<br />

• Neither physical model of the<br />

field or reward-function need to<br />

be specified.<br />

The 7-dof Barrett WAM playing minigolf


Session MoBT8 Continental Parlor 8 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Learning for Control<br />

Chair Jun Nakanishi, Univ. of Edinburgh<br />

Co-Chair Jorg Conradt, Tech. Univ. München<br />

16:50–16:55 MoBT8.5<br />

Stiffness and Temporal Optimization in Periodic<br />

Movements: An Optimal Control Approach<br />

Jun Nakanishi, Konrad Rawlik and Sethu Vijayakumar<br />

IPAB, School of Informatics, University of Edinburgh, UK<br />

• Present a novel framework for stiffness<br />

and temporal optimization of periodic<br />

movements<br />

• Exploit the intrinsic passive dynamics to<br />

realize efficient actuation and control<br />

• Dynamical systems based representation<br />

of rhythmic movements<br />

• Propose a systematic methodology to<br />

optimize for control commands, temporal<br />

aspect of movements and time-varying<br />

stiffness profiles from first principles of<br />

optimality<br />

• Evaluations on a single pendulum and an<br />

underactuated two-link brachiating robot<br />

simulation<br />

Temporally optimized movement<br />

of the swing locomotion task<br />

17:00–17:15 MoBT8.7<br />

Behavioural Cloning for Driving Robots<br />

over Rough Terrain<br />

Raymond Sheh, Bernhard Hengst and Claude Sammut<br />

ARC Centre of Excellence for Autonomous Systems<br />

School of Computer Science and Engineering<br />

The University of New South Wales, Sydney, Australia<br />

• Controllers for driving robots over<br />

rough terrain need to consider robotterrain<br />

interactions that are traditionally<br />

difficult to model (eg. body contact).<br />

• We use Behavioural Cloning in<br />

simulation and reality to directly learn<br />

the control task without explicit system<br />

modeling.<br />

• We also develop an autonomous<br />

demonstrator to generate training data.<br />

• Generated policies found to transfer<br />

directly from simulation to reality.<br />

• On-task performance exceeds basic<br />

controllers and is comparable to the<br />

human expert.<br />

16:55–17:00 MoBT8.6<br />

Improving Operational Space Control of Heavy<br />

Manipulators via Open-Loop Compensation<br />

Guilherme Maeda, Surya Singh, David Rye<br />

Australian Centre for Field Robotics,<br />

The University of Sydney, Australia<br />

• Hydraulic manipulators present<br />

high friction and dynamic coupling<br />

making it difficult to obtain a global<br />

model with acceptable estimation<br />

error<br />

• Model-based compensation based<br />

on such models have negative<br />

impact on performance and may<br />

lead to instability<br />

• Open-loop commands generate<br />

stable partial compensation<br />

• In open-loop, the model can be<br />

local, requiring only dynamic<br />

knowledge in the neighborhood of<br />

the desired trajectory<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–35–


Session MoBT9 Continental Parlor 9 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Novel Actuators II<br />

Chair Cagdas Denizel Onal, Massachusetts Inst. of Tech.<br />

Co-Chair Ohmi Fuchiwaki, Yokohama National Univ. (YNU)<br />

16:00–16:15 MoBT9.1<br />

Sliding-Mode Control of<br />

Nonlinear Discrete-Input Pneumatic Actuators<br />

Sean Hodgson 1 , Minh Quyen Le 2 ,<br />

Mahdi Tavakoli 1 , Minh Tu Pham 2<br />

1 Electrical and Computer Engineering, University of Alberta, Canada<br />

2 Laboratoire Ampère, Université de Lyon, France<br />

• This paper proposes a sliding mode law<br />

for precise position control with minimal<br />

switching activity.<br />

• This control law is applied to 1 DOF<br />

pneumatic actuator utilizing on/off<br />

solenoid valves.<br />

• Utilizes 4 new modes of operation that use<br />

necessary and sufficient amounts of drive<br />

energy.<br />

• Simulated and experimental results<br />

demonstrate improved system<br />

performance!<br />

Pneumatic Actuator<br />

16:30–16:45 MoBT9.3<br />

Development of<br />

a Miniature Foil Type Ultrasonic Motor<br />

Jun Okamoto<br />

Institute of Advanced Biomedical Engineering & Science,<br />

Tokyo Women's Medical University, Japan<br />

Shigeki Toyama<br />

Department of Mechanical Systems Engineering,<br />

Tokyo University of Agriculture and Technology, Japan<br />

• A traveling wave type miniature<br />

ultrasonic motor using a foil type stator<br />

is proposed<br />

• It is found that the main driving force<br />

was the generated symmetric lamb<br />

wave which rotates the coil surface<br />

forward direction<br />

• Rotational speed was found to be 6,380<br />

rpm and starting torque was 0.698 Nm<br />

when the ideal gap between rotor and<br />

stator was realized<br />

• The performance of the developed<br />

prototype was found quite good<br />

compared to the earlier miniature motors<br />

0.8mm<br />

FEM Analysis<br />

Endurance test<br />

Foil type stator<br />

Wave<br />

direction<br />

16:15–16:30 MoBT9.2<br />

Trajectory Planning and Current Control Optimization<br />

of Three Degree-of-Freedom Spherical Actuator<br />

Liang Zhang, Weihai Chen, Liang Yan, and Jingmeng Liu<br />

School of Automation Science and Electrical Engineering,<br />

Beihang University, Beijing, China<br />

• An orientation representation<br />

method is proposed to<br />

facilitate the trajectory<br />

planning of rotor<br />

• A trajectory planning method<br />

which can generate smooth<br />

spline in two-step is utilized<br />

Working<br />

principle<br />

Trajectory<br />

planning<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–36–<br />

Desired torque<br />

Torque<br />

modeling<br />

Spherical actuator<br />

Current control<br />

optimization<br />

• Optimization algorithm for current is developed to improve the power<br />

efficiency and fault tolerance capability<br />

• Simulations are carried out to validate the proposed method and algorithm<br />

16:45–16:50 MoBT9.4<br />

Soft Robot Actuators using Energy-Efficient<br />

Valves Controlled by<br />

Electropermanent Magnets<br />

Andrew D. Marchese, Cagdas D. Onal, and Daniela Rus<br />

Electrical Engineering and Computer Science, MIT, USA<br />

•Authors employ novel<br />

electropermanent magnet (EPM)<br />

valves to drive actuation of a soft,<br />

multi-segment robot.<br />

•When used in driving soft<br />

actuators, EPM valves can be<br />

compact, light weight, and energy<br />

efficient.<br />

•The forward locomotion of a soft<br />

robot consisting of six compliant<br />

fluidic actuators was controlled by<br />

EPM valves.<br />

EPM valves were used to<br />

drive a six segment soft<br />

rolling robot. Segments<br />

were placed around the<br />

perimeter of a cylinder.


Session MoBT9 Continental Parlor 9 Monday, September 26, <strong>2011</strong>, 16:00–17:30<br />

Novel Actuators II<br />

Chair Cagdas Denizel Onal, Massachusetts Inst. of Tech.<br />

Co-Chair Ohmi Fuchiwaki, Yokohama National Univ. (YNU)<br />

16:50–16:55 MoBT9.5<br />

Synthesis of a Non-Circular Cable Spool<br />

to Realize a Nonlinear Rotational Spring<br />

Nicolas Schmit and Masafumi Okada<br />

Dept. of Mech. Sciences and Eng., Tokyo Institute of Technology, Japan<br />

• Closed-form solution of the non-circular<br />

spool which synthesizes a nonlinear<br />

torque profile.<br />

• Realization of a constant force spring, an<br />

exponential softening spring and a cubic<br />

polynomial spring.<br />

• Average experimental error: 1.5%<br />

17:00–17:15 MoBT9.7<br />

Variable Impedance due to Electromechanical<br />

Coupling in Electroactive Polymer Actuators<br />

Sanjay Dastoor and Mark Cutkosky<br />

Department of Mechanical Engineering, Stanford University, USA<br />

• Electroactive polymers (EAPs) have<br />

inherent compliance and damping, making<br />

them attractive for bio-inspired systems<br />

• A novel control circuit can be used to<br />

change the electrical boundary conditions<br />

on the EAP<br />

• A quasi-viscoelastic model can be used to<br />

to simply model the dynamics of EAPs<br />

• Through dynamics modeling and electrical<br />

control, open-loop passive impedance can<br />

be varied, increasing the performance of<br />

bio-inspired robots.<br />

EAP actuator, variable<br />

impedance trajectory, and<br />

perching application<br />

16:55–17:00 MoBT9.6<br />

Combining IBVS and PBVS to ensure the<br />

visibility constraint<br />

Olivier Kermorgant and François Chaumette<br />

Lagadic, INRIA Rennes-Bretagne Atlantique, France<br />

• Hybrid visual servoing based on a generic fusion approach<br />

• Pure PBVS is performed inside a safe image area<br />

• 2D information is added to keep the object in the image<br />

• Validation with simulation and experiments<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–37–<br />

Visual servo with a 3D tracking<br />

The model nodes are used as 2D<br />

features when they approach the<br />

image border<br />

17:15–17:30 MoBT9.8<br />

Optimal Control of Multi-Input SMA Actuator<br />

Arrays Using Graph Theory<br />

Leslie Flemming, David E. Johnson and Stephen Mascaro<br />

Department of Mechanical Engineering, University of Utah, USA<br />

• Wet Shape Memory Alloy actuators are compact, high force-to-weight<br />

actuators that can be activated electrically or thermofluidically.<br />

• A scalable architecture of 2N switches/valves controls the delivery of<br />

thermal energy to an NxN array of actuators.<br />

• Expanding wavefront graph theory is used to identify optimal set of<br />

discrete control commands to operate the array.<br />

A partial graph of a 2x2 array of<br />

wet SMA actuators controlled by<br />

thermofluidic and electric inputs<br />

1 0<br />

0 0<br />

00<br />

CC 00<br />

EE<br />

B<br />

00<br />

HH<br />

1 0<br />

1 1<br />

EE<br />

EE<br />

HH<br />

HH<br />

0H<br />

00<br />

0E<br />

00<br />

0C<br />

00<br />

A C<br />

CC<br />

CC<br />

1 1<br />

1 1<br />

Legend<br />

Node<br />

Edge<br />

Hot Fluid<br />

Cold Fluid<br />

Electricity


<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–38–


8:00-9:30<br />

Technical Program Digest<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–39–<br />

Tuesday<br />

September 27, <strong>2011</strong><br />

Session TuAT1 ⎯⎯ Object Recognition, Segmentation, and Detection<br />

Session TuAT2 ⎯⎯ Simultaneous Localization and Mapping<br />

Session TuAT3 ⎯⎯ Symposium: Microrobotics I<br />

Session TuAT4 ⎯⎯ Industrial Forum: Robots: The New Commercial Platforms<br />

Session TuAT5 ⎯⎯ Symposium: Medical Robotics I<br />

Session TuAT6 ⎯⎯ Symposium: Grasping and Manipulation: Control and Learning<br />

Session TuAT7 ⎯⎯ Software Architectures & Frameworks<br />

Session TuAT8 ⎯⎯ Bio-Inspired & Biomimetic Robots<br />

Session TuAT9 ⎯⎯ Probabilistic Exploration and Coverage<br />

10:00-11:30<br />

Session TuBT1 ⎯⎯ Perception, Saliency and Novelty<br />

Session TuBT2 ⎯⎯ Semantic SLAM & Loop Closure<br />

Session TuBT3 ⎯⎯ Symposium: Microrobotics II<br />

Session TuBT4 ⎯⎯ Industrial Forum: Robots: The Next Generation<br />

Session TuBT5 ⎯⎯ Symposium: Medical Robotics II<br />

Session TuBT6 ⎯⎯ Symposium: Grasping and Manipulation: Mechanics and Design<br />

Session TuBT7 ⎯⎯ Contact and Deformation<br />

Session TuBT8 ⎯⎯ Biomimetic Limbed Robots<br />

Session TuBT9 ⎯⎯ Perceptual Learning<br />

(continues on next page)


14:00-15:30<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–40–<br />

Tuesday<br />

September 27, <strong>2011</strong><br />

(continued)<br />

Session TuCT1 ⎯⎯ Object Detection & Collision Avoidance<br />

Session TuCT2 ⎯⎯ Motion Estimation, Mapping & SLAM<br />

Session TuCT3 ⎯⎯ Symposium: Microrobotics III<br />

Session TuCT4 ⎯⎯ Forum: Robotics: Beyond the Horizon<br />

Session TuCT5 ⎯⎯ Symposium: Medical Robotics III<br />

Session TuCT6 ⎯⎯ Symposium: Grasping and Manipulation: Grasp Planning and<br />

Quality<br />

Session TuCT7 ⎯⎯ Mechanisms: Actuator Design<br />

Session TuCT8 ⎯⎯ Reptile & Fish-inspired Robots<br />

Session TuCT9 ⎯⎯ Novel Sensors


Session TuAT1 Continental Parlor 1 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Object Recognition, Segmentation, and Detection<br />

Chair Trevor Darrell, UC Berkeley<br />

Co-Chair Ashley Desmond Tews, CSIRO<br />

08:00–08:15 TuAT1.1<br />

Robust Stereo-Vision Based 3D Modeling of<br />

Real World Objects for Assistive Robotic<br />

Applications<br />

S.K. Natarajan, D. Ristic-Durrant, A. Leu and A. Gräser<br />

Institute of Automation, University of Bremen, Germany<br />

• Novel stereo vision based segmentation of textured and uniformly<br />

colored objects<br />

• Robust closed-loop segmentation of uniformly colored segments in<br />

irregularly shaped object ROIs<br />

• Accurate 2D shape extraction for 3D object modeling<br />

• Needs no a-priori knowledge on object geometry<br />

Block Diagram of the proposed 3D Object Reconstruction System<br />

08:30–08:45 TuAT1.3<br />

Integrate Multi-Modal Cues for Category-<br />

Independent Object Detection and Localization<br />

Jianhua Zhang, Junhao Xiao and Jianwei Zhang<br />

Dept. Informatics, University of Hamburg, Germany<br />

Houxiang Zhang<br />

Dept. Technology and Nautical Sciences, Aalesund University College, Norway<br />

Shengyong Chen<br />

College of Computer Science, Zhejiang University of Technology, China<br />

• To detect and localize object instances<br />

that is category-independent<br />

• Our study uses multi-modal cues,<br />

including multi-scale saliency, superpixels<br />

straddling, intensity, depth and global<br />

information, into a uniform Bayesian<br />

framework to obtain accurate detection<br />

and localization.<br />

• Experiments show the promising<br />

performance of the proposed method<br />

based on the PASCAL VOC 08 dataset<br />

and our indoor scene dataset.<br />

Some experimental results of our<br />

study<br />

08:15–08:30 TuAT1.2<br />

Practical 3-D Object Detection Using Category<br />

and Instance-level Appearance Models<br />

Kate Saenko, Sergey Karayev, Yangqing Jia, Alex Shyr, Allison<br />

Janoch, Jonathan Long, Mario Fritz*, Trevor Darrell<br />

EECS, UC Berkeley, USA, *Max-Planck-Institute for Informatics, Germany<br />

• Effective interaction requires both<br />

category (e.g. bottle) and instance<br />

(e.g. coca-cola) detection<br />

• Categories are often best described<br />

by overall shape, instances by<br />

distinct local patterns (e.g. logos)<br />

• We combine both cues in a detection<br />

system based on the Kinect sensor<br />

• We exploit depth cues to provide<br />

better segmentation and object size<br />

constraints<br />

• We propose several ways to improve<br />

the efficiency of the search<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–41–<br />

shape path<br />

size zee constraints c<br />

categ category: bottle ttle<br />

instance: coca-cola<br />

08:45–08:50 TuAT1.4<br />

Visual and Physical Segmentation<br />

of Novel Objects<br />

Amr Almaddah, Yasushi Mae, Kenichi Ohara<br />

Tomohito Takubo, Tatsuo Arai<br />

Department of System Innovation, Osaka University, Japan<br />

• Visual segmentation of novel objects<br />

from complicated scenes by deploying<br />

multiple lights and objects reflectivity.<br />

• Physical segmentation of novel objects<br />

by using static electricity charge sensing<br />

technique.<br />

• By using extracted items visual and<br />

physical aspects, an autonomous robot<br />

manipulator successfully performed a<br />

trash separation task.<br />

local texture path


Session TuAT1 Continental Parlor 1 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Object Recognition, Segmentation, and Detection<br />

Chair Trevor Darrell, UC Berkeley<br />

Co-Chair Ashley Desmond Tews, CSIRO<br />

08:50–08:55 TuAT1.5<br />

Knowing Your Limits - Self-Evaluation and<br />

Prediction in Object Recognition<br />

Michael Zillich, Johann Prankl, Thomas Mörwald, Markus Vincze<br />

Inst. of Automation and Control, Vienna Univ. of Technology, Austria<br />

• Incremental acquisition of 3D object<br />

models for open-ended learning scenarios<br />

• Probabilistic measures of observed<br />

detection success, predicted detection<br />

success, model completeness<br />

• Support reasoning when to extend the<br />

model, where to look next<br />

• Predict the probability of successful<br />

detection given the model learned so far<br />

Object, learned views and<br />

indication of model completeness<br />

09:00–09:15 TuAT1.7<br />

Generating Object Hypotheses in Natural<br />

Scenes through Human-Robot Interaction<br />

Niklas Bergström, Mårten Björkman and Danica Kragic<br />

CVAP/CSC, Royal Institute of Technology (KTH), Sweden<br />

• We present an interactive multi label,<br />

real time segmentation system for<br />

object modeling and tracking.<br />

• A human operator instructs the robot<br />

with simple commands such as ”add<br />

segment” or “split segment”.<br />

• By additionally giving relational<br />

information about objects, the<br />

segmentation can be further refined.<br />

• We showed through experiments<br />

how the different interactive elements<br />

helps to improve the segmentation.<br />

08:55–09:00 TuAT1.6<br />

Depth Kernel Descriptors for Object Recognition<br />

Liefeng Bo and Dieter Fox<br />

Department of Computer Science, University of Washington, USA<br />

Xiaofeng Ren<br />

Intel Labs Seattle, USA<br />

• Depth kernel descriptors is a general<br />

framework to generate rich 3D features<br />

• Five features over depth maps and 3D<br />

point clouds are proposed to capture size,<br />

3D shape, and edge cues for recognition<br />

• Evaluated on the large RGB-D Object<br />

dataset, consisting of 300 objects in 51<br />

categories for both instance and category<br />

object recognition<br />

• The proposed features are diverse yet<br />

complementary, as shown in the right Fig.<br />

• Improves over state-of-the-art features<br />

such as spin images.<br />

09:15–09:30 TuAT1.8<br />

3D Payload Detection From 2D Range Scans<br />

Ash Tews<br />

Autonomous Systems Laboratory, CSIRO, Australia<br />

• Major challenge is how to use 2D scan<br />

segments to classify a 3D payload<br />

• No prior knowledge of the payload –<br />

scans obtained during vehicle travel<br />

• System selects highly discriminating target<br />

scan segments for the segment reference<br />

library<br />

• New scans matched to the reference<br />

library and a temporal confidence value<br />

used for classification<br />

• Results demonstrated in classifying a<br />

smelter crucible<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–42–


Session TuAT2 Continental Parlor 2 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Simultaneous Localization and Mapping<br />

Chair Jaime Valls Miro, Univ. of Tech. Sydney<br />

Co-Chair Denis Fernando Wolf, Univ. of Sao Paulo<br />

08:00–08:15 TuAT2.1<br />

A Hierarchical RBPF SLAM for Mobile Robot<br />

Coverage in Indoor Environments<br />

Tae-kyeong Lee and Se-young Oh<br />

Department of Electrical Engineering, POSTECH, Korea<br />

Seongsoo Lee<br />

Future IT Convergence Laboratory, LG Electronics Corporation, Korea<br />

• This paper presents a SLAM framework<br />

that can be applied to floor cleaning robots<br />

equipped with sparse and short range<br />

sensors<br />

• A line feature map is estimated based on<br />

RBPF, hierarchically by dividing the entire<br />

region into local maps<br />

• We improve filter performance by using an<br />

assumption of orthogonal environment<br />

• The proposed SLAM framework is<br />

combined with a coverage path planning<br />

algorithm, enabling online simultaneous<br />

coverage and SLAM<br />

(a) Map and trajectory by odometry<br />

(b) Map and trajectory by SLAM<br />

Experimental results in a real<br />

home environment<br />

08:30–08:45 TuAT2.3<br />

Multiple Robot Simultaneous Localization and<br />

Mapping<br />

Sajad Saeedi � , Liam Paull � , Michael Trentini �� and Howard Li �<br />

� Electrical and Computer Engineering, University of New Brunswick, Canada<br />

�� Defence Research and Development Canada, Canada<br />

• In this research, a decentralized platform<br />

for simultaneous localization and mapping<br />

with multiple robots has been developed<br />

and verified.<br />

• Map fusion is achieved through a multistep<br />

process which is based on image<br />

processing techniques.<br />

• Maps are preprocessed by Canny edge<br />

detection, edge smoothing and image<br />

segmentation.<br />

• Then the relative transformation matrix of<br />

maps is extracted by cross correlation,<br />

Radon transform, similarity index and<br />

image entropy.<br />

The proposed algorithm<br />

for the map fusion<br />

08:15–08:30 TuAT2.2<br />

Mapping of Multi-Floor Buildings: A Barometric<br />

Approach<br />

Ali Ozkil 1 , Zhun Fan 2 , Jizhong Xiao 3 , Jens Kristensen 4<br />

Steen Dawids 1 , Kim Christensen 4<br />

1: Technical University of Denmark, 2: Tongji University,China 3:The City<br />

College of New York, USA 4: Force Technology, Denmark<br />

• This paper presents a new method for<br />

mapping multi-floor buildings.<br />

• The method combines laser range<br />

sensor for metric mapping and<br />

barometric pressure sensor for detecting<br />

floor transitions and map segmentation.<br />

• We exploit the fact that the barometric<br />

pressure is a function of the elevation,<br />

and it varies between different floors.<br />

• The method is tested with a real robot in<br />

a typical indoor environment, and<br />

physically consistent multi-floor<br />

representations are achieved.<br />

08:45–08:50 TuAT2.4<br />

Improving Occupancy Grid FastSLAM<br />

by Integrating Navigation Sensors<br />

Christopher Weyers<br />

Sensors Directorate, Air Force Research Laboratory, USA<br />

Gilbert Peterson<br />

Department of Electrical & Computer Eng, Air Force Institute of Tech, USA<br />

• Simultaneous Localization and Mapping<br />

(SLAM) problem of constructing local grid<br />

maps using only internal sensors<br />

• Multiple Integrated Navigation Sensors<br />

(MINS) system combines IMU, odometry,<br />

stereo vision creating one input path<br />

• Implementation cascades MINS Kalman<br />

filter into FastSLAM particle filter using<br />

LIDAR to build 2D maps<br />

• MINS FastSLAM implementation<br />

decreased error 92% from navigation path<br />

and 79% from odometry path<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–43–<br />

Maps from Odometry, MINS,<br />

Odometry SLAM, MINS SLAM


Session TuAT2 Continental Parlor 2 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Simultaneous Localization and Mapping<br />

Chair Jaime Valls Miro, Univ. of Tech. Sydney<br />

Co-Chair Denis Fernando Wolf, Univ. of Sao Paulo<br />

08:50–08:55 TuAT2.5<br />

Efficient Information-Theoretic Graph Pruning for<br />

Graph-Based SLAM with Laser Range Finders<br />

Henrik Kretzschmar, Cyrill Stachniss<br />

Dept. of Computer Science, University of Freiburg, Germany<br />

Giorgio Grisetti<br />

Dept. of Systems and Computer Science, Sapienza University of Rome, Italy<br />

• Prune the pose graph in graph-based SLAM<br />

• Maximize the mutual information of laser<br />

measurements and the resulting occupancy<br />

grid map<br />

• Discard the least informative laser scans and<br />

marginalize out the corresponding pose<br />

nodes<br />

• Approximate marginalization based on<br />

Chow-Liu trees to avoid a dense pose graph<br />

• Allow for long-term map learning and<br />

any-space SLAM<br />

09:00–09:15 TuAT2.7<br />

Neural Network-based Multiple Robot<br />

Simultaneous Localization and Mapping<br />

Sajad Saeedi � , Liam Paull � , Michael Trentini �� and Howard Li �<br />

� Electrical and Computer Engineering, University of New Brunswick, Canada<br />

�� Defence Research and Development Canada, Canada<br />

• In this research, a decentralized platform<br />

for simultaneous localization and mapping<br />

with multiple robots has been developed.<br />

• A novel occupancy grid map fusion<br />

algorithm is proposed. Map fusion is<br />

achieved through a multistep process.<br />

• The proposed method learns the maps<br />

based on the Neural Network - Self<br />

Organizing Map (SOM).<br />

• In the learning phase, the obstacles of the<br />

map are learned by SOM which makes<br />

further analyses of the map easier and<br />

faster.<br />

Two maps (top and middle) are<br />

fused by the proposed method<br />

(button)<br />

08:55–09:00 TuAT2.6<br />

An Incremental Scheme for Dictionary-based<br />

Compressive SLAM<br />

Tomomi Nagasaka and Kanji Tanaka<br />

Engineering, University of Fukui, Japan<br />

• Obtaining a compact representation of a large-size pointset map built<br />

by mapper robots is a critical issue for recent SLAM applications.<br />

• This ``map compression" problem is explored from a novel perspective<br />

of dictionary-based map compression in the paper.<br />

• The primary contribution of the<br />

paper is proposal of an incremental<br />

scheme for simultaneous mapping<br />

and map-compression applications.<br />

• An incremental map compressor<br />

is presented by employing a<br />

modified RANSAC map-matching<br />

scheme as well as the compact<br />

projection visual search.<br />

Incremental map-compression while<br />

simultaneously building the map<br />

09:15–09:30 TuAT2.8<br />

Conservative Sparsification for Efficient and Consistent<br />

Approximate Estimation<br />

John Vial and Tim Bailey<br />

Australian Centre for Field Robotics, University of Sydney, Australia<br />

Hugh Durrant-Whyte<br />

National ICT Australia<br />

• A new approach for sparse<br />

approximation.<br />

• Guaranteed to maintain<br />

consistency.<br />

• Maintains smaller Kullback-Leibler<br />

Divergence than existing<br />

consistent approaches.<br />

• Complexity is bounded by the size<br />

of the Markov blanket surrounding<br />

link removal.<br />

• No new nonzero links are added.<br />

• Elements outside Markov blanket<br />

area are unaffected.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–44–<br />

Graph before and after sparsification.<br />

Light coloured link removed, no new fill<br />

in. Complexity is bounded by boxed<br />

area (may approach constant time for<br />

large sparse matrices).


Session TuAT3 Continental Parlor 3 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Microrobotics I<br />

Chair Sylvain Martel, Ec. Pol. de Montreal (EPM)<br />

Co-Chair David Cappelleri, Stevens Inst. of Tech.<br />

08:00–08:15 TuAT3.1*<br />

Semi-Plenary Invited Talk: History of Microrobotics<br />

and Vision for the Future: Microassembly and Beyond<br />

Toshio Fukuda, Nagoya University<br />

08:30–08:45 TuAT3.3<br />

Remote Microscale Teleoperation through<br />

Virtual Reality and Haptic Feedback<br />

Aude Bolopion1 , Christian Stolle2 , Robert Tunnell2 , Sinan Haliyo1 ,<br />

Stéphane Régnier1 and Sergej Fatikow2 1 Institut des Systèmes Intelligents et de Robotique, Université Pierre et Marie<br />

Curie CNRS UMR 7222, France.<br />

2 Division Microrobotics and Control Engineering, Oldenburg University,<br />

Germany.<br />

• Teleoperation of microspheres between France and Germany is<br />

performed<br />

• Haptic feedback and 3D reconstruction of the scene are provided to<br />

ensure intuitiveness<br />

• Vision algorithms are developed to compensate the lack of force<br />

measurement in SEM manipulation<br />

• Modular software architecture and efficient transmission of the data<br />

ensures a wide range of applications<br />

08:15–08:30 TuAT3.2<br />

Semi-Plenary Invited Talk: History of Microrobotics<br />

and Vision for the Future: Microassembly and Beyond<br />

Toshio Fukuda, Nagoya University<br />

08:45–08:50 TuAT3.4<br />

Miniature Ferromagnetic Robot Fish Actuated by<br />

a Clinical Magnetic Resonance Scanner<br />

F. Gosselin, D. Zhou, V. Lalande,<br />

M. Vonthron and S. Martel<br />

NanoRobotics Laboratory, Ecole Polytechnique Montreal, Canada<br />

• A new actuation principle for a swimming<br />

robot is presented<br />

• This actuation method uses the magnetic<br />

field produced by a clinical MRI scanner<br />

• The performance of the robot fish is<br />

studied by varying different parameters<br />

• Preliminary results suggest that the<br />

technique could potentially be scaled<br />

Photograph of the actual<br />

down to be used for microrobots<br />

swimming robot compared<br />

with the new generation MRI<br />

swimmer<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–45–


Session TuAT3 Continental Parlor 3 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Microrobotics I<br />

Chair Sylvain Martel, Ec. Pol. de Montreal (EPM)<br />

Co-Chair David Cappelleri, Stevens Inst. of Tech.<br />

08:50–08:55 TuAT3.5<br />

Hybrid Microassembly of Chips on Low<br />

Precision Patterns Assisted by Capillary Selfalignment<br />

Bo Chang, Mirva Jääskeläinen and Quan Zhou<br />

Department of Automation and Systems Technology, Aalto University, Finland<br />

• The segmented patterns having jagged<br />

edges are fabricated to mimic some realworld<br />

RFID antennas<br />

• Self-alignment occurs reliably despite the<br />

variation in size of the pattern and edge<br />

jaggedness<br />

• The alignment accuracy is likely to be<br />

affected when the total size of the pattern<br />

is different than the chip<br />

• The edge jaggedness seems not making<br />

the accuracy worse, which is a very<br />

interesting phenomenon<br />

09:00–09:15 TuAT3.7<br />

The Cellular Force Microscope (CFM):<br />

A microrobotic system for quantitating the growth<br />

mechanics of living, growing plant cells in situ<br />

Dimitrios Felekis, Simon Muntwyler,<br />

Felix Beyeler and Bradley J. Nelson<br />

Mechanical and Process Engineering, ETH Zurich, Switzerland<br />

• CFM is a system capable of performing<br />

automated mechanical characterization of<br />

living, growing cells in situ<br />

• CFM can be applied on organisms with<br />

changing and diverse morphology, from<br />

the sub-cellular to the whole organ level<br />

• SI-traceable calibration and<br />

characterization with uncertainty analysis<br />

is performed for the MEMS sensor<br />

• The results verify relevant works and<br />

quantify the stiffness difference between<br />

the tip and the shank of the cell<br />

CFM used to characterize single<br />

cell organisms<br />

08:55–09:00 TuAT3.6<br />

A Resonant Surface Sensitive to Out-of-plane Forces<br />

for the Indentation and Injection of Living Cells<br />

Denis Desmaële a,b and Mehdi Boukallel a<br />

a CEA, LIST, LISA, France<br />

Stéphane Régnier b<br />

b ISIR, Université Pierre et Marie Curie, France<br />

• This paper presents a monolithic structure<br />

where biological samples can be placed for<br />

manipulation.<br />

• Forces applied on samples can be<br />

estimated via frequency shifts of two<br />

resonators.<br />

• Static and dynamic behaviors of the force<br />

sensor are theoretically investigated.<br />

• Performances of a first prototype are<br />

reported. Tests on lobster eggs are also<br />

conducted.<br />

09:15–09:30 TuAT3.8<br />

Caging Grasps for Micromanipulation &<br />

Microassembly<br />

David J. Cappelleri, Michael Fatovic, and Zhenbo Fu<br />

Department of Mechanical Engineering, Stevens Institute of Technology, USA<br />

• Systematic method for<br />

defining caging grasps for<br />

micro-scale planar<br />

• Feature-Defined Cages: use<br />

of convex and nonconvex<br />

corner geometries<br />

• Form caging polygon to<br />

apply opposing force<br />

equivalents with probes<br />

• Theoretical error bounded<br />

analysis<br />

• Experimental validation of FD-<br />

Cages and error bounds<br />

• Combine with other primitives<br />

to execute sample assembly<br />

task<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–46–<br />

Micromanipulator<br />

Probes<br />

Fc F<br />

Fc c<br />

Fc Caging Polygon<br />

225 um<br />

[Start]<br />

N 4<br />

N 1<br />

225 um<br />

[End]<br />

N 2<br />

N 3


Session TuAT4 Continental Ballroom 4 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Industrial Forum: Robots: The New Commercial Platforms<br />

Chair Rüdiger Dillmann, KIT Karlsruher Inst. für Tech.<br />

Co-Chair<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–47–


Session TuAT5 Continental Ballroom 5 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Medical Robotics I<br />

Chair Jaydev P. Desai, Univ. of Maryland<br />

Co-Chair Paolo Fiorini, Univ. of Verona<br />

08:00–08:15 TuAT5.1*<br />

Semi-Plenary Invited Talk: Future of Surgical Robotics:<br />

Confluence of Surgeon demands and Technology<br />

Paolo Fiorini, University of Verona<br />

08:30–08:45 TuAT5.3<br />

Design of a User Interface for<br />

Intuitive Colonoscope Control: the Grip<br />

Nicole Kuperij*, Rob Reilink*, Matthijs P. Schwartz + , Stefano<br />

Stramigioli*, Sarthak Misra* and Ivo A.M.J. Broeders* +<br />

University of Twente* and Meander Medical Center + , The Netherlands<br />

• Colonoscopy is a medical procedure<br />

(screening for polyps to prevent colon<br />

cancer)<br />

• Colonoscope control is<br />

difficult, the control<br />

handle is large and not<br />

ergonomic<br />

• Procedure efficiency can<br />

be improved by using “the<br />

Grip” that actuates a<br />

motorized control handle<br />

“The Grip” controls the colonoscope tip by<br />

simply aiming it in the desired direction<br />

• “the Grip”: an innovative user interface to provide<br />

• Single handed control<br />

• Intuitive control � a 3D orientation sensor is used to map the orientation<br />

of the colonoscope shaft to the orientation of the tip<br />

• This study: colonoscope DOF and use, Grip prototype, testing on<br />

colonoscopy simulator (human subject experiment), promising results<br />

08:15–08:30 TuAT5.2<br />

A Prototype of Pneumatically-Driven Forceps<br />

Manipulator with Force Sensing Capability<br />

Using a Simple Flexible Joint<br />

Daisuke Haraguchi, Kotaro Tadano and Kenji Kawashima<br />

Precision and Intelligence Lab., Tokyo Institute of Technology, Japan<br />

• Highly simplified, 2-DOF bending distal<br />

joint using a high performance spring<br />

component is developed<br />

• Wire-tendon drive system is implemented<br />

with 4 pneumatic cylinders<br />

• External forces can be estimated using a<br />

disturbance observer (no sensors at the<br />

tip part)<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–48–<br />

Design of the forceps manipulator<br />

08:45–08:50 TuAT5.4<br />

Robot for Ultrasound-Guided<br />

Prostate Imaging and Intervention<br />

Chunwoo Kim, Felix Schäfer, Doyoung Chang,<br />

Doru Petrisor, Misop Han, Dan Stoianovici<br />

Urology Robotics Laboratory, Johns Hopkins University, U.S.A.<br />

• 4 degree of freedom positions and orients<br />

the transrectal ultrasound probe for image<br />

scanning and needle targeting<br />

• Designed to accommodate the constraints<br />

of the clinical prostate intervention.<br />

• 2D image slices are tracked and mapped<br />

3D volume<br />

• In-vitro studies on prostate mockups verify<br />

3D imaging capabilities.<br />

• Clinically used in robot-assisted<br />

laparoscopic radical prostatectomy for<br />

providing intraoperative ultrasound-based<br />

navigation for the surgeon.<br />

CAD model of the robot(top)<br />

Post operative 3D reconstruction<br />

of prostate from intraoperative<br />

US images (bottom)


Session TuAT5 Continental Ballroom 5 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Medical Robotics I<br />

Chair Jaydev P. Desai, Univ. of Maryland<br />

Co-Chair Paolo Fiorini, Univ. of Verona<br />

08:50–08:55 TuAT5.5<br />

A Modular, Mechatronic Joint Design for a<br />

Flexible Access Platform for MIS<br />

David Noonan, Valentina Vitiello, Jianzhong Shang<br />

Christopher Payne, Guang-Zhong Yang<br />

Hamlyn Centre for Robotic Surgery, Imperial College London, UK<br />

• Mechatronic joint module with 1-DoF or 2-<br />

DoF featuring embedded actuation (micro<br />

motor and pre-tensioned tendon)<br />

• Analysis of tendon path length variation,<br />

position repeatability and force<br />

transmission characteristics are presented<br />

• 7-DoF articulated robot constructed from<br />

two 2-DoF modules and three 1-DoF<br />

modules<br />

• Demonstrated diagnostic peritoneoscopy<br />

using probe based confocal laser<br />

endomicroscopy during in-vivo porcine<br />

trials<br />

7-DoF robot with three internal<br />

channels<br />

09:00–09:15 TuAT5.7<br />

Design of an Endoscopic Stitching Device for<br />

Surgical Obesity Treatment Using a N.O.T.E.S<br />

Approach<br />

Kai Xu � , Jiangran Zhao � and Minhua Zheng �<br />

UM-SJTU Joint Institute � and Department of Surgery � ,<br />

Shanghai Jiao Tong University, China<br />

James Geiger � and Albert J. Shih �<br />

Department of Surgery � and Department of Mechanical Engineering � ,<br />

University of Michigan, USA<br />

• This paper proposed a design of an endoscopic stitching device for<br />

gastroplasty using a N.O.T.E.S<br />

(Natural Orifice Translumenal<br />

Endoscopic Surgery) approach, which<br />

performs stitching and resizing of a<br />

stomach from inside the stomach.<br />

• Major contribution of this paper is the<br />

proposal of fabricating suture using<br />

pre-curved super-elastic NiTi alloy to<br />

simplify suturing motion.<br />

• Minor contribution of this paper is the<br />

presentation of one possible design to<br />

realize this proposed concept.<br />

Design of an endoscopic stitching device using<br />

needles made from pre-curved super-elastic<br />

NiTi alloy: (a) the folded configuration and (b) the unfolded working configuration.<br />

08:55–09:00 TuAT5.6<br />

Development of a “Steerable Drill” for ACL<br />

Reconstruction to Create the Arbitrary<br />

Trajectory of a Bone Tunnel<br />

Hiroki Watanabe, Kazuki Kanou<br />

Graduate school of Science and Engineering, Waseda University, Japan<br />

Yo Kobayashi, Masakatsu G. Fujie<br />

Faculty of Science and Engineering, Waseda University, Japan<br />

• Objective: Design of a steerable drill for ACL<br />

(anterior cruciate ligament) reconstruction to<br />

avoid LCL (lateral collateral ligament) injury.<br />

• Requirement:<br />

�Sufficient flexibility to bent to avoid LCL<br />

while cutting the bone.<br />

�Rigidity to avoid the buckle at the base<br />

of the drill while cutting the bone.<br />

• Solution: Adoption of a spring sheath with<br />

stiffness that varied between top and base of<br />

the drill to satisfy the two requirements.<br />

• Result: The steerable drill achieved 3 times<br />

the bending deformation than the drill with<br />

uniform stiffness as preventing the buckling.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–49–<br />

Steerable drill with the nonuniform<br />

stiffness.<br />

09:15–09:30 TuAT5.8<br />

Active Bending Endoscope Robot System for<br />

Navigation through Sinus Area<br />

Hyun-Soo Yoon, Se Min Oh, Jin Hyeok Jeong, Seung Hwan Lee,<br />

Kyung Tae, and Byung-Ju Yi<br />

Hanyang University, Korea<br />

Kyoung-Chul Koh<br />

Sun Moon University, Korea<br />

• A new endoscope robot system for<br />

general sinus surgery is proposed.<br />

• The endoscope mechanism can be bent<br />

by 180 degrees with a small radius of<br />

curvature (1cm).<br />

• The proposed system was successfully<br />

implemented to inspection of a simulated<br />

maxillary sinus area through the nostril of<br />

the human silicon model without any<br />

collision.<br />

Endoscopic view<br />

Active Bending Endoscope Robot<br />

System


Session TuAT6 Continental Ballroom 6 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Grasping and Manipulation: Control and Learning<br />

Chair Matei Ciocarlie, Willow Garage<br />

Co-Chair Peter Allen, Columbia Univ.<br />

08:00–08:15 TuAT6.1*<br />

Semi-Plenary Invited Talk: Get a Grip -- Robotic<br />

Dexterous Manipulation from Finger Choreography to<br />

The Power Pinch<br />

Mark Cutkosky, Stanford University<br />

08:30–08:45 TuAT6.3<br />

Synergy Level Impedance Control for<br />

Multifingered Hands<br />

Thomas Wimböck 1 , Benjamin Jahn 1,2 , and Gerd Hirzinger 1<br />

1 Institute of Robotics and Mechatronics, DLR, Germany<br />

2 Control Engineering Group, Computer Science and Automation Faculty,<br />

Ilmenau University of Technology, Germany<br />

• Controller to imitate the behaviour of a<br />

synergistic, respectively underactuated,<br />

hand.<br />

• Extension of the work on passivity based<br />

hand control at DLR.<br />

• Based on the analysis of the grasp<br />

database of the DLR Hand II.<br />

• Desired equilibrium in synergy coordinates<br />

and the corresponding synergy stiffness<br />

are setpoints/parameters.<br />

• The nullspace impedance allows to adjust<br />

the mechanical coupling.<br />

• Implementation and verification on the DLR<br />

Hand II.<br />

DLR Hand II controlled by a<br />

synergy impedance controller<br />

08:15–08:30 TuAT6.2<br />

Semi-Plenary Invited Talk: Get a Grip -- Robotic<br />

Dexterous Manipulation from Finger Choreography to<br />

The Power Pinch<br />

Mark Cutkosky, Stanford University<br />

08:45–08:50 TuAT6.4<br />

Embodiment-Specific Representation of Robot<br />

Grasping using Graphical Models and Latent-Space<br />

Discretization<br />

Dan Song, Carl Henrik Ek, Kai Huebner and Danica Kragic<br />

School of Computer Science and Communications,<br />

Royal Institute of Technology, Sweden<br />

• We propose an embodimentspecific<br />

representation<br />

framework for robot grasping.<br />

• To evaluate the framework, we<br />

train two task models, one for<br />

the human hand and one for<br />

the Armar robot hand.<br />

• Both models successfully<br />

encode the semantic task<br />

requirements in the continuous<br />

observation spaces<br />

• And different hand kinematics<br />

affect both the Bayesian<br />

network structure and the<br />

conditional distributions over<br />

the modeled variables<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–50–


Session TuAT6 Continental Ballroom 6 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Grasping and Manipulation: Control and Learning<br />

Chair Matei Ciocarlie, Willow Garage<br />

Co-Chair Peter Allen, Columbia Univ.<br />

08:50–08:55 TuAT6.5<br />

Grasping Unknown Objects using an Early Cognitive<br />

Vision System for General Scene Understanding<br />

Mila Popović and Jimmy Alison Jørgensen and Norbert Krüger<br />

The Mærsk Institute, University of Southern Denmark, Denmark<br />

Gert Kootstra and Danica Kragic<br />

Computer Vision and Active Perception Lab, KTH, Sweden<br />

• Early Cognitive Vision system builds a<br />

hierarchical representation based on edge<br />

and texture information from real stereo<br />

images.<br />

• Edge-based and surface-based grasps<br />

are generated and tested in the simulation<br />

environment.<br />

• Method generates successful grasps, also<br />

for complex scenes. Edge and surface<br />

information are complementary.<br />

� The system can be used as benchmark<br />

for visual-based grasping.<br />

09:00–09:15 TuAT6.7<br />

Imitation Learning of Human Grasping Skills<br />

from Motion and Force Data<br />

Alexander M. Schmidts, Dongheui Lee, Angelika Peer<br />

Institute of Automatic Control Engineering<br />

Technische Universität München<br />

Germany<br />

• Imitation learning of precision grasps with multiple interaction points from<br />

motion and force data<br />

• Physical consistency between demonstrations and reproduction<br />

guaranteed by learning tensions compared to finger interaction forces<br />

• Enhanced generalisability by including force information in the learning<br />

process<br />

Teleoperation system used for demonstration. Using the data<br />

glove and motion tracker attached to the users hand (right) the<br />

simulated Elumotion Hand is controlled (left). Haptic force<br />

feedback is given to the user by means of a Cybergrasp.<br />

08:55–09:00 TuAT6.6<br />

Grasping of Unknown Objects via Curvature<br />

Maximization using Active Vision<br />

Berk Calli, Martijn Wisse and Pieter Jonker<br />

Biomechanical Engineering Department, Delft University of Technology,<br />

The Netherlands<br />

• In this paper, we propose a novel grasping<br />

algorithm that uses active vision as basis.<br />

• The algorithm does not require a 3D<br />

model or any offline data.<br />

• The algorithm uses the curvature<br />

information obtained from the silhouette of<br />

the object.<br />

• The silhouette is modeled using EFDs,<br />

and grasping points and approach vector<br />

have been obtained.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–51–<br />

The motion that robot<br />

goes through for grasping.<br />

09:15–09:30 TuAT6.8<br />

Internal force control with no object motion in<br />

compliant robotic grasps<br />

Monica Malvezzi and Domenico Prattichizzo<br />

Department of Information Engineering, University of Siena, Italy<br />

Department of Advanced Robotics, Istituto Italiano di Tecnologia, Genova, Italy<br />

• The control of internal forces is one of the<br />

key issues in grasping.<br />

• When the robotic hand is compliant, and<br />

the number of controlled variables is low, it<br />

is possible that the control of internal<br />

forces implies the motion of the<br />

manipulated object.<br />

• The paper studies the structural conditions<br />

for the control of internal forces which do<br />

not involve any motion of the grasped<br />

object.<br />

• The analysis is constructive and a<br />

controller of internal forces is proposed.


Session TuAT7 Continental Parlor 7 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Software Architectures & Frameworks<br />

Chair Geoffrey Biggs, National Inst. of AIST<br />

Co-Chair Roland Philippsen, Stanford Univ.<br />

08:00–08:15 TuAT7.1<br />

Intelligent System Architectures –<br />

Comparison by Translation<br />

Benjamin Dittes and Christian Goerick<br />

Honda Research Institute Europe GmbH, Germany<br />

• System architectures are important for<br />

constructing large software systems,<br />

scientific discourse requires comparability<br />

• We present translation to a common<br />

language, which we call `Systematica 2D’,<br />

as a way to allow comparison<br />

• The approach is verified on three recent,<br />

popular system architectures: CogX<br />

`George’, `ALIS3’ and the iCub Cognitive<br />

Architecture<br />

• Resulting Systematica 2D diagrams allow<br />

analysis of common patterns, constraints<br />

and design decisions<br />

Main information flows in ‘George’,<br />

‘ALIS3’ and ‘iCub’ architectures,<br />

derived from Systematica 2D notations<br />

08:30–08:45 TuAT7.3<br />

Analysis of Software Connectors in Robotics<br />

Azamat Shakhimardanov, Nico Hochgeschwender,<br />

Michael Reckhaus, Gerhard K. Kraetzschmar<br />

Bonn-Rhine-Sieg University of Applied Sciences, Germany<br />

• Method to analyse quality attributes in component-oriented robotic<br />

software<br />

• Protocol stack view used for the analysis of distributed robotic<br />

software<br />

• Scalability experiments with ROS and ZeroMQ<br />

• Lessons learned in system design for large-scale robotic applications<br />

08:15–08:30 TuAT7.2<br />

Conductor: A Controller Development<br />

Framework for High Degree of Freedom Systems<br />

Robert Sherbert and Paul Oh<br />

Mechanical Engineering, Drexel University, USA<br />

• Conductor is a software development<br />

framework for building controllers for high<br />

degree of freedom systems.<br />

• Conductor simplifies porting controllers<br />

between robots by representing hardware<br />

in terms of state variables.<br />

• Conductor can achieve significant<br />

increases in bandwidth usage efficiency<br />

within complex systems.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–52–<br />

A time progression of Conductor<br />

running a walking trajectory on a<br />

high degree of freedom robot.<br />

08:45–08:50 TuAT7.4<br />

An Open Source Extensible Software Package to Create<br />

Whole-Body Compliant Skills in Personal Mobile Manipulators<br />

Roland Philippsen, Luis Sentis, Oussama Khatib<br />

Stanford Robotics and AI Lab, USA<br />

University of Texas at Austin, USA<br />

• Whole-body operational space<br />

control is a powerful compliant<br />

control approach for robots that<br />

physically interact with their<br />

environment<br />

• Open-source implementation<br />

with runtime configurability,<br />

ease of reuse and extension,<br />

and independence from specific<br />

middlewares or operating<br />

systems<br />

• Experiments on Dreamer and<br />

PR2 show the technical<br />

performance and portability.<br />

Visit http://stanford-wbc.sourceforge.net


Session TuAT7 Continental Parlor 7 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Software Architectures & Frameworks<br />

Chair Geoffrey Biggs, National Inst. of AIST<br />

Co-Chair Roland Philippsen, Stanford Univ.<br />

08:50–08:55 TuAT7.5<br />

A component supervisor for RT-Middleware<br />

using supervision trees<br />

Geoffrey Biggs and Noriaki Ando and Tetsuo Kotoku<br />

Intelligent Systems Research Institute<br />

National Institute of Advanced Industrial Science and Technology<br />

Japan<br />

• Deployment and run-time management is<br />

an important part of a component-based<br />

software system.<br />

• We have developed a deployment and<br />

management tool for RT-Middleware.<br />

• The tool is based on the Supervision Tree e<br />

concept from the Erlang programming<br />

language.<br />

• The tool shows the validity of the<br />

Supervision Tree approach, although<br />

further research is needed to adapt it to<br />

robotics.<br />

09:00–09:15 TuAT7.7<br />

The Computing and Communication<br />

Architecture of the DLR Hand Arm System<br />

Stefan Jörg, Mathias Nickl, Alexander Nothhelfer,<br />

Thomas Bahls, and Gerd Hirzinger<br />

Inst. of Robotics and Mechatronics, German Aerospace Center, Germany<br />

• How to handle 52 motors and 430<br />

sensors?<br />

• How to achieve a convenient but highperformance<br />

interface to control the<br />

robot?<br />

• The presented solution uses a hierarchical<br />

net of computing nodes and combines the<br />

concepts of middleware and device<br />

drivers<br />

• First experiments with control applications<br />

(100 kHz/3KHz cycles) demonstrate the<br />

performance of the architecture<br />

08:55–09:00 TuAT7.6<br />

��������������������������������������<br />

���������������������<br />

������������������������������������<br />

�������������������������������������������������<br />

� ��������������������������<br />

� ����������������������������<br />

� ����������������������������������<br />

� ������������������������������<br />

09:15–09:30 TuAT7.8<br />

Caliper: A Universal Robot Simulation<br />

Framework for Tendon-Driven Robots<br />

S. Wittmeier, M. Jäntsch, K. Dalamagkidis, M. Rickert and A. Knoll<br />

Faculty of Informatics, TU München, Germany<br />

H.G. Marques<br />

Department of Informatics, University of Zurich, Switzerland<br />

• Caliper is a newly developed simulation<br />

framework targeted to the emerging class<br />

of tendon-driven robots.<br />

• The framework uses a modular, CORBA-<br />

and component-based approach. This<br />

provides easy extensibility and tailoring to<br />

meet user requirements.<br />

• The simulation models are defined in the<br />

standardized XML-based COLLADA<br />

format.<br />

• Physics-engine extensions are included to<br />

simulate tendon-driven actuators.<br />

• Caliper also provides tools for real-time<br />

data acquisition and visualization.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–53–<br />

Caliper screenshot (top) and<br />

prototype of the anthropomimetic,<br />

tendon-driven robot ECCE-I<br />

(bottom)


Session TuAT8 Continental Parlor 8 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Bio-inspired & Biomimetic Robots<br />

Chair Robert Wood, Harvard Univ.<br />

Co-Chair Howie Choset, Carnegie Mellon Univ.<br />

08:00–08:15 TuAT8.1<br />

Using Response Surfaces and Expected<br />

Improvement to Optimize Snake Robot Locomotion<br />

Matthew Tesch, Jeff Schneider and Howie Choset<br />

Robotics Institute, Carnegie Mellon University, United States of America<br />

• Goal: Improve the speed and capabilities<br />

of various snake robot gaits<br />

• Problem: Calculating the performance of a<br />

gait is expensive, as accurate models are<br />

unavailable<br />

• Gait optimization requires evaluations on<br />

an actual robot, which are expensive in<br />

time and resources<br />

• Solution: Fit a response surface as a<br />

surrogate objective function to assist with<br />

experiment selection<br />

08:30–08:45 TuAT8.3<br />

Snake-like Active Wheel Robot ACM-R4.1<br />

with Joint Torque Sensor and Limiter<br />

Shunichi Takaoka and Hiroya Yamada<br />

Mechanical and Aerospace Engineering, Tokyo Institute of Technology, Japan<br />

Shigeo Hirose<br />

Mechanical and Aerospace Engineering, Tokyo Institute of Technology, Japan<br />

• This paper proposes a snake-like active<br />

wheel robot named “ACM-R4.1”<br />

• ACM-R4.1 has a new mechanism which<br />

works as both a torque sensor and a<br />

torque limiter<br />

• We made theoretical consideration of<br />

design of the torque sensor and did<br />

torque measurement experiments<br />

• We also demonstrated the terrain<br />

adaptive motion by using torque sensors<br />

ACM-R4.1<br />

08:15–08:30 TuAT8.2<br />

State Estimation for Snake Robots<br />

David Rollinson, Austin Buchan and Howie Choset<br />

The Robotics Institute, Carnegie Mellon University, USA<br />

• We use an EKF to fuse<br />

proprioceptive data from the<br />

individual modules of a snake<br />

robot.<br />

• Using the Virtual Chassis<br />

instead of a fixed body frame<br />

allows more accurate estimates<br />

of pitch and roll.<br />

• State size is reduced by using<br />

gait parameters to approximate<br />

the shape of the robot during<br />

estimation.<br />

08:45–08:50 TuAT8.4<br />

Task-Space Control of Extensible Continuum<br />

Manipulators<br />

Apoorva Kapadia and Ian Walker<br />

Department of Electrical & Computer Engineering, Clemson University, USA<br />

• A new approach towards control of<br />

extensible continuum robots is presented.<br />

• Task-space control considered for<br />

continuum manipulators for first time.<br />

• Model-based control developed extending<br />

controllers for rigid-link robots.<br />

• Regulation of any location along the<br />

backbone of robot as well as tip.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–54–<br />

The Octarm Grasping a<br />

Cylindrical Object


Session TuAT8 Continental Parlor 8 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Bio-inspired & Biomimetic Robots<br />

Chair Robert Wood, Harvard Univ.<br />

Co-Chair Howie Choset, Carnegie Mellon Univ.<br />

08:50–08:55 TuAT8.5<br />

Novel Modal Approach for Kinematics of<br />

Multisection Continuum Arms<br />

Isuru S. Godage, Emanuele Guglielmino, David T. Branson,<br />

Gustavo A. Medrano-Cerda, and Darwin G. Caldwell<br />

Advanced Robotics Department,<br />

Istituto Italiano di Tecnologia, Italy.<br />

• Uses mode shape function (MSF) based<br />

approach to simplify and solve<br />

singularities<br />

• Introduces an intuitive way of deriving<br />

correct MSFs eliminating mode switching<br />

• Simulates forward kinematic spatial<br />

bending and pure elongation/contraction<br />

• Introduces inverse orientation kinematics<br />

for multisection continuum arms<br />

• Computes inverse position and/with<br />

orientation kinematics efficiently<br />

• Proposed method is applicable to a broad<br />

spectrum of continuum robotic arm<br />

designs<br />

Arm demonstrating inverse<br />

position with orientation<br />

kinematics in a simulated object<br />

inspection task<br />

09:00–09:15 TuAT8.7<br />

System identification and linear time-invariant modeling<br />

of an insect-sized flapping-wing micro air vehicle<br />

Ben Finio, Néstor Pérez-Arancibia and Robert Wood<br />

School of Engineering and Applied Sciences, Harvard University, USA<br />

• Dynamic modeling of flapping-wing MAVs<br />

is typically highly nonlinear<br />

• We present a linear time-invariant (LTI)<br />

model and compare it to an identified<br />

model derived using system identification<br />

• We show that the linear model can<br />

provide an accurate estimate of system<br />

behavior despite inherent nonlinearities<br />

• Implications for vehicle design and<br />

optimization are discussed<br />

New prototype of the Harvard<br />

Microrobotic Fly<br />

08:55–09:00 TuAT8.6<br />

Hardware in the Loop for<br />

Optical Flow Sensing in a Robotic Bee<br />

Pierre-Emile Duhamel, Judson Porter, Ben Finio,<br />

David Brooks, Gu-Yeon Wei, and Robert Wood<br />

School of Engineering and Applied Science, Harvard University, USA<br />

Geoffrey Barrows<br />

Centeye, Inc., USA<br />

• Successful autonomous robotic bees<br />

require optimized weight and power<br />

consumption to increase flight time.<br />

• Optimizing components is challenging<br />

due to interconnectedness and varying<br />

interdisciplinary component<br />

development timelines.<br />

• Hardware in the loop (HWIL) provides<br />

a modular combination of robotic bee<br />

hardware and software components.<br />

• We demonstrate the value of HWIL for<br />

desing of an optical flow sensor.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–55–<br />

Robotic Bee Hardware in the Loop<br />

09:15–09:30 TuAT8.8<br />

The acquisition of intentionally indexed and<br />

object centered affordance gradients<br />

Martí Sánchez-Fibla, Armin Duff and Paul Verschure<br />

SPECS, Technology Department, Universitat Pompeu Fabra (UPF), Spain<br />

• We introduce Affordance Gradients<br />

(AGs) : sensorimotor structures that<br />

allow to predict the consequences of<br />

“pushing” actions.<br />

• We show results in two proposed<br />

benchmarks:<br />

• (I) Pushing an object along a<br />

predefined trajectory<br />

• (II) Placing an object at a target<br />

position and orientation<br />

• Figure (a) shows a learned AG of a<br />

square object in Webots simulation<br />

environment.<br />

• Figure (b) shows the trajectory<br />

displayed by triangular and l-shaped<br />

pieces of benchmark (II).<br />

(a)<br />

(b)


Session TuAT9 Continental Parlor 9 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Probabilistic Exploration and Coverage<br />

Chair S.K. Gupta, Univ. of Maryland<br />

Co-Chair Timothy Bretl, Univ. of Illinois at Urbana-Champaign<br />

08:00–08:15 TuAT9.1<br />

Exploration driven by Local Potential Distortions<br />

Edson Prestes and Paulo Engel<br />

Instituto de Informática<br />

Universidade Federal do Rio Grande do Sul (UFRGS)<br />

Brazil<br />

• We extend our exploration method based on BVP to allow the robot to<br />

explore an unknown environment dynamically while considering<br />

environment preferences<br />

• These preferences generate distortions<br />

in the potential field that change locally<br />

the concavity/convexity of the potential<br />

• They can be thought of as an utility<br />

measure that provides an estimate of the<br />

environment information gain and/or the<br />

localization accuracy<br />

Exploration of an unknown environment<br />

with high (dark gray) and low (light gray)<br />

preferences regions<br />

08:30–08:45 TuAT9.3<br />

Adaptive Look-Ahead for Robotic Navigation in<br />

Unknown Environments<br />

Greg Droge and Magnus Egerstedt<br />

Electrical and Computer Engineering<br />

Georgia Institute of Technology, USA<br />

• Schema-based behaviors allow for<br />

navigation based on sensor<br />

measurements alone.<br />

• We introduce a receding horizon approach<br />

for adaptively coordinating these<br />

behaviors.<br />

• We are operating in an unknown<br />

environment: not clear what the time<br />

horizon should be for the receding horizon<br />

problem.<br />

• We introduce a method for online<br />

adaptation of the time horizon based on<br />

how well we have predicted the state<br />

trajectory.<br />

08:15–08:30 TuAT9.2<br />

Histogram Based Frontier Exploration<br />

Amir Mobarhani, Shaghayegh Nazari, Amir H. Tamjidi, Hamid D.<br />

Taghirad<br />

Advanced Robotics and Automated Systems Lab (ARAS), Electrical and<br />

Computer Engineering Department, K. N. Toosi University of Technology, Iran<br />

• Histogram based exploration is less<br />

dependent on thresholds.<br />

• It is a weighted combination of local<br />

search (for closest frontiers) and global<br />

search (for biggest frontiers)<br />

• The criteria for choosing the exploration<br />

sub-goal is simple to implement and<br />

flexible.<br />

• Experimental results show that with low<br />

computational cost we can achieve better<br />

performance compared to the original<br />

paradigm of frontier exploration.<br />

08:45–08:50 TuAT9.4<br />

A receding horizon approach to generating<br />

dynamically feasible plans<br />

for vehicles that operate over large areas<br />

Daniel J. Stilwell and Aditya S. Gadre<br />

ECE Department, Virginia Tech, USA<br />

Andrew J. Kurdila<br />

ME Department, Virginia Tech, USA<br />

• Real-time planning of dynamically<br />

feasible trajectories for large operating<br />

environments.<br />

• Short-horizon trajectories are computed<br />

continuously. Global path is computed<br />

only when needed.<br />

• Formal guarantees that sequence of<br />

trajectories will converge to desired<br />

location<br />

• Suitable for vehicle systems where<br />

kinematic model is poor approximation of<br />

vehicle dynamics.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–56–


Session TuAT9 Continental Parlor 9 Tuesday, September 27, <strong>2011</strong>, 08:00–09:30<br />

Probabilistic Exploration and Coverage<br />

Chair S.K. Gupta, Univ. of Maryland<br />

Co-Chair Timothy Bretl, Univ. of Illinois at Urbana-Champaign<br />

08:50–08:55 TuAT9.5<br />

Planning for Landing Site Selection in the Aerial<br />

Supply Delivery<br />

Aleksandr Kushleyev and Brian MacAllister<br />

University of Pennsylvania, USA<br />

Maxim Likhachev<br />

Carnegie Mellon University, USA<br />

• Our goal is to generate an optimal policy<br />

for a UAV that minimizes the expected cost<br />

of flying, sensing and landing<br />

• Availability of landing sites is unknown<br />

• Modeling uncertainty in the problem<br />

representation allows the planner to reason<br />

about missing information<br />

• Resulting policies are more realistic, but<br />

complexity of the problem increases<br />

exponentially with number of unknowns<br />

• Existing PPCP framework efficiently and<br />

optimally solves such problems if certain<br />

conditions hold (Clear Preferences)<br />

• Our implementation provides rigorous<br />

theoretical guarantees and runs on a<br />

custom quad-rotor helicopter<br />

Prior terrain data<br />

Landing sites and sample paths<br />

09:00–09:15 TuAT9.7<br />

Probably Approximately Correct Coverage<br />

for Robots with Uncertainty<br />

Colin Das 1 , Aaron Becker 2 , and Timothy Bretl 1<br />

1 Aerospace Eng., University of Illinois at Urbana-Champaign, USA<br />

2 Electrical and Computer Eng., University of Illinois at Urbana-Champaign, USA<br />

• With uncertainty, it may be<br />

impossible to guarantee that a<br />

robot covers all the free space<br />

in finite time — how should we<br />

measure performance?<br />

• New performance measure:<br />

probability 1−ε of covering a<br />

fraction 1−δ of free space<br />

P(C 1 ) 1<br />

, [0,1]<br />

• Our paper shows the practical<br />

utility of this new performance<br />

measure<br />

robot<br />

area<br />

covered<br />

coverage with negligible uncertainty in<br />

sensing and actuation<br />

coverage with significant uncertainty<br />

in sensing and actuation<br />

How do we define “good performance” for coverage with uncertainty?<br />

08:55–09:00 TuAT9.6<br />

Trajectory Planning with Look-Ahead for<br />

Unmanned Sea Surface Vehicles to<br />

Handle Environmental Disturbances<br />

Petr Svec and Atul Thakur<br />

Department of Mechanical Engineering,<br />

University of Maryland, College Park, USA<br />

Maxim Schwartz<br />

Energetics Technology Center, USA<br />

Satyandra K. Gupta<br />

Department of Mechanical Engineering and Institute for Systems Research,<br />

University of Maryland, College Park, USA<br />

• Developed a look-ahead based<br />

algorithm for trajectory planning<br />

under motion uncertainty for<br />

USSVs<br />

• Motion uncertainty is modeled<br />

and explicitly used during the<br />

trajectory computation<br />

• Algorithm combines heuristic<br />

search with a variation of<br />

minimax game-tree search<br />

• Contingency plan is generated<br />

to efficiently handle sudden large<br />

deviations from the intended<br />

trajectory<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–57–<br />

Trajectory A: Overly conservative trajectory<br />

Trajectory B: Nominal trajectory computed by the finite<br />

horizon look-ahead planner<br />

Trajectory C: Contingency trajectory<br />

Trajectory planning under motion<br />

uncertainty for USSVs<br />

09:15–09:30 TuAT9.8<br />

A Dynamic Sensor Placement Algorithm for<br />

Dense Sampling<br />

Vineet Bhatawadekar, Ravishankar Sivalingam<br />

and Nikolaos Papanikolopoulos<br />

Department of Computer Science, University of Minnesota Twin Cities, USA<br />

• Objective: Determine the next move of<br />

the robot based on the spatial sampling<br />

density of the sensor<br />

• Approach: A convex optimization<br />

framework for determining the next-bestmove.<br />

• Design a convex constraint function<br />

based on the sampling density, and<br />

satisfy as many constraints as possible to<br />

obtain uniform and dense sampling<br />

Object search with our placement<br />

algorithm


Session TuBT1 Continental Parlor 1 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Perception, Saliency and Novelty<br />

Chair Nicholas Gans, Univ. Texas at Dallas<br />

Co-Chair Michael Zillich, Vienna Univ. of Tech.<br />

10:00–10:15 TuBT1.1<br />

Multimodal Saliency-based Attention for<br />

Object-based Scene Analysis<br />

Boris Schauerte 1 , Benjamin Kühn 1 , Kristian Kroschel 2<br />

and Rainer Stiefelhagen 1,2<br />

1 Institute for Anthropomatics, Karlsruhe Institute of Technology (KIT), Germany<br />

2 Fraunhofer IOSB, Germany<br />

• Multimodal attention is important for robots<br />

to operate in complex environments and<br />

act as social, cognitive human partners<br />

• Overt attention: active saliency-driven<br />

sensor alignment to improve the perception<br />

• Isophote-based saliency map<br />

segmentation for visual proto-objects<br />

• Surprise-based auditory saliency<br />

• Parametric 3-D saliency model enables<br />

efficient audio-visual saliency fusion<br />

• Audio-visual object validation and<br />

hierarchical, knowledge-driven analysis<br />

• Object-based inhibition of return<br />

Auditory Surprise (bottom)<br />

Fitted Gaussian models (uses Isophotebased<br />

Segmentation; top) and<br />

sequence of attentional shifts (bottom)<br />

10:30–10:45 TuBT1.3<br />

Optimisation of Gaze Movement for Multitasking<br />

Using Rewards<br />

Cem Karaoguz 1,2 , Tobias Rodemann 2 and Britta Wrede 1<br />

1 ) Research Institute for Cognition and Robotics (CoR-Lab), Bielefeld<br />

University, Germany<br />

2 ) Honda Research Institute Europe GmbH, Offenbach, Germany<br />

• A framework that optimises active scene<br />

perception for modular systems is<br />

presented.<br />

• Individual visual processes were<br />

encapsulated in modules.<br />

• Our framework learns how to utilise these<br />

modules in time for different tasks.<br />

• Demonstrated in a multitasking scenario<br />

and performed better than preprogramming<br />

and saliency approaches.<br />

10:15–10:30 TuBT1.2<br />

Robots Looking for Interesting Things:<br />

Extremum Seeking Control on Saliency Maps<br />

Yinghua Zhang, Jinglin Shen and Nicholas Gans<br />

Dept. of Electrical Engineering, University of Texas at Dallas, USA<br />

Mario Rotea<br />

Dept. of Mechanical Engineering, University of Texas at Dallas, USA<br />

• We present a Simplex Guided<br />

Extremum Seeking Control<br />

algorithm applied on a 6-DOF<br />

eye-in-hand robot arm to<br />

optimize the visual stimuli.<br />

• We use Saliency to represent<br />

the level of attractiveness of<br />

visual stimuli, which we take as<br />

a measurement of interest.<br />

• Simulations and experiments<br />

show the new algorithm is more<br />

likely to find a global maximum<br />

than the conventional<br />

Extremum Seeking Control<br />

10:45–10:50 TuBT1.4<br />

Novelty Detection Using Growing Neural Gas for<br />

Visuo-Spatial Memory<br />

Dmitry Kit and Dana Ballard<br />

Computer Science, University of Texas, USA<br />

Brian Sullivan<br />

Psycology, University of Texas, USA<br />

• Change detection is a mechanism that can<br />

be used as an online sensory input filter<br />

• A learned distribution of visuo-spatial<br />

features is used to detect changes<br />

• Trained a self-organizing map with color<br />

histograms and spatial data from a 1 st<br />

person camera<br />

• Online solution predicts color content for a<br />

viewpoint and localizes changes that<br />

deviate from priors<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–58–<br />

Change detection allows for online<br />

filtering so that image parsing<br />

computation can be constrained to<br />

small image regions (black regions)


Session TuBT1 Continental Parlor 1 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Perception, Saliency and Novelty<br />

Chair Nicholas Gans, Univ. Texas at Dallas<br />

Co-Chair Michael Zillich, Vienna Univ. of Tech.<br />

10:50–10:55 TuBT1.5<br />

Coherent Spatial Abstraction and Stereo Line<br />

Detection for Robotic Visual Attention<br />

Kai Zhou, Andreas Richtsfeld, Michael Zillich and Markus Vincze<br />

Automation and Control Institute, Vienna University of Technology, Austria<br />

• Planar surfaces are estimated from 3D<br />

point cloud data for spatial reasoning<br />

• Matching edges in image pairs to compute<br />

stereo lines<br />

• Assessing planes and lines to compute<br />

significance values for probabilistic<br />

optimization<br />

• Refine 2D saliency map by eliminating<br />

wrong attentions according to spatial<br />

restriction and enhancing correct<br />

attentions wherein stereo lines are<br />

detected<br />

Up: robotic attention with 2D saliency<br />

Bottom: attention with combination of<br />

spatial and stereo line information<br />

11:00–11:15 TuBT1.7<br />

Visual Anomaly Detection under Temporal and<br />

Spatial Non-uniformity for News Finding Robot<br />

Takahiro Suzuki, Fumihiro Bessho<br />

Tatsuya Harada and Yasuo Kuniyoshi<br />

Department of Mechano-Informatics, Graduate School of Information<br />

Science and Technology, The University of Tokyo, Japan<br />

• Visual anomaly detection framework for a<br />

news finding robot system is proposed.<br />

• Non-uniform Hidden Markov Model is<br />

proposed to handle intermittent<br />

observation at each place.<br />

• The framework reuses information from<br />

similar places. Place similarity is<br />

automatically acquired.<br />

• Experimental results show that the robot<br />

can find news in the real world.<br />

A robot system based on<br />

proposed anomaly detection.<br />

10:55–11:00 TuBT1.6<br />

Visual Machinery Surveillance<br />

for High-Speed Periodic Operations<br />

Idaku Ishii, Yao-dang Wang and Takeshi Takaki<br />

Robotics Laboratory, Hiroshima University, JAPAN<br />

• Machinery surveillance algorithm for<br />

abnormal behavior detection in long-term<br />

abnormal behavior in a sewing machine<br />

t =20.589s<br />

periodic machinery operations in factories<br />

• Automatics phase encoding using several<br />

phase = 0.80 M0 = 34<br />

t =20.596s<br />

significant pixels for periodic operations<br />

phase = 1.13 M0 = 25<br />

• Fast abnormity detection by differencing<br />

input images from their synchronized<br />

t =20.603s<br />

reference images.<br />

• Automated HFR video logging system<br />

phase = 1.91<br />

t =20.610s<br />

M0 = 9<br />

using the machinery surveillance algorithm<br />

phase = 2.43 M0 = 9<br />

on a 1000 fps high-speed vision platform<br />

• Abnormal behavior of a sewing machine<br />

t =20.617s<br />

operating at 12 Hz was recorded in real<br />

time as a 512x512 pixel video at 1000 fps.<br />

phase = 3.02 M0 = 28<br />

t =20.624s<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–59–<br />

(a) input image (b) reference image<br />

phase = 3.61 M0 = 12<br />

(c) difference image<br />

HFR image sequence, selected reference<br />

images, and detected abnormal pixels<br />

11:15–11:30 TuBT1.8<br />

Representation of Manipulation-Relevant Object<br />

Properties and Actions<br />

Susanne Petsch and Darius Burschka<br />

Department of Informatics, Technische Universität München, Germany<br />

• Detection of changes in physical<br />

and functional properties of<br />

objects from unexpected human<br />

actions<br />

• Object-centric representation of<br />

manipulation constraints<br />

• Grounding of a-priori information<br />

to observed geometric and<br />

action attributes<br />

• Efficient monitoring system for<br />

detection of unexpected events<br />

in varying repetitive actions


Session TuBT2 Continental Parlor 2 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Semantic SLAM & Loop Closure<br />

Chair Wolfram Burgard, Univ. of Freiburg<br />

Co-Chair Javier Civera, Univ. de Zaragoza<br />

10:00–10:15 TuBT2.1<br />

Application of Locality Sensitive Hashing to<br />

Realtime Loop Closure Detection<br />

Hossein Shahbazi and Hong Zhang<br />

Department of Computing Science, University of Alberta, Canada<br />

• Using Euclidean Locality Sensitive<br />

Hashing (E2LSH) for finding similar visual<br />

features<br />

• Automatic parameter tuning considering<br />

only the distribution of data<br />

• Feature matching using the distance ratio<br />

criterion<br />

• Accurate loop closure detection by<br />

computing image similarities<br />

System Overview:<br />

individual features are<br />

lookup in the map using<br />

LSH<br />

10:30–10:45 TuBT2.3<br />

Bathymetric SLAM with No Map Overlap using<br />

Gaussian Processes<br />

Stephen Barkby, Stefan Williams, Oscar Pizarro and Michael<br />

Jakuba<br />

Australian Centre for Field Robotics, University of Sydney, Australia<br />

• Navigation and Mapping for AUVs can<br />

be corrected by performing SLAM with<br />

multibeam sonar and a Rao<br />

Blackwellized Particle Filter.<br />

• Memory requirements significantly<br />

reduced by storing maps as trajectories<br />

linked to common bathymetry.<br />

• Gaussian Process Regression allows<br />

loop closures to be performed even<br />

when no map overlap is present.<br />

10:15–10:30 TuBT2.2<br />

BRIEF-GIST –<br />

Closing the Loop by Simple Means<br />

Niko Sünderhauf and Peter Protzel<br />

Department of Electrical Engineering and Information Technology,<br />

Chemnitz University of Technology, Germany<br />

• We propose BRIEF-Gist, a highly efficient<br />

scene descriptor for place recognition and<br />

loop closure detection.<br />

• We compare it to FAB-Map on two<br />

standard datasets. Despite its simplicity,<br />

BRIEF-Gist performs comparably well.<br />

• We perform city-scale SLAM using a novel<br />

optimization-based SLAM formulation that<br />

is robust against the few remaining falsepositive<br />

loop closures.<br />

10:45–10:50 TuBT2.4<br />

Place Recognition in 3D Scans Using a Combination of Bag<br />

of Words and Point Feature based Relative Pose Estimation<br />

Bastian Steder, Michael Ruhnke,<br />

Slawomir Grzonka, and Wolfram Burgard<br />

Dept.of Comp. Science, University of Freiburg, Germany<br />

• Place recognition based on 3D range scans.<br />

• A bag-of-words approach is used for a fast<br />

similarity pre-ordering of the scan pairs.<br />

• Uses NARFs (Normal Aligned<br />

Radial Features) to find relative<br />

transformations.<br />

• Experiments using publicly<br />

available datasets and<br />

comparison to other<br />

approaches.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–60–<br />

Detected loop closures for the Hanover2 dataset<br />

(Dataset courtesy of Oliver Wulf)


Session TuBT2 Continental Parlor 2 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Semantic SLAM & Loop Closure<br />

Chair Wolfram Burgard, Univ. of Freiburg<br />

Co-Chair Javier Civera, Univ. de Zaragoza<br />

10:50–10:55 TuBT2.5<br />

Adaptive Appearance Based Loop-Closing in<br />

Heterogeneous Environments<br />

András Majdik and Gheorghe Lazea<br />

Robotics Research Group,Technical University of Cluj-Napoca, Romania<br />

Dorian Gálvez-López and José A. Castellanos<br />

Instituto de Ingeniería de Aragón, University of Zaragoza, Spain<br />

• The paper concerns the problem of<br />

detecting loop-closure situations from a<br />

visual appearance-based perspective<br />

• A novel probabilistic on-line weight<br />

updating algorithm is proposed for the<br />

bag-of-words description of the gathered<br />

images<br />

• An intuitive measure of the ability of a<br />

certain word to contribute to the detection<br />

of a correct loop-closure is presented<br />

• The proposed strategy is extensively<br />

tested on challenging large-scale urban<br />

environments<br />

11:00–11:15 TuBT2.7<br />

Memory Management for Real-Time<br />

Appearance-Based Loop Closure Detection<br />

Mathieu Labbé and François Michaud<br />

Department of Electrical and Computer Engineering,<br />

Université de Sherbrooke, QC, Canada<br />

• Appearance-based loop closure detection<br />

approach that respects real-time<br />

constraints for large-scale SLAM<br />

• Memory management (Working Memory,<br />

Long-Term Memory) keeps computation<br />

time for each image acquired under a<br />

fixed time limit<br />

• Incremental approach: no prior knowledge<br />

of the environment is required<br />

• Results demonstrate the approach’s<br />

adaptability and scalability using four<br />

standard data sets<br />

Wait<br />

image<br />

Transfer<br />

(WM�LTM)<br />

yes<br />

no<br />

Image Location<br />

Process time<br />

over T time?<br />

Rehearsal<br />

(STM)<br />

Loop closure<br />

rejected<br />

Retrieval<br />

(LTM�WM)<br />

Bayesian Filter<br />

Update<br />

(WM)<br />

Any<br />

no<br />

hypothesis<br />

over Tloop ?<br />

yes<br />

Flow chart of the appearancebased<br />

loop closure detection<br />

approach<br />

Loop closure<br />

accepted<br />

10:55–11:00 TuBT2.6<br />

SLAM with Learned Object Recognition and<br />

Semantic Data Association<br />

John G Rogers III, Alexander J.B. Trevor, Carlos Nieto-Granda,<br />

Henrik I. Christensen<br />

Robotics and Intelligent Machines, Georgia Institute of Technology, USA<br />

• Complex and structured landmarks like<br />

objects have many advantages over lowlevel<br />

image features for semantic mapping<br />

• Human environments contain many objects<br />

which can serve as suitable landmarks for<br />

robot navigation<br />

• Maps based on high level features which<br />

are identified by a learned classifier could<br />

better inform tasks such as semantic<br />

mapping and mobile manipulation<br />

• This paper presents a technique for<br />

recognizing door signs using a learned<br />

classifier and perform semantic reasoning<br />

for data association<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–61–<br />

Large loop closed by recognizing<br />

and understanding semantic<br />

meaning of office door sign<br />

11:15–11:30 TuBT2.8<br />

Towards Semantic SLAM using a Monocular Camera<br />

Javier Civera, Dorian Gálvez-López, L. Riazuelo, Juan D. Tardós<br />

and J. M. M. Montiel<br />

University of Zaragoza, Spain<br />

• Monocular SLAM maps are usually composed of<br />

‘meaningless’ geometric entities (points or lines).<br />

• Our aim is to augment the high level information<br />

(semantics) in a monocular SLAM map.<br />

• We combine a geometric monocular SLAM<br />

algorithm, visual object recognition and<br />

registration to populate a meaningless pointbased<br />

map with objects.<br />

• Our approach runs in real-time at 7 Hz for local<br />

maps and a small number of objects (6 in our<br />

experiments).<br />

Image sequence and<br />

tracked points and objects<br />

Estimated camera trajectory, point<br />

map and inserted object<br />

Two objects inserted at a later step<br />

in the sequence


Session TuBT3 Continental Parlor 3 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Microrobotics II<br />

Chair David Cappelleri, Stevens Inst. of Tech.<br />

Co-Chair Sylvain Martel, Ec. Pol. de Montreal (EPM)<br />

10:00–10:15 TuBT3.1*<br />

Semi-Plenary Invited Talk: Microrobotics for<br />

Biomedical Applications<br />

Paolo Dario, Scuola Superiore Sant'Anna<br />

10:30–10:45 TuBT3.3<br />

A MRI-based Integrated Platform for the<br />

Navigation of Microdevices and Microrobots<br />

Manuel Vonthron, Viviane Lalande, Gaël Bringout<br />

Charles Tremblay and Sylvain Martel<br />

Nanorobotics Laboratory, École Polytechnique de Montréal, Canada<br />

• Magnetic Resonance Navigation (MRN)<br />

platform architecture with 3D Tracking<br />

and magnetic gradient based navigation<br />

• 3D wireless navigation with a custom<br />

magnetic gradient coils set providing up<br />

to 460mT/m<br />

• Experimental validation on a catheter<br />

with custom ferro-magnetic tip<br />

• Integrated platform enabling manual<br />

control or closed-loop control<br />

Superimposed deflections on a<br />

catheter obtained by varying a<br />

magnetic gradient inside a clinical MRI<br />

10:15–10:30 TuBT3.2<br />

Semi-Plenary Invited Talk: Microrobotics for<br />

Biomedical Applications<br />

Paolo Dario, Scuola Superiore Sant'Anna<br />

10:45–10:50 TuBT3.4<br />

Rotating Magnetic Micro-Robots for Versatile<br />

Non-Contact Fluidic Manipulation of Micro-Objects<br />

Eric Diller, Zhou Ye, and Metin Sitti<br />

Mechanical Engineering, Carnegie Mellon University, USA<br />

• Low Reynolds number micro-object<br />

manipulation by magnetic micro-robots<br />

• Rotational fluid flow induced to translate<br />

micro-objects<br />

• Position of microrobots is controlled by<br />

magnetic docks embedded in the<br />

substrate or by feedback control<br />

• High speed and precision manipulation<br />

is accomplished by regulating the<br />

spinning speed and separation distance<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–62–<br />

Magnetic dock concept to create<br />

reconfigurable virtual flow channels. The<br />

array of docks is embedded in the surface<br />

and acts to trap spinning micro-robots at<br />

prescribed locations to achieve controlled<br />

parallel fluid manipulation of micro-objects<br />

along complex paths


Session TuBT3 Continental Parlor 3 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Microrobotics II<br />

Chair David Cappelleri, Stevens Inst. of Tech.<br />

Co-Chair Sylvain Martel, Ec. Pol. de Montreal (EPM)<br />

10:50–10:55 TuBT3.5<br />

MRI Magnetic Signature Imaging, Tracking and<br />

Navigation for Targeted Micro/Nano-capsule<br />

Therapeutics<br />

David Folio and Antoine Ferreira<br />

PRISME, ENSI Bourges, France<br />

Christian Dahmen, Tim Wortmann and Sergej Fatikow<br />

AMiR, University of Oldenburg, Germany<br />

M. Arif Zeeshan, Kaiyu Shou, Salvador Pané, Bradley J. Nelson<br />

ASRL, IRIS, ETH Zurich, Switzerland<br />

• MRI as a medical imaging device to be<br />

used for targeted drug delivery<br />

• Tailored nanoparticles are designed as<br />

drug carriers with magnetic moment for<br />

actuation<br />

• MRI gradient coils propel magnetic<br />

objects, position feedback by artifact<br />

imaging and tracking of artifacts<br />

• Path extraction using Franqi filter and<br />

FMM, Navigation planning with modeled<br />

uncertainty<br />

11:00–11:15 TuBT3.7<br />

Design and Fabrication of<br />

Air-Flow based Single Particle Dispensing System<br />

Tomohiro Kawahara1 , Shigeo Ohashi1 , Masaya Hagiwara1 ,<br />

Yoko Yamanishi1,2 , and Fumihito Arai1,3 1 Graduate School of Engineering, Nagoya University, Japan<br />

2 PRESTO, JST, Japan<br />

3 Seoul National University, Korea<br />

• Design and fabrication to increase<br />

the success rate of single particle<br />

dispensing.<br />

• Disposable microchip which is<br />

composed of two capacitance<br />

sensors and air-flow based inkjet<br />

mechanism.<br />

• The developed system has capability<br />

to eject 3 particles/sec and maximum<br />

flow velocity is 10 mm/s.<br />

• We succeed in automatic dispensing<br />

of a single particle (=100 um) from a<br />

biochip to culture well atmosphere<br />

with the success rate of 50 %.<br />

• Application to single swine oocyte<br />

dispensing.<br />

Concept<br />

Results<br />

10:55–11:00 TuBT3.6<br />

Tumor Targeting by Computer Controlled<br />

Guidance of Magnetotactic Bacteria Acting Like<br />

Autonomous Microrobots<br />

Ouajdi Felfoul, Mahmood Mohammadi, Louis Gaboury, and Sylvain Martel<br />

NanoRobotics Laboratory, Ecole Polytechnique Montreal, Canada<br />

• Magnetotactic Bacteria are controlled by<br />

computer as microrobots<br />

• A directional magnetic field is used to<br />

indicate the target to be reached by the<br />

bacteria<br />

• These bacteria have the capability to<br />

travel in the human microvasculature<br />

• As such, they could potentially deliver<br />

therapeutics inside tumors<br />

• Here we show that they can indeed<br />

penetrate solid tumors<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–63–<br />

Flagella<br />

Directional System<br />

MC-1 Bacterium<br />

11:15–11:30 TuBT3.8<br />

Microrobotic Simulator for Assisted Biological<br />

Cell Injection<br />

Hamid Ladjal 1,2 , Jean-Luc Hanus 2 , Antoine Ferreira 2<br />

1 LIRIS CNRS UMR 5205, Université Claude Bernard Lyon 1, France<br />

2 Laboratoire PRISME, ENSI Bourges, France<br />

• ICSI simulator<br />

• Biomechanical models for cell<br />

injection:<br />

- Linear elastic FEM<br />

- Hyperelastic FEM<br />

• 3D real-time virtual reality based<br />

ICSI simulator:<br />

- Off-line computation<br />

- Real-time haptics-enable simulator<br />

-Virtual coupling for stability of the<br />

haptic rendering<br />

• Experiments and validation<br />

Visual comparison between<br />

FEM simulations and<br />

experimental data.


Session TuBT4 Continental Ballroom 4 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Industrial Forum: Robots: The Next Generation<br />

Chair Steve Cousins, Willow Garage, Inc<br />

Co-Chair<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–64–


Session TuBT5 Continental Ballroom 5 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Medical Robotics II<br />

Chair Paolo Fiorini, Univ. of Verona<br />

Co-Chair Jaydev P. Desai, Univ. of Maryland<br />

10:00–10:15 TuBT5.1*<br />

Semi-Plenary Invited Talk: Challenges in MRI-guided<br />

Interventions<br />

Jaydev P. Desai, University of Maryland<br />

10:30–10:45 TuBT5.3<br />

On the Design of an Interactive, Patient-Specific<br />

Surgical Simulator for Mitral Valve Repair<br />

Neil A. Tenenholtz, Robert J. Schneider, and Robert D. Howe<br />

School of Engineering and Applied Sciences, Harvard University, USA<br />

Peter E. Hammer and Nikolay V. Vasilyev<br />

Department of Cardiac Surgery, Children’s Hospital Boston, USA<br />

• A patient-specific surgical simulator<br />

was developed to assist in planning<br />

mitral valve repair.<br />

• Simulation occurred at 1kHz allowing for<br />

real-time haptic interaction with the<br />

virtual model.<br />

• The valve was simulated to closure in<br />

less than 1s with sub-millimeter<br />

accuracy.<br />

• A cardiac surgeon used the system to<br />

repair 3 pathological valves with minimal<br />

training.<br />

10:15–10:30 TuBT5.2<br />

Investigation of Magnetic Guidance of Cochlear<br />

Implants<br />

James R. Clark, Lisandro Leon, and Jake J. Abbott<br />

Department of Mechanical Engineering, University of Utah, USA<br />

Frank M. Warren<br />

Department of Otolaryngology, Oregon Health & Science University, USA<br />

• Cochlear implant insertions are widely<br />

known to produce trauma, often leading to<br />

loss of residual hearing.<br />

• One of the goals for development of new<br />

electrode arrays is consistent atraumatic<br />

insertions.<br />

• We propose a concept in which a<br />

magnetically tipped cochlear-implant<br />

electrode array is guided during insertions.<br />

• In scaled in vitro studies, insertion forces<br />

were reduced by approximately 50% using<br />

magnetic guidance.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–65–<br />

Concept for magnetically guided<br />

cochlear implant surgery. Red<br />

wide arrows indicate the three<br />

controlled degrees of freedom.<br />

10:45–10:50 TuBT5.4<br />

Ergonomic and Gesture Performance of<br />

Robotized Instruments for Laparoscopic Surgery<br />

Benoît Herman 1,2 , Ali Hassan Zahraee 1 , Jérôme Szewczyk 1 ,<br />

Guillaume Morel 1 , Christophe Bourdin 3 , Jean-Louis Vercher 3<br />

and Brice Gayet 4<br />

1 ISIR, UPMC Univ Paris 06 - CNRS, France<br />

2 CEREM, Université catholique de Louvain - FNRS, Belgium<br />

3 ISM, Université de la Méditerranée - CNRS, France<br />

4 Department of Digestive Diseases, Montsouris Institute, France<br />

• New laparoscopic instrument is designed<br />

with robotized intra-abdominal DOFs<br />

and articulated handle<br />

• Two surgical tasks are implemented in a<br />

virtual reality simulator<br />

• A new RULA-based real-time ergonomic<br />

score is proposed<br />

• Surgeons and PhD students performed an<br />

experimental comparison of several<br />

instruments<br />

• The new instrument was more ergonomic<br />

• Design should be improved to increase<br />

precision


Session TuBT5 Continental Ballroom 5 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Medical Robotics II<br />

Chair Paolo Fiorini, Univ. of Verona<br />

Co-Chair Jaydev P. Desai, Univ. of Maryland<br />

10:50–10:55 TuBT5.5<br />

Laparoscopic Optical Biopsies: In Vivo<br />

Robotized Mosaicing with Probe-based Confocal<br />

Endomicroscopy<br />

Benoît Rosa 1 and Benoît Herman 1,3 and Jerôme Szewczyk 1 and<br />

Brice Gayet 1,2 and Guillaume Morel 1<br />

1 ISIR, UPMC Univ Paris 06 - CNRS, France<br />

2 Department of Digestive Diseases, Montsouris Institute, France<br />

3 CEREM, Université Catholique de Louvain - FNRS, Belgium<br />

a – in vivo validation of the prototype<br />

b – computed mosaic from the acquired images<br />

• A minimally invasive device for performing optical biopsies<br />

• Microactuation using hydraulic microballoons<br />

• Mechanical passive physiological motion compensation<br />

• Preliminary in vivo validation on porcine model<br />

11:00–11:15 TuBT5.7<br />

Shape Estimation for Image-Guided Surgery with<br />

a Highly Articulated Snake Robot<br />

Stephen Tully 1 , George Kantor 2 , Marco Zenati 3 , Howie Choset 2<br />

1 Electrical and Computer Engineering, Carnegie Mellon University, USA<br />

2 Robotics Institute, Carnegie Mellon University, USA<br />

3 Harvard Medical School, Harvard University, USA<br />

• Using an EKF, we estimate the full shape<br />

of a surgical robot using a 5-DOF<br />

magnetic tracker measurement at the<br />

distal tip<br />

• For the prediction step of the filter, we<br />

introduce motion models for the HARP<br />

surgical snake robot<br />

• We analyze the observability of shape<br />

estimation for this system and show that<br />

given sufficient motion, the state is fully<br />

observable<br />

• To demonstrate our methods, we show<br />

results from an image-guidance<br />

experiment on a porcine subject<br />

Shape estimation for the HARP<br />

snake robot during an imageguidance<br />

experiment<br />

10:55–11:00 TuBT5.6<br />

Sensor and Sampling-based Motion Planning<br />

for Minimally Invasive Robotic Exploration<br />

of Osteolytic Lesions<br />

Wen P. Liu, Blake C. Lucas, Kelleher Guerin<br />

Department of Computer Science, Johns Hopkins University, USA<br />

Erion Plaku<br />

Department of Electrical Engineering and Computer Science,<br />

Catholic University of America, USA<br />

• Paper develops planning framework to<br />

control a flexible cannula robot to<br />

explore osteolytic lesions<br />

• Framework effectively combines global<br />

and local planning with information<br />

gain<br />

• Overall goal is to assist orthopaedic<br />

surgeons in minimally invasive<br />

treatment of osteolysis<br />

• Osteolysis is a result of bearing<br />

material wear in total hip replacement<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–66–<br />

Planner controls the snake-like<br />

cannula robot in minimally invasive<br />

treatment of osteolysis in order to<br />

effectively explore with its tip the<br />

osteolytic cavity<br />

11:15–11:30 TuBT5.8<br />

A Virtual Scalpel System for Computer-Assisted<br />

Laser Microsurgery<br />

Leonardo S. Mattos, Giulio Dagnino, Gabriele Becattini,<br />

Darwin G. Caldwell<br />

Advanced Robotics Dept., Italian Institute of Technology, Italy<br />

Massimo Dellepiane<br />

ENT Dept., UNIGE, Italy<br />

• Allows precise and safe laser<br />

control directly for live video of<br />

the surgical site<br />

• New motorized laser<br />

micromanipulator with 1μm<br />

accuracy<br />

• Intuitive intraoperative planning<br />

and automatic plan execution<br />

• Virtual features for increased<br />

safety during teleoperation<br />

• No operation training required<br />

• 50% reduction in laser aiming<br />

errors<br />

[Patient drawing from Hochman et al.]


Session TuBT6 Continental Ballroom 6 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Grasping and Manipulation: Mechanics and Design<br />

Chair Aaron Dollar, Yale Univ.<br />

Co-Chair Aaron Edsinger, Meka Robotics<br />

10:00–10:15 TuBT6.1*<br />

Semi-Plenary Invited Talk: Preliminary Results with<br />

NASA's Robonaut 2 Hand<br />

Robert Ambrose, NASA Johnson Space Center<br />

10:30–10:45 TuBT6.3<br />

FAS A flexible Antagonistic spring element for a<br />

high performance over actuated hand<br />

Werner Friedl, Maxime Chalon, Jens Reinecke and Markus<br />

Grebenstein<br />

Institute of Robotics and Mechatronics, German Aerospace Center<br />

DLR Hand-Arm-System with<br />

opened forearm with 19 FAS<br />

• The FAS is an antagonistic spring element<br />

designed for the over actuated hand of<br />

the DLR hand-arm system<br />

• It showed its design flexibility to adapt on<br />

38 various stiffness characteristics and<br />

tendon lengths<br />

• The developed magnet sensor fulfils the<br />

design goals to be very compact and<br />

precise.<br />

• The tests show a double increase of the<br />

performance against the version in the<br />

testbed.<br />

10:15–10:30 TuBT6.2<br />

Semi-Plenary Invited Talk: Preliminary Results with<br />

NASA's Robonaut 2 Hand<br />

Robert Ambrose, NASA Johnson Space Center<br />

10:45–10:50 TuBT6.4<br />

Varying spring preloads<br />

to select grasp strategies in an adaptive hand<br />

Daniel Aukes, Barrett Heyneman, and Mark Cutkosky<br />

Mechanical Engineering, Stanford University, USA<br />

Vincent Duchaine<br />

Mechanical Engineering, École de Technologie Supérieure, Canada<br />

• Variable spring preloads allow an<br />

underactuated robotic hand to<br />

change grasping poses,<br />

accommodating large and small<br />

objects.<br />

• Grasp stability can be ensured for<br />

a larger range of object sizes<br />

than with a highly underactuated<br />

hand.<br />

• Grasps can be adjusted for<br />

situations that require higher<br />

precision or more safety.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–67–<br />

The Seabed Rig Hand in a<br />

“power-pinch” grasp


Session TuBT6 Continental Ballroom 6 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Grasping and Manipulation: Mechanics and Design<br />

Chair Aaron Dollar, Yale Univ.<br />

Co-Chair Aaron Edsinger, Meka Robotics<br />

10:50–10:55 TuBT6.5<br />

A highly-underactuated robotic hand<br />

with force and joint angle sensors<br />

Long Wang*, Joseph DelPreto**,<br />

Sam Bhattacharyya*. Jonathan Weisz***,<br />

and Peter K. Allen***<br />

*Mechanical Engineering, **Electrical Engineering, ***Computer Science,<br />

Columbia University, USA<br />

• Columbia Hand – a three-fingered<br />

underactuated robotic hand is<br />

designed, prototyped, and tested<br />

• Highly-Underactuated – one<br />

motor actuates 9 DOF, while<br />

another actuates “Thumb” rotation<br />

• Position and Tactile Sensors –<br />

precise angle feedback and binary<br />

force feedback<br />

• Experiments – natural pre-shape<br />

stage; stable self-adaptive grasp<br />

for different shapes<br />

Columbia Hand<br />

with Position and Force sensors<br />

11:00–11:15 TuBT6.7<br />

Dynamic Nonprehensile Shaping<br />

of a Thin Rheological Object<br />

Tomoyuki Inahara, Mitsuru Higashimori,<br />

Kenjiro Tadakuma and Makoto Kaneko<br />

Department of Mechanical Engineering, Osaka University, Japan<br />

• A rheological object is remotely shaped on<br />

a plate attached at the tip of a bar.<br />

• A one-dimensional viscous model is<br />

introduced by focusing on the plastic<br />

deformation of the object.<br />

• Sufficient conditions for the plate<br />

accelerations to enlarge and to contract<br />

the object are derived.<br />

• Experimental results are shown for<br />

confirming the validity of the proposed<br />

shaping method.<br />

Nonprehensile shaping of a<br />

rheological object<br />

10:55–11:00 TuBT6.6<br />

Active Outline Shaping of a Rheological Object<br />

Based on Plastic Deformation Distribution<br />

Kayo Yoshimoto<br />

Department of Health Science, Osaka University, Japan<br />

Mitsuru Higashimori, Kenjiro Tadakuma, and Makoto Kaneko<br />

Department of Mechanical Engineering, Osaka University, Japan<br />

• A handling issue for a rheological object is<br />

discussed.<br />

• Based on plastic deformation distribution,<br />

the shaping method of the object’s outline<br />

by using an uniaxial gripper is proposed.<br />

• The desired shape outline is controlled by<br />

actively managing the integrated stress.<br />

• Experimental results by using a real<br />

rheological object is shown for confirming<br />

the validity of the proposed method.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–68–<br />

Active shaping of a rheological<br />

object<br />

11:15–11:30 TuBT6.8<br />

Softness Effects on Manipulability and Grasp<br />

Stability<br />

Tetsuyou Watanabe<br />

School of Mechanical Engineering, Kanazawa University, Japan<br />

• Derivation of criterion index for<br />

manipulability including softness effect<br />

• Derivation of criterion index for grasp<br />

stability including softness effect<br />

• Softness effect on manipulability and<br />

grasp stability: the increase of the<br />

softness (decrease of stiffness)<br />

decreases the manipulability while it<br />

increases generable object wrench<br />

(grasp stability)<br />

Manipulability index : Im (m/s)<br />

0.66<br />

0.64<br />

0.62<br />

0.6<br />

Manipulability index<br />

0.6<br />

0.5<br />

0.4<br />

0.3<br />

Generable wrench index : If (N)<br />

0.58<br />

0.2<br />

Grasp stability index<br />

0.56<br />

0.1<br />

1 3 5 7 9<br />

Stiffness parameter : ki (�10000)


Session TuBT7 Continental Parlor 7 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Contact and Deformation<br />

Chair Marsette Vona, Northeastern Univ.<br />

Co-Chair Jeff Trinkle, Rensselaer Pol. Inst.<br />

10:00–10:15 TuBT7.1<br />

Combining Imitation and Reinforcement<br />

Learning to Fold Deformable Planar Objects<br />

Benjamin Balaguer and Stefano Carpin<br />

School of Engineering, University of California, Merced, USA<br />

• Learning algorithm allowing<br />

cooperative manipulators to perform<br />

towel folding tasks;<br />

• Human demonstrations are<br />

exploited in an imitation learning<br />

framework to find a good starting<br />

seed for reinforcement learning;<br />

• PoWER’s reinforcement learning<br />

algorithm is applied to refine the<br />

imitation learning process and<br />

converge to a successful action;<br />

• Strengths of the algorithm are its<br />

efficient processing, fast learning<br />

capabilities, absence of a<br />

deformable object model, and<br />

applicability to other problems<br />

exhibiting temporally incoherent<br />

parameter spaces.<br />

Figure caption is optional,<br />

use Arial Narrow 20pt<br />

10:30–10:45 TuBT7.3<br />

Toward Simpler Models of Bending Sheet Joints<br />

Lael U Odhner and Aaron M Dollar<br />

Department of Mechanical Engineering, Yale University, USA<br />

• In this paper, we introduce a<br />

variational model that uses only five<br />

parameters to describe the threedimensional<br />

shape of a sheet<br />

flexure joint<br />

• This model can be used to capture<br />

the parasitic twisting motions of the<br />

sheet.<br />

• We parameterize the twists on the<br />

backbone curve running along the<br />

length of the sheet<br />

• The Jacobians and Hessians of the<br />

joint kinematics can be computed in<br />

a straightforward manner, enabling<br />

classical robot modeling and control<br />

of these continuum structures<br />

10:15–10:30 TuBT7.2<br />

Bimanual Robotic Cloth Manipulation for<br />

Laundry Folding<br />

Christian Bersch, Benjamin Pitzer, Sören Kammel<br />

Robert Bosch Research and Technology Center North America, USA<br />

• Autonomous folding of garments from a<br />

random initial cloth configuration with a<br />

PR2 robot<br />

• Novel method for computing grasp poses<br />

accounting for cloth deformability<br />

• Automated learning of an evaluation<br />

function for assessment of grasp pose<br />

quality<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–69–<br />

PR2 robot manipulates T-shirt<br />

10:45–10:50 TuBT7.4<br />

Bi-Manual Robotic Paper Manipulation<br />

Based on Real-Time Marker Tracking and<br />

Physical Modelling<br />

Christof Elbrechter, Robert Haschke and Helge Ritter<br />

Neuroinfomatics Group, Bielefeld University, Germany<br />

• Bi-manual manipulation of paper<br />

using two Shadow dexterous<br />

robot hands (20 DOF)<br />

• Comparison between a purely<br />

mathematical approach and a<br />

physical modeling approach<br />

• Multi-camera real-time tracking in<br />

presence of occlusions using a<br />

new fiducial marker design


Session TuBT7 Continental Parlor 7 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Contact and Deformation<br />

Chair Marsette Vona, Northeastern Univ.<br />

Co-Chair Jeff Trinkle, Rensselaer Pol. Inst.<br />

10:50–10:55 TuBT7.5<br />

Understanding the difference between prox and<br />

complementarity formulations for simulation of<br />

systems with contact<br />

Thorsten Schindler<br />

INRIA Grenoble – Rhône-Alpes, France<br />

Binh Nguyen, Jeff Trinkle<br />

Rensselaer Polytechnic Institute, USA<br />

• Complementarity formulations are well<br />

known to represent constraints in robotics<br />

• We present the prox formulation as an<br />

alternative and mathematically equivalent<br />

representation offering new numerical<br />

solution strategies<br />

• A study of fixed-point solution schemes<br />

for prox function systems shows that<br />

convergence depends on an additional<br />

parameter<br />

• The paradox of Painlevé illustrates that<br />

fixed-point schemes can fail while pivoting<br />

for complementarity formulations will<br />

succeed<br />

Two solutions in the paradox of<br />

Painlevé illustrate the fixed-point<br />

behavior of prox formulations<br />

11:00–11:15 TuBT7.7<br />

Estimation of Unknown Curvature using a Coarse-<br />

Resolution Sensor and Contact Kinematics<br />

Tri Cong Phung, Hansang Chae, Min Jeong Kim, Dongmin Choi,<br />

Seung Hoon Shin, Hyungpil Moon, Ja Choon Koo<br />

and Hyouk Ryeol Choi<br />

School of Mechanical Engineering, Sungkyunkwan University, Korea<br />

• Active sensing of unknown object by using<br />

rolling and sliding motion of fingertip<br />

• Applicable to an off-the-shelf sensor not<br />

with very high-resolution<br />

• Method for tracking unknown surface and<br />

estimating sliding motion is proposed<br />

• Simulations and experiments have been<br />

done to verify the proposed algorithm.<br />

Experimental setup<br />

10:55–11:00 TuBT7.6<br />

Curved Surface Contact Patches<br />

with Quantified Uncertainty<br />

Marsette Vona and Dimitrios Kanoulas<br />

College of Computer and Information Science, Northeastern University, USA<br />

• detailed models for 10 bounded curvedsurface<br />

patch types for contact regions<br />

both in the environment and on a robot<br />

• minimal geometric parameterizations<br />

using the exponential map for spatial pose<br />

• fast nonlinear fitting algorithm including<br />

quantified uncertainty both in the input<br />

points and the output patch<br />

• experimental results using Kinect<br />

• all sourcecode provided as the open-source<br />

surface patch library (SPL)<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–70–<br />

rock sampled with Kinect<br />

and 21 fitted surface patches<br />

11:15–11:30 TuBT7.8<br />

Singular surfaces and cusps in symmetric<br />

planar 3-RPR manipulators<br />

Michel Coste<br />

IRMAR, Université de Rennes I, France<br />

Philippe Wenger and Damien Chablat<br />

IRCCyN, CNRS - Ecole Centrale de Nantes, France<br />

• Moving triangle congruent to base<br />

triangle by indirect isometry<br />

• Coordinates reflecting the de-coupling<br />

of the DKP in two successive steps of<br />

degrees 3 and 2 are used<br />

• Precise description of the singularity<br />

surfaces and their cusp edges for all<br />

these manipulators<br />

• This allows sorting of assembly<br />

modes, used for motion planning in the<br />

joint space


Session TuBT8 Continental Parlor 8 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Biomimetic Limbed Robots<br />

Chair Elena Garcia, Centre for Automation and Robotics - CSIC-UPM<br />

Co-Chair Olivier Ly, INRIA / Lab. - Bordeaux Univ.<br />

10:00–10:15 TuBT8.1<br />

Arm-Hand Movement: Imitation of Human<br />

Natural Gestures with Tenodesis Effect<br />

Kien-Cuong Nguyen and Véronique Perdereau<br />

UPMC Univ Paris 06, UMR 7222, ISIR,<br />

F-75005, FRANCE<br />

• Decipher natural<br />

movements of<br />

the human armhand<br />

system<br />

based on<br />

mechanical<br />

constraints<br />

• Elaborate the<br />

notion of muscle<br />

comfort which<br />

helps finding the<br />

human natural<br />

postures in the<br />

situation of<br />

redundancy<br />

Optimal choice of palm position in cylindrical grasp<br />

10:30–10:45 TuBT8.3<br />

Climbot: A Modular Bio-inspired Biped<br />

Climbing Robot<br />

Yisheng Guan, Li Jiang, Haifei Zhu, Xuefeng Zhou, Chuanwu Cai,<br />

Wenqiang Wu, Zhanchu Li, Hong Zhang and Xianmin Zhang<br />

Biomimetic and Intelligent Robotics Lab (BIRL)<br />

School of Mechanical and Automotive Engineering<br />

South China University of Technology, Guangzhou, China<br />

�A modular bio-inspired biped climbing<br />

robot - Climbot is developed for highrise<br />

work in agriculture, forestry and<br />

construction<br />

� The robot has both climbing and<br />

manipulating functions<br />

�The mechanical and control system of<br />

the robot are introduced<br />

�Three climbing gaits – inchworm gait,<br />

swinging-around gait and flipping-over<br />

gait – are illustrated with experiments<br />

�An application is also demonstrated<br />

Climbot unscrewing a light-bulb<br />

10:15–10:30 TuBT8.2<br />

Bio-inspired vertebral column, compliance and semipassive<br />

dynamics in a lightweight humanoid robot<br />

Olivier Ly<br />

INRIA / LaBRI – Bordeaux University - France<br />

and Matthieu Lapeyre and Pierre-Yves Oudeyer<br />

Flowers – INRIA Bordeaux - France<br />

• Compliance and semi-passive dy-<br />

namics for locomotion of humanoid<br />

robots.<br />

• Robustness against unknown<br />

external perturbations.<br />

• The advantages of a bio-inspired<br />

multi-articulated vertebral column.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–71–<br />

Compliance of the Acroban Robot<br />

10:45–10:50 TuBT8.4<br />

Passive undulatory gaits enhance walking in a<br />

myriapod millirobot<br />

Katie L. Hoffman and Robert J. Wood<br />

School of Engineering and Applied Sciences, Harvard University, USA<br />

• Design and modeling of a<br />

centipede millirobot with a<br />

compliant body and passive<br />

intersegmental connections<br />

• Body undulations result from<br />

changing phase of stance between<br />

adjacent segments in simulation<br />

and experiments<br />

• These undulations enhance<br />

locomotion when compared to nonundulatory<br />

gaits<br />

• The model and millirobot can be<br />

used to study aspects of biological<br />

myriapod locomotion<br />

Photo of a 10-segment, 20-leg<br />

centipede millirobot


Session TuBT8 Continental Parlor 8 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Biomimetic Limbed Robots<br />

Chair Elena Garcia, Centre for Automation and Robotics - CSIC-UPM<br />

Co-Chair Olivier Ly, INRIA / Lab. - Bordeaux Univ.<br />

10:50–10:55 TuBT8.5<br />

Mechanical Design of a Tree Gripper for<br />

Miniature Tree-Climbing Robots<br />

Tin Lun Lam and Yangsheng Xu<br />

Department of Mechanical and Automation Engineering, The Chinese<br />

University of Hong Kong, Hong Kong, China<br />

• Development of a claw-based gripper for<br />

miniature tree-climbing robots<br />

• Capable of attaching on a wide variety of<br />

trees with a wide range of gripping<br />

curvature<br />

• Actuated by one actuator<br />

• Lightweight and simple in control<br />

• Allows zero energy consumption in static<br />

gripping<br />

A tree-climbing robot - Treebot<br />

11:00–11:15 TuBT8.7<br />

Neural-Body Coupling for Emergent<br />

Locomotion: a Musculoskeletal Quadruped<br />

Robot with Spinobulbar Model<br />

Yasunori Yamada, Satoshi Nishikawa, Kazuya Shida<br />

and Yasuo Kuniyoshi<br />

Grad. School of Info. Sci. & Tech., the Univ. of Tokyo, Japan<br />

Ryuma Niiyama<br />

Computer Science and Artificial Intelligence Lab, MIT, USA<br />

• The robot with biologically-realistic<br />

features in actuator and its configuration,<br />

sensor and nervous system<br />

• Quantifying the contribution of the<br />

morphology in structuring sensorimotor<br />

information via embodied interaction<br />

• Various and coordinated locomotion<br />

emerging from dynamic interaction with<br />

no pre-defined coordination circuit<br />

Robot and muscle configuration.<br />

10:55–11:00 TuBT8.6<br />

Bio-inspired Step Crossing Algorithm<br />

for a Hexapod Robot<br />

Ya-Cheng Chou, Wei-Shun Yu, Ke-Jung Huang, and Pei-Chun Lin<br />

Department of Mechanical Engineering, National Taiwan University, TAIWAN<br />

• The algorithm is inspired by the<br />

observation that the cockroach changes<br />

the tripod gait to a special gait to cross the<br />

high steps<br />

• Gait is composed by two stages:<br />

• “Rearing stage” to lift the front side of<br />

the body<br />

• “Lifting stage” to maneuver the body<br />

COM to pass the edge of the step<br />

• With the inclinometer feedback, the robot<br />

can automatically adjust its gait for<br />

crossing steps with different heights<br />

• The performance of the algorithm is<br />

experimentally evaluated<br />

11:15–11:30 TuBT8.8<br />

Design and development of a biomimetic leg<br />

using hybrid actuators<br />

Elena Garcia, Juan C. Arevalo, Fernando Sanchez, Javier F.<br />

Sarria and Pablo Gonzalez-de-Santos<br />

Centre for Automation and Robotics,<br />

Spanish National Research Council, Spain<br />

• The design, development and preliminary<br />

tests of a robotic leg for agile locomotion<br />

is described.<br />

• The multidisciplinary performance of the<br />

natural muscle is imitated by combination<br />

of technologies.<br />

• The actuation system is based on the<br />

hybrid use of series elasticity and<br />

magneto-rheological dampers .<br />

• Experiments with the real leg prototype<br />

show a natural-looking motion following<br />

reference data from CGA .<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–72–<br />

The HADE leg prototype


Session TuBT9 Continental Parlor 9 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Perceptual Learning<br />

Chair Danica Kragic, KTH<br />

Co-Chair Timothy Bretl, Univ. of Illinois at Urbana-Champaign<br />

10:00–10:15 TuBT9.1<br />

Learning spatial relations<br />

from functional simulation<br />

Kristoffer Sjöö and Patric Jensfelt<br />

Centre for Autonomous Systems,<br />

Royal Institute of Technology, Sweden<br />

• Understanding spatial relationships will be<br />

important for autonomous robots in<br />

complex environments<br />

• We submit that functional distinctions are<br />

crucial for conceptualizing spatial relations<br />

• 5 different functional distinctions are<br />

learned from data obtained through<br />

physics simulation<br />

• Resulting models can successfully predict<br />

qualitative action outcomes<br />

10:30–10:45 TuBT9.3<br />

Online Multiple Instance Learning Applied to<br />

Hand Detection in a Humanoid Robot<br />

Carlo Ciliberto 1 , Fabrizio Smeraldi 2<br />

Lorenzo Natale 1 , Giorgio Metta 1<br />

1) RBCS, Istituto Italiano di Tecnologia, Italy<br />

2) EECS, Queen Mary University of London, United Kingdom<br />

• We propose an online algorithm for the<br />

visual detection and localisation of the<br />

hand of the iCub robot that imposes low<br />

requirements on supervision.<br />

• Learning is achieved by online boosting<br />

using ‘Multiple Instance’ weak learners<br />

implemented as balls in feature space.<br />

• The weak learners are wrapped by<br />

‘selectors’ to allow for feature selection.<br />

• Coarse labeling strategies (e.g. motion)<br />

can be used for self-supervised<br />

learning.<br />

• To the best of our knowledge, this is the<br />

first application of online Multiple<br />

Instance Learning in robotics.<br />

Examples of localisation for the<br />

hand of the iCub humanoid robot<br />

10:15–10:30 TuBT9.2<br />

Multimodal Categorization<br />

by Hierarchical Dirichlet Process<br />

Tomoaki Nakamura, Takayuki Nagai<br />

Dept. of Elec. Eng., The Univ. of Electro-Communications, Japan<br />

Naoto Iwahashi<br />

NICT Knowledge Creating Communication Research Center, Japan<br />

• A nonparametric Bayesian framework for<br />

unsupervised object categorization by a<br />

robot<br />

• Multimodal (visual, audio and haptic)<br />

information is utilized for the<br />

categorization<br />

• Hierarchical Dirichlet Process (HDP) is<br />

extended to multimodal HDP<br />

It enables the system to estimate<br />

the number of categories<br />

• The proposed method provides a<br />

probabilistic framework for inferring<br />

object properties from limited<br />

observations<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–73–<br />

Objects<br />

Multimodal<br />

Information<br />

Category1 Category2 Category3<br />

10:45–10:50 TuBT9.4<br />

Active learning using a Variational Dirichlet<br />

Process model for pre-clustering and<br />

classification of underwater stereo imagery<br />

Ariell Friedman, Daniel Steinberg,<br />

Oscar Pizarro and Stefan B. Williams<br />

Australian Centre for Field Robotics, The University of Sydney, Australia<br />

• Active learning using a Variational<br />

Dirichlet Process (VDP) model:<br />

- Pre-clustering & classification<br />

- Minimises human effort for training &<br />

improves classification accuracy<br />

- Utilises structure in unlabelled data<br />

• Features: 3D morphology combined with<br />

colour and image texture<br />

• Comparison: similar implementations<br />

using Expectation Maximisation (EM)<br />

and Naive Bayes classifier (NB)<br />

• Result: fewer labels are required to<br />

achieve a given level of accuracy<br />

Mean accuracy<br />

Std dev<br />

Least L t confident fid t sampling li with ithdiff different t classifiers l ifi<br />

1<br />

0.8<br />

0.6<br />

10 0<br />

0.4<br />

10 0<br />

0.2<br />

0.1<br />

0<br />

10 1<br />

10 2<br />

10 1<br />

10 2<br />

# labeled instances (log scale)<br />

EM<br />

NB<br />

VDP<br />

10 3<br />

10 3


Session TuBT9 Continental Parlor 9 Tuesday, September 27, <strong>2011</strong>, 10:00–11:30<br />

Perceptual Learning<br />

Chair Danica Kragic, KTH<br />

Co-Chair Timothy Bretl, Univ. of Illinois at Urbana-Champaign<br />

10:50–10:55 TuBT9.5<br />

Autonomous Acquisition of Multimodal Information<br />

for Online Object Concept Formation by Robots<br />

Takaya Araki, Tomoaki Nakamura and Takayuki Nagai<br />

Dept. of Elec. Eng., The Univ. of Electro-Communications, Japan<br />

Kotaro Funakoshi and Mikio Nakano<br />

Honda Research Institute Japan Co., Ltd., Japan<br />

Naoto Iwahashi<br />

NICT Knowledge Creating Communication Research Center, Japan<br />

• A novel framework for acquisition of<br />

multimodal information and concept<br />

formation by robots<br />

• The robot autonomously acquires haptic,<br />

audio and visual information by grasping,<br />

shaking and observing the target object<br />

• The algorithm of multimodal categorization<br />

using Gibbs sampling is extended to an<br />

online version<br />

Online Multimodal LDA<br />

• This framework enables inference,<br />

classification, and the model update<br />

simultaneously<br />

Autonomous<br />

Information<br />

Acquisition<br />

Multimodal object concepts<br />

Category 1<br />

Words<br />

Plushie, Soft,<br />

Animal<br />

Category 2<br />

Words<br />

Cup, Hard,<br />

Handle<br />

Category 3<br />

Words<br />

Plastic bottle,<br />

Hard, Liquid<br />

11:00–11:15 TuBT9.7<br />

����������������������������������<br />

�����������������������������������<br />

��������������������������������������������������<br />

������������������������������������������������<br />

� ��������������������������������������������<br />

������������������������������������<br />

� �����������������������������������������<br />

����������������������������������������<br />

��������������<br />

� ������������������������������������������<br />

�������������������������������������������<br />

��������������������<br />

� ���������������������������������������������<br />

���������������������������������������������<br />

�������������������������<br />

��������������������������������<br />

���������������������������������<br />

�����������<br />

10:55–11:00 TuBT9.6<br />

Learning Robot Grasping from 3-D Images with<br />

Markov Random Fields<br />

Abdeslam Boularias, Oliver Kroemer, Jan Peters<br />

Max-Planck Institute for Intelligent Systems, Tuebingen, Germany<br />

• Use a depth camera<br />

(Kinect,Swissranger, .. ) to create a 3D<br />

image of an object (a point cloud).<br />

• Transform the point cloud to a knearest<br />

neighbor graph.<br />

• The graph defines a Markov Random<br />

Field, where the vertices are labeled as<br />

“good” or “bad” grasping points.<br />

• The parameters of the Network are<br />

learned by using grasping examples<br />

provided by a human.<br />

• A new object is grasped by classifying<br />

its 3-D points, sampling a good<br />

grasping point, and using a heuristic for<br />

the orientation and the direction of the<br />

hand.<br />

11:15–11:30 TuBT9.8<br />

Inverse Reinforcement Learning<br />

using Path Integrals<br />

Navid Aghasadeghi and Timothy W. Bretl<br />

University of Illinois at Urbana-Champaign, USA<br />

• Problem: Given an optimal trajectory<br />

generated by a continuous-time system,<br />

find a weights of parameterized cost<br />

function that would explain it<br />

• Use the Path Integral formulation to find<br />

probability over trajectories,<br />

parameterized � by<br />

P (� m | �,� ) �<br />

• Solution involves two parts:<br />

• Maximum likelihood to find the weights of the<br />

parameterized cost function<br />

• Iteratively update the set of sampled<br />

trajectories, by solving a forward optimal<br />

control with respect to the current cost estimate<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–74–<br />

K<br />

�<br />

1 *T<br />

� � � (� m )<br />

� e<br />

k �1<br />

e<br />

1 *T<br />

� � � (� k )<br />

�<br />

True cost and optimal trajectory<br />

Estimate cost and trajectory


Session TuCT1 Continental Parlor 1 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Object Detection & Collision Avoidance<br />

Chair Paolo Zani, Univ. of Parma, Italy<br />

Co-Chair Jyh-Ming Lien, George Mason Univ.<br />

14:00–14:15 TuCT1.1<br />

Real-time Bézier Trajectory Deformation for<br />

Potential Fields Planning Methods<br />

L.Hilario * , N. Montés * , M.C.Mora ‡ , A.Falcó*<br />

* Physics, Mathematics and Computation Sciences Dept.,Cardenal Herrera<br />

University CEU, Spain<br />

‡ Mechanical Engineering and Construction Dept., Jaume I Universtity, Spain<br />

• A novel technique for obtaining trajectories<br />

based on the deformation of Bézier curves<br />

(BTD) is presented.<br />

• The deformation is computed through<br />

vectors. These are obtained with any<br />

Artificial Potential field methods.<br />

• In the present paper, the BTD has been<br />

combined with the Potential Field<br />

Projection method (PFP).<br />

• Next image shows a PFP-BTD example in<br />

an environment with 5 mobile obstacles.<br />

14:30–14:45 TuCT1.3<br />

Positive and Negative Obstacle Detection<br />

using the HLD Classifier<br />

Ryan D. Morton and Edwin Olson<br />

Computer Science & Engineering, University of Michigan, USA<br />

• New Terrain Classifier<br />

• Detects both positive and negative<br />

obstacles<br />

• Handles partial observability<br />

• Gives designers ability to manage<br />

classification risk<br />

• Subsumes and outperforms previous<br />

methods<br />

14:15–14:30 TuCT1.2<br />

Fast and Robust 2D Minkowski Sum<br />

Using Reduced Convolution<br />

Evan Behar and Jyh-Ming Lien<br />

Department of Computer Science, George Mason University, USA<br />

� M-sum is used for many tasks, such as<br />

C-space mapping and motion planning<br />

� Convolution of P, Q is sum of P's edges<br />

with Q's vertices, and vice versa<br />

� Prevailing convolution-based M-sum<br />

method computes the arrangement of full<br />

convolution, and then throws most of it<br />

away<br />

� Our method uses a reduced<br />

convolution and fast filters to avoid this<br />

wasted computation<br />

� It is shown experimentally to be faster<br />

than the full convolution method used by<br />

CGAL<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–75–<br />

An example reduced convolution<br />

generated from a neuron with<br />

holes and a circle<br />

14:45–14:50 TuCT1.4<br />

Real-time Swept Volume and Distance<br />

Computation for Self Collision Detection<br />

Holger Täubig<br />

Safe & Secure Cognitive Systems, DFKI, Germany<br />

Berthold Bäuml<br />

DLR Institute of Robotics and Mechatronics, Germany<br />

Udo Frese<br />

Safe & Secure Cognitive Systems, DFKI, Germany<br />

• Continuous collision detection algorithm<br />

for industrial and humanoid robots<br />

• Computes swept volumes of all bodies<br />

and checks them pairwise for collisions –<br />

real-time (0.4ms)<br />

• Input: joint angle intervals – cover whole<br />

movement<br />

• Volume representation: sphere swept<br />

convex hulls – tight, efficient, numerically<br />

stable<br />

• Kinematic tree: operation set models<br />

different joints – allows for a trade-off<br />

between accuracy and computation time


Session TuCT1 Continental Parlor 1 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Object Detection & Collision Avoidance<br />

Chair Paolo Zani, Univ. of Parma, Italy<br />

Co-Chair Jyh-Ming Lien, George Mason Univ.<br />

14:50–14:55 TuCT1.5<br />

Visual Navigation With Obstacle Avoidance<br />

Andrea Cherubini and François Chaumette<br />

INRIA Rennes – Bretagne Atlantique<br />

• Visual navigation with<br />

obstacle avoidance is<br />

achieved<br />

• Obstacles are detected by a<br />

range scanner and modeled<br />

with potential fields<br />

• Collision avoidance and<br />

visual navigation are<br />

achieved concurrently by<br />

actuating the camera pan<br />

angle<br />

• The approach is validated in<br />

a series of real outdoor<br />

experiments<br />

15:00–15:15 TuCT1.7<br />

Clustering Obstacle Predictions to Improve<br />

Contingency Planning for Autonomous Road<br />

Vehicles in Congested Environments<br />

Jason Hardy and Mark Campbell<br />

Department of Mechanical and Aerospace Engineering,<br />

Cornell University, United States<br />

• Hierarchical clustering algorithm<br />

for improving the computational<br />

scaling of contingency planning<br />

• Contingency planning correctly<br />

handles mutually exclusive<br />

obstacle trajectory predictions<br />

• Increase the utility of the<br />

available contingency plans by<br />

maximizing dissimilarity between<br />

obstacle prediction clusters<br />

• Goal: capture the most important<br />

mutually exclusive obstacle<br />

decisions and uncertainties with a<br />

limited set of contingency clusters<br />

Application: autonomous road<br />

vehicles in congested environments<br />

14:55–15:00 TuCT1.6<br />

Stereo obstacle detection in challenging<br />

environments: the VIAC experience<br />

Alberto Broggi and Michele Buzzoni<br />

and Mirko Felisa and Paolo Zani<br />

VisLab, University of Parma, Italy<br />

• Stereo-based 3D world mapping using a<br />

parallel SGM approach<br />

• Obstacles points are clustered according<br />

to a spatial compatibility criterion<br />

• Very general approach, working reliably in<br />

a wide variety of scenarios<br />

• Tested on an intercontinental route from<br />

Parma, Italy, to Shanghai, China.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–76–<br />

(a)<br />

(b)<br />

(a) The test vehicle, and (b) the<br />

obstacle detection algorithm<br />

output: color encodes disparity<br />

15:15–15:30 TuCT1.8<br />

Time Parametrization of Prioritized Inverse<br />

Kinematics Based on Terminal Attractors<br />

Gerardo Jarquín, Gustavo Arechavaleta and Vicente Parra-Vega<br />

Robotics and Advanced Manufacturing Group, CINVESTAV-IPN, México<br />

• Terminal attractors are incorporated<br />

to the task-priority redundancy<br />

formalism.<br />

• Each prioritized reaching task is<br />

achieved according to a desired time.<br />

• The strategy guarantees smooth<br />

transitions between active and<br />

inactive tasks.


Session TuCT2 Continental Parlor 2 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Motion Estimation, Mapping & SLAM<br />

Chair Ryan Eustice, Univ. of Michigan<br />

Co-Chair Davide Scaramuzza, Univ. of Pennsylvania<br />

14:00–14:15 TuCT2.1<br />

Stereo Depth Map Fusion<br />

for Robot Navigation<br />

Christian Häne 1 , Christopher Zach 1 , Jongwoo Lim 2 ,<br />

Ananth Ranganathan 2 and Marc Pollefeys 1<br />

Department of Computer Science, ETH Zürich, Switzerland 1<br />

Honda Research Institute, Mountain View, CA, USA 2<br />

• Fusion of range data from stereo image<br />

pairs to a representation suitable for robot<br />

navigation<br />

• Reconstruction restricted to a two level<br />

height map representation with a floor and<br />

a ceiling for efficient processing<br />

• Regularization of height levels with a<br />

convex total variation energy functional to<br />

get globally optimal solutions<br />

Top: Left image from one input stereo pair<br />

Middle: Raw depth map from stereo matching<br />

Bottom: Final regularized two level reconstruction<br />

14:30–14:45 TuCT2.3<br />

Building Facade Detection, Segmentation, and<br />

Parameter Estimation for Mobile Robot<br />

Localization and Guidance<br />

Jeffrey A. Delmerico and Jason J. Corso<br />

Dept. of Computer Science and Engineering, SUNY at Buffalo, USA<br />

Philip David<br />

US Army Research Laboratory, USA<br />

• Method for segmenting and modeling building facades in stereo imagery.<br />

• Proposed application in mobile robotics is localization in urban<br />

environments by registration of façade orientations to aerial images.<br />

• 85% accuracy for segmentation into individual facades; 10� accuracy for<br />

modeling planar façade orientations.<br />

• New benchmark dataset – human-annotated stereo images of buildings.<br />

14:15–14:30 TuCT2.2<br />

An Embedded Stereo Vision Module<br />

for 6D Pose Estimation and Mapping<br />

Giacomo Spampinato, Jörgen Lidholm, Carl Ahlberg,<br />

Fredrik Ekstrand, Michael Ekström and Lars Asplund<br />

School of Innovation Design and Engineering,<br />

Mälardalen University, Sweden<br />

• The paper presents an embedded stereo<br />

vision system based on FPGA to perform<br />

6D vSLAM<br />

• The proposed architecture is both energy<br />

and cost efficient with high versatility<br />

• A real-time sparse visual feature detector<br />

is used as the only source of information<br />

• Experiments on small and large scale in<br />

non flat scenarios are presented<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–77–<br />

Large scale visual SLAM in<br />

industrial scenario<br />

14:45–14:50 TuCT2.4<br />

A Strategy for Efficient Observation Pruning in<br />

Multi-Objective 3D SLAM<br />

Jaime Valls Miro, Weizhen Zhou and Gamini Dissanayake<br />

Centre of Excellence for Autonomous Systems,<br />

Faculty of Engineering and IT,<br />

University of Technology Sydney (UTS),<br />

Australia<br />

• Efficient solution to feature-based 3D<br />

SLAM when competing objectives<br />

operate simultaneously<br />

• Formulation automatically exploits<br />

observations to best fulfil differing<br />

objectives: localisation, mapping,<br />

exploration, feature distribution, victim<br />

search, etc.<br />

• Tested in Search and Rescue scenario:<br />

21% of observations processed in 1/3 of<br />

time required for full SLAM, with little<br />

degradation in map size and quality<br />

2D projection of spatial<br />

coverage/map expansion (in a<br />

USAR environment), one of the<br />

competing objectives assessed<br />

by the proposed dynamic SLAM<br />

strategy


Session TuCT2 Continental Parlor 2 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Motion Estimation, Mapping & SLAM<br />

Chair Ryan Eustice, Univ. of Michigan<br />

Co-Chair Davide Scaramuzza, Univ. of Pennsylvania<br />

14:50–14:55 TuCT2.5<br />

Combined Visually and Geometrically<br />

Informative Link Hypothesis for Pose-graph<br />

Visual SLAM using Bag-of-Words<br />

Ayoung Kim* and Ryan Eustice †<br />

*Department of Mechanical Engineering<br />

† Department of Naval Architecture & Marine Engineering<br />

University of Michigan, USA<br />

• Expected information gain combined with<br />

visual saliency score is reported.<br />

• Geometrically and visually informative<br />

loop-closure candidates are selected.<br />

• Two different bag-of-words saliency<br />

metrics are introduced – global saliency<br />

and local saliency.<br />

• Global saliency measures the rarity.<br />

• Local saliency describes texture richness<br />

and usefulness for camera measurement.<br />

• Results with both indoor and underwater<br />

imagery are presented.<br />

15:00–15:15 TuCT2.7<br />

3D Surveillance Coverage Using Maps Extracted<br />

by a Monocular SLAM Algorithm<br />

Lefteris Doitsidis Alessandro Stephan Weiss<br />

Department of<br />

Electronics, TEI Crete &<br />

CERTH/ITI, Greece<br />

Renzaglia<br />

INRIA,France<br />

ETHZ, Switzerland<br />

Elias Kosmatopoulos Davide Scaramuzza Roland Siegwart<br />

Dept. of ECE, DUTH & University of<br />

ETHZ, Switzerland<br />

CERTH/ITI, Greece Pennsylvania, Greece<br />

• Surveillance Coverage over a terrain of<br />

arbitrary morphology.<br />

• Two-step centralized procedure for<br />

optimal alignment of the robot team.<br />

• A single robot constructs the map using a<br />

novel monocular-vision-based approach.<br />

• The optimal arrangement of the robot<br />

team is produced using a Cognitive based<br />

optimization approach and maximizes the<br />

monitored area.<br />

14:55–15:00 TuCT2.6<br />

RS-SLAM: RANSAC Sampling for Visual SLAM<br />

Gim Hee Lee, Friedrich Fraundorfer, and Marc Pollefeys<br />

Computer Vision and Geometry Laboratory, Department of Computer Science,<br />

ETH Zürich, Switzerland<br />

• RS-SLAM is a monocular FastSLAM<br />

framework that uses hypotheses from<br />

5-point RANSAC and image feature<br />

uncertainties as proposal distribution<br />

• Our proposal distribution is more<br />

robust than the commonly used<br />

constant velocity model which could<br />

be easily violated<br />

• We also showed results from the<br />

extension of our RS-SLAM for stereo<br />

camera<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–78–<br />

(Top) Particles trajectories during loop closure<br />

(Bottom) 3D map generated from the particle<br />

with the highest weight<br />

15:15–15:30 TuCT2.8<br />

Adaptive Sampling Using Mobile Robotic<br />

Sensors<br />

Shuo Huang and Jindong Tan<br />

Electrical & Computering Engineering<br />

Michigan Technological University, USA<br />

• An adaptive sparse sampling approach<br />

based on mobile robotic sensors<br />

• Unknown sensing fields modeled as<br />

sparse signals under Haar Wavelet<br />

domain<br />

• Measurements determined and collected<br />

considering possible distribution of target<br />

sparse signals<br />

• Sensing information maximized with<br />

number of measurements substantially<br />

reduced<br />

Great Lakes surface temperature<br />

sensing and reconstruction


Session TuCT3 Continental Parlor 3 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Microrobotics III<br />

Chair Sylvain Martel, Ec. Pol. de Montreal (EPM)<br />

Co-Chair David Cappelleri, Stevens Inst. of Tech.<br />

14:00–14:15 TuCT3.1*<br />

Semi-Plenary Invited Talk: Artificial, Biomimetic and<br />

Biological Microrobotics<br />

Metin Sitti, Carnegie Mellon University<br />

14:30–14:45 TuCT3.3<br />

Chemotactic Behavior and Dynamics of<br />

Bacteria Propelled Microbeads<br />

Dongwook Kim, Albert Liu, and Metin Sitti<br />

Carnegie Mellon University, USA<br />

• Proposed chemotaxis based steering<br />

control of swimming micro-robotic<br />

bodies propelled by attached<br />

flagellated S. marcescens bacteria<br />

• Proposed a method to create<br />

chemoattactant gradient in enclosed<br />

fluidic chambers using a mixture<br />

consisting of agar and<br />

chemoattractants<br />

• Demonstrated that chemical<br />

attractant directs the stochastic<br />

motion of bacteria propelled beads,<br />

which is promising as a steering<br />

control method<br />

Bead trajectories when there is<br />

no chemical attractant in the medium<br />

Bead trajectories when there is<br />

chemical gradient in the south direction<br />

14:15–14:30 TuCT3.2<br />

Semi-Plenary Invited Talk: Artificial, Biomimetic and<br />

Biological Microrobotics<br />

Metin Sitti, Carnegie Mellon University<br />

14:45–14:50 TuCT3.4<br />

First Leaps Toward Jumping Microrobots<br />

Wayne A. Churaman 1,3 , Aaron P. Gerratt 1 , and<br />

Sarah Bergbreiter 1,2<br />

1 Mechanical Engineering, University of Maryland, College Park, USA<br />

2 Institute for Systems Research, University of Maryland, College Park, USA<br />

3 Army Research Lab, USA<br />

• Two jumping microrobots demonstrated<br />

using both stored mechanical energy and<br />

stored chemical energy to provide thrust<br />

• Robot using stored mechanical energy is<br />

8 mg, 4 x 4 x 0.3 mm 3 and has<br />

demonstrated jump heights of 32 cm<br />

when loaded and released with tweezers<br />

• Robot using stored chemical energy is<br />

300 mg and 4 x 7 x 4 mm 3 but also<br />

includes power, sensing, and control on<br />

board in addition to chemical actuators –<br />

autonomous jumps 8 cm high<br />

demonstrated in response to light stimulus<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–79–<br />

Two jumping microrobots using<br />

(a) stored mechanical energy and<br />

(b) stored chemical energy for<br />

jump


Session TuCT3 Continental Parlor 3 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Microrobotics III<br />

Chair Sylvain Martel, Ec. Pol. de Montreal (EPM)<br />

Co-Chair David Cappelleri, Stevens Inst. of Tech.<br />

14:50–14:55 TuCT3.5<br />

Micro-Scale Propulsion using<br />

Multiple Flexible Artificial Flagella<br />

John Singleton, Eric Diller, Tim Andersen, Metin Sitti<br />

Mechanical Engineering, Carnegie Mellon University, USA<br />

Stéphane Regnier<br />

Institute of Intelligent Systems and Robotics<br />

Univ. Pierre et Marie Curie, France<br />

• Multiple artificial flagella are used<br />

to increase propulsion of microscale<br />

swimmers<br />

• For simplicity of actuation and<br />

fabrication, all flagella rotate<br />

about common axis<br />

• Stiff and flexible helices and rods<br />

are tested and analyzed,<br />

suggesting increase in thrust<br />

over single flagella<br />

Concept sketch of the proposed<br />

micro-scale propulsion system<br />

with multiple flexible flagella<br />

protruding downwards from the<br />

perimeter of the cylindrical body<br />

15:00–15:15 TuCT3.7<br />

Precision Evaluation of Modular Multiscale<br />

Robots for Peg-in-Hole Microassembly Tasks<br />

Aditya N. Das and Dan O. Popa<br />

The University of Texas at Arlington, USA<br />

• Peg-in-hole microassembly is a generic<br />

problem, with yield dependent on both<br />

part tolerance and robot precision<br />

• We present simulation results evaluating<br />

the precision of several modular robotic<br />

configurations<br />

• Simulations are based on explicit<br />

uncertainty models and Monte-Carlo runs<br />

• Results confirm intuition that most precise<br />

robot design include rotation DOFs at the<br />

end of two manipulator chains<br />

Rotational<br />

stages<br />

Angle adapter<br />

(c)<br />

Translational stages<br />

Stages<br />

End-effector<br />

Final pose<br />

Initial pose<br />

Evaluation of peg-in-hole<br />

microassembly yield<br />

with modular robotic stages<br />

14:55–15:00 TuCT3.6<br />

Manipulation of<br />

Multiple Bacteria-driven Microobjects<br />

Based on Bacterial Autonomous Movement<br />

Kousuke Nogawa 1 , Masaru Kojima 1 , Masahiro Nakajima 1 ,<br />

Michio Homma 2 , Fumihito Arai and Toshio Fukuda 1<br />

1 Institute for Advanced Research, Nagoya University, Japan<br />

2 Department of Micro-Nano Systems Engineering, Nagoya University, Japan<br />

3 Center For Micro-nano Mechatronics, Nagoya University, Japan<br />

4 Division of Biological Science, Nagoya University, Japan<br />

• We propose a method of<br />

smart manipulation of<br />

multiple bacteria-driven<br />

microobjects with single<br />

manipulated guiderobot<br />

based on bacterial<br />

autonomous movement,<br />

named “SMARTBOT”.<br />

• Fundamental capability of<br />

SMARTBOT is verified by<br />

using the micro/nano pipettes<br />

instead of the guiderobot.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–80–<br />

Signal<br />

Target<br />

source<br />

Global control methods<br />

Blood<br />

vessels<br />

Microrobots<br />

Bacteria-driven<br />

microrobots<br />

Fully manipulated multi-argents Fully autonomous by bacterial function<br />

Bacteria-driven<br />

microrobots<br />

Guide robot<br />

Smart manipulation of multiple agent with single manipulated guide robot<br />

Autonomous mass manipulation of<br />

bacteria- driven microobjects with<br />

single manipulated guide robot<br />

15:15–15:30 TuCT3.8<br />

Multipoint Sliding Probe Methods for In situ<br />

Electrical Transport Property Characterization of<br />

Individual Nanostructures<br />

Zheng Fan 1 , Xinyong Tao 2 , Xiaodong Li 3 , and Lixin Dong 1<br />

1 Dept. of Elec. & Comput. Engr., Michigan State University, USA<br />

2 College of Chem. Engr. & Mater. Science, Zhejiang Univ. of Tech., China<br />

3 Dept. of Mech. Engr., University of South Carolina, USA<br />

• Sliding probe methods are<br />

designed for the in situ electrical<br />

characterization of individual 1D<br />

nanostructures.<br />

• Contact force is controlled with a<br />

Cu-nanowire-tipped flexible probe.<br />

• Specimen-shape adapting probe is<br />

developed for keeping a constant<br />

contact area.<br />

• The technique has a higher<br />

resolution and simplicity in setup<br />

as compared with conventional<br />

two- and four-terminal methods,<br />

respectively.<br />

Sliding Probe Methods and TEM<br />

Images of Measurements


Session TuCT4 Continental Ballroom 4 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Forum: Robotics: Beyond the Horizon<br />

Chair Hirochika Inoue, AIST<br />

Co-Chair<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–81–


Session TuCT5 Continental Ballroom 5 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Medical Robotics III<br />

Chair Jaydev P. Desai, Univ. of Maryland<br />

Co-Chair Paolo Fiorini, Univ. of Verona<br />

14:00–14:15 TuCT5.1<br />

Spatial and Temporal Movement Characteristics<br />

after Robotic Training of Arm and Hand:<br />

A Case Study of a Person with<br />

Incomplete Spinal Cord Injury<br />

D.P. Eng, Dept of MEMS, Rice University, USA<br />

Z. Kadivar, Dept of PM&R, Baylor College of Medicine,USA<br />

J.L. Sullivan, A.U. Pehlivan, M.K. O’Malley<br />

Dept of MEMS, Rice University, USA<br />

N. Yozbatiran and G.E. Francisco, Dept of PM&R, UTHSC, USA<br />

• Application of RiceWrist robotic device for<br />

upper-limbs of a patient with spinal cord<br />

injury<br />

• Spatial and temporal movement aspects<br />

were determined before and after the<br />

training<br />

• Greater improvements were observed for<br />

temporal aspects of forearm and wrist<br />

movements<br />

14:30–14:45 TuCT5.3<br />

Simulating Prosthetic Devices with<br />

Human-Inspired Hybrid Control<br />

Ryan W. Sinnet, Huihua Zhao, and Aaron D. Ames<br />

Department of Mechanical Engineering, Texas A&M University, USA<br />

• A human walking experiment provides<br />

data from test subjects. Data from four<br />

of the subjects are considered to<br />

strengthen the result.<br />

• Human functions are designed which<br />

represent the most fundamental<br />

behaviors underlying human walking.<br />

• A simulation environment is described<br />

which attempts to reproduce human<br />

walking using human functions on a<br />

model with humanlike measurements.<br />

• A transfemoral prosthesis is substituted<br />

for one of the legs and simulation<br />

results in stable walking; see figure.<br />

Hybrid model of prosthesis (top),<br />

healthy gait with human-inspired<br />

control (middle), and prosthesis gait<br />

(bottom)<br />

14:15–14:30 TuCT5.2<br />

Robotic Rehabilitation System Using Human<br />

Hand Trajectory Generation Model<br />

in Virtual Curling Task<br />

Yoshiyuki Tanaka, Toru Sanemasa, Toshio Tsuji<br />

Graduate School of Engineering, Hiroshima University, Japan<br />

Nobuaki Imamura<br />

Department of Engineering, Hiroshima Kokusai University, Japan<br />

• This paper discusses the training<br />

methodology using hand trajectory<br />

generation model in virtual curling.<br />

• The hand motion by well-trained subject is<br />

expressed within the framework of the<br />

minimum-jerk model.<br />

• The model is utilized for teaching and<br />

assisting trainee's motion according to<br />

individual motor abilities.<br />

• Effectiveness of the proposed training<br />

approach was validated through<br />

preliminary training experiments with the<br />

beginners.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–82–<br />

Velocity, vy [m/s]<br />

0.45<br />

0.3<br />

0.15<br />

0<br />

House<br />

Release Release<br />

line<br />

Stone Stone<br />

Distance, l [m]<br />

Reference motion<br />

A minimum-jerk model<br />

with task-related constraints<br />

-0.15<br />

4.0<br />

3.5<br />

1.0<br />

3.0<br />

0<br />

-1.0<br />

2.5<br />

-2.0 [s]<br />

2.0<br />

-3.0 Time<br />

14:45–14:50 TuCT5.4<br />

Dual Predictive Control of Electrically Stimulated<br />

Muscle using Biofeedback for Drop Foot Correction<br />

Mitsuhiro Hayashibe, Qin Zhang, Christine Azevedo-Coste<br />

DEMAR INRIA Sophia Antipolis, and LIRMM, France<br />

• Development of an ES control strategy for<br />

drop foot correction with torque tracking<br />

performance.<br />

• ES-evoked EMG signal and joint torque<br />

were recorded for the model identification<br />

with Kalman filter with forgetting factor.<br />

• A dual predictive controller was proposed<br />

for explicit consideration of muscle activity.<br />

• The method was verified by generating<br />

control signal adaptively according to the<br />

current muscle condition.


Session TuCT5 Continental Ballroom 5 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Medical Robotics III<br />

Chair Jaydev P. Desai, Univ. of Maryland<br />

Co-Chair Paolo Fiorini, Univ. of Verona<br />

14:50–14:55 TuCT5.5<br />

Gait Support for Complete Spinal Cord Injury<br />

Patient by Synchronized Leg-Swing with HAL<br />

A. Tsukahara, Y. Hasegawa and Y. Sankai<br />

Center for Cybenics Research, University of Tsukuba, Japan<br />

• This paper proposes an estimation<br />

algorithm that infers the intention related<br />

to the forward leg-swing<br />

- Algorithm: Gait-intention estimator<br />

• The system including the algorithm infers<br />

the intention in synchronization with the<br />

center of ground reaction force<br />

- System: Hybrid Assistive Limb (HAL)<br />

• The effectiveness of the proposed<br />

algorithm was verified through a clinical<br />

trial<br />

- Clinical trial: 10m walking test on a treadmill<br />

Complete spinal cord injury<br />

patient wearing HAL<br />

15:00–15:15 TuCT5.7<br />

Knee Joint Movement Assistance Through<br />

Robust Control of an Actuated Orthosis<br />

Saber Mefoued, Samer Mohammed, and Yacine Amirat<br />

University of Paris Est Creteil - (UPEC)<br />

LISSI Laboratory, 94400<br />

Vitry Sur Seine, France<br />

• Design of an actuated orthosis intended to<br />

restore lower limb movements of dependent<br />

persons.<br />

• Dynamic modeling and parametric<br />

identification of the biomechanical – orthosis<br />

model.<br />

• Experimental implementation of a High<br />

Order Sliding Mode Controller (HOSMC).<br />

• Evaluation of controllers’ performances in<br />

terms of tracking errors and robustness<br />

against external perturbations.<br />

Actuated knee joint orthosis<br />

14:55–15:00 TuCT5.6<br />

Subject-based Motion Generation Model<br />

with Adjustable Walking Pattern<br />

for a Gait Robotic Trainer: NaTUre-gaits<br />

Ping Wang and K. H. Low<br />

School of Mechanical and Aerospace Engineering<br />

Nanyang Technological University (NTU), Singapore<br />

A. H. McGregor<br />

Charing Cross Hospital, Imperial College London, UK<br />

• The exoskeleton modules are mounted on<br />

the mobile platform and attached to the<br />

lower limbs and pelvis in parallel<br />

• The synchronized motion generation for<br />

the exoskeleton modules is provided by<br />

virtue of the inverse kinematic model<br />

analysis<br />

• The pelvic trajectory is predefined and ten<br />

points are specified within one gait cycle<br />

to obtain the foot trajectory from the<br />

designated step length and height<br />

• The trajectory is obtained via curve fitting<br />

based on these specified points<br />

• On the other hand, in order to keep the<br />

desired foot trajectory, the pelvic motion is<br />

compensated to accommodate hip and<br />

knee joint angles<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–83–<br />

Figure caption is optional,<br />

Ten points definition as seen<br />

from fixed external observer<br />

15:15–15:30 TuCT5.8<br />

Stairs-Ascending/Descending Assist for a Lower-<br />

Limb Power-Assist Robot Considering ZMP<br />

Yoshiaki Hayashi and Kazuo Kiguchi<br />

Dept. of Advanced Technology Fusion, Saga University, Japan<br />

• The perception-assist for a lower-limb power-assist robot<br />

has been studied.<br />

• In this paper, the stairs-ascending/descending assist is discussed<br />

to prevent the user from falling down.<br />

• In the proposed method, the perception-assist is carried out<br />

considering ZMP.<br />

• The effectiveness of the proposed method has been evaluated<br />

by the performing experiments.<br />

Lower-limb power-assist<br />

exoskeleton robot.


Session TuCT6 Continental Ballroom 6 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Grasping and Manipulation: Grasp Planning and Quality<br />

Chair Jeff Trinkle, Rensselaer Pol. Inst.<br />

Co-Chair Elon Rimon, Tech. - Israel Inst. of Tech.<br />

14:00–14:15 TuCT6.1*<br />

Semi-Plenary Invited Talk: Sensorless Grasping with<br />

Multiple Contacts and its Computational Modeling<br />

Suguru Arimoto, Ritsumeikan University<br />

14:30–14:45 TuCT6.3<br />

Graspability Map:<br />

A Tool for Evaluating Grasp Capabilities<br />

Maximo A. Roa, Katharina Hertkorn, Franziska Zacharias,<br />

Christoph Borst and Gerd Hirzinger<br />

Institute of Robotics and Mechatronics, German Aerospace Center, Germany<br />

Graspability maps for a coffee cup with the DLR hand II and the DLR-HIT hand II<br />

• Novel representation of the positions and orientations that allow force<br />

closure precision grasps for a given object and mechanical hand<br />

• The maps allow the comparison of grasp capabilities for different<br />

mechanical hands<br />

• Potential applications of the maps include autonomous grasp and<br />

manipulation planning<br />

14:15–14:30 TuCT6.2<br />

The OpenGRASP Benchmarking Suite:<br />

An Environment for the Comparative Analysis of<br />

Grasping and Dexterous Manipulation<br />

Stefan Ulbrich, Daniel Kappler, Tamim Asfour,<br />

Nikolaus Vahrenkamp, Alexander Bierbaum, Markus Przybylski<br />

and Rüdiger Dillmann<br />

Institute for Anthropomatics, Karlsruhe Institute of Technology, Germany<br />

• A new software environment for<br />

comparative evaluation of algorithms for<br />

grasping and dexterous manipulation.<br />

• Allows the reproduction of well-defined<br />

experiments in real-life scenarios in every<br />

laboratory.<br />

• Extendable structure in order to include a<br />

wider range of benchmarks.<br />

• Real-life scenario featuring a humanoid<br />

robot acting in a kitchen with domestic<br />

everyday objects.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–84–<br />

Screenshot of the<br />

real-life scenario.<br />

14:45–14:50 TuCT6.4<br />

Experimental evaluation of Postural Synergies<br />

during Reach to Grasp with the UB Hand IV<br />

Fanny Ficuciello1 , Gianluca Palli2 , Claudio Melchiorri2 and Bruno Siciliano1 1Dipartimento di Informatica e Sistemistica,<br />

Università degli studi di Napoli Federico II, Italy<br />

2Dipartimento di Elettronica, Informatica e Sistemistica,<br />

Università di Bologna, Italy<br />

• The Postural Synergies configuration<br />

subspace given by the fundamental<br />

eigengrasps of the UB Hand IV is derived<br />

through experiments.<br />

• The study is based on the kinematic<br />

structure of the hand and on the taxonomy<br />

of the grasps of common objects.<br />

• Experimental results show that grasp<br />

synthesis for precision or power grasps is<br />

obtained by using only the first two<br />

eigengrasps.<br />

• The tasks are planned for reach and grasp<br />

phases, unique for each object/grasp<br />

combination, by suitable tuning the<br />

eigenpostures weight.<br />

Power and Precise Grasp of<br />

Prismatic and Circular objects<br />

obtained using Postural<br />

Synergies


Session TuCT6 Continental Ballroom 6 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Grasping and Manipulation: Grasp Planning and Quality<br />

Chair Jeff Trinkle, Rensselaer Pol. Inst.<br />

Co-Chair Elon Rimon, Tech. - Israel Inst. of Tech.<br />

14:50–14:55 TuCT6.5<br />

Planning Grasps for Robotic Hands using a Novel<br />

Object Representation based on the Medial Axis<br />

Transform<br />

Markus Przybylski, Tamim Asfour and Rüdiger Dillmann<br />

KIT – Karlsruhe Institute of Technology, Germany<br />

• We present the novel grid of medial<br />

spheres object representation.<br />

• The grid of medial spheres contains<br />

symmetry properties of an object.<br />

• We present a grasp planning algorithm<br />

that operates on the grid of medial<br />

spheres.<br />

• We show experimental results with two<br />

robot hands on two sets of object models.<br />

15:00–15:15 TuCT6.7<br />

Prioritized Independent Contact Regions for<br />

Form Closure Grasps<br />

Robert Krug, Dimitar Dimitrov, Krzysztof Charusta and Boyko Iliev<br />

AASS Research Center, Oerebro University, Sweden<br />

• Independent Contact Regions (ICR)<br />

provide robustness to grasp uncertainty<br />

• We propose an Algorithm allowing for<br />

user-input regarding the shape of such<br />

regions<br />

• An extension of the Grasp Wrench Space<br />

(GWS) to sets of grasps is contributed<br />

• We introduce the concept of the Exertable<br />

Wrench Space (EWS) to evaluate grasp<br />

families<br />

14:55–15:00 TuCT6.6<br />

Exploiting potential energy storage for cyclic<br />

manipulation: An analysis for elastic dribbling<br />

with an anthropomorphic robot<br />

S. Haddadin, K. Krieger, M. Kunze, A. Albu-Schäffer<br />

Institute of Robotics and Mechatronics (DLR), Germany<br />

• Stable model based ball dribbling by<br />

proprioceptive feedback only, no vision<br />

feedback<br />

• Utilizing intrinsically compliant fingers for<br />

protecting the robot and enlarging contact<br />

time<br />

• Exploiting dynamic elastic energy storage<br />

and release for better peak power<br />

performance and robustness<br />

• Full 6DoF dribbling experiments with a<br />

Cartesian impedance controlled 7DoF<br />

anthropomorphic LWR<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–85–<br />

Elastic dribbling: from modeling to experiment<br />

15:15–15:30 TuCT6.8<br />

Abort and Retry in Grasping<br />

Alberto Rodriguez, Matthew T. Mason, Siddhartha S. Srinivasa<br />

Matthew Bernstein, and Alex Zirbel<br />

The Robotics Institute, Carnegie Mellon University, USA<br />

• Early abort and retry in a bin-picking task<br />

• The system learns a probabilistic model of<br />

the instantaneous likelihood of success of<br />

the grasp<br />

Successful grasp Failed grasp<br />

• Markov chain model of the iterative process<br />

of abort and retry is used to optimize the<br />

expected time to completion of a successful<br />

grasp<br />

• Experiments with a simple hand show<br />

significant increase in time efficiency<br />

Bin picking scenario


Session TuCT7 Continental Parlor 7 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Mechanisms: Actuator Design<br />

Chair I-Ming Chen, Nanyang Tech. Univ.<br />

Co-Chair Ravi Balasubramanian, Yale Univ.<br />

14:00–14:15 TuCT7.1<br />

Stiffness Adjustment of a Series Elastic Actuator<br />

in a Knee Prosthesis for Walking and Running:<br />

The Trade-off between Energy and Peak Power Optimization<br />

Martin Grimmer and André Seyfarth<br />

Lauflabor Locomotion Laboratory, University of Jena, Germany<br />

Modelling on human experimental data shows:<br />

• a SEA can highly reduce peak power and<br />

energy for running and high speed walking<br />

• for preferred walking speeds only a energy<br />

reduction is possible<br />

• a benefit with constant stiffness values for<br />

running and walking at different speeds is<br />

possible<br />

Concept of an active knee prosthesis<br />

used to estimate peak power and<br />

energy requirements with a SEA in<br />

comparison to a direct drive<br />

14:30–14:45 TuCT7.3<br />

Performance of Serial Underactuated Mechanisms:<br />

Number of Degrees of Freedom and Actuators<br />

Ravi Balasubramanian and Aaron M. Dollar<br />

Department of Mechanical Engineering, Yale University, USA<br />

• Most underactuated mechanisms utilize<br />

only one actuator to actuate many<br />

degrees of freedom in a serial chain.<br />

• If additional actuators are to be used, what<br />

is the additional benefit that each actuator<br />

provides?<br />

• Metric used in paper: the number of<br />

contacts the mechanism makes with<br />

the environment.<br />

• What is the optimal way to route the<br />

additional actuators across the degrees of<br />

freedom?<br />

A three degree-of-freedom linear<br />

underactuated system a) with one<br />

actuator, and b) with two actuators. c)<br />

A schematic of the mechanism<br />

interacting with the environment.<br />

14:15–14:30 TuCT7.2<br />

Static and Dynamic Characteristics of<br />

McKibben Pneumatic Actuator for Realization<br />

of Stable Robot Motion<br />

Yasuhiro Sugimoto and Koichi Osuka<br />

Dept. of Mechanical Engineering, Osaka University, Japan<br />

Keisuke Naniwa<br />

Dept. of Mechanical Engineering, Kobe University, Japan<br />

• McKibben Pneumatic Actuator (MPA) has<br />

favorable properties on the stability of robot’s<br />

motion<br />

• The dependency of MPA tension-velocity<br />

through dynamic experiments was measured<br />

• It was verified that the static and dynamic<br />

MPA’s properties contribute to stable robot<br />

motions<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–86–<br />

Relation between MPA tension and<br />

contraction velocity at closed valve<br />

14:45–14:50 TuCT7.4<br />

Variable Radius Pulley Design Methodology for<br />

Pneumatic Artificial Muscle-based Antagonistic<br />

Actuation Systems<br />

Dongjun Shin1 , Xiyang Yeh2 , Oussama Khatib1 1<br />

Artificial Intelligence Laboratory, Stanford University, USA<br />

2<br />

Mechanical Engineering , Stanford University, USA<br />

• Analysis of the limitations of<br />

conventional circular pulleys<br />

• A new design methodology for<br />

variable radius pulleys<br />

• Experimental comparisons to<br />

show performance<br />

improvement in position control<br />

in the enlarged workspace.<br />

Variable Radius Pulleys in Pneumatic<br />

Muscle-driven 1-DOF Testbed


Session TuCT7 Continental Parlor 7 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Mechanisms: Actuator Design<br />

Chair I-Ming Chen, Nanyang Tech. Univ.<br />

Co-Chair Ravi Balasubramanian, Yale Univ.<br />

14:50–14:55 TuCT7.5<br />

�����������������������������������������������<br />

����������������������������������������������<br />

�����������������������<br />

���������������������������������������������������������������<br />

����������������������������������������������������������������<br />

� �������������������������������������������<br />

���������������������������������������<br />

��������������������������������������<br />

�������������������������������������<br />

������<br />

� ������������������������������������������<br />

�������������������������������������<br />

�����������������������������������<br />

����������������������������������������������<br />

������������������������<br />

� ������������������������������������������<br />

�������������������������������������������<br />

������<br />

�� ��� ��� �� ����������������������������������<br />

������� ������� ����� ����� �� �� ��� �� ��� � ��� ����� ��������� �� �� �� ��� ���� ����� �� �� ��� ��� �� ����� ���� ������ ���� �� ���� ������ �� �� ��� ������ �� ���� ����������<br />

��� ��� ��� ��� ��� ��� ��� ��� ��� ��� ���� ���������� ��������� ��� �� ���� ��� �� �� �� ��������������������������������<br />

15:00–15:15 TuCT7.7<br />

Kinematic and Dynamic Analysis of a<br />

Novel 6-DOF Serial Manipulator for<br />

Underground Distribution Power Lines<br />

J.-F. Allan, S. Lavoie, S. Reiher, and G. Lambert<br />

Hydro-Quebec’s Research Institute (IREQ), Canada<br />

• New manipulator with five revolute joints<br />

and one prismatic joint<br />

• Manipulator architecture is intended to<br />

solve space constraint problems in<br />

confined environments<br />

• Presentation of the geometrical model,<br />

analytical solution for the inverse<br />

kinematic equations, and dynamic model<br />

• Simulations performed with MATLAB and<br />

CATIA, manipulator workspace<br />

• The only robot application in the world<br />

designed to perform operation tasks on<br />

equipment in vaults of an underground<br />

power distribution network<br />

Manipulator design in CATIA<br />

14:55–15:00 TuCT7.6<br />

High-Backdrivable Parallel-Link Manipulator<br />

with Continuously Variable Transmission<br />

Kenji Tahara, Shingo Iwasa, Shu Naba and Motoji Yamamoto<br />

Kyushu University, Japan<br />

• A novel parallel-link manipulator<br />

composed of several linear shaft motors is<br />

proposed.<br />

• A continuously variable transmission<br />

(CVT) mechanism is utilized.<br />

• Kinematics and dynamics of the 1 DOF<br />

parallel-link manipulator are modeled<br />

• A control signal to regulate the arm's<br />

angle and the CVT machanism is<br />

designed.<br />

• A static relation between the output force<br />

and the CVT mechanism is analyzed.<br />

• Experiments are conducted using a<br />

prototype.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–87–<br />

Effect of the continuously variable<br />

transmission mechanism<br />

15:15–15:30 TuCT7.8<br />

Design of a Static Balancing Mechanism with<br />

Unit Gravity Compensators<br />

Changhyun Cho<br />

Dept. of Control, Instruments, and Robot, Chosun University<br />

Sungchul Kang<br />

Center for Bionics, KIST<br />

• Design method of a static balancing<br />

mechanism using unit gravity<br />

compensators (e.g., 1-dof gravity<br />

compensator)<br />

• Determination of the number of springs<br />

and their locations by computing the<br />

mapping matrix between the joint space<br />

and gravity compensator space.<br />

• The number of rows of the mapping matrix<br />

indicates the amount of unit gravity<br />

compensators and linear joint constraints<br />

representing locations of unit gravity<br />

compensators<br />

�<br />

Joint<br />

space<br />

J� = �g<br />

J<br />

J<br />

Gravity<br />

Compensator<br />

space<br />

-1 �g<br />

Mapping between the joint space<br />

and gravity compensator space


Session TuCT8 Continental Parlor 8 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Reptile & Fish-inspired Robots<br />

Chair Akio Ishiguro, Tohoku Univ.<br />

Co-Chair Fumitoshi Matsuno, Kyoto Univ.<br />

14:00–14:15 TuCT8.1<br />

Learning Fish-like Swimming with a CPG-based<br />

Locomotion Controller<br />

Y. Hu, W. Tian, J. Liang, and T. Wang<br />

Beihang University, China<br />

• Fish-like swimming can be acquired with<br />

an adaptive CPG network<br />

• Teaching signals are derived from fish<br />

kinematics with trajectory approximation<br />

method<br />

• A coupling scheme that eliminates the<br />

influence of afferent signals on amplitude<br />

of the oscillator is proposed<br />

• CPG parameters are converted into<br />

dynamical systems that evolve as part of<br />

the CPG network<br />

14:30–14:45 TuCT8.3<br />

Decentralized Control of Multi-articular<br />

Snake-like Robot for Efficient Locomotion<br />

Takeshi Kano 1 , Takahide Sato 1 ,<br />

Ryo Kobayashi 2,3 , and Akio Ishiguro 1,3<br />

1 Research Institute of Electrical Communication, Tohoku University, Japan<br />

2 Dept. of Mathematical and Life Sciences, Hiroshima University, Japan<br />

3 Japan Science and Technology Agency, CREST, Japan<br />

• We theoretically show that multiarticular<br />

muscles of snakes<br />

enhance locomotion efficiency<br />

• We propose a decentralized control<br />

scheme for efficient locomotion of a<br />

multi-articular snake-like robot on<br />

the basis of the theoretical result<br />

• The validity of the proposed control<br />

scheme is confirmed by simulations<br />

n � 1<br />

n � 2<br />

n � 3<br />

n � 4<br />

n � 5<br />

Direction of motion<br />

14:15–14:30 TuCT8.2<br />

A Self-tuning Multi-phase CPG Enabling the<br />

Snake Robot to Adapt to Environments<br />

Chaoquan Tang 1,2 , Shugen Ma 2 , Bin Li 1 , Yuechao Wang 1<br />

1, State Key Laboratory of Robotics, Automation,<br />

Chinese Academy of Sciences, Shenyang, China<br />

2, Department of Robotics, College of Science and Engineering,<br />

Ritsumeikan University, Shiga-ken, Japan<br />

• A control method imitates the control<br />

strategy of natural snake’s movement in<br />

different environments, which enables<br />

the snake robot to move more quickly<br />

and naturally.<br />

• Through kinematic and dynamic analysis<br />

of snake robots, optimal control<br />

parameters are chosen for the decision<br />

strategy.<br />

• Due to the intrinsic property of the multiphase<br />

CPG, this model can change the<br />

movement patterns and control<br />

parameters autonomously according to<br />

external information.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–88–<br />

The comparison of adaptability<br />

between multi-phase CPG control<br />

method(b) and traditional control<br />

method(a).<br />

14:45–14:50 TuCT8.4<br />

A Snake-like Robot Driven by<br />

a Decentralized Control That Enables<br />

Both Phasic and Tonic Control<br />

Takahide Sato1 , Takeshi Kano1 , and Akio Ishiguro1,2 1Research Institute of Electrical Communication, Tohoku University, Japan<br />

2Science and Technology Agency CREST, Japan<br />

• We have developed two-dimensional<br />

snake-like robot employing a<br />

decentralized control scheme, in which<br />

phasic and tonic control are well<br />

coordinated.<br />

• A well-balanced coupling between the<br />

phasic and tonic control enables the robot<br />

to locomote on the upslope, which it<br />

cannot negotiate with only phasic control.<br />

Real physical serpentine<br />

robot developed


Session TuCT8 Continental Parlor 8 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Reptile & Fish-inspired Robots<br />

Chair Akio Ishiguro, Tohoku Univ.<br />

Co-Chair Fumitoshi Matsuno, Kyoto Univ.<br />

14:50–14:55 TuCT8.5<br />

A Lizard-Inspired Active Tail Enables Rapid Maneuvers<br />

and Dynamic Stabilization in a Terrestrial Robot<br />

Evan Chang-Siu , Thomas Libby, Masayoshi Tomizuka<br />

Dept. of Mechanical Engineering, University of California Berkeley, USA<br />

Robert J. Full<br />

Dept. of Integrative Biology, University of California Berkeley, USA<br />

• Bio-inspiration from leaping lizards<br />

• 177g active tailed robot<br />

• Active tail allows redirection of angular<br />

momentum from body to tail<br />

• Active tail enables perturbation mitigation<br />

• Active tail enables steering of the body<br />

angle<br />

• High speed video demonstrating behavior<br />

Angle (deg)<br />

200<br />

100<br />

0<br />

0 0.05 0.1 0.15 0.2<br />

Time (sec)<br />

0.25 0.3 0.35<br />

Bio-Inspiration and Tailbot<br />

Tail Angle<br />

Body Angle<br />

15:00–15:15 TuCT8.7<br />

Multi-physics model of an electric fish-like robot:<br />

numerical aspects and<br />

application to obstacle avoidance<br />

Mathieu Porez*�, Vincent Lebastard�,<br />

Auke Jan Ijspeert* and Frédéric Boyer�<br />

*BioRob, EPFL, Switzerland - �IRCCyN, EMN, France<br />

• Modeling of a fish-like robot equipped with<br />

the electric sense suited to the sensorimotricity<br />

study<br />

• Study of interactions between body<br />

deformations and perception variables<br />

• Design and testing of an electric<br />

extroceptive feedback loop based on a<br />

direct current measurement method<br />

14:55–15:00 TuCT8.6<br />

Moving Right Arm in the Right Place:<br />

Ophiuroid-inspired Omnidirectional Robot<br />

Driven by Coupled Dynamical Systems<br />

Wataru Watanabe, Shota Suzuki and Takeshi Kano<br />

Research Institute of Electrical Communication, Tohoku University, Japan<br />

Akio Ishiguro<br />

Research Institute of Electrical Communication, Tohoku University, Japan,<br />

and Japan Science and Technology Agency CREST, Japan<br />

• Discuss a decentralized control that<br />

exhibits the coordination of<br />

rhythmic and non-rhythmic<br />

movements.<br />

• Focus on the ophiuroid<br />

omnidirectional locomotion as a<br />

simple best example.<br />

• Use an active rotator model that<br />

can describe both oscillatory and<br />

excitatory properties.<br />

• Simulated robot exhibits the<br />

spontaneous role assignment of<br />

versatile arm movements and<br />

generates omnidirectional locomotion.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–89–<br />

Snapshots of the simulated<br />

locomotion of the ophiuroid robot.<br />

15:15–15:30 TuCT8.8<br />

Front-Unit-Following Control of a Snake-like<br />

Robot Using Screw Drive Mechanism Based on<br />

Past Velocity Commands<br />

Ryo Ariizumi, Hiroaki Fukushima and Fumitoshi Matsuno<br />

Dept. of Mechanical Engineering and Science, Graduate School of<br />

Engineering, Kyoto University, Japan<br />

• Present a front-unit-following control<br />

method for a snake-like robot using screw<br />

drive mechanism<br />

• Operators only have to command the first<br />

unit, then the other units automatically<br />

follow it<br />

• Proposed method estimates tracking<br />

errors expressed in Frenet frames using<br />

past velocity commands<br />

• Effectiveness of the method is<br />

demonstrated by computer simulations<br />

and laboratory experiments


Session TuCT9 Continental Parlor 9 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Novel Sensors<br />

Chair Dario Floreano, Ec. Pol. Federal, Lausanne<br />

Co-Chair Sanford Meek, Univ. of Utah<br />

14:00–14:15 TuCT9.1<br />

Contactless deflection sensor for soft robots<br />

Michal Karol Dobrzynski, Ramon Pericet-Camara<br />

and Dario Floreano<br />

Laboratory of Intelligent Systems,<br />

Ecole Federale Polytechnique de Lausanne, Switzerland<br />

• Deflection sensor capable of shape<br />

estimation with no impact on<br />

deflected substrate’s softness.<br />

• Prototype devices show a resolution<br />

of 1.29� with 400 Hz data acquisition.<br />

• Average agreement of 5% between<br />

experimental data and simulation.<br />

• Application to soft robotics domain is<br />

discussed.<br />

Aa<br />

B<br />

Ab<br />

Aa: prototype device consist of 12 contactless<br />

deflection sensors; Ab: shape visualization;<br />

B: operational principle for a single contactless<br />

deflection sensor based on light.<br />

14:30–14:45 TuCT9.3<br />

Characterizing the performance of an optical slip<br />

sensor for grip control in a prosthesis<br />

Hamidreza N. Sani and Sanford G. Meek<br />

Mechanical Engineering, University of Utah, USA<br />

• Capable of detecting slip on surfaces with<br />

various surface proprieties such as<br />

roughness, curvature, transparency,<br />

reflectivity, and hardness<br />

• All tests were done with a 0.5 mm clear<br />

plastic to simulate the cosmetic glove<br />

• Accurate displacement detection up to<br />

1mm away<br />

• Only failed at extreme cases of surface<br />

properties<br />

• Minimal to no disturbance from an<br />

external light source even under the sun<br />

. The prototype optical based<br />

slip detection sensor mounted<br />

between the Motion Control<br />

prosthetic hand fingers<br />

14:15–14:30 TuCT9.2<br />

Soft Curvature Sensors for<br />

Joint Angle Proprioception<br />

Rebecca K. Kramer, Carmel Majidi,<br />

Ranjana Sahai and Robert J. Wood<br />

School of Engineering and Applied Sciences, Harvard University, USA<br />

• Elastomer-based curvature<br />

sensors allow mechanically<br />

noninvasive measurement of<br />

human body motion and robot<br />

kinematics.<br />

• Stretch, pressure, and curvature<br />

sensing are combined to<br />

introduce a family of wearable soft<br />

(E ~ 1 MPa) and stretchable<br />

(>350%) sensors that detect<br />

bending and joint position.<br />

• Curvature sensors are composed<br />

of a thin, transparent elastomer<br />

film embedded with a<br />

microchannel of conductive liquid.<br />

• The sensors are demonstrated via<br />

testing on a human finger joint.<br />

14:45–14:50 TuCT9.4<br />

DLR VR-SCAN: A Versatile and Robust<br />

Miniaturized Laser Scanner for short range 3D-<br />

Modelling and Exploration in Robotics<br />

Simon Kielhöfer, Thomas Bahls, Franz Hacker, Tilo Wüsthoff and<br />

Michael Suppa<br />

Institute for Robotics and Mechatronics,<br />

German Aerospace Center (DLR), Germany<br />

• Miniaturized triangulation based laser scanner with a MEMS scan head<br />

• “Eye in hand” sensor attachable to the TCP<br />

• High quality 3D-modelling of objects e.g. for grasp-planning<br />

• Robust exploration of safe-for-motion (SFM) areas<br />

of the robot through confidence rating<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–90–


Session TuCT9 Continental Parlor 9 Tuesday, September 27, <strong>2011</strong>, 14:00–15:30<br />

Novel Sensors<br />

Chair Dario Floreano, Ec. Pol. Federal, Lausanne<br />

Co-Chair Sanford Meek, Univ. of Utah<br />

14:50–14:55 TuCT9.5<br />

Piezoelectric Self-Sensing Technique for<br />

Tweezer Style End-Effector<br />

Timothy McPherson and Jun Ueda<br />

George W. Woodruff School of Mechanical Engineering<br />

Georgia Institute of Technology, USA<br />

• Piezoelectric tweezers<br />

use rhombus strain<br />

amplification to achieve<br />

macro scale displacement<br />

• Self-sensing technique<br />

uses shunt resistance to<br />

measure discharged<br />

current<br />

• Technique is well suited to<br />

static force and<br />

displacement<br />

measurements<br />

• Experimental verification<br />

shows 0.15 mm max error<br />

vs. laser displacement<br />

sensor<br />

Piezoelectric Tweezers<br />

15:00–15:15 TuCT9.7<br />

Development of<br />

Omni-Directional and Fast-Responsive<br />

Net-Structure Proximity Sensor<br />

Kazuki Terada, Yosuke Suzuki, Hiroaki Hasegawa,<br />

Satoshi Sone, Aiguo Ming and Makoto Shimojo<br />

Graduate School of Informatics and Engineering, UEC, JAPAN<br />

Masatoshi Ishikawa<br />

Graduate School of Information Science and Technology, Tokyo Univ., JAPAN<br />

• All photo-transistors on the full surface<br />

are connected by an unique analog<br />

computing network circuit.<br />

• The output signal means the angle and<br />

the distance of a nearby object.<br />

• Response time is less than 1 ms, and<br />

the accuracy of object angle detection<br />

is less than 3 deg.<br />

• The sensor is especially available for<br />

mobile robots to avoid collision with<br />

rapidly approaching objects.<br />

“Ring-Shaped Proximity Sensor”<br />

14:55–15:00 TuCT9.6<br />

Development of a Low-Profile Sensor Using Electro-<br />

Conductive Yarns in Recognition of Slippage<br />

Van Anh Ho*, Daisuke Kondo*, Shima Okada*, Takahiho Araki**,<br />

Emi Fujita**, Masaaki Makikawa*, and Shinichi Hirai<br />

(*)Department of Robotics, Ritsumeikan University , Japan<br />

(**) Research and Development Division, Okamoto Corporation, Japan<br />

Upper row shows a complete template of a fabric slip sensor and construction of piles.<br />

Lower row plots the output of the sensor when suffering a rubbing action of human<br />

fingertip, and application of Discrete Wavelet Transform in slip detection.<br />

15:15–15:30 TuCT9.8<br />

Design of a Compact Camera-Orienting<br />

Mechanism with Flexural Pan and Tilt Axes<br />

Chao-Chieh Lan and Yi-Chiao Lee<br />

Dept. of Mechanical Eng., National Cheng Kung University, Taiwan<br />

Jinn-Feng Jiang, Yi-Jie Chen, and Hung-Yuan Wei<br />

Metal Industries Research and Development Centre, Taiwan<br />

• Design and prototype of a camera-orienting mechanism is proposed.<br />

Pan-tilt motion is achieved by using parallel placed flexural beams to<br />

actuate a spherical 5-bar mechanism.<br />

• Flexible mechanism design can reduce the<br />

number of parts, avoid joint clearance, achieve<br />

linear output-input relation, and fit irregular<br />

design domain.<br />

• Mechanism specifications<br />

� Size: 42�43�116 mm<br />

� Weight: 140 g<br />

� Pan range: �35�<br />

� Tilt range: �33�<br />

� Resolution: 0.2�<br />

� Max. speed: 150�/s<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–91–


<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–92–


8:00-9:30<br />

Technical Program Digest<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–93–<br />

Wednesday<br />

September 28, <strong>2011</strong><br />

Session WeAT1 ⎯⎯ HRI: Modeling Human Behavior<br />

Session WeAT2 ⎯⎯ Models and Representation<br />

Session WeAT3 ⎯⎯ Medical Robotics: Tracking & Detection<br />

Session WeAT4 ⎯⎯ Symposium: Haptics Interfaces for the Fingertip, Hand, and Arm<br />

Session WeAT5 ⎯⎯ Symposium: Robot Motion Planning: Achievements and<br />

Emerging Approaches<br />

Session WeAT6 ⎯⎯ Symposium: Aerial Robotics: Estimation, Perception and Control<br />

Session WeAT7 ⎯⎯ Robot Walking<br />

Session WeAT8 ⎯⎯ Networked Robots<br />

Session WeAT9 ⎯⎯ Vision: From Features to Applications<br />

10:00-11:30<br />

Session WeBT1 ⎯⎯ Socially Assistive Robots<br />

Session WeBT2 ⎯⎯ Estimation & Sensor Fusion<br />

Session WeBT3 ⎯⎯ Surgical Robotics<br />

Session WeBT4 ⎯⎯ Symposium: Hardware and Software Design for Haptic Systems<br />

Session WeBT5 ⎯⎯ Symposium: Foundations and Future Prospects of Samplingbased<br />

Motion Planning<br />

Session WeBT6 ⎯⎯ Symposium: Aerial Robotics: Control and Planning<br />

Session WeBT7 ⎯⎯ Passive Walking & Leg-Wheeled Robots<br />

Session WeBT8 ⎯⎯ Multirobot Systems: Rendezvous & Task Switching<br />

Session WeBT9 ⎯⎯ Visual Servoing<br />

(continues on next page)


14:00-15:30<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–94–<br />

Wednesday<br />

September 28, <strong>2011</strong><br />

(continued)<br />

Session WeCT1 ⎯⎯ Human-Robot Interaction and Cooperation<br />

Session WeCT2 ⎯⎯ Pose Estimation & Visual Tracking<br />

Session WeCT3 ⎯⎯ Robot Safety<br />

Session WeCT4 ⎯⎯ Symposium: Haptic Feedback and System Evaluation<br />

Session WeCT5 ⎯⎯ Symposium: Symbolic Approaches to Motion Planning and<br />

Control<br />

Session WeCT6 ⎯⎯ Symposium: Marine Robotics: Platforms and Applications<br />

Session WeCT7 ⎯⎯ Humanoid Control<br />

Session WeCT8 ⎯⎯ Multirobot Planning<br />

Session WeCT9 ⎯⎯ Visual & Multi-Sensor Calibration<br />

16:00-17:30<br />

Session WeDT1 ⎯⎯ Human-Robot Collaboration<br />

Session WeDT2 ⎯⎯ Recognition & Prediction of Motion<br />

Session WeDT3 ⎯⎯ Haptic Rendering & Object Recognition<br />

Session WeDT4 ⎯⎯ Industrial Forum: Medical Robotics<br />

Session WeDT5 ⎯⎯ Symposium: Robot Motion Planning: New Frameworks and High<br />

Performance<br />

Session WeDT6 ⎯⎯ Symposium: Marine Robotics: Control and Planning<br />

Session WeDT7 ⎯⎯ Tracking & Gait Analysis<br />

Session WeDT8 ⎯⎯ Multirobot Coordination & Modular Robots<br />

Session WeDT9 ⎯⎯ Calibration & Identification


Session WeAT1 Continental Parlor 1 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

HRI: Modeling Human Behavior<br />

Chair Kai Oliver Arras, Univ. of Freiburg<br />

Co-Chair Mario F. Montenegro Campos, Univ. Federal de Minas Gerais<br />

08:00–08:15 WeAT1.1<br />

Please do not disturb! Minimum Interference<br />

Coverage for Social Robots<br />

Gian Diego Tipaldi and Kai O. Arras<br />

Social Robotics Lab, University of Freiburg, Germany<br />

• We address the novel problem<br />

of human-aware coverage<br />

• We learn spatio-temporal<br />

distributions of human activities<br />

with spatial Poisson processes<br />

• We use a time dependent TSP to plan<br />

paths that minimize the interference<br />

probability with humans<br />

• Experiments show that the proposed<br />

coverage planner significantly<br />

reduces interference in terms of<br />

durations and numbers of disturbances<br />

• Applications e.g. for robotic vacuum<br />

cleaners to learn how to avoid busy<br />

places at certain times of the day<br />

08:30–08:45 WeAT1.3<br />

Generalising Human Demonstration Data by Identifying<br />

Affordance Symmetries in Object Interaction Trajectories<br />

Jonathan Claassens<br />

Council of Industrial and Scientific Research (CSIR), South Africa<br />

Yiannis Demiris<br />

Intelligent Systems and Networks, Imperial College London, South Africa<br />

• When humans interact with objects, their<br />

hand trajectories often exhibit symmetries.<br />

• These symmetries may exist along edges<br />

of tables, around a cup or along the<br />

handle of an oven.<br />

• They typically arise from the affordance of<br />

the affected object.<br />

• A means to formally specify such<br />

symmetries and an algorithm to identify<br />

potential symmetries is proposed.<br />

• The performance of the algorithm is<br />

illustrated on data captured from a mocap<br />

system.<br />

A Cylindrical Affordance<br />

Symmetry<br />

08:15–08:30 WeAT1.2<br />

Shall We Dance? A Music-driven Approach for<br />

Mobile Robots Choreography<br />

Samuel de Sousa Jr and Mario Montenegro Campos<br />

Departamento de Ciência da Computação, UFMG, Brazil<br />

• We describe an approach that endows<br />

a robot to autonomously dance based<br />

on music.<br />

•A variability function for estimating the<br />

turbulence of a block of notes is<br />

defined which generates specific<br />

choreographic patterns.<br />

• Three dancing behaviors: phlegmatic,<br />

melancholic, and choleric are defined<br />

and implemented according to the<br />

turbulence of a block of notes.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–95–<br />

Methodology<br />

08:45–08:50 WeAT1.4<br />

Human Preferences for<br />

Robot-Human Hand-over Configurations<br />

Maya Cakmak<br />

School of Interactive Computing, Georgia Institute of Technology, USA<br />

Siddhartha S.Srinivasa, Min Kyung Lee, Jodi Forlizzi, Sara Kiesler<br />

Robotics Institute & Human-Computer Interaction Institute,<br />

Carnegie Mellon University, USA<br />

• Hand-over configuration: Robot and object<br />

configuration at which the object is<br />

transferred from the robot to the human.<br />

• We present a user study that compares<br />

configurations planned using a kinematic<br />

model of the human and configurations<br />

learned from human examples.<br />

• We find that learned configurations are<br />

preferred (found to be more natural and<br />

appropriate), while planned configurations are<br />

more practical and reachable.<br />

• We identify three latent variables of preferred<br />

configurations: arm extension, consistency<br />

and naturalness. These can be used to plan<br />

better hand-over configurations.<br />

Planned (top) and learned (bottom)<br />

hand-over configurations


Session WeAT1 Continental Parlor 1 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

HRI: Modeling Human Behavior<br />

Chair Kai Oliver Arras, Univ. of Freiburg<br />

Co-Chair Mario F. Montenegro Campos, Univ. Federal de Minas Gerais<br />

08:50–08:55 WeAT1.5<br />

Did you see it hesitate? – Empirically Grounded<br />

Design of Hesitation Trajectories for<br />

Collaborative Robots<br />

AJung Moon, Chris A. C. Parker, Elizabeth A. Croft<br />

H. F. Machiel Van der Loos<br />

Mechanical Engineering, University of British Columbia, Canada<br />

• Humans hesitate when unexpected<br />

conflict of resource occurs.<br />

• Based on survey validated human<br />

hesitation gestures, we developed<br />

characteristic hesitation gesture<br />

trajectories for a robot.<br />

• Our study empirically demonstrates that<br />

the communicative content of human<br />

hesitation gestures can be transferred<br />

onto a robot arm.<br />

• We present the characteristic acceleration<br />

profile of robotically generated hesitation<br />

gestures.<br />

Figure caption is optional,<br />

use Arial Narrow 20pt<br />

09:00–09:15 WeAT1.7<br />

Towards an Understanding of Dancers'<br />

Coupled Body Dynamics for Waltz<br />

Hongbo Wang and Kazuhiro Kosuge<br />

Department of Bioengineering and Robotics, Tohoku University, Japan<br />

• To improve a dance partner robot, two<br />

waltz dancers’ coupled body dynamics are<br />

studied<br />

• Dancers are modeled as linear inverted<br />

pendulums, connected by a spring and a<br />

damper<br />

• Stability and optimal interaction are<br />

analyzed, under assumptions on<br />

synchronized, periodic motions<br />

• Simulations and human-robot experiments<br />

are conducted to validate our approach<br />

08:55–09:00 WeAT1.6<br />

VocaWatcher: Natural Singing Motion Generator<br />

for a Humanoid Robot<br />

Shuuji Kajita*, Tomoyasu Nakano**, Masataka Goto**,<br />

Yosuke Matsusaka*, Shin'ichiro Nakaoka*, and Kazuhito Yokoi*<br />

*Intelligent Systems Research Institute, AIST, Japan<br />

**Information Technology Research Institute, AIST, Japan<br />

• We propose a novel robot motion generator<br />

that enables a humanoid robot to sing with<br />

realistic facial expressions and naturally<br />

synthesized singing voices<br />

• Our generator uses the HRP-4C and<br />

consists of two subsystems: VocaWatcher<br />

for generating singing motions and<br />

VocaListener for synthesizing singing<br />

voices<br />

• VocaWatcher imitates a human singer by<br />

analyzing a video clip of a human singing,<br />

recorded by a single video camera<br />

• VocaListener synthesizes singing voices by<br />

imitating the pitch and dynamics of the The original human singer and<br />

human singing<br />

the robot singer, HRP-4C<br />

Video clips: http://staff.aist.go.jp/t.nakano/VocaWatcher/<br />

09:15–09:30 WeAT1.8<br />

Understanding human interaction<br />

for probabilistic autonomous navigation<br />

using Risk-RRT approach<br />

Jorge Rios-Martinez 1 , Anne Spalanzani 2 , Christian Laugier 1<br />

1 INRIA Rhone-Alpes, Grenoble, France<br />

2 UPMF-Grenoble 2 - INRIA - LIG, France<br />

• Robot navigation systems must be<br />

“aware” of the social conventions<br />

followed by people like respecting<br />

personal space and o-space<br />

• This paper proposes a risk-based<br />

navigation method including both the<br />

traditional notion of risk of collision and<br />

the notion of risk of disturbance<br />

• Results exhibit new emerging behavior<br />

showing how a robot takes into account<br />

social conventions in its navigation<br />

strategy<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–96–


Session WeAT2 Continental Parlor 2 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Models and Representation<br />

Chair Lynne Parker, Univ. of Tennessee<br />

Co-Chair Christian Schlegel, Univ. of Applied Sciences Ulm<br />

08:00–08:15 WeAT2.1<br />

Denoising of Range Images<br />

using a Trilateral filter and Belief Propagation<br />

Shuji Oishi, Ryo Kurazume,<br />

Yumi Iwashita, and Tsutomu Hasegawa<br />

Kyushu University, Japan<br />

• Two denoising techniques are<br />

proposed for a 3D laser scanner<br />

• We focus on laser reflectance<br />

obtained as a by-product of range<br />

image<br />

• New trilateral filter is proposed for<br />

suppressing noises in range images<br />

while preserving geometric features<br />

• New inpainting technique using belief<br />

propagation is also proposed to<br />

recover missing regions in range<br />

images<br />

08:30–08:45 WeAT2.3<br />

3D Crowd Surveillance and Analysis using<br />

Laser Range Scanners<br />

Xiaowei Shao, Ryosuke Shibasaki and Yun Shi<br />

Center for Spatial Information Science, University of Tokyo, Japan<br />

Huijing Zhao<br />

State Key Lab of Machine Perception, Peking University, China<br />

Kiyoshi Sakamoto<br />

Research and Development Center, East Japan Railway Company, Japan<br />

• 3D crowd surveillance by using<br />

swinging laser range scanner<br />

• Multiple clients are integrated by semiauto<br />

calibration<br />

• Automatic and efficient people<br />

extraction based on improved meanshift<br />

technique<br />

• Quantified crowdness analysis in ROIs<br />

is performed.<br />

Top to bottom: swinging laser scanner;<br />

integrated 3D data; people extraction<br />

08:15–08:30 WeAT2.2<br />

Representing Actions with Kernels<br />

Guoliang Luo, Niklas Bergström,<br />

Carl Henrik Ek and Danica Kragic<br />

Computer Vision and Active Perception Lab<br />

Royal Institute of Technology (KTH), Sweden<br />

Approach<br />

• Actions as interaction<br />

• Structural representation<br />

• Define kernel induced feature<br />

representation<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–97–<br />

Benefits<br />

• Robust to noise<br />

• Allows for principled inference<br />

08:45–08:50 WeAT2.4<br />

4-Dimensional Local Spatio-Temporal Features<br />

for Human Activity Recognition<br />

Hao Zhang and Lynne E. Parker<br />

Department of Electrical Engineering and Computer Science,<br />

University of Tennessee, Knoxville, USA<br />

• A 4-dimensional local spatio-temporal<br />

feature is proposed that combines both<br />

intensity and depth information<br />

• The proposed feature is used for human<br />

activity recognition with the latent Dirichlet<br />

allocation (LDA) model as a classifier<br />

• A new human activity database is created<br />

using a Kinect installed on a Pioneer robot<br />

• An average accuracy of 91.50% is<br />

achieved using the LDA model with the<br />

proposed feature<br />

Confusion matrix of human<br />

activity recognition using Kinect


Session WeAT2 Continental Parlor 2 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Models and Representation<br />

Chair Lynne Parker, Univ. of Tennessee<br />

Co-Chair Christian Schlegel, Univ. of Applied Sciences Ulm<br />

08:50–08:55 WeAT2.5<br />

Conic Fitting based on Stochastic Linearization<br />

Marcus Baum and Uwe D. Hanebeck<br />

Intelligent Sensor-Actuator-Systems Lab (ISAS)<br />

Karlsruhe Institute of Technology (KIT), Karlsruhe, Germany<br />

• Fitting conics, e.g., ellipses, to noisy measurements<br />

• Approximation of the implicit measurement eqation with<br />

an explicit measurement equation<br />

• Gaussian filter<br />

09:00–09:15 WeAT2.7<br />

Managing Execution Variants in Task Coordination<br />

by Exploiting Design-Time Models at Run-Time<br />

Andreas Steck and Christian Schlegel<br />

Department of Computer Science,<br />

University of Applied Sciences Ulm, Germany<br />

• Models created at design-<br />

time are used by the<br />

robot at run-time to<br />

support the decision<br />

making process.<br />

• At design-time purpose-<br />

fully left open variation<br />

points are bound at run-<br />

time.<br />

• Real-world experiments<br />

http://youtu.be/xtLK-655v7k<br />

SmartTCL coordinates the whole<br />

system<br />

08:55–09:00 WeAT2.6<br />

Bootstrapping sensorimotor cascades:<br />

a group-theoretic perspective<br />

Andrea Censi and Richard M. Murray<br />

Control & Dynamical Systems<br />

California Institute of Technology, USA<br />

• In the bootstrapping problem we study<br />

agents that can learn to use unknown<br />

actuators and unknown sensors, starting<br />

from zero prior information.<br />

• For example, the figure shows the raw<br />

intensity values returned by a subset of<br />

the sensels in a range-finder and a<br />

camera (can you tell which is which?)<br />

• In this paper, we formulate the problem as<br />

a problem of group nuisance rejection.<br />

Invariance to group nuisances is our<br />

guiding principle for design.<br />

• Models are demonstrated on real-world<br />

data of cameras and range-finders.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–98–<br />

Sensels �<br />

Time �<br />

09:15–09:30 WeAT2.8<br />

The Mathematical Model and Control of Human-<br />

Machine Perceptual Feedback System<br />

Han Yoon and Seth Hutchinson<br />

Department of Electrical and Computer Engineering,<br />

University of Illinois, USA<br />

• Create interfaces that enable perceptual<br />

feedback control between a human user<br />

and a mobile robot.<br />

• Enhance the perception-to-behavior<br />

performance of an unskilled human user<br />

by warping the perceptual display<br />

• Improve the wellbeing of people suffering<br />

from the lack of manipulability<br />

• Protect people like drivers or pilots who<br />

are impaired by fatigue, boredom, or<br />

possible substance abuse


Session WeAT3 Continental Parlor 3 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Medical Robotics: Tracking & Detection<br />

Chair Stefano Stramigioli, Univ. of Twente<br />

Co-Chair Erion Plaku, Catholic Univ. of America<br />

08:00–08:15 WeAT3.1<br />

Three-Dimensional Pose Reconstruction of<br />

Flexible Instruments from Endoscopic Images<br />

Rob Reilink, Stefano Stramigioli, and Sarthak Misra<br />

University of Twente, The Netherlands<br />

• Position and orientation of flexible<br />

endoscopic instruments are measured<br />

without adding sensors<br />

• Feature points of endoscopic instrument<br />

are detected within the endoscopic<br />

image<br />

• The state of a kinematic model is<br />

adapted in order to match the feature<br />

points of the model to observed points in<br />

the image<br />

• RMS error of 1.7, 1.2, and 3.6mm in<br />

resp. horizontal (x), vertical (y) and<br />

longitudinal (z) directions in colon<br />

anatomical model.<br />

08:30–08:45 WeAT3.3<br />

Comparison of Several Image Features<br />

for WCE Video Abstract<br />

Baopu Li1,2 , Max Q.-H. Meng1,2 1. Department of Electronic Engineering, The Chinese University of Hong Kong<br />

2. Shenzhen Institute of Advanced Technology,<br />

The Chinese Academy of Science, China<br />

• Wireless Capsule endoscopy (WCE) is a revolutionary technique to<br />

examine the diseases in small intestine<br />

• Deduction of the review time for a WCE video is needed<br />

• Several image features are compared to summarize a WCE<br />

• Preliminary experiments show that textural and motion features maybe<br />

suitable for a WCE video abstract<br />

• Clinical validation is a part of future work<br />

08:15–08:30 WeAT3.2<br />

Detection of Curved Robots using 3D<br />

Ultrasound<br />

Hongliang Ren, Nikolay V. Vasilyev and Pierre E. Dupont<br />

Department of Cardiac Surgery, Children's Hospital Boston<br />

Harvard Medical School, USA<br />

• Goal is image-based continuum robot<br />

tracking and servoing using real-time 3D<br />

ultrasound<br />

• Existing detection algorithms apply to<br />

robotic instruments with straight shafts,<br />

but do not extend to continuum robots that<br />

curve along their length<br />

• Problem complexity is reduced by<br />

decomposing arc detection into sequence<br />

of plane and circle detection problems.<br />

• Proposed approach is parallelizable.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–99–<br />

3D Ultrasound volume and<br />

detected curved robot<br />

08:45–08:50 WeAT3.4<br />

Toward Development of 3D Surgical Mouse<br />

Paradigm<br />

Xiaochuan Sun and Shahram Payandeh<br />

Experimental Robotics and Imaging Laboratory<br />

School of Engineering Science<br />

Simon Fraser University, Canada<br />

• Monocular systems imaging systems are<br />

commonly utilized in In minimally invasive<br />

surgery.<br />

• The lack of 3D visual perception offers a<br />

natural challenge for innovation design<br />

opportunities developing in surgeoncomputer<br />

interface.<br />

• Design of a robust monocular-based<br />

image tracking of the surgical instruments<br />

is an essential first step.<br />

• We introduce a novel 3D mouse paradigm<br />

and compare four different image tracking<br />

schemes.<br />

3D mouse conceptual diagram


Session WeAT3 Continental Parlor 3 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Medical Robotics: Tracking & Detection<br />

Chair Stefano Stramigioli, Univ. of Twente<br />

Co-Chair Erion Plaku, Catholic Univ. of America<br />

08:50–08:55 WeAT3.5<br />

3D Thread Tracking for Robotic Assistance in<br />

Tele-surgery<br />

Nicolas Padoy, Gregory D. Hager<br />

The Johns Hopkins University, Baltimore, MD, USA<br />

{padoy,hager}@jhu.edu<br />

• Automatic third arm assistance<br />

• Thread modeling and tracking with nonuniform<br />

rationale B-splines<br />

• Energy approximation and minimization<br />

using discrete MRF optimization<br />

• Automatic scissors command for the da<br />

Vinci robot<br />

09:00–09:15 WeAT3.7<br />

In-vitro Three Dimensional Vasculature Modeling<br />

Based on Sensor Fusion between Intravascular<br />

Ultrasound and Magnetic Tracker<br />

Chaoyang Shi, Carlos Tercero, Seiichi Ikeda, Toshio Fukuda<br />

Micro-Nano Systems Engineering, Nagoya University, Japan<br />

Kimihiro Komori and Kiyohito Yamamoto<br />

Graduate School of Medicine, Nagoya University, Japan<br />

• Proposed the new sensor fusion<br />

technology between intravascular<br />

ultrasound (IVUS) and magnetic<br />

trackers for intravascular surgery<br />

• Integrated the hybrid probe and<br />

measured the disturbances between<br />

two sensors<br />

• Performed in-vitro experiments by<br />

adopting a silicone model of<br />

descending aorta, and reconstructed<br />

the corresponding 3D model<br />

Fig. a) Hybrid probe consisting of magnetic<br />

tracker and IVUS probe; b) Silicone model<br />

for in-vitro experiments; c) Reconstructed<br />

model.<br />

08:55–09:00 WeAT3.6<br />

Ultrasound image features of the wrist<br />

are linearly related to finger positions<br />

Claudio Castellini and Georg Passig<br />

Institute of Robotics and Mechatronics<br />

DLR - German Aerospace Research Center<br />

Oberpfaffenhofen, Germany<br />

• ultrasound images of the human wrist<br />

and finger positions are recorded using<br />

a commercial ultrasound machine and<br />

a dataglove<br />

• first-order local linear approximations of<br />

the grey levels are extracted from the<br />

images, and<br />

• a strong linear relationship is detected<br />

between them and finger positions<br />

• finger positons are then predicted in<br />

real time<br />

• this includes fingers flexion/extension<br />

and thumb adduction/rotation<br />

• the system is robust to downsampling<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–100–<br />

typical values of true and<br />

predicted pinkie position<br />

• main forseen application is<br />

on transradial amputees:<br />

1.treatment of phantom limb pain<br />

2.visualisation of the imaginary limb<br />

3.control of a hand prosthesis<br />

09:15–09:30 WeAT3.8<br />

Surgical Tools Pose Estimation for a Multimodal<br />

HMI of a Surgical Robotic Assistant<br />

Belén Estebanez, Enrique Bauzano and Víctor F. Muñoz<br />

System Engineering and Automation, University of Malaga, Spain<br />

• The recognition of a surgeon’s maneuver<br />

through Hidden Markov Models (HMM)<br />

requires the tracking of the surgical tools.<br />

• Minimization of the measured occluded<br />

areas is needed to control autonomous<br />

movements of a robotic assistant through<br />

gestures.<br />

• Location of each surgeon’s tool may be<br />

estimated with Multiple Extended Kalman<br />

Filters (MEKF), one for each gesture the<br />

maneuver is composed of.<br />

• Estimated location depends on the most<br />

probable maneuver the surgeon may be<br />

carrying out.


Session WeAT4 Continental Ballroom 4 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Haptics Interfaces for the Fingertip, Hand, and Arm<br />

Chair Katherine J. Kuchenbecker, Univ. of Pennsylvania<br />

Co-Chair Marcia O'Malley, Rice Univ.<br />

08:00–08:15 WeAT4.1*<br />

Semi-Plenary Invited Talk: Surface Haptics: Virtual<br />

Touch on Physical Surfaces<br />

Edward Colgate, Northwestern University<br />

08:30–08:45 WeAT4.3<br />

Weight and Friction Display Device<br />

by Controlling the Slip Condition of a Fingertip<br />

Yuichi Kurita<br />

Graduate School of Engineering, Hiroshima University, Japan<br />

Satoshi Yonezawa, Atsutoshi Ikeda and Tsukasa Ogasawara<br />

Graduate School of Information Science,<br />

Nara Institute of Technology, Japan<br />

• weight and friction illusions are generated<br />

by controlling ``eccentricity’’ of the contact<br />

surface between a fingertip and a rigid<br />

plate.<br />

• The eccentricity is a quantitative index of<br />

the slip condition calculated from image<br />

processing of captured images.<br />

• The desired eccentricity profile for the<br />

target weight and friction was obtained by<br />

modeling human’s grip/load force profiles.<br />

• Human experiments showed that the<br />

proposed device successfully presents the<br />

weight/friction illusions.<br />

08:15–08:30 WeAT4.2<br />

Semi-Plenary Invited Talk: Surface Haptics: Virtual<br />

Touch on Physical Surfaces<br />

Edward Colgate, Northwestern University<br />

08:45–08:50 WeAT4.4<br />

On-line Bio-impedance Identification of Fingertip<br />

Skin for Enhancement of Electrotactile Based Haptic<br />

Rendering<br />

John Gregory*, Yantao Shen**, and Ning Xi †<br />

*Wintek Electro-Optics Co., Ann Arbor, Michigan, USA<br />

** Dept. of Electrical and Biomedical Engr., University of Nevada, Reno, USA<br />

† Dept. of Electrical and Computer Engr., Michigan State University, USA &<br />

Dept. of Manufacturing Engineering and Engineering Management, The City<br />

University of Hong Kong, China<br />

• A new constant-voltage-driver<br />

(CVD) based electrotactile haptic<br />

rendering system is developed<br />

• The system owns the feature of<br />

on-line identifying bio-impedance<br />

of fingertip skin for haptic/tactile<br />

preference tuning<br />

• The identification method is<br />

based on a discrete-time<br />

extended least squares iterative<br />

approach with forgetting factor<br />

• Experimental results<br />

demonstrate the performance of<br />

both the system and the<br />

identification method<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–101–<br />

Adaptive fingertip skin bio-impedance<br />

identification in z-domain


Session WeAT4 Continental Ballroom 4 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Haptics Interfaces for the Fingertip, Hand, and Arm<br />

Chair Katherine J. Kuchenbecker, Univ. of Pennsylvania<br />

Co-Chair Marcia O'Malley, Rice Univ.<br />

08:50–08:55 WeAT4.5<br />

Design of an MRI Compatible Haptic Interface<br />

• A plastic, MRI Compatible, 1-axis, fiber<br />

optic force sensor for robotic<br />

applications in MRI environment<br />

• Its compact design employs a<br />

displacement amplifying compliant<br />

mechanism for more accurate sensing<br />

• Samples made of delrin and ABSplastic<br />

are tested, both samples<br />

confirmed MRI safety requirements<br />

• Delrin sample is superior to ABSplastic<br />

in accuracy and uniaxial<br />

sensitivity<br />

Melih Turkseven and Jun Ueda<br />

George W. Woodruff School of Mechanical Engineering,<br />

Georgia Institute of Technology, USA<br />

The proposed MRI compatible<br />

force sensor<br />

09:00–09:15 WeAT4.7<br />

Wide-bandwidth Bilateral Control<br />

Using Two Stage Actuator Systems:<br />

Evaluation Results of a Prototype<br />

Saori Kokuryu, Masaki Izutsu, Norihiro Kamamichi and<br />

Jun Ishikawa<br />

Department of Robotics and Mechatoronics, Tokyo Denki University, Japan<br />

• Developed a prototype of the proposed<br />

two stage actuator system for widebandwidth<br />

bilateral control<br />

• Proposed a coordinated motion control via<br />

control bandwidth separation method<br />

• Successfully Implemented a mechanical<br />

impedance control, the mass of which was<br />

0.01 kg, the damping 1.6 N/(m/s) without<br />

stiffness<br />

• Established a basis for wide-bandwidth<br />

bilateral control using lightly-moving<br />

actuator system<br />

Second Seco S<br />

stage<br />

First stage<br />

Ball screw<br />

AC<br />

motor<br />

Encoder<br />

Laser displacement sensor<br />

Prototype using two inexpensive<br />

actuators, i.e., voice coil motor<br />

and AC motor with ball screw<br />

08:55–09:00 WeAT4.6<br />

Force Producibility Improvement of Redundant<br />

Parallel Mechanism for Haptic Applications<br />

Jumpei Arata, Norio Ikedo and Hideo Fujimoto<br />

Nagoya Institute of Technology, Japan<br />

• Parallel mechanism is widely applied to<br />

haptic applications for advantageous<br />

benefits such as high rigidity, output force<br />

and backdrivability .<br />

• In our past study, we proposed a new<br />

redundant parallel mechanism “DELTA-R”<br />

for a haptic device.<br />

• In this paper, we propose an improvement<br />

method of force producibility by using the<br />

redundant DOF.<br />

• The forward kinematic model was<br />

modified by taking into account the<br />

geometric conditions of the mechanism.<br />

• Extremal Force Diagram (EFD) and<br />

experimental results using prototype<br />

clearly showed the effectiveness of the<br />

proposed method.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–102–<br />

Redundant PM DELTA-R<br />

and its improved force producibility<br />

09:15–09:30 WeAT4.8<br />

A New Generation of Ergonomic Exoskeletons –<br />

The High-Performance X-Arm-2<br />

André Schiele<br />

Telerobotics & Haptics Laboratory, ESA, Netherlands<br />

Gerd Hirzinger<br />

Institute of Robotics & Mechatronics, DLR, Germany<br />

• New, fully actuated and ergonomic<br />

haptic exoskeleton for space robotics<br />

Telepresence<br />

• Human-centered design<br />

• Low mass and high-power density<br />

implementation of actuators through<br />

combination of directly integrated and<br />

Bowden Cable relocated drives<br />

• Detailed performance characterization of<br />

mechatronic implementation through<br />

Bode plots and contact experiments<br />

X-Arm-2 Exoskeleton front (top) and<br />

back-side view (bottom)


Session WeAT5 Continental Ballroom 5 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Robot Motion Planning: Achievements and Emerging Approaches<br />

Chair Ron Alterovitz, Univ. of North Carolina at Chapel Hill<br />

Co-Chair Maxim Likhachev, Carnegie Mellon Univ.<br />

08:00–08:15 WeAT5.1*<br />

Semi-Plenary Invited Talk: 50 Years of Robotics:<br />

Motion Planning<br />

Tomas Lozano-Perez, MIT<br />

08:30–08:45 WeAT5.3<br />

Conflict-Free Route Planning<br />

in Dynamic Environments<br />

Adriaan W. ter Mors<br />

Delft University of Technology, The Netherlands<br />

• Optimal route planning on a roadmap for a single robot, in<br />

polynomial time<br />

• Prioritized approach: a robot plans around the routes of higherpriority<br />

robots<br />

• Unexpected delays for one robot can propagate to others<br />

• Pareto- and non-Pareto optimal plan repair mechanisms are<br />

compared<br />

Two robots traverse a shared roadmap<br />

of limited-capacity resources<br />

08:15–08:30 WeAT5.2<br />

Semi-Plenary Invited Talk: 50 Years of Robotics:<br />

Motion Planning<br />

Tomas Lozano-Perez, MIT<br />

08:45–08:50 WeAT5.4<br />

Kinodynamic Motion Planning with<br />

State Lattice Motion Primitives<br />

Mihail Pivtoraiko and Alonzo Kelly<br />

Robotics Institute, Carnegie Mellon University, USA<br />

• We present an approach to automatic<br />

generation of mo-tion primitives for<br />

kinodyna-mic motion planning;<br />

• State lattice motion primitives feature a<br />

number of advanta-ges, including enabling<br />

reuse of computation (e.g. D*) in differentially<br />

constrained motion planning;<br />

• We demonstrate incremental planning for<br />

a dynamics sys-tem: a mobile robot<br />

moving on slippery surface with significant<br />

drift.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–103–


Session WeAT5 Continental Ballroom 5 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Robot Motion Planning: Achievements and Emerging Approaches<br />

Chair Ron Alterovitz, Univ. of North Carolina at Chapel Hill<br />

Co-Chair Maxim Likhachev, Carnegie Mellon Univ.<br />

08:50–08:55 WeAT5.5<br />

Efficient Motion Planning for Manipulation Robots<br />

in Environments with Deformable Objects<br />

Barbara Frank, Cyrill Stachniss,<br />

Nichola Abdo, and Wolfram Burgard<br />

Department of Computer Science, University of Freiburg, Germany<br />

• Approach to online manipulator motion<br />

planning considering deformable objects<br />

• Efficient estimation of deformation cost<br />

along a trajectory using Gaussian process<br />

(GP) regression<br />

• GP training using a deformation simulation<br />

based on finite element methods<br />

• Experiments illustrate an accurate cost<br />

estimation and online planning capabilities<br />

Our robot deforming the leaves of<br />

a plant along a trajectory<br />

09:00–09:15 WeAT5.7<br />

A Simplified Model of RRT Coverage for<br />

Kinematic Systems<br />

Joel M Esposito<br />

Systems Engineering, United States Naval Academy<br />

• Rapidly Exploring Random Trees explore<br />

the state space..but how quickly?<br />

• We model state space coverage as a<br />

random discrete time process<br />

• Under two simplifying assumptions we<br />

obtain a closed form solution for expected<br />

coverage as a function of number of<br />

nodes<br />

• Experimental results suggest model is<br />

valid for holonomic systems in expansive<br />

configuration spaces.<br />

• Model can be used as termination criteria,<br />

or as a measure of problem difficulty<br />

08:55–09:00 WeAT5.6<br />

Learning Dimensional Descent planning for a<br />

highly-articulated robot arm<br />

Paul Vernaza and Daniel D. Lee<br />

GRASP Laboratory, University of Pennsylvania, USA<br />

• Learning Dimensional Descent (LDD)<br />

solves high-dimensional planning<br />

problems by solving a sequence of<br />

low-dimensional DP problems<br />

• Learning-based technique<br />

automatically identifies and exploits<br />

low-dimensional structure<br />

• Experiments with PR2 robot<br />

demonstrate much higher-quality<br />

solutions obtained than with samplingbased<br />

planners and smoothing<br />

• Open-source implementation available<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–104–<br />

Illustration of one step of LDD algorithm.<br />

Solution generated in next step is<br />

minimum-cost path restricted to manifold<br />

of previous path swept along learned<br />

basis directions.<br />

09:15–09:30 WeAT5.8<br />

Space-Filling Trees: A New Perspective on<br />

Incremental Search for Motion Planning<br />

James J. Kuffner<br />

The Robotics Institute, Carnegie Mellon University, USA<br />

& Google Research, Google Inc., USA<br />

Steven M. LaValle<br />

Dept. of Computer Science, Univ. of Illinois at Urbana-Champaign, USA<br />

• The concept of “Space-Filling Trees” is<br />

introduced and analyzed in the context of<br />

sampling-based motion planning.<br />

• Space-filling trees are analogous to<br />

space-filling curves, but with a branching,<br />

tree-like structure that connects a single<br />

point in a continuous space to every other<br />

point by a path of finite length.<br />

• Rapidly-exploring Random Trees (RRTs)<br />

can be considered stochastic variants of<br />

space-filling trees, and compared in terms<br />

of overall efficiency of exploration.<br />

• Key open problems remain, such as<br />

applying space-filling trees to the design<br />

and analysis of sampling-based planners.<br />

Three Iterations of the triangle<br />

and square space-filling trees.


Session WeAT6 Continental Ballroom 6 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Aerial Robotics: Estimation, Perception and Control<br />

Chair Nathan Michael, Univ. of Pennsylvania<br />

Co-Chair Pieter Abbeel, UC Berkeley<br />

08:00–08:15 WeAT6.1*<br />

Semi-Plenary Invited Talk: Do It Yourself Drones<br />

Chris Anderson, Wired Magazine<br />

08:30–08:45 WeAT6.3<br />

Multiple-Objective Motion Planning<br />

for Unmanned Aerial Vehicles (UAVs)<br />

Sebastian Scherer and Sanjiv Singh<br />

Robotics Institute, Carnegie Mellon University, USA<br />

• Want to perform various missions (reach a goal, search for landing<br />

sites, approach LZs), however mission cannot be expressed as goal<br />

seeking behavior<br />

• Propose a consistent framework and algorithm to plan with a<br />

combination of multiple objectives<br />

• Show results from performing realistic missions in simulation<br />

08:15–08:30 WeAT6.2<br />

Semi-Plenary Invited Talk: Do It Yourself Drones<br />

Chris Anderson, Wired Magazine<br />

08:45–08:50 WeAT6.4<br />

Bilateral Teleoperation of Multiple UAVs with<br />

Decentralized Bearing-only Formation Control<br />

Antonio Franchi 1 , Carlo Masone 1<br />

Heinrich H. Bülthoff 1,2 , and Paolo Robuffo Giordano 1<br />

1 Max Plank Institute for Biological Cybernetics, Germany<br />

2 Department of Brain and Cognitive Engineering, Korea University, Korea<br />

• Decentralized<br />

multi-master/multi-slave<br />

bilateral teleoperation of<br />

groups of UAVs:<br />

1. Human controls collective motion<br />

2. UAVs autonomously control shape<br />

• Only relative bearing measures<br />

used (no global localization, no<br />

distances)<br />

• Rigorous analysis of minimally-rigid sets<br />

of 3D bearing-formations and associated<br />

decentralized formation control<br />

• Stable design of force feedback for the human operator in<br />

order to increase telepresence<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–105–<br />

Human-hardware-in-the-loop<br />

simulations and experiments<br />

with quadrotors<br />

and 2 haptic devices


Session WeAT6 Continental Ballroom 6 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Aerial Robotics: Estimation, Perception and Control<br />

Chair Nathan Michael, Univ. of Pennsylvania<br />

Co-Chair Pieter Abbeel, UC Berkeley<br />

08:50–08:55 WeAT6.5<br />

Modeling and Decoupling Control<br />

of the CoaX Micro Helicopter<br />

Péter Fankhauser, Samir Bouabdallah,<br />

Stefan Leutenegger, Roland Siegwart<br />

Autonomous Systems Lab, ETH Zurich, Switzerland<br />

• A nonlinear model for a coaxial micro<br />

helicopter based on gray-box modeling<br />

and parameter identification is presented<br />

• The model accounts for hover and<br />

cruise flight situations and captures the<br />

off-axis dynamics and the dynamics of<br />

the stabilizer bar<br />

• A test bench and a vision based tracking<br />

system are used for parameter<br />

identification and model validation<br />

• A decoupling controller is developed and<br />

implemented which shows to increase<br />

the accuracy of flight trajectories<br />

09:00–09:15 WeAT6.7<br />

Deterministic Initialization of Metric<br />

State Estimation Filters for Loosely-Coupled<br />

Monocular Vision-Inertial Systems<br />

Laurent Kneip, Stephan Weiss, and Roland Siegwart<br />

Autonomous Systems Lab, ETH Zurich<br />

• Deterministic scale factor computation of monocular visual SLAM<br />

• Deterministic computation of orientation with respect to gravity direction<br />

• Performed via a derivation of delta-velocities from<br />

• Numerical differentiation of visual pose<br />

• Integration of inertial data<br />

• Robustness analysis for noise<br />

and degenerate motions<br />

• Successful results on simulated<br />

and real data<br />

• Performance tests on filter initialization<br />

08:55–09:00 WeAT6.6<br />

On Active Target Tracking and Cooperative<br />

Localization for Multiple Aerial Vehicles<br />

Fabio Morbidi, Gian Luca Mariottini<br />

ASTRA Robotics Laboratory<br />

Department of Computer Science & Engineering<br />

University of Texas at Arlington, USA<br />

• Cooperative active target-tracking<br />

for a team of aerial vehicles<br />

equipped with 3-D range-finding<br />

sensors<br />

• Gradient-based controllers are<br />

designed for each vehicle<br />

• The Kalman-Bucy filter is used for<br />

estimation fusion<br />

• Analytical derivation of lower and<br />

upper bounds on the target’s<br />

position uncertainty<br />

• Active Cooperative Localization<br />

and Multi-target Tracking (ACLMT)<br />

minimizes both targets’- and<br />

robots’- position uncertainty<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–106–<br />

Four aerial vehicles cooperatively<br />

track a target to estimate its position<br />

with minimum uncertainty<br />

09:15–09:30 WeAT6.8<br />

Collaborative Stereo<br />

Markus W. Achtelik*, Stephan Weiss*, Margarita Chli*,<br />

Frank Dellaert † , and Roland Siegwart*<br />

*ETH Zurich, Autonomous Systems Lab, Switzerland<br />

† Georgia Institute of Technology, School of Interactive Computing, USA<br />

• Two MAVs, each equipped with a monocular camera and an IMU,<br />

form a variable baseline stereo rig<br />

• Extrinsic calibration estimation with absolute scale baseline by<br />

incorporating IMU information from both MAVs<br />

• Vision / IMU data fusion with EKF framework – only dependent on<br />

IMU and relative vision measurements


Session WeAT7 Continental Parlor 7 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Robot Walking<br />

Chair Luis Sentis, The Univ. of Texas at Austin<br />

Co-Chair Shinya Aoi, Kyoto Univ.<br />

08:00–08:15 WeAT7.1<br />

Self-stabilization Principle of Mechanical Energy<br />

Inherent in Passive Compass Gait<br />

Fumihiko Asano<br />

School of Information Science,<br />

Japan Advanced Institute of Science and Technology, Japan<br />

• The self-stabilization principle<br />

underlying passive compass gait is<br />

investigated from the mechanical<br />

energy point of view.<br />

• It is numerically shown that a passive<br />

compass gait is monotonically<br />

stabilized not by means of the state<br />

variables but by means of the total<br />

mechanical energy.<br />

• The linearized mechanical energy<br />

(LME) is defined and the mechanism<br />

of monotonic convergence is<br />

mathematically investigated.<br />

08:30–08:45 WeAT7.3<br />

A Walking Stability Controller with Disturbance<br />

Rejection Based on CMP Criterion and Ground<br />

Reaction Force Feedback<br />

Richard Beranek, Henry Fung and Mojtaba Ahmadi<br />

Department of Mechanical and Aerospace Engineering, Carleton University,<br />

Canada<br />

• Robustness to external disturbances in<br />

bipedal control remains limited<br />

• Centroidal Moment Pivot (CMP) with<br />

ground reaction force feedback is used to<br />

modify the reference COG trajectory to<br />

increase stability<br />

• Simulation results results show the<br />

controller is more robust to a sagital<br />

disturbance than a ZMP based controller<br />

With ground reaction force<br />

feedback, CMP can be used to<br />

compensate for unbalanced<br />

moments acting on the system<br />

08:15–08:30 WeAT7.2<br />

Model-Based Velocity Control<br />

for Limit Cycle Walking<br />

Tuomas Haarnoja<br />

Smart Machines, VTT Technical Research Centre of Finland, Finland<br />

José-Luis Peralta-Cabezas and Aarne Halme<br />

Department of Automation and Systems Technology, Aalto University, Finland<br />

• Limit Cycle Walking (LCW) is an energy efficient but restrictive method<br />

of locomotion for legged robots<br />

• The paper proposes several ideas for controlling the velocity of a LCW<br />

robot<br />

• The methods are tested in simulations and the results are compared in<br />

the sense of their energy efficiency and achievable velocity range<br />

08:45–08:50 WeAT7.4<br />

Perturbation Theory to Plan Dynamic Locomotion<br />

in Very Rough Terrain<br />

• Motivation<br />

• Planning<br />

Luis Sentis and Benito Fernandez<br />

The University of Texas at Austin, USA<br />

• Prediction<br />

• Results<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–107–


Session WeAT7 Continental Parlor 7 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Robot Walking<br />

Chair Luis Sentis, The Univ. of Texas at Austin<br />

Co-Chair Shinya Aoi, Kyoto Univ.<br />

08:50–08:55 WeAT7.5<br />

Generation of adaptive splitbelt treadmill<br />

walking by a biped robot using nonlinear<br />

oscillators with phase resetting<br />

Shinya Aoi 1 , Soichiro Fujiki 1 , Tsuyoshi Yamashita 1<br />

Takehisa Kohda 1 , Kei Senda 1 , and Kazuo Tsuchiya 2<br />

1 Kyoto University, Japan, 2 Doshisha University, Japan<br />

• We designed a biped robot, whose<br />

joints are controlled by nonlinear<br />

oscillators with phase resetting<br />

• Stable walking on splitbelt treadmill<br />

was generated despite various<br />

speed conditions of belts<br />

• Speed discrepancy caused shift of<br />

phase differences between leg<br />

movements and modulation of duty<br />

factors<br />

• This adaptability was due to<br />

modulation of locomotion rhythm and<br />

phase by phase resetting<br />

Biped robot walking on<br />

splitbelt treadmill<br />

09:00–09:15 WeAT7.7<br />

Multi-objective Parameter CPG Optimization for<br />

Gait Generation of a Quadruped Robot<br />

Considering Behavioral Diversity<br />

Miguel Oliveira, Cristina P. Santos,<br />

Vítor Matos, and Manuel Ferreira<br />

Industrial Electronics Department, Minho University , Portugal<br />

Lino Costa<br />

Production Systems Department, Minho University, Portugal<br />

• Multi-objective optimization system<br />

that combines bio-inspired Central<br />

Patterns Generators (CPGs) and a<br />

multi-objective evolutionary<br />

algorithm(NSGAII)<br />

• CPGs are modeled as autonomous<br />

differential equations that generate the<br />

necessary limb movement<br />

• four conflicting objectives are<br />

considered: vibration, velocity, wide<br />

stability margin and behavioral<br />

diversity<br />

08:55–09:00 WeAT7.6<br />

Experimental verification of hysteresis in gait<br />

transition of a quadruped robot driven by<br />

nonlinear oscillators with phase resetting<br />

Shinya Aoi 1 , Soichiro Fujiki 1 , Daiki Katayama 1 , Tsuyoshi<br />

Yamashita 1 , Takehisa Kohda 1 , Kei Senda 1 , Kazuo Tsuchiya 2<br />

1 Kyoto University, Japan, 2 Doshisha University, Japan<br />

• We designed a quadruped robot,<br />

whose legs are controlled by<br />

nonlinear oscillators with phase<br />

resetting<br />

• Walk and trot patterns were generated<br />

through interactions among robot<br />

dynamics, oscillator dynamics, and<br />

environment<br />

• Walk-trot transition is induced by<br />

changing the walking speed<br />

• Hysteresis appears in the gait<br />

transition<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–108–<br />

Quadruped robot and<br />

hysteresis in walk-trot transition<br />

09:15–09:30 WeAT7.8<br />

A sparse model predictive control formulation<br />

for walking motion generation<br />

Dimitar Dimitrov and Alexander Sherikov<br />

Orebro University, Sweden<br />

Pierre-Brice Wieber<br />

INRIA-Grenoble, France<br />

• Walking motion generation with variable<br />

sampling time and CoM height<br />

• MPC formulation that does not depend<br />

on forming an objective offline<br />

• Faster QP solution


Session WeAT8 Continental Parlor 8 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Networked Robots<br />

Chair C. S. George Lee, Purdue Univ.<br />

Co-Chair Ron Lumia, Univ. of New Mexico<br />

08:00–08:15 WeAT8.1<br />

Mobility and Routing Joint Design for Lifetime<br />

Maximization in Mobile Sensor Networks<br />

Shengwei Yu and C. S. George Lee<br />

School of Electrical and Computer Engineering<br />

Purdue University, USA<br />

• Simultaneously design mobility and<br />

routing strategies of robotic sensor nodes<br />

to improve the lifetime of a mobile sensor<br />

network.<br />

• The non-convex problem was solved<br />

distributedly by a series of convex<br />

approximations and a novel algorithm.<br />

• Computer simulations demonstrated quick<br />

convergence to the solution and the<br />

lifetime of the network was improved<br />

tremendously.<br />

• The proposed joint design method has an<br />

edge over other methods in lifetime<br />

maximization of mobile sensor networks.<br />

08:30–08:45 WeAT8.3<br />

Optimal Maintenance Strategy<br />

in<br />

Fault-Tolerant Multi-Robot Systems<br />

Satoshi Hoshino and Hiroya Seki<br />

Chemical Resources Laboratory, Tokyo Institute of Technology, Japan<br />

Jun Ota<br />

Research into Artifacts, Center for Engineering (RACE), The University of<br />

Tokyo, Japan<br />

• Optimal maintenance strategy for multirobot<br />

systems on the basis of reliability<br />

engineering is proposed<br />

• Robots are allowed to undergo preventive<br />

maintenance at optimal intervals<br />

• Fault tolerance of the system is ensured<br />

as the number of robots is increased<br />

• Highest performance of the system with<br />

many robots is successfully maintained<br />

Robot’s availability and failure<br />

rate curves based on shape<br />

parameters<br />

08:15–08:30 WeAT8.2<br />

Decentralized Multi-Vehicle Path Coordination<br />

under Communication Constraints<br />

Pramod Abichandani and Moshe Kam<br />

Electrical and Computer Engineering Department, Drexel University, USA<br />

Hande Benson<br />

Department of Decision Sciences, Drexel University, USA<br />

• Decentralized framework to generate time<br />

optimal speed profiles for a group of path<br />

constrained vehicle robots<br />

• Paths modeled as cubic splines<br />

• Problem modeled as a Receding Horizon<br />

Mixed Integer Nonlinear Programming<br />

problem (RH-MINLP)<br />

• Constraints on kinematics, dynamics,<br />

collision avoidance and communication<br />

connectivity<br />

• Results for up to ten (10) robots<br />

08:45–08:50 WeAT8.4<br />

Distributed Control of Multi–Robot Systems with<br />

Global Connectivity Maintenance<br />

Lorenzo Sabattini<br />

Department of Electronics, Computer Sciences and Systems<br />

University of Bologna, Italy<br />

Nikhil Chopra<br />

Department of Mechanical Engineering and Institute for Systems Research<br />

University of Maryland, College Park, MD, USA<br />

Cristian Secchi<br />

Department of Sciences and Methods of Engineering<br />

University of Modena and Reggio Emilia, Italy<br />

• Decentralized estimation of the algebraic c<br />

connectivity<br />

• Connectivity maintenance is formally<br />

guaranteed<br />

• Formation control and rendezvous<br />

applications<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–109–


Session WeAT8 Continental Parlor 8 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Networked Robots<br />

Chair C. S. George Lee, Purdue Univ.<br />

Co-Chair Ron Lumia, Univ. of New Mexico<br />

08:50–08:55 WeAT8.5<br />

RSSI-based Physical Layout Classification and<br />

Target Tethering in Mobile Ad-hoc Networks<br />

Prashant Reddy<br />

Machine Learning Department, Carnegie Mellon University, USA<br />

Manuela Veloso<br />

Computer Science Department, Carnegie Mellon University, USA<br />

• LANdroids are simple inexpensive robots<br />

equipped only with a Wi-Fi radio sensor.<br />

• Problem: Using LANdroids to maintain<br />

connectivity between a static Gateway<br />

and mobile Targets (a.k.a. Target<br />

Tethering).<br />

• Challenge: The Wi-Fi RSSI signal is a<br />

very poor indicator of physical distance;<br />

how can we better use the RSSI signal?<br />

• Approach: Use SVM-based classification<br />

of RSSI signals amongst multiple<br />

LANdroids to categorize their physical<br />

layout into known Cluster Geometries.<br />

Then, use Q-learning to learn tethering<br />

policies based on those Geometries.<br />

09:00–09:15 WeAT8.7<br />

Leader-Follower Formation Control of<br />

Nonholonomic Robots with Fuzzy Logic Based<br />

Approach for Obstacle Avoidance<br />

Jawhar Ghommam1, Hasan Mehrjerdi 2 and Maarouf Saad2 1Research Unit on Mechatronics and Autonomous Systems<br />

ENIS-Sfax-Tunisia<br />

2Ecole de technologie supérieure<br />

1100, rue Notre-Dame Ouest Montreal, Quebec<br />

• A combination of the virtual vehicle and<br />

trajectory tracking approach is used.<br />

• A virtual vehicle is steered in such a way it<br />

stabilizes to a shifted reference<br />

position/heading defined by the leader<br />

• Velocity of the virtual vehicle is then used<br />

for further use in designing control law for<br />

the follower independent from the<br />

measurement of leader's velocity.<br />

• Ensuring the safety of robots while moving<br />

in a dynamic environment.<br />

• Obstacle avoidance scheme is introduced<br />

using fuzzy logic.<br />

Leader follower based virtual<br />

vehicle approach<br />

08:55–09:00 WeAT8.6<br />

Heterogeneous Sensor Network<br />

for Prioritized Sensing<br />

R. Andres Cortez and John Wood<br />

Department of Mechanical Engineering, University of New Mexico, USA<br />

Rafael Fierro<br />

Department of Electrical & Computer Eng., University of New Mexico, USA<br />

• Heterogeneous sensor network:<br />

• sensing agents<br />

• communication relays<br />

• Derive communication constraints within<br />

the network that guarantee network<br />

connectivity<br />

• Develop an algorithm that allows for<br />

adding communication links to the minimal<br />

spanning tree of the heterogeneous<br />

proximity graph<br />

• Combine a prioritized search algorithm<br />

and the communication constraints to<br />

provide a decentralized prioritized sensing<br />

control algorithm<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–110–<br />

Motion constraint set for a relay<br />

agent w.r.t. the network<br />

09:15–09:30 WeAT8.8<br />

On the Convergence of Braitenberg vehicle 3a<br />

Immersed in Parabolic Stimuli<br />

Iñaki Rañó<br />

Institut für Neuroinformatik, RUB, Germany<br />

• Braitenberg vehicle 3a models target<br />

seeking in animals.<br />

• Two sensors are directly connected to the<br />

wheels in a decreasing (-) way.<br />

• We perform the first theoretical analysis of<br />

the behavior.<br />

• The trajectory is described by non-linear<br />

differential equations.<br />

• For circular symmetry we identify the<br />

conditions when the trajectory; moves away<br />

from, approaches or circles around the<br />

target.<br />

• For parabolic situations the vehicle<br />

approaches the target along a principal<br />

direction.<br />

Braitenberg vehicle 3a


Session WeAT9 Continental Parlor 9 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Vision: From Features to Applications<br />

Chair Jana Kosecka, George Mason Univ.<br />

Co-Chair Mohammad Mahoor, Univ. of Denver<br />

08:00–08:15 WeAT9.1<br />

Yield Estimation in Vineyards<br />

by Visual Grape Detection<br />

Stephen Nuske, Supreeth Achar,<br />

Srinivas Narasimhan, Sanjiv Singh<br />

Robotics Institute, Carnegie Mellon University, USA<br />

Terry Bates<br />

Lake Eerie Research and Extension Laboratory, Cornell University, USA<br />

• Fine grained knowledge of<br />

yields can allow better<br />

management of vineyards<br />

• Current industry practice for<br />

yield estimation is destructive<br />

and spatially sparse<br />

• We use computer vision to<br />

count grapes and predict yield<br />

• Our method can be scaled to<br />

provide non-destructive, highresolution,<br />

yield estimates<br />

across large vineyards<br />

• We can predict yield of<br />

individual vineyard rows to<br />

within 9.8% of actual harvest<br />

08:30–08:45 WeAT9.3<br />

A Rotation Invariant Feature Descriptor O-DAISY<br />

and its FPGA Implementation<br />

Jan Fischer, Alexander Ruppel, Florian Weißhardt<br />

and Alexander Verl<br />

Robot Systems Department, Fraunhofer Institute for<br />

Manufacturing Engineering and Automation (IPA), Germany<br />

• O-DAISY, a rotational invariant<br />

extension of the DAISY feature<br />

descriptor<br />

• Outperforms other fast-to-compute<br />

descriptors like SURF and BRIEF<br />

• Description of FPGA<br />

implementation to achieve realtime<br />

performance of the descriptor<br />

calculation<br />

• Pipelined FPGA architecture<br />

enabling the processing of one<br />

descriptor per clock at 125 MHz<br />

Matching accuracy as a function of the<br />

rotation angle for an artificially rotated<br />

wall image using the rotation variant<br />

original DAISY (black), and O-DAISY<br />

variants (red and magenta)<br />

08:15–08:30 WeAT9.2<br />

A Novel Learning-based Approach for Local<br />

Patch Recognition<br />

Ce Gao, Yixu Song, Peifa Jia<br />

Department of Computer Science & Technology<br />

Tsinghua University, China<br />

• We propose a novel learningbased<br />

feature matching<br />

approach for Robot Visual<br />

Learning.<br />

• An improved version of FAST-9<br />

is adopted to extract keypoints<br />

quickly.<br />

• It identifies keypoints belong to<br />

different objects or background<br />

by color and texture<br />

representation.<br />

• 3a two-stage multilayer ferns<br />

classifier is trained to recognize<br />

the local patches.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–111–<br />

Matching result for BSD and PASCAL. The color<br />

points represent correct matching, and different<br />

colors represent different objects.<br />

08:45–08:50 WeAT9.4<br />

Adaptive Multi-Affine (AMA) Feature-Matching Algorithm<br />

and its Application to Minimally-Invasive Surgery Images<br />

G. Puerto1 , M. Adibi2 , J.A. Cadeddu2 , G.L. Mariottini1 1Dept. of Comp. Science and Eng., Univ. of Texas at Arlington, USA<br />

2Dept. of Urology, Univ. of Texas Southwestern Med. Center, USA<br />

• Existing methods match only few points btw. images of non-planar object<br />

• Our algorithm (AMA) can match increased number of image features<br />

on the entire 3-D object surface<br />

• Multi-Affine transformations used<br />

to recover from any feature-tracking<br />

loss (e.g., occlusion)<br />

• Affine transformations are found<br />

adaptively to maximize no. of<br />

matched features.<br />

• Extensive experimental validation<br />

with Minimally-Invasive Surgery<br />

videos.<br />

Above, state-of-art method matches few features (dots) and one<br />

affine transformation (indicated w/ arrow). Below, our method<br />

matches more features distributed over the entire object surface.


Session WeAT9 Continental Parlor 9 Wednesday, September 28, <strong>2011</strong>, 08:00–09:30<br />

Vision: From Features to Applications<br />

Chair Jana Kosecka, George Mason Univ.<br />

Co-Chair Mohammad Mahoor, Univ. of Denver<br />

08:50–08:55 WeAT9.5<br />

Video Stabilization Using SIFT-ME Features and<br />

Fuzzy Clustering<br />

Kevin L. Veon, Mohammad H. Mahoor, and Richard M. Voyles<br />

Department of Electrical and Computer Engineering<br />

University of Denver, Denver, CO, USA<br />

• The orientation of SIFT features is used to<br />

estimate rotation.<br />

• Fuzzy set theory is used to separate local<br />

motion and global motion.<br />

• Kalman filtering is used to estimate<br />

desired motion.<br />

• Scenarios with combinations of<br />

translation, rotation, local motion, and<br />

desired motion are presented.<br />

• Qualitative and quantitative measures are<br />

used to evaluate stabilization<br />

performance.<br />

Example Unstable (top) and<br />

stabilized (bottom) video frames.<br />

09:00–09:15 WeAT9.7<br />

A Learning Algorithm for Visual Pose Estimation<br />

of Continuum Robots<br />

Austin Reiter, Peter K. Allen, and Konstantinos Iliopoulos<br />

Dept. of Computer Science, Columbia University, USA<br />

Roger E. Goldman<br />

Dept. of Biomedical Engineering, Columbia University, USA<br />

Andrea Bajo and Nabil Simaan<br />

Dept. of Mechanical Engineering, Vanderbilt University, USA<br />

• Estimate the configuration of a continuum<br />

robot using vision<br />

• Learn a mapping of visual feature<br />

descriptors to ground truth configuration<br />

space variables<br />

• Interpolate a parametric manifold and<br />

estimate unknown poses using only<br />

feature descriptors<br />

• Rotational accuracy in the range of 1<br />

degree<br />

• Robust to occlusions and small training<br />

set size<br />

• Applicable to closed-loop control as<br />

accurate feedback sensory information<br />

[Top] Sample images of a single<br />

segment of a continuum robot<br />

with an occluder. [Bottom]<br />

Algorithm flow<br />

08:55–09:00 WeAT9.6<br />

Label propagation in videos indoors with an<br />

incremental non-parametric model update<br />

Jorge Rituerto, Ana Cristina Murillo<br />

DIIS – i3A, Universidad de Zaragoza, Spain<br />

Jana Košecká<br />

Computer Science Department, George Mason University, USA<br />

• Towards semantic interpretation of the robot environment<br />

• Goal: label background regions and separate them from the<br />

“foreground” along robot acquired sequences.<br />

• Learning simple models from interest regions in first frame, then<br />

update and propagate along the sequence.<br />

• Ingredients: superpixel segmentation, description and EMD based<br />

correspondences.<br />

Summary of presented propagation process<br />

09:15–09:30 WeAT9.8<br />

Multilayer real-time video image stabilization<br />

• Real-time video image<br />

stabilization system (VISS)<br />

primarily developed for aerial<br />

robots<br />

• Four independent stabilization<br />

layers<br />

• Low-cost and robust<br />

• VISS significantly improves the<br />

stability of shaky video images<br />

Jens Windau, Laurent Itti<br />

University of Southern California, USA<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–112–


Session WeBT1 Continental Parlor 1 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Socially Assistive Robots<br />

Chair Urbano Nunes, Univ. de Coimbra<br />

Co-Chair Eric Wade, Univ. of Southern California<br />

10:00–10:15 WeBT1.1<br />

Using Socially Assistive Robotics<br />

to Augment Motor Task Performance<br />

in Individuals Post-Stroke<br />

Eric Wade1 , Avinash Parnandi2 , and Maja J. Matarić1 1University of Southern California, Los Angeles, CA, USA<br />

2Texas A&M University, College Station, TX, USA<br />

• How can we influence performance measures using socially assistive<br />

robotics?<br />

10:30–10:45 WeBT1.3<br />

An Experience-Driven Robotic Assistant<br />

Acquiring Human Knowledge<br />

to Improve Haptic Cooperation<br />

Jose Ramon Medina, Martin Lawitzky, Alexander Mörtl,<br />

Dongheui Lee and Sandra Hirche<br />

Institute of Automatic Control Engineering,<br />

Technische Universität München, Germany<br />

Goal:<br />

Incremental learning of haptic<br />

primitives in complex scenarios<br />

during execution<br />

Proposed approach:<br />

• Combine incremental learning<br />

techniques with feedback<br />

control<br />

• Enhancement of autonomous<br />

segmentation, clustering,<br />

learning and execution<br />

performance by acquisition of<br />

meaningful semantic labels<br />

10:15–10:30 WeBT1.2<br />

The Impact of Different Competence Levels of<br />

Care-Receiving Robot on Children<br />

Madhumita Ghosh and Fumihide Tanaka<br />

Department of Intelligent Interaction Technologies,<br />

University of Tsukuba, Japan<br />

• New concept of robot use for education<br />

• Not like a teacher robot<br />

•But “Care-Receiving Robot (CRR)” which is<br />

taken care of (taught) by children<br />

• The goal is to achieve children’s<br />

spontaneous learning by teaching<br />

• Field trials at an English learning school<br />

for children were conducted<br />

• Investigation on two different types of CRR<br />

with presenting numerical data<br />

• CRR induced children’s behaviors which<br />

would lead to their learning reinforcement<br />

10:45–10:50 WeBT1.4<br />

Study on a Practical Robotic Follower<br />

to Support Home Oxygen Therapy Patients<br />

-Development and Control of a Mobile Platform-<br />

Atsushi Tani, Gen Endo, Edwardo F. Fukushima and Shigeo Hirose<br />

Dept. of Mechanical and Aerospace Engineering, Tokyo Institute of Technology, Japan<br />

Masatsugu Iribe<br />

The Faculty of Engineering, Osaka Electro-Communication University, Japan<br />

Toshio Takubo<br />

The First Department of Medicine, Tokyo Women’s Medical University, Japan<br />

• The patients suffering from severe lung<br />

diseases have to carry an oxygen tank<br />

when they go out.<br />

• We developed a low-cost mobile robot to<br />

carry the oxygen tank in an urban<br />

environment.<br />

• A tether interface is utilized to detect the<br />

relative position of the foregoing patient.<br />

• The robot can negotiate a curb separating a<br />

sidewalk from a street.<br />

• The following algorithm is also improved to<br />

track the patient’s trajectory.<br />

• We successfully demonstrated the following<br />

experiment in an outdoor environment.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–113–<br />

Developed mobile robot to carry<br />

an oxygen tank


Session WeBT1 Continental Parlor 1 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Socially Assistive Robots<br />

Chair Urbano Nunes, Univ. de Coimbra<br />

Co-Chair Eric Wade, Univ. of Southern California<br />

10:50–10:55 WeBT1.5<br />

Progress in Developing a Socially Assistive<br />

Mobile Home Robot Companion<br />

for Elderly with Mild Cognitive Impairment<br />

H.-M. Gross, C. Schroeter, S. Mueller, M. Volkhardt, E. Einhorn<br />

Neuroinformatics and Cognitive Robotics<br />

Ilmenau University of Technology, Germany<br />

A. Bley, Ch. Martin, T. Langner, M. Merten<br />

MetraLabs GmbH Ilmenau, Germany<br />

The paper presents<br />

� main system requirements derived from<br />

studies with the end-user target groups<br />

(elderly, relatives, caregivers)<br />

� consequences for the hardware design<br />

and functionality of the robot companion<br />

and its system architecture<br />

� a key technology for HRI in home<br />

environments – the autonomous user<br />

tracking and searching<br />

� results of already conducted and ongoing<br />

functionality tests with a particular focus<br />

on experimental results in autonomous<br />

user search<br />

89 years old lady in interaction with the<br />

mobile home robot companion developed in<br />

the European FP7 project CompanionAble<br />

11:00–11:15 WeBT1.7<br />

Whole-Body Contact Manipulation Using Tactile<br />

Information for the Nursing-Care Assistant Robot RIBA<br />

Toshiharu Mukai, Shinya Hirano, Morio Yoshida, and<br />

Hiromichi Nakashima<br />

RTC, RIKEN, Japan<br />

Shijie Guo<br />

SR Laboratorym Tokai Rubber Industries, Japan<br />

Yoshikazu Hayakawa<br />

Dept. of Mechanical Science and Engineering, Nagoya Univ., Japan<br />

• Whole-body contact manipulation<br />

method for nursing-care assistant robots<br />

in direct contact with patients is<br />

proposed.<br />

• This method preserves the conditions of<br />

surface contact, while manipulating the<br />

object’s position and orientation.<br />

• Information on the surface contact is<br />

obtained from tactile sensors mounted<br />

on the robot arms.<br />

• Results of basic experiments are also<br />

provided.<br />

10:55–11:00 WeBT1.6<br />

Wheelchair Navigation Assisted by Humanmachine<br />

Shared-control and P300-based Brain<br />

Computer Interface<br />

Ana C. Lopes, Gabriel Pires, Luís Vaz, and Urbano Nunes<br />

Institute for Systems and Robotics, University of Coimbra, Portugal<br />

• A Two-layer shared-control approach for<br />

assistive mobile robots, using a brain<br />

computer interface (BCI) is presented;<br />

• A P300-based paradigm that allows the<br />

selection of brain-actuated commands to<br />

steer the robotic wheelchair is proposed;<br />

• A virtual-constraint layer is responsible for<br />

validating user commands, based on<br />

context aware restrictions;<br />

• An intent-matching layer is responsible for<br />

determining the suitable steering<br />

command.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–114–<br />

Overview of Robchair: ISR-UC<br />

robotic wheelchair platform<br />

11:15–11:30 WeBT1.8<br />

Should Robots or People Do These Jobs?<br />

A Survey of Robotics Experts and Non-Experts<br />

About Which Jobs Robots Should Do<br />

Wendy Ju and Leila Takayama<br />

Willow Garage, USA<br />

• This study identifies occupational predictors for which jobs robots could<br />

or should do.<br />

• Surveyed robot experts and non-experts (N=392, which includes 134<br />

robotics experts and 258 non-experts )<br />

• Experts have more nuanced and guardedly optimistic view of robot<br />

capabilities.<br />

• Experts are more likely to focus on an occupation’s perceptual<br />

requirements in determining robot capability.<br />

• Non-experts are more skeptical about the robot’s ability to memorize or<br />

interact sociably.


Session WeBT2 Continental Parlor 2 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Estimation & Sensor Fusion<br />

Chair Thierry Peynot, The Univ. of Sydney<br />

Co-Chair Agostino Martinelli, INRIA Grenoble-Rhone-Alpes<br />

10:00–10:15 WeBT2.1<br />

Vision-Aided Inertial Navigation: Closed-Form<br />

Determination of Scale, Speed and Attitude<br />

Agostino Martinelli, Chiara Troiani and Alessandro Renzaglia<br />

INRIA, Rhone Alpes, France<br />

• Algorithm based on a<br />

closed-form solution able to<br />

determine speed, attitude<br />

and absolute scale in a<br />

short time interval.<br />

• No prior knowledge or<br />

initialization required.<br />

• Only 4 camera poses and<br />

one single feature are<br />

necessary for the previous<br />

determination (or 2 features<br />

and 3 poses)<br />

• Implementation on a real<br />

quadrotor<br />

10:30–10:45 WeBT2.3<br />

Time-Varying Complementary Filtering for<br />

Attitude Estimation<br />

Evan Chang-Siu , Masayoshi Tomizuka<br />

Dept. of Mechanical Engineering, University of California Berkeley, USA<br />

Kyoungchul Kong<br />

Dept. of Mechanical Engineering, Sogang University, Korea<br />

• Robust method to estimate posture angle<br />

in a single degree of freedom rigid body.<br />

• Applies a complementary filter to a<br />

gyroscope and accelerometer for attitude<br />

estimation.<br />

• Adapts cut off frequency of<br />

complementary filter to achieve improved<br />

performance over a standard<br />

complementary filter.<br />

• Discusses stability and provides<br />

experimental results.<br />

Diagram of TVCF method<br />

10:15–10:30 WeBT2.2<br />

Attitude Determination Framework<br />

by Globally and Asymptotically Stable Bias Error Estimation<br />

with Disturbance Attenuation and Rejection<br />

Hideaki YAMATO and Takayuki FURUTA<br />

Future Robotics Technology Center, Chiba Institute of Technology, Japan<br />

Ken TOMIYAMA<br />

The Department of Advanced Robotics, Chiba Institute of Technology, Japan<br />

Attitude determination framework for costeffective<br />

and small-size attitude sensor<br />

units proposed, featuring<br />

• globally and asymptotically stable<br />

estimation of constant bias error of rate<br />

sensors by a quaternion feedback<br />

• disturbance attenuation as well as<br />

rejection by a disturbance evaluation<br />

scheme (existence and strength) and the<br />

conventional vector matching method<br />

• low computational effort suitable for costeffective<br />

motion sensor units<br />

• fundamental improvement of attitude<br />

error property as shown in the figure.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–115–<br />

������������������<br />

������������������<br />

������������������<br />

�������������������������<br />

��������������������������<br />

���<br />

���������������<br />

���<br />

���<br />

��<br />

�� ��� ��� ��� ���<br />

�����������<br />

��������������������������<br />

��<br />

�����������������������������������<br />

�� �<br />

��<br />

�<br />

� �� ��<br />

�����������<br />

�� ��<br />

����������������������������������<br />

���<br />

� �����������������������<br />

��� �<br />

���<br />

�����������������������������<br />

����������<br />

��<br />

�� ��� ��� ��� ���<br />

�����������<br />

Comparison of error properties: (a) by<br />

gyroscope sensors, (b) by accelerometers<br />

& magnetometers, and (c) by<br />

algorithm estimation proposed<br />

10:45–10:50 WeBT2.4<br />

A Sensor Fusion Approach to Angle and Angular<br />

Rate Estimation<br />

Daniel Kubus and Friedrich M. Wahl<br />

Institut fuer Robotik und Prozessinformatik, TU Braunschweig, Germany<br />

• Fusion of low-cost encoder and MEMS<br />

angular rate sensor measurements<br />

• Virtual increase of encoder resolution and<br />

elimination of rate sensor drift<br />

• Resolution gain of up to factor 10;<br />

outperforms standard Kalman filter<br />

• Easy to use, low computational<br />

requirements<br />

Sinusoidal trajectory. True angle,<br />

encoder output(dotted), and filter<br />

output(dashed).


Session WeBT2 Continental Parlor 2 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Estimation & Sensor Fusion<br />

Chair Thierry Peynot, The Univ. of Sydney<br />

Co-Chair Agostino Martinelli, INRIA Grenoble-Rhone-Alpes<br />

10:50–10:55 WeBT2.5<br />

Combining Multiple Sensor Modalities for a<br />

Localisation Robust to Smoke<br />

Christopher Brunner, Thierry Peynot and Teresa Vidal-Calleja<br />

ACFR, The University of Sydney, Australia<br />

• Camera localisation robust to smoke/fog<br />

• Visible and IR cameras used within a<br />

SLAM framework<br />

• Robust localisation in smoke is shown:<br />

• using IR camera only<br />

• with Visual and IR cameras<br />

• with Visual and IR cameras plus<br />

visual data quality evaluation (visual<br />

images are discarded when most<br />

affected by smoke, mitigating errors)<br />

• Experimental validation with outdoor<br />

ground robot<br />

• Results show accuracy of proposed<br />

methods by comparing multiple<br />

trajectories with dGPS/INS ground truth<br />

11:00–11:15 WeBT2.7<br />

An Improved Pedestrian Inertial Navigation<br />

System for Indoor Environments<br />

Sylvie Lamy-Perbal, Mehdi Boukallel and Nadir Castañeda<br />

CEA List, Sensory and Ambient Interfaces Laboratory, France<br />

• An improved shoe-mounted inertial<br />

navigation system with submeter accuracy<br />

• Zero Velocity Update (ZUPT) and Angular<br />

update algorithms (AUPT) for drift<br />

compensation<br />

• Efficient foot stance phase detection<br />

system without additional sensors<br />

• In situ experimentation and validation<br />

10:55–11:00 WeBT2.6<br />

Multisensor Data Fusion for Robust Pose<br />

Estimation of a Six-Legged Walking Robot<br />

Annett Chilian, Heiko Hirschmüller, Martin Görner<br />

DLR German Aerospace Center, Germany<br />

• Fusion of IMU data, 3D visual odometry<br />

and 3D leg odometry using an indirect<br />

feedback information filter<br />

• Error estimate of visual odometry<br />

measurements is computed and<br />

considered in the filtering process<br />

• State cloning is applied to correctly fuse<br />

relative odometry measurements<br />

• Fusion result is robust against failures of<br />

the visual odometry caused by poor<br />

visual conditions<br />

• Experimental results prove the<br />

robustness and accuracy of the data<br />

fusion process<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–116–<br />

Fusion result compared to ground<br />

truth, visual and leg odometry under<br />

poor visual conditions<br />

11:15–11:30 WeBT2.8<br />

Learning the Delaunay Triangulation of<br />

Landmarks From a Distance Ordering Sensor<br />

Max Katsev and Steven M. LaValle<br />

Department of Computer Science,<br />

University of Illinois at Urbana-Champaign, USA<br />

• Limited sensing: for any two landmarks,<br />

the robot can detect which one is closer<br />

• No other information: no exact distance,<br />

no landmark locations, no odometry<br />

• Algorithms for the robot to navigate to<br />

certain points and to learn global<br />

topological information about landmarks<br />

• Can be used to solve simple tasks, e.g.,<br />

convex hull computation<br />

The environment is split into<br />

regions uniquely identified by the<br />

distance order of landmarks


Session WeBT3 Continental Parlor 3 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Surgical Robotics<br />

Chair Wan Kyun Chung, POSTECH<br />

Co-Chair Sarthak Misra, Univ. of Twente<br />

10:00–10:15 WeBT3.1<br />

A Bimanual Teleoperated System for Endonasal<br />

Skull Base Surgery<br />

Jessica Burgner1 , Philip J. Swaney1 , D. Caleb Rucker1 ,<br />

Hunter B. Gilbert1 , Scott T. Nill1 , Paul T. Russell III2 ,<br />

Kyle D. Weaver3 , and Robert J. Webster III1,2 1 Department of Mechanical Engineering, Vanderbilt University, USA<br />

2 Department of Otolaryngology, Vanderbilt University Medical Center, USA<br />

3 Department of Neurosurgery, Vanderbilt University Medical Center, USA<br />

• Robotic surgery through a single<br />

nostril<br />

• “Tentacle-Like” concentric tube<br />

continuum robot manipulators<br />

• Presentation covers:<br />

- Surgical requirements<br />

- Workspace from patient data<br />

- Optimal end effector design<br />

- Actuation unit design<br />

- Jacobian-based teleoperation<br />

• Cadaver experiment results<br />

10:30–10:45 WeBT3.3<br />

Evaluation of Command Modes of an Assistance<br />

Robot for Middle Ear Surgery<br />

Guillaume Kazmitcheff1 , Mathieu Miroir1 , Yann Nguyen1 , Charlotte<br />

Célérier1 , Stéphane Mazalaigue3 , Evelyne Ferrary1,2 , Olivier<br />

Sterkers1,2 and Alexis Bozorg-Grayeli1,2 1 UMR-S 867 / Inserm / University Paris 7, France<br />

2 AP-HP Hôpital Beaujon, France<br />

3 Collin SA, France<br />

• Tele operated system : RobOtol<br />

• New prototype : improved controller system , mechanical enhancement<br />

and HMI concept<br />

• 2 command modes : Position-Velocity and Position-Position<br />

• Robot evaluation by tasks specific to middle ear surgery<br />

10:15–10:30 WeBT3.2<br />

Automated Surgical Planning and Evaluation<br />

Algorithm for Spinal Fusion Surgery with Three-<br />

Dimensional Pedicle Model<br />

Jongwon Lee1 , Sungmin Kim2 , Young Soo Kim3 ,<br />

Wan Kyun Chung1 , and Minjun Kim1 1Department of Mechanical Engineering, POSTECH, KOREA<br />

2Department of Biomedical Engineering, Hanyang Univ., KOREA<br />

3Department of Neurosurgery, School of Medicine, Hanyang Univ., KOREA<br />

• Autonomous preoperative planning<br />

framework for lumbar spinal fusion was<br />

proposed<br />

• It augmented a novel functionality for<br />

suggesting optimal insertion trajectories<br />

of pedicle screws and its size (w.r.t.<br />

operational safety and screw-vertebra<br />

interface strength)<br />

• It is based on 3-D accurate spinal pedicle<br />

data segmented from patient’s CT images<br />

• It has been tested on 68 spinal pedicles<br />

of 8 patients, and successfully applied<br />

resulting in final safety margin of<br />

2.11�0.17mm<br />

5<br />

mm<br />

65<br />

70<br />

75<br />

80<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–117–<br />

mm<br />

20<br />

15<br />

10<br />

45 50 55 60 65 70 75<br />

mm<br />

Insertion trajectories of the pedicle<br />

screws computed by the proposed<br />

surgical planner<br />

10:45–10:50 WeBT3.4<br />

Towards validation of robotic surgery training<br />

assessment across training platforms<br />

Yixin Gao, Mert Sedef, Amod Jog, Peter Peng,<br />

Michael Choti, Gregory Hager, Jeff Berkley, Rajesh Kumar<br />

Computer Science, Johns Hopkins University, USA<br />

MIMIC Technologies, Inc. USA<br />

• Automated skill assessment in robotic<br />

surgery required for unsupervised<br />

feedback and simulated task<br />

performance<br />

• First cross-platform analysis of motion<br />

data to distinguish novice and<br />

proficient user performances<br />

• Experiments with da Vinci dry-lab and<br />

dV-Trainer simulation environment with<br />

users of varying skills are reported<br />

The prototype cross-platform<br />

architecture


Session WeBT3 Continental Parlor 3 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Surgical Robotics<br />

Chair Wan Kyun Chung, POSTECH<br />

Co-Chair Sarthak Misra, Univ. of Twente<br />

10:50–10:55 WeBT3.5<br />

��������������������������������������������<br />

������������������<br />

�������������������������������������������������������<br />

������������������������������������������<br />

�������������<br />

�������������������������������������<br />

� ���������������������������������������<br />

�����������������������������������<br />

� ����������������������������������������� ����<br />

����������������������������������������<br />

� ����������������������������������������� �����<br />

��������������������������������������� ���<br />

� ��������������������������������������� ����<br />

����������������������������������������� ���� ��<br />

������������������������������������<br />

11:00–11:15 WeBT3.7<br />

Mechanics of Needle-Tissue Interaction<br />

Roy J. Roesthuis, Youri R.J. van Veen,<br />

Alex Jahya and Sarthak Misra<br />

University of Twente, The Netherlands<br />

• Understanding the mechanics of<br />

needle-tissue interaction is required<br />

for pre-operative path planning and<br />

needle steering.<br />

• Needle-tissue interaction forces are<br />

identified for bevel-tipped needles<br />

inserted into a gelatin phantom.<br />

• The interaction forces are used to<br />

predict needle deflection using a<br />

mechanics-based model.<br />

• The average error between model<br />

and experiments for 100 mm<br />

insertion is 0.179 mm.<br />

Interaction forces on beveltipped<br />

needle during insertion<br />

Experimental setup to measure<br />

needle-tissue friction (top view)<br />

10:55–11:00 WeBT3.6<br />

An Analytical Model for Deflection of Flexible<br />

Needles During Needle Insertion<br />

Ali Asadian 1,2 , Mehrdad R. Kermani 1,2 , and Rajni V. Patel 1,2,3<br />

1 Canadian Surgical Technologies & Advanced Robotics (CSTAR),<br />

2 Dept. of Elect. & Comp. Eng., 3 Dept. of Surgery,<br />

University of Western Ontario, London, Ontario, Canada<br />

• The use of flexible needles in<br />

percutaneous interventions<br />

necessitates modeling of the<br />

generated curved trajectories.<br />

• A feasible model is essential<br />

for path planning (needle<br />

guidance) and simulation<br />

(clinician training).<br />

• Euler-Bernoulli beam theory<br />

and Green’s functions are<br />

used to obtain a needle’s<br />

planar deflection using.<br />

• The impact of tissue elasticity<br />

and needle-tissue friction is<br />

studied.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–118–<br />

Effective insertion forces acting on the<br />

curved axis and the 2D needle deflection<br />

with respect to its insertion depth<br />

11:15–11:30 WeBT3.8<br />

Feasibility Study of an Optically Actuated<br />

MR-compatible Active Needle<br />

Seok Chang Ryu1 , Pierre Renaud2 , Richard J. Black3 ,<br />

Bruce L. Daniel4 , and Mark R. Cutkosky1 1Center for Design Research, Stanford University, USA<br />

2LSIIT, Strasbourg University-CNRS-INSA, France<br />

3Intelligent Fiber Optic System Corporation, USA<br />

4Radiology, Stanford University, USA<br />

• We propose a novel MR-compatible,<br />

optically steerable active needle to<br />

improve needle maneuverability, which<br />

features an active flexure near tip for<br />

direct tip orientation control<br />

• The active flexure can be bent with an<br />

optically heated SMA wire by laser<br />

delivered through embedded optical<br />

fibers. It also includes optical curvature<br />

and temperature sensors<br />

Fig. Inner stylet of the proposed<br />

active biopsy needle<br />

• Mechanical behavior of the proposed needle was evaluated, and about<br />

10� of tip bending angle could be achieved<br />

• Optical heating requirements were investigated and side heating<br />

methods were proposed for better needle dynamics


Session WeBT4 Continental Ballroom 4 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Hardware and Software Design for Haptic Systems<br />

Chair Yasuyoshi Yokokohji, Kobe Univ.<br />

Co-Chair Katherine J. Kuchenbecker, Univ. of Pennsylvania<br />

10:00–10:15 WeBT4.1<br />

A Friction Differential and Cable<br />

Transmission Design for a 3-DOF Haptic<br />

Device with Spherical Kinematics<br />

Reuben Brewer 1 , Adam Leeper 1 , and J. Kenneth Salisbury 1,2,3<br />

Departments of Mech. Engineering 1 , Computer Science 2 , and Surgery 3<br />

Stanford University, USA<br />

• New design for a 3-DOF haptic<br />

device with spherical kinematics<br />

(pitch, yaw, and prismatic radial).<br />

• All motors grounded to reduce<br />

inertia and increase compactness<br />

near user’s hand.<br />

• Aluminum-aluminum friction<br />

differential actuates pitch and<br />

yaw robustly.<br />

• Cable transmission routes<br />

through the center of the<br />

differential to actuate the<br />

prismatic, radial DOF.<br />

10:30–10:45 WeBT4.3<br />

Coaxial Needle Insertion Assistant<br />

for Epidural Puncture<br />

Yoshihiko Koseki and Kiyoyuki Chinzei<br />

Advanced Industrial Science and Technology (AIST), Japan<br />

Danilo De Lorenzo<br />

Politecnico di Milano, Bioengineering Department, Italy<br />

Allison M. Okamura<br />

Department of Mechanical Engineering, Stanford University, USA<br />

• The coaxial needle separates cutting force<br />

at the needle tip from shear friction on the<br />

needle shaft<br />

• A robotic assist presents the cutting force<br />

in order to enhance physician’s perception<br />

during epidural puncture<br />

• The ratio of successful to unsuccessful<br />

puncture detection was higher with the<br />

assistant than without<br />

• Users were more confident that they could<br />

perceive the moment of puncture<br />

Operator<br />

Operator<br />

Force<br />

Inner Needle<br />

Outer Needle<br />

Friction Force<br />

Cutting Force<br />

Actuator<br />

Assistant<br />

Force<br />

Epidural Space<br />

Spinal Cord<br />

Lumbar Spine<br />

10:15–10:30 WeBT4.2<br />

Hi5: a versatile dual-wrist device to study<br />

human-human interaction and bimanual control<br />

Alejandro Melendez-Calderon, Lorenzo Bagutti, Brice Pedrono<br />

and Etienne Burdet<br />

Department of Bioengineering, Imperial College London, United Kingdom<br />

• Human-human and bimanual haptic<br />

interactions involve complex control<br />

requiring coordination of redundant<br />

muscle systems.<br />

• Wrist flexion/extension movements<br />

simplify the study of muscle control and<br />

redundancy.<br />

• Hi5 allows to measure the kinematics,<br />

dynamics and muscular activity, allowing<br />

systematic analysis of redundant<br />

strategies.<br />

• The simple design enables quick and<br />

easy reconfiguration of the interface to<br />

investigate symmetric or asymmetric<br />

tasks on both human-human or<br />

bimanual interaction experiments.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–119–<br />

Removable carbon fiber tube<br />

Adjustable position<br />

Human-human Bimanual<br />

10:45–10:50 WeBT4.4<br />

Two-Hands Are Better Than One: Assisting<br />

Users with Multi-Robot Manipulation Tasks<br />

Bennie Lewis and Gita Sukthankar<br />

Department of EECS, University of Central Florida, USA<br />

• User interface for reducing operator<br />

workload while controlling a team of<br />

robots with manipulators.<br />

• Robots leverage information from<br />

commands that the user has executed in<br />

the past.<br />

• Two new modes which can be invoked by<br />

the user or autonomously by the interface.<br />

• Shows statistically-significant<br />

improvements in the time required to<br />

perform foraging scenarios


Session WeBT4 Continental Ballroom 4 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Hardware and Software Design for Haptic Systems<br />

Chair Yasuyoshi Yokokohji, Kobe Univ.<br />

Co-Chair Katherine J. Kuchenbecker, Univ. of Pennsylvania<br />

10:50–10:55 WeBT4.5<br />

3-DOF Haptic Feedback for Assisted<br />

Driving of an Omnidirectional Wheelchair<br />

Quinton M. Christensen and Stephen Mascaro<br />

Department of Mechanical Engineering, University of Utah, USA<br />

• The mobility limitations of<br />

conventional wheelchairs have<br />

prompted development of<br />

omnidirectional wheelchairs.<br />

• The 3-DOF Hapic Joystick shown<br />

provides a natural and intuitive<br />

controller for these wheelchairs.<br />

• This joystick provides force feedback<br />

corresponding to all degrees of<br />

control input.<br />

• This controller can assist in obstacle<br />

avoidance and driver training and<br />

has other research applications.<br />

Functional Prototype of the<br />

3-DOF Haptic Joystick<br />

11:00–11:15 WeBT4.7<br />

Asynchronous Haptic Simulation of Contacting<br />

Deformable Objects with Variable Stiffness<br />

Igor Peterlik, Christian Duriez, Stephane Cotin<br />

INRIA Lille Nord Europe, France<br />

• In complex simulations rapid variation of<br />

force can be introduced due to stiff nonlinear<br />

deformations<br />

• We present asynchronous technique for<br />

constraint-based haptic rendering of<br />

deformable or rigid objects in contact<br />

• The method is demonstrated with two<br />

deformable objects (volumetric and<br />

curved), each having different stiffness,<br />

simulated at different frequencies<br />

• The experiments shows that our<br />

multirate method results in stable,<br />

coherent and precise haptic rendering of<br />

contacts between the objects simulated<br />

asynchronously<br />

Volumic object simulated at low<br />

frequency (50 Hz), curved<br />

object (snap, thread) simulated<br />

at high frequncy (1000 Hz)<br />

10:55–11:00 WeBT4.6<br />

Configuration-based Optimization for Six<br />

Degree-of-Freedom Haptic Rendering<br />

Using Sphere-trees<br />

Xin Zhang, Dangxiao Wang and Yuru Zhang<br />

the State Key Lab of Virtual Reality Technology and Systems<br />

Beihang University, China<br />

Jing Xiao<br />

The Department of Computer Science,<br />

University of North Carolina – Charlotte, USA<br />

• Treated the haptic interaction between a<br />

rigid tool and rigid objects in a virtual<br />

environment as a constrained optimization<br />

problem in the 6-D configuration space of<br />

the tool.<br />

• Modeled the tool and objects as sphere<br />

trees for fast detection of multiple<br />

collisions and formulation of contact<br />

constraints.<br />

• Applied the approach to dental surgery<br />

simulation. Results show stable interaction<br />

in the update rate of 1kHz.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–120–<br />

Six DoF haptic rendering for<br />

dental surgery simulation<br />

11:15–11:30 WeBT4.8<br />

Proxy Method for Fast Haptic Rendering from<br />

Time Varying Point Clouds<br />

Fredrik Rydén, Sina Nia Kosari and Howard Jay Chizeck<br />

Department of Electrical Engineering, University of Washington, USA.<br />

• A novel algorithm for haptic rendering from<br />

time varying point clouds in real time.<br />

• The algorithm can run on a regular laptop<br />

(Core 2 Duo or better) and successfully<br />

render haptic forces at 1000 Hz.<br />

• The user can easily feel features that are<br />

less than 5 mm.<br />

• Algorithm is based on the extension of the<br />

haptic proxy idea to dynamic point clouds.<br />

• Potential application in gaming and telerobotics<br />

applications including robotic<br />

assisted surgery.<br />

The algorithm running on a<br />

regular laptop. The size of the<br />

proxy is an important parameter<br />

in haptic interaction and can here<br />

be compared to an apple.


Session WeBT5 Continental Ballroom 5 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Foundations and Future Prospects of Sampling-based Motion Planning<br />

Chair Kostas E. Bekris, Univ. of Nevada, Reno<br />

Co-Chair Steven M LaValle, Univ. of Illinois<br />

10:00–10:15 WeBT5.1*<br />

Semi-Plenary Invited Talk: Sampling-based Algorithms<br />

for General Motion Control Problems: Recent<br />

Advances and New Challenges<br />

Emilio Frazzoli, Massachusetts Institute of Technology<br />

10:30–10:45 WeBT5.3<br />

An Obstacle-responsive Technique<br />

for the Management and Distribution<br />

of Local Rapidly-exploring Random Trees<br />

Nathan A. Wedge and Michael S. Branicky<br />

Department of Electrical Engineering and Computer Science,<br />

Case Western Reserve University, Cleveland, Ohio, USA<br />

• Previously-introduced PART (Path-length<br />

Annexed Random Tree) builds on RRT by<br />

addressing pathologically-difficult cases<br />

• PART adds threshold on path length to<br />

partition search into local trees, providing<br />

multiple neighbors for diverse expansion<br />

• APART (Adaptive PART) supplements<br />

PART by adjusting thresholds for each<br />

local tree based on obstacles<br />

• Adaptive mechanism focuses effort near<br />

presumed difficult regions and improves<br />

performance over RRT and PART on hard<br />

problems via apt threshold values<br />

APART’s search focuses on tricky<br />

areas like narrow obstacle edges<br />

10:15–10:30 WeBT5.2<br />

Semi-Plenary Invited Talk: Sampling-based Algorithms<br />

for General Motion Control Problems: Recent<br />

Advances and New Challenges<br />

Emilio Frazzoli, Massachusetts Institute of Technology<br />

10:45–10:50 WeBT5.4<br />

Finding Critical Changes in<br />

Dynamic Configuration Spaces<br />

Yanyan Lu and Jyh-Ming Lien<br />

Department of Computer Science, George Mason University, USA<br />

• Probabilistic methods plan in<br />

configuration-time space and do not<br />

reuse computation between<br />

configuration space slices.<br />

• There exist methods reuse the valid<br />

portion of roadmap but only update<br />

the roadmap at fixed times.<br />

• Our method identifies critical changes<br />

in the configuration space and reuses<br />

valid edges and nodes, and updates<br />

roadmap only at the critical times.<br />

• Our planner provides better<br />

completeness and is at least 10 times<br />

faster than existing popular planners.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–121–<br />

Topological changes of configuration<br />

free space from t0 to t2


Session WeBT5 Continental Ballroom 5 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Foundations and Future Prospects of Sampling-based Motion Planning<br />

Chair Kostas E. Bekris, Univ. of Nevada, Reno<br />

Co-Chair Steven M LaValle, Univ. of Illinois<br />

10:50–10:55 WeBT5.5<br />

Toggle PRM:<br />

A Simultaneous Mapping of C free and C obstacle<br />

-A Study in 2D-<br />

Jory Denny and Nancy M. Amato<br />

Computer Science and Engineering, Texas A&M University, USA<br />

• New paradigm for PRM samplingbased<br />

planner<br />

• A simultaneous, coordinated mapping<br />

of both C free and C obst<br />

• Theoretical and experimental<br />

improvement for sampling in narrow<br />

passages<br />

• Improved efficiency in terms of number<br />

of validity checks<br />

Uniform OBPRM<br />

Toggle PRM Free Toggle PRM Block<br />

-Example map constructions- Toggle PRM<br />

has highest density in the passage<br />

11:00–11:15 WeBT5.7<br />

EG-RRT: Environment-Guided Random Trees<br />

for Kinodynamic Motion Planning<br />

with Uncertainty and Obstacles<br />

Léonard Jaillet*, Judy Hoffman+, Jur van den Bergº,<br />

Pieter Abbeel+, Josep M. Porta* and Ken Goldberg+<br />

* Institut de Robòtica i Informàtica Industrial, Barcelona, Spain<br />

+ University of California, Berkeley, USA<br />

º University of North Carolina, Chapel Hill, USA<br />

• EG-RRT is an Environment-Guided<br />

variant of RRT designed for<br />

kinodynamic systems<br />

• This hybrid planner integrates the<br />

local limitations of the system<br />

dynamics as well as the risk of leading<br />

to unproductive expansions<br />

• The method may be enhanced by a<br />

cost model based on the LQG-MP<br />

framework<br />

• It produces paths faster and with a<br />

lower probability of collision under<br />

uncertainty in control and sensing<br />

Comparison of four RRT-based<br />

planners. EG-RRT in the lower-right<br />

outperforms the other variants<br />

10:55–11:00 WeBT5.6<br />

Sampling Heuristics for Optimal Motion<br />

Planning in High Dimensions<br />

Baris Akgun and Mike Stilmand<br />

The School of Interactive Computing, Georgia Institute of Technology, USA<br />

• Scenario: Allocated time for planning<br />

• Find an initial path quickly and iteratively<br />

decrease its cost fast<br />

• Introduce heuristics to RRT* algorithm to<br />

balance exploration and exploitation<br />

• Implement a bidirectional version for<br />

efficiency in high-dimensions<br />

• Local Biasing: Exploitation<br />

• Node rejection: Efficiency<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–122–<br />

Trees produced in same duration<br />

Top: With plain RRT*<br />

Bottom: With combined heuristics<br />

11:15–11:30 WeBT5.8<br />

Planning Humanlike Actions in Blending Spaces<br />

Yazhou Huang, Mentar Mahmudi<br />

and Marcelo Kallmann<br />

School of Engineering, University of California Merced, USA<br />

• We introduce an approach for enabling<br />

sampling-based planners to compute<br />

motions with humanlike appearance.<br />

• The search space of our planner is<br />

defined on a space of blendable motion<br />

capture examples, generating solutions<br />

similar to the original mocap examples.<br />

• The method is combined with a<br />

locomotion planner in order to generate<br />

whole-body motions for generic action<br />

execution among obstacles.


Session WeBT6 Continental Ballroom 6 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Aerial Robotics: Control and Planning<br />

Chair Srikanth Saripalli, Arizona State Univ.<br />

Co-Chair Pieter Abbeel, UC Berkeley<br />

10:00–10:15 WeBT6.1<br />

UAV Rotorcraft in Compliant Contact:<br />

Stability Analysis and Simulation<br />

Paul Pounds and Aaron Dollar<br />

Department of Mechanical Engineering, Yale University, USA<br />

• End-effector compliance can allow for safe<br />

interactions of a hovering vehicle with<br />

objects, without structural contact<br />

knowledge<br />

• The 6-DOF dynamics of a helicopter with<br />

nonlinear PD control remain stable for a<br />

range of end-effector stiffnesses and<br />

displacements from aircraft center of<br />

gravity<br />

• Simulation of the system for archetypical<br />

end-effector configurations demonstrate<br />

stability performance predicted analytically<br />

�<br />

�<br />

F<br />

�<br />

Unforced position<br />

Helicopter coupled to ground<br />

via compliance<br />

10:30–10:45 WeBT6.3<br />

Flight Control for Target Seeking by<br />

13 gram Ornithopter<br />

Stanley S. Baek, Fernando L. Garcia Bermudez,<br />

and Ronald S. Fearing<br />

Dept. of EECS, University of California, Berkeley, United States<br />

• Autonomous flight control of a 13 gram<br />

ornithopter flying toward a target using<br />

onboard sensing and computation only<br />

• 1.0 gram control electronics including a<br />

microcontroller, inertial and visual<br />

sensors, wireless communication, and<br />

motor drivers.<br />

• Simplified aerodynamic model of<br />

ornithopter flight to reduce the order of<br />

the control system.<br />

• Dead-reckoning algorithm to recover<br />

from temporary loss of the target.<br />

• Landed within a radius of 0.5 m from the<br />

target with more than 85% success (N =<br />

20)<br />

target<br />

Figure caption is optional,<br />

use Arial Narrow 20pt<br />

10:15–10:30 WeBT6.2<br />

Design, Modeling, Estimation and Control<br />

for Aerial Grasping and Manipulation<br />

Daniel Mellinger, Quentin Lindsey,<br />

Michael Shomin, and Vijay Kumar<br />

GRASP Lab, University of Pennsylvania, USA<br />

• We present the design of several grippers<br />

that allow quadrotors to pick up and<br />

transport payloads<br />

• We describe methods for estimating and<br />

adapting to the changes in system<br />

dynamics due to the grasped payloads<br />

• We present experimental results for the<br />

estimation methods and show<br />

considerable improvement in tracking<br />

performance with the inclusion of the<br />

estimated parameters<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–123–<br />

Quadrotors Tranporting Payloads<br />

10:45–10:50 WeBT6.4<br />

Embedded robust nonlinear control for a four-rotor<br />

rotorcraft: Validation in real-time with wind gusts<br />

Laura E. Muñoz, Pedro Castillo, Guillaume Sanahuja<br />

Heudiasyc UMR CNRS 6599, UTC France<br />

Omar Santos<br />

CITIS - Universidad Autónoma del Estado de Hidalgo, México<br />

• An embedded robust nonlinear controller to stabilize a fourrotor<br />

rotorcraft in presence of bounded disturbances is<br />

proposed.<br />

• The stability analysis of the close-loop system is proved<br />

using the Lyapunov theory.<br />

• A new prototype with an embedded control system has been<br />

developed.<br />

• The proposed algorithms were validated through simulation<br />

and real time experiments.


Session WeBT6 Continental Ballroom 6 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Aerial Robotics: Control and Planning<br />

Chair Srikanth Saripalli, Arizona State Univ.<br />

Co-Chair Pieter Abbeel, UC Berkeley<br />

10:50–10:55 WeBT6.5<br />

Differential Flatness Based Control of a<br />

Rotorcraft For Aggressive Maneuvers<br />

Jeff Ferrin, Robert Leishman and Tim McLain<br />

Mechanical Engineering, Brigham Young University, U.S.A.<br />

Randy Beard<br />

Electrical Engineering, Brigham Young University, U.S.A.<br />

• A differential flatness based control<br />

method for rotorcraft is presented<br />

• Feed forward term allows for improved<br />

path following performance with large<br />

pitch and roll angles<br />

• The control method also allows for<br />

independent yaw control while flying the<br />

path<br />

• Hardware results demonstrate substantial<br />

improvement over standard feedback<br />

control methods (no feed forward term)<br />

Pitch and roll angles of aircraft<br />

during aggressive flight<br />

11:00–11:15 WeBT6.7<br />

Magnetic Localization for<br />

Perching UAVs on Powerlines<br />

Joseph Moore and Russ Tedrake<br />

Computer Science and Artificial Intelligence Lab (CSAIL), MIT, USA<br />

• Goal: Perch on powerlines to recharge<br />

• Potential challenges<br />

• Unknown current phase leads to<br />

ambiguity in positions<br />

• Signal strength decreases like<br />

1/distance^3<br />

• Developed onboard magnetometer and<br />

estimation algorithm (using EKF)<br />

• Capable of producing sufficient estimates<br />

for our dynamic perching maneuvers,<br />

which start 4m from wire at 8m/s.<br />

10:55–11:00 WeBT6.6<br />

Robust Embedded Egomotion Estimation<br />

Rainer Voigt, Janosch Nikolic, Christoph Hürzeler,<br />

Stephan Weiss, Laurent Kneip and Roland Siegwart<br />

Autonomous Systems Lab, ETH Zürich, Switzerland<br />

• Robust and efficient egomotion estimation in<br />

challenging industrial environments<br />

• Tight coupling of visual and inertial cues to<br />

cope with poorly or repetitively textured<br />

environments<br />

• Real-time operation on low-power embedded<br />

computing platform<br />

11:15–11:30 WeBT6.8<br />

Persistent Surveillance with a Team of MAVs<br />

Nathan Michael and Kartik Mohta<br />

GRASP Laboratory, University of Pennsylvania, USA<br />

Ethan Stump<br />

Army Research Laboratory, USA<br />

• Consider the problem of deploying a team<br />

of aerial robots to persistently survey<br />

discrete locations of interest<br />

• Formulate the problem as a Vehicle<br />

Routing Problem with Time Windows and<br />

detail a real-time optimal solution<br />

methodology<br />

• Detail a system architecture capable of<br />

addressing the problem of persistent<br />

surveillance with a team of autonomous<br />

micro-aerial vehicles<br />

• Report simulation and experimental<br />

results for teams of quadrotors both<br />

indoors and outdoors<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–124–<br />

Two quadrotors are deployed<br />

autonomously to survey discrete<br />

interest points


Session WeBT7 Continental Parlor 7 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Passive Walking & Leg-Wheeled Robots<br />

Chair Dong-Soo Kwon, KAIST<br />

Co-Chair Martin Buehler, iRobot<br />

10:00–10:15 WeBT7.1<br />

Nonlinear Structure of Escape-Times to Falls for<br />

a Passive Dynamic Walker in an Irregular Slope<br />

(A, B, C) (A)<br />

Hiromichi Suetani and Aiko Ideta<br />

(A) Department of Physics and Astronomy, Kagoshima University, Japan<br />

(B) Decoding and Controlling Brain Information, PRESTO, JST, Japan<br />

(C) RIKEN Advanced Science Institute, Japan<br />

(D)<br />

Jun Morimoto<br />

(D)<br />

Dept. of BRI, ATR Computational Neuroscience Laboratories, Japan<br />

• Passive dynamic walkers (PDWs) use the<br />

“limit cycle attractors” for their gait. But,<br />

the area of their basins is very small, i.e.,<br />

PDW is very susceptible to noise.<br />

• A set of escape-times from walking to<br />

falling of PDW with noise exhibits a<br />

characteristic nonlinear structure in the<br />

state space, which can be considered as<br />

the “basin ruin” of the noiseless PDW.<br />

• Analyzing such a nonlinear structure<br />

using a multi-class Support Vector<br />

Machines and Canonical Correlation<br />

Analysis<br />

• Nonlinear structure of this escape-times<br />

may be useful for reinforcement<br />

learning of PDWs.<br />

Escape-times from walking to<br />

falling in a Poincare surface.<br />

Each color indicates the number<br />

of steps left before falling.<br />

10:30–10:45 WeBT7.3<br />

Optimal gait switching for legged locomotion<br />

B. Kersbergen, G.A.D. Lopes,<br />

T.J.J. van den Boom, B. De Schutter and R. Babuska<br />

Delft Center for Systems and Control,<br />

Delft University of Technology, The Netherlands<br />

• The gait space of multi-legged robots is<br />

combinatorial<br />

• We use switching max-plus linear models<br />

to describe legged locomotion<br />

• We present a systematic way to generate<br />

optimal gait transitions, in the sense of<br />

minimizing the foot velocity variation<br />

during stance<br />

• All gait switches are intrinsically safe<br />

Optimal gait switch for an<br />

hexapod robot<br />

10:15–10:30 WeBT7.2<br />

Increasing the Robustness of Acrobot walking<br />

control using compliant mechanisms<br />

Maziar Ahmad Sharbafi and Mohammad Javad Yazdanpanah<br />

and Majid Nili Ahmadabadi<br />

School of Electrical and Computer Engineering, University of Tehran, Iran<br />

• Two controllers are designed based on<br />

Hybrid Zero Dynamics for flat and inclined<br />

surfaces.<br />

• Calculating the difference of force<br />

generated by two controller on inclined<br />

surface<br />

• Finding a nonlinear compliance to<br />

compensate the required force<br />

• Evaluation of the controller on flat surface<br />

• Evaluation of the controller on inclined<br />

surface<br />

• Analyzing the controller performance<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–125–<br />

Walking on slope with HZD based<br />

designed controller for flat surface.<br />

a nonlinear compliance is inserted<br />

to enable it to walk on slope.<br />

10:45–10:50 WeBT7.4<br />

Development and Experiment of a Kneed Biped Walking<br />

Robot Based on Parametric Excitation Principle<br />

Yoshihisa Banno, Kouichi Taji and Yoji Uno<br />

Mechanical Science and Engineering, Nagoya University, Japan<br />

Yuji Harata<br />

Mechanical Systems and Applied Mechanics, Hiroshima University, Japan<br />

• The parametric excitation walking with a<br />

simple periodic reference trajectory was<br />

realized.<br />

• A kneed biped robot based on parametric<br />

excitation principle were developed.<br />

• In the experiment, a sustainable walking<br />

on a level ground was achieved about 17<br />

steps.<br />

• Several angular coordinates of<br />

experimental results were very close to<br />

those of simulation results.<br />

Developed biped robot


Session WeBT7 Continental Parlor 7 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Passive Walking & Leg-Wheeled Robots<br />

Chair Dong-Soo Kwon, KAIST<br />

Co-Chair Martin Buehler, iRobot<br />

10:50–10:55 WeBT7.5<br />

Switchblade: An Agile Treaded Rover<br />

Nicholas Morozovsky, Christopher Schmidt-Wetekam,<br />

Thomas Bewley<br />

Mechanical and Aerospace Engineering Department, University of California<br />

San Diego, USA<br />

• Able to traverse rough terrain while<br />

retaining a small form factor for navigating<br />

confined spaces<br />

• Combination of a novel transforming<br />

mechanical design, capable onboard<br />

electronics, and advanced feedback<br />

control algorithms<br />

• Manipulates its center of mass to perform<br />

unique maneuvers<br />

• Well suited for a variety of sociallyrelevant<br />

applications, including<br />

reconnaissance, mine exploration, and<br />

search & rescue<br />

11:00–11:15 WeBT7.7<br />

Robust obstacle crossing of a wheel-legged<br />

mobile robot using minimax force distribution<br />

and self-reconfiguration<br />

Pierre Jarrault, Christophe Grand and Philippe Bidaud<br />

Institut des Systèmes Intelligent et de Robotique<br />

Université Pierre et Marie Curie, France<br />

• Control algorithm for obstacle<br />

crossing with a poly-articulated<br />

wheeled mobile robot<br />

• Stability criterion based on the<br />

smallest force sustainable at the<br />

wheel/ground contact<br />

• Optimization of both the force<br />

distribution and the center of mass<br />

location in order to maximize the<br />

criterion<br />

• Cross-ability analysis of an obstacle<br />

10:55–11:00 WeBT7.6<br />

Passive Dynamic Walking of Combined Rimless<br />

Wheel and Its Speeding-up by Adjustment of<br />

Phase Difference<br />

Ryosuke Inoue, Fumihiko Asano, Daiki Tanaka and Isao Tokuda<br />

School of Infromation Science,<br />

Japan Advanced Institute of Science and Technology, Japan<br />

• A combined rimless wheel (CRW) that<br />

consists of two identical 8-legged rimless<br />

wheels is introduced.<br />

• Numerical simulations show that the<br />

walking speed dramatically increases by<br />

adjustment of the phase difference.<br />

• The validity of the theoretical results is<br />

confirmed by using a prototype<br />

experimental CRW.<br />

• The mechanism of speeding-up is<br />

investigated from the potential barrier<br />

point of view.<br />

11:15–11:30 WeBT7.8<br />

Zero-Moment Point Feedforward Balance<br />

Control of Leg-Wheel Hybrid Structures by using<br />

Input/Output Linearization<br />

Sang-ik An and Dong-Soo Kwon<br />

Mechanical Engineering, KAIST, Korea<br />

• A balance control algorithm is proposed<br />

which maximizes the dynamic stability by<br />

controlling the ZMP to remains at the<br />

geometric center of the supporting<br />

polygon while the robot traverses the<br />

desired trajectory.<br />

• The ZMP feedforward constraint is<br />

inserted into the dynamically decoupled<br />

model to satisfy the maximum dynamic<br />

stability, but it generates the internal<br />

dynamics.<br />

• Input/Output Linearization is utilized to<br />

address the problem of the internal<br />

dynamics.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–126–<br />

Leg-Wheel Hybrid Robot


Session WeBT8 Continental Parlor 8 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Multirobot Systems: Rendezvous & Task Switching<br />

Chair Richard Vaughan, Simon Fraser Univ.<br />

Co-Chair Filippo Arrichiello, Univ. di Cassino<br />

10:00–10:15 WeBT8.1<br />

Bayesian Rendezvous for Mobile Robots<br />

Sven Gowal and Alcherio Martinoli<br />

DISAL, EPFL, Switzerland<br />

• We study the rendezvous with differentialwheeled<br />

robots.<br />

• Robots have a limited and noisy<br />

perception of their neighboring<br />

teammates.<br />

• Rendezvous convergence properties are<br />

proven under noisy observations.<br />

• A Bayesian framework is exploited to<br />

improve the rendezvous performance.<br />

• Experiments are carried out on real<br />

Khepera III robots equipped with a IRbased<br />

range and bearing module.<br />

3 simulated Khepera III robots<br />

performing the Bayesian<br />

rendezvous with a particle filter.<br />

10:30–10:45 WeBT8.3<br />

A decentralized controller-observer scheme<br />

for multi-robot weighted centroid tracking<br />

Gianluca Antonelli, Filippo Arrichiello, Alessandro Marino<br />

DAEIMI, University of Cassino, Italy<br />

Fabrizio Caccavale<br />

DIFA, University of Basilicata, Italy<br />

• The paper presents a decentralized<br />

controller-observer scheme for centroid<br />

tracking with a multi-robot system<br />

• Each robot uses information from a local<br />

observer to cooperatively track an<br />

assigned time-varying reference<br />

• Convergence of the scheme is proven for<br />

fixed and switching communication<br />

topologies, and for directed and<br />

undirected graphs<br />

• Numerical simulations relative to different<br />

case studies are illustrated<br />

10:15–10:30 WeBT8.2<br />

Power-Aware Rendezvous with Shrinking<br />

Footprints<br />

Hassan Jaleel and Magnus Egerstedt<br />

Electrical and Computer Engineering<br />

Georgia Institute of Technology, USA<br />

• Power consumption is linked to sensor<br />

footprint area in multi-agent networks.<br />

• The rendezvous problem is studied when<br />

the footprints shrink over time due to<br />

power consumption.<br />

• A nonlinear interaction law is proposed for<br />

ensuring that no interactions are lost<br />

despite the shrinking footprints.<br />

• Lyapunov-based techniques are used to<br />

ensure that connectivity is achieved for all<br />

times.<br />

10:45–10:50 WeBT8.4<br />

Market-based Coordination of<br />

Coupled Robot Systems<br />

Ling Xu and Anthony Stentz<br />

Robotics Institute, Carnegie Mellon University, USA<br />

• We address the problem of<br />

environmental coverage with multiple<br />

robots<br />

• To solve this coupled problem, we<br />

introduce a parallel branch-and-bound<br />

approach that divides the computation<br />

among the robots<br />

• We also introduce an auctions<br />

framework, based on market-based<br />

techniques, which enables the robots to<br />

share the work and focus the search<br />

• Test results show parallel branch-andbound<br />

with auctions can lead to<br />

significant improvements for both search<br />

tree size and computation time<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–127–<br />

Parallel branch-and-bound with<br />

auctions


Session WeBT8 Continental Parlor 8 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Multirobot Systems: Rendezvous & Task Switching<br />

Chair Richard Vaughan, Simon Fraser Univ.<br />

Co-Chair Filippo Arrichiello, Univ. di Cassino<br />

10:50–10:55 WeBT8.5<br />

Event-driven Gaussian Process for Object<br />

Localization in Wireless Sensor Networks<br />

Jae Hyun YOO, Woojin Kim, and H.Jin KIM<br />

School of Mechanical & Aerospace Engineering,<br />

Seoul National University, South Korea<br />

• Wireless sensor networks suffer from<br />

excessive traffic between communicating<br />

nodes<br />

• In error-bounded algorithm, each node<br />

decides whether or not to transmit data<br />

to a sink node<br />

• Gaussian process regression is<br />

employed for position estimation of a<br />

unknown object<br />

• RSSI(Received Signal Strength<br />

Indicator), IR(Infrared Rays) and<br />

Magnetic sensors are used<br />

• Experimental results demonstrate the<br />

efficiency and accuracy of the proposed<br />

event-driven Gaussian process approach<br />

Experimental setup for the<br />

evaluation of the event-driven<br />

Gaussian process with thirty sensor<br />

nodes and one mobile robot<br />

11:00–11:15 WeBT8.7<br />

Task Switching in Multirobot Learning<br />

through Indirect Encoding<br />

David D’Ambrosio and Joel Lehman and Sebastian Risi and<br />

Kenneth Stanley<br />

Department of Electrical Engineering and Computer Science,<br />

University of Central Florida, United States<br />

• Training multiple robots to work together<br />

to perform complementary tasks is difficult<br />

• In this paper, multiagent HyperNEAT<br />

exploits the regularities among different<br />

robots and different tasks<br />

• Teams of Khepera III robots cooperatively<br />

explore an area and then return home<br />

when called<br />

• Teams that exploit regularities between<br />

subtasks solve the problem more<br />

consistently than those that cannot<br />

Z<br />

Multiagent HyperNEAT exploits<br />

the relationship among robots<br />

10:55–11:00 WeBT8.6<br />

Multi-Robot Patrolling with Coordinated<br />

Behaviours in Realistic Environments<br />

Luca Iocchi 1 , Luca Marchetti 1,2 , Daniele Nardi 1<br />

1 Department of Computer and Systems Science,\newline<br />

Sapienza, University of Rome, Italy.<br />

2 INRIA Sophia Antipolis - Mèditerranèe<br />

Sophia Antipolis, France.<br />

• Multi-Robot Patrolling needs more experiments in real and realistic<br />

simulated environments<br />

• Many strategies (devised and tested on abstract simulators) may act suboptimally<br />

or fail in real environments.<br />

• MRP needs on-line coordination in order to deal with un-modeled aspects<br />

of the problem during execution.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–128–<br />

Contributions<br />

• Realistic simulator (ROS)<br />

• Evaluation of standard MRP<br />

strategies under real conditions.<br />

• Definition of coordinated behaviors<br />

to improve MRP strategies in real<br />

environments.<br />

11:15–11:30 WeBT8.8<br />

Multi-Robot Coordination Methodology<br />

in<br />

Congested Systems with Bottlenecks<br />

Satoshi Hoshino<br />

Chemical Resources Laboratory, Tokyo Institute of Technology, Japan<br />

• Novel coordination methodology for<br />

autonomous mobile robots in congested<br />

systems with bottlenecks is proposed<br />

• Robot behavior control technique is<br />

improved to generate more efficient<br />

interaction force among robots<br />

• Environmental rule is designed to exert<br />

amplified interaction force on robots<br />

moving in congestion<br />

• Effectiveness of the robot behavior<br />

control technique and environmental rule<br />

through simulation experiments is shown<br />

Entrance<br />

1<br />

2<br />

Circuit 1<br />

8<br />

7<br />

Moving direction<br />

Velocity [m/s]<br />

1.5<br />

1.0<br />

0.5 Increased velocity<br />

0<br />

Constant & safe<br />

length L = 3.0 [m]<br />

3<br />

Crossing: C 1 & C 2<br />

Junction: C 1 & C 3<br />

Entrance<br />

9<br />

10<br />

Circuit 3 Circuit 2<br />

16<br />

6 5<br />

Crossing: C 1 & C 2 Lane unit = 20 [m]<br />

Junction: C 2 & C 3<br />

15<br />

40 [m]<br />

14<br />

13<br />

C 1<br />

Crossing & junction<br />

Simulation environment and<br />

velocity distribution in each lane<br />

C 3<br />

4<br />

C 2<br />

11<br />

12<br />

Original velocity


Session WeBT9 Continental Parlor 9 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Visual Servoing<br />

Chair Francois Chaumette, INRIA Rennes-Bretagne Atlantique<br />

Co-Chair Seth Hutchinson, Univ. of Illinois<br />

10:00–10:15 WeBT9.1<br />

From Optimal Planning to Visual Servoing<br />

With Limited FOV<br />

Paolo Salaris, Lucia Pallottino and Antonio Bicchi<br />

Interdept. Research Center “Enrico Piaggio”, University of Pisa, Italy<br />

Seth Hutchinson<br />

Dept. of Electrical and Computer Engineering, University of Illinois, USA<br />

• This paper presents an optimal feedback<br />

control scheme to drive a vehicle toward a<br />

desired position<br />

• The vehicle is equipped with limit Field-Of-<br />

View camera and must follow the shortest<br />

path keeping a given landmark in sight<br />

• Feedback control laws are defined<br />

exploiting geometric properties of the<br />

synthesis available from previous works<br />

Example of an optimal (shortest)<br />

path on the motion plane<br />

• By using a slightly generalized stability analysis setting, which is that<br />

of stability on a manifold, a proof of stability is given<br />

• Simulations and experiments demonstrate the effectiveness of the<br />

proposed technique<br />

10:30–10:45 WeBT9.3<br />

Intensity-based visual servoing for non-rigid<br />

motion compensation of soft tissue structures<br />

due to physiological motion using 4D ultrasound<br />

Deukhee Lee and Alexandre Krupa<br />

INRIA Rennes-Bretagne Atlantique, IRISA, France<br />

• Intensity-based visual tracking of 3D nonrigid<br />

motion in real time<br />

− 3D non-rigid motion model using<br />

TPS<br />

− Intensity-based motion estimation of<br />

a set of 3D points<br />

• Visual servo control structure to stabilize<br />

3D non-rigid motion using 4D-US imaging<br />

− Global rigid motion extraction<br />

− Position-based visual servo control<br />

• Successful robotic experiments for motion<br />

compensation with deformable TM<br />

phantom<br />

Manipulator<br />

6-DOF robot<br />

4D ultrasound ound probe<br />

Abdominal US<br />

phantom<br />

Turning table<br />

Drawer<br />

Deformable bl phantom h t<br />

A 6-DOF robot holding a 4D-US<br />

probe compensates rigid<br />

motion(top) and non-rigid<br />

motion(bottom).<br />

10:15–10:30 WeBT9.2<br />

Constrained Manipulator Visual Servoing (CMVS):<br />

Rapid Robot Programming in Cluttered Workspaces<br />

Ambrose Chan and Elizabeth Croft<br />

Mechanical Engineering, University of British Columbia, Canada<br />

James Little<br />

Computer Science, University of British Columbia, Canada<br />

• Model-free visual servoing of eye-in-hand<br />

manipulators in cluttered environments<br />

• Exploits the natural by-products of the<br />

teach-by-showing process, to help the<br />

robot navigate non-convex spaces<br />

• Robot servos to previously untaught<br />

locations under challenging constraints:<br />

• whole-arm collisions<br />

• object occlusions<br />

• robot's joint + camera's sensing limits<br />

• Automatic extraction of relevant cost<br />

functions and constraints for servoing<br />

• Experimental verification with Barrett WAM<br />

7-DOF + Sony XC-HR70 camera<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–129–<br />

Robot observes while the user<br />

obtains a “reference image” for<br />

visual servoing<br />

10:45–10:50 WeBT9.4<br />

Improving ultrasound intensity-based visual<br />

servoing: tracking and positioning tasks with<br />

2D and bi-plane probes<br />

Caroline Nadeau and Alexandre Krupa<br />

IRISA, INRIA Rennes-Bretagne Atlantique, France<br />

• Control of the in-plane and out-of-plane<br />

motions of an ultrasound probe<br />

• Intensity-based approach (no image<br />

processing or segmentation step)<br />

• On-line estimation of the current<br />

interaction matrix for positioning task with<br />

a classical 2D probe<br />

• Modelling of a bi-plane approach and<br />

experimental results with 2D and bi-plane<br />

ultrasound probes<br />

Validation of the method with a<br />

robotic arm using a realistic<br />

abdominal phantom


Session WeBT9 Continental Parlor 9 Wednesday, September 28, <strong>2011</strong>, 10:00–11:30<br />

Visual Servoing<br />

Chair Francois Chaumette, INRIA Rennes-Bretagne Atlantique<br />

Co-Chair Seth Hutchinson, Univ. of Illinois<br />

10:50–10:55 WeBT9.5<br />

Automatic landing on aircraft carrier by<br />

visual servoing<br />

Laurent Coutard and François Chaumette<br />

INRIA Rennes-Bretagne Atlantique and IRISA, France<br />

Jean-Michel Pflimlin<br />

Dassault Aviation, France<br />

• Alignment and landing tasks toward a<br />

moving aircraft carrier<br />

• 2D visual features, inspired by cues<br />

used by pilots, are related to aircraft<br />

state<br />

• Control with outputs feedback:<br />

•2D visual features<br />

•Inertial Central Unit<br />

• Validation with a 3D model-based tracker<br />

in a high fidelity simulator<br />

11:00–11:15 WeBT9.7<br />

Towards Vision-Based Control of Cable-Driven<br />

Parallel Robots<br />

Tej Dallej, Marc Gouttefarde, Nicolas Andreff, Micaël Michelin<br />

and Philippe Martinet<br />

(LASMEA/U.B.P - LIRMM - TECNALIA), FRANCE<br />

• Visual servoing methods are a good<br />

alternative for the control of cable-driven<br />

parallel robots.<br />

• The end-effector pose is measured using<br />

a camera and used for regulation.<br />

• The use of computer vision in the<br />

feedback simplifies the kinematic<br />

modeling.<br />

• The forward kinematic model is never<br />

used in the control.<br />

A photograph of ReelAx8 mobile<br />

platform observed by a camera.<br />

10:55–11:00 WeBT9.6<br />

Combining IBVS and PBVS to ensure the<br />

visibility constraint<br />

Olivier Kermorgant and François Chaumette<br />

Lagadic, INRIA Rennes-Bretagne Atlantique, France<br />

• Hybrid visual servoing based on a generic fusion approach<br />

• Pure PBVS is performed inside a safe image area<br />

• 2D information is added to keep the object in the image<br />

• Validation with simulation and experiments<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–130–<br />

Visual servo with a 3D tracking<br />

The model nodes are used as 2D<br />

features when they approach the<br />

image border<br />

11:15–11:30 WeBT9.8<br />

Time-Analysis of a Real-Time Sensor-Servoing<br />

System using Line-of-Sight Path Tracking<br />

Johannes Schrimpf 1 , Morten Lind 2 and Geir Mathisen 1,3<br />

1 Dept. of Engineering Cybernetics,<br />

Norwegian University of Science and Technology, Norway<br />

2 Dept. of Production and Quality Engineering,<br />

Norwegian University of Science and Technology, Norway<br />

3 Dept. of Applied Cybernetics, SINTEF ICT, Norway<br />

• Case system: Visual servoing<br />

for path-tracking with a real-time<br />

controlled robot.<br />

• Measurement and estimation of<br />

the delays in the control loop.<br />

• Stability analysis of a simple<br />

control-model based on pure<br />

delays in the control loop.<br />

• Qualitative validation of the<br />

estimated delay by simulated<br />

and experimental data.


Session WeCT1 Continental Parlor 1 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Human-Robot Interaction and Cooperation<br />

Chair Weihua Sheng, Oklahoma State Univ.<br />

Co-Chair Rachid Alami, CNRS<br />

14:00–14:15 WeCT1.1<br />

A Path Planning Method for Human Tracking<br />

Agents using Variable-term Prediction based on<br />

dynamic k Nearest Neighbor Algorithm<br />

Noriko Takemura, Yutaka Nakamura, and Hiroshi Ishiguro<br />

Graduate School of Engineering Science, Osaka University, Japan<br />

• A path planning method for human<br />

tracking by multiple agents.<br />

• Agents’ paths are planned based on<br />

similarity between the humans’ positions<br />

and the agents’ FOV.<br />

• The horizon length of the planned path is<br />

determined according to the accuracy of<br />

predicted human positions.<br />

• Simulation experiments showed that<br />

agents can keep tracking human<br />

movements recorded in a real<br />

environment.<br />

Recorded human movements<br />

used for simulation experiment<br />

(Kyoto station)<br />

14:30–14:45 WeCT1.3<br />

Sound Source Localization Based on<br />

Time Difference Feature and Space Grid Matching<br />

Xiaofei Li, Hong Liu and Xuesong Yang<br />

Shenzhen Graduate School, Peking University, CHINA<br />

• A new spectral weighting GCC-PHAT<br />

method estimates robust time difference<br />

of arrival in real environments.<br />

• Space is divided into grids represented<br />

by Gaussian Mixture Model based on<br />

time difference feature.<br />

• Finding the maximum likelihood grid with<br />

the current time difference feature in<br />

localization step.<br />

• Decision tree reduces the number of<br />

matches greatly.<br />

The partition of horizontal space<br />

and the process of localization<br />

14:15–14:30 WeCT1.2<br />

Using Human Motion Estimation for<br />

Human-Robot Cooperative Manipulation<br />

Anand Thobbi, Ye Gu and Weihua Sheng<br />

Department of Electrical and Computer Engineering,<br />

Oklahoma State University, USA.<br />

• We propose a novel approach that allows<br />

a humanoid robot to determine its<br />

leader/follower role in a collaborative<br />

manipulation task<br />

• Behavior of the robot is governed by a<br />

weighted sum of proactive and reactive<br />

controllers<br />

• Reactive controller learnt using<br />

reinforcement learning<br />

• Design of the proactive controller is based<br />

on human motion prediction<br />

• Proposed framework is demonstrated on a<br />

joint table lifting task<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–131–<br />

Experimental Setup and Results<br />

14:45–14:50 WeCT1.4<br />

Adapting Robot Team Behavior from Interaction<br />

with a Group of People<br />

P. Urcola and L. Montano<br />

Instituto de Investigación en Ingeniería de Aragón<br />

Universidad de Zaragoza, Spain<br />

• A team of robots cooperate to detect and track people using laser range<br />

finder sensors<br />

• Group motion and people entering and leaving the group are detected<br />

• The behavior of the team is adapted to the people and to the<br />

environment<br />

• The system is applied in a group guiding mission scenario<br />

• Simulation tests and experiments in a real environment are presented<br />

Group guiding mission experiment.<br />

The perception of the team is on top right corner


Session WeCT1 Continental Parlor 1 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Human-Robot Interaction and Cooperation<br />

Chair Weihua Sheng, Oklahoma State Univ.<br />

Co-Chair Rachid Alami, CNRS<br />

14:50–14:55 WeCT1.5<br />

Towards a Platform-Independent Cooperative<br />

HRI System: II. Perception, Execution and<br />

Imitation of Goal Directed Actions<br />

S. Lallée, U. Pattacini, JD. Boucher, S. Lemaignan, A. Lenz, C.<br />

Melhuish, L. Natale, S. Skachek, K. Hamann, J. Steinwender, EA<br />

Sisbot, G. Metta, R. Alami, M. Warnier, J. Guitton, F. Warneken,<br />

PF. Dominey<br />

• Action perceptual & motor<br />

representations linked within<br />

the same datastructure<br />

• Actions can be recognized,<br />

performed and used in<br />

Shared Plan in cooperation<br />

with a human<br />

• Robotic platform<br />

Independence is guaranteed<br />

through the use of the<br />

EgoSphere abstraction layer<br />

15:00–15:15 WeCT1.7<br />

Improvement of Speaker Localization<br />

by Considering Multipath Interference of Sound Wave<br />

for Binaural Robot Audition<br />

Ui-Hyun Kim, Takeshi Mizumoto,<br />

Tetsuya Ogata, and Hiroshi G. Okuno<br />

Department of Intelligence Science and Technology,<br />

Graduate School of Informatics, Kyoto University, Japan<br />

• This paper presents an improved speaker<br />

localization based on the GCC-PHAT<br />

method for binaural robot audition.<br />

• We proposed a new time delay factor for<br />

the GCC-PHAT method to compensate<br />

multipath interference due to diffraction of<br />

the sound wave after assuming that the<br />

robot head is spherical.<br />

• Experiments conducted in the SIG-2<br />

humanoid robot show that the proposed<br />

method reduces localization errors by 17.8<br />

degrees on average and by over 35<br />

degrees in side directions more than the<br />

conventional DOA estimation.<br />

RMSE (degree)<br />

Multipath interference<br />

with spherical-head assumption<br />

40<br />

30<br />

20<br />

10<br />

0<br />

Conventional DOA estimation<br />

Proposed DOA estimation #1<br />

Proposed DOA estimation #2<br />

Proposed DOA estimation #3<br />

-50 0<br />

Azimuth (degree)<br />

50<br />

RMSE of experimental results<br />

in speaker localization<br />

14:55–15:00 WeCT1.6<br />

Listening for people: Exploiting the spectral<br />

structure of speech to robustly perceive<br />

the presence of people<br />

Barbara Hilsenbeck<br />

Faculty of Mechanical and Mechatronic Engineering<br />

Karlsruhe University of Applied Sciences, Karlsruhe, Germany<br />

Nathan Kirchner<br />

Centre for Autonomous Systems, University of Technology, Sydney, Australia<br />

• As the desire to see robots<br />

ubiquitous in society grows, so<br />

does the need for providing the<br />

robots with the means of<br />

building awareness of any<br />

humans with which it may be<br />

sharing the environment<br />

• This paper presents a realworld<br />

suitable system which<br />

enables robots to robustly<br />

perceive the presence of<br />

people acoustically<br />

15:15–15:30 WeCT1.8<br />

Robot Audition & Beat Identification in Noisy<br />

Environments<br />

David K. Grunberg, Daniel M. Lofaro, Paul Oh, and Y. E. Kim<br />

College of Engineering, Drexel University, USA<br />

• Vision: use beat tracking algorithms for<br />

enabling the Hubo robot to participate in<br />

musical ensembles.<br />

• Musical participation requires the ability to<br />

determine auditory beat locations despite<br />

acoustic noise.<br />

• We explore the use of adaptive filtering to<br />

eliminate acoustic noise for greater<br />

musical accuracy.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–132–<br />

The Hubo robot


Session WeCT2 Continental Parlor 2 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Pose Estimation & Visual Tracking<br />

Chair Evangelos Papadopoulos, National Tech. Univ. of Athens<br />

Co-Chair Wim Meeussen, Willow Garage inc.<br />

14:00–14:15 WeCT2.1<br />

Determination of Rigid-Body Pose from<br />

Imprecise Point Position Measurements<br />

Anastasia Tegopoulou and Evangelos Papadopoulos<br />

Department of Mechanical Engineering,<br />

National Technical University of Athens, Athens, Greece<br />

• Determine rigid-body position and<br />

orientation from the position of a number<br />

of its points<br />

• The novel proposed method (NM) uses<br />

normal vectors, by-passing frame<br />

definitions<br />

• The NM takes advantage of point<br />

interconnections and is computationally<br />

efficient<br />

• The NM yields reliable results both with<br />

absolute and relative position sensors<br />

and both with low and high noise levels<br />

Two consecutive time instants for<br />

three point position measurements<br />

during planar motion<br />

14:30–14:45 WeCT2.3<br />

Simultaneous Localization and Capture<br />

with Velocity Information<br />

Qilong Yuan, I-Ming Chen<br />

School of Mechanical and Aerospace Engineering,<br />

Nanyang Technological University, Singapore<br />

• This paper introduces a method to<br />

monitor and capture the velocity,<br />

location, and dynamic behavior of a<br />

human with a self motion capture suit.<br />

• Velocity from lower limb kinematics<br />

and the integration of accelerations<br />

are fused through Kalman Filters to<br />

achieve smooth, accurate and drift<br />

free estimation results.<br />

• Capture of spatial motion for dynamic<br />

behaviors like running and jumping is<br />

realized.<br />

Spatial Motion for Jumping<br />

14:15–14:30 WeCT2.2<br />

Pose Estimation from a Single Image using Tensor<br />

Decompositions and an Algebra of Circulunts<br />

Randy C. Hoover<br />

Dept. of Electrical and Computer Engineering, SDSM&T, USA<br />

Karen S. Braman<br />

Dept. of Mathematics and Computer Science, SDSM&T, USA<br />

Ning Hao<br />

Dept. of Mathematics, Tufts, USA<br />

• New tensor SVD described using an<br />

algebra of circulunts<br />

• Experimental results applied to<br />

image analysis and interpretation<br />

• Able to reconstruct image set<br />

information using a much lower<br />

subspace dimension<br />

• Accurate pose estimation for one<br />

dimensionally correlated sequence<br />

using a single t-eigenimage<br />

• Analyze image sequences containing<br />

multiple degrees of correlations<br />

(multi-modal analysis)<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–133–<br />

�<br />

14:45–14:50 WeCT2.4<br />

Outlet detection and pose estimation for robot<br />

continuous operation<br />

Victor Eruhimov<br />

Itseez, Russia<br />

Wim Meeussen<br />

Willow Garage, USA<br />

• Outlet detection algorithm uses a single<br />

monocular camera<br />

• It locates 6 holes of a 2x1 outlet structure<br />

• A new pose estimation algorithm finds<br />

outlet holes 3D coordinates with submillimeter<br />

accuracy<br />

• It uses an image from a monocular<br />

camera and a wall normal orientation<br />

obtained from a stereo camera<br />

• The algorithm shows no false positives on<br />

a test set of 500+ outlet images<br />

• The plugin function enables PR2<br />

continuous operation that lasted for 13<br />

days


Session WeCT2 Continental Parlor 2 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Pose Estimation & Visual Tracking<br />

Chair Evangelos Papadopoulos, National Tech. Univ. of Athens<br />

Co-Chair Wim Meeussen, Willow Garage inc.<br />

14:50–14:55 WeCT2.5<br />

Robust Tracking of Human Hand Postures<br />

for Robot Teaching<br />

Jonathan Maycock, Jan Steffan,<br />

Robert Haschke, and Helge Ritter<br />

Neuroinformatics, Bielefeld University, Germany<br />

• Three methods for real-time tracking<br />

of hands are presented<br />

• A dataglove approach and two<br />

VICON approaches (one direct<br />

calculation and inverse kinematics<br />

approach)<br />

• Qualitative and quantitative are<br />

presented<br />

• Inverse kinematics approach found<br />

to have performed best for both<br />

power and precision grasps<br />

15:00–15:15 WeCT2.7<br />

Visual Tracking of Robots in Uncalibrated<br />

Environments<br />

Hesheng Wang and Weidong Chen<br />

Department of Automation, Shanghai Jiao Tong University, China<br />

Zhongli Wang<br />

Department of Automation, Beijing Jiao Tong University, China<br />

• Proposed a new adaptive controller for<br />

trajectory tracking of a number of feature<br />

points on a robot manipulator<br />

• Developed an adaptive algorithm to<br />

estimate the unknown parameters on-line<br />

• Proved asymptotic convergence of<br />

tracking errors by Lyapunov method and<br />

verified performance by experiments<br />

Visual tracking system<br />

14:55–15:00 WeCT2.6<br />

Visual Tracking Using the<br />

Sum of Conditional Variance<br />

Rogerio Richa, Raphael Sznitman, Russell Taylor, Gregory Hager<br />

Laboratory for Computational Science and Robotics (LCSR)<br />

Johns Hopkins University, Baltimore MD, USA<br />

• The SCV is a similarity metric used<br />

in the medical imaging domain for<br />

multi-modal image registration<br />

• Applied to visual tracking, the SCV<br />

is invariant to non-linear<br />

illumination variations and is<br />

computationally inexpensive<br />

• Compared to other similarity<br />

metrics, it requires significantly less<br />

iterations for convergence and has<br />

a larger convergence radius<br />

15:15–15:30 WeCT2.8<br />

Hybrid Discriminative Visual Object Tracking<br />

with Confidence Fusion for Robotics<br />

Applications<br />

Ren C. Luo, Ching-Chung Kao, and Yen-Chang Wu<br />

Electrical Engineering Department, National Taiwan University, Taiwan<br />

• In this paper, we propose a hybrid visual tracking algorithm that<br />

combines two discriminative trackers<br />

• The two trackers collectively determine the new object location via a<br />

process called confidence fusion<br />

• One component tracker is aimed to discriminate the target from the<br />

background in pixel level while the other is aimed to do so in patch level<br />

• The resultant hybrid tracker is hopefully more powerful since the two<br />

component trackers can complement the ability of discrimination<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–134–


Session WeCT3 Continental Parlor 3 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Robot Safety<br />

Chair Richard Voyles, Univ. of Denver<br />

Co-Chair Irene Sardellitti, Italian Inst. of Tech.<br />

14:00–14:15 WeCT3.1<br />

Towards safe human-robot interaction in robotic<br />

cells: an approach based on visual tracking and<br />

intention estimation<br />

Luca Bascetta, Gianni Ferretti and Paolo Rocco<br />

Dipartimento di Elettronica e Informazione, Politecnico di Milano, Italia<br />

Håkan Ardö<br />

Department of Mathematics, Lund University, Lund<br />

Herman Bruyninckx, Eric Demeester and Enrico Di Lello<br />

Department of Mechanical Engineering, Katholieke Universiteit Leuven, Belgium<br />

• Human-robot interaction requires the<br />

elimination of the safety fences<br />

• A new functionality of the safety<br />

controller – to detect and track the<br />

humans in the cell and to estimate their<br />

intentions – is required<br />

• The overall goal is to predict in the least<br />

possible time which area in the vicinity<br />

of the robot the human is heading to<br />

14:30–14:45 WeCT3.3<br />

Relaxing the Inevitable Collision State Concept to Address<br />

Provably Safe Mobile Robot Navigation with<br />

Limited Field-of-View in Unknown Dynamic Environments<br />

Sara Bouraine 1 , Thierry Fraichard 2 and Salhi Hassen 3<br />

1 CDTA , Algeria, 2 INRIA, France, 3 Blida University, Algeria<br />

• How to guarantee motion safety in partially known dynamic environments?<br />

• Absolute motion safety impossible to guarantee<br />

• Settle for weaker motion safety: passive motion safety (PMS) = if a<br />

collision must take place, robot is at rest<br />

• PMS obtained via Braking ICS, a relaxation of regular ICS (Inevitable<br />

Collision State)<br />

• Braking ICS checking algorithm designed and used to obtain provably<br />

passively safe navigation<br />

14:15–14:30 WeCT3.2<br />

Guaranteed Safe Online<br />

Learning of a Bounded System<br />

Jeremy Gillula<br />

Computer Science Dept., Stanford, USA<br />

Claire Tomlin<br />

EECS Dept., UC Berkeley, USA<br />

• Many machine learning algorithms are<br />

limited to situations where safety is not<br />

critical, since they lack guarantees about<br />

their stability and robustness<br />

• This can be solved by augmenting<br />

learning algorithms with reachability<br />

analysis, a general technique for making<br />

safety guarantees<br />

• We apply these ideas to a scenario in<br />

which an aerial robot is attempting to learn<br />

the dynamics of a ground vehicle<br />

• The resulting simulations show that by<br />

combining these two paradigms, one can<br />

create robotic systems that feature both<br />

high performance and guaranteed safety<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–135–<br />

Simulated example application:<br />

tracking a ground target using an<br />

aerial vehicle.<br />

14:45–14:50 WeCT3.4<br />

Capacitive Skin Sensors<br />

for Robot Impact Monitoring<br />

Samson Phan, Zhan Fan Quek, Preyas Shah, Zubair Ahmed, and<br />

Mark Cutkosky<br />

Mechanical Engineering, Stanford University, USA<br />

Dongjun Shin and Oussama Khatib<br />

Computer Science, Stanford University, USA<br />

• Capacitive sensors are embedded in an<br />

energy-absorbing artificial skin<br />

• Sensor scanning provides the history of<br />

a collision, including force magnitudes<br />

and contact locations<br />

• Impact testing demonstrates ability to<br />

discern effects of velocity, collision<br />

angle, contact stiffness, etc.<br />

Test setup and sequence of sensor<br />

readings for a sliding contact


Session WeCT3 Continental Parlor 3 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Robot Safety<br />

Chair Richard Voyles, Univ. of Denver<br />

Co-Chair Irene Sardellitti, Italian Inst. of Tech.<br />

14:50–14:55 WeCT3.5<br />

Variable Radius Pulley Design Methodology for<br />

Pneumatic Artificial Muscle-based Antagonistic<br />

Actuation Systems<br />

Dongjun Shin1 , Xiyang Yeh2 , Oussama Khatib1 1<br />

Artificial Intelligence Laboratory, Stanford University, USA<br />

2<br />

Mechanical Engineering , Stanford University, USA<br />

• Analysis of the limitations of<br />

conventional circular pulleys<br />

• A new design methodology for<br />

variable radius pulleys<br />

• Experimental comparisons to<br />

show performance improvement in<br />

position control in the enlarged<br />

workspace.<br />

Variable Radius Pulleys in Pneumatic<br />

Muscle-driven 1-DOF Testbed<br />

15:00–15:15 WeCT3.7<br />

Online Data-Driven Fault Detection<br />

for Robotic Systems<br />

Raphael Golombek and Sebastian Wrede<br />

CoR-Lab, Bielefeld University, Germany<br />

Marc Hanheide<br />

School of Computer Science, University of Birmingham, UK<br />

Martin Heckmann<br />

Honda Research Institute Europe, Offenbach, Germany<br />

• Fault detection applicable for online as<br />

well as a-posteriori analysis<br />

• Exploits inter-component<br />

communication in order to model<br />

normal system behaviour<br />

• Directly applicable to systems built upon a<br />

data-driven or event-based robotics<br />

middle-ware concept<br />

• Flexible design allows to adapt to<br />

systems which do not fit the<br />

aforementioned architectural concepts<br />

Conceptual Overview of the<br />

Data-Driven Fault Detection<br />

Approach<br />

14:55–15:00 WeCT3.6<br />

Fast Computation of Wheel-Soil Interactions for<br />

Safe and Efficient Operation of Mobile Robots<br />

Zhenzhong Jia, William Smith and Huei Peng<br />

Ground Robotics Reliability Center (GRRC), University of Michigan, USA<br />

Department of Mechanical Engineering, University of Michigan, USA<br />

• Fast analytical models of wheel-soil<br />

interactions are important for mobile<br />

robots in off-road environments.<br />

• An accurate, fully-functional, closed-form<br />

wheel-terrain interaction model was<br />

developed by quadratic approximation of<br />

the stress distributions.<br />

• A non-iterative method was derived to<br />

estimate the entry angle when given the<br />

normal wheel load.<br />

• Mobility prediction and real-time vehicle<br />

control become possible by coupling the<br />

wheel-soil interaction model and the entry<br />

angle estimator.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–136–<br />

Determine forces/torques by<br />

integrating the stress distributions<br />

15:15–15:30 WeCT3.8<br />

Containment indicator function construction via<br />

numerical conformal mapping<br />

Shuo Han and Richard M. Murray<br />

California Institute of Technology, Pasadena, USA<br />

• Numerical conformal mapping is used to<br />

obtain a smooth containment indicator<br />

function for optimal motion planning<br />

• The procedure can be approximated by<br />

solving a convex optimization problem<br />

• Several different 2D shapes have been<br />

tested<br />

• Can also be extended to modeling<br />

obstacles without much difficulty


Session WeCT4 Continental Ballroom 4 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Haptic Feedback and System Evaluation<br />

Chair Marcia O'Malley, Rice Univ.<br />

Co-Chair Yasuyoshi Yokokohji, Kobe Univ.<br />

14:00–14:15 WeCT4.1<br />

The sigma.7 haptic interface for MiroSurge:<br />

A new bi-manual surgical console<br />

Andreas Tobergte¹, Patrick Helmer², Ulrich Hagn¹,<br />

Patrice Rouiller², Sophie Thielmann¹, Sebastien Grange²,<br />

Alin Albu-Schäffer¹, Francois Conti², and Gerd Hirzinger¹<br />

German Aerospace Center (DLR) ¹<br />

Force Dimension²<br />

• Design of the haptic devices and the surgical console<br />

• Kinematics and dynamics<br />

• Control with force/torque sensor<br />

• Ergonomics of the console<br />

14:30–14:45 WeCT4.3<br />

Measuring an Operator’s Maneuverability<br />

Performance in the Haptic Teleoperation of<br />

Multiple Robots<br />

Hyoung Il Son 1 , Lewis L. Chuang 1 , Antonio Franchi 1 , Junsuk Kim 2 ,<br />

Dongjun Lee 3 , Seong-Whan Lee 2 , Heinrich H. Bülthoff 1,2 ,<br />

and Paolo Robuffo Giordano 1<br />

1 Max Planck Institute for Biological Cybernetics, Germany;<br />

2 DBCE, Korea University, Korea; 3 DMAE, Seoul National University, Korea<br />

• Maneuverability measure to assess the ease<br />

of the operator in maneuvering the remote<br />

multi-robot slave<br />

• Three haptic cues to perceive<br />

the state of the remote UAVs and their<br />

surrounding environments:<br />

1. Velocity cue<br />

2. Force cue<br />

3. Velocity+Force cue<br />

Haptic Device<br />

• Evaluation by using H 2 norm and<br />

bandwidth of the maneuverability measure<br />

• Results indicate that less control effort<br />

is required in achieving the same level of<br />

tracking accuracy with Force cue<br />

Velocity<br />

Command<br />

Tele-Operator’s<br />

Maneuverability<br />

Haptic<br />

Feedback<br />

JND<br />

Multiple UAVs<br />

14:15–14:30 WeCT4.2<br />

Haptic coupling with augmented feedback between<br />

two KUKA LWR’s and the PR2 robot arms<br />

Koen Buys, Steven Bellens, Wilm Decre,<br />

Ruben Smits, Enea Scioni, Tinne De Laet,<br />

Joris De Schutter, and Herman Bruyninckx<br />

Department Mechanical Eng., KU Leuven, Belgium<br />

• Coupling between Orocos and ROS<br />

robotic frameworks<br />

• Haptic feedback setup with non-uniform<br />

robot coupling<br />

• Coupling of head movement<br />

• Augmented visual feedback in VR goggles<br />

• Quantitative force measurements in<br />

experiments<br />

14:45–14:50 WeCT4.4<br />

Bilateral Physical Interaction with a<br />

Robot Manipulator through a<br />

Weighted Combination of Flow Fields<br />

Antonio Pistillo, Sylvain Calinon and Darwin G. Caldwell<br />

Department of Advanced Robotics, Istituto Italiano di Tecnologia, Italy<br />

• Tasks demonstrated by kinesthetic<br />

teaching technique and encoded<br />

with a combination of local flow<br />

fields.<br />

• The weighting mechanism modifies<br />

standard GMM weights to be<br />

independent from each other.<br />

• Safe Physical Interaction between<br />

the user and the robot is achieved.<br />

• The robot is used as a bi-directional<br />

tangible interface in order to<br />

influence its actions.<br />

• The user can start, stop, pause,<br />

resume and select desired robot<br />

movements by physically guiding it<br />

during task executions.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–137–


Session WeCT4 Continental Ballroom 4 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Haptic Feedback and System Evaluation<br />

Chair Marcia O'Malley, Rice Univ.<br />

Co-Chair Yasuyoshi Yokokohji, Kobe Univ.<br />

14:50–14:55 WeCT4.5<br />

Motion Control of a Semi-Mobile Haptic Interface<br />

for Extended Range Telepresence<br />

Antonia Pérez Arias and Uwe D. Hanebeck<br />

Intelligent Sensor-Actuator-Systems Laboratory (ISAS),<br />

Karlsruhe Institute of Technology (KIT), Germany<br />

• Prepositioning algorithm for haptic<br />

exploration of spatially unrestricted<br />

target environments<br />

• Key idea: Consideration of user’s<br />

direction of motion<br />

• Benefits:<br />

• Optimal utilization of available<br />

space in user environment<br />

• Robustness against noisy<br />

measurements of user position<br />

• Experimental validation under<br />

real-world conditions<br />

Semi-mobile haptic interface and<br />

operator in user environment.<br />

15:00–15:15 WeCT4.7<br />

Assistance or Challenge?<br />

Filling a Gap in User-Cooperative Control<br />

1Georg Rauter, 1Roland Sigrist, 1Laura Marchal-Crespo,<br />

1, 2Heike Vallery, 1Robert Riener, and 1Peter Wolf<br />

1SMS-Lab, Institute of Robotics and Intelligent Systems (IRIS), ETH-Zurich &<br />

Medical Faculty,University of Zurich, Switzerland<br />

2Biomedical Engineering, Khalifa University of Science, Technology &<br />

Research (KUSTAR), Abu Dhabi, UAE<br />

• Haptic feedback might be effective if the right amount of guidance or<br />

challenge is provided<br />

• We developed a user-cooperative controller that can be continuously<br />

modified from supportive to challenging mode<br />

• In a pilot study, the controller could proof its applicability for motor learning<br />

studies<br />

Rowing simulator: Haptic device consists of a tendon-based parallel robot<br />

14:55–15:00 WeCT4.6<br />

An Objective Index that Substitutes for Subjective<br />

Quality of Vibrotactile Material-Like Textures<br />

Shogo Okamoto and Yoji Yamada,<br />

Dept. of Mechanical Science and Engineering, Nagoya University, Japan<br />

• For the delivery of tactile contents through the Internet, the lossy data<br />

compression of vibrotactile textures are expected.<br />

• An objective index that evaluates the quality deterioration of the lossycompressed<br />

textures is necessary for an alternative for psychological<br />

experiments.<br />

• This study developed such<br />

an index based on the<br />

amplitude spectrum of<br />

vibrotactile stimuli.<br />

• The index was applied to<br />

wood and sandpaper<br />

textures and described the<br />

subjective quality indices by<br />

approximately 80%.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–138–<br />

Developed objective index<br />

Quality indices of compressed vibrotactile<br />

material-like textures<br />

Subjective indices acquired by<br />

psychological experiments<br />

15:15–15:30 WeCT4.8<br />

Design and Characterization of the ReHapticKnob, a<br />

Robot for Assessment and Therapy of Hand Function<br />

Jean-Claude Metzger, Olivier Lambercy,<br />

Dominique Chapuis, Roger Gassert<br />

Rehabilitation Engineering Laboratory, ETH Zurich, Switzerland<br />

• End-effector rehabilitation robot with<br />

unique sensing and actuation capabilities<br />

• 2 active DOF for training of hand<br />

opening/closing (88N) and forearm<br />

rotation (0.98Nm) tasks<br />

• Developed for precise and objective<br />

assessments of hand function and for high<br />

fidelity dynamic interaction<br />

• Preliminary grasping results with healthy<br />

subjects


Session WeCT5 Continental Ballroom 5 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Symbolic Approaches to Motion Planning and Control<br />

Chair Hadas Kress-Gazit, Cornell Univ.<br />

Co-Chair Calin Belta, Boston Univ.<br />

14:00–14:15 WeCT5.1<br />

LTL-Based Decentralized Supervisory Control of<br />

Multi-Robot Tasks Modelled as Petri Nets<br />

Bruno Lacerda and Pedro U. Lima<br />

Institute for Systems and Robotics, Instituto Superior Técnico, Portugal<br />

• Methodology to enforce coordination rules s<br />

for multi-robot systems<br />

• Each robot is modelled as a Petri net and<br />

coordination rules are written in LTL<br />

• Decentralized approach, uses the LTL<br />

specifications to define the communication n<br />

requirements between the robots<br />

• Greatly improves scalability when<br />

compared to centralized approach<br />

The Th inputs i t of fth the methodology: th d l<br />

A Petri net model of a robot and<br />

an LTL formula for the robot to fulfill<br />

14:30–14:45 WeCT5.3<br />

Foundations of Formal Language for Humans<br />

and Artificial Systems Based on Intrinsic<br />

Structures in Spatial Behaviors<br />

Zhaodan Kong and Bernie Mettler<br />

Department of Aerospace Engineering and Mechanics<br />

University of Minnesota, USA<br />

• Interactions between vehicle<br />

dynamics and environment result in<br />

patterns in spatial behavior.<br />

• Preliminary results studying these<br />

patterns show that behavior exhibits<br />

well defined hierarchic structure.<br />

• Results can be used to create a<br />

formal language, which can be used<br />

to mitigate complexity in planning and<br />

representation.<br />

• Gained insight will also help<br />

understand spatial intelligence and<br />

help design adaptive guidance<br />

algorithms.<br />

Optimal Trajectories of Raven<br />

UAV in Downtown San Diego<br />

14:15–14:30 WeCT5.2<br />

Optimal Multi-Robot Path Planning with Temporal<br />

Logic Constraints<br />

Alphan Ulusoy<br />

Systems Eng., Boston University, USA<br />

Stephen L. Smith<br />

Electrical and Computer Eng., University of Waterloo, Canada<br />

Xu Chu Ding and Calin Belta<br />

Systems Eng., Boston University, USA<br />

Daniela Rus<br />

EECS, Massachusetts Institute of Technology, USA<br />

• A method for planning optimal paths for a<br />

group of robots that satisfy a common<br />

mission.<br />

• The mission is given as a Linear Temporal<br />

Logic formula.<br />

• In addition, an optimizing proposition must<br />

repeatedly be satisfied.<br />

• The goal is to minimize the maximum time<br />

between the satisfying instances of the<br />

optimizing proposition.<br />

• Method utilizes a timed automaton<br />

representation to capture asynchronous<br />

robot motions.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–139–<br />

Simulated team trajectories<br />

14:45–14:50 WeCT5.4<br />

Minimalist Multiple Target Tracking Using<br />

Directional Sensor Beams<br />

Leonardo Bobadilla, Oscar Sanchez, Justin Czarnowski<br />

Steven M. LaValle<br />

Department of Computer Science, University of Illinois, USA<br />

• Multiple-agent tracking is a fundamental<br />

problem of sensor networks with broad<br />

applications.<br />

• We propose simple and efficient<br />

algorithms for reconstructing paths of<br />

agents using weak detection sensors.<br />

• Designed and implemented a low-cost,<br />

low energy consumption hardware<br />

architecture to demonstrate the practicality<br />

proposed approach.<br />

a) Two randomly moving bodies<br />

b) Ground truth paths<br />

c) Reconstructed paths<br />

based on our algorithm


Session WeCT5 Continental Ballroom 5 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Symbolic Approaches to Motion Planning and Control<br />

Chair Hadas Kress-Gazit, Cornell Univ.<br />

Co-Chair Calin Belta, Boston Univ.<br />

14:50–14:55 WeCT5.5<br />

Temporal Logic Control in Dynamic Environments<br />

with Probabilistic Satisfaction Guarantees<br />

Ana Medina Ayala, Sean Andersson and Calin Belta<br />

Mechanical Engineering, Boston University, United States<br />

• A robot with noisy sensors and<br />

actuators and moving in a dynamic<br />

environment is considered.<br />

• The task is given as a specification in<br />

Probabilistic Computation Tree Logic<br />

(PCTL).<br />

• Algorithms for synthesizing a control<br />

policy to maximize the probability of<br />

satisfaction are derived.<br />

• Different levels of knowledge of<br />

environment dynamics are<br />

considered.<br />

15:00–15:15 WeCT5.7<br />

High-Level Control of Modular Robots<br />

Sebastian Castro, Sarah Koehler and Hadas Kress-Gazit<br />

Sibley School of Mechanical Engineering, Cornell University, USA<br />

• Generation of provably correct<br />

controllers from structured English<br />

specifications with reconfigurable<br />

modular robots.<br />

• Utilizes the LTLMoP framework with the<br />

addition of reconfiguration via descriptors<br />

called traits.<br />

•A configuration-gait-trait library<br />

automatically selects robot morphologies<br />

given a set of active traits.<br />

• Demonstrated through the CKBot<br />

platform and a 3D simulator built on a<br />

physics engine.<br />

Sample Workspace and Task<br />

Execution with the CKBot platform.<br />

14:55–15:00 WeCT5.6<br />

Computing Unions of Inevitable Collision States<br />

and Increasing Safety to Unexpected Obstacles<br />

Daniel Althoff 1 , Christoph N. Brand 1 ,<br />

Dirk Wollherr 1,2 and Martin Buss 1<br />

Institute of Automatic Control Engineering, TUM, Germany 1<br />

Institute for Advanced Study, TUM, Germany 2<br />

• Problem of computing unions of Inevitable<br />

Collision State (ICS) sets is addressed<br />

• Sequential computation of ICS is shown<br />

• Two novel ICS-Checker algorithms are<br />

presented allowing efficient computation<br />

• Concept of robot Maneuverability is introduced,<br />

reducing collision probability to unexpected<br />

obstacles<br />

• Simulation results validate these concepts<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–140–<br />

Top: Union of singel ICS sets<br />

Bottom: ICS set of union of<br />

obstacles<br />

15:15–15:30 WeCT5.8<br />

Synthesis of Feedback Controllers for Multiple<br />

Aerial Robots with Geometric Constraints<br />

Nora Ayanian, Vinutha Kallem and Vijay Kumar<br />

Department of Mechanical Engineering and Applied Mechanics<br />

University of Pennsylvania, USA<br />

• Algorithm for multi-robot navigation<br />

with convergence and safety<br />

guarantees<br />

• Kinematic and dynamic agents<br />

• Maintain communication and avoid<br />

collisions<br />

• Simultaneously decompose joint<br />

configuration space into polytopes<br />

and plan a discrete path using A*<br />

• Sequentially deploy navigationfunction<br />

based local controllers on<br />

polytopes to reach goal<br />

• Implementations on teams of<br />

ground robots and quadrotors<br />

A quadrotor (red) and two ground robots<br />

(blue, green) deploy at left to cover an<br />

intersection in an urban environment<br />

while maintaining communication.


Session WeCT6 Continental Ballroom 6 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Marine Robotics: Platforms and Applications<br />

Chair Ryan Smith, Queensland Univ. of Tech.<br />

Co-Chair Gianluca Antonelli, Univ. degli Studi di Cassino<br />

14:00–14:15 WeCT6.1*<br />

Semi-Plenary Invited Talk: Wiring an Interactive Ocean<br />

John Delaney, University of Washington<br />

14:30–14:45 WeCT6.3<br />

Towards Mixed-initiative, Multi-robot Field<br />

Experiments: Design, Deployment, and Lessons<br />

Learned<br />

Jnaneshwar Das, Gaurav S. Sukhatme<br />

Department of Computer Science, University of Southern California, USA<br />

Thom Maughan, Mike McCann, Mike Godin, Tom O’Reilly,<br />

Monique Messié, Fred Bahr, Kevin Gomes, Frédéric Py,<br />

James G. Bellingham, Kanna Rajan<br />

Monterey Bay Aquarium Research Institute, USA<br />

• Description of a month-long multi-institute<br />

field experiment with multiple<br />

heterogeneous robotic platforms.<br />

• Design, development, and use of an<br />

Oceanographic Decision Support System<br />

(ODSS) during the experiment.<br />

• Lessons learned and discussion of future<br />

roadmap.<br />

14:15–14:30 WeCT6.2<br />

Semi-Plenary Invited Talk: Wiring an Interactive Ocean<br />

John Delaney, University of Washington<br />

14:45–14:50 WeCT6.4<br />

Current-Sensitive Path Planning for an<br />

Underactuated Free-floating Ocean Sensorweb<br />

Kristen P. Dahl<br />

California Institute of Technology, USA<br />

David R. Thompson, David McLaren, Yi Chao, and Steve Chien<br />

Jet Propulsion Laboratory, California Institute of Technology, USA<br />

• This work investigates multiagent path<br />

planning in strong, dynamic currents using<br />

highly underactuated vehicles.<br />

• The Argo global network consists of over<br />

3000 ocean-observing floats able to<br />

control only depth.<br />

• Accurate current forecasts present the<br />

possibility of intentionally controlling floats’<br />

motion.<br />

• Simulations tested this method’s potential<br />

to improve coverage or direct floats to a<br />

desired location.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–141–<br />

A simulation in which five floats<br />

use current-driven planning to<br />

improve coverage.<br />

(yellow: non-adaptive;<br />

blue: adaptive)


Session WeCT6 Continental Ballroom 6 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Symposium: Marine Robotics: Platforms and Applications<br />

Chair Ryan Smith, Queensland Univ. of Tech.<br />

Co-Chair Gianluca Antonelli, Univ. degli Studi di Cassino<br />

14:50–14:55 WeCT6.5<br />

Toward Risk Aware Mission Planning for<br />

Autonomous Underwater Vehicles<br />

Arvind A. Pereira, Jonathan Binney, and Gaurav S. Sukhatme<br />

Department of Computer Science, University of Southern California, USA<br />

Burton H. Jones and Matthew Ragan<br />

Department of Biological Sciences, University of Southern California, USA<br />

• Creation of probabilistic occupancy<br />

maps from AIS data<br />

• Plan minimum risk paths for AUVs<br />

that avoid risky regions<br />

• Paths planned enforce AUV<br />

resurfacing within a maximum<br />

distance from previous waypoint<br />

• Significant surface collision risk<br />

reduction compared to shortest path<br />

and other methods<br />

15:00–15:15 WeCT6.7<br />

Obstacle Detection from Overhead Imagery<br />

using Self-Supervised Learning for<br />

Autonomous Surface Vehicles<br />

Hordur K. Heidarsson<br />

Department of Electrical Engineering, University of Southern California, USA<br />

Gaurav S. Sukhatme<br />

Department of Computer Science, University of Southern California, USA<br />

• Technique for generating an<br />

obstacle map from sonar data<br />

and overhead imagery<br />

• Pipeline consisting of<br />

• Feature extraction<br />

• AdaBoost classifier<br />

• MRF smoothing<br />

• Combining images from<br />

different times<br />

• Resulting map has 3 classes<br />

• Obstacles<br />

• Transient/suspect obstacles<br />

• Free space<br />

14:55–15:00 WeCT6.6<br />

AUV Docking on a Moving Submarine Using a<br />

K-R Navigation Function<br />

Sujit. P.B and Joao Sousa<br />

Department of Electrical and Computer Eng., University of Porto, Portugal<br />

Anthony, J Healey<br />

Naval Postgraduate School, Monterey, CA, USA<br />

• The objective is to design a controller for<br />

the AUV to dock onto a moving submarine<br />

• Developed a Navigation function approach<br />

that integrates path planning with collision<br />

avoidance<br />

• Modeled no-fly zones around stern to<br />

protect propulsor and mitigate against<br />

disturbances<br />

• Other no-fly zones included around the<br />

dock to control approach headings<br />

• Gain must be selected appropriately to<br />

avoid local minima<br />

• Simulation results show the desirability of<br />

the approach from different directions<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–142–<br />

AUV docking on a moving<br />

submarine from stern direction<br />

15:15–15:30 WeCT6.8<br />

Observability metric for the relative localization of AUVs<br />

based on range and depth measurements:<br />

theory and experiments<br />

Filippo Arrichiello and Gianluca Antonelli<br />

DAEIMI, University of Cassino, Italy<br />

Antonio Pedro Aguiar and Antonio Pascoal<br />

IST, Technical University Lisbon, Portugal<br />

• The work addresses the problem of<br />

observability of the relative motion of two<br />

AUVs equipped with, velocity, range and<br />

depth sensors<br />

• We exploited nonlinear observability<br />

concepts to analyze some types of<br />

relative AUV motions<br />

• We derive a specific observability index<br />

(metric) and study its dependence on the<br />

types of motion commanded to the<br />

vehicles.<br />

• The results derived are validated<br />

experimentally in a equivalent, single<br />

beacon navigation scenario<br />

Autonomous surface vessel used<br />

for the experimental validation


Session WeCT7 Continental Parlor 7 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Humanoid Control<br />

Chair Eiichi Yoshida, National Inst. of AIST<br />

Co-Chair Alessandro De Luca, Univ. di Roma "La Sapienza"<br />

14:00–14:15 WeCT7.1<br />

Logic Programming with Simulation-based Temporal<br />

Projection for Everyday Robot Object Manipulation<br />

Lars Kunze, Mihai E. Dolha, Michael Beetz<br />

Intelligent Autonomous Systems<br />

Technische Universitaet Muenchen<br />

Germany<br />

• Robots must decide on appropriate action ction<br />

parameterizations based on the expected cted<br />

consequences of their actions<br />

• Logic programming with simulation-based sed<br />

temporal projection can predict the<br />

consequences of manipulation actions s<br />

qualitatively<br />

• Hence, robots can employ these methods ods<br />

to infer action parameterizations that lead ead<br />

to desired outcomes<br />

Logical proof-tree of simulation-<br />

based temporal projections<br />

14:30–14:45 WeCT7.3<br />

Switching Multiple LQG Controllers<br />

Based on Bellman's Optimality Principle<br />

Norikazu Sugimoto1 and Jun Morimoto2 1National Institute of Communication Telecommunication, Kyoto, Japan<br />

2ATR Computational Neuroscience Labs, Kyoto, Japan<br />

State estimator<br />

�xˆ<br />

( t � 1 | t)<br />

� A xˆ<br />

( t)<br />

� Bu<br />

( t)<br />

� c<br />

i<br />

i i<br />

i<br />

�<br />

�xˆ<br />

( t � 1)<br />

� xˆ<br />

( t � 1 | t)<br />

� K ( y ( t)<br />

� H xˆ<br />

( t � 1 | t))<br />

i<br />

i<br />

i<br />

i i<br />

Value function estimator<br />

� 1<br />

v T<br />

v<br />

�V<br />

( x(<br />

t )) � ( xˆ<br />

( t ) � x ) P ( xˆ<br />

( t ) � x )<br />

i<br />

i i<br />

i<br />

� 2<br />

� T<br />

T �1<br />

T<br />

�0<br />

� A P A � P � A P B R B P A � Q<br />

i i i i i i i i i i<br />

Cost function<br />

1 T 1 T<br />

r ( t ) � x( t)<br />

Qx<br />

( t ) � u(<br />

t ) Ru<br />

( t)<br />

2<br />

2<br />

Weighting signal predictor<br />

1 � 1<br />

2 �<br />

� ( t ) � p ( x(<br />

t ), r ( t ) | i)<br />

� exp ( t )<br />

i<br />

2 i<br />

Z<br />

�<br />

� �<br />

2<br />

�<br />

� � �<br />

� ( t ) � r ( t ) � V ( x(<br />

t � 1))<br />

� V ( x(<br />

t ))<br />

i<br />

i<br />

i<br />

Deviation from Bellman’s optimality principle<br />

14:15–14:30 WeCT7.2<br />

Experimental Evaluation of a Trajectory/Force<br />

Tracking Controller for a Humanoid Robot<br />

Cleaning a Vertical Surface<br />

Fuyuki Sato, Tatsuya Nishii, Jun Takahashi,<br />

Yuki Yoshida, Masaru Mitsuhashi and Dragomir Nenchev<br />

Mechanical Systems Engineering, Tokyo City University, Japan<br />

• The force tracking controller is based on<br />

CoM ground projection and ZMP position<br />

feedback<br />

• The hand trajectory is tracked via a PD<br />

controller<br />

• The trajectory/force tracking controller is<br />

implemented under a mixed<br />

position/torque control mode<br />

• The performance of the controller is<br />

experimentally evaluated with a miniature<br />

humanoid robot HOAP-2 cleaning a<br />

whiteboard under a prerecorded<br />

motion/force pattern<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–143–<br />

A HOAP-2 humanoid robot<br />

cleaning a whiteboard<br />

14:45–14:50 WeCT7.4<br />

Adaptive Predictive Gaze Control<br />

of a Redundant Humanoid Robot Head<br />

Giulio Milighetti and Luca Vallone<br />

Fraunhofer Institute of Optronics, System Technologies and Image Exploitation,<br />

Germany<br />

Alessandro De Luca<br />

Dipartimento di Informatica e Sistemistica, Univ. di Roma “La Sapienza”, Italy<br />

• General concept for gaze control of<br />

redundant humanoid robot heads<br />

• Prediction of target next state by adaptive<br />

Kalman filter<br />

• Combination of proportional feedback and<br />

feedforward term for trajectory tracking<br />

control<br />

• Gain adaptation for improvement of<br />

system dynamics<br />

• Exploitation of inverse kinematics by<br />

means of weighted pseudoinverse<br />

• Addition of self-motions for human-like<br />

head postures<br />

ARMAR-III Head tracking a cup


Session WeCT7 Continental Parlor 7 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Humanoid Control<br />

Chair Eiichi Yoshida, National Inst. of AIST<br />

Co-Chair Alessandro De Luca, Univ. di Roma "La Sapienza"<br />

14:50–14:55 WeCT7.5<br />

Dynamic Whole-Body Mobile Manipulation with<br />

a Torque Controlled Humanoid Robot via<br />

Impedance Control Laws<br />

Alexander Dietrich, Thomas Wimböck, and Alin Albu-Schäffer<br />

German Aerospace Center (DLR), Germany<br />

• Cartesian impedance is utilized for task<br />

execution<br />

• Incorporation of reactive subtasks like<br />

collision avoidance, singularity avoidance,<br />

mechanical end stops, null space damping<br />

• Experimental validation of the proposed<br />

whole-body control<br />

• A video demonstrates the performance of<br />

our approach<br />

Torque Controlled Humanoid<br />

Rollin’ Justin<br />

15:00–15:15 WeCT7.7<br />

A Computational Approach for Push Recovery in<br />

case of Multiple Non coplanar Contacts<br />

Darine Mansour and Alain Micaelli<br />

CEA, France<br />

Pierre Lemerle<br />

INRS, France<br />

• A push recovery approach based on a<br />

humanoid simple model is proposed<br />

•A dynamic stability indicator<br />

predicting whether a perturbed system<br />

can be stabilized is generated<br />

• An algorithm stabilizing the system<br />

under fixed contacts when feasible,<br />

is developed<br />

• An algorithm stabilizing the system<br />

through an optimal contact change<br />

when needed, is developed<br />

[1.] Bretl and S. Lall, "Testing Static Equilibrium for Legged<br />

Robots", in IEEE Transactions on Robotics, Aug 2008<br />

Simplified Model of a humanoid<br />

The model consists of a punctual<br />

mass at the COM and (n) non<br />

coplanar contact surfaces. It is an<br />

extension of the model in [1.]<br />

14:55–15:00 WeCT7.6<br />

Gait Pattern Generation and Stabilization for<br />

Humanoid Robot Based on Coupled Oscillators<br />

Inyong Ha, Yusuke Tamura, and Hajime Asama<br />

Department of Precision Engineering, The University of Tokyo, Japan<br />

• Design of gait pattern generator<br />

based on coupled oscillator model<br />

divided into movement oscillator<br />

and balance oscillator<br />

• Walking stabilizer by feedback<br />

controller using oscillator<br />

parameters and sensor data<br />

• Tested on open humanoid robot<br />

platform DARwIn-OP for the<br />

evaluation<br />

15:15–15:30 WeCT7.8<br />

Stretched knee walking with novel inverse<br />

kinematics for humanoid robots<br />

Przemyslaw Krycza<br />

Graduate School of Science and Engineering, Waseda University, Japan<br />

Kenji Hashimoto, Hideki Kondo and Aiman Omer<br />

Faculty of Science and Engineering, Waseda University, Japan<br />

Hunk-ok Lim<br />

Faculty of Engineering, Kanagawa University and Humanoid Research<br />

Institute, Waseda University, Japan<br />

Atsuo Takanishi<br />

Department of Modern Mechanical Engineering and Humanoid Research<br />

Institute, Waseda University, Japan<br />

• Structure modeled with highly redundant<br />

serial kinematic chains<br />

• Inverse kinematics (IK) problem solved<br />

including multiple tasks and their relative<br />

priorities<br />

• Human-like walk with heel-contact and toe<br />

off as well as nearly stretched knee<br />

phases<br />

• Contains IK equations description,<br />

simulation and on machine experiments<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–144–<br />

WABIAN-2R


Session WeCT8 Continental Parlor 8 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Multirobot Planning<br />

Chair Noel E. Du Toit, California Inst. of Tech.<br />

Co-Chair Spring Berman, Harvard Univ.<br />

14:00–14:15 WeCT8.1<br />

Multiple Agent Coordination for Stochastic<br />

Target Interception Using MILP<br />

Daniel J. Stilwell and Apoorva Shende<br />

ECE Department, Virginia Tech, USA<br />

Matthew J. Bays<br />

ME Department, Virginia Tech, USA<br />

• Multi-agent coordination control for<br />

multiple moving target interception<br />

formulated as a MILP.<br />

• Binary variables in MILP used to indicate<br />

interception.<br />

• Gaussian uncertainty exploited to<br />

formulate chance interception<br />

constraints. Cost adaptable to the<br />

specific target interception problem.<br />

• Proposed stochastic MILP shown to be<br />

no more computationally expensive than<br />

the corresponding deterministic MILP.<br />

• Suitable for multi-agent systems well<br />

approximated by Gaussian uncertainty.<br />

14:30–14:45 WeCT8.3<br />

A Reciprocal Sampling Algorithm for<br />

Lightweight Distributed Multi-Robot Localization<br />

Amanda Prorok and Alcherio Martinoli<br />

Distributed Intelligent Systems and Algorithms Laboratory<br />

EPFL, Switzerland<br />

• Fully decentralized, real-time particle filter<br />

algorithm accommodating realistic robotic<br />

assumptions<br />

• Probabilistic robot detection model based<br />

on relative range and bearing measures<br />

• Novel collaborative reciprocal sampling<br />

method allows a drastic reduction in<br />

number of particles needed for localization<br />

• Experimental validation on a team of 4<br />

real Khepera III robots equipped with IRbased<br />

range and bearing modules<br />

Multi-robot detection model<br />

14:15–14:30 WeCT8.2<br />

Using Minimal Communication to Improve<br />

Decentralized Conflict Resolution<br />

for Non-holonomic Vehicles<br />

Athanasios Krontiris and Kostas E. Bekris<br />

Computer Sience and Engineering Department<br />

University of Nevada, Reno, USA<br />

• This work considers the problem of<br />

decentralized coordination between<br />

multiple non-holonomic vehicles, each<br />

navigating to a specified goal<br />

• Two automata are integrated<br />

− Lower level automaton: guarantees<br />

collision avoidance.<br />

− Higher level automaton: Updates<br />

desired direction based on a dynamic<br />

priority scheme<br />

• Minimalistic local communication is<br />

employed to avoid livelocks<br />

• Favorable experimental comparison<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–145–<br />

Simulated aircraft in different modes as<br />

specified by the proposed method,<br />

and their desired direction of motion.<br />

The method allows one system at a time<br />

to make progress to its goal.<br />

14:45–14:50 WeCT8.4<br />

Localization Using Ambiguous<br />

Bearings from Radio Signal Strength<br />

Jason Derenick, Jonathan Fink, and Vijay Kumar<br />

GRASP Laboratory, University of Pennsylvania, USA<br />

• Explores the problem of localizing robots<br />

using ambiguous bearing estimates (πperiodic)<br />

drawn from a robot’s anisotropic,<br />

but symmetric, radiation profile.<br />

• Leveraging these bearings, odometry, and<br />

a compass, a MH-EKF is formulated that<br />

exploits motion to disambiguate state and<br />

recover formation scale.<br />

• Theoretical analysis shows that despite<br />

the combinatoric nature of the problem,<br />

only 2 initial hypotheses are needed.<br />

• Experimental results using mobile robots,<br />

with Zigbee transceivers, and simulation<br />

results for a larger system demonstrate<br />

the feasibility of the approach.


Session WeCT8 Continental Parlor 8 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Multirobot Planning<br />

Chair Noel E. Du Toit, California Inst. of Tech.<br />

Co-Chair Spring Berman, Harvard Univ.<br />

14:50–14:55 WeCT8.5<br />

������������������������������������������<br />

������������������������������������<br />

�����������������������������������<br />

���������������������������������������������������<br />

� ������������������������������������������<br />

��������������������������������������������<br />

�����������������������<br />

� ���������������������������������������������<br />

���������������<br />

� ����������������������������������������<br />

�������������������������������������<br />

����������������������������������<br />

����������<br />

�����������������<br />

15:00–15:15 WeCT8.7<br />

Efficient and Complete<br />

Centralized Multi-Robot Path Planning<br />

Ryan Luna and Kostas E. Bekris<br />

Computer Science and Engineering, University of Nevada, Reno, USA<br />

• Approach uses intuitive single robot primitives<br />

to efficiently compute multi-robot solutions<br />

• Computes a sequential set of paths to solve<br />

the path-planning problem<br />

• Algorithm is complete for all instances with<br />

two or more empty vertices<br />

• Method is fast and scalable to problems with<br />

hundreds of robots<br />

• Solution quality is highly competitive with<br />

modern coupled and decoupled methods<br />

• No dependence on tunable parameters or<br />

discretization assumptions<br />

Push<br />

2 3<br />

Swap<br />

2<br />

3<br />

Push<br />

1<br />

Swap<br />

1<br />

14:55–15:00 WeCT8.6<br />

M*: A complete Multirobot Path Planning<br />

Algorithm with Performance Bounds<br />

Glenn Wagner and Howie Choset<br />

Robotics Institute, Carnegie Mellon University, USA<br />

• Subdimensional expansion<br />

generates dynamically<br />

generates a low dimensional<br />

search space embedded in full<br />

configuration space<br />

• M* is an implementation of<br />

subdimensional expansion for<br />

graph representations of the<br />

configuration space<br />

• M* and its variants are proven to<br />

be complete and either cost<br />

optimal or bounded suboptimal<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–146–<br />

Plots of the percent of trials in<br />

which each algorithm was able to<br />

find a path within 5 minutes, and<br />

the 10’th (red), 50’th (blue) and<br />

90’th (green) percentile of times<br />

required to find solution<br />

15:15–15:30 WeCT8.8<br />

ARMO: Adaptive Road Map Optimization for<br />

Large Robot Teams<br />

Alexander Kleiner. Dali Sun and Daniel Meyer-Delius<br />

Dept. of Computer Science, University of Freiburg, Germany<br />

� Autonomous robot teams in intra logistics<br />

dispatching transportation tasks<br />

� Multi Robot Path Planning Problem<br />

� Automatic adaptation of the road map<br />

when the map or station load has changed<br />

� Hidden Markov Model for detecting<br />

changes in the map<br />

� Linear Integer Programming for<br />

optimizing the road map configuration


Session WeCT9 Continental Parlor 9 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Visual & Multi-Sensor Calibration<br />

Chair Jonathan Kelly, USC<br />

Co-Chair Oliver Birbach, DFKI<br />

14:00–14:15 WeCT9.1<br />

Extrinsic Calibration of<br />

a Single Line Scanning Lidar and a Camera<br />

Kiho Kwak, Daniel F. Huber, Hernan Badino, and Takeo Kanade<br />

• We propose a robust-weighted<br />

extrinsic calibration algorithm<br />

that is implemented easily and<br />

has small calibration error.<br />

• The extrinsic parameters are<br />

estimated by minimizing the<br />

distance error between<br />

corresponding features into the<br />

image.<br />

• The alignment accuracy of our<br />

method is two times better than<br />

the compared state of the art<br />

algorithms.<br />

Carnegie Mellon University, USA<br />

14:30–14:45 WeCT9.3<br />

Optimization Based IMU Camera Calibration<br />

Elmar Mair and Oliver Ruepp and Darius Burschka<br />

Department of Informatics, Technische Universtität München, Germany<br />

Michael Fleps and Michael Suppa<br />

Institute for Robotics and Mechatronics, German Aerospace Center, Germany<br />

• Spatial alignment estimation of IMU and<br />

camera by a nonlinear batch-optimization<br />

• Modeling of the sensors’ trajectories by Bsplines<br />

• No approximations and sequential data<br />

processing like in Kalman filters<br />

• Experiments on simulated and real data<br />

show that the approach compares<br />

favorably to conventional methods<br />

14:15–14:30 WeCT9.2<br />

A Novel 2.5D Pattern for Extrinsic Calibration of<br />

ToF and Camera Fusion System<br />

Jiyoung Jung, Yekeun Jeong, Jaesik Park, Hyowon Ha,<br />

and In-So Kweon<br />

Department of Electrical Engineering, KAIST, South Korea<br />

James Dokyoon Kim<br />

Samsung Advanced Institute of Technology, South Korea<br />

• We present an extrinsic calibration<br />

method to estimate the pose of a color<br />

camera with respect to a 3D Time-of-Flight<br />

camera.<br />

• We use a 2.5D pattern so that the correct<br />

correspondences are obtained for both<br />

color and ToF cameras.<br />

• We refine the intrinsic parameters of the<br />

ToF camera to model its projection as a<br />

pinhole camera model.<br />

• Depth constraint which restricts the depth<br />

measurement to lie on the pattern plane is<br />

also employed into LM optimization as<br />

well as the reprojection errors.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–147–<br />

The 2.5D pattern and the checkerboard<br />

in color and ToF amplitude images<br />

A comparison of 3D rendering results<br />

14:45–14:50 WeCT9.4<br />

Rapid Development of Manifold-Based Graph Optimization<br />

Systems for Multi-Sensor Calibration and SLAM<br />

René Wagner, Oliver Birbach<br />

Safe and Secure Cognitive Systems, DFKI, Germany<br />

Udo Frese<br />

Safe and Secure Cognitive Systems, DFKI, Germany<br />

Real-Time Computer Vision Group, University of Bremen, Germany<br />

• The Manifold Toolkit for Matlab (MTKM)<br />

brings state-of-the-art generic graph<br />

optimization to Matlab.<br />

• Using MTKM, even complex multi-sensor<br />

cross-calibration problems can be solved<br />

with very little code.<br />

• Get MTKM and all presented calibration<br />

and SLAM examples as open source from<br />

http://openslam.org/MTK.html<br />

• Build upon MTKM to get your calibration<br />

tasks done quickly and experiment with<br />

SLAM in a convenient Matlab<br />

environment.<br />

MTKM applications: Hand–<br />

eye+stereo+IMU calibration on a<br />

humanoid service robot, vertical<br />

stereo calibration, 3D pose<br />

adjustment, RGB-D+accelerometer<br />

calibration, 2D pose adjustment,<br />

2D feature SLAM.


Session WeCT9 Continental Parlor 9 Wednesday, September 28, <strong>2011</strong>, 14:00–15:30<br />

Visual & Multi-Sensor Calibration<br />

Chair Jonathan Kelly, USC<br />

Co-Chair Oliver Birbach, DFKI<br />

14:50–14:55 WeCT9.5<br />

Two-Phase Online Calibration for Infrared-based<br />

Inter-Robot Positioning Modules<br />

Sven Gowal, Amanda Prorok and Alcherio Martinoli<br />

DISAL, EPFL, Switzerland<br />

• A novel two-phase online calibration method<br />

is presented.<br />

• The combination of a Least Mean Square<br />

algorithm and Gaussian Process<br />

Regression allows accurate estimation for<br />

robots with limited computational power.<br />

• Online and on-board estimation is possible.<br />

• Experiments are carried out on 4 Khepera<br />

III robots equipped with a IR-based range<br />

and bearing module.<br />

Example of the two-phase<br />

calibration procedure.<br />

15:00–15:15 WeCT9.7<br />

High-Accuracy Hand-Eye Calibration<br />

from Motion on Manifolds<br />

Federico Vicentini, Nicola Pedrocchi, Matteo Malosio<br />

and Lorenzo Molinari Tosatti<br />

National Research Council CNR-ITIA, Italy<br />

• new hand-eye calibration procedure<br />

based on generation of constant<br />

geometrical entities<br />

• robot motion is programmed for tracing<br />

manifolds with good interpolation/fitting<br />

properties<br />

• geometrical solution with real robot<br />

implementation, replacing the solution of<br />

quadratic AX=XB equation<br />

• better accuracy and robustness against<br />

noise in hand-eye measurement<br />

generation of circular manifolds for<br />

hand-eye transform computation<br />

14:55–15:00 WeCT9.6<br />

Self Calibration of a Vision System Embedded in<br />

a Visual SLAM Framework<br />

Cyril Joly and Patrick Rives<br />

ARobAS Team, INRIA Sophia Antipolis-Méditerranée, France<br />

• Self-calibration of the 3D transformation<br />

between a camera and the odometry<br />

during visual SLAM<br />

• Observability analysis based on the<br />

study of global properties of the system<br />

• Parameters estimation can be computed<br />

when the radius of curvature of the<br />

trajectory changes<br />

• Implementation of a simultaneous SLAM<br />

and calibration approach based on the<br />

Smoothing and Mapping (SAM) algorithm<br />

• Validation on real data with a camera<br />

mounted with a non negligible rotational<br />

offset<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–148–<br />

The translation and rotation<br />

transformation between the<br />

camera and the robot has to be<br />

estimated<br />

15:15–15:30 WeCT9.8<br />

A Nonlinear Observer Approach for<br />

Concurrent Estimation of Pose, IMU Bias and<br />

Camera-to-IMU Rotation<br />

Glauco Garcia Scandaroli1 , Pascal Morin1 and Geraldo Silveira2 1 INRIA Sophia Antipolis-Méditerranée, France<br />

2 CTI Renato Archer – DRVC Division, Brazil<br />

• Novel observer designed for pose<br />

(orientation and position), IMU bias and<br />

Camera-to-IMU rotation estimation.<br />

• Exponential stability is obtained under<br />

adequate observability condition.<br />

• Gyroscopes can be used to detect the<br />

satisfaction of the observability condition.<br />

• Experimental results for an inertial-visual<br />

sensor directly exploiting the image<br />

intensities support the effectiveness of the<br />

method.


Session WeDT1 Continental Parlor 1 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Human-Robot Collaboration<br />

Chair Tetsunari Inamura, National Inst. of Informatics<br />

Co-Chair Elizabeth Croft, Univ. of British Columbia<br />

16:00–16:15 WeDT1.1<br />

Enhanced Visual Scene Understanding<br />

through Human-Robot Dialog<br />

M. Johnson-Roberson, J. Bohg, B. Rasolzadeh and D. Kragic<br />

CVAP/CAS, KTH Stockholm, Sweden<br />

G. Skantze, J.Gustafson and Rolf Carlson<br />

TMH, KTH Stockholm, Sweden<br />

• Goal: Robot should correctly<br />

segment objects in the scene from<br />

themselves and from the background<br />

• Initial object hypotheses from<br />

active segmentation approach<br />

• Disambiguation through humanrobot<br />

interaction using incremental<br />

dialog system<br />

• Quantitative experimental<br />

evaluation on real data shows<br />

improved scene segmentation after<br />

human-robot interaction<br />

Top: Point Cloud of Scene. Middle: Initial<br />

Labeling. Bottom: Resegmented Scene<br />

after Human-Robot Interaction<br />

16:30–16:45 WeDT1.3<br />

Towards Safe Physical Human-Robot<br />

Collaboration: A Projection-based Safety System<br />

Christian Vogel, Maik Poggendorf, Christoph Walter<br />

and Norbert Elkmann<br />

Fraunhofer Institute for Factory Operation and Automation IFF,<br />

Magdeburg, Germany<br />

• Based on conventional projector and<br />

camera equipment<br />

• Projection of safety-spaces directly into<br />

the environment (e.g. onto the floor)<br />

• Cameras reliably detect violations<br />

whenever projection rays are disrupted<br />

• Reduced influence of changing<br />

environmental light conditions<br />

• Intrinsically safe system<br />

• Dynamic change of shape, size, position<br />

• Display additional information<br />

• Visible/ non-visible light<br />

16:15–16:30 WeDT1.2<br />

Intention-Based Coordination and Interface<br />

Design for Human-Robot Cooperative Search<br />

Dan Xie, Yun Lin, Roderic Grupen and Allen Hanson<br />

Department of Computer Science,<br />

University of Massachusetts Amherst, USA<br />

• A multi-agent search scheme is presented<br />

that supports the recognition of activities<br />

and, thus, learning methods for<br />

cooperative human-robot interaction.<br />

• The robot updates a Probabilistic<br />

Distribution Function of the target object<br />

using the observations and the estimated<br />

state of human peers to compensate the<br />

human behaviors.<br />

• An implicit interface design that reduces<br />

the cognitive workload -- the robot infers<br />

the intention of the user and provides<br />

assistance autonomously.<br />

16:45–16:50 WeDT1.4<br />

Experimental Investigation of Human-<br />

Robot Cooperative Carrying<br />

Chris A. C. Parker and Elizabeth Croft<br />

Mechanical Engineering, University of British Columbia, Canada<br />

• Human behavior in a human-robot<br />

cooperative carrying domain was<br />

investigated.<br />

• 19 non-expert human subjects<br />

participated in this study.<br />

• The importance of visual and haptic<br />

feedback to human users was<br />

examined.<br />

• Humans also were found to apply<br />

significant tension to cooperatively<br />

carried loads under certain conditions.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–149–<br />

Experimental Setup of our<br />

HRI Study


Session WeDT1 Continental Parlor 1 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Human-Robot Collaboration<br />

Chair Tetsunari Inamura, National Inst. of Informatics<br />

Co-Chair Elizabeth Croft, Univ. of British Columbia<br />

16:50–16:55 WeDT1.5<br />

A Human-Centered Approach to Robot Gesture<br />

Based Communication<br />

within Collaborative Working Processes<br />

T. Ende, S. Haddadin, S. Parusel, T. Wüsthoff, M. Hassenzahl, A. Albu-Schäffer<br />

Institute of Robotics and Mechatronics (DLR), Germany<br />

Folkwang University of the Arts Essen, Germany<br />

• Identification of relevant gestures from<br />

human observations for human-robot coworking<br />

tasks<br />

• Transfer of gestures to robotic systems<br />

of increasing abstraction<br />

• Evaluation of human recognition rate for<br />

isolated human and robotic gestures<br />

• Robotic gesture lexicon and machine<br />

• Complex use-case with single arm<br />

manipulators in a drug preparation<br />

scenario<br />

Transferring gestures from human observation<br />

to human-robot collaborative tasks<br />

17:00–17:15 WeDT1.7<br />

Motion Coaching with Emphatic Motions and<br />

Adverbial Expression for Humans by a Robot<br />

Keisuke Okuno<br />

Department of Informatics, SOKENDAI, Japan<br />

Tetsunari Inamura<br />

National Institute of Informatics / SOKENDAI, Japan<br />

• Propose a method to bind Emphatic<br />

Motions and Adverbial Expressions.<br />

• Propose a method to control degree of<br />

emphasis of synthesized motions and<br />

adverbial expressions.<br />

• Realize the methods by using a sole<br />

parameter in a phase space.<br />

• Show the feasibility with a motion<br />

coaching system and experiments of<br />

actual sport coaching.<br />

The “α “ binds and control both<br />

emphatic motions and adverbial<br />

expressions<br />

16:55–17:00 WeDT1.6<br />

Human Workflow Analysis using 3D Occupancy<br />

Grid Hand Tracking in a Human-Robot<br />

Collaboration<br />

Claus Lenz, Alice Sotzek, Thorsten Röder,<br />

Helmuth Radrich and Alois Knoll<br />

Robotics and Embedded Systems, Technische Universitä̈ t München, Germany<br />

Markus Huber and Stefan Glasauer<br />

Center for Sensorimotor Research Department of Neurology,<br />

Ludwig-Maximilians-Universität München, Germany<br />

• HMM-based human workflow<br />

analysis of an assembly task<br />

jointly performed by human<br />

and robot<br />

• Training of models without<br />

influence of a robot<br />

• Evaluations to find taskrelevant<br />

and set-up<br />

independent<br />

• Transfer of workflow analysis<br />

models to a robotic system<br />

using a new three-dimensional<br />

occupancy grid tracking<br />

approach to estimate the hand<br />

positions as input data<br />

17:15–17:30 WeDT1.8<br />

A system for interactive learning<br />

in dialogue with a tutor<br />

D. Skočaj, M. Kristan, A. Vrečko, M. Mahnič, M.Janíček,<br />

G.J. Kruijff, M. Hanheide, N. Hawes, T. Keller, M. Zillich, K. Zhou<br />

University of Ljubljana, DFKI Saarbrücken, University of Birmingham,<br />

Albert-Ludwigs-Universität Freiburg, Vienna University of Technology<br />

• Representations and mechanisms that<br />

facilitate continuous learning of visual<br />

concepts in dialogue with a tutor.<br />

• Implemented in a robotic system.<br />

• Beliefs about the world are<br />

• created by processing visual and<br />

linguistic information<br />

• used for planning system behaviour.<br />

• Different kinds of learning initiated by the<br />

human tutor or by the system itself:<br />

• Tutor-driven learning<br />

• Situated tutor-assisted learning<br />

• Non-situated tutor-assisted learning<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–150–<br />

Scenario setup and<br />

system overview


Session WeDT2 Continental Parlor 2 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Recognition & Prediction of Motion<br />

Chair Sylvain Calinon, Italian Inst. of Tech.<br />

Co-Chair Sachin Chitta, Willow Garage Inc.<br />

16:00–16:15 WeDT2.1<br />

Realtime Recognition of Complex Daily<br />

Activities Using Dynamic Bayesian Network<br />

Chun Zhu and Weihua Sheng<br />

School of Electrical and Computer Engineering<br />

Oklahoma State University, USA<br />

• Daily complex activities are recognized<br />

using wearable motion sensors and<br />

location information<br />

• Adaptive gesture spotting is proposed<br />

to segment gestures for different<br />

environments and body activities<br />

• A dynamic Bayesian network is<br />

developed to recognize body activities<br />

and hand gestures simultaneously<br />

• Test in a mock apartment shows good<br />

recognition results<br />

The hardware platform<br />

Dynamic Bayesian network for<br />

complex activity recognition<br />

16:30–16:45 WeDT2.3<br />

Movement Segmentation using a<br />

Primitive Library<br />

Franziska Meier 1 , Evangelos Theodorou 1 , Freek Stulp 1<br />

and Stefan Schaal 1,2<br />

1 CLMC Lab, University of Southern California<br />

2 Max-Planck-Institute for Intelligent Systems, Tuebingen<br />

• The segmentation problem is reduced<br />

to online sequential movement<br />

recognition.<br />

• Original Dynamic Movement Primitives<br />

are reformulated to take the form of a<br />

linear dynamical system.<br />

• Based on this new formulation an EM<br />

algorithm is developed to estimate the<br />

duration and goal of<br />

partially observed motion.<br />

• Results on two applications of this new formulation - online movement<br />

recognition and movement segmentation – are presented.<br />

16:15–16:30 WeDT2.2<br />

Motion Data Retrieval Based on Statistic Correlation<br />

between Motion Symbol Space and Language<br />

Seiya Hamano and Wataru Takano<br />

Yoshihiko Nakamura<br />

Department of Mechano-Informatics, The University of Tokyo, Japan<br />

• The motion retrieval method with word<br />

based on Canonical Correlation Analysis<br />

between motion features and word<br />

features is proposed.<br />

• The motion feature is represented by the<br />

proto-symbol space and the word feature<br />

is represented by the binary feature of<br />

word labels.<br />

• This method enables motion retrieve<br />

considering similarity of motion pattern<br />

and closeness of word meaning.<br />

16:45–16:50 WeDT2.4<br />

Encoding the temporal and spatial constraints<br />

of a task in explicit-duration HMM<br />

Sylvain Calinon, Antonio Pistillo and Darwin G. Caldwell<br />

Advanced Robotics Dept, Italian Institute of Technology (IIT), Genova, Italy<br />

•Aim: Recovery from perturbations in<br />

physical human-robot interaction (pHRI)<br />

• We consider scenarios where the user<br />

momentary perturbs the task by physically<br />

holding the robot (the robot is not aware of<br />

the type and place of the perturbation).<br />

• Kinesthetic teaching is used to learn the<br />

task, with non-linear motion encoded as a<br />

weighted combination of linear systems.<br />

• We focus here on the design of a weighting<br />

mechanism that can handle various types<br />

of task constraints.<br />

• We review existing methods and introduce<br />

Hidden Semi-Markov Model (HSMM) as a<br />

way to encapsulate the importance of time<br />

and space constraints in a parametric form.<br />

• Two pHRI experiments show that different<br />

behaviors can be obtained depending on the<br />

chosen HSMM weighting schemes.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–151–


Session WeDT2 Continental Parlor 2 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Recognition & Prediction of Motion<br />

Chair Sylvain Calinon, Italian Inst. of Tech.<br />

Co-Chair Sachin Chitta, Willow Garage Inc.<br />

16:50–16:55 WeDT2.5<br />

Behavior Prediction from Trajectories in a House<br />

by Estimating Transition Model Using Stay Points<br />

Taketoshi Mori and Hiroshi Noguchi<br />

Dept. of Life Support Technology (Molten), The Univ. of Tokyo, Japan<br />

Shoji Tominaga, Masamichi Shimosaka,<br />

Rui Fukui and Tomomasa Sato<br />

Dept. of Mechano-Informatics, The Univ. of Tokyo, Japan<br />

• Prediction by preceding activities<br />

before the target behaviors<br />

• Segmentation of trajectory data into<br />

staying or moving and classification<br />

of the segments<br />

• Mining time-series association rules<br />

from transition events of each<br />

segmented trajectory<br />

• Validated by experiment using<br />

trajectories of approx. 2 years.<br />

(Upper) Room Layout<br />

(Lower) Example of Transition<br />

17:00–17:15 WeDT2.7<br />

Trajectory Prediction of Spinning Ball<br />

for Ping-Pong Player Robot<br />

Yanlong Huang, De Xu, Min Tan and Hu Su<br />

State Key Laboratory of Intelligent Control and Management of Complex<br />

Systems, Institute of Automation, Chinese Academy of Sciences, China<br />

• Analytic flying model: the flying ball’s<br />

self-rotational velocity is estimated<br />

and sufficiently taken into account in<br />

the model<br />

• Rebound Model: the relation<br />

between the velocities before and<br />

after rebound<br />

• Trajectory Prediction: initial trajectory<br />

fitting, initial velocities calculation,<br />

landing position prediction, rebound<br />

velocities calculation, striking position<br />

prediction<br />

Z(mm)<br />

600<br />

400<br />

200<br />

0<br />

0<br />

100<br />

200<br />

X(mm)<br />

300<br />

The trajectory prediction<br />

Predict with angular velocity<br />

Predict without angular velocity<br />

Actual<br />

-500<br />

-1000<br />

-1500<br />

-2000<br />

400 -2500 Y(mm)<br />

16:55–17:00 WeDT2.6<br />

Estimation and Prediction of Multiple Flying Balls<br />

Using Probability Hypothesis Density Filtering<br />

Oliver Birbach and Udo Frese<br />

DFKI, Bremen, Germany<br />

� Position and velocity estimation of flying<br />

balls using recently developed GM-PHD<br />

filter<br />

� Detected flying balls represented as peaks<br />

in a Gaussian mixture, propagated by<br />

UKF over time<br />

� Prior of typically pitched balls integrated<br />

into GM-PHD for trajectory starting and<br />

increased robustness<br />

� Used as tracking component in a robotic<br />

ball catching application<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–152–<br />

PHD as projection into image and<br />

onto the floor.<br />

17:15–17:30 WeDT2.8<br />

Invariant Trajectory Indexing for Real Time 3D<br />

Motion Recognition<br />

Jianyu Yang, Y.F. Li and Keyi Wang<br />

Dept of MEEM, Joint Advanced Research Center of<br />

University of Science and Technology of China and<br />

City University of Hong Kong, Suzhou, China<br />

• A dynamic indexing method is proposed<br />

for representing, retrieving and real time<br />

recognizing motion trajectories.<br />

• Trajectory is segmented into basic<br />

segments by classifying the points, and<br />

we use the labels of the basic segments<br />

to title the trajectory.<br />

• All the trajectories of database are titled<br />

as an indexing database, and the query<br />

trajectory is retrieved and recognized by<br />

the matching of titles.<br />

• As the matching of title is simple, this<br />

method is effective for motion retrieval<br />

and real time recognition.<br />

A model of segmented trajectory<br />

(a) and a real case (b).


Session WeDT3 Continental Parlor 3 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Haptic Rendering & Object Recognition<br />

Chair M. Cenk Cavusoglu, Case Western Res. Univ.<br />

Co-Chair Masashi Konyo, Tohoku Univ.<br />

16:00–16:15 WeDT3.1<br />

A Comparison of Encoding Schemes for Haptic<br />

Object Recognition using a Biologically<br />

Plausible Spiking Neural Network<br />

Sivalogeswaran Ratnasingam and T.M. McGinnity<br />

Intelligent Systems Research Centre, School of Computing and Intelligent<br />

Systems, Magee campus, University of Ulster, United Kingdom<br />

• A biologically inspired spiking neural<br />

network based haptic object recognition<br />

system is proposed.<br />

• A number of different encoding schemes<br />

to convert the haptic measurements into<br />

spike train are proposed.<br />

• The spiking neural network was trained<br />

using a supervised training approach.<br />

• During the training, firing threshold of the<br />

hidden layer neurons were modified.<br />

• A robot hand with three fingers that have<br />

the same number of degrees of freedom<br />

as the human fingers was used for testing<br />

the system.<br />

• Test results show that the system can<br />

recognise 100 % of 7 different objects.<br />

Figure. Robot hand grasping a ball<br />

16:30–16:45 WeDT3.3<br />

Effect of Visuo-Haptic Co-location on<br />

3D Fitts’ Task Performance<br />

Michael J. Fu, Andrew D. Hershberger, Kumiko Sano,<br />

and M. Cenk Cavusoglu<br />

Electrical Engineering and Computer Science,<br />

Case Western Reserve University, USA<br />

• Goal was to characterize human<br />

performance of point-to-point reaching<br />

for physical, co-located/non-colocated<br />

virtual environment (VE), and rotated<br />

VE visualization conditions<br />

• Existing results on benefits of visual<br />

and haptic workspace co-location in VE<br />

are conflicting<br />

• Key finding was the significant<br />

decrease in end-point error for tasks<br />

performed in a co-located virtual<br />

environment display modality<br />

Experimental setup using<br />

Fishtank VE display<br />

16:15–16:30 WeDT3.2<br />

Study on lower back electrotactile stimulation<br />

characteristics for prosthetic sensory feedback<br />

Konstantinos Dermitzakis and Alejandro Hernandez-Arieta<br />

Artificial Intelligence Laboratory, University of Zurich, Switzerland<br />

Monika Seps<br />

Institute of Neuroinformatics, Swiss Federal Institute of Technology Zurich<br />

(ETHZ), Zurich, Switzerland<br />

• Prosthetic devices are in lack of rich<br />

sensory feedback to their user’s body<br />

• Such sensory feedback is important for<br />

the integration of artificial limbs to the<br />

body image<br />

• The lower back is a comfortable, low<br />

cognitive load feedback area<br />

• This study details the basic characteristics<br />

of electrical stimulation to the lower back<br />

• The lower back is found to be a viable<br />

location for providing such feedback<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–153–<br />

Moving sensation through<br />

electrical stimulation<br />

16:45–16:50 WeDT3.4<br />

Determining Object Geometry with<br />

Compliance and Simple Sensors<br />

Leif P. Jentoft and Robert D. Howe<br />

School of Engineering and Applied Sciences, Harvard University, USA<br />

• Joint-angle sensors on compliant joints<br />

are robust and gentle<br />

• Methods are presented to<br />

• Detect finger contact with objects<br />

• Sweep space to determine object<br />

geometry and surrounding free space<br />

• Eliminates need for expensive sensors to<br />

determine object geometry for grasping in<br />

vision-limited settings


Session WeDT3 Continental Parlor 3 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Haptic Rendering & Object Recognition<br />

Chair M. Cenk Cavusoglu, Case Western Res. Univ.<br />

Co-Chair Masashi Konyo, Tohoku Univ.<br />

16:50–16:55 WeDT3.5<br />

Usability of a virtual reality system based on a<br />

wearable haptic interface<br />

Ingo Kossyk and Konstantin Kondak<br />

Robotics and Mechatronics, German Aerospace Center (DLR), Germany<br />

Jonas Dörr and Lars Raschendörfer<br />

Robotics and Biology Laboratory, Technische Universität Berlin, Germany<br />

• We evaluate the usability of a wearable<br />

haptic interface with subjective and<br />

objective measures<br />

• A presence questionnaire with 18 test<br />

persons served as a subjective measure<br />

• A task performance experiment was<br />

conducted for an objective measure<br />

17:00–17:15 WeDT3.7<br />

Enhancement of Human Force Perception by<br />

Multi-point Tactile Stimulation<br />

Lope Ben Porquis, Masashi Konyo and Satoshi Tadokoro<br />

Graduate School of Information Science, Tohoku University, Japan<br />

• Multiple skin-contacts are natural<br />

consequence when gripping an<br />

object or a tool.<br />

• Aggregation of strain energy density<br />

by applying tensile and compressive<br />

forces.<br />

• Experimentally verified that multicontact<br />

stimulation applied on a<br />

tactile interface can enhance force<br />

perception.<br />

• Vacuum stimuli can be used as<br />

complementary stimuli for inducing<br />

force sensation.<br />

SED<br />

Mechanoreceptors<br />

Tactile<br />

Stimuli<br />

Tool Segment<br />

Force<br />

Index Finger<br />

Tensile<br />

Force<br />

Participant<br />

Force<br />

Weight<br />

Compressive<br />

Force<br />

16:55–17:00 WeDT3.6<br />

Teaching by Touching: Interpretation of<br />

Tactile Instructions for Motion Development<br />

Fabio DallaLibera, Fransiska Basoeki, Hiroshi Ishiguro<br />

Department of System Innovation, Osaka University, Japan<br />

Takashi Minato<br />

ATR Hiroshi Ishiguro Laboratory, ATR, Japan<br />

Emanuele Menegatti<br />

Department of Information Engineering, Padua University, Italy<br />

• Touch is an important means of<br />

communication among humans<br />

• Sport or dance instructors communicate<br />

much information using simple touches<br />

• To permit natural interaction, humanoid<br />

robots should interpret the meaning of<br />

tactile instructions<br />

• A prototype system for the development of<br />

robot motions is presented, and a<br />

preliminary analysis on how humans<br />

employ touch to express their intention is<br />

provided.<br />

17:15–17:30 WeDT3.8<br />

Enhancement of Vibrotactile Sensitivity:<br />

Effects of Stationary Boundary Contacts<br />

Tatsuma Sakurai, Masashi Konyo and Satoshi Tadokoro<br />

Graduate School of Information Scienses, Tohoku University, Japan<br />

• Simultaneous contact with vibratory and<br />

stationary surfaces enhances human<br />

vibrotactile sensitivity, which we call<br />

stationary-boundary-contact(SBC)<br />

enhancement.<br />

• SBC produces a sharp line sensation<br />

along the gap between the vibratory and<br />

stationary surfaces<br />

• Psychophysical experiments showed the<br />

detection thresholds were more than<br />

tripled as low at lower frequencies<br />

• The mechanism behind SBC<br />

enhancement is investigated by using a<br />

finite element model for the skin<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–154–<br />

NOT SBC<br />

SBC<br />

Stationary<br />

Surface<br />

Skin<br />

Stationary<br />

Surface<br />

Vibratory<br />

Surface<br />

Vibratory<br />

Surface<br />

Vibratory<br />

Surface<br />

Skin<br />

Impact<br />

Stationary<br />

Surface<br />

Vibratory<br />

Surface<br />

Schema of Stationary Boundary<br />

Contacts


Session WeDT4 Continental Ballroom 4 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Industrial Forum: Medical Robotics<br />

Chair Paolo Dario, Scuola Superiore Sant'Anna<br />

Co-Chair<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–155–


Session WeDT5 Continental Ballroom 5 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Symposium: Robot Motion Planning: New Frameworks and High Performance<br />

Chair Maxim Likhachev, Carnegie Mellon Univ.<br />

Co-Chair Ron Alterovitz, Univ. of North Carolina at Chapel Hill<br />

16:00–16:15 WeDT5.1<br />

Multi-resolution H-Cost Motion Planning: A New<br />

Framework for Hierarchical Motion Planning for<br />

Autonomous Mobile Vehicles<br />

Raghvendra V. Cowlagi and Panagiotis Tsiotras<br />

School of Aerospace Engineering,<br />

Georgia Institute of Technology, Atlanta, GA, USA<br />

• Computationally efficient<br />

hierarchical motion planner that<br />

incorporates vehicle dynamical<br />

constraints<br />

• Higher-level: Provably complete<br />

multi-resolution path planner using<br />

wavelet transform-based cell<br />

decompositions<br />

• History-based transition costs in<br />

cell decomposition graph for<br />

including vehicle dynamical<br />

constraints<br />

• Lower-level: Local trajectory<br />

generation algorithm based on<br />

geometric constructions and<br />

model predictive control<br />

Schematic illustration of the proposed<br />

motion planning framework<br />

16:30–16:45 WeDT5.3<br />

Massively Parallelizing the RRT and the RRT∗<br />

Joshua Bialkowski and Emilio Frazzoli<br />

Department of Aeronautics and Astronautics,<br />

Massachussetts Institute of Technology, USA<br />

Sertac Karaman<br />

Department of Electrical Enginnering and Computer Science,<br />

Massachussetts Institute of Technology, USA<br />

• GPUs have exhibited recent performance<br />

growth and support for general purpose<br />

computation<br />

• There is potential for significant speed-up<br />

in sampling based motion planning<br />

implementations<br />

• Collision checking is the primary<br />

bottleneck in many implementations of<br />

motion planning problems<br />

• We investigate parallel collision checking<br />

for a multi-link robotic manipulator using<br />

GPU hardware<br />

An N-Link Robotic<br />

Manipulator<br />

16:15–16:30 WeDT5.2<br />

Multi-Scale LPA* with Low Worst-Case<br />

Complexity Guarantees<br />

Yibiao Lu and Xiaoming Huo<br />

H. Milton Stewart School of Industrial and System Engineering<br />

Oktay Arslan and Panagiotis Tsiotras<br />

D. Guggenheim School of Aerospace Engineering<br />

Georgia Institute of Technology, USA<br />

• Combine the incremental<br />

search with map abstraction<br />

• Utilize multiscale graphical<br />

structure constructed via<br />

beamlets<br />

• Improve both run-time<br />

efficiency and robustness at<br />

execution level<br />

• Theoretical proofs on<br />

computational complexity<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–156–<br />

Replanning via m-LPA*<br />

16:45–16:50 WeDT5.4<br />

Deterministic Kinodynamic Planning<br />

with hardware demonstrations<br />

François Gaillard, Michaël Soulignac and Cédric Dinont<br />

CS Dept, ISEN Lille, France<br />

Philippe Mathieu<br />

SMAC-LIFL, University of Lille 1, France<br />

� DKP is a deterministic hybrid approach for<br />

trajectory planning<br />

� Incrementally builds a tree of spline<br />

trajectories<br />

� Guarantees on trajectory feasibility and<br />

reproducibility of the results<br />

� Explicit geometrical representation of local<br />

reachable space<br />

Reachable<br />

space<br />

Iterative planning within<br />

reachable spaces


Session WeDT5 Continental Ballroom 5 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Symposium: Robot Motion Planning: New Frameworks and High Performance<br />

Chair Maxim Likhachev, Carnegie Mellon Univ.<br />

Co-Chair Ron Alterovitz, Univ. of North Carolina at Chapel Hill<br />

16:50–16:55 WeDT5.5<br />

Navigation Meshes for Realistic<br />

Multi-Layered Environments<br />

Wouter van Toll, Atlas F. Cook IV, Roland Geraerts<br />

Utrecht University, Netherlands<br />

• A multi-layered environment is<br />

a connected set of 2D floors.<br />

• Examples:<br />

multi-storey buildings<br />

train stations<br />

airports<br />

shopping malls<br />

• We implement a multi-layered<br />

navigation mesh and use this<br />

structure to perform path<br />

planning operations.<br />

17:00–17:15 WeDT5.7<br />

Adaptive Time Horizon for On-line Avoidance<br />

in Dynamic Environments<br />

Zvi Shiller 1 , Oren Gal 2 and Ariel Raz 1<br />

Dept. Mechanical Engineering and Mechatronics, Ariel University Center, Israel<br />

2 Mechanical Engineering, Technion, Israel<br />

• On-line avoidance of static and<br />

moving obstacles based on Velocity<br />

Obstacles (VO)<br />

• VO truncated at an adaptive time<br />

horizon<br />

• Adaptive time horizon allows<br />

sufficient time for safe avoidance<br />

• VO conservatively but tightly<br />

approximates the set of inevitable<br />

collision states (ICS)<br />

• Planner generates near-time optimal<br />

trajectories<br />

16:55–17:00 WeDT5.6<br />

Solving Shortest Path Problems with Curvature<br />

Constraints Using Beamlets<br />

Oktay Arslan and Panagiotis Tsiotras<br />

D. Guggenheim School of Aerospace Engineering,<br />

Georgia Institute of Technology, USA<br />

Xiaoming Huo<br />

H. Milton Stewart School of Industrial and System Engineering,<br />

Georgia Institute of Technology, USA<br />

• A new multiscale graph representation of<br />

the environment which uses beamlets<br />

• Computation of paths with more feasibility<br />

guarantees (i.e., bounded heading angle<br />

on the path)<br />

• Searching on a larger space of curves by<br />

adopting the beamlet graph representation<br />

• Being able to compute complex curves<br />

(i.e., a self-intersecting path)<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–157–<br />

Proposed approach can find a<br />

self-intersecting path which may<br />

result due to the type of heading<br />

angle constraint on the path<br />

17:15–17:30 WeDT5.8<br />

Spline Templates for Fast Path Planning in<br />

Unstructured Environments<br />

Marcel Häselich, Nikolay Handzhiysky,<br />

Christian Winkens and Dietrich Paulus<br />

Computer Sciences, University of Koblenz-Landau, Germany<br />

AGAS robotics, robots.uni-koblenz.de<br />

• Perception and representation of<br />

unstructured environments from multiple<br />

sensors<br />

• Efficient data structure with terrain<br />

negotiability<br />

• Path planning for an autonomous<br />

heavyweight outdoor robot<br />

• Fast algorithm based on precomputed<br />

spline templates (arrays of terrain cells)


Session WeDT6 Continental Ballroom 6 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Symposium: Marine Robotics: Control and Planning<br />

Chair Fumin Zhang, Georgia Inst. of Tech.<br />

Co-Chair Mandar Chitre, National Univ. of Singapore<br />

16:00–16:15 WeDT6.1*<br />

Semi-Plenary Invited Talk: Applications of Marine<br />

Robotic Vehicles<br />

Junku Yuh, Korea Aerospace University<br />

16:30–16:45 WeDT6.3<br />

3D-Surface Reconstruction for Partially<br />

Submerged Marine Structures<br />

Using an Autonomous Surface Vehicle<br />

Georgios Papadopoulos 1 , Hanna Kurniawati 2 , Ahmed S. B. M.<br />

Shariff 3 , Liang Jie Wong 3 , Nicholas M. Patrikalakis 1<br />

1 Massachusetts Institute of Technology, USA<br />

2 SMART Center, Singapore<br />

3 National University of Singapore, Singapore<br />

• Proposed Robotic Platform<br />

• GPS denied environment<br />

• Surface Reconstruction<br />

above water-line part<br />

• Both above and below<br />

water-line parts<br />

16:15–16:30 WeDT6.2<br />

Semi-Plenary Invited Talk: Applications of Marine<br />

Robotic Vehicles<br />

Junku Yuh, Korea Aerospace University<br />

16:45–16:50 WeDT6.4<br />

Path tracking: Combined path following<br />

and trajectory tracking<br />

for autonomous underwater vehicle<br />

Xianbo Xiang1,2 , Lionel Lapierre2 , Chao Liu2 and Bruno Jouvencel2 1Huazhong University of Science and Technology, China<br />

2Department of Robotics, LIRMM/CNRS, France<br />

• This paper proposes a novel<br />

path tracking control for<br />

autonomous underwater<br />

vehicles (AUVs), which blends<br />

the conventional path following<br />

and trajectory tracking control<br />

in order to achieve smooth<br />

spatial convergence and tight<br />

temporal performance as well<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–158–


Session WeDT6 Continental Ballroom 6 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Symposium: Marine Robotics: Control and Planning<br />

Chair Fumin Zhang, Georgia Inst. of Tech.<br />

Co-Chair Mandar Chitre, National Univ. of Singapore<br />

16:50–16:55 WeDT6.5<br />

Autonomous Data Collection from Underwater<br />

Sensor Networks using Acoustic Communication<br />

Geoffrey A. Hollinger and Gaurav S. Sukhatme<br />

Computer Science Department, University of Southern California, USA<br />

Urbashi Mitra<br />

Department of Electrical Engineering, University of Southern California, USA<br />

• Underwater sensor networks monitor<br />

phenomena such as seismic activity<br />

and environmental conditions<br />

• An Autonomous Underwater Vehicle<br />

(AUV) equipped with an acoustic<br />

modem can gather data from such<br />

networks<br />

• We propose and analyze approximation<br />

algorithms for coordinating the AUV<br />

based on probabilistic variants of TSP<br />

• Simulations using realistic acoustic<br />

communication modeling demonstrate<br />

substantial improvement versus prior<br />

techniques<br />

Example sensor deployment on the<br />

ocean floor to monitor environmental<br />

conditions. Such sensors remain in place<br />

for many months, and collecting data<br />

from them is a challenging problem.<br />

17:00–17:15 WeDT6.7<br />

Underwater SLAM with Robocentric Trajectory<br />

Using a Mechanically Scanned Imaging Sonar<br />

Antoni Burguera, Yolanda González, Gabriel Oliver<br />

Dept. Matemàtiques i Informàtica, Universitat de les Illes Balears, Spain<br />

• Underwater MSIS scan-based<br />

SLAM<br />

• Correction of motion induced<br />

scan distortions<br />

• Explicit trajectory representation<br />

relative to the robot (robocentric)<br />

• Mean position error of 1.15m after<br />

more than 600m experiment<br />

An experiment showing<br />

robocentric trajectory results and<br />

the ground truth (DGPS)<br />

16:55–17:00 WeDT6.6<br />

Modeling and Reactive Navigation<br />

of an Autonomous Sailboat<br />

Clement Petres, Miguel Romero, Frederic Plumet<br />

ISIR, UPMC Univ. Paris 06, France<br />

Bertrand Alessandrini<br />

Fluid Mechanics Laboratory, Ecole Centrale de Nantes, France<br />

• ASAROME project: Autonomous SAiling<br />

Robot for Oceanographic MEasurements<br />

• Experimental identification of a dynamic<br />

model of a sailboat � realistic simulator<br />

• Potential field based path planning<br />

method � real-time reactive navigation<br />

• Simulation of various navigation strategies<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–159–<br />

ASAROME prototype<br />

17:15–17:30 WeDT6.8<br />

A Lower Bound On Navigation Error for Marine<br />

Robots Guided by Ocean Circulation Models<br />

Klementyna Szwaykowska and Fumin Zhang<br />

Electrical and Computer Engineering, Georgia Institute of Technology, USA<br />

• We analyze effectiveness of using Ocean<br />

General Circulation Models (OGCMs) to<br />

guide autonomous marine robots<br />

• Controlled Lagrangian Particle Tracking<br />

(CLPT) used to study motion of<br />

autonomous agents in the ocean<br />

• We derive lower bound on error in<br />

predicted robot position, which depends<br />

on OGCM resolution<br />

• We show that long-term error in predicted<br />

robot position has bounded growth rate<br />

Error in robot position prediction<br />

during 2006 field experiment in<br />

Monterey Bay, CA.


Session WeDT7 Continental Parlor 7 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Tracking & Gait Analysis<br />

Chair Gian Luca Mariottini, Univ. of Texas at Arlington<br />

Co-Chair Domenico Prattichizzo, Istituto Italiano di Tecnologia<br />

16:00–16:15 WeDT7.1<br />

Identification of Mobile Entities Based on<br />

Trajectory and Shape Information<br />

Zeynep Yücel Tetsushi Ikeda Takahiro Miyashita Norihiro Hagita<br />

Advanced Telecommunications Research Institute International<br />

Japan<br />

• This study integrates motion characteristics and geometric shape<br />

models in object recognition from range samples.<br />

• The shape model is described by parameters of an elliptic fit and is<br />

utilized to obtain the prior probabilities.<br />

• The motion characteristics<br />

are described in terms of<br />

coherence quality as<br />

moving at the front, rear,<br />

side, or alone and is utilized<br />

to get the posterior<br />

probabilities.<br />

• The method is shown to<br />

yield 92% recognition rate<br />

in an uncontrolled field<br />

experiment with more than<br />

500 participants.<br />

Three entities moving coherently. e i and e j<br />

present side-by-side motion where e j and e k<br />

are in following motion.<br />

16:30–16:45 WeDT7.3<br />

Adaptive Human Shape Reconstruction via<br />

3D Head Tracking for Motion Capture in<br />

Changing Environment<br />

Kazuhiko Murasaki*, Masamichi Shimosaka**,<br />

Taketoshi Mori** and Tomomasa Sato**<br />

*Nippon Telegraph and Telephone Corporation, Japan<br />

**The University of Tokyo, Japan<br />

• Markerless motion capture techniques with dynamic background<br />

scenarios including furniture movements and light condition changes<br />

• Robust human silhouette reconstruction by the following 2 techniques<br />

• Iterative graph-cut methods with head positions<br />

• Novel tree-based multi-class recognition as multi-view face detectors<br />

16:15–16:30 WeDT7.2<br />

Marathoner Tracking Algorithms for a High<br />

Speed Mobile Robot<br />

Eui-Jung Jung, Byung-Ju Yi, Il Hong Suh, and Si Tae Noh<br />

Department of Electronics and System Engineering, Hanyag University, Korea<br />

Jae Hoon Lee<br />

Graduate School of Science and Engineering, Ehime University, Japan<br />

Shin’ichi Yuta<br />

Institute of Engineering Mechanics and Systems, University of Tsukuba, Japan<br />

• Tracking algorithms of a mobile robot that<br />

follows a human body moving at high speed<br />

in unstructured outdoor environment<br />

• The mobile robot is equipped with one laser<br />

range finder to perform high-speed human<br />

tracking and obstacle avoidance.<br />

• A waist part of the human body is found the<br />

best part of the human body for robust<br />

tracking by a laser range finder.<br />

• An obstacle avoidance algorithm<br />

considering the relative velocity between<br />

the robot and obstacles is proposed.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–160–<br />

Marathoner Following Robot<br />

16:45–16:50 WeDT7.4<br />

Cooperative active target tracking for heterogeneous<br />

robots with application to gait monitoring<br />

Fabio Morbidi1 , Christopher Ray2 , Gian Luca Mariottini1 ,<br />

1 ASTRA Robotics Laboratory, Dept. of Computer Science & Engineering<br />

2 Dept. of Kinesiology<br />

University of Texas at Arlington, USA<br />

• Cooperative active target-tracking for a team of heterogeneous robots<br />

equipped with 3-D range-finding sensors<br />

• Gradient-based controllers are designed for each robot and the<br />

Kalman-Bucy filter is used for estimation fusion<br />

• Robots use relative-position measurements to localize themselves in<br />

GPS-denied environments<br />

• Validation on a team of aerial and ground robots tracking the motion of<br />

a human subject for a gait-monitoring task


Session WeDT7 Continental Parlor 7 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Tracking & Gait Analysis<br />

Chair Gian Luca Mariottini, Univ. of Texas at Arlington<br />

Co-Chair Domenico Prattichizzo, Istituto Italiano di Tecnologia<br />

16:50–16:55 WeDT7.5<br />

Fast Visual People Tracking using a Feature-<br />

Based People Detector<br />

Achim Königs and Dirk Schulz<br />

Unmanned Systems Group,<br />

Fraunhofer Institute for Communication,<br />

Information Processing and Ergonomics FKIE, Germany<br />

• Visual people tracking that combines<br />

feature-based object detection with<br />

feature tracking.<br />

• Provides a fast tracking process and<br />

allows to specialize to tracking a specific<br />

object.<br />

• The tracking process is based on tracking<br />

the features found by the detector.<br />

• Features are tracked by carrying out a<br />

correspondence search from frame to<br />

frame in combination with a voting<br />

scheme for determining a person’s center<br />

position in the image.<br />

Tracked person, distinguished<br />

from other similar person. White<br />

dots denote the tracked features.<br />

17:00–17:15 WeDT7.7<br />

Particle Filter Based Monocular Human Tracking<br />

with a 3D Cardbox Model and a Novel<br />

Deterministic Resampling Strategy<br />

Ziyuan Liu + , Dongheui Lee + and Wolfgang Sepp *<br />

+ Technische Universität München, Germany<br />

* German Aerospace Center, Germany<br />

• Particle filter based monocular human<br />

upper body tracking<br />

• A novel deterministic resampling strategy<br />

(DRS)<br />

• A new 3D articulated human upper body<br />

model with the name 3D cardbox model<br />

• Quantitative evaluation of the tracking<br />

results<br />

• Comparison between DRS and standard<br />

stratified resampling strategy (SRS)<br />

• Comparison with a commercial motion<br />

capturing system<br />

Tracking example: projected human<br />

model (top), silhouette matching<br />

(middle), raw image (bottom)<br />

16:55–17:00 WeDT7.6<br />

A Nonlinear Controller for People Guidance<br />

based on Omnidirectional Vision<br />

Flávio Garcia Pereira<br />

Department of Engineering and Computing, Federal University of Espírito<br />

Santo, Brazil.<br />

Milton César Paes Santos and Raquel Frizera Vassallo<br />

Department of Electrical Engineering, Federal University of Espírito Santo,<br />

Brazil.<br />

• Guide-tour robot;<br />

• Human’s movement;<br />

• Nonlinear final pose controller ;<br />

• Lyapunov analysis;<br />

• Omnidirectional vision;<br />

• Human detection.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–161–<br />

A robot navigating and a person<br />

following it.<br />

17:15–17:30 WeDT7.8<br />

Non-Drifting Limb Angle Measurement Relative<br />

to the Gravitational Vector<br />

Andrew Petruska and Sanford Meek<br />

Mechanical Engineering, University of Utah, USA<br />

• Two accelerometers and one<br />

rate gyro are used measure<br />

inclination for feedback control<br />

and gait analysis.<br />

• The estimate of inclination from<br />

the two accelerometers is not<br />

biased by angular motion.<br />

• The inclination is blended with<br />

the angular velocity using a<br />

Kalman filter.<br />

• RMS error for both sinusoidal<br />

and constant velocity inputs was<br />

measured to be less than 5� for<br />

gait-like frequencies.<br />

A comparison between the Kalman filter<br />

implemented with the dynamic angle<br />

estimation (solid) and without (dashed).


Session WeDT8 Continental Parlor 8 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Multirobot Coordination & Modular Robots<br />

Chair Kasper Stoy, Univ. of Southern Denmark<br />

Co-Chair Stephen L. Smith, Univ. of Waterloo<br />

16:00–16:15 WeDT8.1<br />

Modeling Mutual Capabilities in<br />

Heterogeneous Teams for Role Assignment<br />

Somchaya Liemhetcharat and Manuela Veloso<br />

School of Computer Science, Carnegie Mellon University, USA<br />

• Agents in heterogeneous teams have different capabilities.<br />

• These capabilities are not only individual, but depend on the<br />

composition of the team and the role assignments.<br />

• We contribute the Mutual State Capability-Based Role Assignment<br />

(MuSCRA) model that models pairwise interactions between agents<br />

and their mutual state.<br />

• The MuSCRA model is applicable to dynamic environments where<br />

state affects the performance of agents in the team.<br />

16:30–16:45 WeDT8.3<br />

Evaluation of a Power Management System for<br />

Heterogeneous Modules<br />

in Self-Reconfigurable Multi-Module Systems<br />

Zhuowei Wang, Florian Cordes,<br />

Alexander Dettmann, and Roman Szczuka<br />

German Research Center for Artificial Intelligence<br />

Robotics Innovation Center, Germany<br />

• Power source and diverse<br />

functionality are separated from each<br />

other and encapsulated in different<br />

modules<br />

• Homogeneous power management<br />

system can support more than 250W<br />

power consumption in an established<br />

multi-module system<br />

• Evaluation of key features of the<br />

power management system<br />

Self-reconfiguration of<br />

heterogeneous modular system<br />

supported by the power<br />

management system<br />

16:15–16:30 WeDT8.2<br />

Collision Avoidance for Persistent Monitoring in<br />

Multi-Robot Systems with Intersecting Trajectories<br />

Daniel E. Soltero Stephen L. Smith<br />

EECS, MIT, USA ECE, Waterloo, Canada<br />

Daniela Rus<br />

EECS, MIT, USA<br />

• Group of robots equipped with sensors with<br />

finite footprints persistently monitor a<br />

changing environment.<br />

• Collision and dead-lock avoidance<br />

procedure is developed to treat intersecting<br />

trajectories.<br />

• Analysis of the effects of developed<br />

procedure on the stability of the persistent<br />

controller.<br />

• Implementation on ground vehicles and<br />

aerial vehicles.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–162–<br />

Four robots (yellow-filled circles)<br />

performing a persistent task where<br />

collision avoidance is essential<br />

due to intersecting paths.<br />

16:45–16:50 WeDT8.4<br />

Generalized Programming of Modular Robots<br />

through Kinematic Configurations<br />

M. Bordignon, K. Stoy, and U. Schultz<br />

Modular Robotics Lab, University of Southern Denmark, Denmark<br />

• Programmer models kinematics of<br />

each kind of module in the robot<br />

(~30 lines of geometric declarations)<br />

• Toolchain generates:<br />

– runtime support for computing<br />

forward kinematics<br />

– high-level primitives and low-level<br />

code for morphology-independent<br />

programming of robot<br />

– complete world descriptions for robot<br />

simulation<br />

• General programming language for<br />

modular robots<br />

MODEL<br />

PROGRAM


Session WeDT8 Continental Parlor 8 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Multirobot Coordination & Modular Robots<br />

Chair Kasper Stoy, Univ. of Southern Denmark<br />

Co-Chair Stephen L. Smith, Univ. of Waterloo<br />

16:50–16:55 WeDT8.5<br />

Energy-Aware Coverage Control with<br />

Docking for Robot Teams<br />

Jason Derenick, Nathan Michael, and Vijay Kumar<br />

GRASP Laboratory, University of Pennsylvania, USA<br />

• A Centroidal Voronoi Tessellation<br />

(CVT)-based control law is formulated<br />

that leverages a weighting scheme that<br />

embeds an agent’s tradeoff to achieve<br />

its coverage mission and to maintain its<br />

energy reserve.<br />

• The controller is motivated by longterm<br />

persistent surveillance<br />

applications that may exceed the<br />

runtime of an agent.<br />

• As a robot’s energy-reserve depletes, it<br />

is drawn towards a docking station (for<br />

recharging) while still participating in<br />

coverage as the formation adjusts.<br />

• Stability of the distributed (among<br />

Voronoi-neighbors) control law is<br />

considered and simulation results are<br />

presented verifying the analysis.<br />

17:00–17:15 WeDT8.7<br />

Intent Inference and Strategic Escape in Multi-<br />

Robot Games with Physical Limitations<br />

Aris Valtazanos and Subramanian Ramamoorthy<br />

School of Informatics, University of Edinburgh, United Kingdom<br />

• Strategic decision-making framework for<br />

physical multi-robot games (e.g. robotic<br />

soccer)<br />

• Decouple sensory and strategic uncertainty<br />

• Sensory uncertainty: Reachable Set<br />

Particle Filter – combine probabilistic<br />

state estimation with dynamical constraints<br />

• Strategic uncertainty: Probabilistic intent<br />

templates fitted to the adversary and<br />

adjusted through data-driven estimation<br />

16:55–17:00 WeDT8.6<br />

Optimization of Personal Distribution<br />

for Evacuation Guidance based on Vector Field<br />

Masafumi Okada and Teruhisa Ando<br />

Dept. of Mechanical Sciences and Engineering, Tokyo TECH, Japan<br />

• This paper proposes an optimization<br />

method of personal distribution for<br />

evacuation guidance.<br />

• Based on the evacuation route, the<br />

intention of the evacuee is modeled by<br />

vector field.<br />

• The action of the operator is also modeled<br />

by vector field considering indicating<br />

direction.<br />

• By using the autonomous mobile robots<br />

and a radio controlled robot, the proposed<br />

method is evaluated in the quasi-human<br />

environment.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–163–<br />

Guidance experiment with<br />

a radio controlled robot<br />

17:15–17:30 WeDT8.8<br />

Optimisation Model and Exact Algorithm for<br />

Autonomous Straddle Carrier Scheduling at<br />

Automated Container Terminals<br />

Binghuang Cai, Shoudong Huang, Dikai Liu,<br />

Shuai Yuan, and Gamini Dissanayake<br />

Centre for Autonomous Systems, University of Technology, Sydney, Australia<br />

Haye Lau and Daniel Pagac<br />

Patrick Technology Systems, Sydney, Australia<br />

• An optimisation model is presented for<br />

Autonomous Straddle Carriers Scheduling<br />

(ASCS) at automated container terminals.<br />

• An exact algorithm based on Branch-and-<br />

Bound-with-Column-Generation (BBCG) is<br />

presented for the ASCS problem solving.<br />

• The BBCG algorithm is compared to<br />

Binary-integer-Programming-with-<br />

Dynamic-Programming (BPDP) and<br />

Exhaustive-Search-with-Permutation-and-<br />

Combination (ESPC) algorithms.<br />

• Simulation results and discussions<br />

demonstrate the effectiveness and<br />

efficiency of the presented model and<br />

algorithm.


Session WeDT9 Continental Parlor 9 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Calibration & Identification<br />

Chair Giorgio Metta, Istituto Italiano di Tecnologia (IIT)<br />

Co-Chair Jozsef Kovecses, McGill Univ.<br />

16:00–16:15 WeDT9.1<br />

Skin Spatial Calibration<br />

Using Force/Torque Measurements<br />

Andrea Del Prete, Lorenzo Natale,<br />

Francesco Nori and Giorgio Metta<br />

RBCS, Italian Institute of Technology, Italy<br />

Simone Denei, Fulvio Mastrogiovanni and Giorgio Cannata<br />

DIST, University of Genova, Italy<br />

• This paper deals with the problem of<br />

estimating the position of tactile elements<br />

(i.e. taxels) mounted on a robot body part<br />

• Taxel positions are estimated measuring<br />

contact forces/torques applied on the<br />

sensorized part using a F/T sensor<br />

mounted on the robot kinematic chain<br />

• Differently from previous works, this<br />

procedure provides not only the network<br />

topology, but also a metric information<br />

• The method has been validated with<br />

experiments on the iCub humanoid robot,<br />

leading to a final average error of about 7<br />

mm<br />

Taxel position estimations.<br />

As reference the edges of the<br />

triangular skin modules are drawn.<br />

16:30–16:45 WeDT9.3<br />

Static Calibration of the DLR Medical Robot<br />

MIRO, a Flexible Lightweight Robot with<br />

Integrated Torque Sensors<br />

Julian Klodmann, Rainer Konietschke,<br />

Alin Albu-Schäffer and Gerd Hirzinger<br />

German Aerospace Center (DLR), Germany<br />

• Stepwise approach to calibrate<br />

robots in the assembled state<br />

• Planning of static poses in the<br />

entire workspace to obtain<br />

convergence<br />

16:15–16:30 WeDT9.2<br />

Muscle Strength and Mass Distribution<br />

Identification Toward Subject-Specific<br />

Musculoskeletal Modeling<br />

Mitsuhiro Hayashibe<br />

DEMAR INRIA and LIRMM, France<br />

Gentiane Venture<br />

Tokyo University of Agriculture and Technology, Japan<br />

Ko Ayusawa, Yoshihiko Nakamura<br />

University of Tokyo, Japan<br />

• Framework toward subject-specific<br />

musculoskeletal modeling,<br />

• Mass Distribution Identification to<br />

improve the joint torque estimation,<br />

• Muscle Strength Identification to<br />

improve the muscle force estimation<br />

accuracy,<br />

• This first result highlights that the<br />

reliable muscle force estimation could<br />

be extracted after these identifications.<br />

16:45–16:50 WeDT9.4<br />

Simultaneous Calibration, Localization,<br />

and Mapping<br />

Rainer Kümmerle 1 , Giorgio Grisetti 2,1 , and Wolfram Burgard 1<br />

1 Dept. of Computer Science, University of Freiburg, Germany<br />

2 Dept. of Systems and Computer Science, Sapienza University of Rome, Italy<br />

• The calibration parameters of a mobile<br />

robot play a substantial role in navigation<br />

tasks.<br />

• Simultaneously estimate a map of the<br />

environment, the position of the on-board<br />

sensors, and its kinematic parameters<br />

• Our approach performs online estimation<br />

of the calibration parameters and is able<br />

to adapt to non-stationary changes.<br />

• Tested in simulation and with a wide range<br />

of real-world data<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–164–<br />

Effects of carrying a load on the<br />

wheel diameters and the odometry


Session WeDT9 Continental Parlor 9 Wednesday, September 28, <strong>2011</strong>, 16:00–17:30<br />

Calibration & Identification<br />

Chair Giorgio Metta, Istituto Italiano di Tecnologia (IIT)<br />

Co-Chair Jozsef Kovecses, McGill Univ.<br />

16:50–16:55 WeDT9.5<br />

Experimental Evaluation of New Methods for<br />

In-Situ Calibration of Attitude and Doppler<br />

Sensors for Underwater Vehicle Navigation<br />

Giancarlo Troni and Louis L. Whitcomb<br />

Department of Mechanical Engineering<br />

Johns Hopkins University, USA<br />

• Development and in-water experimental<br />

evaluation of two new methods for in-situ<br />

calibration of the alignment rotation matrix<br />

between Doppler sonar velocity sensors<br />

and gyrocompass attitude sensors.<br />

• We report alignment calibration methods<br />

employing only internal vehicle navigation<br />

sensors, e.g. for velocity, acceleration,<br />

attitude, and depth.<br />

• Results from laboratory experiments<br />

comparing these methods to previously<br />

reported techniques indicate satisfactory<br />

performance of the proposed methods.<br />

JHU ROV in the Johns Hopkins<br />

Hydrodynamic Test Facility<br />

17:00–17:15 WeDT9.7<br />

Relative Accuracy Enhancement System<br />

Based on Internal Error Range Estimation for<br />

External Force Measurement in<br />

Construction Manipulator<br />

Mitsuhiro Kamezaki, Hiroyasu Iwata, and Shigeki Sugano<br />

Department of Modern Mechanical Engineering, Waseda University, Japan<br />

• Our framework adopts a relative accuracy<br />

improvement strategy without correcting the<br />

models for the practicality, comprising:<br />

• i. Quantifying the internal error range (IER)<br />

by using the sum of the maximal measuring<br />

errors of static and dynamic friction forces<br />

• ii. Calculating the error force vector by using<br />

IER to select cylinders that have less error<br />

• iii. Outputting the front load vector by using<br />

the cylinders whose error force vector is<br />

minimum<br />

• Experimental results indicate that our<br />

framework can enhance the relative<br />

accuracy of external force measurement<br />

Instrumented hydraulic arm<br />

16:55–17:00 WeDT9.6<br />

New Method for Global Identification<br />

of the Joint Drive Gains of Robots<br />

using a Known Payload Mass<br />

M. Gautier (1) and S. Briot (2)<br />

(1) University of Nantes, IRCCyN, France<br />

(2) CNRS, IRCCyN, France<br />

• Off-line identification of robot dynamic parameters<br />

• Inverse Dynamic Identification Model and<br />

LS technique (IDIM-LS)<br />

• Identification of the total joint drive gains in one<br />

step<br />

• Using only the weighed mass of a payload, no<br />

need CAD values<br />

• Using motor current reference, joint position,<br />

available robot’s controller data<br />

• Mixing data from trajectory without load +<br />

trajectory with load<br />

• Validation experiment, 4 first joints of a 6 dof<br />

industrial robot<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–165–<br />

FStaubli RX90 robot<br />

17:15–17:30 WeDT9.8<br />

Recursive State-Parameter Estimation<br />

of Haptic Robotic Systems<br />

Arash Mohtat, Kamran Ghaffari Toiserkan,<br />

and József Kövecses<br />

Centre for Intelligent Machines, McGill University, Canada<br />

• Simultaneous state-parameter estimation<br />

via unscented Kalman filtering (UKF).<br />

• Overcoming limitations: linearity of the<br />

models in the parameters; need for<br />

velocities and accelerations; and, need for<br />

differentiation and linearization.<br />

• The recursive nature allows for offline<br />

processing and online implementation.<br />

• Haptic Applications: State-parameter<br />

estimation of the haptic device;<br />

Environmental parameters identification.<br />

The identification experiment:<br />

The haptic device interacting with<br />

a virtual box.


<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–166–


8:00-9:30<br />

Technical Program Digest<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–167–<br />

Thursday<br />

September 29, <strong>2011</strong><br />

Session ThAT1 ⎯⎯ Force and Stiffness Control<br />

Session ThAT2 ⎯⎯ Range and RGB-D Sensing<br />

Session ThAT3 ⎯⎯ Discrete & Kinodynamic Planning<br />

Session ThAT4 ⎯⎯ Symposium: Stochasticity in Robotics and Biological Systems I<br />

Session ThAT5 ⎯⎯ Symposium: Humanoid Robotics and Biped Locomotion<br />

Session ThAT6 ⎯⎯ Symposium: Robots that Can See I<br />

Session ThAT7 ⎯⎯ Mechanisms: Joint Design<br />

Session ThAT8 ⎯⎯ Autonomous Vehicles<br />

Session ThAT9 ⎯⎯ Towards Anthropomimetic Robots<br />

10:00-11:30<br />

Session ThBT1 ⎯⎯ Control for Manipulation & Grasping<br />

Session ThBT2 ⎯⎯ Mapping<br />

Session ThBT3 ⎯⎯ Randomized Planning & Kinematic Control<br />

Session ThBT4 ⎯⎯ Symposium: Stochasticity in Robotics and Biological Systems II<br />

Session ThBT5 ⎯⎯ Symposium: Humanoid Technologies<br />

Session ThBT6 ⎯⎯ Symposium: Robots that Can See II<br />

Session ThBT7 ⎯⎯ Medical Robots & Systems<br />

Session ThBT8 ⎯⎯ Search & Rescue Robots<br />

Session ThBT9 ⎯⎯ Intelligent Transportation Systems (Automotive)<br />

(continues on next page)


14:45-16:15<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–168–<br />

Thursday<br />

September 29, <strong>2011</strong><br />

(continued)<br />

Session ThCT1 ⎯⎯ Manipulation<br />

Session ThCT2 ⎯⎯ Industrial Robots<br />

Session ThCT3 ⎯⎯ Marine Systems I<br />

Session ThCT4 ⎯⎯ Symposium: (Self-)assembly from the Nano to the Macro Scale:<br />

State of the Art and Future Directions<br />

Session ThCT5 ⎯⎯ Symposium: Humanoid Applications<br />

Session ThCT6 ⎯⎯ Symposium: Computer Vision for Robotics<br />

Session ThCT7 ⎯⎯ Exoskeleton Robots & Gait Rehabilitation<br />

Session ThCT8 ⎯⎯ Aerial Robots: Navigation, Tracking & Landing<br />

Session ThCT9 ⎯⎯ Swarms and Flocks<br />

16:45-17:45<br />

Session ThDT1 ⎯⎯ Forum: On Robotics Conferences: A Town Hall Meeting<br />

(in Continental Ballroom)<br />

Session ThDT3 ⎯⎯ Marine Systems II<br />

Session ThDT4 ⎯⎯ Novel System Designs: Locomotion (in Continental Parlor 2)<br />

Session ThDT5 ⎯⎯ Climbing & Brachiation (in Continental Parlor 1)<br />

Session ThDT6 ⎯⎯ Novel System Designs: Sensing and Manipulation<br />

(in Continental Parlor 9)<br />

Session ThDT7 ⎯⎯ Medical Robotics: Motion Planning & State Estimation<br />

Session ThDT8 ⎯⎯ Aerial Robots


Session ThAT1 Continental Parlor 1 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Force and Stiffness Control<br />

Chair Robert James Webster III, Vanderbilt Univ.<br />

Co-Chair Jonathan Hurst, Oregon State Univ.<br />

08:00–08:15 ThAT1.1<br />

Trajectory Tracking Controller With Dynamic<br />

Gains for Mobile Robots<br />

Cassius Resende<br />

Dept. of Electrical Engineering, Federal University of Espírito Santo, Brazil<br />

Felipe Espinosa, Ignacio Bravo<br />

Dept. of Electronics, University of Alcalá, Spain<br />

Mário Sarcinelli-Filho, Teodiano F. Bastos-Filho<br />

Dept. of Electrical Engineering, Federal University of Espírito Santo, Brazil<br />

• A Takashi-Sugeno fuzzy controller is<br />

proposed to guide a unicycle mobile<br />

robot during trajectory tracking<br />

• Stability of the closed loop system<br />

using such controller is proven, based<br />

on Lyapunov’s theory<br />

• The low complexity of the proposed<br />

controller makes it suitable for<br />

implementation in low-profile<br />

processors<br />

• Theoretical analysis and experimental<br />

results validate the proposed controller<br />

08:30–08:45 ThAT1.3<br />

Force Control for<br />

Planar Spring-Mass Running<br />

Devin Koepl and Jonathan Hurst<br />

Dynamic Robotics Laboratory, Oregon State University, USA<br />

• Presents a novel control<br />

strategy for spring-leg robot<br />

running.<br />

• Strategy makes good use of<br />

passive dynamics, conserving<br />

energy.<br />

• Incorporates active force control<br />

for robustness to disturbances.<br />

• Requires no external sensing,<br />

so practical for real world springleg<br />

robots.<br />

Actuated spring-leg model, with<br />

series hip and leg elasticity.<br />

08:15–08:30 ThAT1.2<br />

Multi-Priority Control in Redundant Robotic Systems<br />

Hamid Sadeghian †, Luigi Villani ††, Mehdi Keshmiri †,<br />

Bruno Siciliano ††<br />

†Mechanical Eng. Dept., Isfahan Univ. of Tech.,Iran<br />

††DIS, Univ. di Napoli Federico II, Italy<br />

• Dynamic level multi-priority control algorithm<br />

as a general framework for the whole body<br />

control of robotic systems is presented.<br />

• Some of the previously developed results<br />

are easily formalized using this general<br />

framework.<br />

• Null-space impedance control is proposed<br />

as one of the main results and is evaluated<br />

by means of computer simulation.<br />

08:45–08:50 ThAT1.4<br />

Deflection-Based Force Sensing<br />

for Continuum Robots:<br />

A Probabilistic Approach<br />

D. Caleb Rucker and Robert J. Webster III<br />

Department of Mechanical Engineering, Vanderbilt University, USA<br />

Medical & Electromechanical Design Laboratory<br />

• Flexible Continuum Robots Can Be Used As Force Sensors<br />

• Pose Measurements + Mechanics-Based Model => Force Estimate<br />

• Extended Kalman Filter Used to Quantify Uncertainty in Estimates<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–169–


Session ThAT1 Continental Parlor 1 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Force and Stiffness Control<br />

Chair Robert James Webster III, Vanderbilt Univ.<br />

Co-Chair Jonathan Hurst, Oregon State Univ.<br />

08:50–08:55 ThAT1.5<br />

Optimality Principles in Stiffness Control<br />

The VSA Hammer<br />

Manolo Garabini † , Felipe Belo † , Andrea Passaglia † ,<br />

Paolo Salaris † † ‡<br />

, and Antonio Bicchi<br />

†<br />

Interdept. Research Center “E. Piaggio”, University of Pisa, Italy<br />

‡<br />

Italian Institute of Technology, Genova, Italy<br />

• Importance of Variable Stiffness Actuators<br />

(VSA) in safety and performance of robots<br />

• Optimality principles regulating variation of<br />

stiffness in very dynamic tasks where<br />

impacts are maximized<br />

• New optimization problem: maximize<br />

velocity of a link at a given final position,<br />

e.g., for best hammer impact<br />

• Fixed stiffness case: there exists an optimal linear spring for given link and<br />

motor inertia<br />

• VSA case: varying spring stiffness during execution improves final<br />

performance of hammering task<br />

• Optimal control law obtained analytically and validated with experimental<br />

tests<br />

09:00–09:15 ThAT1.7<br />

Optimal and Fault-Tolerant Torque Control of Servo<br />

Motors Subject to Voltage and Current Limits<br />

Farhad Aghili<br />

Canadian Space Agency (CSA), Canada<br />

• Optimal management of motor<br />

excitation currents can significantly<br />

increase the rated speed and<br />

torque of the motor in the face of<br />

the voltage and current limits of<br />

the drivers.<br />

• In the case of one phase failure,<br />

controller optimally reshapes the<br />

stator currents of the remaining<br />

phases for continuing accurate<br />

torque production<br />

• A closed-form solution for the<br />

optimal phase currents rendering<br />

the control algorithm suitable for<br />

real-time implementation<br />

• Experimental results<br />

The architecture of the optimal<br />

and fault-tolerant torque<br />

controller<br />

08:55–09:00 ThAT1.6<br />

A novel stiffness node controller which enables<br />

simultaneous regulation of torque and stiffness<br />

in multi-muscle driven joints<br />

Salvatore Annunziata, Jan Paskarbeit and Axel Schneider<br />

Faculty of Technology, University of Bielefeld, Germany<br />

• Co-activation of two antagonistic<br />

muscles is in general considered to<br />

vary stiffness at joint level.<br />

• However, angular joint positions may<br />

exist for which the stiffness does not<br />

vary with co-activation.<br />

• At those positions any concurrent<br />

torque/stiffness controller is bound to<br />

fail.<br />

• A four-muscle based controller is<br />

presented that enables torque/stiffness<br />

control also at these positions.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–170–<br />

(a) 4-muscle driven joint<br />

(b) Torque/angle muscle setup<br />

(c) Stiffness node<br />

09:15–09:30 ThAT1.8<br />

Online Motion Selection for Semi-Optimal<br />

Stabilization using Reverse-Time Tree<br />

Chyon Hae Kim and Hiroshi Tsujino<br />

Honda Research Institute, Japan<br />

Shigeki Sugano<br />

Graduate School of Creative Science<br />

and Engineering Waseda University, Japan<br />

• We discuss a general method for<br />

creating an approximately optimal<br />

stabilization system.<br />

• The proposed system creates a<br />

reverse-time tree using RApid Semioptimal<br />

MOtion-planning (RASMO).<br />

• We developed an online motion<br />

selection method that selects a<br />

semi-optimal motion according to the<br />

current state of a machine.<br />

• Calculation speed and optimality are<br />

validated in a simulation.<br />

Offline Process: Creating reverse-time tree by RASMO<br />

Input Goal States<br />

Search States<br />

Prune Paths<br />

Is the Tree Depth Sufficient?<br />

Get Reverse-Time Tree<br />

Online Process:� Path selection and feedback control<br />

Yes No<br />

The proposed system<br />

Constraints<br />

Resolver


Session ThAT2 Continental Parlor 2 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Range and RGB-D Sensing<br />

Chair Robert Zlot, CSIRO<br />

Co-Chair Kai Oliver Arras, Univ. of Freiburg<br />

08:00–08:15 ThAT2.1<br />

3D Object Recognition in Range Images Using<br />

Visibility Context<br />

Eunyoung Kim and Gerard Medioni<br />

Computer Science Department, University of Southern California, USA<br />

• Goal<br />

• Recognize and localize queried objects<br />

in cluttered environments using the<br />

Primesensor<br />

• Only utilize depth information for robots<br />

to work under bad or no illumination<br />

• Contributions<br />

• Propose a new point pair feature using<br />

visibility context that contains<br />

discriminative description<br />

• Improve an pose estimation method<br />

using point pair matches<br />

• Outperform two state-of-the-art methods<br />

in terms of recall & precision rate<br />

08:30–08:45 ThAT2.3<br />

Comparative Evaluation of Range Sensing<br />

Technologies for Underground Void Modeling<br />

Uland Wong, Aaron Morris, Colin Lea, James Lee,<br />

Chuck Whittaker, Ben Garney and Red Whittaker<br />

Robotics Institute, Carnegie Mellon University, USA<br />

• 10 sensors evaluated across three<br />

underground tunnel environments<br />

and one lab environment<br />

• Technologies assessed include 5 ToF<br />

LIDAR sensors, flash LIDAR, FMCW,<br />

stereo vision, the Kinect and others.<br />

• Metrics of evaluation inspired by 3D<br />

modeling performance in field<br />

(density distribution, accuracy,<br />

coverage)<br />

• Test environments and methodology<br />

are sufficiently general for<br />

applicability of results in related<br />

domains<br />

Sensor modeling is performed in three<br />

representative tunnel environments.<br />

08:15–08:30 ThAT2.2<br />

Fast Plane Extraction in 3D Range Data<br />

Based on Line Segments<br />

Kristiyan Georgiev, Ross Creed, Rolf Lakaemper<br />

Department of Computer and Information Sciences<br />

Temple University, USA<br />

• Extract line segments from raw data.<br />

• Determine connected components,<br />

representing possible candidate sets of<br />

coplanar segments.<br />

• Perform region growing on candidate sets<br />

to establish planes from coplanar<br />

segments.<br />

• Region growing contains a fast O(1) plane<br />

update technique in its core, leading to a<br />

real time performance of the algorithm.<br />

08:45–08:50 ThAT2.4<br />

Tracking a Depth Camera: Parameter Exploration<br />

for Fast ICP<br />

François Pomerleau, Stéphane Magnenat, Francis Colas,<br />

Ming Liu, and Roland Siegwart<br />

Department of Mechanical and Process Engineering, ETH Zurich, Switzerland<br />

• Modular ICP library openly<br />

available<br />

• Test datasets available with<br />

millimeter accuracy ground truth<br />

• Capability of 3D pose tracking at<br />

30 Hz with reasonable precision<br />

• Sound statistical evaluation over<br />

a range of parameters<br />

• Library accessible to embedded<br />

systems<br />

One path of depth camera tracked (red) with ICP at<br />

30 Hz vs. the measured ground truth (green)<br />

• Standalone C++ library:<br />

http://github.com/ethz-asl/libpointmatcher<br />

• ROS node:<br />

www.ros.org/wiki/modular_cloud_matcher<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–171–


Session ThAT2 Continental Parlor 2 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Range and RGB-D Sensing<br />

Chair Robert Zlot, CSIRO<br />

Co-Chair Kai Oliver Arras, Univ. of Freiburg<br />

08:50–08:55 ThAT2.5<br />

Watertight Surface Reconstruction of Caves<br />

from 3D Laser Data<br />

Claude Holenstein, Robert Zlot, and Michael Bosse<br />

Autonomous Systems Laboratory, ICT Centre<br />

CSIRO, Australia<br />

• Watertight surface reconstruction of<br />

caves and other closed environments<br />

from data collected on a mobile sensor<br />

• Developed algorithm is scalable<br />

(kilometers of cave data), and able to<br />

remove returns from dynamic<br />

obstacles occluding sensor field of<br />

view<br />

• High resolution surface models have<br />

been produced for further scientific<br />

study<br />

Surface reconstruction of a section of<br />

Chifley cave. Returns from people on<br />

the mapping team are circled in red.<br />

09:00–09:15 ThAT2.7<br />

People Tracking in RGB-D Data With<br />

On-line Boosted Target Models<br />

Matthias Luber, Luciano Spinello and Kai O. Arras<br />

Social Robotics Lab, University of Freiburg, Germany<br />

• We present a 3D people tracking<br />

approach in RGB-D data<br />

• Using on-line learning, we train targetspecific<br />

models with three types of<br />

RGB-D features and a confidence<br />

maximization search in 3D space<br />

• Integration into multi-hypothesis tracker<br />

with track interpretation feedback<br />

to avoid drift of on-line detectors<br />

and to fill gaps of misdetections<br />

• Results demonstrate clearly improved<br />

tracking performance: reduction of false<br />

negative tracks by 50% and reduction of<br />

track confusions by 24%<br />

3D trajectories of<br />

six persons<br />

08:55–09:00 ThAT2.6<br />

People Detection in RGB-D Data<br />

Luciano Spinello and Kai O. Arras<br />

Social Robotics Lab, University of Freiburg, Germany<br />

• We present Combo-HOD, a fast and<br />

robust RGB-D people detector<br />

• Probabilistic fusion of<br />

Histogram of Oriented Gradients (HOG)<br />

and Histogram of Oriented Depths (HOD)<br />

• Uses a 3-fold accelerated depthinformed<br />

scale space search<br />

• In a comprehensive comparison with<br />

alternative methods, Combo-HOD<br />

outperforms all others with a 13% EER<br />

increase over vision-only HOG<br />

• 85% EER in a range up to 8m from<br />

sensor (Kinect)<br />

• GPU implementation runs at 30 Hz<br />

09:15–09:30 ThAT2.8<br />

‘Misspelled’ Visual Words in Unsupervised<br />

Range Data Classification<br />

Michael Firman and Simon Julier<br />

Department of Computer Science, University College London, UK<br />

• LiDAR scans can be affected by<br />

significant noise, which varies in<br />

different situations<br />

• We explore the effect noise has<br />

on the performance of<br />

unsupervised classifiers, using<br />

different local features<br />

The decrease in classification accuracy as noise is added to an<br />

artificial dataset; we also used real LiDAR data<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–172–<br />

• Topic models are found to be<br />

less robust classifiers than<br />

k-means when noise is added<br />

• Spin images are shown to be<br />

more robust to noise than their<br />

angular derivative


Session ThAT3 Continental Parlor 3 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Discrete & Kinodynamic Planning<br />

Chair Edwin Olson, Univ. of Michigan<br />

Co-Chair Luigi Biagiotti, Univ. of Modena and Reggio Emilia<br />

08:00–08:15 ThAT3.1<br />

Improved Hierarchical Planner Performance<br />

Using Local Path Equivalence<br />

Ross A. Knepper and Matthew T. Mason<br />

The Robotics Institute, Carnegie Mellon University, USA<br />

• Motion planners face two types<br />

of choice: discrete decisions,<br />

and continuous optimization<br />

problems.<br />

• Using an equivalence relation<br />

among local paths, we separate<br />

the two types.<br />

• We classify path sets according<br />

to route among obstacles.<br />

• 2-stage path selection process:<br />

1) Select a route (set of paths)<br />

progressing toward the goal, and<br />

2) Execute the optimal (safest)<br />

path within the class.<br />

Discrete Continuous<br />

decision optimization<br />

08:30–08:45 ThAT3.3<br />

Choosing Landmarks for Risky Planning<br />

Elizabeth Murphy and Peter Corke<br />

School of Engineering Systems, QUT, Australia<br />

Paul Newman<br />

Mobile Robotics Group, Oxford University, UK<br />

• Risky Planning is a path planning<br />

technique applicable to probabilistic<br />

costmaps that achieves 70% efficiency<br />

increases over normal heuristic search<br />

methods.<br />

• The risk of not returning the shortest path<br />

is quantified and used to leverage<br />

planning speed ups.<br />

• Risky Planning requires the<br />

precomputation of heuristics for A* type<br />

searches which use landmarks to<br />

precompute an informative lower bound<br />

between any two cells in the costmap<br />

• This paper recommends strategies for<br />

placing these landmarks so that risk<br />

bounds of risky planning are accurate and<br />

precomputation is minimized.<br />

An overhead image of an area to<br />

be planned over, and some<br />

strategically placed landmarks<br />

08:15–08:30 ThAT3.2<br />

Simplicial Dijkstra and A* Algorithms for<br />

Optimal Feedback Planning<br />

Dmitry S. Yershov and Steven M. LaValle<br />

Department of Computer Science, U of Illinois at Urbana-Champaign, USA<br />

• Considered Euclidean shortest path<br />

problem in n-dimensional configuration<br />

space among obstacles<br />

• Adaptation of Dijkstra’s and A* algorithms<br />

to compute approximate cost-to-go<br />

function over simplicial complex<br />

• Numerical methods are carefully<br />

designed, analyzed, and proven to<br />

converge<br />

• Methods are demonstrated to work for 2D<br />

and 3D examples. Test results show<br />

significant improvement of Simplicial A*<br />

over Simplicial Dijkstra’s<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–173–<br />

Level sets of cost-to-go function<br />

in 2D and 3D environments<br />

08:45–08:50 ThAT3.4<br />

A Graph Traversal based Algorithm for Obstacle<br />

Detection using Lidar or Stereo<br />

Sujit Kuthirummal, Aveek Das, and Supun Samarasekera<br />

Vision and Robotics Lab, SRI International Sarnoff, USA<br />

• A computationally efficient approach to<br />

obstacle detection (OD) that is applicable<br />

to both structured and unstructured<br />

environments.<br />

• Instead of explicitly identifying obstacles,<br />

we explicitly detect scene regions that are<br />

traversable – safe for the robot to go to –<br />

from the current position.<br />

• The algorithm does not make any flatworld<br />

assumptions or need sensor pitchroll<br />

compensation . It also accounts for<br />

overhanging structures.<br />

• We demonstrate our approach using both<br />

lidar and stereo sensors.<br />

Top: Lidar based OD<br />

Bottom: Stereo based OD


Session ThAT3 Continental Parlor 3 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Discrete & Kinodynamic Planning<br />

Chair Edwin Olson, Univ. of Michigan<br />

Co-Chair Luigi Biagiotti, Univ. of Modena and Reggio Emilia<br />

08:50–08:55 ThAT3.5<br />

Iterative Path Optimization for<br />

Practical Robot Planning<br />

Andrew Richardson and Edwin Olson<br />

Computer Science & Engineering, University of Michigan, USA<br />

• Two-stage path planning method to<br />

decouple path clearance from<br />

minimum distance path computation<br />

• Avoids some potential field pitfalls with<br />

partially-observed obstacle maps<br />

• Iterative optimization to maximize<br />

clearance (up to bound) and<br />

smoothness<br />

• Algorithm tested extensively on our<br />

indoor/outdoor 14 robot testbed<br />

Smoothed path (blue) maximizes<br />

clearance at A but not B, as<br />

clearance is sufficient. Original path<br />

in orange<br />

09:00–09:15 ThAT3.7<br />

Geometric Maneuverability<br />

with Applications to Low Reynolds Number Swimming<br />

Ross L. Hatton and Howie Choset<br />

Robotics Institute, CMU, USA<br />

Lisa J. Burton and A. E. Hosoi<br />

Department of Mechanical Engineering, MIT, USA<br />

• Locomoting systems use shape changes,<br />

rather than vector thrust, to move through<br />

their environment<br />

• We introduce a geometric framework for<br />

describing their maneuverability<br />

• Local maneuverability is similar to<br />

manipulability for robot arms and is<br />

described by an ellipsoid<br />

• Cyclic maneuverability characterizes<br />

bulk motion over gait or stroke cycles and<br />

is described by a polygon<br />

• Examples are provided for low Reynolds<br />

number swimmers<br />

08:55–09:00 ThAT3.6<br />

Analysis of the Discontinuities in Prioritized<br />

Tasks-Space Control Under Discreet Task<br />

Scheduling Operations<br />

François Keith and Pierre-Brice Wieber<br />

INRIA Rhône-Alpes, France<br />

Nicolas Mansard<br />

CNRS-LAAS, France<br />

Abderrahmane Kheddar<br />

CNRS-UM2 LIRMM, France – CNRS-AIST JRL, UMI3218/CRT Japan<br />

• Presentation of the methods defining<br />

a control law by a hierarchy between<br />

tasks and their continuity properties.<br />

• Illustration of their discontinuous<br />

behavior during discrete events such<br />

as task insertion, removal, swap.<br />

• Definition of a method ensuring a<br />

continuous control law even during<br />

these events.<br />

• Experimentation on HRP-2 robot.<br />

09:15–09:30 ThAT3.8<br />

Input Shaping via B-spline Filters<br />

for 3-D Trajectory Planning<br />

Luigi Biagiotti<br />

Department of Information Engineering<br />

University of Modena and Reggio Emilia, Italy<br />

Claudio Melchiorri<br />

Department of Electronics, Informatics and Systems<br />

University of Bologna, Italy<br />

• Uniform B-splines of degree p are<br />

equivalent to the output of a chain<br />

composed by p average filters<br />

• The frequency spectrum of B-spline<br />

trajectories is completely determined<br />

by the degree p and by time period<br />

T between the knots<br />

• These parameters are selected with the purpose of suppressing residual<br />

vibrations, that may be present because elastic phenomena affecting the<br />

robotic system<br />

• The effectiveness of the proposed approach is proved by considering a<br />

cartesian robot with elastic joints<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–174–


Session ThAT4 Continental Ballroom 4 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Stochasticity in Robotics and Biological Systems I<br />

Chair M. Ani Hsieh, Drexel Univ.<br />

Co-Chair Harry Asada, MIT<br />

08:00–08:15 ThAT4.1*<br />

Semi-Plenary Invited Talk: Set-based Control in<br />

Stochastic Dynamical Systems: Making Almost<br />

Invariant Sets more Invariant<br />

Ira Schwartz, US Naval Research Laboratory<br />

08:30–08:45 ThAT4.3<br />

Noise, Bifurcations, and Modeling of Interacting<br />

Particle Systems<br />

Luis Mier-y-Teran-Romero and Ira B. Schwartz<br />

Nonlinear Systems Dynamics Section, Naval Research Laboratory, USA<br />

Eric Forgoston<br />

Department of Mathematical Sciences, Montclair State University, USA<br />

• Modeled self-propelling particles<br />

interacting through pairwise force in<br />

presence of noise and time delay.<br />

• Identified parameter regions resulting in<br />

different coherent structures of the swarm.<br />

• Showed how stochasticity modifies the<br />

coherent structures and attractor<br />

sensitivity.<br />

Time snapshots of transitioning<br />

swarms<br />

08:15–08:30 ThAT4.2<br />

Semi-Plenary Invited Talk: Set-based Control in<br />

Stochastic Dynamical Systems: Making Almost<br />

Invariant Sets more Invariant<br />

Ira Schwartz, US Naval Research Laboratory<br />

08:45–08:50 ThAT4.4<br />

Probability of success in stochastic robot<br />

navigation with state feedback<br />

Shridhar Shah<br />

Department of Mechanical Engineering, University of Delaware, USA<br />

Chetan Pahlajani<br />

Department of Mathematical Sciences, University of Delaware, USA<br />

Herbert Tanner<br />

Department of Mechanical Engineering, University of Delaware, USA<br />

• Robots with stochastic dynamics need<br />

to navigate in constrained<br />

environments.<br />

• Collision avoidance cannot be<br />

guaranteed with bounded inputs.<br />

• A deterministic, bounded input<br />

controller has a given probability of<br />

success in steering to target region,<br />

under stochastic uncertainty.<br />

• Explicit analytic solutions possible for<br />

single integrator point robots in a<br />

constrained, spherical but obstaclefree<br />

environments.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–175–<br />

Two sample paths for stochastic<br />

navigation using gradient<br />

descent. The green path reaches<br />

the goal while blue path hits the<br />

obstacle boundary.


Session ThAT4 Continental Ballroom 4 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Stochasticity in Robotics and Biological Systems I<br />

Chair M. Ani Hsieh, Drexel Univ.<br />

Co-Chair Harry Asada, MIT<br />

08:50–08:55 ThAT4.5<br />

A Stochastic Approach to Dubins Feedback<br />

Control for Target Tracking<br />

Ross Anderson and Dejan Milutinović<br />

Applied Mathematics&Statistics, University of California, Santa Cruz, USA<br />

• Motivated by an Unmanned Aerial Vehicle<br />

maintaining a nominal distance from an<br />

unpredictable target<br />

• Target is assumed to perform a random<br />

walk<br />

• Dyn. Programming on a Markov chain that<br />

is consistent with stochastic kinematics<br />

produces a feedback controller that can<br />

also be applied for deterministic targets<br />

• Effect of the level of uncertainty of target<br />

motion on the control law is examined<br />

Dubins vehicle tracking<br />

a random walk<br />

09:00–09:15 ThAT4.7<br />

Stochastic Tracking of Migrating Live Cells<br />

Interacting with 3D Gel Environment Using<br />

Augmented-Space Particle Filters<br />

Lee-Ling Ong 1 , Levi Wood 3 , Marcelo Ang Jr. 2, H.Harry Asada 1,3<br />

1 Singapore-MIT Alliance for Research and Technology, Singapore<br />

2 Department of Mechanical Engineering, NUS, Singapore<br />

3 Department of Mechanical Engineering, MIT, Cambridge, MA, USA<br />

• Types of images obtained at discrete<br />

time steps using confocal microscopy of<br />

3D cell migration experiments:<br />

a) 3D images of stained cell nuclei and<br />

b) 2D images of the gel<br />

• Our approach is to track a joint state<br />

representation, using the concepts of<br />

Simultaneous Localization and Mapping<br />

(SLAM)<br />

• This allows mathematically consistent<br />

updates from both sources and<br />

augmentation of new cells and conduit<br />

slices to the state<br />

Bottom figure shows visualization<br />

of the extracted conduit and cells<br />

using IMARIS (from Bitplane)<br />

08:55–09:00 ThAT4.6<br />

Optimization of Stochastic Strategies for<br />

Spatially Inhomogeneous Robot Swarms:<br />

A Case Study in Commercial Pollination<br />

Spring Berman and Radhika Nagpal<br />

Dept. of Computer Science, Harvard University, USA<br />

Ádám Halász<br />

Dept. of Mathematics,West Virginia University, USA<br />

• Scalable approach to optimizing<br />

decentralized robot control policies that<br />

produce a target collective behavior<br />

• Can maintain system performance by<br />

incorporating robot feedback that relies<br />

on coarse-grained localization<br />

• Swarm abstracted to linear ODE model<br />

whose parameters are optimized and<br />

mapped onto robot controllers<br />

• Application: Achieve uniform and<br />

nonuniform pollination of a field with a<br />

swarm of robotic bees<br />

09:15–09:30 ThAT4.8<br />

Stochastic Dynamics of Bacteria Propelled<br />

Spherical Micro-Robots<br />

Veaceslav Arabagi*, Bahareh Behkam**, and Metin Sitti*<br />

*Dept. of Mechanical Engineering, Carnegie Mellon University, USA<br />

**Dept. of Mechanical Engineering, Virginia Tech, USA<br />

• Stochastic dynamic model<br />

of bacteria propelled<br />

micro-beads<br />

• Stokes flow superposition<br />

of flow fields<br />

• Investigate stiffness of<br />

bacteria flagellar hook<br />

through simulations<br />

• Bead radius sensitivity<br />

• Number of attached<br />

bacteria sensitivity<br />

• Studying the long time<br />

random walk behavior of<br />

micro-beads<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–176–


Session ThAT5 Continental Ballroom 5 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Humanoid Robotics and Biped Locomotion<br />

Chair Ambarish Goswami, Honda Res. Inst.<br />

Co-Chair Seung-kook Yun, Honda Res. Inst.<br />

08:00–08:15 ThAT5.1*<br />

Semi-Plenary Invited Talk: Angular Momentum-based<br />

Humanoid Robot Control<br />

Shuuji Kajita, National Inst. of AIST<br />

08:30–08:45 ThAT5.3<br />

Momentum-Based Reactive Stepping Controller on<br />

Level and Non-level Ground for Humanoid Robot<br />

Push Recovery<br />

Seung-kook Yun and Ambarish Goswami<br />

Honda Research Institute, USA<br />

• We present a momentum-based reactive<br />

stepping controller for humanoid robot<br />

push recovery<br />

• Regulation of combinations of linear and<br />

angular momenta allows the controller to<br />

selectively encourage the robot to recover<br />

its balance with or without taking a step<br />

• A reference stepping location called<br />

Generalized Foot Placement Estimator<br />

(GFPE), is computed by modeling the<br />

humanoid as a passive rimless wheel with<br />

two spokes<br />

• GFPE is calculated only once and it does<br />

not move<br />

• The approach works for both level and<br />

non-level grounds<br />

Humanoid takes a downhill step<br />

after a forward push<br />

08:15–08:30 ThAT5.2<br />

Semi-Plenary Invited Talk: Angular Momentum-based<br />

Humanoid Robot Control<br />

Shuuji Kajita, National Inst. of AIST<br />

08:45–08:50 ThAT5.4<br />

Time-Independent, Spatial<br />

Human Coordination For Humanoids<br />

Jean-Christophe PALYART LAMARCHE 1 , Olivier BRUNEAU 2<br />

and Jean-Guy FONTAINE 1<br />

1 Italian Institute of Technology (IIT), Italy<br />

2 Laboratoire d’Ingénierie des Systèmes de Versailles (LISV), France<br />

• Methodology to parameterize movements<br />

with 12 Cartesian gait descriptors, and<br />

one unique gait parameter<br />

• Gait coordination for kinematic simulation<br />

of an anthropomorphic and NAO virtual<br />

models<br />

• Real NAO robot walking<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–177–


Session ThAT5 Continental Ballroom 5 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Humanoid Robotics and Biped Locomotion<br />

Chair Ambarish Goswami, Honda Res. Inst.<br />

Co-Chair Seung-kook Yun, Honda Res. Inst.<br />

08:50–08:55 ThAT5.5<br />

The Effect of Swing Leg Retraction on<br />

Running Energy Efficiency<br />

Matt Haberland and Sangbae Kim<br />

Dept. of Mechanical Engineering, Massachusetts Institute of Technology, USA<br />

J. G. Daniël Karssen and Martijn Wisse<br />

Fac. of Mechanical Engineering, Delft University of Technology, The<br />

Netherlands<br />

• Swing leg retraction (SLR):<br />

rearward rotation of airborne front<br />

leg before touchdown<br />

• SLR significantly improves<br />

energetic efficiency of prismaticlegged<br />

running robots<br />

• SLR reduces touchdown energy<br />

loss, impact forces, and foot<br />

slippage simultaneously<br />

• SLR is recommended as an element<br />

of any control strategy for running<br />

robots<br />

Model consists of point mass body, distributed mass legs,<br />

torque actuator, massless springs, and point feet<br />

09:00–09:15 ThAT5.7<br />

Angular Momentum: Insights into Walking and<br />

Its Control<br />

Bradford C. Bennett<br />

Orthopaedic Surgery, University of Virginia, USA<br />

Thomas Robert<br />

IFSTTAR, Université de Lyon, France<br />

Shawn D. Russell<br />

Mechanical and Aerospace Engineering, University of Virginia, USA<br />

• Nondimensional angular<br />

momentum organization was<br />

independent of walking speed.<br />

• The uncontrolled manifold (UCM)<br />

hypothesis revealed a control<br />

synergy for some angular<br />

momenta.<br />

• Control of bipedal robots may<br />

benefit from simulating the<br />

organization of human gait<br />

X<br />

Y<br />

Z<br />

0.02<br />

0<br />

-0.02<br />

0.02<br />

0<br />

-0.02<br />

0.02<br />

0<br />

Slow<br />

CWS<br />

Fast<br />

-0.02<br />

0 20 40 60 80 100<br />

Gait Cycle (%)<br />

The whole body normalized angular<br />

momentum during a gait cycle for slow,<br />

comfortable, and fast walking.<br />

08:55–09:00 ThAT5.6<br />

Practical Bipedal Walking Control<br />

on Uneven Terrain<br />

Using Surface Learning and Push Recovery<br />

Seung-Joon Yi, Daniel D. Lee<br />

ESE, University of Pennsylvania, USA<br />

Byoung-Tak Zhang<br />

CSE, Seoul National University, Korea<br />

Dennis Hong<br />

ME, Virginia Tech, USA<br />

• Bipedal walking in human environment is<br />

made difficult by two sources of error: the<br />

surface unevenness and external<br />

disturbances.<br />

• Electrically compliant swing foot and<br />

onboard sensors are used to estimate the<br />

current surface gradient, and an online<br />

learning algorithm is used to learn the<br />

surface model from noisy estimates<br />

• Biomechanically motivated hierarchical<br />

push recovery controller is used to reject<br />

external disturbances<br />

• Implemented on the DARwIn-OP<br />

commercial humanoid robot<br />

09:15–09:30 ThAT5.8<br />

XoR: Hybrid Drive Exoskeleton Robot<br />

That Can Balance<br />

Sang-Ho Hyon, Jun Morimoto, Takamitsu Matsubara,<br />

Tomoyuki Noda and Mitsuo Kawato<br />

Computational Neuroscience Lab, ATR, Japan<br />

• A novel exoskeleton robot prototype is<br />

developed<br />

• Pneumatic muscles and electric motors<br />

are arranged in a optimal way to achieve<br />

weight-reduction and torque-controllability<br />

• Autonomous postural control using hybrid<br />

drive is presented<br />

• Experimental video is attached<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–178–


Session ThAT6 Continental Ballroom 6 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Robots that Can See I<br />

Chair José Neira, Univ. de Zaragoza<br />

Co-Chair Dieter Fox, Univ. of Washington<br />

08:00–08:15 ThAT6.1*<br />

Semi-Plenary Invited Talk: Vision at Nanoscales<br />

Yu Sun, University of Toronto<br />

08:30–08:45 ThAT6.3<br />

Visual Place Categorization in Maps<br />

����������������������������������<br />

����������������������������<br />

� ��������������������������������������<br />

�����������������<br />

� ����������������������������������������<br />

��������������������������������<br />

� ������������������������������������������<br />

��������������������������������������������<br />

�������������������<br />

��������������������������������������<br />

���������������������������������������<br />

��������������������������������������<br />

�����������������������������������������<br />

�������������������������������������������<br />

������������������������<br />

� ������������������������������������������������������<br />

� ���������������������������������������������������<br />

� ���������������������������������������<br />

� �������������������������������������������������<br />

��������������������������������������������������������<br />

��������������������������������������������������������<br />

�������������������������������������������������������������<br />

� ������������������������������������������������������������������������<br />

08:15–08:30 ThAT6.2<br />

Semi-Plenary Invited Talk: Vision at Nanoscales<br />

Yu Sun, University of Toronto<br />

08:45–08:50 ThAT6.4<br />

CD SLAM - Continuous Localization and<br />

Mapping in a Dynamic World<br />

Katrin Pirker and Matthias Rüther and Horst Bischof<br />

Institute for Computer Graphics and Vision, Graz University of Technology,<br />

Austria<br />

• we perfrom large-scale,<br />

continuous, monocular localization<br />

and mapping<br />

• we handle short- and long-term<br />

scene dynamics by adding visibility<br />

information to each map point<br />

• efficient keyframe organization<br />

facilitates sliding window bundle<br />

adjustment and speeds up loop<br />

closure correction<br />

• scalability and localization<br />

accuracy is measured in a longterm<br />

experiment in a mixed<br />

indoor/outdoor scenery<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–179–


Session ThAT6 Continental Ballroom 6 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Symposium: Robots that Can See I<br />

Chair José Neira, Univ. de Zaragoza<br />

Co-Chair Dieter Fox, Univ. of Washington<br />

08:50–08:55 ThAT6.5<br />

Vision-based Mobile Robot’s SLAM and<br />

Navigation in Crowded Environments<br />

Hiroshi Morioka, Sangkyu Yi<br />

Department of Computational Intelligence and System Science,<br />

Tokyo Institute of Technology, Japan<br />

Osamu Hasegawa<br />

Imaging Science and Engineering Laboratory,<br />

Tokyo Institute of Technology, Japan<br />

• We propose vision-based SLAM and<br />

navigation method that is effective even in<br />

crowded environments.<br />

• Proposed method extracts robust 3D<br />

feature points from sequential vision<br />

images and odometry.<br />

• We can eliminate unstable feature points<br />

extracted from dynamic objects.<br />

• We present experiments showing the<br />

utility of our approach in crowded<br />

environments.<br />

09:00–09:15 ThAT6.7<br />

Real-Time Photo-Realistic 3D Mapping<br />

for Micro Aerial Vehicles<br />

Lionel Heng, Gim Hee Lee,<br />

Friedrich Fraundorfer, and Marc Pollefeys<br />

Computer Vision and Geometry Lab, ETH Zürich, Switzerland<br />

• A stereo or Kinect camera generates<br />

a point cloud; a 3D occupancy grid is<br />

built.<br />

• We use IMU measurements to<br />

establish a 1-point RANSAC for<br />

visual odometry (VO) estimates.<br />

• The occupancy grid, images, and VO<br />

estimates are transmitted wirelessly<br />

to a ground control station (GCS).<br />

• The GCS incrementally constructs a<br />

3D textured map via a viewdependent<br />

projective texturing<br />

method.<br />

3D Textured Map from Kinect data<br />

08:55–09:00 ThAT6.6<br />

Vision Based Attitude And Altitude Estimation<br />

For UAVs In Dark Environments<br />

Ashutosh Natraj<br />

MIS Laboratory, University of Picardie Jules Verne, France<br />

Cédric Demonceaux<br />

Le2I Laboratory, University of Burgundy, France<br />

Pascal Vasseur<br />

LITIS Laboratory, University of Rouen, France<br />

Peter Sturm<br />

INRIA Grenoble – Rhônes-Alpes, France<br />

• A new sensor based on a laser projector<br />

and a fish-eye camera is proposed for<br />

UAV applications.<br />

• Attitude and altitude can be estimated in<br />

dark environment with good accuracy and<br />

in real-time.<br />

• The laser projects a circle on the ground<br />

then captured by the fish-eye camera.<br />

• The paper demonstrates the geometric<br />

relations between ground, the laser circle<br />

and the camera and proposes a simple<br />

and fast algorithm for parameter<br />

estimation<br />

09:15–09:30 ThAT6.8<br />

Stereo Visual Odometry for Pipe Mapping<br />

Peter Hansen 1 , Hatem Alismail 2 , Brett Browning 1,2<br />

and Peter Rander 2<br />

1 QRI8 Lab, Carnegie Mellon University, Qatar<br />

2 Robotics Institute/NREC, Carnegie Mellon University, USA<br />

• Generating appearance maps for<br />

inspection of Liquified Natural Gas pipes.<br />

• Using imagery from a customized stereo<br />

camera equipped on a pipe crawling robot.<br />

• Camera pose and scene point estimates<br />

obtained using Perspective-n-Points (PnP)<br />

and Sparse Bundle Adjustment (SBA).<br />

• Accurate pose estimates with error less<br />

than 1% for distance traveled.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–180–<br />

Our stereo camera system,<br />

and an appearance map of the<br />

internal surface of a carbon<br />

steel pipe.


Session ThAT7 Continental Parlor 7 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Mechanisms: Joint Design<br />

Chair Irene Sardellitti, Italian Inst. of Tech.<br />

Co-Chair Jean-François Brethe, LE HAVRE Univ.<br />

08:00–08:15 ThAT7.1<br />

Robust Estimation of Variable Stiffness<br />

in Flexible Joints<br />

Fabrizio Flacco and Alessandro De Luca<br />

Dipartimento di Informatica e Sistemistica,<br />

Università di Roma “La Sapienza”, Rome, Italy<br />

Irene Sardellitti and Nikos G. Tsagarakis<br />

Advanced Robotics Lab, Italian Institute of Technology, Genova, Italy<br />

• Stiffness is estimated using only position<br />

measurements on the motor sides and<br />

motor dynamic parameters<br />

• Residual based method to estimate the<br />

joint flexibility torque<br />

• Modified kinematic Kalman filter to handle<br />

discretization and quantization errors<br />

• Enhanced recursive least squares that<br />

does not suffer from lack of persistent<br />

excitation<br />

• Simulations and Experiments with the IIT<br />

AwAS joint<br />

08:30–08:45 ThAT7.3<br />

A bio-inspired condylar hinge joint<br />

for mobile robots<br />

Appolinaire C. Etoundi, Stuart C. Burgess<br />

Mechanical Engineering, University of Bristol, UK<br />

Ravi Vaidyanathan<br />

Mechanical Engineering, Imperial College London, UK<br />

• A bio-inspired design of hinge<br />

joint for mobile robots based on<br />

the human knee<br />

• Key feature of the design is its<br />

simplicity and similarity to the<br />

mammalian knee joint<br />

• Prototype model mimics the<br />

curved profiles of the human<br />

knee joint in order to achieve<br />

the benefits of high conformity<br />

and high stiffness and strength<br />

Femur<br />

Bush assembly<br />

Cap<br />

Nylon cable<br />

Strut x2<br />

Tibia<br />

Prototype coldylar joint<br />

08:15–08:30 ThAT7.2<br />

Cartesian stiffness matrix of manipulators with<br />

passive joints: analytical approach<br />

Anatol Pashkevich 1,2 , Alexandr Klimchik 1,2 ,<br />

Stephane Caro 2 and Damien Chablat 2<br />

1 Ecole des Mines de Nantes, France<br />

2 Institut de Recherches en Communications et en Cybernetique de Nantes<br />

• Analytical expressions for stiffness matrix<br />

computation of serial kinematic chain with<br />

passive joints (stiffness mapping for<br />

manipulators with passive joints)<br />

• Recursive procedure for sequentially<br />

modification of the original stiffness matrix<br />

in accordance with the geometry of each<br />

passive joints.<br />

• Aggregation of the serial chain stiffness<br />

matrices with respect to any given<br />

reference point of the mobile platform<br />

• Ability to produce both singular and nonsingular<br />

stiffness matrices for serial and<br />

parallel manipulators<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–181–<br />

Virtual Joint Models of kinematic<br />

chain and parallel manipulator<br />

08:45–08:50 ThAT7.4<br />

The Mechanism of the Linear Load-Sensitive<br />

Continuously Variable Transmission<br />

with the Spherical Driving Unit<br />

Kenjiro TADAKUMA1 , Riichiro TADAKUMA2 (Yamagata University),<br />

Kazuki TERADA3 , Aiguo MING3 , Makoto SHIMOJO3 ,<br />

Mitsuru Higashimori1 , Makoto Kaneko1 ,<br />

1. Osaka University, 2. Yamagata University, 3. UEC, Japan<br />

• The proposed CVT mechanism consists of<br />

spherical drive, drive axis, motor housing,<br />

fixed bracket and linear sliding plate.<br />

• The mechanism changes the reduction ratio<br />

continuously by inclination angle of active<br />

rotational axis.<br />

• This linear mechanism has a load-sensitive<br />

function by changing inclination of active<br />

rotational axis in response to the load.<br />

• The prototype mechanical model linear loadsensitive<br />

continuously variable<br />

transmission has been developed and the<br />

effectiveness of the proposed mechanism<br />

has been confirmed by the experiments.<br />

Figure caption is optional,<br />

use Arial Narrow 20pt


Session ThAT7 Continental Parlor 7 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Mechanisms: Joint Design<br />

Chair Irene Sardellitti, Italian Inst. of Tech.<br />

Co-Chair Jean-François Brethe, LE HAVRE Univ.<br />

08:50–08:55 ThAT7.5<br />

Dynamic model of a Hyper-redundant Octopuslike<br />

Manipulator for Underwater Applications<br />

Rongjie Kang, Emanuele Guglielmino, David T. Branson<br />

and Darwin G. Caldwell<br />

Advanced Robotics Department, Italian Institute of Technology, Italy<br />

Asimina Kazakidi, Dimitris P. Tsakiris and John A. Ekaterinaris<br />

Foundation for Research and Technology – Hellas (FORTH), Greece<br />

• Octopus arm anatomy and morphology<br />

• Kinematics of an octopus arm inspired<br />

manipulator<br />

• Dynamic modeling for a single segment<br />

with parallel mechanism<br />

• Multiple-segment model<br />

• External forces due to the aquatic<br />

environment, such as buoyancy and<br />

hydrodynamic forces<br />

09:00–09:15 ThAT7.7<br />

Granular Stochastic Modeling of<br />

Robot Micrometric Precision<br />

Brethé Jean-François<br />

GREAH, Le Havre University, France<br />

• Modeling and quantifying robot precision when information from<br />

external sensors in the operating area is available<br />

• Pros and cons of usual precision criteria, especially repeatability<br />

and accuracy in this micrometric context<br />

• Computing the worst error position in the micrometric range<br />

• Granular stochastic modeling means holes and aggregates in the<br />

micrometric structure !<br />

• Comparison to other precision modeling (input and output error<br />

modeling, interval analysis)<br />

08:55–09:00 ThAT7.6<br />

Three Module Lumped Element model of a<br />

Continuum Arm Section<br />

Nivedhitha Giri and Ian D. Walker<br />

Department of Electrical and Computer Engineering, Clemson University,<br />

United States of America<br />

• A continuum arm section is modeled using<br />

lumped model elements (mass, springs<br />

and dampers)<br />

• The model parameters are chosen to<br />

match the physical model of Octarm VI<br />

• Principles of Lagrangian Dynamics are<br />

used to derive the generalized forces in<br />

the system<br />

• Numerical results are compared with<br />

measurements<br />

• The linear approximation model provides<br />

sufficient information about the section’s<br />

configuration with a slight tradeoff in<br />

accuracy<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–182–<br />

Linearized Dynamic model


Session ThAT8 Continental Parlor 8 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Autonomous Vehicles<br />

Chair Karl Iagnemma, MIT<br />

Co-Chair William Smart, Washington Univ. in St. Louis<br />

08:00–08:15 ThAT8.1<br />

Avoiding steering actuator saturation in off-road mobile<br />

robot path tracking via predictive velocity control<br />

Oliver Hach, Roland Lenain<br />

Cemagref, France<br />

Benoit Thuilot, Philippe Martinet<br />

Ifma and Lasmea, Université Blaise Pascal, France<br />

• High accurate control for off-road mobile<br />

robots<br />

• Observer to estimate grip conditions<br />

• Predictive control to ensure tracking<br />

accuracy and limit velocity in order to<br />

preserve path achievability<br />

• Algorithm applied to avoid actuator<br />

saturation: the maximal admissible<br />

velocity is computed to limit the steering<br />

angle to its saturation<br />

Views of the experimental offroad<br />

mobile robot<br />

08:30–08:45 ThAT8.3<br />

Integrating Stereo Structure<br />

for Omnidirectional Trail Following<br />

Christopher Rasmussen, Yan Lu, and Mehmet Kocamaz<br />

Dept. Computer & Information Sciences, University of Delaware, USA<br />

• A system for autonomous hiking/mountain-biking trail following is<br />

discussed<br />

• Structural information derived from dense stereo is integrated with color<br />

appearance information to segment and track the trail<br />

• The utility of structural information is demonstrated on ground truth<br />

segmentations, and live run results are presented<br />

Height snakes overlaid on height map for trail hypothesis<br />

08:15–08:30 ThAT8.2<br />

Consistent Pile-Shape Quantification for<br />

Autonomous Wheel Loaders<br />

Martin Magnusson and Håkan Almqvist<br />

AASS, Örebro University, Sweden<br />

• Experimental comparison of<br />

several algorithms for quantifying<br />

pile shapes.<br />

• Evaluate consistency w.r.t.<br />

viewpoint and sensor<br />

configuration.<br />

• Extensions of previous algorithms.<br />

• Novel method for segmenting pile<br />

from background in point-cloud<br />

data.<br />

• Propose to use quadric fitting for<br />

estimating relevant pile quantities.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–183–<br />

Application shown above,<br />

variance of convexity measures<br />

illustrated below.<br />

08:45–08:50 ThAT8.4<br />

Performance Analysis and Odometry<br />

Improvement of an Omnidirectional Mobile<br />

Robot for Outdoor Terrain<br />

Genya Ishigami *1 , Elvine Pineda *2 , Jim Overholt *3 ,<br />

Greg Hudas *3 , and Karl Iagnemma *2<br />

*1 Japan Aerospace Exploration Agency, JAPAN<br />

*2 Massachusetts Institute of Technology, USA<br />

*3 U.S. Army TARDEC, USA<br />

• An omnidirectional mobile robot that<br />

possesses high mobility in rough<br />

terrain is presented.<br />

• The robot has four active split offset<br />

caster (ASOC) modules, enabling the<br />

omnidirectional mobility.<br />

• The agility of the robot is evaluated in<br />

various configurations.<br />

• An odometry method along with<br />

kinematic constraints of the robot<br />

configuration is proposed.<br />

ASOC-driven Omnidirectional<br />

Mobile Robot


Session ThAT8 Continental Parlor 8 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Autonomous Vehicles<br />

Chair Karl Iagnemma, MIT<br />

Co-Chair William Smart, Washington Univ. in St. Louis<br />

08:50–08:55 ThAT8.5<br />

Inertial Rotation Center Position Estimation for a<br />

Perching Treaded Vehicle<br />

Christopher Schmidt-Wetekam, Nicholas Morozovsky,<br />

and Thomas Bewley<br />

Dept. of Mechanical Engineering, UC San Diego, USA<br />

• A treaded vehicle, “Switchblade”, uses a<br />

combination of tread and body actuation in<br />

order to balance while in a perched<br />

configuration.<br />

• Identify stair edges without tactile or visual<br />

feedback by monitoring the position of the<br />

center of rotation<br />

• Use offset acceleration measurements to<br />

calculate rotation center position (RCP)<br />

• Variance of calculated RCP is inversely<br />

proportional to angular acceleration<br />

• RCP variance is minimized when<br />

accelerometers are equidistant and<br />

widely-spaced about the actual RCP<br />

The treaded rover, “Switchblade”,<br />

is intended to climb stairs using a<br />

perching manuever.<br />

09:00–09:15 ThAT8.7<br />

An Automated Truck Platoon for Energy Saving<br />

Sadayuki Tsugawa<br />

Department of Information Engineering, Meijo University, Japan<br />

Shin Kato<br />

Institute of Intelligent Systems, AIST, Japan<br />

Keiji Aoki<br />

Department of ITS Research, JARI, Japan<br />

• Objectives: energy saving and CO2 emission reduction for road transportation<br />

in addition to safety<br />

• Platoon: 3 heavy (25 ton) trucks<br />

• Functions: fully automated driving of an<br />

autonomous type<br />

• Lateral control: lane marker detection with computer vision<br />

• Longitudinal control: gap measurement with radar and lidar, V2V<br />

communications<br />

• Experiments: driving at 80 km/h with 10 m gap on a test track (oval, 3.2<br />

km) and along an expressway (8 km)<br />

• Fuel consumption improvement: 14 % (measurement)<br />

•CO2 emission reduction: 2.1 % (40 % penetration, simulation)<br />

08:55–09:00 ThAT8.6<br />

Improving near-to-near lateral control of<br />

platoons without communication<br />

Jano Yazbeck, Alexis Scheuer,<br />

Olivier Simonin and François Charpillet<br />

LORIA / INRIA, Nancy, France<br />

• Proposition of a decentralized local<br />

approach for a secure platooning.<br />

• Generic method based on memorization<br />

of the predecessor's path to reduce<br />

drastically lateral deviation.<br />

• Use of a longitudinal controller to ensure<br />

collision avoidance.<br />

• Experimentations on differential wheeled<br />

Khepera III robots.<br />

09:15–09:30 ThAT8.8<br />

Context-Aware Video Compression<br />

for Mobile Robots<br />

Daniel A. Lazewatsky*, Bogumil Giertler**, Martha Witick**, Leah<br />

Perlmutter**, Bruce A. Maxwell**, and William D. Smart*<br />

*Dept of Computer Science & Engineering Washington University in St. Louis, USA<br />

**Dept of Computer Science Colby College, USA<br />

• Remote teleoperation<br />

often relies heavily on<br />

live video which travels<br />

over network(s) with<br />

unknown conditions<br />

• General-purpose video<br />

compression does not<br />

take into account<br />

conditions specific to<br />

robot operation<br />

• Integrating robot<br />

odometry into video<br />

compression pipeline can simultaneously reduce bandwidth and latency<br />

without affecting the robot operator’s task performance<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–184–


Session ThAT9 Continental Parlor 9 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Towards Anthropomimetic Robots<br />

Chair Michael Beetz, Tech. Univ. München<br />

Co-Chair Frank Kirchner, Univ. of Bremen<br />

08:00–08:15 ThAT9.1<br />

A model of reference trajectory adaptation<br />

for interaction with objects<br />

of arbitrary shape and impedance<br />

Chenguang Yang and Etienne Burdet<br />

Imperial College London, UK<br />

• in the presence of an obstacle, trajectory<br />

drifts trial after trial<br />

• this tends to limit the interaction force<br />

• we have derived the dynamics of this<br />

human adaptation strategy and simulated it<br />

• this yields a stable trajectory adaptation<br />

behavior for robots<br />

reference<br />

trajectory<br />

1<br />

1<br />

5<br />

actual<br />

trajectory<br />

5<br />

9 9<br />

stiff object<br />

08:30–08:45 ThAT9.3<br />

Predictive Compliance for Interaction Control of<br />

Robot Manipulators<br />

José de Gea Fernández 1 and Frank Kirchner 1,2<br />

1 German Research Center for Artificial Intelligence (DFKI)<br />

Robotics Innovation Center<br />

2 Robotics Group, University of Bremen<br />

Bremen, Germany<br />

• Predictive Context-Based Adaptive<br />

Compliance (PCAC) architecture is<br />

presented<br />

• Context-based predictions for the<br />

selection and on-line modification of the<br />

compliance of a robot manipulator<br />

• A Bayesian predictor in the form of a<br />

Relevance Vector Machine combines<br />

the use of prior knowledge and<br />

expected sensory feedback to correct<br />

for an erroneous compliance in the<br />

case of a falsely-predicted context<br />

Robot performing the lifting<br />

experiment which makes use of<br />

context-based predictions to<br />

adjust the arm compliance<br />

08:15–08:30 ThAT9.2<br />

Generic Dynamic Motion Generation<br />

with Multiple Unilateral Constraints<br />

L. Saab O. Ramos N. Mansard and P. Souères<br />

LAAS, Université de Toulouse (UPS), France<br />

J.Y. Fourquet<br />

LGP, ENIT, France<br />

• Dynamic whole-body motion generation<br />

• Generic formulation of complex<br />

movements as stack of tasks including<br />

rigid contacts and unilateral constraints<br />

• Global resolution method involving a<br />

cascade of quadratic programming<br />

• Simulation based on the complete<br />

dynamic model of the humanoid robot<br />

HRP-2<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–185–<br />

A sitting task for HRP-2 involving<br />

several non-coplanar contacts<br />

08:45–08:50 ThAT9.4<br />

Parameterizing Actions to have the<br />

Appropriate Effects<br />

Lorenz Mösenlechner and Michael Beetz<br />

Intelligent Autonomous Systems Group<br />

Technische Universität München, Germany<br />

• Performing complex tasks in human<br />

environments is hard and requires<br />

reasoning (qualitative and quantitative) ative)<br />

• Plans use symbolic descriptions of f<br />

locations, objects, … which need to o be<br />

resolved at run-time<br />

• We present a fast and light-weight<br />

reasoning system that integrates<br />

Prolog, a physics engine and OpenGL nGL<br />

to infer locations under the constraints aints<br />

of the current task


Session ThAT9 Continental Parlor 9 Thursday, September 29, <strong>2011</strong>, 08:00–09:30<br />

Towards Anthropomimetic Robots<br />

Chair Michael Beetz, Tech. Univ. München<br />

Co-Chair Frank Kirchner, Univ. of Bremen<br />

08:50–08:55 ThAT9.5<br />

Physics-based Modeling of an<br />

Anthropomimetic Robot<br />

Steffen Wittmeier, Michael Jäntsch,<br />

Konstantinos Dalamagkidis and Alois Knoll<br />

Faculty of Informatics, TU München, Germany<br />

• Control of highly-redundant, tendon-driven<br />

robots remains a very challenging task.<br />

• Here, the reverse-engineered derivation of f<br />

a physics-based model of a hand-crafted,<br />

anthropomimetic robot arm is presented.<br />

• The skeleton model was derived from a<br />

total of 25 laser-scans and actuator as<br />

well as sensor models were implemented.<br />

• Moreover, a muscle control algorithm was<br />

implemented to demonstrate and validate<br />

the dynamics of the simulation model.<br />

• The developed model will subsequently be e<br />

employed as an internal dynamics model<br />

for robot control.<br />

Prototype P of fh the anthropomimetic<br />

h i i<br />

robot ECCE-I.<br />

09:00–09:15 ThAT9.7<br />

A comparison between joint level torque<br />

sensing and proximal F/T sensor torque<br />

estimation: implementation on the iCub<br />

M. Randazzo, M. Fumagalli, F. Nori, L. Natale,<br />

G. Metta, G.Sandini<br />

Robotics Brain and Cognitive Science Department,<br />

Instituto Italiano di Tecnología, Italy<br />

• Two different approaches of<br />

achieving compliant control on a<br />

humanoid robotic arm are compared.<br />

• A model-based approach is used to<br />

estimate both the internal and the<br />

external wrenches using a single<br />

proximal six-axis force-torque sensor.<br />

• On the other hand, integrated torque<br />

sensors have been developed to<br />

directly measure the joint torques.<br />

• The obtained estimations and<br />

torques measurements are<br />

experimentally compared and the<br />

benefits / disadvantages of the two<br />

approaches are discussed.<br />

08:55–09:00 ThAT9.6<br />

Reexamining Lucas-Kanade Method for Real-Time<br />

Independent Motion Detection: An Application to<br />

the iCub Humanoid Robot<br />

C. Ciliberto, U. Pattacini, L. Natale, F. Nori and G. Metta<br />

RBCS, Istituto Italiano di Tecnologia, Italy<br />

• The motionCUT algorithm is presented to<br />

perform real-time independent motion<br />

detection for applications in robotics.<br />

• The canonical Lucas-Kanade method for<br />

optical flow is analyzed to derive a criterion<br />

that is apt to discriminate between the<br />

robot egomotion and independent motion<br />

in camera-acquired image streams.<br />

• Extensive experimental validation is<br />

carried out on the iCub humanoid robot,<br />

demonstrating that the proposed system<br />

responds smoothly to changes in the<br />

environmental conditions.<br />

• A task involving the stereo gaze tracking of<br />

a moving target is examined as a practical<br />

example of the motionCUT capabilities<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–186–<br />

Stereo tracking of a person walking in<br />

a cluttered environment<br />

09:15–09:30 ThAT9.8<br />

Computation with mechanically coupled springs<br />

for compliant robots<br />

Hidenobu Sumioka, Helmut Hauser,<br />

and Rolf Pfeifer<br />

the Department of Informatics<br />

University of Zurich, Switzerland<br />

• We identify the computation that a<br />

compliant physical body can achieve.<br />

• Actuation of the springs around a joint is<br />

used as a computational device.<br />

• Only a linear and static readout is adapted<br />

to compute the output.<br />

• Computer simulation shows the spring<br />

network can emulate nonlinear<br />

combinations which need temporal<br />

integration.<br />

An overview of information<br />

processing in the system and the<br />

emulated trajectories .The system<br />

can emulate 2 nd order systems and<br />

10 th ones accurately.


Session ThBT1 Continental Parlor 1 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Control for Manipulation & Grasping<br />

Chair Jing Xiao, UNC-Charlotte<br />

Co-Chair Luigi Villani, Univ. di Napoli Federico II<br />

10:00–10:15 ThBT1.1<br />

Adaptive Sliding Mode Control of Grasped<br />

Object Slip for Prosthetic Hands<br />

Erik D. Engeberg<br />

Mechanical Engineering Department, The University of Akron, USA<br />

Sanford Meek<br />

Mechanical Engineering Department, The University of Utah, USA<br />

• The first robustly stable grasped object<br />

slip prevention algorithm for a prosthetic<br />

hand.<br />

• The adaptive integral sliding mode slip<br />

prevention controller also minimizes<br />

inadvertent object crushing when slip<br />

occurs<br />

• Slip is detected by band pass filtering the<br />

shear force derivative signal to amplify<br />

vibrations that occur<br />

• Five different values of object stiffness are<br />

tested with three different control<br />

algorithms<br />

10:30–10:45 ThBT1.3<br />

Self-tuning Cooperative Control of Manipulators<br />

with Position/Orientation Uncertainties in The<br />

Closed-Kinematic Loop<br />

Farhad Aghili<br />

Canadian Space Agency (CSA), Canada<br />

• Adaptive control of cooperative<br />

manipulators in the presence of<br />

uncertainties in the closed-loop<br />

kinematic loop<br />

• The kinematic parameters are<br />

estimated in real-time using two<br />

cascaded estimators<br />

• No force measurement is required to<br />

deal with kinematic uncertainties of<br />

interconnected robotic system.<br />

• Convergence and stability of the<br />

adaptive control process can be<br />

ensured if the direction of angular<br />

velocity of the object does not remain<br />

constant over time.<br />

• Experimental results are presented<br />

The cooperative dual-arm<br />

manipulator system<br />

10:15–10:30 ThBT1.2<br />

Cartesian Impedance Control For A Variable<br />

Stiffness Robot Arm<br />

Florian Petit and Alin Albu-Schäffer<br />

Institute of Robotics and Mechatronics, German Aerospace Center (DLR),<br />

Germany<br />

• Combination of an active Cartesian<br />

impedance controller and the variable<br />

stiffness robot.<br />

• An algorithm to optimize the passive and<br />

active Cartesian stiffness for increased<br />

tracking performance is presented.<br />

• The extended stiffness range provided by<br />

the algorithm is shown.<br />

• Implementation and experimental results<br />

on the DLR Hand Arm System are given.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–187–<br />

The DLR Hand Arm System<br />

10:45–10:50 ThBT1.4<br />

Kinematic Control with Force Feedback for a<br />

Redundant Bimanual Manipulation System<br />

F. Caccavale*, V. Lippiello + , G. Muscio*, F. Pierri*, F. Ruggiero + , L. Villani +<br />

*Dipartimento di Ingegneria e Fisica dell’Ambiente, Università della Basilicata, Italy<br />

+ Dipartimento di Informatica e Sistemistica, Università di Napoli Federico II, Italy<br />

• Kinematic model for a multi-fingered<br />

redundant bimanual manipulation system<br />

with elastic contacts, while grasping an<br />

object<br />

• Two-stages control scheme<br />

• First stage: inverse kinematics with<br />

redundancy resolution to compute joint<br />

references<br />

• Second stage: parallel force/position<br />

control to impose desired object<br />

motion and normal contact forces<br />

• Secondary tasks accomplished through a<br />

prioritized task-sequencing strategy


Session ThBT1 Continental Parlor 1 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Control for Manipulation & Grasping<br />

Chair Jing Xiao, UNC-Charlotte<br />

Co-Chair Luigi Villani, Univ. di Napoli Federico II<br />

10:50–10:55 ThBT1.5<br />

Robust Manipulation<br />

for Temporary Lack of Sensory Information<br />

by a Multi-Fingered Hand-Arm System<br />

Akihiro Kawamura, Kenji Tahara, Ryo Kurazume<br />

and Tsutomu Hasegawa<br />

Kyushu University, Japan<br />

• A new manipulation scheme against the<br />

temporary lack of sensory information is<br />

proposed.<br />

• According to the availability of sensory<br />

information, the controlled target frame of<br />

the object is switched from the real one to<br />

the virtual one, and vice versa.<br />

• A new virtual object frame based on an<br />

attitude of each fingertip is proposed.<br />

• A numerical simulation result shows that<br />

the proposed method realizes stable<br />

object manipulation even if sensory<br />

information is unavailable.<br />

Multi-fingered hand-arm system<br />

11:00–11:15 ThBT1.7<br />

Impedance control of a non-linearly coupled<br />

tendon driven thumb<br />

Maxime Chalon, Werner Friedl, Jens Reinecke, Thomas<br />

Wimboeck and Alin Albu-Schaeffer<br />

Institute of Robotics and Mechatronics, German Aerospace Center<br />

• The thumb nonlinear actuation structure<br />

provides improved torque and range of<br />

motion but creates new control challenges<br />

• A singular perturbation based controller is<br />

designed and implemented on the real<br />

system<br />

• Experiments and simulation are performed<br />

to verify the performances<br />

• Due to the absence of link side sensor, a<br />

projected gradient method is implemented<br />

in real-time to estimate the joint position<br />

10:55–11:00 ThBT1.6<br />

Determining “Grasping” Configurations for a<br />

Spatial Continuum Manipulator<br />

Jinglin Li and Jing Xiao<br />

IMI Lab, Department of Computer Science<br />

University of North Carolina - Charlotte, USA<br />

• A continuum manipulator, such as a multisection<br />

trunk/tentacle robot, is promising<br />

for deft manipulation of a wide range of<br />

objects of different shapes and sizes.<br />

• This paper presents a general and<br />

complete analysis to determine grasping<br />

configurations for any given 3D object by<br />

a spatial continuum manipulator with three<br />

constant-curvature sections; the<br />

curvature, length, and orientation of each<br />

section are continuous variables.<br />

• Our method finds all types of grasping<br />

configurations through solving intersection<br />

constraint equations. The<br />

implementation results show its efficiency<br />

for on-line operation.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–188–<br />

One grasping configuration,<br />

where sections 3 and 2 of the<br />

continuum manipulator wrap<br />

around the object.<br />

11:15–11:30 ThBT1.8<br />

Energy Shaping Control of Robot Manipulators<br />

in Explicit Force Regulation Tasks with Elastic<br />

Environments<br />

David Navarro-Alarcon, Peng Li, and Hiu Man Yip<br />

Departament Mechanical and Automation Engineering,<br />

The Chinese University of Hong Kong, Hong Kong S.A.R.<br />

• Physical interaction of a robot manipulator<br />

with a purely elastic environment<br />

• The closed-loop elastic energy is shaped<br />

based on the available force sensory<br />

feedback<br />

• Exact force regulation is achieved by<br />

means of a force-modulated “energy<br />

injection” to the system<br />

• Simple experimental results are presented<br />

to validate the approach<br />

Conceptual representation


Session ThBT2 Continental Parlor 2 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Mapping<br />

Chair Cyrill Stachniss, Univ. of Freiburg<br />

Co-Chair Mitch Bryson, Univ. of Sydney<br />

10:00–10:15 ThBT2.1<br />

Visual Mapping with Uncertainty<br />

for Correspondence-free Localization<br />

using Gaussian Process Regression<br />

T. Schairer1 , B. Huhle1 , P. Vorst2 , A. Schilling1 and W. Straßer1 1Dept. of Graphical Interactive Systems, University of Tübingen, Germany<br />

2Dept. of Cognitive Systems, University of Tübingen, Germany<br />

• Setting:<br />

• Appearance-based localization using<br />

low-resolution omnidirectional images<br />

• Gaussian Process regression for<br />

modeling image variation of the scene<br />

• Problem:<br />

• Time-consuming acquisition of reference<br />

images with exact position information<br />

• Our solution:<br />

• Use available odometry data<br />

• Learn a probabilistic map by fusing<br />

uncertain training data<br />

• Quality increases with every new run<br />

Reference images (dots), odometry<br />

(dotted), ground-truth (solid) and<br />

our approach (dashed).<br />

10:30–10:45 ThBT2.3<br />

Dense visual mapping of large scale<br />

environments for real-time localisation<br />

Maxime Meilland and Patrick Rives<br />

INRIA Sophia Antipolis Méditerrannée, France<br />

Andrew Ian Comport<br />

CNRS, I3S Laboratory, Université Nice Sophia Antipolis, France<br />

• Custom sensor for augmented visual<br />

sphere construction with a unique<br />

center of projection.<br />

• Calibration of a ring of divergent stereo<br />

cameras.<br />

• Automatic environment mapping using<br />

an ego-centered representation.<br />

• Accurate real-time robot localisation<br />

navigating locally within the learnt<br />

model.<br />

10:15–10:30 ThBT2.2<br />

Distributed Large Scale Terrain Mapping for Mining<br />

and Autonomous Systems<br />

Paul Thompson, Eric Nettleton and Hugh Durrant-Whyte<br />

The Australian Centre for Field Robotics (ACFR)<br />

The University of Sydney, Australia<br />

• A method for efficient fusion, estimation and distribution of large<br />

scale terrain<br />

• Based on sparse information smoothing, related to finite-elements<br />

(FEM) and Gaussian processes (GPs)<br />

• Results for efficient fusion and distribution<br />

• Applied to multiple terrain scanning lasers in mining<br />

10:45–10:50 ThBT2.4<br />

Hierarchies of Octrees for Efficient 3D Mapping<br />

K.M. Wurm1 D. Hennes2 D. Holz3 R.B. Rusu4 C. Stachniss1 K. Konolige4 W. Burgard1 1 Dep. of Computer Science, University of Freiburg, Germany<br />

2 Dep. of Knowledge Engineering, Maastricht University, The Netherlands<br />

3 Autonomous Intelligent Systems Group, University of Bonn, Germany<br />

4 Willow Garage Inc., CA, USA<br />

• Multi-resolution 3D representation<br />

organized as a tree of submaps<br />

• Each submap is updated and<br />

transformed individually<br />

• Models occupied and free space<br />

efficiently<br />

• Application to table-top<br />

manipulation and autonomous<br />

exploration<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–189–<br />

Top: model of a table top scene.<br />

Bottom: illustration of the map hierarchy


Session ThBT2 Continental Parlor 2 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Mapping<br />

Chair Cyrill Stachniss, Univ. of Freiburg<br />

Co-Chair Mitch Bryson, Univ. of Sydney<br />

10:50–10:55 ThBT2.5<br />

A Comparison of Feature and Pose-Based<br />

Mapping using Vision, Inertial and GPS on a UAV<br />

Mitch Bryson and Salah Sukkarieh<br />

Australian Centre for Field Robotics, University of Sydney, Australia<br />

• Presents and compares two different<br />

approaches to integrating IMU, GPS<br />

and monocular vision camera data<br />

from an Unmanned Aerial Vehicle for<br />

building large-scale 3D terrain<br />

reconstructions.<br />

• Novel pose-only non-linear leastsquares<br />

formulation that optimises<br />

relative camera poses based on 1D<br />

epipolar constraints imposed by<br />

vision matches.<br />

• Results presented within an<br />

environmental mapping project over<br />

outback Australia<br />

11:00–11:15 ThBT2.7<br />

Occupancy Grid Rasterization in Large<br />

Environments for Teams of Robots<br />

Johannes Strom Edwin Olson<br />

Compute Science and Engineering , University of Michigan, USA<br />

• We present an efficient algorithm for<br />

dynamic occupancy grid computation in<br />

large environments (220m x 170m and up)<br />

• Map update cost is proportion to impact of<br />

any loop closures<br />

• Rasterization can be sped up by<br />

identifying redundant sensor data<br />

• Applicable in dynamic environments by<br />

not rasterizing transient objects<br />

Rasterized map using proposed<br />

method (100m by 160m)<br />

10:55–11:00 ThBT2.6<br />

Autonomous Semantic Mapping for Robots<br />

Performing Everyday Manipulation Tasks<br />

in Kitchen Environments<br />

Nico Blodow, Lucian Cosmin Goron, Zoltan-Csaba Marton, Dejan<br />

Pangercic, Thomas Ruehr, Moritz Tenorth, and Michael Beetz<br />

Intelligent Autonomous Systems, Technische Universitaet Muenchen, Germany<br />

� Autonomous exploration including the selection of the next best view<br />

pose in order to actively explore unknown space based using a visibility<br />

kernel and associated costmaps<br />

� Registration, segmentation, and interpretation of color point cloud data<br />

from low-cost devices<br />

• Segmentation of differences of point clouds through interaction using the<br />

robot manipulator<br />

11:15–11:30 ThBT2.8<br />

Autonomous mapping for inspection<br />

of 3D structures<br />

Mahmoud Tavakoli, Ricardo Faria,<br />

Lino Maques, Anibal T. de Almeida<br />

Institute for Systems and Robotics, University of Coimbra, Portugal<br />

• Multiple terrain robots map a<br />

structure which should be explored<br />

by a pole climbing robot<br />

• Roombas are equipped with a<br />

wide angle camera<br />

• Patterns are fixed on the climbing<br />

robot<br />

• ARToolkit library is used for pose<br />

estimation between each camera<br />

and each pattern<br />

• A leap-frog methodology is applied<br />

for the cooperative localization of<br />

the robots<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–190–


Session ThBT3 Continental Parlor 3 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Randomized Planning & Kinematic Control<br />

Chair Nancy Amato, Texas A&M Univ.<br />

Co-Chair Elie Shammas, American Univ. of Beirut<br />

10:00–10:15 ThBT3.1<br />

FIRM: Feedback Controller-Based<br />

Information-state RoadMap<br />

A framework for planning under uncertainty<br />

Ali-akbar Agha-mohammadi, Suman Chakravorty, and Nancy Amato<br />

Computer Science and Engineering Dept., Texas A&M University, USA<br />

Aerospace Engineering Dept., Texas A&M University, USA<br />

• Generalizing PRM to belief space<br />

preserving the edge independence<br />

and thus breaking the curse of history<br />

in POMDPs<br />

• Inducing reachable belief nodes<br />

• Incorporating obstacles in planning<br />

• Online planning and replanning<br />

capabilities<br />

• Providing a feedback law over belief<br />

space<br />

10:30–10:45 ThBT3.3<br />

Optimal Probabilistic Robot Path Planning with<br />

Missing Information<br />

Mohamad Ali Movafaghpour and Ellips Masehian<br />

Faculty of Engineering, Tarbiat Modares University, Iran<br />

• The missing information about the<br />

outcomes of a robot’s actions can be<br />

modeled with a Stochastic Markov<br />

Decision Process with Hidden Variables,<br />

the exact values of which are not known at<br />

the beginning of path planning.<br />

• A Stochastic Dynamic Programming<br />

solution approach for planning with<br />

missing information is formulated as a<br />

Linear Programming (LP) model.<br />

• A new algorithm called Incremental<br />

Assumptive Probabilistic Planning (IAPP)<br />

is proposed for incrementally incorporating<br />

active Hidden Variables into the beliefstate<br />

space and solving an LP.<br />

• An optimality proof for the path generated<br />

by the IAPP and a method for treating<br />

large-scale LP models are also presented.<br />

A4 B4<br />

A6<br />

B5<br />

B6<br />

C1<br />

C2<br />

C3<br />

C4 D4<br />

D1 E1<br />

F1<br />

F2<br />

F3<br />

E4 F4<br />

F5<br />

C6 D6 E6 F6<br />

The robot’s start and goal points<br />

are on A4 and F4, and the<br />

open/closed states of the cells B5<br />

and E4 are unknown at the<br />

beginning. These cells represent<br />

the Hidden Variables whose<br />

states affect the outcomes and<br />

costs of certain actions.<br />

10:15–10:30 ThBT3.2<br />

Computing Spanners of<br />

Asymptotically Optimal Probabilistic Roadmaps<br />

James D. Marble and Kostas E. Bekris<br />

University of Nevada, Reno, USA<br />

• Asymptotically optimal motion planning<br />

algorithms produce very dense<br />

roadmaps<br />

• Spanner graphs are sub-graphs with<br />

fewer edges and guarantees on<br />

resulting path length<br />

• This work combines these two<br />

concepts to produce an asymptotically<br />

near-optimal planner<br />

−Path degradation is bounded<br />

• Small sacrifice in path quality results in:<br />

• Large savings in graph density<br />

• Faster online query resolution<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–191–<br />

Given a dense roadmap (above) the<br />

resulting sparse spanner (below)<br />

preserves path quality.<br />

10:45–10:50 ThBT3.4<br />

Asymptotically-optimal Path Planning for<br />

Manipulation using Incremental Sampling-based<br />

Algorithms<br />

Alejandro Perez, Alexander Shkolnik, Seth Teller, Matthew R. Walter<br />

Computer Science and Artificial Intelligence Laboratory, MIT<br />

Sertac Karaman, Emilio Frazzoli<br />

Laboratory for Information and Decision Systems, MIT<br />

• Optimal motion planning for manipulation<br />

based on the RRT*<br />

• Efficient sampling using Ball Trees<br />

• Rapid generation of low-cost solutions<br />

• Dual-arm planning on Willow Garage’s<br />

PR2 robot<br />

RRT<br />

BT+RRT*<br />

Time-lapse images of RRT and<br />

BT+RRT* trajectories after 2,000<br />

iterations (12DOF)


Session ThBT3 Continental Parlor 3 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Randomized Planning & Kinematic Control<br />

Chair Nancy Amato, Texas A&M Univ.<br />

Co-Chair Elie Shammas, American Univ. of Beirut<br />

10:50–10:55 ThBT3.5<br />

Quasi-Static Motion Planning on Uneven Terrain<br />

for a Wheeled Mobile Robot<br />

Vijay Eathakota , Aditya Gattupalli and K Madhava Krishna<br />

Robotics Research Center, IIIT Hyderabad, India<br />

• We develop a motion planning algorithm<br />

based on the RRT, connecting the starting<br />

and the end goal positions of the Wheeled<br />

Mobile Robot on Uneven Terrain while<br />

minimizing wheel slip<br />

• The curves connected the adjacent nodes of<br />

the RRT are developed using the forward<br />

motion problem based on Peshkin’s<br />

minimum energy principle<br />

• The result of this planner is a set of ordinary<br />

differential equations (ODEs) along with a set<br />

of Differential Algebraic Equations (DAEs)<br />

• Simulation Results prove the efficacy of the<br />

planner<br />

Wheeled Mobile Robot on<br />

Uneven Terrain<br />

11:00–11:15 ThBT3.7<br />

Exact Motion Planning Solution for Principally<br />

Kinematic Systems<br />

Elie Shammas<br />

Mechanical Engineering, American University of Beirut, Lebanon<br />

Mauricio de Oliveira<br />

Mechanical and Aerospace Eng’g, University of California San Diego, USA<br />

• Motion planning problem for principally<br />

kinematic systems is solve analytically.<br />

• The input of the motion planning problem<br />

is a planar curve and the output is the time<br />

evolution of the input shape variables.<br />

• The base or shape variables are solved<br />

for analytically so that the kinematic snake<br />

traverses exactly a planar trajectory.<br />

• The motion planning techniques are<br />

tested with Serpenpid, sinusoidal, and<br />

cubic planar curves.<br />

Kinematic snake traversing a<br />

cubic spline curve.<br />

10:55–11:00 ThBT3.6<br />

Minimum-time trajectories for kinematic mobile<br />

robots and other planar rigid bodies<br />

Andrei Furtuna, Wenyu Lu,<br />

Weifu Wang, Devin Balkcom<br />

Computer Science, Dartmouth College, USA<br />

• A generalized model of mobile robots includes Dubins, Reeds-<br />

Shepp and other vehicles as special cases.<br />

• Pontryagin’s principle gives strong necessary conditions on the<br />

minimum-time trajectories.<br />

• We present an algorithm that efficiently searches for minimum-time<br />

trajectories.<br />

A minimum-time trajectory for an<br />

omnidirectional vechicle<br />

11:15–11:30 ThBT3.8<br />

Continuous-Curvature Kinematic Control for<br />

Path Following Problems<br />

Vicent Girbés and Leopoldo Armesto and<br />

Josep Tornero and J. Ernesto Solanes<br />

Institute of Design and Manufacturing, Technical University of Valencia, Spain<br />

• Smooth kinematic controller is<br />

introduced, mathematically and<br />

algorithmically.<br />

• Kinematic and Dynamic constraints<br />

are taken into account.<br />

• Capability of combine the kinematic<br />

controller with global path planners<br />

or vision systems as a waypoints<br />

generators.<br />

• Vision-based industrial forklift is<br />

used as an AGV for testing the<br />

proposed method.<br />

• Benchmark evaluation is done to<br />

show better performance than<br />

Pure-Pursuit methods.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–192–<br />

AGV with a vision-based smooth<br />

kinematic controller


Session ThBT4 Continental Ballroom 4 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Stochasticity in Robotics and Biological Systems II<br />

Chair M. Ani Hsieh, Drexel Univ.<br />

Co-Chair Dejan Milutinovic, Baskin School of Engineering, UC Santa Cruz<br />

10:00–10:15 ThBT4.1<br />

A Trajectory-based Calibration Method for<br />

Stochastic Motion Models<br />

Ezequiel Di Mario, Gregory Mermoud, Massimo<br />

Mastrangeli and Alcherio Martinoli<br />

Distributed Intelligent Systems and Algorithms Laboratory,<br />

EPFL, Switzerland<br />

• Quantitative method based on the<br />

Correlated Random Walk (CRW) model<br />

• Minimize Kolmogorov-Smirnov distance<br />

between step length and step angle<br />

distributions<br />

• Validation on real trajectories of<br />

centimeter-sized water-floating robots<br />

• Sensitivity analysis on a multi-robot<br />

simulation: effect of parameter<br />

perturbation on the predicted<br />

distributions of self-assembled<br />

structures<br />

10:30–10:45 ThBT4.3<br />

Stochastic optimal control with variable<br />

impedance manipulators in presence of<br />

uncertainties and delayed feedback<br />

Bastien Berret, Serena Ivaldi, Francesco Nori and Giulio Sandini<br />

Robotics, Brain and Cognitive Sciences, Italian Institute of Technology, Italy<br />

• Recent findings suggest that cocontraction,<br />

or the human ability to change<br />

the intrinsic musculo-skeletal compliance,<br />

plays a crucial role when dealing with<br />

uncertainties and unpredictability<br />

• We use numerical methods for stochastic<br />

optimal control of movements with<br />

variable impedance manipulators in<br />

presence of uncertainties<br />

• We show that actively varying the system<br />

compliance can prevent the<br />

disadvantages of delayed feedback if<br />

coupled with a global and centralized<br />

feedforward motor plan which exploits<br />

muscle co-contraction to achieve<br />

(feedback free) disturbance rejection<br />

Example of two-joint arm moving<br />

in a divergent force field, using<br />

variable impedance control<br />

10:15–10:30 ThBT4.2<br />

Cooperative Multi-Agent Inference over<br />

Grid Structured Markov Random Fields<br />

Ryan Williams and Gaurav Sukhatme<br />

Departments of Electrical Engineering and Computer Science, University of<br />

Southern California, USA<br />

• Cooperative inference with distributed<br />

observations, uncertainty modeled by grid<br />

structured pairwise Markov Random Field<br />

(MRF)<br />

• Proposed Multi-Agent MRF (MAMRF)<br />

decomposes global inference into local<br />

belief exchanges and per-agent inference<br />

problems<br />

• Loopy belief propagation for approximate<br />

inference with synchronous and proposed<br />

region-of-influence message passing<br />

• Spatial mapping simulations over realworld<br />

Moderate Resolution Imaging<br />

Spectroradiometer (MODIS) dataset<br />

illustrates promising performance<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–193–<br />

Cooperative spatial mapping<br />

simulation over MODIS sampling<br />

10:45–10:50 ThBT4.4<br />

Stochastic optimization of a chain sliding mode<br />

controller for the mobile robot maneuvering<br />

A.V. Terekhov, J.-B. Mouret, C. Grand<br />

Institut des Systèmes Intelligents et de Robotique, UPMC-CNRS, France<br />

• Study of aggressive steering<br />

maneuvers<br />

• A chain sliding mode controller is<br />

proposed<br />

• Uses stochastic optimization for<br />

controller parameters tuning<br />

• Validation performed on the real robot


Session ThBT4 Continental Ballroom 4 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Stochasticity in Robotics and Biological Systems II<br />

Chair M. Ani Hsieh, Drexel Univ.<br />

Co-Chair Dejan Milutinovic, Baskin School of Engineering, UC Santa Cruz<br />

10:50–10:55 ThBT4.5<br />

Programmable 3D Stochastic Fluidic Assembly<br />

of cm-scale Modules<br />

Michael T. Tolley<br />

School of Engineering and Applied Sciences, Harvard University, USA<br />

Hod Lipson<br />

School of Mechanical and Aerospace Engineering, Cornell University, USA<br />

• Stochastic Fluidic Assembly (SFA)<br />

offers a potential approach to<br />

achieving programmable matter<br />

• We present a minimalistic<br />

component approach that has led to<br />

three experimental advances in cmscale<br />

SFA:<br />

• Robust assembly<br />

• Parallel hierarchical assembly<br />

• Fully automated assembly using<br />

pressure sensor feedback<br />

• This biologically inspired approach<br />

could lead to batch manufacturing of<br />

small scale target structures<br />

Parallel Stochastic Fluidic Assembly<br />

of Non-Planar 3D Structures<br />

11:00–11:15 ThBT4.7<br />

Visual Localization in Fused Image and Laser<br />

Range Data<br />

Nicholas Carlevaris-Bianco and Anush Mohan<br />

Dept. Electrical Eng. & Computer Science, University of Michigan, U.S.A.<br />

James R. McBride<br />

Research and Innovation Center, Ford Motor Company, U.S.A.<br />

Ryan M. Eustice<br />

Dept. Naval Architecture & Marine Eng., University of Michigan, U.S.A.<br />

• Using a richly instrumented<br />

robotic platform the algorithm<br />

builds a city scale map of<br />

sparse visual features from<br />

co-registered LIDAR and<br />

image data.<br />

• Localization is performed<br />

using the map and a low cost<br />

instrumented camera<br />

system.<br />

• Results presented on data<br />

collected over multiple years,<br />

showing strong robustness<br />

w.r.t. dynamic changes in the<br />

environment.<br />

10:55–11:00 ThBT4.6<br />

Hierarchical Congestion Control<br />

for Robotic Swarms<br />

Vinicius Graciano Santos and Luiz Chaimowicz<br />

VeRLab, UFMG, Brazil<br />

• One of the main challenges in swarm<br />

navigation is to avoid congestions<br />

• We propose the use of hierarchical<br />

abstractions in conjunction with traffic<br />

controls rules to solve this problem<br />

• Simulated and real experiments are<br />

presented to validate the approach.<br />

• Results show that our method allows the<br />

swarm to navigate without congestions in<br />

a smooth and coherent fashion<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–194–<br />

Avoiding congestions between<br />

groups of robots<br />

11:15–11:30 ThBT4.8<br />

Mathematical Analysis of the Minimum Variance<br />

Model of Human-Like Reaching Movements<br />

Mikhail Svinin and Motoji Yamamoto<br />

Mechanical Engineering Department<br />

Faculty of Engineering<br />

Kyushu University, Japan<br />

• Open-loop optimal trajectory planning in<br />

stochastic formulation (control signaldependent<br />

noise).<br />

• Performance index: cumulative average<br />

positional variance over the interval (T,T+R).<br />

• Limiting case of infinite R is considered for<br />

several classes of models (stable plant, n-th<br />

order integrator, diagonizable systems).<br />

• For n-th order integrator and diagonazible<br />

systems with zero eigenvalues, the limiting<br />

model is equivalent to the conventional<br />

minimum effort model<br />

• The conventional minimum jerk model is a<br />

partial case of the minimum variance model.<br />

Post-movement covariance


Session ThBT5 Continental Ballroom 5 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Humanoid Technologies<br />

Chair Tamim Asfour, Karlsruhe Inst. of Tech. (KIT)<br />

Co-Chair Kazuhito Yokoi, National Inst. of AIST<br />

10:00–10:15 ThBT5.1*<br />

Semi-Plenary Invited Talk: Optimal control of motions<br />

of humanoid robots<br />

Katja Mombaur, University of Heidelberg<br />

10:30–10:45 ThBT5.3<br />

Hardware Improvement of Cybernetic Human<br />

HRP-4C<br />

for Entertainment Use<br />

Kenji Kaneko, Fumio Kanehiro, Mitsuharu Morisawa,<br />

Tokuo Tsuji, Kanako Miura, Shin’ichiro Nakaoka,<br />

Shuuji Kajita and Kazuhito Yokoi<br />

AIST, Japan<br />

• Hardware improvement of cybernetic<br />

human HRP-4C for entertainment<br />

• New hand redesigned to realize a humansize<br />

hand with realistic skin and the<br />

average figure of a young Japanese<br />

female<br />

• New foot with active toe joint for realizing<br />

human-like walking motion<br />

• New eye with camera towards both<br />

providing visual information for operators<br />

and realization of motion in response to<br />

spectators<br />

Cybernetic human HRP-4C<br />

improved with,<br />

new hands, an active toe joint,<br />

and a new eye with camera<br />

10:15–10:30 ThBT5.2<br />

Semi-Plenary Invited Talk: Optimal control of motions<br />

of humanoid robots<br />

Katja Mombaur, University of Heidelberg<br />

10:45–10:50 ThBT5.4<br />

Human Robot HRP-4<br />

Humanoid Robotics Platform with Lightweight and Slim Body<br />

Kenji Kaneko, Fumio Kanehiro, Mitsuharu Morisawa<br />

AIST, Japan<br />

Kazuhiko Akachi, Go Miyamori,<br />

Atsushi Hayashi, Noriyuki Kanehira<br />

Kawada Industries, Inc., Japan<br />

• A new lightweight and slim body with a<br />

height of 151 [cm] and weight 39 [kg]<br />

following the legacy of the HRP series<br />

• Total of 34 degrees of freedom, including<br />

7 degrees of freedom for each arm to<br />

facilitate object handling<br />

• Cost reduction by adoption of modularized<br />

units and common parts, reviewing the<br />

machining process, streamlining the basic<br />

specification, and so on<br />

• Improved research efficiency due to the<br />

adoption of OpenRTM-aist, which enables<br />

the use of domestic and international<br />

software assets<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–195–<br />

HRP-4<br />

( Humanoid robotics platform – 4 )


Session ThBT5 Continental Ballroom 5 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Humanoid Technologies<br />

Chair Tamim Asfour, Karlsruhe Inst. of Tech. (KIT)<br />

Co-Chair Kazuhito Yokoi, National Inst. of AIST<br />

10:50–10:55 ThBT5.5<br />

Weakly collision-free paths for continuous<br />

humanoid footstep planning<br />

Nicolas Perrin (1) and Olivier Stasse (2)<br />

Florent Lamiraux (1) and Eiichi Yoshida (2)<br />

(1) CNRS/LAAS, Université de Toulouse UPS, INSA, INP, ISAE, France<br />

(2) CNRS-AIST JRL, UMI3218/CRT, Tsukuba, Japan<br />

• New notion of collisionfreeness<br />

• Equivalence between an<br />

example of discrete<br />

footstep planning and<br />

classical 2D continuous<br />

motion planning<br />

• Possibility to use PRM or<br />

RRT for footstep planning in<br />

a sound and complete way<br />

A continuous weakly collision-free path for the green<br />

shape is converted into a discrete sequence of steps<br />

11:00–11:15 ThBT5.7<br />

Bipedal walking control<br />

based on Capture Point dynamics<br />

Johannes Englsberger, Christian Ott, Maximo A. Roa,<br />

Alin Albu-Schäffer, and Gerhard Hirzinger<br />

Institute of Robotics & Mechatronics, German Aerospace Center (DLR)<br />

• Capture Point (CP) dynamics is an<br />

effective and simple tool to design<br />

feedback controllers for bipedal<br />

robots<br />

• CP control is very flexible and can<br />

be used for high-level control<br />

strategies like push recovery and<br />

online step adjustment<br />

• CP control allows to limit the ZMP<br />

to the support polygon explicitly<br />

• Robustness of CP feedback<br />

control is shown analytically, in<br />

simulation and in experiments on<br />

DLR Biped<br />

DLR Biped<br />

CP shifting with ZMP<br />

CP<br />

COM<br />

ZMP<br />

Shifting CP<br />

from foot to foot<br />

10:55–11:00 ThBT5.6<br />

Using a Multi-Objective Controller to Synthesize<br />

Simulated Humanoid Robot Motion with<br />

Changing Contact Configurations<br />

Karim Bouyarmane and Abderrahmane Kheddar<br />

CNRS-AIST JRL UMI3218/CRT, AIST, Japan<br />

LIRMM, Montpellier 2 University, France<br />

• Non-gaited acyclic multi-contact closedloop-based<br />

motion generation tool for a<br />

humanoid robot.<br />

• Input: sequence of static contact-transition<br />

postures, output: dynamically consistent<br />

motion between the postures.<br />

• Multi-objective control: CoM, end-effector,<br />

posture. Two kinds of objectives: set-point<br />

objective and target objective.<br />

• Objectives automatically decided by a<br />

finite-state machine.<br />

• Results in constraint-based dynamic<br />

simulation (non real-time).<br />

11:15–11:30 ThBT5.8<br />

Human-like Walking<br />

with Toe Supporting for Humanoids<br />

Kanako Miura, Mitsuharu Morisawa, Fumio Kanehiro<br />

Shuuji Kajita, Kenji Kaneko and Kazuhito Yokoi<br />

Intelligent Systems Research Institute,<br />

National Institute of Advanced Industrial Science and Technology, Japan<br />

• Three characteristics of human walking<br />

were added: single toe support, stretching<br />

knees, and foot swing motion.<br />

• Modifications to add these characteristics<br />

are made based on the conventional<br />

walking pattern generator.<br />

• The generated pattern is converted to a<br />

feasible pattern using a prioritized<br />

optimization technique.<br />

• A stabilizer is also improved to maintain<br />

ZMP inside of tiny supporting area of toe.<br />

• The aimed walking motion was<br />

successfully demonstrated with HRP-4C.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–196–


Session ThBT6 Continental Ballroom 6 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Robots that Can See II<br />

Chair José Neira, Univ. de Zaragoza<br />

Co-Chair Dieter Fox, Univ. of Washington<br />

10:00–10:15 ThBT6.1*<br />

Semi-Plenary Invited Talk: Vision in the Field<br />

Peter Corke, QUT<br />

10:30–10:45 ThBT6.3<br />

Capturing City-level Scenes with a<br />

Synchronized Camera-Laser Fusion Sensor<br />

Yunsu Bok, Dong-Geol Choi, Yekeun Jeong, and In So Kweon<br />

RCV Lab., EE Dept., KAIST, Korea<br />

• A novel camera-laser fusion system :<br />

synchronized to be mounted on a fastmoving<br />

ground vehicle<br />

• Reconstruction by accumulating laser<br />

scans : motion estimation without the<br />

2D assumption and refinement by a few<br />

closed loops<br />

• Moving object detection :<br />

a solution for detecting moving objects<br />

which cannot be detected by cameras<br />

10:15–10:30 ThBT6.2*<br />

Semi-Plenary Invited Talk: Vision in Space<br />

Andrew Howard, Space Exploration Technologies (SpaceX)<br />

10:45–10:50 ThBT6.4<br />

Vehicle Detection and Tracking at Nighttime for<br />

Urban Autonomous Driving<br />

Hossein Tehrani, Koji Takahashi, Seiichi Mita,<br />

Smart Vehicle Research Center, Toyota Technological Institute, Japan<br />

David McAllester<br />

Toyota Technological Institute at Chicago, USA<br />

• Present a weighted deformable object<br />

model for detecting and tracking vehicles<br />

using monocular infra-red camera.<br />

• Filters are learned through combination of<br />

latent support vector machine and<br />

histograms of oriented gradients.<br />

• Detected vehicles are tracked by<br />

calculating maximum HOG features<br />

compatibility for both root and parts.<br />

• Achieve an average vehicle detection and<br />

tracking rate of 83.78% with very low false<br />

positive.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–197–<br />

Multi Vehicles detection and<br />

tracking at night time


Session ThBT6 Continental Ballroom 6 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Symposium: Robots that Can See II<br />

Chair José Neira, Univ. de Zaragoza<br />

Co-Chair Dieter Fox, Univ. of Washington<br />

10:50–10:55 ThBT6.5<br />

Dense Multi-Planar Scene Estimation from a<br />

Sparse Set of Images<br />

Alberto Argiles, Javier Civera and Luis Montesano<br />

I3A, Universidad de Zaragoza, Spain<br />

• 3D scene estimation from visual data is limited, in most cases, to a sparse<br />

set of salient points.<br />

• Robots need dense scene estimations in order to properly interact with a<br />

3D world.<br />

• We present an algorithm to densify a<br />

sparse point-based scene estimation for<br />

Manhattan-like environments.<br />

• The algorithm combines robust multimodel<br />

estimation and geometric and<br />

photometric tests again the visual data.<br />

Input images Dense multi-planar 3D scene estimation<br />

11:00–11:15 ThBT6.7<br />

Real-time Step Edge Estimation Using Stereo<br />

Images for Biped Robot<br />

Minami Asatani<br />

The Fundamental Technology Research Center, Honda R&D Co. Ltd, Japan.<br />

Shigeki Sugimoto and Masatoshi Okutomi<br />

Dept. of Mech. and Control Eng., Tokyo Institute of Technology, Japan.<br />

• A step edge estimation method using<br />

stereo images for the biped robots with<br />

the ability of the overhanging footstep.<br />

• The overhanging footstep: advantageous<br />

in terms of smooth walking and<br />

relaxation of restrictions on gait planning.<br />

• Pre-processing: Occupancy grid<br />

segmentation & fast direct image<br />

alignment for estimating the plane<br />

parameters of each step.<br />

• Step edge estimation: By our image<br />

segmentation technique minimizing a<br />

cost composed of the sum of pixel value<br />

differences of stereo images.<br />

• Optional use of uncalibrated projector for<br />

texture-less surfaces.<br />

The overhanging<br />

footstep<br />

The overhanging footstep and<br />

edge estimation results.<br />

10:55–11:00 ThBT6.6<br />

Plenoptic Flow: Closed-Form Visual<br />

Odometry for Light Field Cameras<br />

Donald G. Dansereau, Ian Mahon,<br />

Oscar Pizarro and Stefan B. Williams<br />

Australian Centre for Field Robotics, University of Sydney, Australia<br />

• Classic approaches to visual odometry are iterative and/or complex<br />

• We present a closed-form, featureless solution to visual odometry for<br />

planar camera arrays, a.k.a. “light field cameras”<br />

• Three solutions capable of estimating a camera’s 6-DOF trajectory in a<br />

closed-form manner are presented<br />

• Computation is simple and results are competitive with SIFT+RANSAC<br />

11:15–11:30 ThBT6.8<br />

Exploiting Motion Priors in Visual Odometry<br />

for Vehicle-Mounted Cameras<br />

with Non-holonomic Constraints<br />

Davide Scaramuzza 1 , Andrea Censi 2 , and Kostas Daniilidis 1<br />

1 GRASP Lab, University of Pennsylvania<br />

2 California Institute of Technology<br />

• We present an algorithm (MOBRAS) for<br />

outlier removal which is not based on<br />

RANSAC<br />

• Contrary to RANSAC, motion hypotheses are<br />

not generated from randomly-sampled point<br />

correspondences, but from a “proposal<br />

distribution” that is built by exploiting the<br />

vehicle non-holonomic constraints<br />

• We show that not only is the proposed<br />

algorithm significantly faster than RANSAC,<br />

but that the returned solution may also be<br />

better in that it favors the underlying motion<br />

model of the vehicle, thus overcoming the<br />

typical limitations of RANSAC<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–198–


Session ThBT7 Continental Parlor 7 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Medical Robots & Systems<br />

Chair Pierre Dupont, Children's Hospital Boston, Harvard Medical School<br />

Co-Chair Jung Kim, KAIST<br />

10:00–10:15 ThBT7.1<br />

Identification of a hybrid model for simulation of<br />

the instrument/trocar interaction force<br />

J. Verspecht, L. Catoire, S. Torfs and M. Kinnaert<br />

Control Engineering and System Analysis Department<br />

Université libre de Bruxelles (ULB), Belgium<br />

• Design of a simulator for the slave robot<br />

interacting with its environment in<br />

minimally invasive robotic surgery.<br />

• Short introduction to the Extended<br />

Maxwell-Slip (EMS) model, a 2-state<br />

hybrid friction model including a<br />

deformation and a sliding phase<br />

• Identification procedure for the EMS<br />

model using a PWARX identification<br />

techniques namely the clustering based<br />

technique.<br />

• Identification and validation are achieved<br />

using experimental data recorded on a<br />

dedicated test setup.<br />

10:30–10:45 ThBT7.3<br />

Bio-inspired Active Soft Orthotic Device<br />

for Ankle Foot Pathologies<br />

Yong-Lae Park1 , Bor-rong Chen2 , Diana Young1 , Leia Stirling1 ,<br />

Robert J. Wood1,2 , Eugene Goldfield1,3 and Radhika Nagpal1,2 1Wyss Institute for Biologically Inspired Engineering, Harvard University, USA<br />

2School of Engineering and Applied Science, Harvard University, USA<br />

3Children’s Hospital Boston, USA<br />

• Active soft ankle-foot<br />

orthotic device<br />

• Biologically inspired<br />

design based on a<br />

musculoskeletal system<br />

of a human foot and leg<br />

• Soft sensor and<br />

actuator materials with<br />

no rigid frame structure<br />

• For treating gait<br />

pathologies associated<br />

with neuromuscular<br />

disorders<br />

10:15–10:30 ThBT7.2<br />

A Fast Classification System For Decoding<br />

of Human Hand Configurations<br />

Using Multi-Channel sEMG Signals<br />

Myoung Soo Park, Keehoon Kim and Sang Rok Oh<br />

Center of Human-centered Interaction and Robotics<br />

Korea Institute of Science and Technology (KIST), Korea<br />

• We propose a novel fast classification<br />

system that allows real-time decoding of<br />

human movement intentions from multichannel<br />

sEMG signals<br />

• For extracting small-dimensional and<br />

discriminative features, CA-PCA (Classaugmented<br />

PCA) are applied to RMS<br />

(root-mean-squared) sEMG signals<br />

• As a fast classifier, ELM (Extreme<br />

Learning Machine) are selected and used<br />

• From 28,000 8-channel sEMG of 4 actions<br />

and rest, feature and classifier are learned<br />

in less than 10 secs and can classify<br />

280,000 test data with 91.59% accuracy<br />

10:45–10:50 ThBT7.4<br />

A Preliminary Experiment for Transferring<br />

Human Motion to a Musculoskeletal Robot<br />

― Decomposition of Human Running based on Muscular Coordination ―<br />

Taiki Iimura, Keita Inoue, Hang T.T. Pham,<br />

Hiroaki Hirai, and Fumio Miyazaki<br />

Mechanical Science and Bioengineering, Osaka University, Japan<br />

• The electromyographic signals of 8 leg<br />

muscles during human running were<br />

measured.<br />

• Two parameters related to kinematics or<br />

kinetics are defined from EMG data.<br />

• Principal Component Analysis<br />

decomposes human running into only two<br />

muscle coordination patterns.<br />

• The kinematic meanings of the extracted<br />

EMG patterns were visualized by a<br />

musculoskeletal leg robot.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–199–<br />

Robot movements corresponding<br />

to the extracted EMG patterns


Session ThBT7 Continental Parlor 7 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Medical Robots & Systems<br />

Chair Pierre Dupont, Children's Hospital Boston, Harvard Medical School<br />

Co-Chair Jung Kim, KAIST<br />

10:50–10:55 ThBT7.5<br />

Heart Motion Simulator for Motion<br />

Compensation<br />

Renat Iskakov, Martin Groeger and Gerd Hirzinger<br />

Institute of Robotics and Mechatronics,<br />

German Aerospace Center (DLR), Germany<br />

• Heart motion simulator<br />

(HMS) to accurately<br />

simulate 3D translational<br />

motion trajectories of<br />

human beating heart<br />

• System description<br />

including design,<br />

kinematics, and closedloop<br />

dynamics<br />

• Impedance control structure for accurate simulation and future<br />

interaction with medical tools<br />

• Experiments show the capability of the HMS to simulate different lowamplitude<br />

translational motion trajectories<br />

11:00–11:15 ThBT7.7<br />

New Approach for Abnormal Tissue Localization<br />

with Robotic Palpation and Mechanical Property<br />

Characterization<br />

Bummo Ahn, Yeongjin Kim, and Jung Kim<br />

Department of Mechanical Engineering, KAIST, Korea<br />

• Abnormal tissue localization<br />

- Robotic palpation: scanning & sweeping<br />

- FEM-based mechanical property characterization.<br />

New approach for abnormal tissue localization<br />

Palpation experiment<br />

Mechanical property characterization<br />

10:55–11:00 ThBT7.6<br />

MRI-Powered Actuators for Robotic Interventions<br />

Panagiotis Vartholomeos and Pierre E. Dupont<br />

Cardiovascular Surgery, Children’s Hospital Boston,<br />

Harvard Medical School, USA<br />

Lei Qin<br />

Department of Radiology, Brigham & Women’s Hopsital, Harvard<br />

Medical School, USA<br />

• Novel actuation technology for<br />

robotically assisted MRI-guided<br />

interventional procedures<br />

• Actuators are both powered and<br />

controlled by the MRI scanner<br />

• Design concept and performance<br />

limits are described and derived<br />

analytically<br />

• Experiments in a clinical MR scanner<br />

are used to demonstrate the capability<br />

of the approach for needle biopsies<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–200–<br />

MRI-powered<br />

actuator<br />

MRI-powered actuator performs<br />

needle biopsy in porcine heart.<br />

11:15–11:30 ThBT7.8<br />

Position Estimation of an Epicardial Crawling<br />

Robot on the Beating Heart by Modeling of<br />

Physiological Motion<br />

Nathan Wood and Cameron Riviere<br />

The Robotics Institute, Carnegie Mellon University, USA<br />

Diego Moral del Agua<br />

University of Valladolid, Spain<br />

Marco Zenati<br />

BHS Department of Cardiothoracic Surgery, Harvard Medical School, USA<br />

• HeartLander robot provides therapies to<br />

the heart in a minimally invasive manner<br />

• Modeling of cardiac and respiratory<br />

motion enables more accurate position<br />

estimation of robot on heart<br />

• Fourier series models of physiological<br />

motions are fit using an Extended Kalman<br />

Filter (EKF) framework<br />

• Improves localization over previous<br />

filtering techniques and enables<br />

locomotion synchronization with<br />

physiological cycles on heart phantom


Session ThBT8 Continental Parlor 8 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Search & Rescue Robots<br />

Chair Timothy H. Chung, Naval Postgraduate School<br />

Co-Chair Geoffrey Hollinger, Univ. of Southern California<br />

10:00–10:15 ThBT8.1<br />

����������������������������������<br />

��������������������<br />

���������������������������������������<br />

����������������������������������������������������������<br />

� ��������������������������������������������<br />

�����������������<br />

� ������������������������������������<br />

����������������������������������������<br />

�������������<br />

� �������������������������������������<br />

���������������������<br />

� �������������������������������������������<br />

����������������������������������������<br />

����������������������<br />

10:30–10:45 ThBT8.3<br />

Optimal Deployment of Robotic Teams for<br />

Autonomous Wilderness Search and Rescue<br />

Ashish Macwan, Goldie Nejat, and Beno Benhabib<br />

Department of Mechanical and Industrial Engineering, University of Toronto,<br />

Canada<br />

• An optimal re-deployment method for autonomous robotic Wilderness<br />

Search and Rescue (WiSAR) is presented<br />

• WiSAR challenges include: moving lost person, growing search area,<br />

and varying terrain<br />

• The proposed method addresses initial deployment as well as<br />

subsequent online re-deployment<br />

• The novel concept of iso-probability curves is used to predict and<br />

represent the lost person’s probable location<br />

• Optimal deployment / re-deployment is based on minimizing search time<br />

and maximizing probability of success<br />

• A simulated WiSAR search scenario demonstrates the method<br />

10:15–10:30 ThBT8.2<br />

Searching for Multiple Targets Using<br />

Probabilistic Quadtrees<br />

Stefano Carpin and Derek Burch<br />

School of Engineering, University of California - Merced, USA<br />

Timothy H. Chung<br />

Dept. of Systems Engineering, Naval Postgraduate School - Montery, USA<br />

• We extend Probabilistic Quadtrees for<br />

search to deal with an arbitrary, unknown<br />

number of targets;<br />

• An information gain function is defined to<br />

guide the search;<br />

• An entropy based criterion terminates the<br />

search;<br />

• We theoretically show that in the limit the<br />

hierarchical representation converges to a<br />

uniform grid;<br />

• Extensive simulation show hierarchical<br />

search largely outperforms uniform<br />

approaches.<br />

10:45–10:50 ThBT8.4<br />

Development of the high strength retractable<br />

skin and the closed type crawler vehicle<br />

Takeshi Aoki and Takahiro Karino<br />

Advanced robotics, Chiba Institute of Technology, Japan<br />

Hiroyuki Kuwahara<br />

SUSTAINable Robotics, Japan<br />

• Proposal and development of the high<br />

strength retractable skin<br />

• Proposal a new concept of crawler vehicle<br />

and mechanisms covered by the skin<br />

• Development of the mono crawler vehicle,<br />

named “Slug Crawler Vehicle: SCV”<br />

• SCV has water and dust proof abilities by<br />

the Flexible crawler belt<br />

• Verification of its effectiveness by basic<br />

experiments of the prototype model<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–201–<br />

The closed type flexible<br />

crawler vehicle


Session ThBT8 Continental Parlor 8 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Search & Rescue Robots<br />

Chair Timothy H. Chung, Naval Postgraduate School<br />

Co-Chair Geoffrey Hollinger, Univ. of Southern California<br />

10:50–10:55 ThBT8.5<br />

Human-in-the-Loop: MPC for Shared Control of a<br />

Quadruped Rescue Robot<br />

Rahul Chipalkatty, Hannes Daepp,<br />

Magnus Egerstedt*, and Wayne Book<br />

Mechanical Engineering, Electrical and Computer Engineering*,<br />

Georgia Institute of Technology, USA<br />

• Composition of human input with an<br />

automatic controller to ensure static<br />

rescue robot stability.<br />

• A hybrid control architecture used to<br />

implement the supporting gait.<br />

• Control applied to a rescue robot<br />

simulation with real-time human input from<br />

haptic joysticks.<br />

• Experimental results confirm the viability<br />

of control approach: strengths of both<br />

human and robot control.<br />

10:55–11:00 ThBT8.6<br />

Detection and Tracking of Road Networks in<br />

Rural Terrain Fusing Vision and LIDAR<br />

Michael Manz, Michael Himmelsbach, Thorsten Luettel<br />

and Hans-Joachim Wuensche<br />

Autonomous System Technology, University of the Bundeswehr Munich,<br />

Germany<br />

• Fusing 3D LIDAR data and camera<br />

images into an accumulated, colored 3D<br />

elevation map of the terrain<br />

• Model based estimation of the local road<br />

network geometry using a particle filter<br />

and multiple features within the map<br />

• Supervised learning is applied to assess<br />

the likelihood of particles<br />

• Enables our robot MuCAR-3 to<br />

autonomously navigate on dirt-road<br />

networks<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–202–<br />

Crossing detection


Session ThBT9 Continental Parlor 9 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Intelligent Transportation Systems (Automotive)<br />

Chair Bernd Radig, TU Muenchen<br />

Co-Chair Juan Nieto, Univ. of Sydney, Australian Centre for Field Robotics<br />

10:00–10:15 ThBT9.1<br />

���������������������������������������������������<br />

���������������������������<br />

������������������������������������������<br />

������������������������������<br />

����������������<br />

���������������������������������<br />

������������������������������������������������������<br />

� ����������������������������������<br />

���������������������������������<br />

� ���������������������������������������<br />

�����������������<br />

� �����������������������������������<br />

������������������������������������������<br />

�����������������������������������<br />

����������<br />

� ���������������������������������<br />

��������������������������������<br />

�������������<br />

�������������������������<br />

������������������������������<br />

10:30–10:45 ThBT9.3<br />

Autonomous Intersection Management:<br />

Multi-Intersection Optimization<br />

Matthew Hausknecht, Tsz-Chiu Au, and Peter Stone<br />

Department of Computer Science, University of Texas at Austin, USA<br />

• In a future of autonomous vehicles,<br />

intelligent traffic management systems<br />

such as Autonomous Intersection<br />

Management (AIM) are crucial<br />

• Exploring AIM for the first time on road<br />

networks, we:<br />

• Examine vehicle navigations<br />

strategies<br />

• Observe an instance of Braess’<br />

Paradox<br />

• Dynamically reverse the direction of<br />

traffic along selected lanes<br />

• We quantify the improvements imparted<br />

by these agent-based traffic control<br />

methods<br />

Intersection in AIM simulator<br />

10:15–10:30 ThBT9.2<br />

Tracking Objects of Arbitrary Shape Using<br />

Expectation Maximization Algorithm<br />

Shuqing Zeng and Yuanhong Li<br />

Electrical and Controls Lab, GM R&D Center, USA<br />

Yantao Shen<br />

Department of Electrical and Biomedical Engineering, University of Nevada,<br />

USA<br />

• Address the general object tracking with<br />

arbitrary shape using rangefinders, which<br />

is a key module for an autonomous driving<br />

vehicle.<br />

• An Expectation Maximization (EM)<br />

algorithm with locally matching is<br />

proposed for motion estimation between<br />

two consecutive range images.<br />

• The complexity of the algorithm is O(N)<br />

with N the numbers of scan points.<br />

• Quantitative performance evaluation of the<br />

algorithm using a benchmarking vehicular<br />

data set.<br />

• Results of road tests show the<br />

effectiveness and efficiency of the system.<br />

10:45–10:50 ThBT9.4<br />

Evaluation of Different Approaches for Road<br />

Course Estimation using Imaging Radar<br />

Frederik Sarholz, Jens Mehnert, Jens Klappstein<br />

and Juergen Dickmann<br />

Group Research/Environment Perception, Daimler AG, Germany<br />

Bernd Radig<br />

Intelligent Autonomous Systems, Technische Universitaet Muenchen, Germany<br />

• Three approaches for road course<br />

estimation are evaluated based on<br />

several hundred kilometers of country<br />

roads:<br />

1. spatial derivative + lane free of radar<br />

echoes<br />

2. angle extraction of the road border +<br />

projection onto the vehicle’s course<br />

(performs up to 78% better than 1)<br />

3. usage of objects moving parallel to the<br />

road course<br />

(performs up to 209% better than 1)<br />

• The driven trajectory is taken as ground<br />

truth<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–203–<br />

x[m]<br />

160<br />

120<br />

80<br />

40<br />

0<br />

x[m]<br />

80<br />

40<br />

y[m] y[m]<br />

Left: estimation with second approach<br />

Right: estimation with third approach,<br />

the dots are the measured positions of<br />

the followed vehicle<br />

0


Session ThBT9 Continental Parlor 9 Thursday, September 29, <strong>2011</strong>, 10:00–11:30<br />

Intelligent Transportation Systems (Automotive)<br />

Chair Bernd Radig, TU Muenchen<br />

Co-Chair Juan Nieto, Univ. of Sydney, Australian Centre for Field Robotics<br />

10:50–10:55 ThBT9.5<br />

A Car Transportation System Using<br />

Multiple Mobile Robots: iCART II<br />

K. Kashiwazaki 1 , N. Yonezawa 1 , M. Endo 2 , K. Kosuge 1 ,<br />

Y. Sugahara 1 , Y. Hirata 1 , T. Kanbayashi 3 , K. Suzuki 3 ,<br />

K. Murakami 3 and K. Nakamura 3<br />

1 Department of Bioengineering and Robotics, Tohoku University, Japan<br />

2 Department of Mechanical Engineering, Nihon University, Japan<br />

3 IHI Transport Machinery Co., Ltd., Japan<br />

• This paper proposes a car transportation<br />

system, iCART II, based on “a-robot-fora-wheel”<br />

concept.<br />

• A prototype system, MRWheel (a Mobile<br />

Robot for a Wheel), is designed.<br />

• We propose a decentralized control<br />

algorithm for car transportation in<br />

coordination by using a leader-follower<br />

type multiple robot system.<br />

• The proposed control algorithm is applied<br />

to the car transportation system and the<br />

experimental results are reported.<br />

Mobile Base Module<br />

Connecting Module<br />

Lifter Module Omni-directional Wheel<br />

11:00–11:15 ThBT9.7<br />

Intelligent Driver Drowsiness Detection System<br />

Using Uncorrelated Fuzzy Locality Preserving<br />

Analysis<br />

Rami N. Khushaba, Sarath Kodagoda, Gamini Dissanayake<br />

ARC centre of Excellence for Autonomous Systems, Faculty of Engineering and<br />

Information Technology, University of Technology, Sydney (UTS), Australia.<br />

Sara Lal<br />

Department of Medical and Molecular Biosciences, Faculty of Science,<br />

University of Technology, Sydney (UTS), Australia.<br />

• Driver drowsiness is believed to account<br />

for 20% of all vehicle accidents on<br />

motorways and monotonous roads.<br />

• Problem: identify physiological<br />

associations between driver drowsiness<br />

and the corresponding patterns of EEG,<br />

EOG, and ECG.<br />

• Approach: A pattern recognition approach<br />

is adopted by proposing a novel feature<br />

extraction and reduction algorithm UFLPA.<br />

• Experiments on 31 participants provided a<br />

classification accuracy of ~95%.<br />

Driving simulation system<br />

10:55–11:00 ThBT9.6<br />

Probabilistic Road Geometry Estimation using a<br />

Millimetre-Wave Radar<br />

Andres Hernandez-Gutierrez, University of Sydney, Australian<br />

Centre for Field Robotics<br />

Juan Nieto, University of Sydney, Australian Centre for Field<br />

Robotics<br />

Tim Bailey, University of Sydney<br />

Eduardo Nebot, Unversity of Sydney<br />

11:15–11:30 ThBT9.8<br />

An Efficient Control Strategy for the Traffic<br />

Coordination of AGVs<br />

Roberto Olmi<br />

Elettric80, Italy<br />

Cristian Secchi and Cesare Fantuzzi<br />

DISMI, University of Modena and Reggio Emilia, Italy<br />

• Coordination of AGVs over a zone<br />

cotrolled layout<br />

• Coordination diagrams are used for<br />

representing possible collisions<br />

• A computationally efficient strategy for<br />

segments allocations is proposed<br />

• The traffic control strategy is tested over<br />

real industrial layouts<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–204–


Session ThCT1 Continental Parlor 1 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Manipulation<br />

Chair Mike Stilman, Georgia Tech.<br />

Co-Chair Ludovic Righetti, Univ. of Southern California<br />

14:45–15:00 ThCT1.1<br />

���������������������������������������������<br />

���������������������������������<br />

����������������������������������<br />

����������������������������������������������������<br />

� �������������������������������������<br />

������������������������������������������<br />

�������������������<br />

� ��������������������������������������<br />

���������������������������������������<br />

���������������������<br />

� ����������������������������������������<br />

��������<br />

� ����������������������������������������<br />

���������������������������������������<br />

��������������������������<br />

��������������������������������<br />

������������<br />

15:15–15:30 ThCT1.3<br />

Distributed Generalization of<br />

Learned Planning Models in Robot PbD<br />

Rainer Jäkel, Sven R. Schmidt-Rohr, Rüdiger Dillmann<br />

Institute for Anthropomatics, Karlsruhe Institute of Technology, Germany<br />

Pascal Meißner<br />

FZI - Research Center for Information Technologies, Germany<br />

• Manipulation knowledge can be learned<br />

efficiently from human observation<br />

• Problem: how to determine a reasonable<br />

set of learning features?<br />

• Approach: distributed simulation and<br />

constrained motion planning to generate<br />

statistics about constraint inconsistencies<br />

• Goal: determine a maximal subset of task<br />

constraints, which allows the robot to<br />

solve a set of test problems<br />

15:00–15:15 ThCT1.2<br />

Push Planning for Object Placement on<br />

Cluttered Table Surfaces<br />

Akansel Cosgun, Tucker Hermans, Victor Emeli and Mike Stilman<br />

Center for Robotics and Intelligent Machines<br />

Georgia Institute of Technology, USA<br />

• A planner to find a sequence of<br />

pushes of tabletop objects to make<br />

room for a target object<br />

• Implemented in dynamic simulation<br />

• Efficient and applicable to practical<br />

domains<br />

(a) Initial State<br />

• Placement pose found by convolving object kernel<br />

over table and sampling among the least<br />

overlapping poses<br />

• Iterative Deepening Breadth-First-Search (IDBFS)<br />

used to try different placement poses and gradually<br />

allow more complex plans<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–205–<br />

7<br />

6<br />

4<br />

5<br />

(b) Plan<br />

6<br />

6<br />

7<br />

7<br />

7<br />

6<br />

4<br />

4<br />

5<br />

Area 5 is cleared<br />

15:30–15:35 ThCT1.4<br />

Learning Force Control Policies<br />

for Compliant Manipulation<br />

Mrinal Kalakrishnan * , Ludovic Righetti * ,<br />

Peter Pastor * , and Stefan Schaal *†<br />

* CLMC Lab, University of Southern California, USA<br />

† Max-Planck-Institute for Intelligent Systems, Tübingen, Germany<br />

• Compliant actuation and control are<br />

critical for safe operation of robots in<br />

household environments.<br />

• Manipulation involves exerting forces on<br />

the world through contact interaction.<br />

• We learn control policies for both positions<br />

and forces, using the PI 2 reinforcement<br />

learning algorithm.<br />

• This allows compliant execution with<br />

increased robustness as opposed to stiff<br />

position control.<br />

• We use this method to learn dooropening<br />

and pen-grasping tasks on a<br />

Barrett WAM arm.<br />

4<br />

5<br />

5


Session ThCT1 Continental Parlor 1 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Manipulation<br />

Chair Mike Stilman, Georgia Tech.<br />

Co-Chair Ludovic Righetti, Univ. of Southern California<br />

15:35–15:40 ThCT1.5<br />

Autonomous Indoor Aerial Gripping<br />

Using a Quadrotor<br />

Vaibhav Ghadiok, Jeremy Goldin and Wei Ren<br />

Electrical & Computer Engineering, Utah State University, USA<br />

• Custom-built quadrotor equipped with<br />

low-cost sensors: gyroscopes,<br />

accelerometers, camera, IR camera and<br />

sonar<br />

• Capabilities include visual SLAM,<br />

waypoint navigation, target search and<br />

aerial gripping while hovering<br />

• IR camera to locate target and<br />

underactuated passively compliant<br />

gripper for gripping<br />

• Nested PID controllers for attitude<br />

stabilization, altitude regulation,<br />

navigation and gripping<br />

15:45–16:00 ThCT1.7<br />

�����������������������������������������������<br />

����������������������������������������<br />

������<br />

������������������������������������������������������<br />

���������������������������������������<br />

� �������������������������������������������<br />

���������������������������������������<br />

�������������������������<br />

� ����������������������������������������<br />

�����������������������������������������<br />

�����������<br />

� �������������������������������������<br />

�������������������������������������������<br />

��������������<br />

� �������������������������������������������<br />

�����������������������������������������<br />

��������������������������������������������<br />

�������<br />

15:40–15:45 ThCT1.6<br />

Shooting Manipulation System<br />

with High Reaching Accuracy<br />

Tomofumi Hatakeyama and Hiromi Mochiyama<br />

Department of Intelligent Interaction Technologies<br />

University of Tsukuba, Japan<br />

• We propose a shooting manipulation<br />

system with high reaching accuracy for<br />

manipulating a distant object<br />

• The proposed system has excellent<br />

performances such as high motion<br />

straightness, quickness and reaching<br />

accuracy<br />

• The end-effector rests at target point, then<br />

we clarify the mechanics of this motion<br />

• Realize quick capturing of a falling object<br />

at 700[mm] away with high success rate<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–206–<br />

Quick capturing motion<br />

16:00–16:15 ThCT1.8<br />

A Novel Approach to the Manipulation of Body-Parts<br />

Ownership Using a Bilateral Master-Slave System<br />

Masayuki Hara, Akio Yamamoto, and Toshiro Higuchi<br />

Dept. of Precision Engineering, The University of Tokyo, Japan<br />

Giulio Rognini and Hannes Bleuler<br />

Institute of Microengineering, EPFL, Switzerland<br />

Nathan Evans and Olaf Blanke<br />

Brain Mind Institute, EPFL, Switzerland<br />

• A novel tactile rubber hand illusion (RHI)<br />

paradigm with haptics/robotics technology<br />

is proposed for manipulating body-parts<br />

ownership<br />

• A 3-DOF master-slave system is designed<br />

to enable RHI experiments under active<br />

self-touch condition<br />

• The applicability of the proposed system is<br />

validated conducting a self-touch-enabled<br />

RHI experiment<br />

• The proposed approach has a possibility<br />

to answer current open questions about<br />

body representation in human brain<br />

Paintbrush attached on<br />

a slave device<br />

Rubber hand<br />

Paintbrush attached on<br />

a master device<br />

Participant<br />

Self-touch-enabled RHI paradigm<br />

with a master-slave system


Session ThCT2 Continental Parlor 2 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Industrial Robots<br />

Chair Jun Ota, The Univ. of Tokyo<br />

Co-Chair Rolf Johansson, Lund Univ.<br />

14:45–15:00 ThCT2.1<br />

Assembly Strategy Modeling and Selection for<br />

Human and Robot Coordinated Cell Assembly<br />

Fei Chen 1 , Kosuke Sekiyama 1 , Hironobu Sasaki 1 , Jian Huang 3 ,<br />

Baiqing Sun 4 , Toshio Fukuda 1,2<br />

1 Department of Micro-nano Systems Engineering, Nagoya University, Japan.<br />

2 Center for Micro-nano Mechatronics, Nagoya University, Japan.<br />

3 Department of Control Science and Engineering,<br />

Huazhong University of Science and Technology, China.<br />

4 School of Electrical Engineering, Shenyang University of Technology, China.<br />

•A Dual Generalized Stochastic Petri Net<br />

(GSPN) model is built based on a practical<br />

assembly task for human and robot<br />

coordination by taking possible human and<br />

robot behaviors into consideration.<br />

• Based on GSPN, Monte Carlo method is<br />

carried out to study the time cost and<br />

payment cost for different assembly<br />

subtask scheduling strategies within<br />

human and robot.<br />

• Multiple-Objective Optimization (MOOP)<br />

method related Cost-effectiveness analysis<br />

is adopted to select the optimal assembly<br />

scheduling strategy.<br />

15:15–15:30 ThCT2.3<br />

Reusable Hybrid Force-Velocity controlled<br />

Motion Specifications with executable DSL<br />

Markus Klotzbuecher, Ruben Smits,<br />

Herman Bruyninckx, Joris De Schutter<br />

Department of Mechanical Engineering<br />

Katholieke Universiteit Leuven, Belgium<br />

• An approach for modeling reusable<br />

robotic tasks using Domain Specific<br />

Languages (DSL)<br />

• Models of force-velocity controlled<br />

motions are composed in finite state<br />

machines<br />

• DSL are used to separate platform<br />

independent from platform specific<br />

aspects<br />

• Illustrated by an alignment task<br />

executed on a KUKA LWR and a PR2<br />

robot<br />

Alignment task executed by a<br />

KUKA LWR robot<br />

15:00–15:15 ThCT2.2<br />

Stereographic projection for industrial<br />

manipulator tasks: Theory and experiments<br />

Magnus Bjerkeng and Kristin Y. Pettersen<br />

Department of Engineering Cybernetics, NTNU, Norway<br />

Erik Kyrkebø<br />

SINTEF ICT Applied Cybernetics, Norway<br />

• A pseudoinverse controller using a<br />

task representation based on the<br />

stereographic projection, applied to<br />

a dynamic surveillance problem, is<br />

presented<br />

• A comparative analysis between<br />

similar task representations is<br />

provided<br />

• A qualitative analysis of the global<br />

dynamics is presented and the<br />

existence and uniqueness of<br />

singularities is determined<br />

• Experiments using industrial<br />

manipulators are presente.<br />

15:30–15:35 ThCT2.4<br />

Entropy-Based Motion Selection<br />

for Touch-Based Registration<br />

Using Rao-Blackwellized Particle Filtering<br />

Yuichi Taguchi, Tim K. Marks, and John R. Hershey<br />

Mitsubishi Electric Research Labs (MERL), Cambridge, MA, USA<br />

• 6-DOF robot-to-object registration using<br />

a sequence of touches<br />

• Efficient online estimation using Rao-<br />

Blackwellized Particle Filtering (RBPF)<br />

• Entropy-based motion selection to<br />

determine optimal next touch location<br />

• Comparison of several entropy<br />

measures for RBPF<br />

• Applications: Fully automatic registration<br />

and human-guided registration<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–207–<br />

Probe<br />

Particle Distribution<br />

on Object Model<br />

Force<br />

Sensor<br />

Object


Session ThCT2 Continental Parlor 2 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Industrial Robots<br />

Chair Jun Ota, The Univ. of Tokyo<br />

Co-Chair Rolf Johansson, Lund Univ.<br />

15:35–15:40 ThCT2.5<br />

Manipulator System Selection Based on<br />

Evaluation of Task Completion Time and Cost<br />

Yanjiang Huang, Lounell B. Gueta and Jun Ota<br />

Research into Artifacts, Center for Engineering, The University of Tokyo, Japan<br />

Ryosuke Chiba<br />

Faculty of System Design, Tokyo Metropolitan University, Japan<br />

Tamio Arai<br />

Department of Precision Engineering, The University of Tokyo, Japan<br />

Tsuyoshi Ueyama<br />

DENSO WAVE INCORPORATED, Japan<br />

Masao Sugi<br />

Department of Mechanical Engineering and Intelligent Systems,<br />

University of Electro-Communications, Japan<br />

• Multiple objective particle swarm<br />

optimization is utilized to search for<br />

appropriate manipulator systems.<br />

• Location optimization and motion<br />

coordination are utilized to calculate task<br />

completion time of manipulator system.<br />

• Relative cost is utilized to calculate cost of<br />

manipulator system.<br />

• Particle swarm optimization and nearest<br />

neighborhood algorithm are employed in<br />

location optimization and motion<br />

coordination.<br />

System selection Start<br />

Select manipulator system, s<br />

Location optimization<br />

Select location of positioning table, L<br />

Motion coordination<br />

Update location<br />

No<br />

Terminate select L ?<br />

Yes<br />

Calculate cost<br />

Update manipulator system<br />

Terminate select s ?<br />

Yes<br />

End<br />

No<br />

Proposed method for<br />

manipulator system selection<br />

15:45–16:00 ThCT2.7<br />

Dynamics identification of industrial robots<br />

using contact force for the IDCS control<br />

Kengo AOKI , Gentiane VENTURE , Yasutaka TAGAWA<br />

Department of Mechanical Systems Engineering,<br />

Tokyo University of Agriculture and Technology, Tokyo, Japan<br />

• The application of identification of<br />

the robot using the base link<br />

information<br />

• The concept of the IDCS (Inverse<br />

Dynamics Compensation via<br />

‘Simulation of feedback control<br />

systems’) controller<br />

• The simulation results of<br />

comparison of the IDCS controller<br />

performance with PID controller<br />

Weight with adding<br />

acceleration sensor<br />

Motor with<br />

an encoder<br />

Top: The controlled object<br />

Bottom: The cross validation of<br />

� � � identification result<br />

15:40–15:45 ThCT2.6<br />

Modeling and Control of a Piezo-Actuated<br />

High-Dynamic Compensation Mechanism for<br />

Industrial Robots<br />

B. Olofsson, O. Sörnmo, A. Robertsson & R. Johansson<br />

Dept. of Automatic Control, Lund University, Sweden<br />

U. Schneider & A. Puzik<br />

Fraunhofer Institute for Manufacturing and Engineering, Germany<br />

• Piezo-actuated compensation mechanism<br />

for usage together with industrial robots<br />

during machining operations<br />

• High-precision milling despite limited<br />

stiffness and accuracy of the robot<br />

• Modeling of the compensation mechanism<br />

using subspace-based identification<br />

methods and model-based control<br />

scheme<br />

• Experimental evaluations of the control<br />

scheme performed on a prototype of the<br />

mechanism are presented<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–208–<br />

Industrial robot together with the<br />

compensation mechanism.<br />

16:00–16:15 ThCT2.8<br />

A Fast Distributed Auction and Consensus Process<br />

Using Parallel Task Allocation and Execution<br />

G.P. Das, T.M. McGinnity, S.A. Coleman and L. Behera<br />

Intelligent Systems Research Centre, University of Ulster, United Kingdom<br />

• Distributed auction and consensus for<br />

task allocation in a multi robot system<br />

• Task allocation is carried out in parallel to<br />

execution of already allocated tasks<br />

• Task allocation in highly dynamic<br />

scenarios is better than other distributed d<br />

auction based approaches<br />

• Performance analysis<br />

• static cases where all robots and<br />

tasks are active from the start<br />

• dynamic case where new tasks are<br />

activated during the execution<br />

• Faster task allocation in both static and<br />

dynamic scenarios<br />

Figure: Task allocation module in<br />

a single robot. Similar modules<br />

run in each robot of the multi<br />

robot system.


Session ThCT3 Continental Parlor 3 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Marine Systems I<br />

Chair Stephen Barkby, Univ. of Sydney<br />

Co-Chair Terry Huntsberger, Jet Propulsion Lab.<br />

14:45–15:00 ThCT3.1<br />

Flexible Formation Control Realistic Underactuated<br />

Autonomous Surface Vessels<br />

Bradley Bishop, United States Naval Academy<br />

15:15–15:30 ThCT3.3<br />

Safe Maritime Navigation with COLREGS<br />

Using Velocity Obstacles<br />

Yoshiaki Kuwata, Michael T. Wolf, Dimitri Zarzhitsky,<br />

and Terrance L. Huntsberger<br />

Mobility and Robotics Systems, Jet Propulsion Laboratory,<br />

California Institute of Technology, USA<br />

• In maritime navigation, all vessels must obey COLREGS (“rules of the<br />

road”), which requires specific type of maneuvers (e.g., need to overtake<br />

from the starboard side).<br />

• Developed a motion planner that<br />

incorporates such logic for USVs<br />

(Unmanned Surface Vehicles).<br />

• Velocity Obstacles allow us to<br />

express them naturally in the<br />

velocity space.<br />

• Demonstrated in simulation and<br />

on the water three COLREGS<br />

behaviors: crossing, overtaking,<br />

and head-on.<br />

USV in a crossing situation<br />

15:00–15:15 ThCT3.2<br />

Toward Automatic Classification of Chemical<br />

Sensor Data from Autonomous Underwater Vehicles<br />

Michael V. Jakuba 1 , Daniel Steinberg 1 , James C. Kinsey 2 , Dana<br />

R. Yoerger 2 , Richard Camilli 2 , Oscar Pizarro 1 , Stefan B. Williams 1<br />

1 Australian Centre for Field Robotics, University of Sydney, Australia<br />

2 Woods Hole Oceanographic Institution, USA<br />

• Classification of nearly raw<br />

chemical sensor data from<br />

AUVs can aid follow-on<br />

analysis by both humans<br />

and machines.<br />

• The Variational Dirichlet<br />

Process (VDP) was used to<br />

cluster chemical sensor<br />

data from the Sentry AUV.<br />

• The VDP executes rapidly,<br />

accepts data from multiple<br />

sensors, and does not<br />

require operators to specify<br />

the number of classes a<br />

priori.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–209–<br />

Probability of membership in the plume class<br />

derived from chemical sensor data collected by<br />

the Sentry AUV. A plume is evident trailing off to<br />

the west from the Deepwater Horizon site in the<br />

northeast corner of the plot.<br />

15:30–15:35 ThCT3.4<br />

A Novel Propulsion Method<br />

of Flexible Underwater Robots<br />

Jun Shintake, Aiguo Ming, Makoto Shimojo<br />

Department of Mechanical Engineering and Intelligent Systems<br />

The University of Electro-Communications, Japan<br />

• Proposal of the new type of<br />

propulsion method<br />

• Development of prototypes<br />

• Experiments and analysis


Session ThCT3 Continental Parlor 3 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Marine Systems I<br />

Chair Stephen Barkby, Univ. of Sydney<br />

Co-Chair Terry Huntsberger, Jet Propulsion Lab.<br />

15:35–15:40 ThCT3.5<br />

On Adaptive Underwater Object Detection<br />

David P. Williams<br />

NATO Undersea Research Centre, Italy<br />

• A new, fast algorithm for the detection of<br />

underwater objects in sonar imagery is<br />

proposed.<br />

• The algorithm also detects the presence<br />

of, and estimates the orientation of, sand<br />

ripples.<br />

• The method is adaptively tailored to the<br />

environmental characteristics of the<br />

sensed data that is collected in situ.<br />

• Ways to exploit the findings and adapt<br />

AUV surveys for optimized detection<br />

performance are suggested.<br />

Synthetic aperture sonar image<br />

with man-made objects<br />

and sand ripples.<br />

15:45–16:00 ThCT3.7<br />

Pitch and Roll Control Using Independent<br />

Movable Floats for Small Underwater Robots<br />

Norimitsu Sakagami*, Tomohiro Ueda**<br />

Mizuho Shibata*** and Sadao Kawamura**<br />

*Department of Navigation and Ocean Engineering, Tokai University, Japan<br />

**Department of Robotics, Ritsumeikan University, Japan<br />

***Department of Intelligent Mechanical Engineering, Kinki University, Japan<br />

• We propose a pitch and roll control<br />

system based on a float-shift mechanism<br />

• Two independent movable floats control<br />

the center of buoyancy of a robot<br />

• Numerical results show that the proposed<br />

system contributes to change and keep<br />

the attitude of the robot<br />

• The experimental test was also conducted<br />

in a test tank<br />

Human-sized Underwater Robot<br />

• The experimental results show the equipped with an attitude control system<br />

performance of pitch and roll angle control based on a float-shift mechanism<br />

15:40–15:45 ThCT3.6<br />

Modeling, Simulation, and Performance of a<br />

Synergistically Propelled Ichthyoid<br />

Paul Strefling, Aren Hellum and Ranjan Mukherjee<br />

Department of Mechanical Engineering<br />

Michigan State University,<br />

East Lansing, Michigan, United States<br />

Images of the submersible captured over a period of one second immediately after<br />

starting from rest.<br />

Synergistically Propelled Ichthyoid is a submersible with a fluid-conveying<br />

flexible tail.<br />

Propulsion is achieved by combining jet action and oscillatory tail motion<br />

induced by the jet.<br />

Analytical and numerical methods are employed to solve two models of the<br />

Synergistically Propelled Ichthyoid.<br />

Experiments were performed which qualitatively verify the models.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–210–


Session ThCT4 Continental Ballroom 4 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Symposium: (Self-)assembly from the Nano to the Macro Scale: State of the Art and Future Directions<br />

Chair Nikolaus Correll, Univ. of Colorado at Boulder<br />

Co-Chair Mark Yim, Univ. of Pennsylvania<br />

14:45–15:00 ThCT4.1*<br />

(Self-)assembly from the nano to the macro scale<br />

Nikolaus Correll, University of Colorado at Boulder<br />

M. Ani Hsieh, Drexel University<br />

Mark Yim, University of Pennsylvania<br />

15:15–15:30 ThCT4.3<br />

Self-Assembly for Maximum Yields Under<br />

Constraints<br />

Michael J. Fox and Jeff S. Shamma<br />

School of Electrical and Computer Engineering, Georgia Institute of Technology,<br />

United Stated of America<br />

• We present an algorithm that, given<br />

any target tree, synthesizes<br />

reversible self-assembly rules that<br />

provide a maximum yield in the<br />

sense of stochastic stability.<br />

• If the reversibility constraint is<br />

relaxed then the same algorithm<br />

can be trivially modified so that it<br />

converges to a maximum yield<br />

almost surely.<br />

• We examine the conservatism of<br />

this technique by considering its<br />

implications for the internal states<br />

of the system.<br />

We use graph grammars to<br />

describe our algorithm.<br />

15:00–15:15 ThCT4.2<br />

Enhanced Directional Self-Assembly<br />

based on Active Recruitment and Guidance<br />

Nithin Mathews1 , Anders Lyhne Christensen2 ,<br />

Rehan O’Grady1 , Philippe Rétornaz3 , Michael Bonani3 ,<br />

Francesco Mondada3 , and Marco Dorigo1 1IRIDIA, Université Libre de Bruxelles, Brussels, Belgium<br />

2Instituto de Telecomunicações, Lisbon, Portugal<br />

3EPFL-LSRO Ecole Polytechnique Fédérale de Lausanne, Switzerland<br />

• Self-assembling robots<br />

• Grow specific morphologies<br />

• We propose a new mechanism<br />

to create directional<br />

connections between robots<br />

• Real-robot experiments<br />

15:30–15:35 ThCT4.4<br />

�����������������������������������<br />

�������������������<br />

������������������������������������<br />

�������������������������������������������������<br />

� ������������������������������������<br />

��������<br />

� ���������������������������������������<br />

���������������<br />

� ��������������������������������������<br />

���������<br />

� �������������������������������������<br />

������������������������������������������<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–211–<br />

��������������������������������<br />

����������


Session ThCT4 Continental Ballroom 4 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Symposium: (Self-)assembly from the Nano to the Macro Scale: State of the Art and Future Directions<br />

Chair Nikolaus Correll, Univ. of Colorado at Boulder<br />

Co-Chair Mark Yim, Univ. of Pennsylvania<br />

15:35–15:40 ThCT4.5<br />

Self-Assembly of Modular Robots from Finite<br />

Number of Modules Using Graph Grammars<br />

Vijeth Rai and Nikolaus Correll<br />

University of Colorado at Boulder, USA<br />

Anne van Rossum<br />

Almende Research and TiCC, Tilburg University , The Netherlands<br />

• Stochastic Self-Assembly of<br />

modular robots when number of<br />

available building blocks is limited or<br />

exact<br />

• Stochastic approach provides<br />

scalability and robustness, while not<br />

requiring global localization<br />

• Combining assembly with<br />

disassembly leads to 100% yield<br />

• Broadcast communication<br />

accelerates assembly<br />

Hexapod assembly from<br />

“Replicator” modules<br />

15:45–16:00 ThCT4.7<br />

Structure synthesis on-the-fly in a modular robot<br />

Shai Revzen, Mohit Bhoite, Antonio Macasieb, and Mark Yim<br />

School of Engineering and Applied Science<br />

University of Pennsylvania, USA<br />

• Modular robots promise robots created<br />

on-the-fly to suite tasks not known in<br />

advance of deployment<br />

• We demonstrate a remote controlled mobile<br />

synthesizer capable of creating structural<br />

members for linking actuated modules into<br />

arbitrary custom robots<br />

• Material is a foam with x28 expansion;<br />

creating robots larger than synthesizer cart<br />

• Also shown application to hazard disposal<br />

and rescue / urban warfare<br />

• Videos and more details at<br />

http://www.modlabupenn.org/foambot-videos<br />

Synthesizer cart & two mobile<br />

clusters (top), a quadruped and a<br />

snake<br />

15:40–15:45 ThCT4.6<br />

Constrained Task Partitioning for<br />

Distributed Assembly<br />

James Worcester, Josh Rogoff, and M. Ani Hsieh<br />

Scalable Autonomous Systems Lab<br />

Mechanical Engineering & Mechanics<br />

Drexel University, USA<br />

• We present an algorithm to partition 2 and<br />

3-D structures into N separate assembly<br />

tasks that can be executed in parallel.<br />

• The algorithm achieves a partitioning that<br />

minimizes the workload imbalance<br />

between the robots.<br />

• The algorithm consists of three phases: 1)<br />

an initial allocation, 2) workload balance,<br />

and 3) generation of parallel assembly<br />

plans.<br />

• We present simulation and experimental<br />

results.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–212–<br />

Two robots assembling the 3-D<br />

structure (bottom) in parallel.<br />

16:00–16:15 ThCT4.8<br />

Constraint-Aware Coordinated Construction<br />

of Generic Structures<br />

David Stein, T. Ryan Schoen, Daniela Rus<br />

Massachusetts Institute of Technology, USA<br />

• Constraint-aware decentralized approach to construction<br />

• Novel algorithm for partitioning sets of point masses in<br />

• Extension of previous work to conform to physical constraints of<br />

reachability and mid-construction stability<br />

Simulated parallel construction of a model airplane by three robots<br />

conforming to physical and reachability constraints


Session ThCT5 Continental Ballroom 5 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Symposium: Humanoid Applications<br />

Chair Rüdiger Dillmann, KIT Karlsruher Inst. für Tech.<br />

Co-Chair Shigeki Sugano, Waseda Univ.<br />

14:45–15:00 ThCT5.1*<br />

Semi-Plenary Invited Talk: Humanoid Robotics<br />

Research: Retrospection and Prospects<br />

Tamim Asfour, Karlsruhe Institute of Technology (KIT)<br />

15:15–15:30 ThCT5.3<br />

Actuation Requirements for Hopping and<br />

Running of the Musculoskeletal Robot<br />

BioBiped1<br />

Katayon Radkhah and Oskar von Stryk<br />

Department of Computer Science, Technische Universität Darmstadt, Germany<br />

• Approach to determine the actuation<br />

requirements for given motions for a<br />

electrically driven bipedal robot<br />

• Integration of human-like series-elastic<br />

structures to mimic the main human leg<br />

muscles<br />

• Step-by-step process to compute the<br />

control signals for computer-generated<br />

hopping and human running motions<br />

• Good agreement of the simulated ground<br />

reaction forces with the typical human<br />

single-humped patterns<br />

Rigid kinematic joint-link structure<br />

(left) versus compliant robot<br />

actuated by series elastic<br />

structures (right) performing<br />

hopping and running motions<br />

15:00–15:15 ThCT5.2<br />

Semi-Plenary Invited Talk: Humanoid Robotics<br />

Research: Retrospection and Prospects<br />

Tamim Asfour, Karlsruhe Institute of Technology (KIT)<br />

15:30–15:35 ThCT5.4<br />

Combined Intention, Activity, and Motion<br />

Recognition for a Humanoid Household Robot<br />

Dirk Gehrig1 , Peter Krauthausen1 , Lukas Rybok1 ,<br />

Hildegard Kuehne1 , Uwe D. Hanebeck1 , Tanja Schultz1 ,<br />

and Rainer Stiefelhagen1 2<br />

1 Institute for Anthropomatics,<br />

Karlsruhe Institute of Technology (KIT), Germany<br />

2 Fraunhofer IOSB, Karlsruhe, Germany<br />

• Multi-Level approach to intention,<br />

activity, and motion recognition<br />

• Video-based on-line and real-time<br />

recognition for a humanoid robot<br />

• Key ideas<br />

• Complimentary recognizers<br />

• Extensible model with consistent<br />

uncertainty treatment<br />

• Experimental validation on realworld<br />

data set of complex kitchen<br />

tasks, e.g. Lay Table, Prepare Meal<br />

Robot System<br />

State Estimation<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–213–<br />

Intention<br />

Motion Activity<br />

Human and environment<br />

Model of Human‘s Rationale<br />

Domain<br />

Knowledge<br />

Robot Control<br />

Sensors Actuators


Session ThCT5 Continental Ballroom 5 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Symposium: Humanoid Applications<br />

Chair Rüdiger Dillmann, KIT Karlsruher Inst. für Tech.<br />

Co-Chair Shigeki Sugano, Waseda Univ.<br />

15:35–15:40 ThCT5.5<br />

An Expected Perception Architecture using 3D<br />

Reconstruction for a Humanoid Robot<br />

N. Moutinho 1 , N. Cauli 2 , E. Falotico 2 , R. Ferreira 1 , J. Gaspar 1 ,<br />

A. Bernardino 1 , J. Santos-Victor 1 , P. Dario 2 , C. Laschi 2<br />

1 Institute for Systems and Robotics, IST/UTL, Portugal<br />

2 The BioRobotics Institute, SSSA, Italy<br />

• Architecture: Anticipatory Visual<br />

Perception (AVP) performs Expected<br />

Perception (EP)<br />

• EP uses proprioception information to<br />

predict visual changes and detect<br />

unexpected visual events<br />

• EP allows saving computational<br />

resources by doing localized 3D<br />

reconstruction instead of full scene<br />

reconstruction<br />

15:45–16:00 ThCT5.7<br />

Development of Whole-body Humanoid "Pneumat-<br />

BS" with Pneumatic Musculoskeletal System<br />

Keita Ogawa<br />

Dept. of Adaptive Machine Systems,Osaka Univ., Japan<br />

Kenichi Narioka and Koh Hosoda<br />

Dept. of Multimedia Engineering,Osaka Univ,. Japan<br />

• We developed whole-body<br />

musculoskeletal humanoid “Pneumat-BS”<br />

• The robot is drivened by pneumatic<br />

artificial muscles<br />

• The robot has whole-body flexibility and<br />

whole-body coordination<br />

• We conducted a standing experiment<br />

using a simple feedback control and<br />

evaluated the stability<br />

• We conducted a walking experiment using<br />

a feedforward controller with human<br />

muscle activation patterns and confirmed<br />

that the robot was able to perform the<br />

dynamic task<br />

A musculoskeletal humanoid<br />

robot "Pneumat-BS"<br />

15:40–15:45 ThCT5.6<br />

Multilayer Control of Skiing Robot<br />

Tadej Petrič, Bojan Nemec, Jan Babič and Leon Žlajpah<br />

Department for automation, biocybernetics and robotics,<br />

Jožef Stefan Institute, Slovenia<br />

• Novel method for ensuring stability of a<br />

skiing robot on a previously unknown ski<br />

slope.<br />

• Multilayer control where the primary task<br />

(stability) is only observed when, the<br />

secondary task (direction) would lead into<br />

instability.<br />

• The effectiveness of the proposed method<br />

is shown in both simulation in the virtual<br />

reality environment as well as on the real<br />

robot, i.e. hardware-in-the-loop.<br />

16:00–16:15 ThCT5.8<br />

Autonomous Climbing of Spiral Staircases<br />

with Humanoids<br />

Stefan Oßwald, Attila Görög,<br />

Armin Hornung, and Maren Bennewitz<br />

Department of Computer Science, University of Freiburg, Germany<br />

• Our Nao humanoid autonomously climbs<br />

up complex staircases<br />

• The robot uses a 2D laser range finder to<br />

globally localize itself in a 3D model<br />

• The pose is refined by comparing<br />

detected edges in monocular camera<br />

images to model edges<br />

• The robot accurately aligns with the next<br />

step and reliably climbs up a spiral<br />

staircase consisting of 10 steps<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–214–


Session ThCT6 Continental Ballroom 6 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Symposium: Computer Vision for Robotics<br />

Chair Dieter Fox, Univ. of Washington<br />

Co-Chair José Neira, Univ. de Zaragoza<br />

14:45–15:00 ThCT6.1*<br />

Semi-Plenary Invited Talk: Learning to Recognize<br />

Objects Despite Novel Environments and Sensors<br />

Trevor Darrell, UC Berkeley<br />

15:15–15:30 ThCT6.3<br />

RGB-D Object Discovery<br />

Via Multi-Scene Analysis<br />

Evan Herbst, Xiaofeng Ren, Dieter Fox<br />

Computer Science, University of Washington, USA<br />

Intel Research Seattle, Seattle, USA<br />

• Represent a geometric map as static<br />

background + movable objects<br />

• Point-based change detection algorithm<br />

for segmentation and matching<br />

• Improve on 2-D feature-based matching<br />

and on 3-D ICP-based matching<br />

15:00–15:15 ThCT6.2<br />

Semi-Plenary Invited Talk: Learning to Recognize<br />

Objects Despite Novel Environments and Sensors<br />

Trevor Darrell, UC Berkeley<br />

15:30–15:35 ThCT6.4<br />

Online Learning for Automatic<br />

Segmentation of 3D Data<br />

Federico Tombari Luigi Di Stefano<br />

Simone Giardino<br />

DEIS, University of Bologna, Italy<br />

• Semantic segmentation of indoor/outdoor<br />

scenes based on standard classifiers and<br />

a Markov Random Field-based grouping<br />

stage<br />

• Online learning approach: new samples<br />

are continuously selected from incoming<br />

data in an unsupervised way to improve<br />

the learning model<br />

• We experimentally demonstrate<br />

performance improvements with different<br />

classifiers (NN, SVM) as well as 3D data<br />

sources (CAD/synthetic, LIDAR, Kinect)<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–215–<br />

Semantic segmentation of a<br />

complex scene acquired by a<br />

Microsoft Kinect sensor


Session ThCT6 Continental Ballroom 6 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Symposium: Computer Vision for Robotics<br />

Chair Dieter Fox, Univ. of Washington<br />

Co-Chair José Neira, Univ. de Zaragoza<br />

15:35–15:40 ThCT6.5<br />

Shape-Based Depth Image to 3D Model Matching<br />

and Classification with Inter-View Similarity<br />

Walter Wohlkinger, Markus Vincze<br />

Vision4Robotics Group,<br />

Automation and Control Institute,<br />

Vienna University of Technology, Austria<br />

• 3D Shape based matching of depth<br />

images acquired with Kinect to 3D models<br />

from the Web<br />

• Matching against multiple views<br />

significantly boosts classification rate<br />

• Adaptation and optimization of shape<br />

descriptors enables real-time classification<br />

• Easy and fast learning of new categories<br />

Classification of everyday objects<br />

for robotic applications<br />

15:45–16:00 ThCT6.7<br />

Perception for the Manipulation of Socks<br />

Ping Chuan Wang, Stephen Miller, Mario Fritz,<br />

Trevor Darrell, Pieter Abbeel<br />

University of California, Berkeley, USA<br />

• Provide perceptual tools necessary<br />

for robotic sock manipulation<br />

• Visually infer configuration of<br />

possibly bunched or inside-out socks<br />

• Combine local texture features into<br />

global model<br />

• Visually match pairs of socks<br />

• Implemented on the PR2<br />

15:40–15:45 ThCT6.6<br />

Model for Unfolding Laundry using<br />

Interactive Perception<br />

Bryan Willimon, Stan Birchfield, and Ian Walker<br />

Department of Electrical and Computer Engineering,<br />

Clemson University, USA<br />

• We present an algorithm for automatically<br />

unfolding a piece of clothing.<br />

� Our results show that, at most, the<br />

algorithm flattens out a piece of cloth with<br />

11.1% to 95.6% of the canonical position.<br />

� In this paper, we use peak ridges,<br />

continuity of a surface, and corner locations<br />

in our model to calculate grasp locations<br />

and movement orientations.<br />

16:00–16:15 ThCT6.8<br />

��������������������������<br />

����������<br />

�������������������������������<br />

�������������������������������������������������������������������<br />

� �������������������������������<br />

���������������������������<br />

����������������������������<br />

� ���������������������������<br />

���������������������������<br />

���������������������<br />

������������������������<br />

� �����������������������������<br />

������������������������������<br />

�����������������<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–216–<br />

����������������������������������<br />

��������������������������������<br />

�����������������������������<br />

����������


Session ThCT7 Continental Parlor 7 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Exoskeleton Robots & Gait Rehabilitation<br />

Chair Bernard Espiau, INRIA<br />

Co-Chair Constantinos Mavroidis, Northeastern Univ.<br />

14:45–15:00 ThCT7.1<br />

Design and Control of a Robotic Lower<br />

Extremity Exoskeleton for Gait Rehabilitation<br />

Ozer Unluhisarcikli, Maciej Pietrusinski,<br />

Brian Weinberg, Constantinos Mavroidis<br />

Dept. of Mechanical & Industrial Engineering, Northeastern University, USA<br />

Paolo Bonato<br />

Spaulding Rehabilitation Hospital, USA<br />

• Design and control of an active knee<br />

rehabilitation orthotic system (ANdROS) ) is<br />

presented.<br />

• ANdROS is designed as a wearable and d<br />

portable assistive tool for gait<br />

rehabilitation<br />

• A corrective force field that reinforces a<br />

desired gait pattern is applied to the<br />

patient’s impaired leg around the knee<br />

joint via an impedance controlled<br />

exoskeleton.<br />

• The impedance controller is synchronized ed<br />

with the patient’s walking phase which is s<br />

estimated from the kinematic<br />

measurements of the healthy leg.<br />

15:15–15:30 ThCT7.3<br />

Ergonomic considerations for anthropomorphic<br />

wrist exoskeletons: A simulation study on<br />

the effects of joint misalignment<br />

M. Esmaeili, K. Gamage, E. Tan and D. Campolo<br />

School of Mechanical and Aerospace Engineering<br />

Nanyang Technological University, Singapore<br />

• We consider a 2dof model with non-<br />

intersecting joints for both the human wrist<br />

and the exoskeleton<br />

• A viscoelastic attachment between the<br />

human hand and the handle of the exo is<br />

modeled via 4 non-collinear springs to<br />

allow kinematics differences between<br />

human and exoskeleton<br />

• Discomfort is quantified as the amount of<br />

potential energy stored in the deformation<br />

of the viscoelastic attachment<br />

• From the known distribution of joint offset<br />

for humans and the simulated discomfort<br />

function, we compute the optimal joint<br />

offset for the exoskeleton by solving the<br />

‘one-size-fits-all’ problem<br />

2dof kinematic model of the wrist<br />

and exoskeleton with viscoelastic<br />

hand-handle attachment<br />

15:00–15:15 ThCT7.2<br />

Active Air Mat for Comfortable and Easy to Wear<br />

a Forearm Support System<br />

Yasuhisa Hasegawa, Munenori Tayama, Takefumi Saito and<br />

Yoshiyuki Sankai<br />

Department of Intelligent Interaction Technologies, University of Tsukuba,<br />

Japan<br />

• An active air mat installed in interface<br />

parts of the exoskeleton improves<br />

comfortableness and easiness to wear<br />

an exoskeleton on a forearm.<br />

• One of the most important effects is to<br />

minimize constriction of blood flow by<br />

changing contacting areas with a<br />

human arm from the periphery to the<br />

trunk.<br />

• The active air mat is evaluated from the<br />

viewpoint of pressure distribution, blood<br />

flow, wearing time, releasing time,<br />

body-holding rigidity, and temperature<br />

and humidity of a user’s skin in resting<br />

and working states.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–217–<br />

Forearm support system and<br />

components of active air mat<br />

15:30–15:35 ThCT7.4<br />

The Development and Testing of a Human<br />

Machine Interface for a Mobile Medical<br />

Exoskeleton<br />

Katherine Strausser and H. Kazerooni<br />

Mechanical Engineering, University of California, Berkeley, USA<br />

• The Human Machine Interface (HMI)<br />

allows a user to control a mobile<br />

exoskeleton independently.<br />

• The HMI is implemented on eLEGS, a<br />

mobile exoskeleton for Spinal Cord Injury<br />

patients.<br />

• This method uses natural gestures to<br />

operate the machine easily and naturally.<br />

The Human Machine Interface is<br />

implemented on eLEGS.


Session ThCT7 Continental Parlor 7 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Exoskeleton Robots & Gait Rehabilitation<br />

Chair Bernard Espiau, INRIA<br />

Co-Chair Constantinos Mavroidis, Northeastern Univ.<br />

15:35–15:40 ThCT7.5<br />

A Self-Adjusting Knee Exoskeleton<br />

for Robot-Assisted Treatment of Knee Injuries<br />

Mehmet Alper Ergin and Volkan Patoglu<br />

Faculty of Engineering and Natural Sciences<br />

Istanbul, Turkey<br />

• A novel active device for robot-assisted<br />

rehabilitation that accommodates<br />

transitional movements of the knee joint<br />

as well as its rotation<br />

• 3 Degrees of Freedom device with 2<br />

translations and one rotation<br />

• Enables both passive and active<br />

alignment of the knee joint<br />

• Impedance controller design<br />

• Characterization and tracking<br />

performance of the device are presented<br />

15:40–15:45 ThCT7.6<br />

Alterations in Body Weight Shifting during<br />

Robotic Assisted Gait Rehabilitation<br />

using NaTUre-gaits<br />

H. B. Lim, Trieu Phat Luu, K. H. Hoon, Xingda Qu, and K. H. Low<br />

School of Mechanical and Aerospace Engineering,<br />

Nanyang Technological University, Singapore<br />

Adela Tow<br />

Department of Rehabilitation Medicine, Tan Tock Seng Hospital, Singapore<br />

• Pelvic assistance for robotic assisted gait<br />

rehabilitation<br />

• Investigation of body weight shifting of<br />

human subject during gait rehabilitation<br />

with NaTUre-gaits<br />

• Study of the Center of Pressure (COP)<br />

trajectory during robotic assisted gait<br />

rehabilitation<br />

• In this study, COP trajectory is found to be<br />

affected by robotic system<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–218–<br />

Robotic Arm<br />

(right)<br />

Contact surface<br />

Robotic Arm<br />

(left)<br />

Pelvic assistance mechanism


Session ThCT8 Continental Parlor 8 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Aerial Robots: Navigation, Tracking & Landing<br />

Chair Stefan Hrabar, CSIRO ICT Centre<br />

Co-Chair Mark Minor, Univ. of Utah<br />

14:45–15:00 ThCT8.1<br />

Chasing a moving target from a flying UAV<br />

Céline Teulière and Laurent Eck<br />

Interactive Robotics Unit, CEA-LIST, France<br />

Eric Marchand<br />

Lagadic Unit, Université de Rennes 1, France<br />

• Robust visual tracking system:<br />

• Multiple kernel tracking in particle<br />

filtering framework<br />

• Initialisation/reinitialisation system<br />

• Hierarchical control scheme for UAV<br />

position and yaw control<br />

• Real-time chasing experiments robust to<br />

noisy video transmission, some complete<br />

occlusions, and scale and orientation<br />

changes.<br />

UAV<br />

Internal view<br />

15:15–15:30 ThCT8.3<br />

Autonomous Miniature Blimp Navigation<br />

with Online Motion Planning and Re-planning<br />

Jörg Müller, Norman Kohler, and Wolfram Burgard<br />

Department of Computer Science, University of Freiburg, Germany<br />

• Approach to autonomous navigation<br />

for miniature indoor blimps in mapped<br />

environments<br />

• Multi-stage planning algorithm for<br />

strongly goal-directed tree-based<br />

kinodynamic motion planning<br />

• A* path generation in a 4-dimensional<br />

subspace and path-guided sampling in<br />

the full 12-dimensional state space<br />

• Navigation on a round-trip using<br />

motion capture state estimation and an<br />

LQR controller<br />

• Implemented and evaluated on a real<br />

robotic indoor blimp<br />

Our indoor blimp navigating<br />

autonomously through a narrow<br />

passage between two obstacles.<br />

15:00–15:15 ThCT8.2<br />

A Fast and Adaptive Method for Estimating UAV<br />

Attitude from the Visual Horizon<br />

Richard J. D. Moore, Saul Thurrowgood, Daniel Bland,<br />

Dean Soccol & Mandyam V. Srinivasan<br />

The Queensland Brain Institute, The University of Queensland, Australia<br />

• A classifier is used to<br />

segment wide FoV images<br />

into sky and ground<br />

regions<br />

• The classifier is trained<br />

continuously allowing onthe-fly<br />

initialisation and the<br />

ability to adapt to changing<br />

environments<br />

• Matching the classified<br />

images against a precomputed<br />

database allows<br />

estimation of aircraft<br />

attitude to within 1.5 o in<br />

2ms @ 1GHz<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–219–<br />

An illustration of the attitude estimation process. The<br />

input image (a) is classified into sky/ground regions<br />

(b), and then the row and column averages are<br />

matched against a pre-computed database (c) to<br />

retrieve the attitude, which is used to generate a<br />

training mask (d) for the classified.<br />

15:30–15:35 ThCT8.4<br />

Efficient Target Geolocation by<br />

Highly Uncertain Small Air Vehicles<br />

Ben Grocholsky, Michael Dille and Stephen Nuske<br />

Robotics Institute, Carnegie Mellon University, USA<br />

• Small UAVs have limited sensing and<br />

typically exhibit significant heading<br />

error resulting in poor target<br />

geolocation accuracy.<br />

• An estimation method is sought that<br />

can process complex nonlinear<br />

uncertainty using lightweight<br />

computing hardware.<br />

• We present an efficient parametric<br />

representation for nonlinear<br />

uncertainty estimates with significantly<br />

lower complexity than existing<br />

methods.<br />

• Visual target geolocation results are<br />

demonstrated on a small UAV platform.<br />

A typical small UAV platform and target<br />

location observation uncertainty


Session ThCT8 Continental Parlor 8 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Aerial Robots: Navigation, Tracking & Landing<br />

Chair Stefan Hrabar, CSIRO ICT Centre<br />

Co-Chair Mark Minor, Univ. of Utah<br />

15:35–15:40 ThCT8.5<br />

Beyond Visual Range Obstacle Avoidance and<br />

Infrastructure Inspection by an<br />

Autonomous Helicopter<br />

Torsten Merz and Farid Kendoul<br />

ICT Centre, CSIRO, Australia<br />

• LIDAR-based perception and guidance<br />

system for autonomous helicopters<br />

operating at low altitude in unknown<br />

environments<br />

• Efficient methods for terrain following,<br />

goal-oriented obstacle avoidance, and<br />

close-range inspection<br />

• Implemented on board the CSIRO<br />

unmanned helicopter and flight tested in<br />

several different mission scenarios<br />

• Successfully completed 37 inspection<br />

missions including 2 missions beyond<br />

visual range without a backup pilot<br />

15:45–16:00 ThCT8.7<br />

Reactive Obstacle Avoidance for<br />

Rotorcraft UAVs<br />

Stefan Hrabar<br />

Australian Research Centre for Aerospace Automation (ARCAA) / CSIRO,<br />

Brisbane, Australia<br />

• A goal-oriented reactive avoidance<br />

technique suitable for Rotorcraft UAVs.<br />

• Uses a 3D occupancy map<br />

representation of sensed obstacles.<br />

• Cylindrical Safety Volume is tested for<br />

obstacles ahead of the UAV.<br />

• Efficient collision checking within the<br />

cylinder achieved by approximate<br />

nearest neighbor search in a Bkd-tree<br />

representation of occupied voxels.<br />

• Generates an ‘Escape Point’ waypoint<br />

providing a path around the obstacles.<br />

• Performs in real-time using on-board<br />

processing.<br />

• Successful obstacle avoidance results<br />

presented with an Unmanned<br />

Helicopter<br />

The CSIRO Unmanned Helicopter fitted<br />

with stereo and lidar sensors for<br />

obstacle avoidance.<br />

15:40–15:45 ThCT8.6<br />

Coordinated Landing of a Quadrotor on a Skid-<br />

Steered Ground Vehicle in the Presence of Time<br />

Delays<br />

John Daly, Yan Ma, and Steven Waslander<br />

Department of Mechanical and Mechatronics Engineering, University of<br />

Waterloo, Canada<br />

• We present a control technique to<br />

autonomously land a quadrotor on a<br />

moving ground vehicle<br />

• Feedback linearizing controllers are used<br />

for the quadrotor and ground vehicle<br />

• A joint controller to stabilize the difference<br />

in position and the velocity is designed to<br />

accomplish a rendezvous<br />

• Time delay stability analysis is performed<br />

to determine delay margin of closed loop<br />

system<br />

• Results verified through numerical<br />

simulation<br />

16:00–16:15 ThCT8.8<br />

Avian-Inspired Passive Perching Mechanism for<br />

Robotic Rotorcraft<br />

Courtney E. Doyle, Justin J. Bird, Taylor A. Isom,<br />

C. Jerald Johnson, Jason C. Kallman, Jason A. Simpson,<br />

Raymond J. King, Jake J. Abbott, and Mark. A. Minor<br />

Mechanical Engineering, University of Utah, United States<br />

• Mechanism inspired by songbirds<br />

to enable perching on cylindrical<br />

surfaces.<br />

• Compliant, underactuated, tendondriven<br />

gripper provides closure<br />

around perch.<br />

• Leg mechanism with compliant<br />

joints converts rotorcraft weight<br />

into actuation force for gripper.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–220–<br />

Passive mechanism for perching on cylindrical<br />

surfaces. LEFT: Mechanism in relaxed state;<br />

CENTER: Mechanism perching on 49mm rod;<br />

RIGHT: Mechanism perching on 33mm rod.


Session ThCT9 Continental Parlor 9 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Swarms and Flocks<br />

Chair Dylan Shell, Texas A&M Univ.<br />

Co-Chair Radhika Nagpal, Harvard Univ.<br />

14:45–15:00 ThCT9.1<br />

Communication assisted navigation in robotic<br />

swarms: self-organization and cooperation<br />

F. Ducatelle, G. A. Di Caro and L. M. Gambardella<br />

IDSIA, USI/SUPSI, Switzerland<br />

C. Pinciroli<br />

IRIDIA, ULB, Belgium<br />

F. Mondada<br />

LSRO, EPFL, Switzerland<br />

• Robots guide each other’s<br />

navigation by passing wireless<br />

messages over a range and<br />

bearing communication system<br />

• The algorithm supports efficient<br />

navigation while being robust to<br />

robot failures<br />

• Multiple robots moving between<br />

two targets can self-organize into<br />

a dynamic chain (see figure)<br />

• This self-organization supports<br />

cooperation and improves<br />

navigation efficiency<br />

15:15–15:30 ThCT9.3<br />

The Distributed Co-Evolution<br />

of an Embodied Simulator and Controller<br />

for Swarm Robot Behaviours<br />

Paul O'Dowd, Alan Winfield and Matthew Studley<br />

Bristol Robotics Laboratory, University of the West of England, UK<br />

• Each robot evolves a controller solution<br />

through embodied simulator<br />

• Real world robot performance metric<br />

used to indirectly measure environment<br />

correspondence of the embodied<br />

simulator<br />

• Real world robot performance metric<br />

used to collectively and competitively<br />

evolve the environment model<br />

• Swarm of robots can adapt to<br />

environmental change<br />

Five processes on board each<br />

robot to co-evolve both robot<br />

controller and embodied<br />

simulator.<br />

15:00–15:15 ThCT9.2<br />

Effect of Sensor and Actuator Quality<br />

on Robot Swarm Algorithm Performance<br />

Nicholas Hoff, Robert Wood, and Radhika Nagpal<br />

Computer Science, Harvard University, United States<br />

• The performance of a swarm depends on<br />

the hardware quality of the robots.<br />

• We vary the quality of the sensors and<br />

actuators, and measure the effect on<br />

algorithm performance.<br />

• Large amounts of hardware error are<br />

required before performance appreciably<br />

decreases.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–221–<br />

Varying number of communication<br />

receivers.<br />

15:30–15:35 ThCT9.4<br />

����������<br />

����������������������������������������<br />

�����������������������������������<br />

������������������������������������������������<br />

�������������������������<br />

� �������������������������������������������<br />

�����������������������������������������<br />

�������������������������������������<br />

� ��������������������������������������<br />

���������������������������������������<br />

� �����������������������������������<br />

��������������������������������������<br />

����������������������������������������<br />

�����������������������������������<br />

����������������������


Session ThCT9 Continental Parlor 9 Thursday, September 29, <strong>2011</strong>, 14:45–16:15<br />

Swarms and Flocks<br />

Chair Dylan Shell, Texas A&M Univ.<br />

Co-Chair Radhika Nagpal, Harvard Univ.<br />

15:35–15:40 ThCT9.5<br />

Gas Source Localization in Indoor Environments<br />

using Multiple Inexpensive Robots and Stigmergy<br />

Maurizio Di Rocco<br />

DIA, Università degli Studi Roma Tre, Italy<br />

Matteo Reggente, Alessandro Saffiotti<br />

AASS, Orebro University, Sweden<br />

• Gas source localization is difficult, e.g.,<br />

because gas concentration does not have a<br />

smooth gradient due to turbulent diffusion<br />

• We generate an artificial gradient that<br />

combines gas concentration and distance,<br />

and leads to likely location of gas source<br />

• The gradient is stored in the environment<br />

(stigmergy) using a grid of RFID tags<br />

• Through stigmergy, robots only need<br />

minimal sensing and computing resources<br />

• Through stigmergy, cooperation among<br />

multiple robots emerges naturally<br />

Top: experimental set-up.<br />

Bottom: computed gas<br />

concentration map.<br />

15:45–16:00 ThCT9.7<br />

Human Swarm Modeling in Exhibition Space<br />

and Space Design<br />

Masafumi Okada, Yuichi Motegi and Ko Yamamoto<br />

Department of Mechanical Sciences and Engineering,<br />

Tokyo TECH, Japan<br />

• The purposes of this research are layout<br />

optimization of exhibits to reduce<br />

congestion and amenity space design.<br />

• Human behavior including individual<br />

characteristics is modeled by twodimensional<br />

vector field and multidimensional<br />

extension.<br />

• A Layout of exhibits is optimized based on<br />

the proposed model by minimizing<br />

collision avoidance of individuals.<br />

• The proposed optimization method is<br />

verified by simulations and experiments<br />

using swarm robots.<br />

The experiment verification using<br />

swarm robots<br />

15:40–15:45 ThCT9.6<br />

Reynolds flocking in reality with fixed-wing robots:<br />

communication range vs. maximum turning rate<br />

Sabine Hauert, Severin Leven, Maja Varga, Jean-Christophe<br />

Zufferey and Dario Floreano<br />

Laboratory of Intelligent Systems, EPFL, Switerland<br />

Fabio Ruini, Angelo Cangelosi<br />

Centre for Robotics and Neural Systems, University of Plymouth, UK<br />

• Demonstrated Reynolds flocking<br />

using swarms of 10 flying robots.<br />

• Results show tradeoff between<br />

communication range and flight<br />

dynamics on flocking performance.<br />

• Developed infrastructure enables<br />

aerial swarm experiments with a<br />

single operator.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–222–<br />

Swarm of flying robots used to<br />

perform Reynolds flocking.<br />

16:00–16:15 ThCT9.8<br />

ARGoS: a Modular, Multi-Engine Simulator for<br />

Heterogeneous Swarm Robotics<br />

Carlo Pinciroli, Vito Trianni, Rehan O’Grady, Giovanni Pini,<br />

Arne Brutschy, Manuele Brambilla, Nithin Mathews,<br />

Eliseo Ferrante and Marco Dorigo<br />

IRIDIA, CoDE, Université Libre de Bruxelles, Belgium<br />

Gianni di Caro, Frederick Ducatelle and Luca Maria Gambardella<br />

IDSIA, USI-SUPSI, Switzerland<br />

Timothy Stirling<br />

LIS, École Polythechnique Fédérale de Lausanne, Switzerland<br />

Álvaro Gutiérrez<br />

ETSI Telecomunicación, Universidad Politécnica de Madrid, Spain<br />

• A novel, open source multi-robot<br />

simulator targeted for large<br />

heterogeneous swarms of robots<br />

• ARGoS’ modular architecture is<br />

designed for scalability and<br />

extensibility<br />

• A unique feature of ARGoS is that it is<br />

possible to divide the space into subspaces<br />

managed by different physics<br />

engines<br />

• ARGoS’ architecture is multi-threaded<br />

and free of race conditions<br />

Results show that ARGoS can<br />

simulate 10,000 simple robots<br />

40% faster than real time


Session ThDT1 Continental Ballroom Thursday, September 29, <strong>2011</strong>, 16:45–17:45<br />

Forum: On Robotics Conferences: A Town Hall Meeting<br />

Chair Peter Corke, QUT<br />

Co-Chair<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–223–


Session ThDT3 Continental Parlor 3 Thursday, September 29, <strong>2011</strong>, 16:45–17:45<br />

Marine Systems II<br />

Chair Ioannis Rekleitis, McGill Univ.<br />

Co-Chair Shugen Ma, Ritsumeikan Univ.<br />

16:45–17:00 ThDT3.1<br />

ePaddle Mechanism: Towards The Development of<br />

A Versatile Amphibious Locomotion Mechanism<br />

Yi Sun and Shugen Ma<br />

Department of Robotics, Ritsumeikan University, Japan<br />

• An eccentric paddle locomotion mechanism (ePaddle) for amphibious<br />

tasks is proposed;<br />

• Three terrestrial and two aquatic gaits are introduced, including wheeled<br />

rolling, legged walking, hybrid wheeled, rotational paddling and oscillating<br />

paddling gaits;<br />

• Forward and inverse kinematic models for above gaits are derived;<br />

• Prototype design, experiments and simulations are presented to verify<br />

the idea of ePaddle mechanism.<br />

17:15–17:30 ThDT3.3<br />

MARE: Marine Autonomous Robotic Explorer<br />

Yogesh Girdhar, Anqi Xu, Bir Bikram Dey, Malika Meghjani,<br />

Florian Shkurti, Ioannis Rekleitis, and Gregory Dudek<br />

School of Computer Science, McGill University, Canada<br />

• An autonomous airboat robot suitable for<br />

exploration-oriented tasks in both closed<br />

and open water environments<br />

• A coverage-driven exploration strategy for<br />

systematically exploring a bounded region<br />

with arbitrary obstacle based on a<br />

complete and optimal coverage algorithm<br />

• A surprise-driven exploration strategy for<br />

steering the robot on a path that might<br />

lead to potentially surprising observations<br />

• A programming tool and accompanied<br />

software management system for<br />

respectively specifying and regulating<br />

concurrent robot behaviors and activities<br />

Our MARE robot being deployed<br />

to visually explore a coastal coral<br />

reef region autonomously<br />

17:00–17:15 ThDT3.2<br />

Generic Architecture for Multi-AUV Cooperation<br />

Based on a Multi-Agent Reactive<br />

Organizational Approach<br />

Nicolas Carlési, Fabien Michel,<br />

Bruno Jouvencel and Jacques Ferber<br />

LIRMM, Univ. Montpellier 2, France<br />

• Objective:<br />

Design an architecture for<br />

specifying AUV flotillas at a high<br />

level of abstraction, regardless of<br />

mission context and AUVs<br />

characteristics and skills.<br />

• Proposed approach:<br />

- A organizational model is used to<br />

ease and regulate interactions<br />

between heterogeneous AUVs.<br />

- A behavioral reactive approach is<br />

used to limit communication.<br />

• Simulation:<br />

Heterogeneous AUVs cooperate<br />

for exploring and mapping an area.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–224–<br />

REMORAS: a three-layer control architecture<br />

17:30–17:45 ThDT3.4<br />

State Estimation of an Underwater Robot Using<br />

Visual and Inertial Information<br />

F. Shkurti, I. Rekleitis, M. Scaccia, and G. Dudek<br />

School of Computer Science, McGill University, Canada<br />

• Single camera, depth sensor, and<br />

IMU used for underwater robot<br />

state estimation<br />

• Challenging lighting conditions<br />

• Unpredictable motion<br />

• EKF used to estimate the pose and<br />

velocities of the vehicle<br />

• Experimental validation using data<br />

sets collected over coral reefs<br />

Aqua swimming over the coral reef


Session ThDT4 Continental Parlor 2 Thursday, September 29, <strong>2011</strong>, 16:45–17:45<br />

Novel System Designs: Locomotion<br />

Chair François Pierrot, CNRS - LIRMM<br />

Co-Chair Pei-Chun Lin, National Taiwan Univ.<br />

16:45–17:00 ThDT4.1<br />

Shape and Location Design of Supporting Legs<br />

for a New Water Strider Robot<br />

Wu Licheng 1 , Wang Shuhui 2 , Marco Ceccarelli 3 ,<br />

Yuan Haiwen 2 and Yang Guosheng 1<br />

1 School of Information Engineering, Minzu University of China, China<br />

2 School of Automation Science & Electrical Engineering, Beihang Univ., China<br />

3 Laboratory of Robotics and Mechatronics, University of Cassino, Italy<br />

• Water Strider Robot<br />

(WSR) stands and moves<br />

on water by surface<br />

tension.<br />

• Shape and location of<br />

supporting legs for<br />

enhancing the lift force is<br />

critical to a WSR<br />

A photo of a Water Dancer II-a is standing on the water.<br />

• A supporting leg is approached as Euler-Bernoulli elastic curved beam<br />

and a method for designing its optimal shape is proposed.<br />

• A method for properly locating supporting legs on the robot body is<br />

proposed by analyzing the influence of leg location to lift force and the<br />

relationship of supporting legs’ with robot’s roll-resistant capability.<br />

• The proposed methods are used and validated on a developed Water<br />

Dance II-a as shown in the figure.<br />

17:15–17:30 ThDT4.3<br />

HAMR 3 : An Autonomous 1.7g Ambulatory Robot<br />

Andrew T. Baisch, Michael Karpelson, and Robert J. Wood<br />

School of Engineering and Applied Sciences, and the Wyss Institute for<br />

Biologically Inspired Engineering, Harvard University, USA<br />

Christian Heimlich<br />

Ecole Polytechnique Fédérale de Lausanne, Switzerland<br />

• Presentation of a 1.7g, 4.7cm<br />

hexapod robot as a platform for<br />

research on centimeter-scale<br />

walking systems.<br />

• Results during autonomous<br />

locomotion.<br />

• Description of onboard high<br />

voltage electronics to drive<br />

piezoelectric actuators.<br />

The third generation Harvard<br />

Ambulatory MicroRobot, HAMR 3<br />

17:00–17:15 ThDT4.2<br />

Locomotion Approach of REMORA: a REonfigurable<br />

MObile Robot for manufacturing Applications<br />

Hai Yang and Cédric Baradat<br />

Tecnalia, France<br />

Sébastien Krut and François Pierrot<br />

LIRMM, CNRS-Université de Montpellier II, France<br />

• A quadruped uses eight actuators for<br />

achieving manufacturing and<br />

locomotion tasks in large workspaces<br />

• There are lockers on some of the<br />

passive joints and clamping devices at<br />

the end of its limbs<br />

• The locking strategy of the robot is<br />

formulated as an optimization problem<br />

• The robot can achieve locomotion with<br />

respect to some extra constrains in the<br />

workspaces<br />

17:30–17:45 ThDT4.4<br />

Experimental Dynamics of Wing Assisted<br />

Running for a Bipedal Ornithopter<br />

Kevin Peterson and Ronald Fearing<br />

EECS, University of California, Berkeley, USA<br />

• The wing assisted dynamics of quasi-static<br />

and dynamic gaits are examined by using an<br />

onboard accelerometer.<br />

• We compare accelerations for bipedal<br />

running with and without wings and find<br />

flapping wings result in a smoother, faster<br />

dynamic gait.<br />

• With wings, peak accelerations are ±22<br />

m/s 2 and the speed is 1.5 m/s; without, the<br />

accelerations peak at ±47 m/s 2 with speed<br />

0.35 m/s.<br />

• Efficiency of running with and without wing<br />

assistance measured to be 53.1 J/kg-m and<br />

89.7 J/kg-m.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–225–<br />

The Bipedal Ornithopter for<br />

Locomotion Transitioning (BOLT)<br />

is a hybrid flying/crawling robot<br />

used to study the interactions<br />

between legs and wings, and<br />

terrestrial/aerial transitions.


Session ThDT5 Continental Parlor 1 Thursday, September 29, <strong>2011</strong>, 16:45–17:45<br />

Climbing & Brachiation<br />

Chair Fumiya Iida, ETH Zurich<br />

Co-Chair Motoji Yamamoto, Kyushu Univ.<br />

16:45–17:00 ThDT5.1<br />

CLASH: Climbing Loose Cloth<br />

Paul Birkmeyer, Ronald S. Fearing<br />

Department of Electrical Engineering and Computer Science<br />

University of California, Berkeley, United States<br />

Andrew G. Gillies<br />

Department of Mechanical Engineering<br />

University of California, Berkeley, United States<br />

• 10 cm, 15 gram hexapedal robot capable<br />

of climbing vertical loose cloth at 15 cm/s<br />

• Single drive actuator<br />

• Passive foot design allows easy<br />

engagement, disengagement of surface<br />

• Novel design manages body dynamics to<br />

improve climbing performance<br />

17:15–17:30 ThDT5.3<br />

Scaling Walls:<br />

Applying Dry Adhesives to the Real World<br />

Elliot W. Hawkes, John Ulmen, Noe Esparza,<br />

and Mark R. Cutkosky<br />

Department of Mechanical Engineering, Design Division,<br />

Stanford University, United States<br />

• We present two foot mechanisms<br />

inspired by the structures in gecko<br />

toes<br />

• Both allow almost 100% contact even<br />

with significant initial misalignment<br />

• The tendon-inspired design with large<br />

patches achieves adhesion equal to<br />

small aligned test samples<br />

• Scaling may allow a robot as large as<br />

100kg to climb<br />

Stickybot III climbing a painted<br />

metal door with a tendon-based<br />

foot design.<br />

17:00–17:15 ThDT5.2<br />

Shaping Energetically Efficient Brachiation<br />

Motion for a 24-DOF Gorilla Robot<br />

Stepan Pchelkin 1 Anton Shiriaev 1 Uwe Mettin 1 Leonid Freidovich 2<br />

1 Norwegian University of Science and Technology, Norway<br />

2 Umeå University, Sweden<br />

Tadayoshi Aoyama Zhiguo Lu Toshio Fukuda<br />

Nagoya University, Japan<br />

• Problem: Trajectory generation for a<br />

24-DOF gorilla robot with passive wrist<br />

• We present a numerically tractable<br />

procedure with iterative heuristics and<br />

rigorous tools of analytical mechanics<br />

to minimize the energetic cost of<br />

transport<br />

• (step 1) Blind Numerical Search with<br />

Different Assumptions on Geometrical<br />

Constraints<br />

• (step 2) Relaxed Numerical Search<br />

with Gradient and Hessian of Cost<br />

Function<br />

17:30–17:45 ThDT5.4<br />

A Climbing Robot Based on Hot Melt Adhesion<br />

Marc Osswald and Fumiya Iida<br />

Bio-Inspired Robotics Lab, ETH Zurich, Switzerland<br />

• Hot Melt Adhesion can provide up to<br />

150N/cm 2 shear force for climbing robot<br />

locomotion<br />

• HMA can be used to climb on different<br />

materials in the environment<br />

• Investigation of HMA thermoplasticity and<br />

controllability for climbing robot locomotion<br />

• Design and control of a climbing robot<br />

platform (body size 16cm and weight<br />

0.6kg)<br />

• Demonstration and analysis of climbing<br />

robot locomotion at 0.17m/min.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–226–<br />

The Climbing Robot Platform


Session ThDT6 Continental Parlor 9 Thursday, September 29, <strong>2011</strong>, 16:45–17:45<br />

Novel System Designs: Sensing and Manipulation<br />

Chair Raj Madhavan, UMD-CP/NIST<br />

Co-Chair Chang-Soo Han, Hanyang Univ.<br />

16:45–17:00 ThDT6.1<br />

Quadrocopter Ball Juggling<br />

Mark Müller, Sergei Lupashin and Raffaello D’Andrea<br />

Institute for Dynamic Systems and Control, ETH Zurich, Switzerland<br />

• Quadrocopters with attached rackets hitting a table<br />

tennis ball<br />

• Operating in a 10x10x10m motion capture volume<br />

• Trajectory generation, ball state estimation<br />

• Adaptation by estimation of ball’s drag coefficient ,<br />

racket characteristics and aiming bias<br />

• Three modes of operation:<br />

• Solo quadrocopter play<br />

• Quadrocopter/human play<br />

• Cooperative two quadrocopter play<br />

17:15–17:30 ThDT6.3<br />

Compact Design of a Torque Sensor using<br />

Optical Technique and its Fabrication for<br />

Wearable and Quadruped Robots<br />

Sarmad Shams, Dongik Shin, Ji Yeong Lee, Kyoosik Shin and<br />

Chang-Soo Han<br />

Dept. of Mechanical Engineering, Hanyang Univ., South Korea<br />

Jungsoo Han<br />

Dept. of Mechanical Systems Engineering, Hansung University, South Korea<br />

• We have developed a compact torque<br />

sensor for a wearable robot: dimension of<br />

Ø80�11 mm and range of 100 Nm.<br />

• The deflection is measured by a photointerrupter<br />

instead of strain gages.<br />

• A monolithic flexure mechanism made of<br />

titanium was designed verified by FEM<br />

(deflection and stress).<br />

• The performance such as linearity and<br />

repeatability was evaluated by<br />

experiments<br />

Photo-interrupter-based<br />

torque sensor<br />

17:00–17:15 ThDT6.2<br />

2-DOF Contactless Distributed Manipulation<br />

Using Superposition of Induced Air Flows<br />

Anne Delettre, Guillaume J. Laurent and Nadine Le-Fort Piat<br />

Femto-ST Institute, UFC-ENSMM-UTBM-CNRS, Besançon, France<br />

• Contactless conveyor<br />

• Original aerodynamic<br />

traction principle<br />

• Induced air flow modeling<br />

• Two dimensions position<br />

control and trajectory<br />

tracking<br />

17:30–17:45 ThDT6.4<br />

Development of a High Efficiency and High Reliable<br />

Glass Cleaning Robot with a Dirt Detect Sensor<br />

Yoshio Katsuki, Takeshi Ikeda and Motoji Yamamoto<br />

Faculty of Engineering, Kyushu University, Japan<br />

• This research proposes a trajectory tracking<br />

control method for a glass cleaning robot.<br />

• The robot can follow a desired path for glass<br />

cleaning with the proposed control method.<br />

• This research proposes a dirt detect sensor<br />

for a glass cleaning robot, too.<br />

• A motion control method to guarantee<br />

complete cleaning with the dirt detect<br />

sensor is proposed.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–227–<br />

Glass cleaning robot with a dirt detect<br />

sensor on a partly dirty glass


Session ThDT7 Continental Parlor 7 Thursday, September 29, <strong>2011</strong>, 16:45–17:45<br />

Medical Robotics: Motion Planning & State Estimation<br />

Chair Michael Zinn, Univ. of Wisconsin - Madison<br />

Co-Chair Allison M. Okamura, Stanford Univ.<br />

16:45–17:00 ThDT7.1<br />

Modeling Approach for Continuum Robotic<br />

Manipulators: Effects of Nonlinear Internal<br />

Device Friction<br />

Jinwoo Jung, Ryan Penning, Nicola Ferrier, and Michael Zinn,<br />

Department of Mechanical Engineering<br />

University of Wisconsin – Madison, USA<br />

• A new continuum robotics lumpedparameter<br />

modeling approach is<br />

presented<br />

• Effectively models internal friction from<br />

sources such as sliding control tendons<br />

and relative telescoping motion<br />

• Rresults demonstrate advantage of<br />

new approach as compared to<br />

modeling with constant curvature<br />

assumption<br />

• Comparison with experimental results<br />

demonstrates efficacy of approach<br />

Nonlinear internal friction can<br />

significantly distort tip trajectory<br />

17:15–17:30 ThDT7.3<br />

Motion Planning for Concentric Tube Robots<br />

Using Mechanics-based Models<br />

Luis G. Torres and Ron Alterovitz<br />

Deptartment of Computer Science<br />

University of North Carolina at Chapel Hill, USA<br />

• New motion planner for concentric tube<br />

robots for minimally invasive surgery<br />

• Uses mechanics-based model of robot<br />

shape and considers uncertainty in<br />

actuation and kinematics<br />

• Minimizes estimated probability of collision<br />

with anatomical obstacles<br />

• Extends sampling-based Rapidly-<br />

Exploring Roadmap with goal bias for fast<br />

performance<br />

• Apply to challenging skull base<br />

neurosurgery procedure<br />

Concentric tube robot executing<br />

motion plan to reach pituitary<br />

gland through nasal cavity<br />

17:00–17:15 ThDT7.2<br />

Inequality Constrained Kalman Filtering for the<br />

Localization and Registration of a Surgical Robot<br />

Stephen Tully<br />

Electrical and Computer Engineering, Carnegie Mellon University, USA<br />

George Kantor and Howie Choset<br />

Robotics Institute, Carnegie Mellon University, USA<br />

• For image-guided surgery, preoperative<br />

surface models provide geometric<br />

constraints on the robot path<br />

• We introduce an uncertainty projection<br />

method for inequality constrained Kalman<br />

filtering<br />

• Nonlinear constraints are handled with a<br />

pseudo-measurement update process<br />

• We demonstrate localization and<br />

registration of the HARP surgical probe<br />

using constrained filtering<br />

• Registration parameters are automatically<br />

updated during image-guidance<br />

experiments<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–228–<br />

Correcting snake robot localization<br />

for image-guided surgery<br />

17:30–17:45 ThDT7.4<br />

State Estimation and Feedforward Tremor Suppression<br />

for a Handheld Micromanipulator with a Kalman Filter<br />

Brian C. Becker, Robert A. MacLachlan, Cameron N. Riviere<br />

Robotics Institute, Carnegie Mellon University, USA<br />

• Active compensation of tremor in fully<br />

handheld micrompanipulators is limited by<br />

latencies in actuation<br />

• By modeling tremor, more effective tremor<br />

cancellation is possible by anticipating and<br />

correcting for future hand motion<br />

• A Kalman filter estimates state and<br />

models hand motion, providing tremor<br />

predictions for a feedforward model<br />

• Incorporating a feedforward model into the<br />

control loop of our Micron manipulator<br />

reduces tremor over 50% compared to the<br />

existing feedback-only control method<br />

Micron and experimental setup


Session ThDT8 Continental Parlor 8 Thursday, September 29, <strong>2011</strong>, 16:45–17:45<br />

Aerial Robots<br />

Chair Randy Beard, Brigham Young Univ.<br />

Co-Chair Mac Schwager, Univ. of Pennsylvania<br />

16:45–17:00 ThDT8.1<br />

Quadcopter Breaking the MAV Endurance<br />

Record with Laser Power Beaming<br />

Michael C. Achtelik, Jan Stumpf,<br />

Ascending Technologies GmbH, Germany<br />

Daniel Gurdan and Klaus-Michael Doth<br />

Ascending Technologies GmbH, Germany<br />

• Flexible Quadrocopter Design<br />

• Separate processor for experiments<br />

• SDK for different control levels<br />

• MATLAB/Simulink Interface<br />

• Position Control<br />

• High update rate of 1 kHz<br />

• Position estimation based on LED<br />

tracking and IMU measurements<br />

• Powered by Laser Power Beaming<br />

• Small Buffer Battery with Recharge in<br />

Flight Capability<br />

• World Endurance Record with a 1 kg UAV:<br />

• 12h 27min in 2010 in Everett, WA<br />

Laser powered MAV “AscTec Pelican”<br />

17:15–17:30 ThDT8.3<br />

Quadrocopter Performance Benchmarking<br />

Using Optimal Control<br />

Robin Ritz, Markus Hehn,<br />

Sergei Lupashin, and Raffaello D’Andrea<br />

Institute for Dynamic Systems and Control, ETH Zurich, Switzerland<br />

• Numerical method for computing<br />

quadrocopter maneuvers satisfying<br />

Pontryagin’s minimum principle for timeoptimality.<br />

• Time-optimal maneuvers are a useful<br />

benchmark to compare other control<br />

strategies to.<br />

• Structure of time-optimal maneuvers is<br />

shown for various translations in two<br />

dimensions.<br />

• Experimental results demonstrate validity<br />

of computed trajectories.<br />

Time-Optimal Maneuvers for<br />

Vertical Displacements<br />

17:00–17:15 ThDT8.2<br />

Utilizing an Improved Rotorcraft Dynamic Model<br />

in State Estimation<br />

Robert Leishman, Jeff Ferrin, and Timothy McLain<br />

Mechanical Eng. Dept., Brigham Young University, USA<br />

John Macdonald, Stephen Quebe, and Randal Beard<br />

Elec. & Comp. Eng. Dept., Brigham Young University, USA<br />

• We present an improved dynamic<br />

model for small multirotor aircraft<br />

suitable for indoor flight.<br />

• The model enables more accurate<br />

IMU-based state estimates.<br />

• The benefit is reduced need for<br />

computationally expensive<br />

exteroceptive sensor updates.<br />

• The benefit is most helpful to payload<br />

constrained vehicles such as those<br />

flying indoors.<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–229–<br />

This figure compares the truth to two<br />

estimates of pitch: based on our<br />

improved model (dotted line) & a more<br />

traditional approach (dashed line)<br />

17:30–17:45 ThDT8.4<br />

A Scalable Information Theoretic<br />

Approach to Distributed Robot Coordination<br />

Brian J. Julian* † , Michael Angermann ‡ ,<br />

Mac Schwager* � , and Daniela Rus*<br />

*CSAIL, MIT, USA<br />

† Engineering Division, MIT Lincoln Laboratory, USA<br />

‡ DLR Oberpfaffenhofen, Germany<br />

� GRASP Laboratory, UPenn, USA<br />

• Scalable information theoretic<br />

approach to infer an environment’s<br />

state by distributively controlling<br />

robots equipped with sensors.<br />

• Novel algorithm for approximating<br />

joint measurement probabilities<br />

using consensus and sampling.<br />

• Distributed controller proven to be<br />

convergent between consensus<br />

rounds and, under certain<br />

conditions, locally optimal.<br />

• Complexity constant w.r.t. number<br />

of robots, and linear w.r.t. sensor<br />

measurement and environment<br />

discretization cells.<br />

Five quad-rotor flying robots inferring<br />

the state of an environment


<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–230–


Author Index<br />

C = Chair, CC = Co-Chair, O = Organizer, * = interactive presentation (beside the “teaser” in regular sessions)<br />

A<br />

Abadie, Joel.............................................................. MoAT1.7 39<br />

Abbeel, Pieter........................................................... WeAT6 CC<br />

.................................................................................. WeAT6 O<br />

.................................................................................. WeBT5.7 2646<br />

.................................................................................. WeBT6 CC<br />

.................................................................................. WeBT6 O<br />

.................................................................................. ThCT6.7 4877<br />

Abbott, Jake.............................................................. MoBT1 CC<br />

.................................................................................. MoBT1.5 445<br />

.................................................................................. TuAPT10.2 *<br />

.................................................................................. TuBT5.2 1321<br />

.................................................................................. ThCT8.8 4975<br />

Abdo, Nichola ........................................................... WeAT5.5 2180<br />

.................................................................................. WeBPT10.14 *<br />

Abeles, Peter............................................................ MoBT2.2 475<br />

Abelmann, Leon ....................................................... MoBT1.1 421<br />

Abichandani, Pramod ............................................... WeAT8.2 2306<br />

Abiko, Satoko ........................................................... MoBT6.7 625<br />

Achar, Supreeth........................................................ MoAT6.3 227<br />

.................................................................................. WeAT9.1 2352<br />

Achtelik, Markus W................................................... WeAT6.8 2242<br />

Achtelik, Michael C................................................... ThDT8.1 5166<br />

Ackerman, Jeffrey..................................................... MoAT5.5 203<br />

.................................................................................. MoBPT10.14 *<br />

Adibi, Mehrad ........................................................... WeAT9.4 2371<br />

.................................................................................. WeBPT10.25 *<br />

Adorno, Bruno Vilhena ............................................. WeBT3.5 2545<br />

.................................................................................. WeCPT10.8 *<br />

.................................................................................. ThCT1.7 4658<br />

Agamennoni, Gabriel................................................ MoAT2.7 92<br />

Agha-mohammadi, Ali-akbar.................................... ThBT3.1 4284<br />

Aghasadeghi, Navid ................................................. TuBT9.8 1561<br />

Aghili, Farhad ........................................................... ThAT1.7 3784<br />

.................................................................................. ThBT1.3 4187<br />

Aguiar, António......................................................... WeCT6.8 3166<br />

Ahlberg, Carl............................................................. TuCT2.2 1626<br />

Ahmad Sharbafi, Maziar........................................... WeBT7.2 2723<br />

Ahmadi, Mojtaba....................................................... WeAT7.3 2261<br />

Ahmed, Zubair.......................................................... WeCT3.4 2992<br />

.................................................................................. WeDPT10.7 *<br />

Ahn, Bummo............................................................. ThBT7.7 4516<br />

Akachi, Kazuhiko...................................................... ThBT5.4 4400<br />

.................................................................................. ThCPT10.13 *<br />

Akgun, Baris ............................................................. WeBT5.6 2640<br />

.................................................................................. WeCPT10.15 *<br />

Alami, Rachid ........................................................... WeCT1 CC<br />

.................................................................................. WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

Albu-Schäffer, Alin.................................................... TuCT6.6 1789<br />

.................................................................................. WeAPT10.15 *<br />

.................................................................................. WeCT4.1 3023<br />

.................................................................................. WeCT7.5 3199<br />

.................................................................................. WeDT1.5 3367<br />

.................................................................................. WeDT9.3 3708<br />

.................................................................................. WeDPT10.20 *<br />

.................................................................................. ThAPT10.2 *<br />

.................................................................................. ThBT1.2 4180<br />

.................................................................................. ThBT1.7 4215<br />

.................................................................................. ThBT5.7 4420<br />

Alequin-Ramos, Freddie........................................... MoBT5.8 588<br />

Alismail, Hatem......................................................... ThAT6.8 4020<br />

Allan, Jean-Francois................................................. TuCT7.7 1849<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–231–<br />

Allen, Peter ............................................................... TuAT6 CC<br />

.................................................................................. TuAT6 O<br />

.................................................................................. TuBT6.5 1380<br />

.................................................................................. TuCPT10.14 *<br />

.................................................................................. WeAT9.7 2390<br />

Almaddah, Amr......................................................... TuAT1.4 807<br />

.................................................................................. TuBPT10.1 *<br />

Almqvist, Håkan........................................................ ThAT8.2 4078<br />

Alterovitz, Ron .......................................................... WeAT5 C<br />

.................................................................................. WeAT5 O<br />

.................................................................................. WeDT5 CC<br />

.................................................................................. WeDT5 O<br />

.................................................................................. ThDT7.3 5153<br />

Althoff, Daniel ........................................................... WeCT5.6 3114<br />

.................................................................................. WeDPT10.15 *<br />

Amato, Nancy ........................................................... WeBT5.5 2632<br />

.................................................................................. WeCPT10.14 *<br />

.................................................................................. ThBT3 C<br />

.................................................................................. ThBT3.1 4284<br />

Ambrose, Robert ...................................................... TuBT6.1 *<br />

Ames, Aaron............................................................. TuCT5.3 1723<br />

Amirat, Yacine .......................................................... TuCT5.7 1749<br />

An, Sang-ik ............................................................... WeBT7.8 2759<br />

Andersen, Tim .......................................................... TuCT3.5 1687<br />

Anderson, Chris........................................................ WeAT6.1 *<br />

Anderson, Ross ........................................................ ThAT4.5 3917<br />

.................................................................................. ThBPT10.11 *<br />

Andersson, Sean ...................................................... WeCT5.5 3108<br />

.................................................................................. WeDPT10.14 *<br />

Ando, Noriaki ............................................................ TuAT7.5 1042<br />

.................................................................................. TuBPT10.17 *<br />

Ando, Teruhisa ......................................................... WeDT8.6 3673<br />

.................................................................................. ThAPT10.21 *<br />

Andreff, Nicolas ........................................................ WeBT9.7 2855<br />

Ang Jr, Marcelo H..................................................... ThAT4.7 3931<br />

Angeles, Jorge.......................................................... MoAT7.3 280<br />

Angermann, Michael................................................. ThDT8.4 5187<br />

Anisi, David A. .......................................................... MoAT6 CC<br />

.................................................................................. MoAT6.4 235<br />

.................................................................................. MoBT6 C<br />

.................................................................................. MoBPT10.16 *<br />

Annunziata, Salvatore .............................................. ThAT1.6 3776<br />

.................................................................................. ThBPT10.3 *<br />

Antonelli, Gianluca.................................................... WeBT8.3 2778<br />

.................................................................................. WeCT6 CC<br />

.................................................................................. WeCT6 O<br />

.................................................................................. WeCT6.8 3166<br />

.................................................................................. WeDT6 O<br />

Aoi, Shinya ............................................................... WeAT7 CC<br />

.................................................................................. WeAT7.5 2274<br />

.................................................................................. WeAT7.6 2280<br />

.................................................................................. WeBPT10.20 *<br />

.................................................................................. WeBPT10.21 *<br />

Aoki, Keiji.................................................................. ThAT8.7 4109<br />

Aoki, Kengo .............................................................. ThCT2.7 4710<br />

Aoki, Takeshi ............................................................ ThBT8.4 4550<br />

.................................................................................. ThCPT10.22 *<br />

Aoyama, Tadayoshi.................................................. ThDT5.2 5094<br />

Apker, Thomas ......................................................... MoAT3.5 125<br />

.................................................................................. MoBPT10.8 *<br />

Arabagi, Veaceslav................................................... ThAT4.8 3937<br />

Arafuka, Kazushi ...................................................... MoAT9.5 395<br />

.................................................................................. MoBPT10.26 *


Arai, Fumihito ........................................................... MoAT1 CC<br />

.................................................................................. MoAT1.3 13<br />

.................................................................................. MoBT1.4 439<br />

.................................................................................. TuAPT10.1 *<br />

.................................................................................. TuBT3.7 1309<br />

.................................................................................. TuCT3.6 1693<br />

Arai, Tamio ............................................................... ThCT2.5 4698<br />

.................................................................................. ThDPT10.5 *<br />

Arai, Tatsuo .............................................................. MoAT1.5 25<br />

.................................................................................. MoBT1 C<br />

.................................................................................. MoBT1.2 427<br />

.................................................................................. MoBPT10.2 *<br />

.................................................................................. TuAT1.4 807<br />

.................................................................................. TuBPT10.1 *<br />

Araki, Takahiro ......................................................... TuCT9.6 1946<br />

.................................................................................. WeAPT10.24 *<br />

Araki, Takaya............................................................ TuBT9.5 1540<br />

.................................................................................. TuCPT10.23 *<br />

Arata, Jumpei ........................................................... WeAT4.6 2145<br />

.................................................................................. WeBPT10.12 *<br />

Ardö, Håkan.............................................................. WeCT3.1 2971<br />

Arechavaleta, Gustavo ............................................. TuCT1.8 1612<br />

Arevalo, Juan Carlos ................................................ TuBT8.8 1507<br />

Argentieri, Sylvain..................................................... MoAT3.7 137<br />

Argiles, Alberto ......................................................... ThBT6.5 4448<br />

.................................................................................. ThCPT10.17 *<br />

Ariizumi, Ryo ............................................................ TuCT8.8 1907<br />

Arimoto, Suguru........................................................ TuCT6.1 *<br />

Armesto, Leopoldo ................................................... ThBT3.8 4335<br />

Arras, Kai Oliver ....................................................... WeAT1 C<br />

.................................................................................. WeAT1.1 1968<br />

.................................................................................. ThAT2 CC<br />

.................................................................................. ThAT2.6 3838<br />

.................................................................................. ThAT2.7 3844<br />

.................................................................................. ThBPT10.6 *<br />

Arrichiello, Filippo ..................................................... WeBT8 CC<br />

.................................................................................. WeBT8.3 2778<br />

.................................................................................. WeCT6.8 3166<br />

Arslan, Oktay............................................................ WeDT5.2 3507<br />

.................................................................................. WeDT5.6 3533<br />

.................................................................................. ThAPT10.12 *<br />

Artigas, Jordi............................................................. MoAT4.7 177<br />

.................................................................................. MoBT7.5 665<br />

.................................................................................. TuAPT10.17 *<br />

Asada, Harry............................................................. ThAT4 CC<br />

.................................................................................. ThAT4 O<br />

.................................................................................. ThAT4.7 3931<br />

.................................................................................. ThBT4 O<br />

Asadian, Ali............................................................... WeBT3.6 2551<br />

.................................................................................. WeCPT10.9 *<br />

Asama, Hajime ......................................................... WeCT7.6 3207<br />

.................................................................................. WeDPT10.21 *<br />

Asano, Fumihiko....................................................... WeAT7.1 2249<br />

.................................................................................. WeBT7.6 2747<br />

.................................................................................. WeCPT10.21 *<br />

Asano, Futoshi.......................................................... MoAT3.8 143<br />

Asatani, Minami........................................................ ThBT6.7 4463<br />

Asfour, Tamim .......................................................... TuCT6.2 1761<br />

.................................................................................. TuCT6.5 1781<br />

.................................................................................. WeAPT10.14 *<br />

.................................................................................. ThBT5 C<br />

.................................................................................. ThBT5 O<br />

.................................................................................. ThCT5 O<br />

.................................................................................. ThCT5.1 *<br />

Asplund, Lars............................................................ TuCT2.2 1626<br />

Au, Tsz-Chiu............................................................. ThBT9.3 4581<br />

Aukes, Daniel ........................................................... TuBT6.4 1373<br />

.................................................................................. TuCPT10.13 *<br />

Avci, Ebubekir........................................................... MoBT1.2 427<br />

Ayanian, Nora........................................................... WeCT5.8 3126<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–232–<br />

Ayusawa, Ko............................................................. WeDT9.2 3701<br />

Azevedo Coste, Christine ......................................... TuCT5.4 1731<br />

.................................................................................. WeAPT10.10 *<br />

Azimi, Ali................................................................... MoAT7.3<br />

B<br />

280<br />

Babic, Jan................................................................. ThCT5.6 4832<br />

.................................................................................. ThDPT10.15 *<br />

Babuska, Robert....................................................... WeBT7.3 2729<br />

Bachmann, Richard J. .............................................. MoAT5.6 209<br />

.................................................................................. MoBPT10.15 *<br />

Badino, Hernan......................................................... WeCT9.1 3283<br />

Baek, Stanley ........................................................... WeBT6.3 2674<br />

Bagutti, Lorenzo ....................................................... WeBT4.2 2578<br />

Bahls, Thomas.......................................................... TuAT7.7 1055<br />

.................................................................................. TuCT9.4 1933<br />

.................................................................................. WeAPT10.22 *<br />

Bahr, Fred................................................................. WeCT6.3 3132<br />

Bailey, Tim................................................................ TuAT2.8 886<br />

.................................................................................. ThBT9.6 4601<br />

.................................................................................. ThCPT10.27 *<br />

Baisch, Andrew......................................................... ThDT4.3 5073<br />

Bajcsy, Ruzena......................................................... WePL C<br />

Bajo, Andrea............................................................. WeAT9.7 2390<br />

Balaguer, Benjamin .................................................. TuBT7.1 1405<br />

Balasubramanian, Ravi ............................................ TuCT7 CC<br />

.................................................................................. TuCT7.3 1823<br />

Balkcom, Devin......................................................... ThBT3.6 4321<br />

.................................................................................. ThCPT10.9 *<br />

Ballard, Dana............................................................ TuBT1.4 1194<br />

.................................................................................. TuCPT10.1 *<br />

Banno, Yoshihisa...................................................... WeBT7.4 2735<br />

.................................................................................. WeCPT10.19 *<br />

Baradat, Cédric......................................................... ThDT4.2 5067<br />

Barfoot, Timothy ....................................................... MoBT6.8 631<br />

Barkby, Stephen ....................................................... TuBT2.3 1242<br />

.................................................................................. ThCT3 C<br />

Barrows, Geoffrey..................................................... TuAT8.6 1099<br />

.................................................................................. TuBPT10.21 *<br />

Bascetta, Luca.......................................................... WeCT3.1 2971<br />

Basoeki, Fransiska ................................................... WeDT3.6 3480<br />

.................................................................................. ThAPT10.9 *<br />

Bastos-Filho, Teodiano............................................. ThAT1.1 3746<br />

Bates, Terry .............................................................. WeAT9.1 2352<br />

Baum, Marcus .......................................................... WeAT2.5 2050<br />

.................................................................................. WeBPT10.5 *<br />

Bäuml, Berthold ........................................................ TuCT1.4 1585<br />

.................................................................................. WeAPT10.1 *<br />

Bauzano, Enrique ..................................................... WeAT3.8 2121<br />

Bays, Matthew .......................................................... WeCT8.1 3227<br />

Beard, Randy............................................................ WeBT6.5 2688<br />

.................................................................................. WeCPT10.17 *<br />

.................................................................................. ThDT8 C<br />

.................................................................................. ThDT8.2 5173<br />

Becattini, Gabriele .................................................... TuBT5.8 1359<br />

Becker, Aaron........................................................... TuAT9.7 1160<br />

Becker, Brian C. ....................................................... ThDT7.4 5160<br />

Beetz, Michael .......................................................... WeCT7.1 3172<br />

.................................................................................. ThAT9 C<br />

.................................................................................. ThAT9.4 4141<br />

.................................................................................. ThBT2.6 4263<br />

.................................................................................. ThBPT10.25 *<br />

.................................................................................. ThCPT10.6 *<br />

Behar, Evan.............................................................. TuCT1.2 1573<br />

Behera, Laxmidhar ................................................... ThCT2.8 4716<br />

Behkam, Bahareh..................................................... ThAT4.8 3937<br />

Bekiroglu, Yasemin................................................... TuBT9.7 1554<br />

Bekris, Kostas E. ...................................................... WeBT5 C<br />

.................................................................................. WeBT5 O<br />

.................................................................................. WeCT8.2 3235<br />

.................................................................................. WeCT8.7 3268


.................................................................................. ThBT3.2 4292<br />

Bellens, Steven......................................................... WeCT4.2 3031<br />

Bellingham, James G. .............................................. WeCT6.3 3132<br />

Belo, Felipe............................................................... ThAT1.5 3770<br />

.................................................................................. ThBPT10.2 *<br />

Belta, Calin ............................................................... WeCT5 CC<br />

.................................................................................. WeCT5 O<br />

.................................................................................. WeCT5.2 3087<br />

.................................................................................. WeCT5.5 3108<br />

.................................................................................. WeDPT10.14 *<br />

Ben Amar, Faiz......................................................... MoAT7.2 274<br />

Bender, John ............................................................ MoAT5.7 215<br />

Bengel, Matthias....................................................... MoAT6.5 241<br />

.................................................................................. MoBPT10.17 *<br />

Benhabib, Beno........................................................ ThBT8.3 4544<br />

Bennett, Bradford ..................................................... ThAT5.7 3969<br />

Bennewitz, Maren..................................................... ThCT5.8 4844<br />

Benson, Hande......................................................... WeAT8.2 2306<br />

Beranek, Richard...................................................... WeAT7.3 2261<br />

Bergbreiter, Sarah .................................................... TuCT3.4 1680<br />

.................................................................................. WeAPT10.7 *<br />

Bergström, Niklas ..................................................... TuAT1.7 827<br />

.................................................................................. WeAT2.2 2028<br />

Berkley, Jeff.............................................................. WeBT3.4 2539<br />

.................................................................................. WeCPT10.7 *<br />

Berman, Spring......................................................... WeCT8 CC<br />

.................................................................................. ThAT4.6 3923<br />

.................................................................................. ThBPT10.12 *<br />

Bernardes, Mariana Costa........................................ WeBT3.5 2545<br />

.................................................................................. WeCPT10.8 *<br />

Bernardino, Alexandre.............................................. ThCT5.5 4826<br />

.................................................................................. ThDPT10.14 *<br />

Bernstein, Matthew................................................... TuCT6.8 1804<br />

Berret, Bastien.......................................................... ThBT4.3 4354<br />

Bersch, Christian ...................................................... TuBT7.2 1413<br />

Berthoz, Alain ........................................................... WePL.1 *<br />

Bessho, Fumihiro...................................................... TuBT1.7 1214<br />

Bewley, Thomas....................................................... WeBT7.5 2741<br />

.................................................................................. WeCPT10.20 *<br />

.................................................................................. ThAT8.5 4097<br />

.................................................................................. ThBPT10.23 *<br />

Beyeler, Felix............................................................ TuAT3.7 919<br />

Bhatawadekar, Vineet .............................................. TuAT9.8 1167<br />

Bhattacharyya, Samrat............................................. TuBT6.5 1380<br />

.................................................................................. TuCPT10.14 *<br />

Bhoite, Mohit............................................................. ThCT4.7 4797<br />

Biagiotti, Luigi ........................................................... ThAT3 CC<br />

.................................................................................. ThAT3.8 3899<br />

Bialkowski, Joshua J ................................................ WeDT5.3 3513<br />

Bicchi, Antonio.......................................................... WeBT9.1 2817<br />

.................................................................................. ThAT1.5 3770<br />

.................................................................................. ThBPT10.2 *<br />

Bidaud, Philippe........................................................ WeBT7.7 2753<br />

Bierbaum, Alexander................................................ TuCT6.2 1761<br />

Biggs, Geoffrey......................................................... TuAT7 C<br />

.................................................................................. TuAT7.5 1042<br />

.................................................................................. TuBPT10.17 *<br />

Billard, Aude ............................................................. MoBT8.4 710<br />

.................................................................................. TuAPT10.19 *<br />

Bin Mohd Shariff, Ahmed Shafeeq ........................... WeDT6.3 3551<br />

Binney, Jonathan...................................................... WeCT6.5 3147<br />

.................................................................................. WeDPT10.17 *<br />

Birbach, Oliver.......................................................... WeCT9 CC<br />

.................................................................................. WeCT9.4 3305<br />

.................................................................................. WeDT2.6 3426<br />

.................................................................................. WeDPT10.25 *<br />

.................................................................................. ThAPT10.6 *<br />

Birchfield, Stan ......................................................... ThCT6.6 4871<br />

.................................................................................. ThDPT10.18 *<br />

Bird, Justin................................................................ ThCT8.8 4975<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–233–<br />

Birkmeyer, Paul ........................................................ ThDT5.1 5087<br />

Bischof, Horst ........................................................... ThAT6.4 3990<br />

.................................................................................. ThBPT10.16 *<br />

Biswas, Joydeep....................................................... MoAT2.4 73<br />

.................................................................................. MoBPT10.4 *<br />

Bjerkeng, Magnus..................................................... MoAT6.6 247<br />

.................................................................................. MoBPT10.18 *<br />

.................................................................................. ThCT2.2 4676<br />

Björkman, Mårten ..................................................... TuAT1.7 827<br />

Black, Richard J........................................................ WeBT3.8 2564<br />

Bland, Daniel Peter................................................... ThCT8.2 4935<br />

Blanke, Olaf .............................................................. ThCT1.8 4664<br />

Bleuler, Hannes ........................................................ ThCT1.8 4664<br />

Bley, Andreas ........................................................... WeBT1.5 2430<br />

Blodow, Nico............................................................. ThBT2.6 4263<br />

.................................................................................. ThCPT10.6 *<br />

Blumel, Marcus......................................................... MoAT5.7 215<br />

Bó, Antônio Padilha Lanari ....................................... ThCT1.7 4658<br />

Bo, Liefeng ............................................................... TuAT1.6 821<br />

.................................................................................. TuBPT10.3 *<br />

Bobadilla, Leonardo.................................................. WeCT5.4 3101<br />

.................................................................................. WeDPT10.13 *<br />

Bocsi, Botond ........................................................... MoBT8.2 698<br />

Bohg, Jeannette ....................................................... WeDT1.1 3342<br />

Bok, Yunsu ............................................................... ThBT6.3 4436<br />

Bolopion, Aude ......................................................... TuAT3.3 894<br />

Bonani, Michael ........................................................ ThCT4.2 4762<br />

Bonato, Paolo ........................................................... ThCT7.1 4893<br />

Book, Wayne ............................................................ ThBT8.5 4556<br />

.................................................................................. ThCPT10.23 *<br />

Bordignon, Mirko ...................................................... WeDT8.4 3659<br />

.................................................................................. ThAPT10.19 *<br />

Borges, Geovany Araujo .......................................... WeBT3.5 2545<br />

.................................................................................. WeCPT10.8 *<br />

Borst, Christoph........................................................ TuCT6.3 1768<br />

Bosse, Michael ......................................................... ThAT2.5 3830<br />

.................................................................................. ThBPT10.5 *<br />

Bouabdallah, Samir .................................................. WeAT6.5 2223<br />

.................................................................................. WeBPT10.17 *<br />

Boucher, Jean-David ................................................ WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

Boukallel, Mehdi ....................................................... TuAT3.6 913<br />

.................................................................................. TuBPT10.9 *<br />

.................................................................................. WeBT2.7 2505<br />

Boularias, Abdeslam................................................. TuBT9.6 1548<br />

.................................................................................. TuCPT10.24 *<br />

Bouraine, Sara.......................................................... WeCT3.3 2985<br />

Bourdin, Christophe.................................................. TuBT5.4 1333<br />

.................................................................................. TuCPT10.10 *<br />

Bouyarmane, Karim.................................................. ThBT5.6 4414<br />

.................................................................................. ThCPT10.15 *<br />

Boxerbaum, Alexander ............................................. MoAT5.4 197<br />

.................................................................................. MoBPT10.13 *<br />

Boyer, Frédéric ......................................................... TuCT8.7 1901<br />

Bozorg Grayeli, Alexis .............................................. WeBT3.3 2532<br />

Braman, Karen ......................................................... WeCT2.2 2928<br />

Brambilla, Manuele................................................... ThCT9.8 5027<br />

Brand, Christoph Norbert.......................................... WeCT5.6 3114<br />

.................................................................................. WeDPT10.15 *<br />

Branicky, Michael ..................................................... WeBT5.3 2620<br />

Branson, David ......................................................... TuAT8.5 1093<br />

.................................................................................. TuBPT10.20 *<br />

.................................................................................. ThAT7.5 4054<br />

.................................................................................. ThBPT10.20 *<br />

Bravo, Ignacio........................................................... ThAT1.1 3746<br />

Breitenmoser, Andreas............................................. MoAT2.5 79<br />

.................................................................................. MoBPT10.5 *<br />

Brethe, Jean-François .............................................. ThAT7 CC<br />

.................................................................................. ThAT7.7 4066<br />

Bretl, Timothy ........................................................... TuAT9 CC


.................................................................................. TuAT9.7 1160<br />

.................................................................................. TuBT9 CC<br />

.................................................................................. TuBT9.8 1561<br />

Brewer, Reuben........................................................ WeBT4.1 2570<br />

Bringout, Gael........................................................... TuBT3.3 1285<br />

Briot, Sébastien ........................................................ WeDT9.6 3728<br />

.................................................................................. ThAPT10.24 *<br />

Broeders, Ivo A. M. J................................................ TuAT5.3 937<br />

Broggi, Alberto.......................................................... TuCT1.6 1599<br />

.................................................................................. WeAPT10.3 *<br />

Brooks, David ........................................................... TuAT8.6 1099<br />

.................................................................................. TuBPT10.21 *<br />

Browning, Brett......................................................... ThAT6.8 4020<br />

Bruneau, Olivier........................................................ ThAT5.4 3951<br />

.................................................................................. ThBPT10.13 *<br />

Brunner, Christopher Joseph.................................... WeBT2.5 2489<br />

.................................................................................. WeCPT10.5 *<br />

Brutschy, Arne.......................................................... ThCT9.8 5027<br />

Bruyninckx, Herman ................................................. WeCT3.1 2971<br />

.................................................................................. WeCT4.2 3031<br />

.................................................................................. ThCT2.3 4684<br />

Bryson, Mitch............................................................ ThBT2 CC<br />

.................................................................................. ThBT2.5 4256<br />

.................................................................................. ThCPT10.5 *<br />

Bubeck, Alexander ................................................... MoAT6.5 241<br />

.................................................................................. MoBPT10.17 *<br />

Buchaillot, Lionel ...................................................... MoAT1.8 45<br />

Buchan, Austin ......................................................... TuAT8.2 1075<br />

Buehler, Martin ......................................................... WeBT7 CC<br />

Buelthoff, Heinrich H................................................. MoAT4.5 163<br />

.................................................................................. MoBPT10.11 *<br />

.................................................................................. WeAT6.4 2215<br />

.................................................................................. WeBPT10.16 *<br />

.................................................................................. WePL.1 *<br />

.................................................................................. WeCT4.3 3039<br />

Buether, John ........................................................... MoAT4.6 171<br />

.................................................................................. MoBPT10.12 *<br />

Buffinton, Keith ......................................................... MoAT5.3 190<br />

Bugajska, Magdalena............................................... MoAT3.5 125<br />

.................................................................................. MoBPT10.8 *<br />

Burch, Derek............................................................. ThBT8.2 4536<br />

Burdet, Etienne......................................................... WeBT4.2 2578<br />

.................................................................................. ThAT9.1 4121<br />

Burgard, Wolfram ..................................................... MoAT2.6 86<br />

.................................................................................. MoBPT10.6 *<br />

.................................................................................. TuBT2 C<br />

.................................................................................. TuBT2.4 1249<br />

.................................................................................. TuCPT10.4 *<br />

.................................................................................. WeAT5.5 2180<br />

.................................................................................. WeBPT10.14 *<br />

.................................................................................. WeDT9.4 3716<br />

.................................................................................. ThAPT10.22 *<br />

.................................................................................. ThBT2.4 4249<br />

.................................................................................. ThCT8.3 4941<br />

.................................................................................. ThCPT10.4 *<br />

Burgess, Stuart......................................................... ThAT7.3 4042<br />

Burgner, Jessica....................................................... WeBT3.1 2517<br />

Burguera, Antoni....................................................... WeDT6.7 3577<br />

Burschka, Darius ...................................................... TuBT1.8 1221<br />

.................................................................................. WeCT9.3 3297<br />

Burton, Lisa .............................................................. ThAT3.7 3893<br />

Buss, Martin.............................................................. WeCT5.6 3114<br />

.................................................................................. WeDPT10.15 *<br />

Butzke, Jonathan...................................................... WeCT8.5 3254<br />

.................................................................................. WeDPT10.23 *<br />

Buys, Koen ............................................................... WeCT4.2 3031<br />

Buzzoni, Michele ...................................................... TuCT1.6 1599<br />

.................................................................................. WeAPT10.3<br />

C<br />

*<br />

Caccavale, Fabrizio.................................................. WeBT8.3 2778<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–234–<br />

.................................................................................. ThBT1.4 4194<br />

.................................................................................. ThCPT10.1 *<br />

Cadeddu, Jeffrey A................................................... WeAT9.4 2371<br />

.................................................................................. WeBPT10.25 *<br />

Cai, Binghuang ......................................................... WeDT8.8 3686<br />

Cai, Chuanwu ........................................................... TuBT8.3 1473<br />

Cakmak, Maya.......................................................... WeAT1.4 1986<br />

.................................................................................. WeBPT10.1 *<br />

Caldwell, Darwin G. .................................................. MoAT8.1 318<br />

.................................................................................. MoAT9.2 378<br />

.................................................................................. MoAT9.4 390<br />

.................................................................................. MoBPT10.25 *<br />

.................................................................................. TuAT8.5 1093<br />

.................................................................................. TuBT5.8 1359<br />

.................................................................................. TuBPT10.20 *<br />

.................................................................................. WeCT4.4 3047<br />

.................................................................................. WeDT2.4 3413<br />

.................................................................................. WeDPT10.10 *<br />

.................................................................................. ThAT7.5 4054<br />

.................................................................................. ThAPT10.4 *<br />

.................................................................................. ThBPT10.20 *<br />

Calinon, Sylvain........................................................ MoAT8.1 318<br />

.................................................................................. WeCT4.4 3047<br />

.................................................................................. WeDT2 C<br />

.................................................................................. WeDT2.4 3413<br />

.................................................................................. WeDPT10.10 *<br />

.................................................................................. ThAPT10.4 *<br />

Calli, Berk ................................................................. TuAT6.6 995<br />

.................................................................................. TuBPT10.15 *<br />

Camilli, Richard ........................................................ MoAT6.8 261<br />

.................................................................................. ThCT3.2 4722<br />

Campbell, Mark ........................................................ TuCT1.7 1605<br />

Campolo, Domenico ................................................. ThCT7.3 4905<br />

Campos, Mario F. Montenegro................................. WeAT1 CC<br />

.................................................................................. WeAT1.2 1974<br />

Cangelosi, Angelo .................................................... ThCT9.6 5015<br />

.................................................................................. ThDPT10.27 *<br />

Cannata, Giorgio ...................................................... WeDT9.1 3694<br />

Cappelleri, David ...................................................... TuAT3 CC<br />

.................................................................................. TuAT3 O<br />

.................................................................................. TuAT3.8 925<br />

.................................................................................. TuBT3 C<br />

.................................................................................. TuBT3 O<br />

.................................................................................. TuCT3 CC<br />

.................................................................................. TuCT3 O<br />

Carlési, Nicolas......................................................... ThDT3.2 5041<br />

Carlevaris-Bianco, Nicholas ..................................... ThBT4.7 4378<br />

Carlson, Rolf............................................................. WeDT1.1 3342<br />

Caro, Stéphane ........................................................ ThAT7.2 4034<br />

Carpin, Stefano......................................................... TuBT7.1 1405<br />

.................................................................................. ThBT8.2 4536<br />

Carryon, Gabe .......................................................... MoBT5.7 580<br />

Castellanos, Jose A.................................................. TuBT2.5 1256<br />

.................................................................................. TuCPT10.5 *<br />

Castellini, Claudio..................................................... MoBT7.6 672<br />

.................................................................................. TuAPT10.18 *<br />

.................................................................................. WeAT3.6 2108<br />

.................................................................................. WeBPT10.9 *<br />

Castillo, Pedro .......................................................... WeBT6.4 2682<br />

.................................................................................. WeCPT10.16 *<br />

Castro, Sebastian ..................................................... WeCT5.7 3120<br />

Catoire, Laurent........................................................ ThBT7.1 4477<br />

Cauli, Nino ................................................................ ThCT5.5 4826<br />

.................................................................................. ThDPT10.14 *<br />

Cavusoglu, M. Cenk ................................................. MoAT4 CC<br />

.................................................................................. MoAT4 O<br />

.................................................................................. WeDT3 C<br />

.................................................................................. WeDT3.3 3460<br />

Ceccarelli, Marco...................................................... ThDT4.1 5061<br />

Célérier, Charlotte .................................................... WeBT3.3 2532


Censi, Andrea........................................................... WeAT2.6 2056<br />

.................................................................................. WeBPT10.6 *<br />

.................................................................................. ThBT6.8 4469<br />

Chablat, Damien....................................................... TuBT7.8 1453<br />

.................................................................................. ThAT7.2 4034<br />

Chae, Hansang......................................................... TuBT7.7 1447<br />

Chae, Yongwook ...................................................... MoBT7.8 685<br />

Chaimowicz, Luiz...................................................... ThBT4.6 4372<br />

.................................................................................. ThCPT10.12 *<br />

Chakravorty, Suman................................................. ThBT3.1 4284<br />

Chalon, Maxime........................................................ TuBT6.3 1366<br />

.................................................................................. ThBT1.7 4215<br />

Chamberlain, Lyle..................................................... MoAT6.3 227<br />

Chambers, Andrew................................................... MoAT6.3 227<br />

Chan, Ambrose......................................................... WeBT9.2 2825<br />

Chang, Bo................................................................. TuAT3.5 907<br />

.................................................................................. TuBPT10.8 *<br />

Chang, Doyoung....................................................... TuAT5.4 943<br />

.................................................................................. TuBPT10.10 *<br />

Chang-Siu, Evan ...................................................... TuCT8.5 1887<br />

.................................................................................. WeAPT10.20 *<br />

.................................................................................. WeBT2.3 2474<br />

Chao, Yi.................................................................... WeCT6.4 3140<br />

.................................................................................. WeDPT10.16 *<br />

Chapuis, Dominique ................................................. WeCT4.8 3074<br />

Charpillet, Francois................................................... ThAT8.6 4103<br />

.................................................................................. ThBPT10.24 *<br />

Charusta, Krzysztof Andrzej..................................... TuCT6.7 1797<br />

Chaumette, Francois ................................................ MoBT9.6 768<br />

.................................................................................. TuAPT10.24 *<br />

.................................................................................. TuCT1.5 1593<br />

.................................................................................. WeAPT10.2 *<br />

.................................................................................. WeBT9 C<br />

.................................................................................. WeBT9.5 2843<br />

.................................................................................. WeBT9.6 2849<br />

.................................................................................. WeCPT10.26 *<br />

.................................................................................. WeCPT10.27 *<br />

Chen, Bor-rong ......................................................... ThBT7.3 4488<br />

Chen, Fei.................................................................. ThCT2.1 4670<br />

Chen, Haoyao........................................................... MoBT1.6 451<br />

.................................................................................. TuAPT10.3 *<br />

Chen, I-Ming............................................................. TuCT7 C<br />

.................................................................................. WeCT2.3 2935<br />

Chen, S.Y. ................................................................ TuAT1.3 801<br />

Chen, Weidong......................................................... WeCT2.7 2959<br />

Chen, Weihai............................................................ MoBT9.2 744<br />

Chen, Yi-Jie .............................................................. TuCT9.8 1962<br />

Cheng, Bo................................................................. MoBT5.6 574<br />

.................................................................................. TuAPT10.12 *<br />

Cherubini, Andrea..................................................... TuCT1.5 1593<br />

.................................................................................. WeAPT10.2 *<br />

Chiba, Ryosuke ........................................................ ThCT2.5 4698<br />

.................................................................................. ThDPT10.5 *<br />

Chiel, Hillel................................................................ MoAT5.4 197<br />

.................................................................................. MoBPT10.13 *<br />

Chien, Steve............................................................. WeCT6.4 3140<br />

.................................................................................. WeDPT10.16 *<br />

Chilian, Annett .......................................................... WeBT2.6 2497<br />

Chinzei, Kiyoyuki ...................................................... WeBT4.3 2584<br />

Chipalkatty, Rahul .................................................... ThBT8.5 4556<br />

.................................................................................. ThCPT10.23 *<br />

Chirikjian, Gregory.................................................... ThAT4 O<br />

.................................................................................. ThBT4 O<br />

Chitre, Mandar.......................................................... WeDT6 CC<br />

Chitta, Sachin ........................................................... WeDT2 CC<br />

Chizeck, Howard ...................................................... WeBT4.8 2614<br />

Chli, Margarita .......................................................... WeAT6.8 2242<br />

Cho, Changhyun....................................................... TuCT7.8 1857<br />

Choi, Dong - Geol..................................................... ThBT6.3 4436<br />

Choi, Dongmin.......................................................... TuBT7.7 1447<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–235–<br />

Choi, Hyouk Ryeol.................................................... TuBT7.7 1447<br />

Chopra, Nikhil ........................................................... MoBT7 C<br />

.................................................................................. MoBT7.7 679<br />

.................................................................................. WeAT8.4 2321<br />

.................................................................................. WeBPT10.22 *<br />

Choset, Howie .......................................................... MoAT5.8 221<br />

.................................................................................. MoAT8.7 357<br />

.................................................................................. MoBT2.3 482<br />

.................................................................................. TuAT8 CC<br />

.................................................................................. TuAT8.1 1069<br />

.................................................................................. TuAT8.2 1075<br />

.................................................................................. TuBT5.7 1353<br />

.................................................................................. WeCT8.6 3260<br />

.................................................................................. WeDPT10.24 *<br />

.................................................................................. ThAT3.7 3893<br />

.................................................................................. ThDT7.2 5147<br />

Choti, Michael........................................................... WeBT3.4 2539<br />

Chou, Ya-Cheng....................................................... TuBT8.6 1493<br />

.................................................................................. TuCPT10.21 *<br />

Christensen, Anders Lyhne ...................................... ThCT4.2 4762<br />

Christensen, Henrik Iskov......................................... TuBT2.6 1264<br />

.................................................................................. TuCPT10.6 *<br />

.................................................................................. ThPT11 C<br />

Christensen, Kim Hardam ........................................ TuAT2.2 847<br />

Christensen, Quinton................................................ WeBT4.5 2596<br />

.................................................................................. WeCPT10.11 *<br />

Christopher, Ray....................................................... WeDT7.4 3608<br />

Chuang, Lewis L....................................................... WeCT4.3 3039<br />

Chung, Timothy H..................................................... ThBT8 C<br />

.................................................................................. ThBT8.2 4536<br />

Chung, Wan Kyun .................................................... WeBT3 C<br />

.................................................................................. WeBT3.2 2524<br />

Churaman, Wayne.................................................... TuCT3.4 1680<br />

.................................................................................. WeAPT10.7 *<br />

Ciliberto, Carlo.......................................................... TuBT9.3 1526<br />

.................................................................................. ThAT9.6 4154<br />

.................................................................................. ThBPT10.27 *<br />

Ciocarlie, Matei......................................................... TuAT6 C<br />

.................................................................................. TuAT6 O<br />

Civera, Javier............................................................ TuBT2 CC<br />

.................................................................................. TuBT2.8 1277<br />

.................................................................................. ThBT6.5 4448<br />

.................................................................................. ThCPT10.17 *<br />

Claassens, Jonathan ................................................ WeAT1.3 1980<br />

Clark, James Romney .............................................. TuBT5.2 1321<br />

Cognetti, Marco ........................................................ MoBT2.1 469<br />

Colas, Francis........................................................... ThAT2.4 3824<br />

.................................................................................. ThBPT10.4 *<br />

Coleman, Sonya ....................................................... ThCT2.8 4716<br />

Colgate, Edward ....................................................... WeAT4.1 *<br />

Coltin, Brian .............................................................. MoAT2.4 73<br />

.................................................................................. MoBPT10.4 *<br />

Comport, Andrew Ian................................................ ThBT2.3 4242<br />

Conradt, Jorg............................................................ MoBT8 CC<br />

Conti, Francois ......................................................... WeCT4.1 3023<br />

Cook IV, Atlas F........................................................ WeDT5.5 3526<br />

.................................................................................. ThAPT10.11 *<br />

Cordes, Florian ......................................................... WeDT8.3 3653<br />

Corke, Peter ............................................................. ThAT3.3 3868<br />

.................................................................................. ThBT6.1 *<br />

Correll, Nikolaus ....................................................... ThCT4 C<br />

.................................................................................. ThCT4 O<br />

.................................................................................. ThCT4.1 *<br />

.................................................................................. ThCT4.5 4783<br />

.................................................................................. ThDPT10.11 *<br />

Corso, Jason ............................................................ TuCT2.3 1632<br />

Cortez, Andres.......................................................... WeAT8.6 2333<br />

.................................................................................. WeBPT10.24 *<br />

Cosgun, Akansel ...................................................... ThCT1.2 4627<br />

Costa, Lino ............................................................... WeAT7.7 2286


Coste, Michel............................................................ TuBT7.8 1453<br />

Cotin, Stephane........................................................ WeBT4.7 2608<br />

Coutard, Laurent....................................................... WeBT9.5 2843<br />

.................................................................................. WeCPT10.26 *<br />

Cowlagi, Raghvendra ............................................... WeDT5.1 3501<br />

Cowley, Anthony....................................................... TuAT7.6 1048<br />

.................................................................................. TuBPT10.18 *<br />

.................................................................................. ThCT4.4 4776<br />

.................................................................................. ThDPT10.10 *<br />

Creed, Ross.............................................................. ThAT2.2 3808<br />

Croft, Elizabeth......................................................... WeAT1.5 1994<br />

.................................................................................. WeBT9.2 2825<br />

.................................................................................. WeBPT10.2 *<br />

.................................................................................. WeDT1 CC<br />

.................................................................................. WeDT1.4 3361<br />

.................................................................................. ThAPT10.1 *<br />

Csató, Lehel ............................................................. MoBT8.2 698<br />

Cutkosky, Mark......................................................... MoBT9.7 774<br />

.................................................................................. TuAT6.1 *<br />

.................................................................................. TuBT6.4 1373<br />

.................................................................................. TuCPT10.13 *<br />

.................................................................................. WeBT3.8 2564<br />

.................................................................................. WeCT3.4 2992<br />

.................................................................................. WeCT3.5 2998<br />

.................................................................................. WeDPT10.7 *<br />

.................................................................................. WeDPT10.8 *<br />

.................................................................................. ThDT5.3 5100<br />

Czarnowski, Justin.................................................... WeCT5.4 3101<br />

.................................................................................. WeDPT10.13<br />

D<br />

*<br />

D'Ambrosio, David.................................................... WeBT8.7 2802<br />

D'Andrea, Raffaello .................................................. ThDT6.1 5113<br />

.................................................................................. ThDT8.3 5179<br />

Daepp, Hannes......................................................... ThBT8.5 4556<br />

.................................................................................. ThCPT10.23 *<br />

Dagnino, Giulio......................................................... TuBT5.8 1359<br />

Dahl, Kristen............................................................. WeCT6.4 3140<br />

.................................................................................. WeDPT10.16 *<br />

Dahmen, Christian.................................................... MoAT1.4 19<br />

.................................................................................. MoBPT10.1 *<br />

.................................................................................. TuBT3.5 1297<br />

.................................................................................. TuCPT10.8 *<br />

Dalamagkidis, Konstantinos ..................................... TuAT7.8 1063<br />

.................................................................................. ThAT9.5 4148<br />

.................................................................................. ThBPT10.26 *<br />

DallaLibera, Fabio .................................................... WeDT3.6 3480<br />

.................................................................................. ThAPT10.9 *<br />

Dallej, Tej.................................................................. WeBT9.7 2855<br />

Daly, John Michael ................................................... ThCT8.6 4961<br />

.................................................................................. ThDPT10.24 *<br />

Damani, Aayush ....................................................... MoBT1.5 445<br />

.................................................................................. TuAPT10.2 *<br />

Danès, Patrick .......................................................... MoAT3 C<br />

.................................................................................. MoAT3 O<br />

.................................................................................. MoAT3.1 *<br />

.................................................................................. MoAT3.7 137<br />

.................................................................................. MoBT3 O<br />

Daniel, Bruce............................................................ WeBT3.8 2564<br />

Daniilidis, Kostas ...................................................... ThBT6.8 4469<br />

Dansereau, Donald Gilbert....................................... ThBT6.6 4455<br />

.................................................................................. ThCPT10.18 *<br />

Dario, Paolo.............................................................. TuBT3.1 *<br />

.................................................................................. ThCT5.5 4826<br />

.................................................................................. ThDPT10.14 *<br />

Darrell, Trevor........................................................... TuAT1 C<br />

.................................................................................. TuAT1.2 793<br />

.................................................................................. ThCT6.1 *<br />

.................................................................................. ThCT6.7 4877<br />

Das, Aditya ............................................................... TuCT3.7 1699<br />

Das, Aveek ............................................................... ThAT3.4 3874<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–236–<br />

.................................................................................. ThBPT10.7 *<br />

Das, Colin ................................................................. TuAT9.7 1160<br />

Das, Gautham .......................................................... ThCT2.8 4716<br />

Das, Jnaneshwar...................................................... WeCT6.3 3132<br />

Dastoor, Sanjay ........................................................ MoBT9.7 774<br />

David, Philip.............................................................. MoBT2.5 494<br />

.................................................................................. TuAPT10.5 *<br />

.................................................................................. TuCT2.3 1632<br />

Davison, Andrew J.................................................... ThCT6 O<br />

Dawids, Steen .......................................................... TuAT2.2 847<br />

de Almeida, Anibal.................................................... ThBT2.8 4277<br />

de Gea Fernandez, Jose .......................................... ThAT9.3 4134<br />

De Laet, Tinne .......................................................... WeCT4.2 3031<br />

De Lorenzo, Danilo................................................... WeBT4.3 2584<br />

De Luca, Alessandro ................................................ WeCT7 CC<br />

.................................................................................. WeCT7.4 3192<br />

.................................................................................. WeDPT10.19 *<br />

.................................................................................. ThAT7.1 4026<br />

de Menezes Pereira , Arvind A................................. WeCT6.5 3147<br />

de Menezes Pereira, Arvind A.................................. WeDPT10.17 *<br />

de Oliveira, Mauricio................................................. ThBT3.7 4329<br />

De Schutter, Bart ...................................................... WeBT7.3 2729<br />

De Schutter, Joris ..................................................... WeCT4.2 3031<br />

.................................................................................. ThCT2.3 4684<br />

de Vries, Jeroen ....................................................... MoBT1.1 421<br />

Debain, Christophe................................................... ThBT9.1 4569<br />

Decré, Wilm .............................................................. WeCT4.2 3031<br />

Del Prete, Andrea ..................................................... WeDT9.1 3694<br />

Delaney, John........................................................... WeCT6.1 *<br />

Delettre, Anne........................................................... ThDT6.2 5121<br />

Dellaert, Frank .......................................................... WeAT6.8 2242<br />

Dellepiane, Massimo ................................................ TuBT5.8 1359<br />

Delmerico, Jeffrey..................................................... TuCT2.3 1632<br />

DelPreto, Joseph ...................................................... TuBT6.5 1380<br />

.................................................................................. TuCPT10.14 *<br />

Demeester, Eric........................................................ WeCT3.1 2971<br />

Demiris, Yiannis........................................................ WeAT1.3 1980<br />

Demonceaux, Cédric ................................................ ThAT6.6 4006<br />

.................................................................................. ThBPT10.18 *<br />

Denei, Simone .......................................................... WeDT9.1 3694<br />

Deng, Xinyan ............................................................ MoAT5 CC<br />

.................................................................................. MoAT5 O<br />

.................................................................................. MoBT5 O<br />

.................................................................................. MoBT5.6 574<br />

.................................................................................. TuAPT10.12 *<br />

Denny, Jory .............................................................. WeBT5.5 2632<br />

.................................................................................. WeCPT10.14 *<br />

Derenick, Jason........................................................ WeCT8.4 3248<br />

.................................................................................. WeDT8.5 3667<br />

.................................................................................. WeDPT10.22 *<br />

.................................................................................. ThAPT10.20 *<br />

Dermitzakis, Konstantinos ........................................ WeDT3.2 3454<br />

Desai, Jaydev P........................................................ TuAT5 C<br />

.................................................................................. TuAT5 O<br />

.................................................................................. TuBT5 CC<br />

.................................................................................. TuBT5 O<br />

.................................................................................. TuBT5.1 *<br />

.................................................................................. TuCT5 C<br />

.................................................................................. TuCT5 O<br />

Desmaele, Denis ...................................................... TuAT3.6 913<br />

.................................................................................. TuBPT10.9 *<br />

Detry, Renaud .......................................................... TuBT9.7 1554<br />

Dettmann, Alexander................................................ WeDT8.3 3653<br />

Dey, Bir Bikram......................................................... ThDT3.3 5048<br />

Di Caro, Gianni A...................................................... ThCT9.1 4981<br />

.................................................................................. ThCT9.8 5027<br />

Di Lello, Enrico ......................................................... WeCT3.1 2971<br />

Di Mario, Ezequiel .................................................... ThBT4.1 4341<br />

Di Rocco, Maurizio ................................................... ThCT9.5 5007<br />

.................................................................................. ThDPT10.26 *


Di Stefano, Luigi ....................................................... ThCT6.4 4857<br />

.................................................................................. ThDPT10.16 *<br />

Dickmann, Jürgen..................................................... ThBT9.4 4587<br />

.................................................................................. ThCPT10.25 *<br />

Dietrich, Alexander ................................................... WeCT7.5 3199<br />

.................................................................................. WeDPT10.20 *<br />

Dille, Michael ............................................................ ThCT8.4 4947<br />

.................................................................................. ThDPT10.22 *<br />

Diller, Eric D.............................................................. TuBT3.4 1291<br />

.................................................................................. TuCT3.5 1687<br />

.................................................................................. TuCPT10.7 *<br />

.................................................................................. WeAPT10.8 *<br />

Dillmann, Rüdiger..................................................... TuCT6.2 1761<br />

.................................................................................. TuCT6.5 1781<br />

.................................................................................. WeAPT10.14 *<br />

.................................................................................. ThCT1.3 4633<br />

.................................................................................. ThCT5 C<br />

Dimitrov, Dimitar Nikolaev ........................................ TuCT6.7 1797<br />

.................................................................................. WeAT7.8 2292<br />

Ding, Xu Chu ............................................................ WeCT5.2 3087<br />

Dinont, Cédric........................................................... WeDT5.4 3519<br />

.................................................................................. ThAPT10.10 *<br />

Dissanayake, Gamini................................................ TuCT2.4 1640<br />

.................................................................................. WeAPT10.4 *<br />

.................................................................................. WeDT8.8 3686<br />

.................................................................................. ThBT9.7 4608<br />

Dittes, Benjamin ....................................................... TuAT7.1 1015<br />

Dobrzynski, Michal Karol.......................................... TuCT9.1 1913<br />

Doitsidis, Lefteris ...................................................... TuCT2.7 1661<br />

Dolha, Mihai Emanuel .............................................. WeCT7.1 3172<br />

Dollar, Aaron............................................................. TuBT6 C<br />

.................................................................................. TuBT6 O<br />

.................................................................................. TuBT7.3 1420<br />

.................................................................................. TuCT6 O<br />

.................................................................................. TuCT7.3 1823<br />

.................................................................................. WeBT6.1 2660<br />

Dominey, Peter Ford ................................................ WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

Dong, Lixin................................................................ TuCT3.8 1705<br />

Dorigo, Marco........................................................... ThCT4.2 4762<br />

.................................................................................. ThCT9.8 5027<br />

Dörr, Jonas............................................................... WeDT3.5 3474<br />

.................................................................................. ThAPT10.8 *<br />

Doth, Klaus-Michael ................................................. ThDT8.1 5166<br />

Douillard, Bertrand.................................................... MoAT6.7 255<br />

Doyle, Courtney........................................................ ThCT8.8 4975<br />

Droge, Greg.............................................................. TuAT9.3 1134<br />

Du Toit, Noel E. ........................................................ WeCT8 C<br />

Ducatelle, Frederick.................................................. ThCT9.1 4981<br />

.................................................................................. ThCT9.8 5027<br />

Duchaine, Vincent .................................................... TuBT6.4 1373<br />

.................................................................................. TuCPT10.13 *<br />

Dudek, Gregory ........................................................ ThDT3.3 5048<br />

.................................................................................. ThDT3.4 5054<br />

Duff, Armin................................................................ TuAT8.8 1115<br />

Duhamel, Pierre-Emile ............................................. TuAT8.6 1099<br />

.................................................................................. TuBPT10.21 *<br />

Dupont, Pierre .......................................................... WeAT3.2 2083<br />

.................................................................................. ThBT7 C<br />

.................................................................................. ThBT7.6 4508<br />

.................................................................................. ThCPT10.21 *<br />

Dupuis, Erick ............................................................ MoBT6.8 631<br />

Duriez, Christian....................................................... WeBT4.7 2608<br />

Durrant-Whyte, Hugh................................................ TuAT2.8 886<br />

.................................................................................. ThBT2.2<br />

E<br />

4236<br />

Eathakota, Vijay........................................................ ThBT3.5 4314<br />

.................................................................................. ThCPT10.8 *<br />

Eck, Laurent ............................................................. ThCT8.1 4929<br />

Edsinger, Aaron........................................................ TuBT6 CC<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–237–<br />

.................................................................................. TuBT6 O<br />

.................................................................................. TuCT6 O<br />

Egerstedt, Magnus ................................................... TuAT9.3 1134<br />

.................................................................................. WeBT8.2 2772<br />

.................................................................................. ThBT8.5 4556<br />

.................................................................................. ThCPT10.23 *<br />

Einhorn, Erik ............................................................. WeBT1.5 2430<br />

Ek, Carl Henrik ......................................................... TuAT6.4 980<br />

.................................................................................. TuBPT10.13 *<br />

.................................................................................. WeAT2.2 2028<br />

Ekaterinaris, John A. ................................................ ThAT7.5 4054<br />

.................................................................................. ThBPT10.20 *<br />

Ekstrand, Fredrik ...................................................... TuCT2.2 1626<br />

Ekström, Mikael ........................................................ TuCT2.2 1626<br />

Elbrechter, Christof................................................... TuBT7.4 1427<br />

.................................................................................. TuCPT10.16 *<br />

Elkmann, Norbert...................................................... WeDT1.3 3355<br />

Emeli, Victor ............................................................. ThCT1.2 4627<br />

Ende, Tobias ............................................................ WeDT1.5 3367<br />

.................................................................................. ThAPT10.2 *<br />

Endo, Gen ................................................................ WeBT1.4 2423<br />

.................................................................................. WeCPT10.1 *<br />

Endo, Mitsuru ........................................................... ThBT9.5 4593<br />

.................................................................................. ThCPT10.26 *<br />

Eng, Dillon ................................................................ TuCT5.1 1711<br />

Engeberg, Erik Daniel............................................... ThBT1.1 4174<br />

Engel, Paulo ............................................................. TuAT9.1 1122<br />

Englsberger, Johannes............................................. ThBT5.7 4420<br />

Ergin, Mehmet Alper................................................. ThCT7.5 4917<br />

.................................................................................. ThDPT10.20 *<br />

Eruhimov, Victor ....................................................... WeCT2.4 2941<br />

.................................................................................. WeDPT10.4 *<br />

Esmaeili Malekabadi, Mohammad............................ ThCT7.3 4905<br />

Esparza, Noe............................................................ ThDT5.3 5100<br />

Espiau, Bernard........................................................ ThCT7 C<br />

Espinosa, Felipe ....................................................... ThAT1.1 3746<br />

Espinoza León, Judith .............................................. ThBT8.1 4528<br />

Esposito, Joel ........................................................... WeAT5.7 2192<br />

Estebanez, Belen ..................................................... WeAT3.8 2121<br />

Etoundi, Appolinaire C.............................................. ThAT7.3 4042<br />

Eustice, Ryan ........................................................... TuCT2 C<br />

.................................................................................. TuCT2.5 1647<br />

.................................................................................. WeAPT10.5 *<br />

.................................................................................. ThBT4.7 4378<br />

Evans, Nathan .......................................................... ThCT1.8 4664<br />

Even, Jani................................................................. MoBT3.6 536<br />

.................................................................................. TuAPT10.9<br />

F<br />

*<br />

Falcó Montesinos, Antonio ....................................... TuCT1.1 1567<br />

Falotico, Egidio ......................................................... ThCT5.5 4826<br />

.................................................................................. ThDPT10.14 *<br />

Fan, Zheng ............................................................... TuCT3.8 1705<br />

Fan, Zhun ................................................................. TuAT2.2 847<br />

Fankhauser, Peter .................................................... WeAT6.5 2223<br />

.................................................................................. WeBPT10.17 *<br />

Fantuzzi, Cesare ...................................................... ThBT9.8 4615<br />

Faria, Ricardo ........................................................... ThBT2.8 4277<br />

Fatikow, Sergej......................................................... MoAT1.4 19<br />

.................................................................................. MoBPT10.1 *<br />

.................................................................................. TuAT3.3 894<br />

.................................................................................. TuBT3.5 1297<br />

.................................................................................. TuCPT10.8 *<br />

Fatovic, Michael........................................................ TuAT3.8 925<br />

Fearing, Ronald ........................................................ WeBT6.3 2674<br />

.................................................................................. ThDT4.4 5080<br />

.................................................................................. ThDT5.1 5087<br />

Felekis, Dimitrios ...................................................... TuAT3.7 919<br />

Felfoul, Ouajdi .......................................................... TuBT3.6 1304<br />

.................................................................................. TuCPT10.9 *<br />

Felisa, Mirko ............................................................. TuCT1.6 1599


.................................................................................. WeAPT10.3 *<br />

Ferber, Jacques........................................................ ThDT3.2 5041<br />

Fernandez, Benito R................................................. WeAT7.4 2267<br />

.................................................................................. WeBPT10.19 *<br />

Ferrante, Eliseo ........................................................ ThCT9.8 5027<br />

Ferrary, Evelyne ....................................................... WeBT3.3 2532<br />

Ferreira, Antoine....................................................... TuBT3.5 1297<br />

.................................................................................. TuBT3.8 1315<br />

.................................................................................. TuCPT10.8 *<br />

Ferreira, Manuel ....................................................... WeAT7.7 2286<br />

Ferreira, Ricardo....................................................... ThCT5.5 4826<br />

.................................................................................. ThDPT10.14 *<br />

Ferretti, Gianni.......................................................... WeCT3.1 2971<br />

Ferrier, Nicola ........................................................... ThDT7.1 5139<br />

Ferrin, Jeffrey ........................................................... WeBT6.5 2688<br />

.................................................................................. WeCPT10.17 *<br />

.................................................................................. ThDT8.2 5173<br />

Ficuciello, Fanny....................................................... TuCT6.4 1775<br />

.................................................................................. WeAPT10.13 *<br />

Fierro, Rafael............................................................ WeAT8.6 2333<br />

.................................................................................. WeBPT10.24 *<br />

Fine, Benjamin.......................................................... ThCT9.4 5001<br />

.................................................................................. ThDPT10.25 *<br />

Finio, Benjamin......................................................... MoAT1.6 31<br />

.................................................................................. MoAT9.3 384<br />

.................................................................................. MoBPT10.3 *<br />

.................................................................................. TuAT8.6 1099<br />

.................................................................................. TuAT8.7 1107<br />

.................................................................................. TuBPT10.21 *<br />

Fink, Jonathan.......................................................... WeCT8.4 3248<br />

.................................................................................. WeDPT10.22 *<br />

Fiorini, Paolo............................................................. TuAT5 CC<br />

.................................................................................. TuAT5 O<br />

.................................................................................. TuAT5.1 *<br />

.................................................................................. TuBT5 C<br />

.................................................................................. TuBT5 O<br />

.................................................................................. TuCT5 CC<br />

.................................................................................. TuCT5 O<br />

Firman, Michael David.............................................. ThAT2.8 3850<br />

Fischer, Jan .............................................................. WeAT9.3 2365<br />

Fisher, Charles ......................................................... MoAT6.8 261<br />

Fjerdingen, Sigurd Aksnes ....................................... MoAT6.6 247<br />

.................................................................................. MoBPT10.18 *<br />

Flacco, Fabrizio ........................................................ ThAT7.1 4026<br />

Flemming, Leslie ...................................................... MoBT9.8 780<br />

Fleps, Michael .......................................................... WeCT9.3 3297<br />

Floreano, Dario......................................................... TuCT9 C<br />

.................................................................................. TuCT9.1 1913<br />

.................................................................................. ThCT9.6 5015<br />

.................................................................................. ThDPT10.27 *<br />

Folio, David............................................................... TuBT3.5 1297<br />

.................................................................................. TuCPT10.8 *<br />

Fontaine, Jean-Guy.................................................. ThAT5.4 3951<br />

.................................................................................. ThBPT10.13 *<br />

Forgoston, Eric ......................................................... ThAT4.3 3905<br />

Forlizzi, Jodi.............................................................. WeAT1.4 1986<br />

.................................................................................. WeBPT10.1 *<br />

Forsman, Pekka ....................................................... MoAT2.2 59<br />

Fourquet, Jean-Yves ................................................ ThAT9.2 4127<br />

Fox, Dieter................................................................ TuAT1.6 821<br />

.................................................................................. TuBPT10.3 *<br />

.................................................................................. ThAT6 CC<br />

.................................................................................. ThBT6 CC<br />

.................................................................................. ThCT6 C<br />

.................................................................................. ThCT6 O<br />

.................................................................................. ThCT6.3 4850<br />

Fox, Michael ............................................................. ThCT4.3 4770<br />

Fraichard, Thierry ..................................................... WeCT3.3 2985<br />

Fraisse, Philippe....................................................... ThCT1.7 4658<br />

Franchi, Antonio ....................................................... MoAT2 CC<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–238–<br />

.................................................................................. MoAT4.5 163<br />

.................................................................................. MoBT2.1 469<br />

.................................................................................. MoBPT10.11 *<br />

.................................................................................. WeAT6.4 2215<br />

.................................................................................. WeBPT10.16 *<br />

.................................................................................. WeCT4.3 3039<br />

Francisco, Gerald ..................................................... TuCT5.1 1711<br />

Frank, Barbara.......................................................... WeAT5.5 2180<br />

.................................................................................. WeBPT10.14 *<br />

Fraundorfer, Friedrich............................................... TuCT2.6 1655<br />

.................................................................................. WeAPT10.6 *<br />

.................................................................................. ThAT6.7 4012<br />

Frazzoli, Emilio ......................................................... MoAT7.6 298<br />

.................................................................................. MoBPT10.21 *<br />

.................................................................................. WeBT5.1 *<br />

.................................................................................. WeDT5.3 3513<br />

.................................................................................. ThBT3.4 4307<br />

.................................................................................. ThCPT10.7 *<br />

Freidovich, Leonid .................................................... ThDT5.2 5094<br />

Frese, Udo................................................................ TuCT1.4 1585<br />

.................................................................................. WeAPT10.1 *<br />

.................................................................................. WeCT9.4 3305<br />

.................................................................................. WeDT2.6 3426<br />

.................................................................................. WeDPT10.25 *<br />

.................................................................................. ThAPT10.6 *<br />

Friedl, Werner........................................................... TuBT6.3 1366<br />

.................................................................................. TuCT7.5 1836<br />

.................................................................................. WeAPT10.17 *<br />

.................................................................................. ThBT1.7 4215<br />

Friedman, Ariell ........................................................ TuBT9.4 1533<br />

.................................................................................. TuCPT10.22 *<br />

Fritz, Mario................................................................ TuAT1.2 793<br />

.................................................................................. ThCT6.7 4877<br />

Fu, Michael J. ........................................................... WeDT3.3 3460<br />

Fu, Yu ....................................................................... MoBT2.3 482<br />

Fu, Zhenbo ............................................................... TuAT3.8 925<br />

Fuchiwaki, Ohmi ....................................................... MoAT9.5 395<br />

.................................................................................. MoBT9 CC<br />

.................................................................................. MoBPT10.26 *<br />

Fujie, Masakatsu G................................................... TuAT5.6 955<br />

.................................................................................. TuBPT10.12 *<br />

Fujiki, Soichiro .......................................................... WeAT7.5 2274<br />

.................................................................................. WeAT7.6 2280<br />

.................................................................................. WeBPT10.20 *<br />

.................................................................................. WeBPT10.21 *<br />

Fujimoto, Hideo ........................................................ WeAT4.6 2145<br />

.................................................................................. WeBPT10.12 *<br />

Fujita, Emi................................................................. TuCT9.6 1946<br />

.................................................................................. WeAPT10.24 *<br />

Fukuda, Toshio......................................................... MoAT1.1 1<br />

.................................................................................. MoBT1.3 433<br />

.................................................................................. MoBT1.4 439<br />

.................................................................................. MoBT1.7 457<br />

.................................................................................. TuAT3.1 *<br />

.................................................................................. TuAPT10.1 *<br />

.................................................................................. TuCT3.6 1693<br />

.................................................................................. WeAT3.7 2115<br />

.................................................................................. WeAPT10.9 *<br />

.................................................................................. ThCT2.1 4670<br />

.................................................................................. ThDT5.2 5094<br />

Fukui, Rui ................................................................. WeDT2.5 3419<br />

.................................................................................. ThAPT10.5 *<br />

Fukushima, Edwardo F............................................. WeBT1.4 2423<br />

.................................................................................. WeCPT10.1 *<br />

Fukushima, Hiroaki................................................... TuCT8.8 1907<br />

Full, Robert ............................................................... MoAT5.1 *<br />

.................................................................................. TuCT8.5 1887<br />

.................................................................................. WeAPT10.20 *<br />

Fumagalli, Matteo ..................................................... ThAT9.7 4161<br />

Funakoshi, Kotaro .................................................... TuBT9.5 1540


.................................................................................. TuCPT10.23 *<br />

Fung, Henry.............................................................. WeAT7.3 2261<br />

Furtuna, Andrei......................................................... ThBT3.6 4321<br />

.................................................................................. ThCPT10.9 *<br />

Furuta, Takayuki....................................................... WeBT2.2<br />

G<br />

2466<br />

Gaboury, Louis ......................................................... TuBT3.6 1304<br />

.................................................................................. TuCPT10.9 *<br />

Gadre, Aditya............................................................ TuAT9.4 1140<br />

.................................................................................. TuBPT10.22 *<br />

Gaillard, François ..................................................... WeDT5.4 3519<br />

.................................................................................. ThAPT10.10 *<br />

Gal, Oren.................................................................. WeDT5.7 3539<br />

Galloway, Kevin........................................................ MoAT1.6 31<br />

.................................................................................. MoBPT10.3 *<br />

Galvez Lopez, Dorian............................................... MoAT2.1 51<br />

.................................................................................. TuBT2.5 1256<br />

.................................................................................. TuBT2.8 1277<br />

.................................................................................. TuCPT10.5 *<br />

Gambardella, Luca ................................................... ThCT9.1 4981<br />

.................................................................................. ThCT9.8 5027<br />

Gans, Nicholas ......................................................... TuBT1 C<br />

.................................................................................. TuBT1.2 1180<br />

Gao, Ce .................................................................... WeAT9.2 2359<br />

Gao, Yixin................................................................. WeBT3.4 2539<br />

.................................................................................. WeCPT10.7 *<br />

Garabini, Manolo ...................................................... ThAT1.5 3770<br />

.................................................................................. ThBPT10.2 *<br />

Garcia, Elena............................................................ TuBT8 C<br />

.................................................................................. TuBT8.8 1507<br />

Garcia Bermudez, Fernando .................................... WeBT6.3 2674<br />

Garney, Ben ............................................................. ThAT2.3 3816<br />

Gaspar, Jose ............................................................ ThCT5.5 4826<br />

.................................................................................. ThDPT10.14 *<br />

Gassert, Roger ......................................................... WeCT4.8 3074<br />

Gattupalli, Aditya ...................................................... ThBT3.5 4314<br />

.................................................................................. ThCPT10.8 *<br />

Gautier, Maxime ....................................................... WeDT9.6 3728<br />

.................................................................................. ThAPT10.24 *<br />

Gayet, Brice.............................................................. TuBT5.4 1333<br />

.................................................................................. TuBT5.5 1339<br />

.................................................................................. TuCPT10.10 *<br />

.................................................................................. TuCPT10.11 *<br />

Gehrig, Dirk .............................................................. ThCT5.4 4819<br />

.................................................................................. ThDPT10.13 *<br />

Geiger, James .......................................................... TuAT5.7 961<br />

Georgiev, Kristiyan ................................................... ThAT2.2 3808<br />

Geraerts, Roland ...................................................... WeDT5.5 3526<br />

.................................................................................. ThAPT10.11 *<br />

German, Christopher R. ........................................... MoAT6.8 261<br />

Gerratt, Aaron P. ...................................................... TuCT3.4 1680<br />

.................................................................................. WeAPT10.7 *<br />

Ghadiok, Vaibhav ..................................................... ThCT1.5 4645<br />

.................................................................................. ThDPT10.2 *<br />

Ghaffari Toiserkan, Kamran ..................................... WeDT9.8 3740<br />

Ghommam, Jawhar .................................................. WeAT8.7 2340<br />

Ghosh, Madhumita ................................................... WeBT1.2 2409<br />

Giardino, Simone...................................................... ThCT6.4 4857<br />

.................................................................................. ThDPT10.16 *<br />

Giertler, Bogumil....................................................... ThAT8.8 4115<br />

Gilbert, Hunter B....................................................... WeBT3.1 2517<br />

Gillies, Andrew G...................................................... ThDT5.1 5087<br />

Gillula, Jeremy.......................................................... WeCT3.2 2979<br />

Girbés, Vicent........................................................... ThBT3.8 4335<br />

Girdhar, Yogesh ....................................................... ThDT3.3 5048<br />

Giri, Nivedhitha......................................................... ThAT7.6 4060<br />

.................................................................................. ThBPT10.21 *<br />

Glasauer, Stefan....................................................... WeDT1.6 3375<br />

.................................................................................. ThAPT10.3 *<br />

Godage, Isuru S. ...................................................... TuAT8.5 1093<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–239–<br />

.................................................................................. TuBPT10.20 *<br />

Godin, Mike .............................................................. WeCT6.3 3132<br />

Goerick, Christian ..................................................... TuAT7.1 1015<br />

Goerner, Martin ........................................................ WeBT2.6 2497<br />

.................................................................................. WeCPT10.6 *<br />

Goeroeg, Attila.......................................................... ThCT5.8 4844<br />

Goldberg, Ken .......................................................... WeBT5.7 2646<br />

Goldfield, Eugene ..................................................... ThBT7.3 4488<br />

Goldin, Jeremy ......................................................... ThCT1.5 4645<br />

.................................................................................. ThDPT10.2 *<br />

Goldman, Roger E.................................................... WeAT9.7 2390<br />

Golombek, Raphael.................................................. WeCT3.7 3011<br />

Gomes, Kevin ........................................................... WeCT6.3 3132<br />

Gong, Zheng............................................................. MoBT3.5 530<br />

.................................................................................. TuAPT10.8 *<br />

Gonzalez, Yolanda ................................................... WeDT6.7 3577<br />

Gonzalez de Santos, Pablo ...................................... TuBT8.8 1507<br />

Goron, Lucian Cosmin.............................................. ThBT2.6 4263<br />

.................................................................................. ThCPT10.6 *<br />

Gosselin, Frederick P. .............................................. TuAT3.4 901<br />

.................................................................................. TuBPT10.7 *<br />

Goswami, Ambarish ................................................. ThAT5 C<br />

.................................................................................. ThAT5 O<br />

.................................................................................. ThAT5.3 3943<br />

Goto, Masataka ........................................................ WeAT1.6 2000<br />

.................................................................................. WeBPT10.3 *<br />

Gouttefarde, Marc..................................................... WeBT9.7 2855<br />

Gowal, Sven ............................................................. WeBT8.1 2765<br />

.................................................................................. WeCT9.5 3313<br />

.................................................................................. WeDPT10.26 *<br />

Graciano Santos, Vinicius ........................................ ThBT4.6 4372<br />

.................................................................................. ThCPT10.12 *<br />

Grand, Christophe .................................................... WeBT7.7 2753<br />

.................................................................................. ThBT4.4 4360<br />

.................................................................................. ThCPT10.10 *<br />

Grange, Sebastien.................................................... WeCT4.1 3023<br />

Gräser, Axel.............................................................. TuAT1.1 786<br />

Grebenstein, Markus ................................................ TuBT6.3 1366<br />

Gregory, John........................................................... WeAT4.4 2133<br />

.................................................................................. WeBPT10.10 *<br />

Grimmer, Martin........................................................ TuCT7.1 1811<br />

Grira, Aymen ............................................................ MoAT1.8 45<br />

Grisetti, Giorgio......................................................... MoAT2.6 86<br />

.................................................................................. MoBPT10.6 *<br />

.................................................................................. TuAT2.5 865<br />

.................................................................................. TuBPT10.5 *<br />

.................................................................................. WeDT9.4 3716<br />

.................................................................................. ThAPT10.22 *<br />

Grocholsky, Ben ....................................................... ThCT8.4 4947<br />

.................................................................................. ThDPT10.22 *<br />

Groeger, Martin ........................................................ ThBT7.5 4502<br />

.................................................................................. ThCPT10.20 *<br />

Gross, Horst-Michael................................................ WeBT1.5 2430<br />

.................................................................................. WeCPT10.2 *<br />

Grunberg, David ....................................................... WeCT1.8 2916<br />

Grupen, Rod ............................................................. WeDT1.2 3349<br />

Grzonka, Slawomir ................................................... TuBT2.4 1249<br />

.................................................................................. TuCPT10.4 *<br />

Gu, Ye ...................................................................... WeCT1.2 2873<br />

Guan, Yisheng.......................................................... TuBT8.3 1473<br />

Guerin, Kelleher........................................................ TuBT5.6 1346<br />

.................................................................................. TuCPT10.12 *<br />

Gueta, Lounell B....................................................... ThCT2.5 4698<br />

.................................................................................. ThDPT10.5 *<br />

Guglielmino, Emanuele ............................................ TuAT8.5 1093<br />

.................................................................................. TuBPT10.20 *<br />

.................................................................................. ThAT7.5 4054<br />

.................................................................................. ThBPT10.20 *<br />

Guitton, Julien........................................................... WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *


Guo, Shijie................................................................ WeBT1.7 2445<br />

Gupta, S.K................................................................ TuAT9 C<br />

Gupta, Satyandra K.................................................. TuAT9.6 1154<br />

.................................................................................. TuBPT10.24 *<br />

Gurdan, Daniel ......................................................... ThDT8.1 5166<br />

Gustafson, Joakim.................................................... WeDT1.1 3342<br />

Gutierrez, Alvaro....................................................... ThCT9.8<br />

H<br />

5027<br />

Ha, Hyowon.............................................................. WeCT9.2 3290<br />

Ha, Inyong ................................................................ WeCT7.6 3207<br />

.................................................................................. WeDPT10.21 *<br />

Haarnoja, Tuomas .................................................... WeAT7.2 2255<br />

Haberland, Matt........................................................ ThAT5.5 3957<br />

.................................................................................. ThBPT10.14 *<br />

Hach, Oliver.............................................................. ThAT8.1 4072<br />

Hacker, Franz........................................................... TuCT9.4 1933<br />

.................................................................................. WeAPT10.22 *<br />

Haddadin, Sami........................................................ TuCT6.6 1789<br />

.................................................................................. WeAPT10.15 *<br />

.................................................................................. WeDT1.5 3367<br />

.................................................................................. ThAPT10.2 *<br />

Hager, Gregory......................................................... WeAT3.5 2102<br />

.................................................................................. WeBT3.4 2539<br />

.................................................................................. WeBPT10.8 *<br />

.................................................................................. WeCT2.6 2953<br />

.................................................................................. WeCPT10.7 *<br />

.................................................................................. WeDPT10.6 *<br />

Hagita, Norihiro......................................................... MoBT3.6 536<br />

.................................................................................. MoBT3.8 550<br />

.................................................................................. TuAPT10.9 *<br />

.................................................................................. WeDT7.1 3589<br />

Hagiwara, Ichiro........................................................ MoBT3.5 530<br />

.................................................................................. TuAPT10.8 *<br />

Hagiwara, Masaya.................................................... TuBT3.7 1309<br />

Hagn, Ulrich.............................................................. WeCT4.1 3023<br />

Haines, Justin........................................................... MoAT6.3 227<br />

Halasz, Adam ........................................................... ThAT4.6 3923<br />

.................................................................................. ThBPT10.12 *<br />

Haliyo, Dogan Sinan................................................. TuAT3.3 894<br />

Halme, Aarne J......................................................... WeAT7.2 2255<br />

Hamann, Katharina................................................... WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

Hamano, Seiya......................................................... WeDT2.2 3401<br />

Hammer, Peter ......................................................... TuBT5.3 1327<br />

Han, Chang-Soo....................................................... ThDT6 CC<br />

.................................................................................. ThDT6.3 5127<br />

Han, Jungsoo ........................................................... ThDT6.3 5127<br />

Han, Misop ............................................................... TuAT5.4 943<br />

.................................................................................. TuBPT10.10 *<br />

Han, Shuo................................................................. WeCT3.8 3017<br />

Han, Zhou................................................................. MoBT5.5 568<br />

.................................................................................. TuAPT10.11 *<br />

Handzhiyski, Nikolay ................................................ WeDT5.8 3545<br />

Häne, Christian......................................................... TuCT2.1 1618<br />

Hanebeck, Uwe D..................................................... WeAT2.5 2050<br />

.................................................................................. WeBPT10.5 *<br />

.................................................................................. WeCT4.5 3053<br />

.................................................................................. WeDPT10.11 *<br />

.................................................................................. ThCT5.4 4819<br />

.................................................................................. ThDPT10.13 *<br />

Hanheide, Marc ........................................................ WeCT3.7 3011<br />

.................................................................................. WeDT1.8 3387<br />

Hansen, Peter........................................................... ThAT6.8 4020<br />

Hanson, Allen ........................................................... WeDT1.2 3349<br />

Hanus, Jean Luc....................................................... TuBT3.8 1315<br />

Hao, Ning.................................................................. WeCT2.2 2928<br />

Hara, Masayuki......................................................... ThCT1.8 4664<br />

Harada, Tatsuya....................................................... TuBT1.7 1214<br />

Haraguchi, Daisuke .................................................. TuAT5.2 931<br />

Harata, Yuji............................................................... WeBT7.4 2735<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–240–<br />

.................................................................................. WeCPT10.19 *<br />

Hardy, Jason ............................................................ TuCT1.7 1605<br />

Haschke, Robert....................................................... TuBT7.4 1427<br />

.................................................................................. TuCPT10.16 *<br />

.................................................................................. WeCT2.5 2947<br />

.................................................................................. WeDPT10.5 *<br />

Hasegawa, Hiroaki ................................................... TuCT9.7 1954<br />

Hasegawa, Osamu ................................................... ThAT6.5 3998<br />

.................................................................................. ThBPT10.17 *<br />

Hasegawa, Tsutomu................................................. WeAT2.1 2020<br />

.................................................................................. ThBT1.5 4201<br />

.................................................................................. ThCPT10.2 *<br />

Hasegawa, Yasuhisa................................................ TuCT5.5 1737<br />

.................................................................................. WeAPT10.11 *<br />

.................................................................................. ThCT7.2 4899<br />

Häselich, Marcel ....................................................... WeDT5.8 3545<br />

Hashimoto, Kenji ...................................................... WeCT7.8 3221<br />

Hashimoto, Koichi..................................................... MoAT1 C<br />

.................................................................................. MoAT1.2 7<br />

Hassan Zahraee, Ali ................................................. TuBT5.4 1333<br />

.................................................................................. TuCPT10.10 *<br />

Hassen, Salhi ........................................................... WeCT3.3 2985<br />

Hassenzahl, Marc..................................................... WeDT1.5 3367<br />

Hatakeyama, Tomofumi ........................................... ThCT1.6 4652<br />

.................................................................................. ThDPT10.3 *<br />

Hatton, Ross............................................................. ThAT3.7 3893<br />

Hauert, Sabine.......................................................... ThCT9.6 5015<br />

.................................................................................. ThDPT10.27 *<br />

Hauser, Helmut......................................................... ThAT9.8 4168<br />

Hausknecht, Matthew ............................................... ThBT9.3 4581<br />

Hawes, Nick.............................................................. WeDT1.8 3387<br />

Hawkes, Elliot Wright................................................ ThDT5.3 5100<br />

Hayakawa, Yoshikazu .............................................. WeBT1.7 2445<br />

Hayashi, Atsushi....................................................... ThBT5.4 4400<br />

.................................................................................. ThCPT10.13 *<br />

Hayashi, Yoshiaki ..................................................... TuCT5.8 1755<br />

Hayashibe, Mitsuhiro ................................................ TuCT5.4 1731<br />

.................................................................................. WeAPT10.10 *<br />

.................................................................................. WeDT9.2 3701<br />

Healey, Anthony J. ................................................... WeCT6.6 3154<br />

.................................................................................. WeDPT10.18 *<br />

Heckmann, Martin .................................................... WeCT3.7 3011<br />

Hehn, Markus ........................................................... ThDT8.3 5179<br />

Heidarsson, Hordur K ............................................... WeCT6.7 3160<br />

Heimlich, Christian.................................................... ThDT4.3 5073<br />

Hellum, Aren............................................................. ThCT3.6 4749<br />

.................................................................................. ThDPT10.9 *<br />

Helmer, Patrick ......................................................... WeCT4.1 3023<br />

Heng, Lionel ............................................................. ThAT6.7 4012<br />

Hengst, Bernhard ..................................................... MoBT8.7 732<br />

Hennes, Daniel ......................................................... ThBT2.4 4249<br />

.................................................................................. ThCPT10.4 *<br />

Heracleous, Panikos................................................. MoBT3.6 536<br />

.................................................................................. TuAPT10.9 *<br />

Herbst, Evan............................................................. ThCT6.3 4850<br />

Herman, Benoît ........................................................ TuBT5.4 1333<br />

.................................................................................. TuBT5.5 1339<br />

.................................................................................. TuCPT10.10 *<br />

.................................................................................. TuCPT10.11 *<br />

Hermans, Tucker ...................................................... ThCT1.2 4627<br />

Hernandez Arieta, Alejandro .................................... WeDT3.2 3454<br />

Hernandez-Gutierrez, Andres................................... ThBT9.6 4601<br />

.................................................................................. ThCPT10.27 *<br />

Hershberger, Andrew D............................................ WeDT3.3 3460<br />

Hershey, John R....................................................... ThCT2.4 4690<br />

.................................................................................. ThDPT10.4 *<br />

Hertkorn, Katharina .................................................. TuCT6.3 1768<br />

Heyer, Clint............................................................... MoAT6.4 235<br />

.................................................................................. MoBPT10.16 *<br />

Heyneman, Barrett ................................................... TuBT6.4 1373


.................................................................................. TuCPT10.13 *<br />

Higashimori, Mitsuru................................................. TuBT6.6 1386<br />

.................................................................................. TuBT6.7 1392<br />

.................................................................................. TuCPT10.15 *<br />

.................................................................................. ThAT7.4 4048<br />

.................................................................................. ThBPT10.19 *<br />

Higuchi, Toshiro........................................................ ThCT1.8 4664<br />

Hilario Pérez, Lucía .................................................. TuCT1.1 1567<br />

Hilsenbeck, Barbara ................................................. WeCT1.6 2903<br />

.................................................................................. WeDPT10.3 *<br />

Himmelsbach, Michael ............................................. ThBT8.6 4562<br />

.................................................................................. ThCPT10.24 *<br />

Hirai, Hiroaki............................................................. ThBT7.4 4496<br />

.................................................................................. ThCPT10.19 *<br />

Hirai, Shinichi............................................................ TuCT9.6 1946<br />

.................................................................................. WeAPT10.24 *<br />

Hirano, Shinya.......................................................... WeBT1.7 2445<br />

Hirata, Yasuhisa ....................................................... MoAT7 C<br />

.................................................................................. MoAT7.8 311<br />

.................................................................................. ThBT9.5 4593<br />

.................................................................................. ThCPT10.26 *<br />

Hirche, Sandra.......................................................... MoAT8.5 344<br />

.................................................................................. MoBPT10.23 *<br />

.................................................................................. WeBT1.3 2416<br />

Hirose, Shigeo.......................................................... MoAT5.2 *<br />

.................................................................................. TuAT8.3 1081<br />

.................................................................................. TuPL.1 *<br />

.................................................................................. WeBT1.4 2423<br />

.................................................................................. WeCPT10.1 *<br />

Hirschmüller, Heiko .................................................. WeBT2.6 2497<br />

.................................................................................. WeCPT10.6 *<br />

Hirzinger, Gerd ......................................................... MoAT4.7 177<br />

.................................................................................. TuAT6.3 973<br />

.................................................................................. TuAT7.7 1055<br />

.................................................................................. TuPL.1 *<br />

.................................................................................. TuCT6.3 1768<br />

.................................................................................. TuCT7.5 1836<br />

.................................................................................. WeAT4.8 2158<br />

.................................................................................. WeAPT10.17 *<br />

.................................................................................. WeCT4.1 3023<br />

.................................................................................. WeDT9.3 3708<br />

.................................................................................. ThBT5.7 4420<br />

.................................................................................. ThBT7.5 4502<br />

.................................................................................. ThCPT10.20 *<br />

Ho, Sean................................................................... MoBT2.5 494<br />

.................................................................................. TuAPT10.5 *<br />

Ho, Van..................................................................... TuCT9.6 1946<br />

.................................................................................. WeAPT10.24 *<br />

Hochgeschwender, Nico........................................... TuAT7.3 1030<br />

Hodgson, Sean......................................................... MoBT9.1 738<br />

Hoepflinger, Mark ..................................................... MoBT5.4 562<br />

.................................................................................. TuAPT10.10 *<br />

Hoeppner, Hannes ................................................... TuCT7.5 1836<br />

.................................................................................. WeAPT10.17 *<br />

Hoff, Nicholas ........................................................... ThCT9.2 4989<br />

Hoffman, Judy .......................................................... WeBT5.7 2646<br />

Hoffman, Katie.......................................................... TuBT8.4 1479<br />

.................................................................................. TuCPT10.19 *<br />

Holenstein, Claude ................................................... ThAT2.5 3830<br />

.................................................................................. ThBPT10.5 *<br />

Hollinger, Geoffrey.................................................... WeDT6.5 3564<br />

.................................................................................. ThAPT10.14 *<br />

.................................................................................. ThBT8 CC<br />

Holz, Dirk.................................................................. ThBT2.4 4249<br />

.................................................................................. ThCPT10.4 *<br />

Homma, Michio......................................................... MoAT1.1 1<br />

.................................................................................. TuCT3.6 1693<br />

.................................................................................. WeAPT10.9 *<br />

Hong, Dennis............................................................ ThAT5.6 3963<br />

.................................................................................. ThBPT10.15 *<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–241–<br />

Hoon, Kay Hiang ...................................................... ThCT7.6 4923<br />

.................................................................................. ThDPT10.21 *<br />

Hoover, Randy.......................................................... WeCT2.2 2928<br />

Horchler, Andrew...................................................... MoAT5.4 197<br />

.................................................................................. MoBPT10.13 *<br />

Hornung, Armin ........................................................ ThCT5.8 4844<br />

Hoshino, Satoshi ...................................................... WeAT8.3 2314<br />

.................................................................................. WeBT8.8 2810<br />

Hosoda, Koh............................................................. ThCT5.7 4838<br />

Hosoi, Anette ............................................................ ThAT3.7 3893<br />

Howard, Andrew ....................................................... ThBT6.2 *<br />

Howe, Robert D. ....................................................... TuBT5.3 1327<br />

.................................................................................. WeDT3.4 3468<br />

.................................................................................. ThAPT10.7 *<br />

Hrabar, Stefan .......................................................... ThCT8 C<br />

.................................................................................. ThCT8.7 4967<br />

Hsieh, M. Ani ............................................................ ThAT4 C<br />

.................................................................................. ThAT4 O<br />

.................................................................................. ThBT4 C<br />

.................................................................................. ThBT4 O<br />

.................................................................................. ThCT4 O<br />

.................................................................................. ThCT4.1 *<br />

.................................................................................. ThCT4.6 4790<br />

.................................................................................. ThDPT10.12 *<br />

Hu, Chengzhi............................................................ MoBT1.4 439<br />

.................................................................................. TuAPT10.1 *<br />

Hu, Tianjiang ............................................................ MoBT5.5 568<br />

.................................................................................. TuAPT10.11 *<br />

Hu, Yonghui.............................................................. TuCT8.1 1863<br />

Hu, Zheng................................................................. MoBT5.6 574<br />

.................................................................................. TuAPT10.12 *<br />

Huang, Guoquan ...................................................... MoAT2.3 65<br />

Huang, Han-Pang..................................................... MoAT9 C<br />

.................................................................................. MoAT9.1 372<br />

Huang, Jian .............................................................. ThCT2.1 4670<br />

Huang, Ke................................................................. MoAT4.3 149<br />

Huang, Ke Jung........................................................ TuBT8.6 1493<br />

.................................................................................. TuCPT10.21 *<br />

Huang, Shoudong..................................................... WeDT8.8 3686<br />

Huang, Shuo............................................................. TuCT2.8 1668<br />

Huang, Tzu-Hao ....................................................... MoAT9.1 372<br />

Huang, Yanjiang ....................................................... ThCT2.5 4698<br />

.................................................................................. ThDPT10.5 *<br />

Huang, Yanlong........................................................ WeDT2.7 3434<br />

Huang, Yazhou......................................................... WeBT5.8 2653<br />

Huber, Daniel............................................................ WeCT9.1 3283<br />

Huber, Markus .......................................................... WeDT1.6 3375<br />

.................................................................................. ThAPT10.3 *<br />

Hudas, Greg ............................................................. ThAT8.4 4091<br />

.................................................................................. ThBPT10.22 *<br />

Huebner, Kai............................................................. TuAT6.4 980<br />

.................................................................................. TuBPT10.13 *<br />

Huhle, Benjamin ....................................................... ThBT2.1 4229<br />

Huihua, Zhao ............................................................ TuCT5.3 1723<br />

Hunt, Alexander J ..................................................... MoAT5.6 209<br />

.................................................................................. MoBPT10.15 *<br />

Huntsberger, Terry ................................................... ThCT3 CC<br />

.................................................................................. ThCT3.3 4728<br />

Huo, Xiaoming .......................................................... WeDT5.2 3507<br />

.................................................................................. WeDT5.6 3533<br />

.................................................................................. ThAPT10.12 *<br />

Hurst, Jonathan ........................................................ ThAT1 CC<br />

.................................................................................. ThAT1.3 3758<br />

Hürzeler, Christoph................................................... WeBT6.6 2694<br />

.................................................................................. WeCPT10.18 *<br />

Hutchinson, Seth ...................................................... WeAT2.8 2070<br />

.................................................................................. WeBT9 CC<br />

.................................................................................. WeBT9.1 2817<br />

Hutter, Marco............................................................ MoBT5.4 562<br />

.................................................................................. TuAPT10.10 *


Hyon, Sang-Ho......................................................... ThAT5.8<br />

I<br />

3975<br />

Iagnemma, Karl ........................................................ MoAT7.6 298<br />

.................................................................................. MoBPT10.21 *<br />

.................................................................................. ThAT8 C<br />

.................................................................................. ThAT8.4 4091<br />

.................................................................................. ThBPT10.22 *<br />

Ichikawa, Akihiko ...................................................... MoAT1.5 25<br />

.................................................................................. MoBPT10.2 *<br />

Ideta, Aiko................................................................. WeBT7.1 2715<br />

Igarashi, Yasunobu................................................... MoAT1.2 7<br />

Iida, Fumiya .............................................................. ThDT5 C<br />

.................................................................................. ThDT5.4 5107<br />

Iimura, Taiki .............................................................. ThBT7.4 4496<br />

.................................................................................. ThCPT10.19 *<br />

Ijspeert, Auke............................................................ TuCT8.7 1901<br />

Ikeda, Atsutoshi........................................................ WeAT4.3 2127<br />

Ikeda, Seiichi ............................................................ MoBT1.4 439<br />

.................................................................................. TuAPT10.1 *<br />

.................................................................................. WeAT3.7 2115<br />

Ikeda, Takeshi .......................................................... ThDT6.4 5133<br />

Ikeda, Tetsushi ......................................................... WeDT7.1 3589<br />

Ikedo, Norio .............................................................. WeAT4.6 2145<br />

.................................................................................. WeBPT10.12 *<br />

Iliev, Boyko ............................................................... TuCT6.7 1797<br />

Iliopoulos, Konstantinos............................................ WeAT9.7 2390<br />

Imamura, Nobuaki .................................................... TuCT5.2 1717<br />

Imura, Jun-ichi .......................................................... MoAT3.2 106<br />

.................................................................................. MoAT3.6 131<br />

.................................................................................. MoBPT10.9 *<br />

Inahara, Tomoyuki.................................................... TuBT6.7 1392<br />

Inamura, Tetsunari ................................................... WeDT1 C<br />

.................................................................................. WeDT1.7 3381<br />

Ince, Gokhan ............................................................ MoAT3.2 106<br />

.................................................................................. MoAT3.6 131<br />

.................................................................................. MoAT3.8 143<br />

.................................................................................. MoBPT10.9 *<br />

Inoue, Keita .............................................................. ThBT7.4 4496<br />

.................................................................................. ThCPT10.19 *<br />

Inoue, Ryosuke......................................................... WeBT7.6 2747<br />

.................................................................................. WeCPT10.21 *<br />

Iocchi, Luca .............................................................. WeBT8.6 2796<br />

.................................................................................. WeCPT10.24 *<br />

Iribe, Masatsugu ....................................................... WeBT1.4 2423<br />

.................................................................................. WeCPT10.1 *<br />

Ishi, Carlos Toshinori................................................ MoBT3.6 536<br />

.................................................................................. MoBT3.8 550<br />

.................................................................................. TuAPT10.9 *<br />

Ishigami, Genya........................................................ MoBT6.3 601<br />

.................................................................................. ThAT8.4 4091<br />

.................................................................................. ThBPT10.22 *<br />

Ishiguro, Akio............................................................ TuCT8 C<br />

.................................................................................. TuCT8.3 1875<br />

.................................................................................. TuCT8.4 1881<br />

.................................................................................. TuCT8.6 1895<br />

.................................................................................. WeAPT10.19 *<br />

.................................................................................. WeAPT10.21 *<br />

Ishiguro, Hiroshi........................................................ MoBT3.8 550<br />

.................................................................................. WeCT1.1 2867<br />

.................................................................................. WeDT3.6 3480<br />

.................................................................................. ThAPT10.9 *<br />

Ishii, Idaku ................................................................ TuBT1.6 1208<br />

.................................................................................. TuCPT10.3 *<br />

Ishikawa, Jun............................................................ WeAT4.7 2151<br />

Ishikawa, Masatoshi ................................................. TuCT9.7 1954<br />

Iskakov, Renat.......................................................... ThBT7.5 4502<br />

.................................................................................. ThCPT10.20 *<br />

Isler, Volkan.............................................................. MoBT2.4 488<br />

.................................................................................. TuAPT10.4 *<br />

Isom, Taylor.............................................................. ThCT8.8 4975<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–242–<br />

Ito, Masaki ................................................................ MoBT1.3 433<br />

Itohara, Tatsuhiko..................................................... MoAT3.4 118<br />

.................................................................................. MoBPT10.7 *<br />

Itti, Laurent................................................................ WeAT9.8 2397<br />

Ivaldi, Serena............................................................ ThBT4.3 4354<br />

Iwahashi, Naoto........................................................ MoAT8.6 350<br />

.................................................................................. MoBPT10.24 *<br />

.................................................................................. TuBT9.2 1520<br />

.................................................................................. TuBT9.5 1540<br />

.................................................................................. TuCPT10.23 *<br />

Iwasa, Shingo ........................................................... TuCT7.6 1843<br />

.................................................................................. WeAPT10.18 *<br />

Iwashita, Yumi .......................................................... WeAT2.1 2020<br />

Iwata, Hiroyasu......................................................... WeDT9.7 3734<br />

Izutsu, Masaki........................................................... WeAT4.7<br />

J<br />

2151<br />

Jääskeläinen, Mirva.................................................. TuAT3.5 907<br />

.................................................................................. TuBPT10.8 *<br />

Jahn, Benjamin......................................................... TuAT6.3 973<br />

Jahya, Alex ............................................................... WeBT3.7 2557<br />

Jaillet, Leonard ......................................................... WeBT5.7 2646<br />

Jäkel, Rainer............................................................. ThCT1.3 4633<br />

Jakuba, Michael........................................................ MoAT6.8 261<br />

.................................................................................. TuBT2.3 1242<br />

.................................................................................. ThCT3.2 4722<br />

Jaleel, Hassan .......................................................... WeBT8.2 2772<br />

Janicek, Miroslav ...................................................... WeDT1.8 3387<br />

Janoch, Allison ......................................................... TuAT1.2 793<br />

Jäntsch, Michael....................................................... TuAT7.8 1063<br />

.................................................................................. ThAT9.5 4148<br />

.................................................................................. ThBPT10.26 *<br />

Jarquín, Gerardo ...................................................... TuCT1.8 1612<br />

Jarrault, Pierre .......................................................... WeBT7.7 2753<br />

Jens, Kristensen ....................................................... TuAT2.2 847<br />

Jensfelt, Patric .......................................................... TuBT9.1 1513<br />

Jentoft, Leif ............................................................... WeDT3.4 3468<br />

Jentoft, Leif P............................................................ ThAPT10.7 *<br />

Jeong, Jaeseung ...................................................... MoBT7.8 685<br />

Jeong, Jin Hyeok ...................................................... TuAT5.8 967<br />

Jeong, Yekeun.......................................................... WeCT9.2 3290<br />

.................................................................................. ThBT6.3 4436<br />

Jia, Peifa................................................................... WeAT9.2 2359<br />

Jia, Yangqing............................................................ TuAT1.2 793<br />

Jia, Yunyi .................................................................. MoAT4.6 171<br />

.................................................................................. MoAT4.8 184<br />

.................................................................................. MoBPT10.12 *<br />

Jia, Zhenzhong ......................................................... WeCT3.6 3004<br />

.................................................................................. WeDPT10.9 *<br />

Jiang, Jinn-Feng ....................................................... TuCT9.8 1962<br />

Jiang, Li .................................................................... TuBT8.3 1473<br />

Jo, Sungho ............................................................... MoBT7.8 685<br />

Joerg, Stefan ............................................................ TuAT7.7 1055<br />

Jog, Amod ................................................................ WeBT3.4 2539<br />

.................................................................................. WeCPT10.7 *<br />

Johansson, Rolf........................................................ ThCT2 CC<br />

.................................................................................. ThCT2.6 4704<br />

.................................................................................. ThDPT10.6 *<br />

Johns, Edward.......................................................... MoBT3.7 542<br />

Johnson, Chris.......................................................... ThCT8.8 4975<br />

Johnson, David......................................................... MoBT9.8 780<br />

Johnson-Roberson, Matthew.................................... WeDT1.1 3342<br />

Joly, Cyril .................................................................. WeCT9.6 3320<br />

.................................................................................. WeDPT10.27 *<br />

Jones, Burton ........................................................... WeCT6.5 3147<br />

.................................................................................. WeDPT10.17 *<br />

Jonker, Pieter ........................................................... TuAT6.6 995<br />

.................................................................................. TuBPT10.15 *<br />

Jørgensen, Jimmy Alison ......................................... TuAT6.5 987<br />

.................................................................................. TuBPT10.14 *<br />

Jouvencel, Bruno...................................................... WeDT6.4 3558


.................................................................................. ThDT3.2 5041<br />

Ju, Wendy................................................................. WeBT1.8 2452<br />

Julian, Brian.............................................................. ThDT8.4 5187<br />

Julier, Simon Justin .................................................. ThAT2.8 3850<br />

Jung, Eui-jung........................................................... WeDT7.2 3595<br />

Jung, Jinwoo............................................................. ThDT7.1 5139<br />

Jung, Jiyoung ........................................................... WeCT9.2<br />

K<br />

3290<br />

Kadivar, Zahra.......................................................... TuCT5.1 1711<br />

Kagami, Satoshi ....................................................... ThBT5 O<br />

.................................................................................. ThCT5 O<br />

Kahn, Jeff ................................................................. MoBT5.7 580<br />

Kajita, Shuuji............................................................. WeAT1.6 2000<br />

.................................................................................. WeBPT10.3 *<br />

.................................................................................. ThAT5.1 *<br />

.................................................................................. ThBT5.3 4392<br />

.................................................................................. ThBT5.8 4428<br />

Kalakrishnan, Mrinal................................................. MoAT8.2 325<br />

.................................................................................. MoAT8.8 365<br />

.................................................................................. ThCT1.4 4639<br />

.................................................................................. ThDPT10.1 *<br />

Kallem, Vinutha ........................................................ WeCT5.8 3126<br />

Kallman, Jason......................................................... ThCT8.8 4975<br />

Kallmann, Marcelo.................................................... WeBT5.8 2653<br />

Kam, Moshe ............................................................. WeAT8.2 2306<br />

Kamamichi, Norihiro ................................................. WeAT4.7 2151<br />

Kamezaki, Mitsuhiro ................................................. WeDT9.7 3734<br />

Kammel, Sören......................................................... TuBT7.2 1413<br />

Kanade, Takeo ......................................................... WeCT9.1 3283<br />

Kanani, Keyvan ........................................................ MoBT6.6 619<br />

.................................................................................. TuAPT10.15 *<br />

Kanbayashi, Takashi ................................................ ThBT9.5 4593<br />

.................................................................................. ThCPT10.26 *<br />

Kanehira, Noriyuki .................................................... ThBT5.4 4400<br />

.................................................................................. ThCPT10.13 *<br />

Kanehiro, Fumio ....................................................... ThBT5.3 4392<br />

.................................................................................. ThBT5.4 4400<br />

.................................................................................. ThBT5.8 4428<br />

.................................................................................. ThCPT10.13 *<br />

Kaneko, Kenji ........................................................... ThBT5.3 4392<br />

.................................................................................. ThBT5.4 4400<br />

.................................................................................. ThBT5.8 4428<br />

.................................................................................. ThCPT10.13 *<br />

Kaneko, Makoto........................................................ TuBT6.6 1386<br />

.................................................................................. TuBT6.7 1392<br />

.................................................................................. TuCPT10.15 *<br />

.................................................................................. ThAT7.4 4048<br />

.................................................................................. ThBPT10.19 *<br />

Kang, Rongjie........................................................... ThAT7.5 4054<br />

.................................................................................. ThBPT10.20 *<br />

Kang, Sungchul ........................................................ TuCT7.8 1857<br />

Kano, Takeshi........................................................... TuCT8.3 1875<br />

.................................................................................. TuCT8.4 1881<br />

.................................................................................. TuCT8.6 1895<br />

.................................................................................. WeAPT10.19 *<br />

.................................................................................. WeAPT10.21 *<br />

Kanou, Kazuki .......................................................... TuAT5.6 955<br />

.................................................................................. TuBPT10.12 *<br />

Kanoulas, Dimitrios................................................... TuBT7.6 1439<br />

.................................................................................. TuCPT10.18 *<br />

Kantor, George......................................................... MoBT2.3 482<br />

.................................................................................. TuBT5.7 1353<br />

.................................................................................. ThDT7.2 5147<br />

Kao, Ching-Chung.................................................... WeCT2.8 2965<br />

Kapadia, Apoorva..................................................... TuAT8.4 1087<br />

.................................................................................. TuBPT10.19 *<br />

Kapoor, Ankur........................................................... MoBT7.1 639<br />

Kappler, Daniel......................................................... TuCT6.2 1761<br />

Karaman, Sertac....................................................... WeDT5.3 3513<br />

.................................................................................. ThBT3.4 4307<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–243–<br />

.................................................................................. ThCPT10.7 *<br />

Karaoguz, Cem......................................................... TuBT1.3 1187<br />

Karayev, Sergey ....................................................... TuAT1.2 793<br />

Karino, Takahiro ....................................................... ThBT8.4 4550<br />

.................................................................................. ThCPT10.22 *<br />

Karpelson, Michael ................................................... ThDT4.3 5073<br />

Karssen, Daniël ........................................................ ThAT5.5 3957<br />

.................................................................................. ThBPT10.14 *<br />

Kashioka, Hideki....................................................... MoAT8.6 350<br />

.................................................................................. MoBPT10.24 *<br />

Kashiwazaki, Koshi................................................... ThBT9.5 4593<br />

.................................................................................. ThCPT10.26 *<br />

Katayama, Daiki ....................................................... WeAT7.6 2280<br />

.................................................................................. WeBPT10.21 *<br />

Kato, Shin ................................................................. ThAT8.7 4109<br />

Katsev, Max.............................................................. WeBT2.8 2511<br />

Katsuki, Yoshio......................................................... ThDT6.4 5133<br />

Kavraki, Lydia ........................................................... ThCT1.1 4621<br />

Kawahara, Tomohiro ................................................ TuBT3.7 1309<br />

Kawakami, Daiki ....................................................... MoAT1.5 25<br />

.................................................................................. MoBPT10.2 *<br />

Kawamura, Akihiro ................................................... ThBT1.5 4201<br />

.................................................................................. ThCPT10.2 *<br />

Kawamura, Sadao .................................................... ThCT3.7 4756<br />

Kawashima, Kenji ..................................................... TuAT5.2 931<br />

Kawato, Mitsuo ......................................................... ThAT5.8 3975<br />

Kazakidi, Asimina ..................................................... ThAT7.5 4054<br />

.................................................................................. ThBPT10.20 *<br />

Kazanzides, Peter .................................................... MoBT7.1 639<br />

Kazerooni, Homayoon .............................................. ThCT7.4 4911<br />

.................................................................................. ThDPT10.19 *<br />

Kazmitcheff, Guillaume............................................. WeBT3.3 2532<br />

Keith, François.......................................................... ThAT3.6 3887<br />

.................................................................................. ThBPT10.9 *<br />

Keller, Thomas ......................................................... WeDT1.8 3387<br />

Kelly, Alonzo............................................................. MoAT6 O<br />

.................................................................................. MoBT6 O<br />

.................................................................................. WeAT5.4 2172<br />

.................................................................................. WeBPT10.13 *<br />

Kelly, Jonathan ......................................................... WeCT9 C<br />

Kendoul, Farid .......................................................... ThCT8.5 4953<br />

.................................................................................. ThDPT10.23 *<br />

Kermani, Mehrdad R. ............................................... WeBT3.6 2551<br />

.................................................................................. WeCPT10.9 *<br />

Kermorgant, Olivier................................................... MoBT9.6 768<br />

.................................................................................. TuAPT10.24 *<br />

.................................................................................. WeBT9.6 2849<br />

.................................................................................. WeCPT10.27 *<br />

Kersbergen, Bart ...................................................... WeBT7.3 2729<br />

Keshmiri, Mehdi........................................................ ThAT1.2 3752<br />

Keuning, Jasper David ............................................. MoBT1.1 421<br />

Khansari-Zadeh, Seyed Mohammad........................ MoBT8.4 710<br />

.................................................................................. TuAPT10.19 *<br />

Khatib, Oussama ...................................................... TuAT7.4 1036<br />

.................................................................................. TuBPT10.16 *<br />

.................................................................................. TuCT7.4 1830<br />

.................................................................................. WeAPT10.16 *<br />

.................................................................................. WeCT3.4 2992<br />

.................................................................................. WeCT3.5 2998<br />

.................................................................................. WeDPT10.7 *<br />

.................................................................................. WeDPT10.8 *<br />

Kheddar, Abderrahmane .......................................... ThAT3.6 3887<br />

.................................................................................. ThBT5.6 4414<br />

.................................................................................. ThBPT10.9 *<br />

.................................................................................. ThCPT10.15 *<br />

Khushaba, Rami ....................................................... ThBT9.7 4608<br />

Kielhöfer, Simon ....................................................... TuCT9.4 1933<br />

.................................................................................. WeAPT10.22 *<br />

Kiesler, Sara ............................................................. WeAT1.4 1986<br />

.................................................................................. WeBPT10.1 *


Kiguchi, Kazuo.......................................................... TuCT5.8 1755<br />

Kim, Ayoung ............................................................. TuCT2.5 1647<br />

.................................................................................. WeAPT10.5 *<br />

Kim, Chunwoo .......................................................... TuAT5.4 943<br />

.................................................................................. TuBPT10.10 *<br />

Kim, Chyon Hae ....................................................... ThAT1.8 3792<br />

Kim, Dongwook ........................................................ TuCT3.3 1674<br />

Kim, Eunyoung ......................................................... ThAT2.1 3800<br />

Kim, H. Jin ................................................................ WeBT8.5 2790<br />

.................................................................................. WeCPT10.23 *<br />

Kim, James Dokyoon................................................ WeCT9.2 3290<br />

Kim, Jung.................................................................. ThBT7 CC<br />

.................................................................................. ThBT7.7 4516<br />

Kim, Junsuk.............................................................. WeCT4.3 3039<br />

Kim, Keehoon........................................................... ThBT7.2 4483<br />

Kim, Min Jeong......................................................... TuBT7.7 1447<br />

Kim, Minjun............................................................... WeBT3.2 2524<br />

Kim, Sangbae........................................................... ThAT5.5 3957<br />

.................................................................................. ThBPT10.14 *<br />

Kim, Sungmin ........................................................... WeBT3.2 2524<br />

Kim, Ui-Hyun ............................................................ WeCT1.7 2910<br />

Kim, Woojin .............................................................. WeBT8.5 2790<br />

.................................................................................. WeCPT10.23 *<br />

Kim, Yeongjin ........................................................... ThBT7.7 4516<br />

Kim, Young Soo........................................................ WeBT3.2 2524<br />

Kim, Youngmoo........................................................ WeCT1.8 2916<br />

King, Raymond......................................................... ThCT8.8 4975<br />

Kinnaert, Michel........................................................ ThBT7.1 4477<br />

Kinsey, James .......................................................... MoAT6.8 261<br />

.................................................................................. ThCT3.2 4722<br />

Kirchner, Frank......................................................... ThAT9 CC<br />

.................................................................................. ThAT9.3 4134<br />

Kirchner, Nathan....................................................... WeCT1.6 2903<br />

.................................................................................. WeDPT10.3 *<br />

Kit, Dmitry................................................................. TuBT1.4 1194<br />

.................................................................................. TuCPT10.1 *<br />

Kitt, Bernd................................................................. MoAT6.3 227<br />

Klappstein, Jens ....................................................... ThBT9.4 4587<br />

.................................................................................. ThCPT10.25 *<br />

Kleiner, Alexander .................................................... WeCT8.8 3276<br />

Klimchik, Alexandr.................................................... ThAT7.2 4034<br />

Klodmann, Julian...................................................... WeDT9.3 3708<br />

Klotzbuecher, Markus............................................... ThCT2.3 4684<br />

Kneip, Laurent .......................................................... MoAT2.5 79<br />

.................................................................................. MoBPT10.5 *<br />

.................................................................................. WeAT6.7 2235<br />

.................................................................................. WeBT6.6 2694<br />

.................................................................................. WeCPT10.18 *<br />

Knepper, Ross A ...................................................... ThAT3.1 3856<br />

Knoll, Alois................................................................ TuAT7.8 1063<br />

.................................................................................. WeDT1.6 3375<br />

.................................................................................. ThAT9.5 4148<br />

.................................................................................. ThAPT10.3 *<br />

.................................................................................. ThBPT10.26 *<br />

Ko, Hanseok............................................................. MoAT3 O<br />

.................................................................................. MoBT3 O<br />

Kobayashi, Ryo ........................................................ TuCT8.3 1875<br />

Kobayashi, Yo .......................................................... TuAT5.6 955<br />

.................................................................................. TuBPT10.12 *<br />

Kober, Jens .............................................................. MoAT8.4 338<br />

.................................................................................. MoBPT10.22 *<br />

Kocamaz, Mehmet.................................................... ThAT8.3 4084<br />

Kodagoda, Sarath..................................................... ThBT9.7 4608<br />

Kodama, Atsushi ...................................................... MoAT7.1 268<br />

Koehler, Sarah.......................................................... WeCT5.7 3120<br />

Koepl, Devin ............................................................. ThAT1.3 3758<br />

Koh, Kyoung-Chul .................................................... TuAT5.8 967<br />

Kohda, Takehisa....................................................... WeAT7.5 2274<br />

.................................................................................. WeAT7.6 2280<br />

.................................................................................. WeBPT10.20 *<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–244–<br />

.................................................................................. WeBPT10.21 *<br />

Kohler, Norman ........................................................ ThCT8.3 4941<br />

Kojima, Masaru......................................................... MoAT1.1 1<br />

.................................................................................. MoBT1.3 433<br />

.................................................................................. MoBT1.7 457<br />

.................................................................................. TuCT3.6 1693<br />

.................................................................................. WeAPT10.9 *<br />

Kokuryu, Saori .......................................................... WeAT4.7 2151<br />

Komori, Kimihiro ....................................................... WeAT3.7 2115<br />

Kondak, Konstantin .................................................. WeDT3.5 3474<br />

.................................................................................. ThAPT10.8 *<br />

Kondo, Daisuke ........................................................ TuCT9.6 1946<br />

.................................................................................. WeAPT10.24 *<br />

Kondo, Hayato.......................................................... WeCT6 O<br />

.................................................................................. WeDT6 O<br />

Kondo, Hideki ........................................................... WeCT7.8 3221<br />

Kondo, Takao ........................................................... MoAT1.1 1<br />

Kong, Kyoungchul .................................................... WeBT2.3 2474<br />

Kong, Zhaodan ......................................................... WeCT5.3 3093<br />

Konietschke, Rainer ................................................. WeDT9.3 3708<br />

Königs, Achim........................................................... WeDT7.5 3614<br />

.................................................................................. ThAPT10.17 *<br />

Konolige, Kurt ........................................................... ThBT2.4 4249<br />

.................................................................................. ThCPT10.4 *<br />

Konyo, Masashi ........................................................ WeDT3 CC<br />

.................................................................................. WeDT3.7 3488<br />

.................................................................................. WeDT3.8 3494<br />

Koo, Ja Choon.......................................................... TuBT7.7 1447<br />

Kootstra, Gert ........................................................... TuAT6.5 987<br />

.................................................................................. TuBPT10.14 *<br />

Kormushev, Petar..................................................... MoAT8 CC<br />

.................................................................................. MoAT8.1 318<br />

Koropouli, Vasiliki ..................................................... MoAT8.5 344<br />

.................................................................................. MoBPT10.23 *<br />

Kosecka, Jana .......................................................... WeAT9 C<br />

.................................................................................. WeAT9.6 2383<br />

.................................................................................. WeBPT10.27 *<br />

Koseki, Yoshihiko ..................................................... WeBT4.3 2584<br />

Kosmatopoulos, Elias ............................................... TuCT2.7 1661<br />

Kossyk, Ingo ............................................................. WeDT3.5 3474<br />

.................................................................................. ThAPT10.8 *<br />

Kosuge, Kazuhiro ..................................................... MoAT7.8 311<br />

.................................................................................. WeAT1.7 2008<br />

.................................................................................. ThBT9.5 4593<br />

.................................................................................. ThCPT10.26 *<br />

Kotoku, Tetsuo ......................................................... TuAT7.5 1042<br />

.................................................................................. TuBPT10.17 *<br />

Kovecses, Jozsef...................................................... MoAT7.3 280<br />

.................................................................................. WeDT9 CC<br />

.................................................................................. WeDT9.8 3740<br />

Kraetzschmar, Gerhard ............................................ TuAT7.3 1030<br />

Kragic, Danica .......................................................... TuAT1.7 827<br />

.................................................................................. TuAT6.4 980<br />

.................................................................................. TuAT6.5 987<br />

.................................................................................. TuBT9 C<br />

.................................................................................. TuBT9.7 1554<br />

.................................................................................. TuBPT10.13 *<br />

.................................................................................. TuBPT10.14 *<br />

.................................................................................. WeAT2.2 2028<br />

.................................................................................. WeDT1.1 3342<br />

Kramer, Rebecca...................................................... MoAT9.8 414<br />

.................................................................................. TuCT9.2 1919<br />

Krauthausen, Peter................................................... ThCT5.4 4819<br />

.................................................................................. ThDPT10.13 *<br />

Kress-Gazit, Hadas .................................................. WeCT5 C<br />

.................................................................................. WeCT5 O<br />

.................................................................................. WeCT5.7 3120<br />

Kretzschmar, Henrik ................................................. MoAT2.6 86<br />

.................................................................................. MoBPT10.6 *<br />

.................................................................................. TuAT2.5 865


.................................................................................. TuBPT10.5 *<br />

Krid, Mohamed ......................................................... MoAT7.2 274<br />

Krieger, Kai............................................................... TuCT6.6 1789<br />

.................................................................................. WeAPT10.15 *<br />

Krishna, Madhava..................................................... ThBT3.5 4314<br />

.................................................................................. ThCPT10.8 *<br />

Kristan, Matej............................................................ WeDT1.8 3387<br />

Kroemer, Oliver ........................................................ TuBT9.6 1548<br />

.................................................................................. TuCPT10.24 *<br />

Kronander, Klas........................................................ MoBT8.4 710<br />

.................................................................................. TuAPT10.19 *<br />

Krontiris, Athanasios................................................. WeCT8.2 3235<br />

Kroschel, Kristian...................................................... TuBT1.1 1173<br />

Krug, Robert ............................................................. TuCT6.7 1797<br />

Krüger, Norbert......................................................... TuAT6.5 987<br />

.................................................................................. TuBPT10.14 *<br />

Kruijff, Geert-Jan ...................................................... WeDT1.8 3387<br />

Krupa, Alexandre...................................................... WeBT9.3 2831<br />

.................................................................................. WeBT9.4 2837<br />

.................................................................................. WeCPT10.25 *<br />

Krut, Sebastien......................................................... ThDT4.2 5067<br />

Kryczka, Przemyslaw ............................................... WeCT7.8 3221<br />

Kuan, Jiun-Yih .......................................................... MoAT9.1 372<br />

Kubus, Daniel ........................................................... WeBT2.4 2481<br />

.................................................................................. WeCPT10.4 *<br />

Kuchenbecker, Katherine J. ..................................... WeAT4 C<br />

.................................................................................. WeAT4 O<br />

.................................................................................. WeBT4 CC<br />

.................................................................................. WeBT4 O<br />

.................................................................................. WeCT4 O<br />

Kuehn, Benjamin ...................................................... TuBT1.1 1173<br />

Kuehne, Hildegard.................................................... ThCT5.4 4819<br />

.................................................................................. ThDPT10.13 *<br />

Kuemmerle, Rainer................................................... WeDT9.4 3716<br />

.................................................................................. ThAPT10.22 *<br />

Kuffner, James ......................................................... WeAT5.8 2199<br />

Kumar, Rajesh.......................................................... WeBT3.4 2539<br />

.................................................................................. WeCPT10.7 *<br />

Kumar, Vijay ............................................................. WeBT6.2 2668<br />

.................................................................................. WeCT5.8 3126<br />

.................................................................................. WeCT8.4 3248<br />

.................................................................................. WeDT8.5 3667<br />

.................................................................................. WeDPT10.22 *<br />

.................................................................................. ThAPT10.20 *<br />

Kumon, Makoto ........................................................ MoAT3.3 112<br />

Kuniyoshi, Yasuo...................................................... TuBT1.7 1214<br />

.................................................................................. TuBT8.7 1499<br />

Kunze, Lars .............................................................. WeCT7.1 3172<br />

Kunze, Mirko............................................................. TuCT6.6 1789<br />

.................................................................................. WeAPT10.15 *<br />

Kuperij, Nicole .......................................................... TuAT5.3 937<br />

Kurazume, Ryo......................................................... WeAT2.1 2020<br />

.................................................................................. ThBT1.5 4201<br />

.................................................................................. ThCPT10.2 *<br />

Kurdila, Andrew ........................................................ TuAT9.4 1140<br />

.................................................................................. TuBPT10.22 *<br />

Kurita, Yuichi ............................................................ WeAT4.3 2127<br />

Kurniawati, Hanna .................................................... WeDT6.3 3551<br />

Kushleyev, Aleksandr............................................... TuAT9.5 1146<br />

.................................................................................. TuBPT10.23 *<br />

Kuthirummal, Sujit .................................................... ThAT3.4 3874<br />

.................................................................................. ThBPT10.7 *<br />

Kuwahara, Hiroyuki .................................................. ThBT8.4 4550<br />

.................................................................................. ThCPT10.22 *<br />

Kuwata, Yoshiaki...................................................... ThCT3.3 4728<br />

Kwak, Kiho................................................................ WeCT9.1 3283<br />

Kweon, In So ............................................................ WeCT9.2 3290<br />

.................................................................................. ThBT6.3 4436<br />

Kwon, Dong-Soo ...................................................... WeBT7 C<br />

.................................................................................. WeBT7.8 2759<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–245–<br />

Kyrkjebø, Erik ........................................................... MoAT6.6 247<br />

.................................................................................. MoBPT10.18 *<br />

.................................................................................. ThCT2.2<br />

L<br />

4676<br />

Labbé, Mathieu......................................................... TuBT2.7 1271<br />

Lacerda, Bruno ......................................................... WeCT5.1 3081<br />

Ladjal, Hamid............................................................ TuBT3.8 1315<br />

Lakaemper, Rolf ....................................................... ThAT2.2 3808<br />

Lal, Sara ................................................................... ThBT9.7 4608<br />

Lalande, Viviane ....................................................... TuAT3.4 901<br />

.................................................................................. TuBT3.3 1285<br />

.................................................................................. TuBPT10.7 *<br />

Lallée, Stéphane....................................................... WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

Lam, Tin Lun............................................................. TuBT8.5 1487<br />

.................................................................................. TuCPT10.20 *<br />

Lambercy, Olivier...................................................... WeCT4.8 3074<br />

Lambert, Ghislain ..................................................... TuCT7.7 1849<br />

Lamiraux, Florent...................................................... ThBT5.5 4408<br />

.................................................................................. ThCPT10.14 *<br />

Lampert, Christoph H. .............................................. MoAT8.3 332<br />

Lamy-Perbal, Sylvie.................................................. WeBT2.7 2505<br />

Lan, Chao-Chieh ...................................................... TuCT9.8 1962<br />

Lane, Ian................................................................... MoAT3 CC<br />

.................................................................................. MoAT3 O<br />

.................................................................................. MoBT3 O<br />

.................................................................................. MoBT3.1 *<br />

Langner, Tim ............................................................ WeBT1.5 2430<br />

Lapeyre, Matthieu..................................................... TuBT8.2 1465<br />

Lapierre, Lionel......................................................... WeDT6.4 3558<br />

Laschi, Cecilia .......................................................... ThCT5.5 4826<br />

.................................................................................. ThDPT10.14 *<br />

Latscha, Stella .......................................................... MoAT9.7 408<br />

Lau, Haye ................................................................. WeDT8.8 3686<br />

Laugier, Christian ..................................................... WeAT1.8 2014<br />

Laurent, Guillaume J. ............................................... ThDT6.2 5121<br />

LaValle, Steven M .................................................... WeAT5.8 2199<br />

.................................................................................. WeBT2.8 2511<br />

.................................................................................. WeBT5 CC<br />

.................................................................................. WeBT5 O<br />

.................................................................................. WeCT5.4 3101<br />

.................................................................................. WeDPT10.13 *<br />

.................................................................................. ThAT3.2 3862<br />

Lavoie, Samuel......................................................... TuCT7.7 1849<br />

Lawitzky, Martin........................................................ WeBT1.3 2416<br />

Lazea Gheorghe, Gheorghe..................................... TuBT2.5 1256<br />

.................................................................................. TuCPT10.5 *<br />

Lazewatsky, Daniel................................................... ThAT8.8 4115<br />

Le, Minh-Quyen ........................................................ MoBT7.4 659<br />

.................................................................................. MoBT9.1 738<br />

.................................................................................. TuAPT10.16 *<br />

Le Fort-Piat, Nadine ................................................. ThDT6.2 5121<br />

Lea, Colin ................................................................. ThAT2.3 3816<br />

Lebastard, Vincent.................................................... TuCT8.7 1901<br />

Lee, C. S. George..................................................... WeAT8 C<br />

.................................................................................. WeAT8.1 2300<br />

Lee, Daniel D............................................................ WeAT5.6 2186<br />

.................................................................................. WeBPT10.15 *<br />

.................................................................................. ThAT5.6 3963<br />

.................................................................................. ThBPT10.15 *<br />

Lee, Deukhee ........................................................... WeBT9.3 2831<br />

Lee, Dongheui .......................................................... MoAT8.5 344<br />

.................................................................................. MoBPT10.23 *<br />

.................................................................................. TuAT6.7 1002<br />

.................................................................................. WeBT1.3 2416<br />

.................................................................................. WeDT7.7 3626<br />

Lee, Dongjun ............................................................ MoAT4.3 149<br />

.................................................................................. WeCT4.3 3039<br />

Lee, Gim Hee ........................................................... TuCT2.6 1655<br />

.................................................................................. WeAPT10.6 *


.................................................................................. ThAT6.7 4012<br />

Lee, Jae Hoon .......................................................... WeDT7.2 3595<br />

Lee, James............................................................... ThAT2.3 3816<br />

Lee, Ji Yeong Lee..................................................... ThDT6.3 5127<br />

Lee, Jongwon ........................................................... WeBT3.2 2524<br />

Lee, Min Kyung......................................................... WeAT1.4 1986<br />

.................................................................................. WeBPT10.1 *<br />

Lee, Seong-Whan..................................................... WeCT4.3 3039<br />

Lee, Seongsoo ......................................................... TuAT2.1 841<br />

Lee, Seung Hwan..................................................... TuAT5.8 967<br />

Lee, Tae-kyeong....................................................... TuAT2.1 841<br />

Lee, Yi-Chiao............................................................ TuCT9.8 1962<br />

Leeper, Adam Eric.................................................... WeBT4.1 2570<br />

Legrand, Bernard...................................................... MoAT1.8 45<br />

Lehman, Joel............................................................ WeBT8.7 2802<br />

Lehnert, Christopher................................................. MoBT8.1 692<br />

Leishman, Robert ..................................................... WeBT6.5 2688<br />

.................................................................................. WeCPT10.17 *<br />

.................................................................................. ThDT8.2 5173<br />

Lemaignan, Séverin.................................................. WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

Lemerle, Pierre......................................................... WeCT7.7 3213<br />

Lenain, Roland ......................................................... ThAT8.1 4072<br />

.................................................................................. ThBT9.1 4569<br />

Lenz, Alexander........................................................ WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

Lenz, Claus............................................................... WeDT1.6 3375<br />

.................................................................................. ThAPT10.3 *<br />

Leon, Lisandro.......................................................... TuBT5.2 1321<br />

Leu, Adrian ............................................................... TuAT1.1 786<br />

Leutenegger, Stefan................................................. WeAT6.5 2223<br />

.................................................................................. WeBPT10.17 *<br />

Leven, Severin.......................................................... ThCT9.6 5015<br />

.................................................................................. ThDPT10.27 *<br />

Lewinger, William ..................................................... MoAT5.7 215<br />

Lewis, Bennie ........................................................... WeBT4.4 2590<br />

.................................................................................. WeCPT10.10 *<br />

Li, Baopu .................................................................. WeAT3.3 2090<br />

Li, Bin........................................................................ TuCT8.2 1869<br />

Li, Howard ................................................................ TuAT2.3 853<br />

.................................................................................. TuAT2.7 880<br />

Li, Jinglin................................................................... ThBT1.6 4207<br />

.................................................................................. ThCPT10.3 *<br />

Li, Peng .................................................................... ThBT1.8 4222<br />

Li, Xiaodong.............................................................. TuCT3.8 1705<br />

Li, Xiaofei.................................................................. WeCT1.3 2879<br />

Li, Xin........................................................................ MoAT4.8 184<br />

Li, Y.F. ...................................................................... WeDT2.8 3440<br />

Li, Yuanhong ............................................................ ThBT9.2 4575<br />

Li, Zhanchu............................................................... TuBT8.3 1473<br />

Liang, Dong .............................................................. MoBT3.8 550<br />

Liang, Jianhong ........................................................ TuCT8.1 1863<br />

Libby, Tom................................................................ TuCT8.5 1887<br />

.................................................................................. WeAPT10.20 *<br />

Lidholm, Jörgen........................................................ TuCT2.2 1626<br />

Liemhetcharat, Somchaya........................................ WeDT8.1 3638<br />

Lien, Jyh-Ming .......................................................... TuCT1 CC<br />

.................................................................................. TuCT1.2 1573<br />

.................................................................................. WeBT5.4 2626<br />

.................................................................................. WeCPT10.13 *<br />

Likhachev, Maxim..................................................... TuAT9.5 1146<br />

.................................................................................. TuBPT10.23 *<br />

.................................................................................. WeAT5 CC<br />

.................................................................................. WeAT5 O<br />

.................................................................................. WeCT8.5 3254<br />

.................................................................................. WeDT5 C<br />

.................................................................................. WeDT5 O<br />

.................................................................................. WeDPT10.23 *<br />

Lim, Hun-ok .............................................................. WeCT7.8 3221<br />

Lim, Hup Boon.......................................................... ThCT7.6 4923<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–246–<br />

.................................................................................. ThDPT10.21 *<br />

Lim, Jongwoo ........................................................... TuCT2.1 1618<br />

.................................................................................. ThAT6.3 3982<br />

Lima, Pedro .............................................................. WeCT5.1 3081<br />

Lin, Pei-Chun............................................................ TuBT8.6 1493<br />

.................................................................................. TuCPT10.21 *<br />

.................................................................................. ThDT4 CC<br />

Lin, Yun .................................................................... WeDT1.2 3349<br />

Lind, Morten.............................................................. WeBT9.8 2861<br />

Lindsey, Quentin....................................................... WeBT6.2 2668<br />

Lippiello, Vincenzo.................................................... ThBT1.4 4194<br />

.................................................................................. ThCPT10.1 *<br />

Lipson, Hod .............................................................. ThBT4.5 4366<br />

.................................................................................. ThCPT10.11 *<br />

Little, James J........................................................... WeBT9.2 2825<br />

.................................................................................. ThCT6.8 4885<br />

Liu, Albert ................................................................. TuCT3.3 1674<br />

Liu, Chao .................................................................. WeDT6.4 3558<br />

.................................................................................. ThAPT10.13 *<br />

Liu, Dikai................................................................... WeDT8.8 3686<br />

Liu, Hong .................................................................. WeCT1.3 2879<br />

Liu, Jindong .............................................................. MoBT3.7 542<br />

Liu, Jingmeng ........................................................... MoBT9.2 744<br />

Liu, Ming ................................................................... ThAT2.4 3824<br />

.................................................................................. ThBPT10.4 *<br />

Liu, Wen ................................................................... TuBT5.6 1346<br />

.................................................................................. TuCPT10.12 *<br />

Liu, Yen-Chen........................................................... MoBT7.7 679<br />

Liu, Ziyuan ................................................................ WeDT7.7 3626<br />

Lofaro, Daniel ........................................................... WeCT1.8 2916<br />

Loiselle, Stéphane .................................................... MoBT3.3 518<br />

Long, Jonathan......................................................... TuAT1.2 793<br />

Lopes, Ana ............................................................... WeBT1.6 2438<br />

.................................................................................. WeCPT10.3 *<br />

Lopes, Gabriel .......................................................... WeBT7.3 2729<br />

Low, K. H. ................................................................. MoAT5 O<br />

.................................................................................. MoBT5 C<br />

.................................................................................. MoBT5 O<br />

.................................................................................. MoBT5.5 568<br />

.................................................................................. TuCT5.6 1743<br />

.................................................................................. WeAPT10.12 *<br />

.................................................................................. ThCT7.6 4923<br />

.................................................................................. ThDPT10.21 *<br />

Lozano-Perez, Tomas .............................................. WeAT5.1 *<br />

Lu, Wenyu ................................................................ ThBT3.6 4321<br />

.................................................................................. ThCPT10.9 *<br />

Lu, Yan ..................................................................... ThAT8.3 4084<br />

Lu, Yanyan ............................................................... WeBT5.4 2626<br />

.................................................................................. WeCPT10.13 *<br />

Lu, Yibiao.................................................................. WeDT5.2 3507<br />

Lu, Zhiguo................................................................. ThDT5.2 5094<br />

Luber, Matthias......................................................... ThAT2.7 3844<br />

Lucas, Blake ............................................................. TuBT5.6 1346<br />

.................................................................................. TuCPT10.12 *<br />

Luettel, Thorsten....................................................... ThBT8.6 4562<br />

.................................................................................. ThCPT10.24 *<br />

Lumia, Ron ............................................................... WeAT8 CC<br />

.................................................................................. WeAT8.6 2333<br />

Luna, Ryan ............................................................... WeCT8.7 3268<br />

Luo, Guoliang ........................................................... WeAT2.2 2028<br />

Luo, Ren ................................................................... WeCT2.8 2965<br />

Lupashin, Sergei....................................................... ThDT6.1 5113<br />

.................................................................................. ThDT8.3 5179<br />

Luu, Trieu Phat ......................................................... ThCT7.6 4923<br />

.................................................................................. ThDPT10.21 *<br />

Ly, Olivier.................................................................. TuBT8 CC<br />

.................................................................................. TuBT8.2<br />

M<br />

1465<br />

Ma, Shugen .............................................................. TuCT8.2 1869<br />

.................................................................................. ThDT3 CC


.................................................................................. ThDT3.1 5035<br />

Ma, Yan .................................................................... ThCT8.6 4961<br />

.................................................................................. ThDPT10.24 *<br />

MacAllister, Brian...................................................... TuAT9.5 1146<br />

.................................................................................. TuBPT10.23 *<br />

Macasieb, Juan Antonio Miguel Lim......................... ThCT4.7 4797<br />

Macdonald, John ...................................................... ThDT8.2 5173<br />

MacLachlan, Robert A.............................................. ThDT7.4 5160<br />

Macwan, Ashish ....................................................... ThBT8.3 4544<br />

Madhavan, Raj ......................................................... ThDT6 C<br />

Mae, Yasushi............................................................ MoAT1.5 25<br />

.................................................................................. MoBT1.2 427<br />

.................................................................................. MoBPT10.2 *<br />

.................................................................................. TuAT1.4 807<br />

.................................................................................. TuBPT10.1 *<br />

Maeda, Guilherme Jorge.......................................... MoBT8.6 725<br />

.................................................................................. TuAPT10.21 *<br />

Magnenat, Stéphane ................................................ ThAT2.4 3824<br />

.................................................................................. ThBPT10.4 *<br />

Magnusson, Martin................................................... ThAT8.2 4078<br />

Mahmudi, Mentar...................................................... WeBT5.8 2653<br />

Mahnič, Marko.......................................................... WeDT1.8 3387<br />

Mahon, Ian................................................................ ThBT6.6 4455<br />

.................................................................................. ThCPT10.18 *<br />

Mahoor, Mohammad ................................................ WeAT9 CC<br />

.................................................................................. WeAT9.5 2377<br />

.................................................................................. WeBPT10.26 *<br />

Mair, Elmar ............................................................... WeCT9.3 3297<br />

Mairiaux, Estelle ....................................................... MoAT1.8 45<br />

Majdik, Andras.......................................................... TuBT2.5 1256<br />

.................................................................................. TuCPT10.5 *<br />

Majidi, Carmel........................................................... TuCT9.2 1919<br />

Makikawa, Masaaki .................................................. TuCT9.6 1946<br />

.................................................................................. WeAPT10.24 *<br />

Malosio, Matteo ........................................................ WeCT9.7 3327<br />

Malvezzi, Monica...................................................... TuAT6.8 1008<br />

Malysz, Pawel........................................................... MoBT7.2 645<br />

Mansard, Nicolas...................................................... ThAT3.6 3887<br />

.................................................................................. ThAT9.2 4127<br />

.................................................................................. ThBPT10.9 *<br />

Mansour, Darine....................................................... WeCT7.7 3213<br />

Manz, Michael .......................................................... ThBT8.6 4562<br />

.................................................................................. ThCPT10.24 *<br />

Marble, James.......................................................... ThBT3.2 4292<br />

Marchal-Crespo, Laura............................................. WeCT4.7 3068<br />

Marchand, Eric ......................................................... MoBT6.6 619<br />

.................................................................................. TuAPT10.15 *<br />

.................................................................................. ThCT8.1 4929<br />

Marchese, Andrew.................................................... MoBT9.4 756<br />

.................................................................................. TuAPT10.22 *<br />

Marchetti, Luca......................................................... WeBT8.6 2796<br />

.................................................................................. WeCPT10.24 *<br />

Marino, Alessandro................................................... WeBT8.3 2778<br />

Mariottini, Gian Luca................................................. WeAT6.6 2229<br />

.................................................................................. WeAT9.4 2371<br />

.................................................................................. WeBPT10.18 *<br />

.................................................................................. WeBPT10.25 *<br />

.................................................................................. WeDT7 C<br />

.................................................................................. WeDT7.4 3608<br />

.................................................................................. ThAPT10.16 *<br />

Marks, Tim K. ........................................................... ThCT2.4 4690<br />

.................................................................................. ThDPT10.4 *<br />

Marques, Hugo......................................................... TuAT7.8 1063<br />

Marques, Lino........................................................... ThBT2.8 4277<br />

Martel, Sylvain.......................................................... TuAT3 C<br />

.................................................................................. TuAT3 O<br />

.................................................................................. TuAT3.4 901<br />

.................................................................................. TuBT3 CC<br />

.................................................................................. TuBT3 O<br />

.................................................................................. TuBT3.3 1285<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–247–<br />

.................................................................................. TuBT3.6 1304<br />

.................................................................................. TuBPT10.7 *<br />

.................................................................................. TuCT3 C<br />

.................................................................................. TuCT3 O<br />

.................................................................................. TuCPT10.9 *<br />

Martin, Christian ....................................................... WeBT1.5 2430<br />

Martinelli, Agostino ................................................... WeBT2 CC<br />

.................................................................................. WeBT2.1 2460<br />

Martinet, Philippe...................................................... WeBT9.7 2855<br />

.................................................................................. ThAT8.1 4072<br />

Martinoli, Alcherio ..................................................... WeBT8.1 2765<br />

.................................................................................. WeCT8.3 3241<br />

.................................................................................. WeCT9.5 3313<br />

.................................................................................. WeDPT10.26 *<br />

.................................................................................. ThBT4.1 4341<br />

Martinson, Eric.......................................................... MoAT3.5 125<br />

.................................................................................. MoBPT10.8 *<br />

Marton, Zoltan-Csaba............................................... ThBT2.6 4263<br />

.................................................................................. ThCPT10.6 *<br />

Maruyama, Hisataka................................................. MoAT1.3 13<br />

Mascaro, Stephen .................................................... MoAT9 CC<br />

.................................................................................. MoAT9.6 402<br />

.................................................................................. MoBT9.8 780<br />

.................................................................................. MoBPT10.27 *<br />

.................................................................................. WeBT4.5 2596<br />

.................................................................................. WeCPT10.11 *<br />

Masehian, Ellips ....................................................... ThBT3.3 4299<br />

Mason, Matthew T. ................................................... TuCT6.8 1804<br />

.................................................................................. ThAT3.1 3856<br />

Masone, Carlo .......................................................... WeAT6.4 2215<br />

.................................................................................. WeBPT10.16 *<br />

Mastrangeli, Massimo............................................... ThBT4.1 4341<br />

Mastrogiovanni, Fulvio.............................................. WeDT9.1 3694<br />

Masuda, Taisuke ...................................................... MoAT1.3 13<br />

Mataric, Maja ............................................................ WeBT1.1 2403<br />

Mathews, Nithin ........................................................ ThCT4.2 4762<br />

.................................................................................. ThCT9.8 5027<br />

Mathieu, Philippe ...................................................... WeDT5.4 3519<br />

.................................................................................. ThAPT10.10 *<br />

Mathisen, Geir .......................................................... WeBT9.8 2861<br />

Matos, Vítor .............................................................. WeAT7.7 2286<br />

Matsubara, Takamitsu .............................................. ThAT5.8 3975<br />

Matsuno, Fumitoshi .................................................. TuCT8 CC<br />

.................................................................................. TuCT8.8 1907<br />

Matsusaka, Yosuke .................................................. WeAT1.6 2000<br />

.................................................................................. WeBPT10.3 *<br />

Mattos, Leonardo...................................................... TuBT5.8 1359<br />

Maughan, Thom ....................................................... WeCT6.3 3132<br />

Mavroidis, Constantinos ........................................... ThCT7 CC<br />

.................................................................................. ThCT7.1 4893<br />

Maxwell, Bruce ......................................................... ThAT8.8 4115<br />

Maycock, Jonathan................................................... WeCT2.5 2947<br />

.................................................................................. WeDPT10.5 *<br />

Mazalaigue, Stéphane.............................................. WeBT3.3 2532<br />

McAllester, David...................................................... ThBT6.4 4442<br />

.................................................................................. ThCPT10.16 *<br />

McBride, James........................................................ ThBT4.7 4378<br />

McCann, Mike........................................................... WeCT6.3 3132<br />

McGinnity, Martin...................................................... WeDT3.1 3446<br />

.................................................................................. ThCT2.8 4716<br />

McLain, T.W. ............................................................ WeBT6.5 2688<br />

.................................................................................. WeCPT10.17 *<br />

.................................................................................. ThDT8.2 5173<br />

McLaren, David ........................................................ WeCT6.4 3140<br />

.................................................................................. WeDPT10.16 *<br />

McPherson, Timothy................................................. TuCT9.5 1940<br />

.................................................................................. WeAPT10.23 *<br />

Medina Ayala, Ana Ivonne........................................ WeCT5.5 3108<br />

.................................................................................. WeDPT10.14 *<br />

Medina Hernandez, Jose Ramon ............................. WeBT1.3 2416


Medioni, Gerard........................................................ ThAT2.1 3800<br />

Medrano-Cerda, Gustavo......................................... TuAT8.5 1093<br />

.................................................................................. TuBPT10.20 *<br />

Meek, Sanford .......................................................... TuCT9 CC<br />

.................................................................................. TuCT9.3 1927<br />

.................................................................................. WeDT7.8 3632<br />

.................................................................................. ThBT1.1 4174<br />

Meeussen, Wim........................................................ WeCT2 CC<br />

.................................................................................. WeCT2.4 2941<br />

.................................................................................. WeDPT10.4 *<br />

Mefoued, Saber........................................................ TuCT5.7 1749<br />

Meger, David Paul.................................................... ThCT6.8 4885<br />

Meghjani, Malika....................................................... ThDT3.3 5048<br />

Mehnert, Jens........................................................... ThBT9.4 4587<br />

.................................................................................. ThCPT10.25 *<br />

Mehrjerdi, Hasan ...................................................... WeAT8.7 2340<br />

Meier, Franziska....................................................... WeDT2.3 3407<br />

Meilland, Maxime...................................................... ThBT2.3 4242<br />

Meirion-Griffith, Gareth............................................. MoAT7.7 305<br />

Meißner, Pascal........................................................ ThCT1.3 4633<br />

Melchiorri, Claudio.................................................... TuCT6.4 1775<br />

.................................................................................. WeAPT10.13 *<br />

.................................................................................. ThAT3.8 3899<br />

Melendez-Calderon, Alejandro................................. WeBT4.2 2578<br />

Melhuish, Chris......................................................... WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

Mellinger, Daniel....................................................... WeBT6.2 2668<br />

Menegatti, Emanuele................................................ WeDT3.6 3480<br />

.................................................................................. ThAPT10.9 *<br />

Meng, Max Q.-H. ...................................................... WeAT3.3 2090<br />

Mermoud, Gregory ................................................... ThBT4.1 4341<br />

Merten, Matthias....................................................... WeBT1.5 2430<br />

Merz, Torsten ........................................................... ThCT8.5 4953<br />

.................................................................................. ThDPT10.23 *<br />

Messié, Monique ...................................................... WeCT6.3 3132<br />

Metta, Giorgio........................................................... TuBT9.3 1526<br />

.................................................................................. WeCT1.5 2895<br />

.................................................................................. WeDT9 C<br />

.................................................................................. WeDT9.1 3694<br />

.................................................................................. WeDPT10.2 *<br />

.................................................................................. ThAT9.6 4154<br />

.................................................................................. ThAT9.7 4161<br />

.................................................................................. ThBPT10.27 *<br />

Mettin, Uwe............................................................... ThDT5.2 5094<br />

Mettler, Bernie .......................................................... WeCT5.3 3093<br />

Metzger, Jean-Claude .............................................. WeCT4.8 3074<br />

Meyer-Delius, Daniel ................................................ WeCT8.8 3276<br />

Micaelli, Alain............................................................ WeCT7.7 3213<br />

Michael, Nathan........................................................ WeAT6 C<br />

.................................................................................. WeAT6 O<br />

.................................................................................. WeBT6 O<br />

.................................................................................. WeBT6.8 2708<br />

.................................................................................. WeDT8.5 3667<br />

.................................................................................. ThAPT10.20 *<br />

Michaud, Francois .................................................... TuBT2.7 1271<br />

Michel, Fabien .......................................................... ThDT3.2 5041<br />

Michelin, Micaël........................................................ WeBT9.7 2855<br />

Mier-Y-Teran-Romero, Luis...................................... ThAT4.3 3905<br />

Mignano, Anthony..................................................... MoBT5.7 580<br />

Milella, Annalisa........................................................ MoAT6.7 255<br />

Milighetti, Giulio ........................................................ WeCT7.4 3192<br />

.................................................................................. WeDPT10.19 *<br />

Miller, Stephen.......................................................... ThCT6.7 4877<br />

Milutinovic, Dejan ..................................................... ThAT4.5 3917<br />

.................................................................................. ThBT4 CC<br />

.................................................................................. ThBPT10.11 *<br />

Minamizawa, Kouta .................................................. MoAT4.4 157<br />

.................................................................................. MoBPT10.10 *<br />

Minato, Takashi ........................................................ WeDT3.6 3480<br />

.................................................................................. ThAPT10.9 *<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–248–<br />

Ming, Aiguo............................................................... TuCT9.7 1954<br />

.................................................................................. ThAT7.4 4048<br />

.................................................................................. ThCT3.4 4735<br />

.................................................................................. ThDPT10.7 *<br />

Minor, Mark............................................................... MoAT7.5 292<br />

.................................................................................. MoBPT10.20 *<br />

.................................................................................. ThCT8 CC<br />

.................................................................................. ThCT8.8 4975<br />

Miroir, Mathieu.......................................................... WeBT3.3 2532<br />

Misra, Sarthak .......................................................... MoBT1.1 421<br />

.................................................................................. TuAT5.3 937<br />

.................................................................................. WeAT3.1 2076<br />

.................................................................................. WeBT3 CC<br />

.................................................................................. WeBT3.7 2557<br />

Mita, Seiichi .............................................................. ThBT6.4 4442<br />

.................................................................................. ThCPT10.16 *<br />

Mitra, Urbashi ........................................................... WeDT6.5 3564<br />

.................................................................................. ThAPT10.14 *<br />

Mitsuhashi, Masaru .................................................. WeCT7.2 3179<br />

Miura, Hiroki ............................................................. MoBT3.4 524<br />

.................................................................................. TuAPT10.7 *<br />

Miura, Kanako .......................................................... ThBT5.3 4392<br />

.................................................................................. ThBT5.8 4428<br />

Miyamori, Go ............................................................ ThBT5.4 4400<br />

.................................................................................. ThCPT10.13 *<br />

Miyashita, Takahiro .................................................. WeDT7.1 3589<br />

Miyazaki, Fumio........................................................ ThBT7.4 4496<br />

.................................................................................. ThCPT10.19 *<br />

Mizumoto, Takeshi ................................................... MoAT3.4 118<br />

.................................................................................. MoBPT10.7 *<br />

.................................................................................. WeCT1.7 2910<br />

Mobarhani, Amir ....................................................... TuAT9.2 1128<br />

Mochiyama, Hiromi................................................... ThCT1.6 4652<br />

.................................................................................. ThDPT10.3 *<br />

Mohammadi, Mahmood............................................ TuBT3.6 1304<br />

.................................................................................. TuCPT10.9 *<br />

Mohammed, Samer .................................................. TuCT5.7 1749<br />

Mohan, Anush .......................................................... ThBT4.7 4378<br />

Mohta, Kartik ............................................................ WeBT6.8 2708<br />

Mohtat, Arash ........................................................... WeDT9.8 3740<br />

Molinari Tosatti, Lorenzo .......................................... WeCT9.7 3327<br />

Molotchnikoff, Stéphane ........................................... MoBT3.3 518<br />

Mombaur, Katja ........................................................ ThBT5.1 *<br />

Mondada, Francesco................................................ ThCT4.2 4762<br />

.................................................................................. ThCT9.1 4981<br />

Montano, Luis ........................................................... WeCT1.4 2887<br />

.................................................................................. WeDPT10.1 *<br />

Montés Sánchez, Nicolás ......................................... TuCT1.1 1567<br />

Montesano, Luis ....................................................... ThBT6.5 4448<br />

.................................................................................. ThCPT10.17 *<br />

Montiel, J.M.M .......................................................... TuBT2.8 1277<br />

Moon, AJung ............................................................ WeAT1.5 1994<br />

.................................................................................. WeBPT10.2 *<br />

Moon, Hyungpil......................................................... TuBT7.7 1447<br />

Moore, Joseph.......................................................... WeBT6.7 2700<br />

Moore, Richard James Donald ................................. ThCT8.2 4935<br />

Mora, Marta Covadonga........................................... TuCT1.1 1567<br />

Moral del Agua, Diego .............................................. ThBT7.8 4522<br />

Morbidi, Fabio........................................................... WeAT6.6 2229<br />

.................................................................................. WeBPT10.18 *<br />

.................................................................................. WeDT7.4 3608<br />

.................................................................................. ThAPT10.16 *<br />

Moreau, Richard ....................................................... MoBT7.4 659<br />

.................................................................................. TuAPT10.16 *<br />

Morel, Guillaume ...................................................... TuBT5.4 1333<br />

.................................................................................. TuBT5.5 1339<br />

.................................................................................. TuCPT10.10 *<br />

.................................................................................. TuCPT10.11 *<br />

Mori, Taketoshi ......................................................... WeDT2.5 3419<br />

.................................................................................. WeDT7.3 3601


.................................................................................. ThAPT10.5 *<br />

Morimoto, Jun........................................................... WeBT7.1 2715<br />

.................................................................................. WeCT7.3 3185<br />

.................................................................................. ThAT5.8 3975<br />

Morin, Pascal............................................................ WeCT9.8 3335<br />

Morioka, Hiroshi........................................................ ThAT6.5 3998<br />

.................................................................................. ThBPT10.17 *<br />

Morisawa, Mitsuharu ................................................ ThBT5.3 4392<br />

.................................................................................. ThBT5.4 4400<br />

.................................................................................. ThBT5.8 4428<br />

.................................................................................. ThCPT10.13 *<br />

Morozovsky, Nicholas............................................... WeBT7.5 2741<br />

.................................................................................. WeCPT10.20 *<br />

.................................................................................. ThAT8.5 4097<br />

.................................................................................. ThBPT10.23 *<br />

Morris, Aaron............................................................ ThAT2.3 3816<br />

Mörtl, Alexander ....................................................... WeBT1.3 2416<br />

Morton, Ryan............................................................ TuCT1.3 1579<br />

Mörwald, Thomas..................................................... TuAT1.5 813<br />

.................................................................................. TuBPT10.2 *<br />

Mösenlechner, Lorenz.............................................. ThAT9.4 4141<br />

.................................................................................. ThBPT10.25 *<br />

Motegi, Yuichi........................................................... ThCT9.7 5021<br />

Mouret, Jean-Baptiste .............................................. ThBT4.4 4360<br />

.................................................................................. ThCPT10.10 *<br />

Mourikis, Anastasios................................................. MoAT2.3 65<br />

Moutinho, Nuno ........................................................ ThCT5.5 4826<br />

.................................................................................. ThDPT10.14 *<br />

Movafaghpour, Mohamad Ali.................................... ThBT3.3 4299<br />

Mueller, Joerg........................................................... ThCT8.3 4941<br />

Mueller, Steffen ........................................................ WeBT1.5 2430<br />

Muelling, Katharina................................................... MoAT8.3 332<br />

Mukai, Toshiharu...................................................... WeBT1.7 2445<br />

Mukherjee, Ranjan ................................................... ThCT3.6 4749<br />

.................................................................................. ThDPT10.9 *<br />

Müller, Mark Wilfried................................................. ThDT6.1 5113<br />

Muñoz, Victor............................................................ WeAT3.8 2121<br />

Muñoz Hernandez, Laura Elena............................... WeBT6.4 2682<br />

.................................................................................. WeCPT10.16 *<br />

Muntwyler, Simon..................................................... TuAT3.7 919<br />

Murakami, Kazunori.................................................. ThBT9.5 4593<br />

.................................................................................. ThCPT10.26 *<br />

Murasaki, Kazuhiko .................................................. WeDT7.3 3601<br />

Murillo, Ana Cristina ................................................. WeAT9.6 2383<br />

.................................................................................. WeBPT10.27 *<br />

Murphy, Elizabeth..................................................... ThAT3.3 3868<br />

Murphy, Robin .......................................................... MoAT5.6 209<br />

Murray, Richard........................................................ WeAT2.6 2056<br />

.................................................................................. WeBPT10.6 *<br />

.................................................................................. WeCT3.8 3017<br />

Murrieta-Cid, Rafael ................................................. ThBT8.1 4528<br />

Muscio, Giuseppe..................................................... ThBT1.4 4194<br />

.................................................................................. ThCPT10.1<br />

N<br />

*<br />

Naba, Shu................................................................. TuCT7.6 1843<br />

.................................................................................. WeAPT10.18 *<br />

Nadeau, Caroline...................................................... WeBT9.4 2837<br />

.................................................................................. WeCPT10.25 *<br />

Nagai, Takayuki........................................................ TuBT9.2 1520<br />

.................................................................................. TuBT9.5 1540<br />

.................................................................................. TuCPT10.23 *<br />

Nagasaka, Tomomi .................................................. TuAT2.6 872<br />

.................................................................................. TuBPT10.6 *<br />

Nagatani, Keiji .......................................................... MoAT6.1 *<br />

.................................................................................. MoBT6.3 601<br />

Nagpal, Radhika....................................................... ThAT4.6 3923<br />

.................................................................................. ThBT7.3 4488<br />

.................................................................................. ThBPT10.12 *<br />

.................................................................................. ThCT9 CC<br />

.................................................................................. ThCT9.2 4989<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–249–<br />

Nakadai, Kazuhiro .................................................... MoAT3 O<br />

.................................................................................. MoAT3.2 106<br />

.................................................................................. MoAT3.6 131<br />

.................................................................................. MoAT3.8 143<br />

.................................................................................. MoBT3 C<br />

.................................................................................. MoBT3 O<br />

.................................................................................. MoBT3.4 524<br />

.................................................................................. MoBT3.5 530<br />

.................................................................................. MoBPT10.9 *<br />

.................................................................................. TuAPT10.7 *<br />

.................................................................................. TuAPT10.8 *<br />

Nakajima, Hirofumi ................................................... MoAT3.2 106<br />

.................................................................................. MoAT3.6 131<br />

.................................................................................. MoBT3.5 530<br />

.................................................................................. MoBPT10.9 *<br />

.................................................................................. TuAPT10.8 *<br />

Nakajima, Masahiro.................................................. MoAT1.1 1<br />

.................................................................................. MoBT1.3 433<br />

.................................................................................. MoBT1.7 457<br />

.................................................................................. TuCT3.6 1693<br />

.................................................................................. WeAPT10.9 *<br />

Nakamura, Keisuke .................................................. MoAT3.2 106<br />

.................................................................................. MoAT3.6 131<br />

.................................................................................. MoAT3.8 143<br />

.................................................................................. MoBT3.4 524<br />

.................................................................................. MoBPT10.9 *<br />

.................................................................................. TuAPT10.7 *<br />

Nakamura, Kenichi ................................................... ThBT9.5 4593<br />

.................................................................................. ThCPT10.26 *<br />

Nakamura, Tomoaki ................................................. TuBT9.2 1520<br />

.................................................................................. TuBT9.5 1540<br />

.................................................................................. TuCPT10.23 *<br />

Nakamura, Yoshihiko ............................................... MoAT7.1 268<br />

.................................................................................. WeDT2.2 3401<br />

.................................................................................. WeDT9.2 3701<br />

Nakamura, Yutaka.................................................... WeCT1.1 2867<br />

Nakanishi, Hiroki....................................................... MoBT6.7 625<br />

Nakanishi, Jun .......................................................... MoBT8 C<br />

.................................................................................. MoBT8.5 718<br />

.................................................................................. TuAPT10.20 *<br />

Nakano, Mikio........................................................... TuBT9.5 1540<br />

.................................................................................. TuCPT10.23 *<br />

Nakano, Tomoyasu .................................................. WeAT1.6 2000<br />

.................................................................................. WeBPT10.3 *<br />

Nakaoka, Shin'ichiro................................................. WeAT1.6 2000<br />

.................................................................................. WeBPT10.3 *<br />

.................................................................................. ThBT5.3 4392<br />

Nakashima, Hiromichi............................................... WeBT1.7 2445<br />

Nambi, Manikantan................................................... MoBT1.5 445<br />

.................................................................................. TuAPT10.2 *<br />

Naniwa, Keisuke....................................................... TuCT7.2 1817<br />

Narasimhan, Srinivasa ............................................. WeAT9.1 2352<br />

Nardi, Daniele........................................................... WeBT8.6 2796<br />

.................................................................................. WeCPT10.24 *<br />

Narioka, Kenichi ....................................................... ThCT5.7 4838<br />

Natale, Lorenzo ........................................................ TuBT9.3 1526<br />

.................................................................................. WeCT1.5 2895<br />

.................................................................................. WeDT9.1 3694<br />

.................................................................................. WeDPT10.2 *<br />

.................................................................................. ThAT9.6 4154<br />

.................................................................................. ThAT9.7 4161<br />

.................................................................................. ThBPT10.27 *<br />

Natarajan, Saravana K. ............................................ TuAT1.1 786<br />

Natraj, Ashutosh ....................................................... ThAT6.6 4006<br />

.................................................................................. ThBPT10.18 *<br />

Navarro-Alarcon, David ............................................ ThBT1.8 4222<br />

Nazari, Shaghayegh ................................................. TuAT9.2 1128<br />

Nebot, Eduardo ........................................................ ThBT9.6 4601<br />

.................................................................................. ThCPT10.27 *<br />

Negoro, Makoto ........................................................ MoBT1.4 439


.................................................................................. TuAPT10.1 *<br />

Neira, José ............................................................... ThAT6 C<br />

.................................................................................. ThAT6 O<br />

.................................................................................. ThBT6 C<br />

.................................................................................. ThBT6 O<br />

.................................................................................. ThCT6 CC<br />

Nejat, Goldie............................................................. ThBT8.3 4544<br />

Nelson, Bradley J. .................................................... TuAT3.7 919<br />

.................................................................................. TuBT3.5 1297<br />

.................................................................................. TuCPT10.8 *<br />

Nelson, Gabriel......................................................... MoBT5.1 *<br />

Nemec, Bojan ........................................................... ThCT5.6 4832<br />

.................................................................................. ThDPT10.15 *<br />

Nenchev, Dragomir................................................... WeCT7.2 3179<br />

Nerurkar, Esha ......................................................... MoBT2.6 502<br />

.................................................................................. TuAPT10.6 *<br />

Nettleton, Eric........................................................... ThBT2.2 4236<br />

Newman, Paul .......................................................... ThAT3.3 3868<br />

.................................................................................. ThAT6 O<br />

.................................................................................. ThBT6 O<br />

Nguyen, Binh............................................................ TuBT7.5 1433<br />

.................................................................................. TuCPT10.17 *<br />

Nguyen, Chanh-Nghiem........................................... MoBT1.2 427<br />

Nguyen, Kien Cuong ................................................ TuBT8.1 1459<br />

Nguyen, Yann........................................................... WeBT3.3 2532<br />

Nguyen-Tuong, Duy ................................................. MoBT8.2 698<br />

.................................................................................. MoBT8.3 704<br />

Nia Kosari, Sina........................................................ WeBT4.8 2614<br />

Nickl, Mathias ........................................................... TuAT7.7 1055<br />

Nieto, Juan ............................................................... MoAT2.7 92<br />

.................................................................................. ThBT9 CC<br />

.................................................................................. ThBT9.6 4601<br />

.................................................................................. ThCPT10.27 *<br />

Nieto-Granda, Carlos................................................ TuBT2.6 1264<br />

.................................................................................. TuCPT10.6 *<br />

Niiyama, Ryuma ....................................................... TuBT8.7 1499<br />

Nikolic, Janosch........................................................ WeBT6.6 2694<br />

.................................................................................. WeCPT10.18 *<br />

Nili Ahmadabadi, Majid............................................. WeBT7.2 2723<br />

Nill, Scott T. .............................................................. WeBT3.1 2517<br />

Nishii, Tatsuya.......................................................... WeCT7.2 3179<br />

Nishikawa, Satoshi ................................................... TuBT8.7 1499<br />

Noda, Tomoyuki ....................................................... ThAT5.8 3975<br />

Noda, Yoshitaka ....................................................... MoAT3.3 112<br />

Nogawa, Kousuke .................................................... TuCT3.6 1693<br />

.................................................................................. WeAPT10.9 *<br />

Noguchi, Hiroshi ....................................................... WeDT2.5 3419<br />

.................................................................................. ThAPT10.5 *<br />

Noh, Si Tae............................................................... WeDT7.2 3595<br />

Noonan, David.......................................................... TuAT5.5 949<br />

.................................................................................. TuBPT10.11 *<br />

Nori, Francesco ........................................................ WeDT9.1 3694<br />

.................................................................................. ThAT9.6 4154<br />

.................................................................................. ThAT9.7 4161<br />

.................................................................................. ThBT4.3 4354<br />

.................................................................................. ThBPT10.27 *<br />

Nothhelfer, Alexander............................................... TuAT7.7 1055<br />

Nunes, Urbano ......................................................... WeBT1 C<br />

.................................................................................. WeBT1.6 2438<br />

.................................................................................. WeCPT10.3 *<br />

Nuske, Stephen........................................................ MoAT6.3 227<br />

.................................................................................. WeAT9.1 2352<br />

.................................................................................. ThCT8.4 4947<br />

.................................................................................. ThDPT10.22<br />

O<br />

*<br />

O'Dowd, Paul Jason ................................................. ThCT9.3 4995<br />

O'Grady, Rehan........................................................ ThCT4.2 4762<br />

.................................................................................. ThCT9.8 5027<br />

O'Malley, Marcia....................................................... TuCT5.1 1711<br />

.................................................................................. WeAT4 CC<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–250–<br />

.................................................................................. WeAT4 O<br />

.................................................................................. WeBT4 O<br />

.................................................................................. WeCT4 C<br />

.................................................................................. WeCT4 O<br />

Obara, Takeshi ......................................................... MoAT1.2 7<br />

Odedra, Siddharth .................................................... MoAT7.4 286<br />

.................................................................................. MoBPT10.19 *<br />

Odhner, Lael............................................................. TuBT7.3 1420<br />

Ogasawara, Tsukasa................................................ WeAT4.3 2127<br />

Ogata, Tetsuya ......................................................... MoAT3.4 118<br />

.................................................................................. MoBPT10.7 *<br />

.................................................................................. WeCT1.7 2910<br />

Ogawa, Keita ............................................................ ThCT5.7 4838<br />

Oh, Paul Y. ............................................................... TuAT7.2 1022<br />

.................................................................................. WeCT1.8 2916<br />

Oh, Sang-Rok........................................................... ThBT7.2 4483<br />

Oh, Se Min................................................................ TuAT5.8 967<br />

Oh, Se-Young........................................................... TuAT2.1 841<br />

Ohara, Kenichi.......................................................... MoAT1.5 25<br />

.................................................................................. MoBT1.2 427<br />

.................................................................................. MoBPT10.2 *<br />

.................................................................................. TuAT1.4 807<br />

.................................................................................. TuBPT10.1 *<br />

Ohashi, Shigeo ......................................................... TuBT3.7 1309<br />

Oishi, Shuji ............................................................... WeAT2.1 2020<br />

Ok, Sungsuk ............................................................. MoAT7.1 268<br />

Okada, Masafumi ..................................................... MoBT9.5 762<br />

.................................................................................. TuAPT10.23 *<br />

.................................................................................. WeDT8.6 3673<br />

.................................................................................. ThAPT10.21 *<br />

.................................................................................. ThCT9.7 5021<br />

Okada, Shima........................................................... TuCT9.6 1946<br />

.................................................................................. WeAPT10.24 *<br />

Okamoto, Jun ........................................................... MoBT9.3 750<br />

Okamoto, Shogo....................................................... WeCT4.6 3060<br />

.................................................................................. WeDPT10.12 *<br />

Okamura, Allison M. ................................................. WeBT4.3 2584<br />

.................................................................................. ThDT7 CC<br />

Oki, Tomohisa .......................................................... MoBT6.7 625<br />

Okuno, Hiroshi G. ..................................................... MoAT3 O<br />

.................................................................................. MoAT3.1 *<br />

.................................................................................. MoAT3.4 118<br />

.................................................................................. MoBT3 CC<br />

.................................................................................. MoBT3 O<br />

.................................................................................. MoBPT10.7 *<br />

.................................................................................. WeCT1.7 2910<br />

Okuno, Keisuke ........................................................ WeDT1.7 3381<br />

Okutomi, Masatoshi.................................................. ThBT6.7 4463<br />

Oliveira, Miguel......................................................... WeAT7.7 2286<br />

Oliver, Gabriel A. ...................................................... WeDT6.7 3577<br />

Olmi, Roberto ........................................................... ThBT9.8 4615<br />

Olofsson, Bjorn ......................................................... ThCT2.6 4704<br />

.................................................................................. ThDPT10.6 *<br />

Olson, Edwin ............................................................ TuCT1.3 1579<br />

.................................................................................. ThAT3 C<br />

.................................................................................. ThAT3.5 3881<br />

.................................................................................. ThBT2.7 4271<br />

.................................................................................. ThBPT10.8 *<br />

Omer, Aiman ............................................................ WeCT7.8 3221<br />

Omura, Suguru ......................................................... MoAT9.5 395<br />

.................................................................................. MoBPT10.26 *<br />

Onal, Cagdas Denizel............................................... MoBT9 C<br />

.................................................................................. MoBT9.4 756<br />

.................................................................................. TuAPT10.22 *<br />

Ong, Lee-Ling Sharon .............................................. ThAT4.7 3931<br />

Oriolo, Giuseppe....................................................... MoBT2 C<br />

.................................................................................. MoBT2.1 469<br />

Osswald, Marc.......................................................... ThDT5.4 5107<br />

Osswald, Stefan ....................................................... ThCT5.8 4844<br />

Oster, Stéphane ....................................................... MoAT1.7 39


Osuka, Koichi ........................................................... TuCT7.2 1817<br />

Ota, Jun.................................................................... WeAT8.3 2314<br />

.................................................................................. ThCT2 C<br />

.................................................................................. ThCT2.5 4698<br />

.................................................................................. ThDPT10.5 *<br />

Otsuka, Takuma ....................................................... MoAT3.4 118<br />

.................................................................................. MoBPT10.7 *<br />

Ott, Christian............................................................. MoBT7.5 665<br />

.................................................................................. TuAPT10.17 *<br />

.................................................................................. ThBT5.7 4420<br />

Oudeyer, Pierre-Yves............................................... TuBT8.2 1465<br />

Overholt, Jim ............................................................ ThAT8.4 4091<br />

.................................................................................. ThBPT10.22 *<br />

Özkil, Ali Gürcan....................................................... TuAT2.2 847<br />

O'Reilly, Tom ............................................................ WeCT6.3<br />

P<br />

3132<br />

Paanajärvi, Janne Olavi............................................ MoAT2.2 59<br />

Padoy, Nicolas.......................................................... WeAT3.5 2102<br />

.................................................................................. WeBPT10.8 *<br />

Pagac, Daniel ........................................................... WeDT8.8 3686<br />

Pahlajani, Chetan ..................................................... ThAT4.4 3911<br />

.................................................................................. ThBPT10.10 *<br />

Paik, Jamie ............................................................... MoAT9.8 414<br />

Palli, Gianluca........................................................... TuCT6.4 1775<br />

.................................................................................. WeAPT10.13 *<br />

Pallottino, Lucia ........................................................ WeBT9.1 2817<br />

Palyart Lamarche, Jean Christophe ......................... ThAT5.4 3951<br />

.................................................................................. ThBPT10.13 *<br />

Pane, Salvador......................................................... TuBT3.5 1297<br />

.................................................................................. TuCPT10.8 *<br />

Pangercic, Dejan ...................................................... ThBT2.6 4263<br />

.................................................................................. ThCPT10.6 *<br />

Papadopoulos, Evangelos........................................ MoBT6.2 595<br />

.................................................................................. WeCT2 C<br />

.................................................................................. WeCT2.1 2922<br />

Papadopoulos, Georgios.......................................... WeDT6.3 3551<br />

Papanikolopoulos, Nikos .......................................... TuAT9.8 1167<br />

Park, Jaesik.............................................................. WeCT9.2 3290<br />

Park, Myoung Soo.................................................... ThBT7.2 4483<br />

Park, Yong-Lae......................................................... ThBT7.3 4488<br />

Parker, Chris............................................................. WeAT1.5 1994<br />

.................................................................................. WeBPT10.2 *<br />

.................................................................................. WeDT1.4 3361<br />

.................................................................................. ThAPT10.1 *<br />

Parker, Lynne ........................................................... WeAT2 C<br />

.................................................................................. WeAT2.4 2044<br />

.................................................................................. WeBPT10.4 *<br />

Parks, Perry.............................................................. MoBT5.6 574<br />

.................................................................................. TuAPT10.12 *<br />

Parnandi, Avinash .................................................... WeBT1.1 2403<br />

Parra Vega, Vicente ................................................. TuCT1.8 1612<br />

Parusel, Sven ........................................................... WeDT1.5 3367<br />

.................................................................................. ThAPT10.2 *<br />

Pascoal, Antonio....................................................... WeCT6.8 3166<br />

Pashkevich, Anatol................................................... ThAT7.2 4034<br />

Paskarbeit, Jan......................................................... ThAT1.6 3776<br />

.................................................................................. ThBPT10.3 *<br />

Passaglia, Andrea .................................................... ThAT1.5 3770<br />

.................................................................................. ThBPT10.2 *<br />

Passig, Georg........................................................... WeAT3.6 2108<br />

.................................................................................. WeBPT10.9 *<br />

Pastor, Peter............................................................. MoAT8.2 325<br />

.................................................................................. MoAT8.8 365<br />

.................................................................................. ThCT1.4 4639<br />

.................................................................................. ThDPT10.1 *<br />

Patel, Rajnikant V..................................................... MoBT7.3 653<br />

.................................................................................. WeBT3.6 2551<br />

.................................................................................. WeCPT10.9 *<br />

Patoglu, Volkan ........................................................ ThCT7.5 4917<br />

.................................................................................. ThDPT10.20 *<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–251–<br />

Patrikalakis, Nicholas ............................................... WeDT6.3 3551<br />

Pattacini, Ugo ........................................................... WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

.................................................................................. ThAT9.6 4154<br />

.................................................................................. ThBPT10.27 *<br />

Paull, Liam................................................................ TuAT2.3 853<br />

.................................................................................. TuAT2.7 880<br />

Paulus, Dietrich ........................................................ WeDT5.8 3545<br />

Payandeh, Shahram................................................. WeAT3.4 2096<br />

.................................................................................. WeBPT10.7 *<br />

Payne, Christopher................................................... TuAT5.5 949<br />

.................................................................................. TuBPT10.11 *<br />

Pchelkin, Stepan....................................................... ThDT5.2 5094<br />

Pedrocchi, Nicola...................................................... WeCT9.7 3327<br />

Pedrono, Brice.......................................................... WeBT4.2 2578<br />

Peer, Angelika .......................................................... MoAT4 C<br />

.................................................................................. MoAT4 O<br />

.................................................................................. TuAT6.7 1002<br />

Pehlivan, Utku .......................................................... TuCT5.1 1711<br />

Peng, Huei................................................................ WeCT3.6 3004<br />

.................................................................................. WeDPT10.9 *<br />

Peng, Peter............................................................... WeBT3.4 2539<br />

Penning, Ryan .......................................................... ThDT7.1 5139<br />

Peralta Cabezas, José Luis...................................... WeAT7.2 2255<br />

Perdereau, Véronique .............................................. TuBT8.1 1459<br />

Pereira, Flávio Garcia............................................... WeDT7.6 3620<br />

.................................................................................. ThAPT10.18 *<br />

Perez, Alejandro ....................................................... ThBT3.4 4307<br />

.................................................................................. ThCPT10.7 *<br />

Pérez Arias, Antonia................................................. WeCT4.5 3053<br />

.................................................................................. WeDPT10.11 *<br />

Perez-Arancibia, Nestor O........................................ TuAT8.7 1107<br />

Pericet-Camara, Ramon........................................... TuCT9.1 1913<br />

Perlmutter, Leah ....................................................... ThAT8.8 4115<br />

Perrin, Nicolas Yves ................................................. ThBT5.5 4408<br />

.................................................................................. ThCPT10.14 *<br />

Persson, Erik ............................................................ MoAT6.4 235<br />

.................................................................................. MoBPT10.16 *<br />

Peterlik, Igor ............................................................. WeBT4.7 2608<br />

Peters, Jan ............................................................... MoAT8 C<br />

.................................................................................. MoAT8.3 332<br />

.................................................................................. MoAT8.4 338<br />

.................................................................................. MoBT8.2 698<br />

.................................................................................. MoBT8.3 704<br />

.................................................................................. MoBPT10.22 *<br />

.................................................................................. TuBT9.6 1548<br />

.................................................................................. TuCPT10.24 *<br />

Peters, Steven .......................................................... MoAT7.6 298<br />

.................................................................................. MoBPT10.21 *<br />

Peterson, Gilbert....................................................... TuAT2.4 859<br />

.................................................................................. TuBPT10.4 *<br />

Peterson, Kevin ........................................................ ThDT4.4 5080<br />

Petit, Antoine ............................................................ MoBT6.6 619<br />

.................................................................................. TuAPT10.15 *<br />

Petit, Florian ............................................................. TuCT7.5 1836<br />

.................................................................................. WeAPT10.17 *<br />

.................................................................................. ThBT1.2 4180<br />

Pêtrès, Clément........................................................ WeDT6.6 3571<br />

.................................................................................. ThAPT10.15 *<br />

Petric, Tadej ............................................................. ThCT5.6 4832<br />

.................................................................................. ThDPT10.15 *<br />

Petrisor, Doru ........................................................... TuAT5.4 943<br />

.................................................................................. TuBPT10.10 *<br />

Petruska, Andrew ..................................................... WeDT7.8 3632<br />

Petsch, Susanne ...................................................... TuBT1.8 1221<br />

Pettersen, Kristin Y................................................... MoAT6.6 247<br />

.................................................................................. MoBPT10.18 *<br />

.................................................................................. ThCT2.2 4676<br />

Peynot, Thierry ......................................................... WeBT2 C<br />

.................................................................................. WeBT2.5 2489


.................................................................................. WeCPT10.5 *<br />

Pfeifer, Rolf............................................................... ThAT9.8 4168<br />

Pfeiffer, Kai............................................................... MoAT6.5 241<br />

.................................................................................. MoBPT10.17 *<br />

Pflimlin, Jean-Michel................................................. WeBT9.5 2843<br />

.................................................................................. WeCPT10.26 *<br />

Pham, Hang.............................................................. ThBT7.4 4496<br />

.................................................................................. ThCPT10.19 *<br />

Pham, Minh Tu ......................................................... MoBT7.4 659<br />

.................................................................................. MoBT9.1 738<br />

.................................................................................. TuAPT10.16 *<br />

Phan, Samson.......................................................... WeCT3.4 2992<br />

.................................................................................. WeCT3.5 2998<br />

.................................................................................. WeDPT10.7 *<br />

.................................................................................. WeDPT10.8 *<br />

Philippsen, Roland.................................................... TuAT7 CC<br />

.................................................................................. TuAT7.4 1036<br />

.................................................................................. TuBPT10.16 *<br />

Phung, Tri Cong ....................................................... TuBT7.7 1447<br />

Piat, Emmanuel ........................................................ MoAT1.7 39<br />

Pierce, Matthew........................................................ MoAT9.6 402<br />

.................................................................................. MoBPT10.27 *<br />

Pierri, Francesco ...................................................... ThBT1.4 4194<br />

.................................................................................. ThCPT10.1 *<br />

Pierrot, François ....................................................... ThDT4 C<br />

.................................................................................. ThDT4.2 5067<br />

Pietrusinski, Maciej................................................... ThCT7.1 4893<br />

Pinciroli, Carlo .......................................................... ThCT9.1 4981<br />

.................................................................................. ThCT9.8 5027<br />

Pineda, Elvine Philip................................................. ThAT8.4 4091<br />

.................................................................................. ThBPT10.22 *<br />

Pini, Giovanni ........................................................... ThCT9.8 5027<br />

Pires, Gabriel............................................................ WeBT1.6 2438<br />

.................................................................................. WeCPT10.3 *<br />

Pirker, Katrin............................................................. ThAT6.4 3990<br />

.................................................................................. ThBPT10.16 *<br />

Pistillo, Antonio......................................................... WeCT4.4 3047<br />

.................................................................................. WeDT2.4 3413<br />

.................................................................................. WeDPT10.10 *<br />

.................................................................................. ThAPT10.4 *<br />

Pitzer, Benjamin ....................................................... TuBT7.2 1413<br />

Pivtoraiko, Mihail ...................................................... WeAT5.4 2172<br />

.................................................................................. WeBPT10.13 *<br />

Pizarro, Oscar........................................................... TuBT2.3 1242<br />

.................................................................................. TuBT9.4 1533<br />

.................................................................................. TuCPT10.22 *<br />

.................................................................................. ThBT6.6 4455<br />

.................................................................................. ThCT3.2 4722<br />

.................................................................................. ThCPT10.18 *<br />

Plaku, Erion .............................................................. TuBT5.6 1346<br />

.................................................................................. TuCPT10.12 *<br />

.................................................................................. WeAT3 CC<br />

Plumet, Frederic ....................................................... WeDT6.6 3571<br />

.................................................................................. ThAPT10.15 *<br />

Poggendorf, Maik ..................................................... WeDT1.3 3355<br />

Poignet, Philippe....................................................... WeBT3.5 2545<br />

.................................................................................. WeCPT10.8 *<br />

Pollefeys, Marc......................................................... TuCT2.1 1618<br />

.................................................................................. TuCT2.6 1655<br />

.................................................................................. WeAPT10.6 *<br />

.................................................................................. ThAT6.7 4012<br />

Polushin, Ilia G. ........................................................ MoBT7.3 653<br />

Pomerleau, Francois ................................................ ThAT2.4 3824<br />

.................................................................................. ThBPT10.4 *<br />

Popa, Dan................................................................. TuCT3.7 1699<br />

Popovic, Mila ............................................................ TuAT6.5 987<br />

.................................................................................. TuBPT10.14 *<br />

Porez, Mathieu ......................................................... TuCT8.7 1901<br />

Porquis, Lope Ben.................................................... WeDT3.7 3488<br />

Porta, Josep M ......................................................... WeBT5.7 2646<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–252–<br />

Portello, Alban .......................................................... MoAT3.7 137<br />

Porter, Judson .......................................................... TuAT8.6 1099<br />

.................................................................................. TuBPT10.21 *<br />

Pounds, Paul ............................................................ WeBT6.1 2660<br />

Prankl, Johann.......................................................... TuAT1.5 813<br />

.................................................................................. TuBPT10.2 *<br />

Prattichizzo, Domenico............................................. TuAT6.8 1008<br />

.................................................................................. WeDT7 CC<br />

Prestes, Edson ......................................................... TuAT9.1 1122<br />

Preusche, Carsten.................................................... MoAT4.7 177<br />

.................................................................................. MoBT7.5 665<br />

.................................................................................. TuAPT10.17 *<br />

Prorok, Amanda........................................................ WeCT8.3 3241<br />

.................................................................................. WeCT9.5 3313<br />

.................................................................................. WeDPT10.26 *<br />

Protzel, Peter............................................................ TuBT2.2 1234<br />

Przybylski, Markus.................................................... TuCT6.2 1761<br />

.................................................................................. TuCT6.5 1781<br />

.................................................................................. WeAPT10.14 *<br />

Puerto, Gustavo Armando ........................................ WeAT9.4 2371<br />

.................................................................................. WeBPT10.25 *<br />

Puzik, Arnold ............................................................ ThCT2.6 4704<br />

.................................................................................. ThDPT10.6 *<br />

Py, Frederic .............................................................. WeCT6.3<br />

Q<br />

3132<br />

Qin, Lei ..................................................................... ThBT7.6 4508<br />

.................................................................................. ThCPT10.21 *<br />

Qu, Xingda................................................................ ThCT7.6 4923<br />

.................................................................................. ThDPT10.21 *<br />

Quebe, Stephen ....................................................... ThDT8.2 5173<br />

Quek, Zhan Fan........................................................ WeCT3.4 2992<br />

.................................................................................. WeCT3.5 2998<br />

.................................................................................. WeDPT10.7 *<br />

.................................................................................. WeDPT10.8 *<br />

Quinn, Roger, D........................................................ MoAT5.4 197<br />

.................................................................................. MoAT5.6 209<br />

.................................................................................. MoAT5.7 215<br />

.................................................................................. MoBPT10.13 *<br />

.................................................................................. MoBPT10.15<br />

R<br />

*<br />

Radig, Bernd............................................................. ThBT9 C<br />

.................................................................................. ThBT9.4 4587<br />

.................................................................................. ThCPT10.25 *<br />

Radkhah, Katayon .................................................... ThCT5.3 4811<br />

Radrich, Helmuth...................................................... WeDT1.6 3375<br />

.................................................................................. ThAPT10.3 *<br />

Ragan, Matthew ....................................................... WeCT6.5 3147<br />

.................................................................................. WeDPT10.17 *<br />

Rai, Vijeth ................................................................. ThCT4.5 4783<br />

.................................................................................. ThDPT10.11 *<br />

Raibert, Marc ............................................................ TuPL.1 *<br />

Rajan, Kanna............................................................ WeCT6.3 3132<br />

Ramamoorthy, Subramanian.................................... WeDT8.7 3679<br />

Ramos, Oscar........................................................... ThAT9.2 4127<br />

Randazzo, Marco ..................................................... ThAT9.7 4161<br />

Rander, Peter ........................................................... ThAT6.8 4020<br />

Ranganathan, Ananth............................................... TuCT2.1 1618<br />

.................................................................................. ThAT6.3 3982<br />

Rañó, Iñaki ............................................................... WeAT8.8 2346<br />

Raschendörfer, Lars ................................................. WeDT3.5 3474<br />

.................................................................................. ThAPT10.8 *<br />

Rasmussen, Christopher .......................................... ThAT8.3 4084<br />

Rasolzadeh, Babak .................................................. WeDT1.1 3342<br />

Ratnasingam, Sivalogeswaran ................................. WeDT3.1 3446<br />

Rauter, Georg........................................................... WeCT4.7 3068<br />

Rawlik, Konrad ......................................................... MoBT8.5 718<br />

.................................................................................. TuAPT10.20 *<br />

Ray, Christopher....................................................... ThAPT10.16 *<br />

Raz, Ariel .................................................................. WeDT5.7 3539<br />

Reckhaus, Michael ................................................... TuAT7.3 1030


Reddy, Prashant....................................................... WeAT8.5 2327<br />

.................................................................................. WeBPT10.23 *<br />

Reggente, Matteo..................................................... ThCT9.5 5007<br />

.................................................................................. ThDPT10.26 *<br />

Régnier, Stéphane.................................................... TuAT3.3 894<br />

.................................................................................. TuAT3.6 913<br />

.................................................................................. TuBPT10.9 *<br />

.................................................................................. TuCT3.5 1687<br />

.................................................................................. WeAPT10.8 *<br />

Rehder, Joern........................................................... MoAT6.3 227<br />

Reiher, Stephane...................................................... TuCT7.7 1849<br />

Reilink, Rob .............................................................. TuAT5.3 937<br />

.................................................................................. WeAT3.1 2076<br />

Reina, Giulio............................................................. MoAT6.7 255<br />

Reinecke, Jens......................................................... TuBT6.3 1366<br />

.................................................................................. ThBT1.7 4215<br />

Reiter, Austin............................................................ WeAT9.7 2390<br />

Rekleitis, Georgios ................................................... MoBT6.2 595<br />

Rekleitis, Ioannis ...................................................... ThDT3 C<br />

.................................................................................. ThDT3.3 5048<br />

.................................................................................. ThDT3.4 5054<br />

Remy, C. David ........................................................ MoAT5.3 190<br />

.................................................................................. MoBT5.4 562<br />

.................................................................................. TuAPT10.10 *<br />

Ren, Hongliang......................................................... WeAT3.2 2083<br />

Ren, Wei................................................................... ThCT1.5 4645<br />

.................................................................................. ThDPT10.2 *<br />

Ren, Xiaofeng........................................................... TuAT1.6 821<br />

.................................................................................. TuBPT10.3 *<br />

.................................................................................. ThCT6.3 4850<br />

Renaud, Pierre ......................................................... WeBT3.8 2564<br />

Renzaglia, Alessandro.............................................. TuCT2.7 1661<br />

.................................................................................. WeBT2.1 2460<br />

Resende, Cassius .................................................... ThAT1.1 3746<br />

Rétornaz, Philippe .................................................... ThCT4.2 4762<br />

Revzen, Shai ............................................................ ThCT4.7 4797<br />

Riazuelo, Luis........................................................... TuBT2.8 1277<br />

Richa, Rogerio.......................................................... WeCT2.6 2953<br />

.................................................................................. WeDPT10.6 *<br />

Richardson, Andrew ................................................. ThAT3.5 3881<br />

.................................................................................. ThBPT10.8 *<br />

Richier, Mathieu........................................................ ThBT9.1 4569<br />

Richtsfeld, Andreas .................................................. TuBT1.5 1201<br />

.................................................................................. TuCPT10.2 *<br />

Rickert, Markus......................................................... TuAT7.8 1063<br />

Riener, Robert .......................................................... WeCT4.7 3068<br />

Righetti, Ludovic....................................................... MoAT8.2 325<br />

.................................................................................. MoAT8.8 365<br />

.................................................................................. ThCT1 CC<br />

.................................................................................. ThCT1.4 4639<br />

.................................................................................. ThDPT10.1 *<br />

Rimon, Elon.............................................................. TuCT6 CC<br />

Rios-Martinez, Jorge ................................................ WeAT1.8 2014<br />

Risi, Sebastian.......................................................... WeBT8.7 2802<br />

Ristic-Durrant, Danijela............................................. TuAT1.1 786<br />

Ritter, Helge Joachim ............................................... TuBT7.4 1427<br />

.................................................................................. TuCPT10.16 *<br />

.................................................................................. WeCT2.5 2947<br />

.................................................................................. WeDPT10.5 *<br />

Rituerto, Jorge.......................................................... WeAT9.6 2383<br />

.................................................................................. WeBPT10.27 *<br />

Ritz, Robin................................................................ ThDT8.3 5179<br />

Ritzmann, Roy Earl................................................... MoAT5.7 215<br />

Rives, Patrick............................................................ WeCT9.6 3320<br />

.................................................................................. WeDPT10.27 *<br />

.................................................................................. ThBT2.3 4242<br />

Riviere, Cameron...................................................... ThBT7.8 4522<br />

.................................................................................. ThDT7.4 5160<br />

Roa, Maximo A......................................................... TuCT6.3 1768<br />

.................................................................................. ThBT5.7 4420<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–253–<br />

Robert, Thomas........................................................ ThAT5.7 3969<br />

Robertsson, Anders.................................................. ThCT2.6 4704<br />

.................................................................................. ThDPT10.6 *<br />

Robuffo Giordano, Paolo .......................................... MoAT4.5 163<br />

.................................................................................. MoBPT10.11 *<br />

.................................................................................. WeAT6.4 2215<br />

.................................................................................. WeBPT10.16 *<br />

.................................................................................. WeCT4.3 3039<br />

Rocco, Paolo ............................................................ WeCT3.1 2971<br />

Rodemann, Tobias ................................................... MoAT3.2 106<br />

.................................................................................. MoAT3.6 131<br />

.................................................................................. MoBPT10.9 *<br />

.................................................................................. TuBT1.3 1187<br />

Röder, Thorsten........................................................ WeDT1.6 3375<br />

.................................................................................. ThAPT10.3 *<br />

Rodriguez, Alberto.................................................... TuCT6.8 1804<br />

Roesthuis, Roy ......................................................... WeBT3.7 2557<br />

Rogers III, John G. ................................................... TuBT2.6 1264<br />

.................................................................................. TuCPT10.6 *<br />

Rognini, Giulio .......................................................... ThCT1.8 4664<br />

Rogoff, Joshua ......................................................... ThCT4.6 4790<br />

.................................................................................. ThDPT10.12 *<br />

Rollinson, David........................................................ MoAT5.8 221<br />

.................................................................................. TuAT8.2 1075<br />

Romero-Ramirez, Miguel-Angel ............................... WeDT6.6 3571<br />

.................................................................................. ThAPT10.15 *<br />

Rosa, Benoît............................................................. TuBT5.5 1339<br />

.................................................................................. TuCPT10.11 *<br />

Rotea, Mario ............................................................. TuBT1.2 1180<br />

Roth, Bernard ........................................................... TuPL C<br />

Rotinat, Christine ...................................................... MoAT1.8 45<br />

Rouat, Jean .............................................................. MoBT3.3 518<br />

Roumeliotis, Stergios................................................ MoAT2.3 65<br />

.................................................................................. MoAT2.8 98<br />

.................................................................................. MoBT2.6 502<br />

.................................................................................. TuAPT10.6 *<br />

Rucker, Caleb........................................................... WeBT3.1 2517<br />

.................................................................................. ThAT1.4 3764<br />

.................................................................................. ThBPT10.1 *<br />

Ruehr, Thomas......................................................... ThBT2.6 4263<br />

.................................................................................. ThCPT10.6 *<br />

Ruepp, Oliver............................................................ WeCT9.3 3297<br />

Ruggiero, Fabio ........................................................ ThBT1.4 4194<br />

.................................................................................. ThCPT10.1 *<br />

Ruhnke, Michael....................................................... TuBT2.4 1249<br />

.................................................................................. TuCPT10.4 *<br />

Ruini, Fabio .............................................................. ThCT9.6 5015<br />

.................................................................................. ThDPT10.27 *<br />

Ruppel, Alexander .................................................... WeAT9.3 2365<br />

Rus, Daniela ............................................................. MoBT9.4 756<br />

.................................................................................. TuAPT10.22 *<br />

.................................................................................. WeCT5.2 3087<br />

.................................................................................. WeDT8.2 3645<br />

.................................................................................. ThCT4.8 4803<br />

.................................................................................. ThDT8.4 5187<br />

Russell, Paul T. ........................................................ WeBT3.1 2517<br />

Russell, Shawn......................................................... ThAT5.7 3969<br />

Rusu, Radu Bogdan ................................................. ThBT2.4 4249<br />

.................................................................................. ThCPT10.4 *<br />

Rüther, Matthias ....................................................... ThAT6.4 3990<br />

.................................................................................. ThBPT10.16 *<br />

Rutter, Brandon ........................................................ MoAT5.7 215<br />

Rybok, Lukas............................................................ ThCT5.4 4819<br />

.................................................................................. ThDPT10.13 *<br />

Rydén, Fredrik .......................................................... WeBT4.8 2614<br />

Ryden, Thomas ........................................................ MoAT4.1 *<br />

Rye, David ................................................................ MoBT8.6 725<br />

.................................................................................. TuAPT10.21 *<br />

Ryu, Jee-Hwan ......................................................... MoAT4.7 177<br />

Ryu, Seok Chang ..................................................... WeBT3.8 2564


S<br />

Saab, Layale............................................................. ThAT9.2 4127<br />

Saad, Maarouf.......................................................... WeAT8.7 2340<br />

Saarinen, Jari Pekka ................................................ MoAT2.2 59<br />

Sabattini, Lorenzo..................................................... WeAT8.4 2321<br />

.................................................................................. WeBPT10.22 *<br />

Sadeghian, Hamid.................................................... ThAT1.2 3752<br />

Saeedi Gharahbolagh, Sajad ................................... TuAT2.3 853<br />

.................................................................................. TuAT2.7 880<br />

Saenko, Kate............................................................ TuAT1.2 793<br />

Saffiotti, Alessandro.................................................. ThCT9.5 5007<br />

.................................................................................. ThDPT10.26 *<br />

Sahai, Ranjana......................................................... TuCT9.2 1919<br />

Saida, Masao............................................................ MoAT7.8 311<br />

Saito, Takefumi......................................................... ThCT7.2 4899<br />

Sakagami, Norimitsu ................................................ ThCT3.7 4756<br />

Sakamoto, Kiyoshi.................................................... WeAT2.3 2036<br />

Sakurai, Tatsuma ..................................................... WeDT3.8 3494<br />

Salaris, Paolo ........................................................... WeBT9.1 2817<br />

.................................................................................. ThAT1.5 3770<br />

.................................................................................. ThBPT10.2 *<br />

Salisbury, Kenneth ................................................... WeBT4.1 2570<br />

Samarasekera, Supun.............................................. ThAT3.4 3874<br />

.................................................................................. ThBPT10.7 *<br />

Sammut, Claude....................................................... MoBT8.7 732<br />

Sanahuja, Guillaume ................................................ WeBT6.4 2682<br />

.................................................................................. WeCPT10.16 *<br />

Sanchez, Fernando .................................................. TuBT8.8 1507<br />

Sanchez Fibla, Marti................................................. TuAT8.8 1115<br />

Sanchez Plazas, Oscar ............................................ WeCT5.4 3101<br />

.................................................................................. WeDPT10.13 *<br />

Sandini, Giulio .......................................................... ThAT9.7 4161<br />

Sandini, Giulio .......................................................... ThBT4.3 4354<br />

Sanemasa, Toru....................................................... TuCT5.2 1717<br />

Sangkyu, Yi .............................................................. ThAT6.5 3998<br />

.................................................................................. ThBPT10.17 *<br />

Sani, Hamidreza....................................................... TuCT9.3 1927<br />

Sankai, Yoshiyuki ..................................................... TuCT5.5 1737<br />

.................................................................................. WeAPT10.11 *<br />

.................................................................................. ThCT7.2 4899<br />

Sano, Kumiko ........................................................... WeDT3.3 3460<br />

Santos, Cristina ........................................................ WeAT7.7 2286<br />

Santos, Milton César Paes....................................... WeDT7.6 3620<br />

.................................................................................. ThAPT10.18 *<br />

Santos Sanchez, Omar Jacobo................................ WeBT6.4 2682<br />

.................................................................................. WeCPT10.16 *<br />

Santos-Victor, José .................................................. ThCT5.5 4826<br />

.................................................................................. ThDPT10.14 *<br />

Sarcinelli-Filho, Mario ............................................... ThAT1.1 3746<br />

Sardellitti, Irene......................................................... MoAT9.2 378<br />

.................................................................................. WeCT3 CC<br />

.................................................................................. ThAT7 C<br />

.................................................................................. ThAT7.1 4026<br />

Sarholz, Frederik ...................................................... ThBT9.4 4587<br />

.................................................................................. ThCPT10.25 *<br />

Saripalli, Srikanth...................................................... WeAT6 O<br />

.................................................................................. WeBT6 C<br />

.................................................................................. WeBT6 O<br />

Sarria, Javier Francisco............................................ TuBT8.8 1507<br />

Saruwatari, Hiroshi ................................................... MoBT3.2 *<br />

Sasaki, Hironobu ...................................................... ThCT2.1 4670<br />

Sato, Fuyuki.............................................................. WeCT7.2 3179<br />

Sato, Katsunari......................................................... MoAT4 O<br />

.................................................................................. MoAT4.4 157<br />

.................................................................................. MoBPT10.10 *<br />

Sato, Takahide ......................................................... TuCT8.3 1875<br />

.................................................................................. TuCT8.4 1881<br />

.................................................................................. WeAPT10.19 *<br />

Sato, Tomomasa ...................................................... WeDT2.5 3419<br />

.................................................................................. WeDT7.3 3601<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–254–<br />

.................................................................................. ThAPT10.5 *<br />

Scaccia, Milena ........................................................ ThDT3.4 5054<br />

Scandaroli, Glauco Garcia........................................ WeCT9.8 3335<br />

Scaramuzza, Davide ................................................ TuCT2 CC<br />

.................................................................................. TuCT2.7 1661<br />

.................................................................................. ThBT6.8 4469<br />

Schaal, Stefan .......................................................... MoAT8.2 325<br />

.................................................................................. MoAT8.8 365<br />

.................................................................................. WeDT2.3 3407<br />

.................................................................................. ThCT1.4 4639<br />

.................................................................................. ThDPT10.1 *<br />

Schäfer, Felix............................................................ TuAT5.4 943<br />

.................................................................................. TuBPT10.10 *<br />

Schairer, Timo .......................................................... ThBT2.1 4229<br />

Schauerte, Boris ....................................................... TuBT1.1 1173<br />

Scherer, Sebastian ................................................... MoAT6.3 227<br />

.................................................................................. WeAT6.3 2207<br />

Scheuer, Alexis......................................................... ThAT8.6 4103<br />

.................................................................................. ThBPT10.24 *<br />

Schiele, Andre .......................................................... WeAT4.8 2158<br />

Schilling, Andreas..................................................... ThBT2.1 4229<br />

Schindler, Thorsten .................................................. TuBT7.5 1433<br />

.................................................................................. TuCPT10.17 *<br />

Schlaefer, Steven ..................................................... MoAT9.7 408<br />

Schlegel, Christian.................................................... WeAT2 CC<br />

.................................................................................. WeAT2.7 2064<br />

Schmidt-Rohr, Sven R.............................................. ThCT1.3 4633<br />

Schmidt-Wetekam, Christopher................................ WeBT7.5 2741<br />

.................................................................................. WeCPT10.20 *<br />

.................................................................................. ThAT8.5 4097<br />

.................................................................................. ThBPT10.23 *<br />

Schmidts, Alexander Markus.................................... TuAT6.7 1002<br />

Schmit, Nicolas......................................................... MoBT9.5 762<br />

.................................................................................. TuAPT10.23 *<br />

Schneider, Axel ........................................................ ThAT1.6 3776<br />

.................................................................................. ThBPT10.3 *<br />

Schneider, Jeff.......................................................... MoAT8.7 357<br />

.................................................................................. TuAT8.1 1069<br />

Schneider, Robert..................................................... TuBT5.3 1327<br />

Schneider, Ulrich ...................................................... ThCT2.6 4704<br />

.................................................................................. ThDPT10.6 *<br />

Schoelkopf, Bernhard ............................................... MoAT8.3 332<br />

.................................................................................. MoBT8.2 698<br />

Schoen, Timothy Ryan ............................................. ThCT4.8 4803<br />

Schrimpf, Johannes.................................................. WeBT9.8 2861<br />

Schroeter, Christof.................................................... WeBT1.5 2430<br />

Schultz, Tanja........................................................... ThCT5.4 4819<br />

.................................................................................. ThDPT10.13 *<br />

Schultz, Ulrik Pagh ................................................... WeDT8.4 3659<br />

.................................................................................. ThAPT10.19 *<br />

Schulz, Dirk .............................................................. WeDT7.5 3614<br />

.................................................................................. ThAPT10.17 *<br />

Schwager, Mac......................................................... ThDT8 CC<br />

.................................................................................. ThDT8.4 5187<br />

Schwartz, Ira............................................................. ThAT4.1 *<br />

.................................................................................. ThAT4.3 3905<br />

Schwartz, Matthijs P ................................................. TuAT5.3 937<br />

Schwartz, Maxim ...................................................... TuAT9.6 1154<br />

.................................................................................. TuBPT10.24 *<br />

Scioni, Enea ............................................................. WeCT4.2 3031<br />

Secchi, Cristian......................................................... MoAT4 O<br />

.................................................................................. MoAT4.5 163<br />

.................................................................................. MoBPT10.11 *<br />

.................................................................................. WeAT8.4 2321<br />

.................................................................................. WeBPT10.22 *<br />

.................................................................................. ThBT9.8 4615<br />

Sedef, Mert ............................................................... WeBT3.4 2539<br />

.................................................................................. WeCPT10.7 *<br />

Seegmiller, Neal Andrew .......................................... MoBT6.4 607<br />

.................................................................................. MoBT6.5 613


.................................................................................. TuAPT10.13 *<br />

.................................................................................. TuAPT10.14 *<br />

Seipel, Justin ............................................................ MoAT5 C<br />

.................................................................................. MoAT5 O<br />

.................................................................................. MoAT5.5 203<br />

.................................................................................. MoBT5 O<br />

.................................................................................. MoBPT10.14 *<br />

Seki, Hiroya .............................................................. WeAT8.3 2314<br />

Sekiyama, Kosuke.................................................... ThCT2.1 4670<br />

Senda, Kei................................................................ WeAT7.5 2274<br />

.................................................................................. WeAT7.6 2280<br />

.................................................................................. WeBPT10.20 *<br />

.................................................................................. WeBPT10.21 *<br />

Sentis, Luis............................................................... TuAT7.4 1036<br />

.................................................................................. TuBPT10.16 *<br />

.................................................................................. WeAT7 C<br />

.................................................................................. WeAT7.4 2267<br />

.................................................................................. WeBPT10.19 *<br />

Sepp, Wolfgang........................................................ WeDT7.7 3626<br />

Seps, Monika............................................................ WeDT3.2 3454<br />

Seyfarth, Andre......................................................... TuCT7.1 1811<br />

Shah, Preyas............................................................ WeCT3.4 2992<br />

.................................................................................. WeDPT10.7 *<br />

Shah, Shridhar.......................................................... ThAT4.4 3911<br />

.................................................................................. ThBPT10.10 *<br />

Shahbazi, Hossein.................................................... TuBT2.1 1228<br />

Shakhimardanov, Azamat ........................................ TuAT7.3 1030<br />

Shamma, Jeff ........................................................... ThCT4.3 4770<br />

Shammas, Elie ......................................................... ThBT3 CC<br />

.................................................................................. ThBT3.7 4329<br />

Shams, Sarmad........................................................ ThDT6.3 5127<br />

Shang, Jianzhong..................................................... TuAT5.5 949<br />

.................................................................................. TuBPT10.11 *<br />

Shao, Xiaowei........................................................... WeAT2.3 2036<br />

Shaw, Kendrick......................................................... MoAT5.4 197<br />

.................................................................................. MoBPT10.13 *<br />

Sheh, Raymond Ka-Man .......................................... MoBT8.7 732<br />

Shell, Dylan .............................................................. ThCT9 C<br />

.................................................................................. ThCT9.4 5001<br />

.................................................................................. ThDPT10.25 *<br />

Shen, Jinglin............................................................. TuBT1.2 1180<br />

Shen, Lincheng......................................................... MoBT5.5 568<br />

.................................................................................. TuAPT10.11 *<br />

Shen, Yantao............................................................ WeAT4.4 2133<br />

.................................................................................. WeBPT10.10 *<br />

.................................................................................. ThBT9.2 4575<br />

Shende, Apoorva...................................................... WeCT8.1 3227<br />

Sheng, Weihua......................................................... WeCT1 C<br />

.................................................................................. WeCT1.2 2873<br />

.................................................................................. WeDT2.1 3395<br />

Sherbert, Robert....................................................... TuAT7.2 1022<br />

Sherikov, Aleksander................................................ WeAT7.8 2292<br />

Shi, Chaoyang.......................................................... WeAT3.7 2115<br />

Shi, Yun.................................................................... WeAT2.3 2036<br />

Shibasaki, Ryosuke.................................................. WeAT2.3 2036<br />

Shibata, Mizuho........................................................ ThCT3.7 4756<br />

Shida, Kazuya .......................................................... TuBT8.7 1499<br />

Shih, Albert J. ........................................................... TuAT5.7 961<br />

Shiller, Zvi................................................................. WeDT5.7 3539<br />

Shimojo, Makoto....................................................... TuCT9.7 1954<br />

.................................................................................. ThAT7.4 4048<br />

.................................................................................. ThCT3.4 4735<br />

.................................................................................. ThDPT10.7 *<br />

Shimosaka, Masamichi............................................. WeDT2.5 3419<br />

.................................................................................. WeDT7.3 3601<br />

.................................................................................. ThAPT10.5 *<br />

Shin, Dongik ............................................................. ThDT6.3 5127<br />

Shin, Dongjun........................................................... TuCT7.4 1830<br />

.................................................................................. WeAPT10.16 *<br />

.................................................................................. WeCT3.4 2992<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–255–<br />

.................................................................................. WeCT3.5 2998<br />

.................................................................................. WeDPT10.7 *<br />

.................................................................................. WeDPT10.8 *<br />

Shin, Kyoosik............................................................ ThDT6.3 5127<br />

Shin, Seung Hoon .................................................... TuBT7.7 1447<br />

Shintake, Jun............................................................ ThCT3.4 4735<br />

.................................................................................. ThDPT10.7 *<br />

Shiriaev, Anton ......................................................... ThDT5.2 5094<br />

Shkolnik, Alexander.................................................. ThBT3.4 4307<br />

.................................................................................. ThCPT10.7 *<br />

Shkurti, Florian ......................................................... ThDT3.3 5048<br />

.................................................................................. ThDT3.4 5054<br />

Shomin, Michael ....................................................... WeBT6.2 2668<br />

Shou, Kaiyu .............................................................. TuBT3.5 1297<br />

.................................................................................. TuCPT10.8 *<br />

Shyr, Alex ................................................................. TuAT1.2 793<br />

Siciliano, Bruno......................................................... TuCT6.4 1775<br />

.................................................................................. WeAPT10.13 *<br />

.................................................................................. ThAT1.2 3752<br />

Siegwart, Roland ...................................................... MoAT2.5 79<br />

.................................................................................. MoAT5.3 190<br />

.................................................................................. MoBT5.4 562<br />

.................................................................................. MoBPT10.5 *<br />

.................................................................................. TuAPT10.10 *<br />

.................................................................................. TuCT2.7 1661<br />

.................................................................................. WeAT6.5 2223<br />

.................................................................................. WeAT6.7 2235<br />

.................................................................................. WeAT6.8 2242<br />

.................................................................................. WeBT6.6 2694<br />

.................................................................................. WeBPT10.17 *<br />

.................................................................................. WeCPT10.18 *<br />

.................................................................................. ThAT2.4 3824<br />

.................................................................................. ThBPT10.4 *<br />

Sigrist, Roland .......................................................... WeCT4.7 3068<br />

Silveira, Geraldo ....................................................... WeCT9.8 3335<br />

Silver, David ............................................................. MoBT2.7 510<br />

Simaan, Nabil ........................................................... WeAT9.7 2390<br />

Simonin, Olivier ........................................................ ThAT8.6 4103<br />

.................................................................................. ThBPT10.24 *<br />

Simpson, Jason ........................................................ ThCT8.8 4975<br />

Singh, Sanjiv............................................................. MoAT6.3 227<br />

.................................................................................. WeAT6.3 2207<br />

.................................................................................. WeAT9.1 2352<br />

Singh, Surya ............................................................. MoBT8.6 725<br />

.................................................................................. TuAPT10.21 *<br />

Singleton, John......................................................... TuCT3.5 1687<br />

.................................................................................. WeAPT10.8 *<br />

Sinnet, Ryan W......................................................... TuCT5.3 1723<br />

Sirouspour, Shahin ................................................... MoBT7.2 645<br />

Sisbot, Emrah Akin ................................................... WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

Sitti, Metin................................................................. MoBT5.3 556<br />

.................................................................................. TuBT3.4 1291<br />

.................................................................................. TuCT3.1 *<br />

.................................................................................. TuCT3.3 1674<br />

.................................................................................. TuCT3.5 1687<br />

.................................................................................. TuCPT10.7 *<br />

.................................................................................. WeAPT10.8 *<br />

.................................................................................. ThAT4.8 3937<br />

Sivalingam, Ravishankar .......................................... TuAT9.8 1167<br />

Sjöö, Kristoffer .......................................................... TuBT9.1 1513<br />

Skachek, Sergey ...................................................... WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

Skantze, Gabriel ....................................................... WeDT1.1 3342<br />

Skocaj, Danijel.......................................................... WeDT1.8 3387<br />

Skourup, Charlotte.................................................... MoBPT10.16 *<br />

Smart, William .......................................................... ThAT8 CC<br />

.................................................................................. ThAT8.8 4115<br />

Smeraldi, Fabrizio..................................................... TuBT9.3 1526<br />

Smith, Ryan .............................................................. WeCT6 C


.................................................................................. WeCT6 O<br />

.................................................................................. WeDT6 O<br />

Smith, Stephen L...................................................... WeCT5.2 3087<br />

.................................................................................. WeDT8 CC<br />

.................................................................................. WeDT8.2 3645<br />

Smith, William........................................................... WeCT3.6 3004<br />

.................................................................................. WeDPT10.9 *<br />

Smits, Ruben............................................................ WeCT4.2 3031<br />

.................................................................................. ThCT2.3 4684<br />

Soccol, Dean ............................................................ ThCT8.2 4935<br />

Solanes, Ernesto ...................................................... ThBT3.8 4335<br />

Soltero, Daniel E....................................................... WeDT8.2 3645<br />

Son, Hyoung Il.......................................................... WeCT4.3 3039<br />

Sone, Satoshi ........................................................... TuCT9.7 1954<br />

Song, Dan................................................................. TuAT6.4 980<br />

.................................................................................. TuBPT10.13 *<br />

Song, Yixu ................................................................ WeAT9.2 2359<br />

Sornmo, Olof ............................................................ ThCT2.6 4704<br />

.................................................................................. ThDPT10.6 *<br />

Sotzek, Alice............................................................. WeDT1.6 3375<br />

.................................................................................. ThAPT10.3 *<br />

Soueres, Philippe ..................................................... ThAT9.2 4127<br />

Soulignac, Michaël ................................................... WeDT5.4 3519<br />

.................................................................................. ThAPT10.10 *<br />

Sousa, João.............................................................. WeCT6.6 3154<br />

.................................................................................. WeDPT10.18 *<br />

Sousa Junior, Samuel Felix de................................. WeAT1.2 1974<br />

Spalanzani, Anne ..................................................... WeAT1.8 2014<br />

Spampinato, Giacomo.............................................. TuCT2.2 1626<br />

Spenko, Matthew...................................................... MoAT7 CC<br />

.................................................................................. MoAT7.7 305<br />

Spinello, Luciano ...................................................... ThAT2.6 3838<br />

.................................................................................. ThAT2.7 3844<br />

.................................................................................. ThBPT10.6 *<br />

Srinivasa, Siddhartha ............................................... TuCT6.8 1804<br />

.................................................................................. WeAT1.4 1986<br />

.................................................................................. WeBPT10.1 *<br />

Srinivasan, Mandyam............................................... WePL.1 *<br />

.................................................................................. ThCT8.2 4935<br />

Stachniss, Cyrill........................................................ MoAT2.6 86<br />

.................................................................................. MoBPT10.6 *<br />

.................................................................................. TuAT2.5 865<br />

.................................................................................. TuBPT10.5 *<br />

.................................................................................. WeAT5.5 2180<br />

.................................................................................. WeBPT10.14 *<br />

.................................................................................. ThBT2 C<br />

.................................................................................. ThBT2.4 4249<br />

.................................................................................. ThCPT10.4 *<br />

Stanley, Kenneth ...................................................... WeBT8.7 2802<br />

Stasse, Olivier .......................................................... ThBT5.5 4408<br />

.................................................................................. ThCPT10.14 *<br />

Steck, Andreas ......................................................... WeAT2.7 2064<br />

Steder, Bastian......................................................... TuBT2.4 1249<br />

.................................................................................. TuCPT10.4 *<br />

Steffen, Jan .............................................................. WeCT2.5 2947<br />

.................................................................................. WeDPT10.5 *<br />

Stegagno, Paolo....................................................... MoBT2.1 469<br />

Stein, David .............................................................. ThCT4.8 4803<br />

Steinberg, Daniel...................................................... TuBT9.4 1533<br />

.................................................................................. TuCPT10.22 *<br />

.................................................................................. ThCT3.2 4722<br />

Steinwender, Jasmin ................................................ WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

Stelzer, Annett.......................................................... WeCPT10.6 *<br />

Stentz, Anthony ........................................................ MoBT2.7 510<br />

.................................................................................. WeBT8.4 2784<br />

.................................................................................. WeCPT10.22 *<br />

Sterkers, Olivier........................................................ WeBT3.3 2532<br />

Stiefelhagen, Rainer................................................. TuBT1.1 1173<br />

.................................................................................. ThCT5.4 4819<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–256–<br />

.................................................................................. ThDPT10.13 *<br />

Stilman, Mike ............................................................ WeBT5.6 2640<br />

.................................................................................. WeCPT10.15 *<br />

.................................................................................. ThCT1 C<br />

.................................................................................. ThCT1.2 4627<br />

Stilwell, Daniel .......................................................... TuAT9.4 1140<br />

.................................................................................. TuBPT10.22 *<br />

.................................................................................. WeCT8.1 3227<br />

Stirling, Leia.............................................................. ThBT7.3 4488<br />

Stirling, Timothy........................................................ ThCT9.8 5027<br />

Stoianovici, Dan........................................................ TuAT5.4 943<br />

.................................................................................. TuBPT10.10 *<br />

Stolle, Christian ........................................................ TuAT3.3 894<br />

Stone, Peter.............................................................. ThBT9.3 4581<br />

Stoy, Kasper ............................................................. WeDT8 C<br />

.................................................................................. WeDT8.4 3659<br />

.................................................................................. ThAPT10.19 *<br />

Stramigioli, Stefano .................................................. TuAT5.3 937<br />

.................................................................................. WeAT3 C<br />

.................................................................................. WeAT3.1 2076<br />

Strasser, Wolfgang ................................................... ThBT2.1 4229<br />

Strausser, Katherine................................................. ThCT7.4 4911<br />

.................................................................................. ThDPT10.19 *<br />

Strefling, Paul ........................................................... ThCT3.6 4749<br />

.................................................................................. ThDPT10.9 *<br />

Strom, Johannes H................................................... ThBT2.7 4271<br />

Studley, Matthew ...................................................... ThCT9.3 4995<br />

Stulp, Freek .............................................................. MoAT8.2 325<br />

.................................................................................. WeDT2.3 3407<br />

Stump, Ethan............................................................ MoBT2 CC<br />

.................................................................................. WeBT6.8 2708<br />

Stumpf, Jan Carsten................................................. ThDT8.1 5166<br />

Sturm, Peter ............................................................. ThAT6.6 4006<br />

.................................................................................. ThBPT10.18 *<br />

Su, Hu....................................................................... WeDT2.7 3434<br />

Sucan, Ioan Alexandru ............................................. ThCT1.1 4621<br />

Suetani, Hiromichi .................................................... WeBT7.1 2715<br />

Sugahara, Yusuke .................................................... ThBT9.5 4593<br />

.................................................................................. ThCPT10.26 *<br />

Sugano, Shigeki ....................................................... WeDT9.7 3734<br />

.................................................................................. ThAT1.8 3792<br />

.................................................................................. ThBT5 O<br />

.................................................................................. ThCT5 CC<br />

.................................................................................. ThCT5 O<br />

Sugi, Masao.............................................................. ThCT2.5 4698<br />

Sugimoto, Norikazu .................................................. WeCT7.3 3185<br />

Sugimoto, Shigeki..................................................... ThBT6.7 4463<br />

Sugimoto, Yasuhiro .................................................. TuCT7.2 1817<br />

Sugiura, Komei ......................................................... MoAT8.6 350<br />

.................................................................................. MoBPT10.24 *<br />

Suh, Il Hong.............................................................. WeDT7.2 3595<br />

Sujit, P.B................................................................... WeCT6.6 3154<br />

.................................................................................. WeDPT10.18 *<br />

Sukhatme, Gaurav.................................................... WeCT6.3 3132<br />

.................................................................................. WeCT6.5 3147<br />

.................................................................................. WeCT6.7 3160<br />

.................................................................................. WeDT6.5 3564<br />

.................................................................................. WeDPT10.17 *<br />

.................................................................................. ThAPT10.14 *<br />

.................................................................................. ThBT4.2 4348<br />

Sukkarieh, Salah ...................................................... ThBT2.5 4256<br />

.................................................................................. ThCPT10.5 *<br />

Sukthankar, Gita....................................................... WeBT4.4 2590<br />

.................................................................................. WeCPT10.10 *<br />

Sullivan, Brian........................................................... TuBT1.4 1194<br />

.................................................................................. TuCPT10.1 *<br />

Sullivan, Jenny ......................................................... TuCT5.1 1711<br />

Sumioka, Hidenobu .................................................. ThAT9.8 4168<br />

Sun, Baiqing ............................................................. ThCT2.1 4670<br />

Sun, Dali ................................................................... WeCT8.8 3276


Sun, Dong................................................................. MoBT1.6 451<br />

.................................................................................. TuAPT10.3 *<br />

Sun, Xiaochuan ........................................................ WeAT3.4 2096<br />

.................................................................................. WeBPT10.7 *<br />

Sun, Yi...................................................................... ThDT3.1 5035<br />

Sun, Yu..................................................................... ThAT6.1 *<br />

Sünderhauf, Niko...................................................... TuBT2.2 1234<br />

Suppa, Michael......................................................... TuCT9.4 1933<br />

.................................................................................. WeAPT10.22 *<br />

.................................................................................. WeCT9.3 3297<br />

Suzuki, Kouki............................................................ ThBT9.5 4593<br />

.................................................................................. ThCPT10.26 *<br />

Suzuki, Shota ........................................................... TuCT8.6 1895<br />

.................................................................................. WeAPT10.21 *<br />

Suzuki, Takahiro....................................................... TuBT1.7 1214<br />

Suzuki, Yosuke......................................................... TuCT9.7 1954<br />

Svec, Petr................................................................. TuAT9.6 1154<br />

.................................................................................. TuBPT10.24 *<br />

Svinin, Mikhail........................................................... ThBT4.8 4386<br />

Swaney, Philip.......................................................... WeBT3.1 2517<br />

Szczuka, Roman ...................................................... WeDT8.3 3653<br />

Szewczyk, Jérôme.................................................... TuBT5.4 1333<br />

.................................................................................. TuBT5.5 1339<br />

.................................................................................. TuCPT10.10 *<br />

.................................................................................. TuCPT10.11 *<br />

Sznitman, Raphael ................................................... WeCT2.6 2953<br />

.................................................................................. WeDPT10.6 *<br />

Szwaykowska, Klementyna...................................... WeDT6.8<br />

T<br />

3583<br />

Tabak, Ahmet Fatih .................................................. MoBT1.8 463<br />

Tachi, Susumu.......................................................... MoAT4.4 157<br />

.................................................................................. MoBPT10.10 *<br />

Tadakuma, Kenjiro ................................................... TuBT6.6 1386<br />

.................................................................................. TuBT6.7 1392<br />

.................................................................................. TuCPT10.15 *<br />

.................................................................................. ThAT7.4 4048<br />

.................................................................................. ThBPT10.19 *<br />

Tadakuma, Riichiro................................................... ThAT7.4 4048<br />

.................................................................................. ThBPT10.19 *<br />

Tadano, Kotaro......................................................... TuAT5.2 931<br />

Tadokoro, Satoshi .................................................... MoAT6 C<br />

.................................................................................. MoAT6 O<br />

.................................................................................. MoBT6 CC<br />

.................................................................................. MoBT6 O<br />

.................................................................................. WeDT3.7 3488<br />

.................................................................................. WeDT3.8 3494<br />

Tae, Kyung ............................................................... TuAT5.8 967<br />

Tagawa, Yasutaka.................................................... ThCT2.7 4710<br />

Taghirad, Hamid....................................................... TuAT9.2 1128<br />

Taguchi, Yuichi......................................................... ThCT2.4 4690<br />

.................................................................................. ThDPT10.4 *<br />

Tahara, Kenji ............................................................ TuCT7.6 1843<br />

.................................................................................. WeAPT10.18 *<br />

.................................................................................. ThBT1.5 4201<br />

.................................................................................. ThCPT10.2 *<br />

Taji, Kouichi.............................................................. WeBT7.4 2735<br />

.................................................................................. WeCPT10.19 *<br />

Takahashi, Jun ......................................................... WeCT7.2 3179<br />

Takahashi, Koji......................................................... ThBT6.4 4442<br />

.................................................................................. ThCPT10.16 *<br />

Takaki, Takeshi ........................................................ TuBT1.6 1208<br />

.................................................................................. TuCPT10.3 *<br />

Takanishi, Atsuo....................................................... WeCT7.8 3221<br />

Takano, Wataru........................................................ WeDT2.2 3401<br />

Takaoka, Shunichi.................................................... TuAT8.3 1081<br />

Takayama, Leila ....................................................... WeBT1.8 2452<br />

Takemura, Noriko..................................................... WeCT1.1 2867<br />

Takeshita, Keisuke ................................................... MoAT4.4 157<br />

.................................................................................. MoBPT10.10 *<br />

Takeuchi, Masaru..................................................... MoBT1.7 457<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–257–<br />

Takhmar, Amir .......................................................... MoBT7.3 653<br />

Takiguchi, Kingo ....................................................... MoAT1.1 1<br />

Takubo, Tomohito..................................................... MoAT1.5 25<br />

.................................................................................. MoBT1.2 427<br />

.................................................................................. MoBPT10.2 *<br />

.................................................................................. TuAT1.4 807<br />

.................................................................................. TuBPT10.1 *<br />

Takubo, Toshio......................................................... WeBT1.4 2423<br />

.................................................................................. WeCPT10.1 *<br />

Takumi, Yoshida....................................................... MoAT4.4 157<br />

.................................................................................. MoBPT10.10 *<br />

Tamjidi, Amir Hossein............................................... TuAT9.2 1128<br />

Tamura, Yusuke ....................................................... WeCT7.6 3207<br />

.................................................................................. WeDPT10.21 *<br />

Tan, Jindong............................................................. TuCT2.8 1668<br />

Tan, Min.................................................................... WeDT2.7 3434<br />

Tan, Sia Nguan Eugene ........................................... ThCT7.3 4905<br />

Tan, Xiaobo .............................................................. MoBT5.8 588<br />

Tanaka, Daiki............................................................ WeBT7.6 2747<br />

.................................................................................. WeCPT10.21 *<br />

Tanaka, Fumihide..................................................... WeBT1.2 2409<br />

Tanaka, Kanji............................................................ TuAT2.6 872<br />

.................................................................................. TuBPT10.6 *<br />

Tanaka, Yoshiyuki .................................................... TuCT5.2 1717<br />

Tang, Chaoquan....................................................... TuCT8.2 1869<br />

Tangorra, James ...................................................... MoBT5.7 580<br />

Tani, Atsushi............................................................. WeBT1.4 2423<br />

.................................................................................. WeCPT10.1 *<br />

Tanikawa, Tamio ...................................................... MoAT1.5 25<br />

.................................................................................. MoBPT10.2 *<br />

Tanner, Herbert G. ................................................... ThAT4.4 3911<br />

.................................................................................. ThBPT10.10 *<br />

Tao, Xinyong ............................................................ TuCT3.8 1705<br />

Tardos, Juan D. ........................................................ MoAT2 C<br />

.................................................................................. MoAT2.1 51<br />

.................................................................................. TuBT2.8 1277<br />

Täubig, Holger .......................................................... TuCT1.4 1585<br />

.................................................................................. WeAPT10.1 *<br />

Tavakoli, Mahdi ........................................................ MoBT7.4 659<br />

.................................................................................. MoBT9.1 738<br />

.................................................................................. TuAPT10.16 *<br />

Tavakoli, Mahmoud .................................................. ThBT2.8 4277<br />

Tayama, Munenori.................................................... ThCT7.2 4899<br />

Taylor, Brian ............................................................. MoAT5.7 215<br />

Taylor, Camillo Jose ................................................. TuAT7.6 1048<br />

.................................................................................. TuBPT10.18 *<br />

.................................................................................. ThCT4.4 4776<br />

.................................................................................. ThDPT10.10 *<br />

Taylor, Russell H. ..................................................... MoBT7.1 639<br />

.................................................................................. WeCT2.6 2953<br />

.................................................................................. WeDPT10.6 *<br />

Tedrake, Russ .......................................................... WeBT6.7 2700<br />

Tegopoulou, Anastasia............................................. WeCT2.1 2922<br />

Tehrani Nik Nejad, Hossein...................................... ThBT6.4 4442<br />

.................................................................................. ThCPT10.16 *<br />

Teller, Seth ............................................................... ThBT3.4 4307<br />

.................................................................................. ThCPT10.7 *<br />

Temel, Fatma Zeynep .............................................. MoBT1.8 463<br />

Tenenholtz, Neil........................................................ TuBT5.3 1327<br />

Tenorth, Moritz ......................................................... ThBT2.6 4263<br />

.................................................................................. ThCPT10.6 *<br />

ter Mors, Adriaan W.................................................. WeAT5.3 2166<br />

Terada, Kazuki ......................................................... TuCT9.7 1954<br />

.................................................................................. ThAT7.4 4048<br />

.................................................................................. ThBPT10.19 *<br />

Tercero Villagran, Carlos Rafael .............................. MoBT1.4 439<br />

.................................................................................. TuAPT10.1 *<br />

.................................................................................. WeAT3.7 2115<br />

Terekhov, Alexander V. ............................................ ThBT4.4 4360<br />

.................................................................................. ThCPT10.10 *


Tesch, Matthew ........................................................ MoAT8.7 357<br />

.................................................................................. TuAT8.1 1069<br />

Teuliere, Celine ........................................................ ThCT8.1 4929<br />

Tews, Ashley Desmond............................................ TuAT1 CC<br />

.................................................................................. TuAT1.8 834<br />

Thakur, Atul .............................................................. TuAT9.6 1154<br />

.................................................................................. TuBPT10.24 *<br />

Theodorou, Evangelos ............................................. MoAT8.2 325<br />

.................................................................................. WeDT2.3 3407<br />

Thielmann, Sophie Charlotte Franziska ................... WeCT4.1 3023<br />

Thobbi, Anand .......................................................... WeCT1.2 2873<br />

Thompson, David ..................................................... WeCT6.4 3140<br />

.................................................................................. WeDPT10.16 *<br />

Thompson, Paul ....................................................... ThBT2.2 4236<br />

Thorson, Ivar ............................................................ MoAT9.4 390<br />

.................................................................................. MoBPT10.25 *<br />

Thrun, Sebastian ...................................................... ThPT11.1 *<br />

Thuilot, Benoit........................................................... ThAT8.1 4072<br />

.................................................................................. ThBT9.1 4569<br />

Thurrowgood, Saul ................................................... ThCT8.2 4935<br />

Tian, Weicheng......................................................... TuCT8.1 1863<br />

Tipaldi, Gian Diego................................................... WeAT1.1 1968<br />

Tobergte, Andreas.................................................... WeCT4.1 3023<br />

Tokekar, Pratap........................................................ MoBT2.4 488<br />

.................................................................................. TuAPT10.4 *<br />

Tokuda, Isao............................................................. WeBT7.6 2747<br />

.................................................................................. WeCPT10.21 *<br />

Tolley, Michael Thomas............................................ ThBT4.5 4366<br />

.................................................................................. ThCPT10.11 *<br />

Tombari, Federico .................................................... ThCT6.4 4857<br />

.................................................................................. ThDPT10.16 *<br />

Tominaga, Shoji........................................................ WeDT2.5 3419<br />

.................................................................................. ThAPT10.5 *<br />

Tomita, Kyohei.......................................................... MoAT1.3 13<br />

Tomiyama, Ken ........................................................ WeBT2.2 2466<br />

Tomizuka, Masayoshi............................................... TuCT8.5 1887<br />

.................................................................................. WeAPT10.20 *<br />

.................................................................................. WeBT2.3 2474<br />

Tomlin, Claire ........................................................... WeCT3.2 2979<br />

Tong, Chi Hay........................................................... MoBT6.8 631<br />

Torfs, Serge.............................................................. ThBT7.1 4477<br />

Tornero, Josep ......................................................... ThBT3.8 4335<br />

Torres, Luis G........................................................... ThDT7.3 5153<br />

Tow, Adela................................................................ ThCT7.6 4923<br />

.................................................................................. ThDPT10.21 *<br />

Toyama, Shigeki....................................................... MoBT9.3 750<br />

Transeth, Aksel Andreas .......................................... MoAT6.6 247<br />

.................................................................................. MoBPT10.18 *<br />

Tremblay, Charles .................................................... TuBT3.3 1285<br />

Trentini, Michael ....................................................... TuAT2.3 853<br />

.................................................................................. TuAT2.7 880<br />

Trevor, Alexander J B............................................... TuBT2.6 1264<br />

.................................................................................. TuCPT10.6 *<br />

Trianni, Vito .............................................................. ThCT9.8 5027<br />

Trinkle, Jeff............................................................... TuBT7 CC<br />

.................................................................................. TuBT7.5 1433<br />

.................................................................................. TuCT6 C<br />

.................................................................................. TuCPT10.17 *<br />

Troiani, Chiara.......................................................... WeBT2.1 2460<br />

Troni, Giancarlo ........................................................ WeDT9.5 3722<br />

.................................................................................. ThAPT10.23 *<br />

Tsagarakis, Nikolaos ................................................ MoAT8.1 318<br />

.................................................................................. MoAT9.2 378<br />

.................................................................................. ThAT7.1 4026<br />

Tsakiris, Dimitris ....................................................... ThAT7.5 4054<br />

.................................................................................. ThBPT10.20 *<br />

Tsiotras, Panagiotis.................................................. WeDT5.1 3501<br />

.................................................................................. WeDT5.2 3507<br />

.................................................................................. WeDT5.6 3533<br />

.................................................................................. ThAPT10.12 *<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–258–<br />

Tsuchiya, Kazuo ....................................................... WeAT7.5 2274<br />

.................................................................................. WeAT7.6 2280<br />

.................................................................................. WeBPT10.20 *<br />

.................................................................................. WeBPT10.21 *<br />

Tsugawa, Sadayuki .................................................. ThAT8.7 4109<br />

Tsuji, Tokuo .............................................................. ThBT5.3 4392<br />

Tsuji, Toshio ............................................................. TuCT5.2 1717<br />

Tsujino, Hiroshi......................................................... ThAT1.8 3792<br />

Tsukahara, Atsushi................................................... TuCT5.5 1737<br />

.................................................................................. WeAPT10.11 *<br />

Tully, Stephen........................................................... MoBT2.3 482<br />

.................................................................................. TuBT5.7 1353<br />

.................................................................................. ThDT7.2 5147<br />

Tunnell, Robert ......................................................... TuAT3.3 894<br />

Turkseven, Melih ...................................................... WeAT4.5 2139<br />

.................................................................................. WeBPT10.11<br />

U<br />

*<br />

Ueda, Jun ................................................................. TuCT9.5 1940<br />

.................................................................................. WeAT4.5 2139<br />

.................................................................................. WeAPT10.23 *<br />

.................................................................................. WeBPT10.11 *<br />

Ueda, Tomohiro........................................................ ThCT3.7 4756<br />

Ueyama, Tsuyoshi.................................................... ThCT2.5 4698<br />

.................................................................................. ThDPT10.5 *<br />

Ugurlu, Barkan.......................................................... MoAT8.1 318<br />

Ulbrich, Stefan .......................................................... TuCT6.2 1761<br />

Ulmen, John ............................................................. ThDT5.3 5100<br />

Ulusoy, Alphan ......................................................... WeCT5.2 3087<br />

Underwood, James Patrick....................................... MoAT6.7 255<br />

Unluhisarcikli, Ozer................................................... ThCT7.1 4893<br />

Uno, Yoji ................................................................... WeBT7.4 2735<br />

.................................................................................. WeCPT10.19 *<br />

Urcola, Pablo ............................................................ WeCT1.4 2887<br />

.................................................................................. WeDPT10.1<br />

V<br />

*<br />

Vahrenkamp, Nikolaus ............................................. TuCT6.2 1761<br />

Vaidyanathan, Ravi .................................................. MoAT5 O<br />

.................................................................................. MoBT5 CC<br />

.................................................................................. MoBT5 O<br />

.................................................................................. ThAT7.3 4042<br />

Vallery, Heike ........................................................... WeCT4.7 3068<br />

Vallone, Luca............................................................ WeCT7.4 3192<br />

.................................................................................. WeDPT10.19 *<br />

Valls Miro, Jaime ...................................................... TuAT2 C<br />

.................................................................................. TuCT2.4 1640<br />

.................................................................................. WeAPT10.4 *<br />

Valtazanos, Aris........................................................ WeDT8.7 3679<br />

van den Berg, Jur ..................................................... WeBT5.7 2646<br />

van den Boom, Ton .................................................. WeBT7.3 2729<br />

Van der Loos, H.F. Machiel ...................................... WeAT1.5 1994<br />

.................................................................................. WeBPT10.2 *<br />

van der Smagt, Patrick ............................................. MoBT7 CC<br />

.................................................................................. MoBT7.6 672<br />

.................................................................................. TuAPT10.18 *<br />

van Rossum, Anne ................................................... ThCT4.5 4783<br />

.................................................................................. ThDPT10.11 *<br />

van Toll, Wouter ....................................................... WeDT5.5 3526<br />

.................................................................................. ThAPT10.11 *<br />

van Veen, Youri ........................................................ WeBT3.7 2557<br />

Vander Hook, Joshua ............................................... MoBT2.4 488<br />

.................................................................................. TuAPT10.4 *<br />

Varga, Maja .............................................................. ThCT9.6 5015<br />

.................................................................................. ThDPT10.27 *<br />

Vartholomeos, Panagiotis......................................... ThBT7.6 4508<br />

.................................................................................. ThCPT10.21 *<br />

Vasilyev, Nikolay ...................................................... TuBT5.3 1327<br />

.................................................................................. WeAT3.2 2083<br />

Vassallo, Raquel Frizera .......................................... WeDT7.6 3620<br />

.................................................................................. ThAPT10.18 *<br />

Vasseur, Pascal........................................................ ThAT6.6 4006


.................................................................................. ThBPT10.18 *<br />

Vaughan, Richard..................................................... WeBT8 C<br />

Vaz, Luís................................................................... WeBT1.6 2438<br />

.................................................................................. WeCPT10.3 *<br />

Veloso, Manuela....................................................... MoAT2.4 73<br />

.................................................................................. MoBPT10.4 *<br />

.................................................................................. WeAT8.5 2327<br />

.................................................................................. WeBPT10.23 *<br />

.................................................................................. WeDT8.1 3638<br />

Venture, Gentiane .................................................... WeDT9.2 3701<br />

.................................................................................. ThCT2.7 4710<br />

Veon, Kevin .............................................................. WeAT9.5 2377<br />

.................................................................................. WeBPT10.26 *<br />

Vercher, Jean-Louis ................................................. TuBT5.4 1333<br />

.................................................................................. TuCPT10.10 *<br />

Verl, Alexander......................................................... WeAT9.3 2365<br />

Vernaza, Paul........................................................... WeAT5.6 2186<br />

.................................................................................. WeBPT10.15 *<br />

Verschure, Paul........................................................ TuAT8.8 1115<br />

Verspecht, Jonathan................................................. ThBT7.1 4477<br />

Vial, John.................................................................. TuAT2.8 886<br />

Vicentini, Federico.................................................... WeCT9.7 3327<br />

Vidal-Calleja, Teresa A............................................. MoAT2.7 92<br />

.................................................................................. WeBT2.5 2489<br />

.................................................................................. WeCPT10.5 *<br />

Vijayakumar, Sethu .................................................. MoBT8.5 718<br />

.................................................................................. TuAPT10.20 *<br />

Villani, Luigi .............................................................. ThAT1.2 3752<br />

.................................................................................. ThBT1 CC<br />

.................................................................................. ThBT1.4 4194<br />

.................................................................................. ThCPT10.1 *<br />

Vincze, Markus......................................................... TuAT1.5 813<br />

.................................................................................. TuBT1.5 1201<br />

.................................................................................. TuBPT10.2 *<br />

.................................................................................. TuCPT10.2 *<br />

.................................................................................. ThCT6.5 4865<br />

.................................................................................. ThDPT10.17 *<br />

Vitiello, Valentina...................................................... TuAT5.5 949<br />

.................................................................................. TuBPT10.11 *<br />

Vogel, Christian ........................................................ WeDT1.3 3355<br />

Vogel, Joern ............................................................. MoBT7.6 672<br />

.................................................................................. TuAPT10.18 *<br />

Voigt, Rainer............................................................. WeBT6.6 2694<br />

.................................................................................. WeCPT10.18 *<br />

Volkhardt, Michael.................................................... WeBT1.5 2430<br />

Volpe, Richard.......................................................... MoBT6.1 *<br />

von Stryk, Oskar....................................................... ThCT5.3 4811<br />

Vona, Marsette ......................................................... TuBT7 C<br />

.................................................................................. TuBT7.6 1439<br />

.................................................................................. TuCPT10.18 *<br />

Vonthron, Manuel ..................................................... TuAT3.4 901<br />

.................................................................................. TuBT3.3 1285<br />

.................................................................................. TuBPT10.7 *<br />

Vorst, Philipp ............................................................ ThBT2.1 4229<br />

Voyles, Richard ........................................................ WeAT9.5 2377<br />

.................................................................................. WeBPT10.26 *<br />

.................................................................................. WeCT3 C<br />

Vrecko, Alen ............................................................. WeDT1.8<br />

W<br />

3387<br />

Wade, Eric................................................................ WeBT1 CC<br />

.................................................................................. WeBT1.1 2403<br />

Wagner, Glenn ......................................................... WeCT8.6 3260<br />

.................................................................................. WeDPT10.24 *<br />

Wagner, René .......................................................... WeCT9.4 3305<br />

.................................................................................. WeDPT10.25 *<br />

Wahl, Friedrich M. .................................................... WeBT2.4 2481<br />

.................................................................................. WeCPT10.4 *<br />

Walker, Ian ............................................................... TuAT8.4 1087<br />

.................................................................................. TuBPT10.19 *<br />

.................................................................................. ThAT7.6 4060<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–259–<br />

.................................................................................. ThBPT10.21 *<br />

.................................................................................. ThCT6.6 4871<br />

.................................................................................. ThDPT10.18 *<br />

Walter, Christoph...................................................... WeDT1.3 3355<br />

Walter, Matthew........................................................ ThBT3.4 4307<br />

.................................................................................. ThCPT10.7 *<br />

Wang, Dangxiao ....................................................... WeBT4.6 2602<br />

.................................................................................. WeCPT10.12 *<br />

Wang, Fei ................................................................. MoAT4.8 184<br />

Wang, Hesheng........................................................ WeCT2.7 2959<br />

Wang, Hongbo.......................................................... WeAT1.7 2008<br />

Wang, Jianxun.......................................................... MoBT5.8 588<br />

Wang, Keyi ............................................................... WeDT2.8 3440<br />

Wang, Long .............................................................. TuBT6.5 1380<br />

.................................................................................. TuCPT10.14 *<br />

Wang, Ping ............................................................... TuCT5.6 1743<br />

.................................................................................. WeAPT10.12 *<br />

Wang, Ping Chuan ................................................... ThCT6.7 4877<br />

Wang, Shuhui ........................................................... ThDT4.1 5061<br />

Wang, Tianmiao ....................................................... TuCT8.1 1863<br />

Wang, Weifu ............................................................. ThBT3.6 4321<br />

.................................................................................. ThCPT10.9 *<br />

Wang, Yao-Dong ...................................................... TuBT1.6 1208<br />

.................................................................................. TuCPT10.3 *<br />

Wang, Yuechao ........................................................ TuCT8.2 1869<br />

Wang, Yunxia ........................................................... MoAT4.8 184<br />

Wang, Zhikun ........................................................... MoAT8.3 332<br />

Wang, Zhongli .......................................................... WeCT2.7 2959<br />

Wang, Zhuowei......................................................... WeDT8.3 3653<br />

Warneken, Felix........................................................ WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

Warnier, Matthieu ..................................................... WeCT1.5 2895<br />

.................................................................................. WeDPT10.2 *<br />

Warren, Frank........................................................... TuBT5.2 1321<br />

Waslander, Steven Lake .......................................... ThCT8.6 4961<br />

.................................................................................. ThDPT10.24 *<br />

Watanabe, Hiroki ...................................................... TuAT5.6 955<br />

.................................................................................. TuBPT10.12 *<br />

Watanabe, Kouichi ................................................... MoAT4.4 157<br />

.................................................................................. MoBPT10.10 *<br />

Watanabe, Tetsuyou ................................................ TuBT6.8 1398<br />

Watanabe, Wataru.................................................... TuCT8.6 1895<br />

.................................................................................. WeAPT10.21 *<br />

Weaver, Kyle D......................................................... WeBT3.1 2517<br />

Webster III, Robert James........................................ WeBT3.1 2517<br />

.................................................................................. ThAT1 C<br />

.................................................................................. ThAT1.4 3764<br />

.................................................................................. ThBPT10.1 *<br />

Wedge, Nathan......................................................... WeBT5.3 2620<br />

Wei, Gu-Yeon ........................................................... TuAT8.6 1099<br />

.................................................................................. TuBPT10.21 *<br />

Wei, Hung-Yuan ....................................................... TuCT9.8 1962<br />

Weinberg, Brian........................................................ ThCT7.1 4893<br />

Weiss, Stephan ........................................................ TuCT2.7 1661<br />

.................................................................................. WeAT6.7 2235<br />

.................................................................................. WeAT6.8 2242<br />

.................................................................................. WeBT6.6 2694<br />

.................................................................................. WeCPT10.18 *<br />

Weisshardt, Florian................................................... WeAT9.3 2365<br />

Weisz, Jonathan ....................................................... TuBT6.5 1380<br />

.................................................................................. TuCPT10.14 *<br />

Welihena Gamage, Kumudu Chalaka ThCT7.3 4905<br />

Gamage....................................................................<br />

Wenger, Philippe ...................................................... TuBT7.8 1453<br />

Wettergreen, David................................................... MoBT6.4 607<br />

.................................................................................. MoBT6.5 613<br />

.................................................................................. TuAPT10.13 *<br />

.................................................................................. TuAPT10.14 *<br />

Weyers, Christopher................................................. TuAT2.4 859<br />

.................................................................................. TuBPT10.4 *


Whitcomb, Louis....................................................... WeDT9.5 3722<br />

.................................................................................. ThAPT10.23 *<br />

White, Paul ............................................................... MoAT9.7 408<br />

Whittaker, Chuck ...................................................... ThAT2.3 3816<br />

Whittaker, William..................................................... ThAT2.3 3816<br />

Wieber, Pierre-Brice ................................................. WeAT7.8 2292<br />

.................................................................................. ThAT3.6 3887<br />

.................................................................................. ThBPT10.9 *<br />

Williams, David......................................................... ThCT3.5 4741<br />

.................................................................................. ThDPT10.8 *<br />

Williams, Ryan.......................................................... ThBT4.2 4348<br />

Williams, Stefan Bernard.......................................... TuBT2.3 1242<br />

.................................................................................. TuBT9.4 1533<br />

.................................................................................. TuCPT10.22 *<br />

.................................................................................. ThBT6.6 4455<br />

.................................................................................. ThCT3.2 4722<br />

.................................................................................. ThCPT10.18 *<br />

Willimon, Bryan......................................................... ThCT6.6 4871<br />

.................................................................................. ThDPT10.18 *<br />

Wimboeck, Thomas.................................................. TuAT6.3 973<br />

.................................................................................. WeCT7.5 3199<br />

.................................................................................. WeDPT10.20 *<br />

.................................................................................. ThBT1.7 4215<br />

Windau, Jens............................................................ WeAT9.8 2397<br />

Winfield, Alan............................................................ ThCT9.3 4995<br />

Winkens, Christian.................................................... WeDT5.8 3545<br />

Wisse, Martijn........................................................... TuAT6.6 995<br />

.................................................................................. TuBPT10.15 *<br />

.................................................................................. ThAT5.5 3957<br />

.................................................................................. ThBPT10.14 *<br />

Witick, Martha........................................................... ThAT8.8 4115<br />

Wittmeier, Steffen..................................................... TuAT7.8 1063<br />

.................................................................................. ThAT9.5 4148<br />

.................................................................................. ThBPT10.26 *<br />

Wohlkinger, Walter ................................................... ThCT6.5 4865<br />

.................................................................................. ThDPT10.17 *<br />

Wolf, Denis Fernando............................................... TuAT2 CC<br />

Wolf, Michael............................................................ ThCT3.3 4728<br />

Wolf, Peter................................................................ WeCT4.7 3068<br />

Wollherr, Dirk............................................................ WeCT5.6 3114<br />

.................................................................................. WeDPT10.15 *<br />

Wong, Liang Jie........................................................ WeDT6.3 3551<br />

Wong, Uland............................................................. ThAT2.3 3816<br />

Wood, John .............................................................. WeAT8.6 2333<br />

Wood, Levi................................................................ ThAT4.7 3931<br />

Wood, Nathan........................................................... ThBT7.8 4522<br />

Wood, Robert ........................................................... MoAT1.6 31<br />

.................................................................................. MoAT9.3 384<br />

.................................................................................. MoAT9.8 414<br />

.................................................................................. MoBT5.2 *<br />

.................................................................................. MoBPT10.3 *<br />

.................................................................................. TuAT8 C<br />

.................................................................................. TuAT8.6 1099<br />

.................................................................................. TuAT8.7 1107<br />

.................................................................................. TuBT8.4 1479<br />

.................................................................................. TuBPT10.21 *<br />

.................................................................................. TuCT9.2 1919<br />

.................................................................................. TuCPT10.19 *<br />

.................................................................................. ThBT7.3 4488<br />

.................................................................................. ThCT9.2 4989<br />

.................................................................................. ThDT4.3 5073<br />

Woodward, Matthew................................................. MoBT5.3 556<br />

Worcester, James..................................................... ThCT4.6 4790<br />

.................................................................................. ThDPT10.12 *<br />

Wortmann, Tim......................................................... TuBT3.5 1297<br />

.................................................................................. TuCPT10.8 *<br />

Wrede, Britta............................................................. TuBT1.3 1187<br />

Wrede, Sebastian..................................................... WeCT3.7 3011<br />

Wu, Licheng.............................................................. ThDT4.1 5061<br />

Wu, Wenqiang.......................................................... TuBT8.3 1473<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–260–<br />

Wu, Yen-Chang ........................................................ WeCT2.8 2965<br />

Wuensche, Hans J ................................................... ThBT8.6 4562<br />

.................................................................................. ThCPT10.24 *<br />

Wurm, Kai M............................................................. ThBT2.4 4249<br />

.................................................................................. ThCPT10.4 *<br />

Wüsthoff, Tilo............................................................ TuCT9.4 1933<br />

.................................................................................. WeAPT10.22 *<br />

.................................................................................. WeDT1.5 3367<br />

.................................................................................. ThAPT10.2 *<br />

Wyeth, Gordon ......................................................... MoBT8.1<br />

X<br />

692<br />

Xi, Ning ..................................................................... MoAT4.6 171<br />

.................................................................................. MoAT4.8 184<br />

.................................................................................. MoBPT10.12 *<br />

.................................................................................. WeAT4.4 2133<br />

.................................................................................. WeBPT10.10 *<br />

Xia, Tian ................................................................... MoBT7.1 639<br />

Xiang, Xianbo ........................................................... WeDT6.4 3558<br />

.................................................................................. ThAPT10.13 *<br />

Xiao, Jing.................................................................. WeBT4.6 2602<br />

.................................................................................. WeCPT10.12 *<br />

.................................................................................. ThBT1 C<br />

.................................................................................. ThBT1.6 4207<br />

.................................................................................. ThCPT10.3 *<br />

Xiao, Junhao............................................................. TuAT1.3 801<br />

Xie, Dan.................................................................... WeDT1.2 3349<br />

Xin, Ming................................................................... MoAT7.5 292<br />

.................................................................................. MoBPT10.20 *<br />

Xu, Anqi .................................................................... ThDT3.3 5048<br />

Xu, De....................................................................... WeDT2.7 3434<br />

Xu, Kai ...................................................................... TuAT5.7 961<br />

Xu, Ling .................................................................... WeBT8.4 2784<br />

.................................................................................. WeCPT10.22 *<br />

Xu, Yangsheng ......................................................... TuBT8.5 1487<br />

.................................................................................. TuCPT10.20<br />

Y<br />

*<br />

Yamada, Hiroya........................................................ TuAT8.3 1081<br />

Yamada, Yasunori .................................................... TuBT8.7 1499<br />

Yamada, Yoji ............................................................ WeCT4.6 3060<br />

.................................................................................. WeDPT10.12 *<br />

Yamamoto, Akio ....................................................... ThCT1.8 4664<br />

Yamamoto, Kiyohito ................................................. WeAT3.7 2115<br />

Yamamoto, Ko.......................................................... ThCT9.7 5021<br />

Yamamoto, Motoji..................................................... TuCT7.6 1843<br />

.................................................................................. WeAPT10.18 *<br />

.................................................................................. ThBT4.8 4386<br />

.................................................................................. ThDT5 CC<br />

.................................................................................. ThDT6.4 5133<br />

Yamanishi, Yoko....................................................... TuBT3.7 1309<br />

Yamashita, Tsuyoshi ................................................ WeAT7.5 2274<br />

.................................................................................. WeAT7.6 2280<br />

.................................................................................. WeBPT10.20 *<br />

.................................................................................. WeBPT10.21 *<br />

Yamato, Hideaki ....................................................... WeBT2.2 2466<br />

Yan, Liang ................................................................ MoBT9.2 744<br />

Yang, Chenguang..................................................... ThAT9.1 4121<br />

Yang, Guang-Zhong ................................................. MoBT3.7 542<br />

.................................................................................. TuAT5.5 949<br />

.................................................................................. TuBPT10.11 *<br />

Yang, Guosheng....................................................... ThDT4.1 5061<br />

Yang, Hai.................................................................. ThDT4.2 5067<br />

Yang, Jianyu............................................................. WeDT2.8 3440<br />

Yang, Xuesong ......................................................... WeCT1.3 2879<br />

Yatsurugi, Manabu ................................................... MoAT9.5 395<br />

.................................................................................. MoBPT10.26 *<br />

Yazbeck, Jano .......................................................... ThAT8.6 4103<br />

.................................................................................. ThBPT10.24 *<br />

Yazdanpanah, M. J................................................... WeBT7.2 2723<br />

Ye, Zhou ................................................................... TuBT3.4 1291<br />

.................................................................................. TuCPT10.7 *


Yeh, Xiyang .............................................................. TuCT7.4 1830<br />

.................................................................................. WeAPT10.16 *<br />

Yershov, Dmitry........................................................ ThAT3.2 3862<br />

Yesilyurt, Serhat ....................................................... MoBT1.8 463<br />

Yi, Byung-Ju ............................................................. TuAT5.8 967<br />

.................................................................................. WeDT7.2 3595<br />

Yi, Seung Joon ......................................................... ThAT5.6 3963<br />

.................................................................................. ThBPT10.15 *<br />

Yim, Mark ................................................................. MoAT9.7 408<br />

.................................................................................. ThCT4 CC<br />

.................................................................................. ThCT4 O<br />

.................................................................................. ThCT4.1 *<br />

.................................................................................. ThCT4.7 4797<br />

Yip, Hiu Man............................................................. ThBT1.8 4222<br />

Yoerger, Dana .......................................................... MoAT6.8 261<br />

.................................................................................. ThCT3.2 4722<br />

Yokoi, Kazuhito......................................................... WeAT1.6 2000<br />

.................................................................................. WeBPT10.3 *<br />

.................................................................................. ThBT5 CC<br />

.................................................................................. ThBT5.3 4392<br />

.................................................................................. ThBT5.8 4428<br />

Yokokohji, Yasuyoshi ............................................... WeAT4 O<br />

.................................................................................. WeBT4 C<br />

.................................................................................. WeBT4 O<br />

.................................................................................. WeCT4 CC<br />

.................................................................................. WeCT4 O<br />

Yonezawa, Naoaki.................................................... ThBT9.5 4593<br />

.................................................................................. ThCPT10.26 *<br />

Yonezawa, Satoshi................................................... WeAT4.3 2127<br />

Yoo, Jae Hyun.......................................................... WeBT8.5 2790<br />

.................................................................................. WeCPT10.23 *<br />

Yoon, Han................................................................. WeAT2.8 2070<br />

Yoon, Hyun-Soo ....................................................... TuAT5.8 967<br />

Yoshida, Eiichi.......................................................... WeCT7 C<br />

.................................................................................. ThBT5.5 4408<br />

.................................................................................. ThCPT10.14 *<br />

Yoshida, Kazuya....................................................... MoBT6.3 601<br />

.................................................................................. MoBT6.7 625<br />

Yoshida, Morio.......................................................... WeBT1.7 2445<br />

Yoshida, Takami....................................................... MoBT3.4 524<br />

.................................................................................. TuAPT10.7 *<br />

Yoshida, Tomoaki..................................................... MoAT6.1 *<br />

Yoshida, Yuki............................................................ WeCT7.2 3179<br />

Yoshimoto, Kayo ...................................................... TuBT6.6 1386<br />

.................................................................................. TuCPT10.15 *<br />

Young, Diana............................................................ ThBT7.3 4488<br />

Yozbatiran, Nuray..................................................... TuCT5.1 1711<br />

Yu, Shengwei ........................................................... WeAT8.1 2300<br />

Yu, Wei-Shun ........................................................... TuBT8.6 1493<br />

.................................................................................. TuCPT10.21 *<br />

Yuan, Haiwen ........................................................... ThDT4.1 5061<br />

Yuan, Qilong............................................................. WeCT2.3 2935<br />

Yuan, Shuai.............................................................. WeDT8.8 3686<br />

Yucel, Zeynep........................................................... WeDT7.1 3589<br />

Yue, Tao................................................................... MoBT1.3 433<br />

Yuh, Junku................................................................ WeDT6.1 *<br />

Yuma, Matsumura .................................................... MoAT7.1 268<br />

Yun, Seung-kook...................................................... ThAT5 CC<br />

.................................................................................. ThAT5 O<br />

.................................................................................. ThAT5.3 3943<br />

Yuta, Shinichi............................................................ WeDT7.2<br />

Z<br />

3595<br />

Zach, Christopher..................................................... TuCT2.1 1618<br />

Zacharias, Franziska ................................................ TuCT6.3 1768<br />

Zani, Paolo ............................................................... TuCT1 C<br />

.................................................................................. TuCT1.6 1599<br />

.................................................................................. WeAPT10.3 *<br />

Zarzhitsky, Dimitri..................................................... ThCT3.3 4728<br />

Zeeshan, Arif Muhammad ........................................ TuBT3.5 1297<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–261–<br />

.................................................................................. TuCPT10.8 *<br />

Zemiti, Nabil.............................................................. WeBT3.5 2545<br />

.................................................................................. WeCPT10.8 *<br />

Zenati, Marco............................................................ TuBT5.7 1353<br />

.................................................................................. ThBT7.8 4522<br />

Zeng, Shuqing .......................................................... ThBT9.2 4575<br />

Zhang, Byoung-Tak .................................................. ThAT5.6 3963<br />

.................................................................................. ThBPT10.15 *<br />

Zhang, Fumin ........................................................... WeCT6 O<br />

.................................................................................. WeDT6 C<br />

.................................................................................. WeDT6 O<br />

.................................................................................. WeDT6.8 3583<br />

Zhang, Hao............................................................... WeAT2.4 2044<br />

.................................................................................. WeBPT10.4 *<br />

Zhang, Hong............................................................. TuBT2.1 1228<br />

.................................................................................. TuBT8.3 1473<br />

Zhang, Houxiang ...................................................... TuAT1.3 801<br />

Zhang, Jianhua......................................................... TuAT1.3 801<br />

Zhang, Jianwei ......................................................... TuAT1.3 801<br />

Zhang, Liang............................................................. MoBT9.2 744<br />

Zhang, Qin................................................................ TuCT5.4 1731<br />

.................................................................................. WeAPT10.10 *<br />

.................................................................................. ThAPT10.13 *<br />

Zhang, Xianmin ........................................................ TuBT8.3 1473<br />

Zhang, Xin ................................................................ WeBT4.6 2602<br />

.................................................................................. WeCPT10.12 *<br />

Zhang, Yinghua ........................................................ TuBT1.2 1180<br />

Zhang, Yuru.............................................................. WeBT4.6 2602<br />

.................................................................................. WeCPT10.12 *<br />

Zhao, Huijing ............................................................ WeAT2.3 2036<br />

Zhao, Jiangran.......................................................... TuAT5.7 961<br />

Zheng, Minhua.......................................................... TuAT5.7 961<br />

Zhou, David .............................................................. TuAT3.4 901<br />

.................................................................................. TuBPT10.7 *<br />

Zhou, Kai .................................................................. TuBT1.5 1201<br />

.................................................................................. TuCPT10.2 *<br />

.................................................................................. WeDT1.8 3387<br />

Zhou, Ke ................................................................... MoAT2.8 98<br />

.................................................................................. MoBT2.6 502<br />

.................................................................................. TuAPT10.6 *<br />

Zhou, Quan............................................................... TuAT3.5 907<br />

.................................................................................. TuBPT10.8 *<br />

Zhou, Weizhen ......................................................... TuCT2.4 1640<br />

.................................................................................. WeAPT10.4 *<br />

Zhou, Xuefeng .......................................................... TuBT8.3 1473<br />

Zhou, Xun ................................................................. MoAT2.8 98<br />

Zhu, Chun................................................................. WeDT2.1 3395<br />

Zhu, Haifei ................................................................ TuBT8.3 1473<br />

Zhu, Huayong ........................................................... MoBT5.5 568<br />

.................................................................................. TuAPT10.11 *<br />

Ziegler, Jakob ........................................................... MoAT2.6 86<br />

.................................................................................. MoBPT10.6 *<br />

Zillich, Michael .......................................................... TuAT1.5 813<br />

.................................................................................. TuBT1 CC<br />

.................................................................................. TuBT1.5 1201<br />

.................................................................................. TuBPT10.2 *<br />

.................................................................................. TuCPT10.2 *<br />

.................................................................................. WeDT1.8 3387<br />

Zinn, Michael ............................................................ ThDT7 C<br />

.................................................................................. ThDT7.1 5139<br />

Zirbel, Alex................................................................ TuCT6.8 1804<br />

Zlot, Robert............................................................... ThAT2 C<br />

.................................................................................. ThAT2.5 3830<br />

.................................................................................. ThBPT10.5 *<br />

Zufferey, Jean-Christophe ........................................ ThCT9.6 5015<br />

.................................................................................. ThDPT10.27<br />

Ž<br />

*<br />

Žlajpah, Leon............................................................ ThCT5.6 4832<br />

.................................................................................. ThDPT10.15 *


<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–262–


Keyword Index<br />

A<br />

Adaptive Control MoAT1.2, MoAT2.8, MoAT5.4,<br />

MoAT8.7, MoBPT10.13, MoBT7.7,<br />

ThAT9.1, ThBT1.1, ThCT2.7, ThCT4.2,<br />

TuAT9.3, TuBT1.2, TuCT2.8, TuCT8.2,<br />

TuCT8.4, WeAPT10.19, WeBT2.2,<br />

WeBT7.1, WeCT7.4, WeDPT10.19<br />

Aerial Robotics MoAT4.5, MoBPT10.11, MoBT5.6,<br />

ThAT4.5, ThAT6.6, ThAT6.7,<br />

ThBPT10.11, ThBPT10.18, ThBT2.5,<br />

ThCPT10.5, ThCT1.5, ThCT8.1,<br />

ThCT8.2, ThCT8.3, ThCT8.4, ThCT8.5,<br />

ThCT8.6, ThCT8.7, ThCT9.6,<br />

ThDPT10.2, ThDPT10.22,<br />

ThDPT10.23, ThDPT10.24,<br />

ThDPT10.27, ThDT4.4, ThDT6.1,<br />

ThDT8.1, ThDT8.2, ThDT8.3,<br />

TuAPT10.12, TuAT9.5, TuBPT10.23,<br />

TuCT2.7, WeAT6.1, WeAT6.3,<br />

WeAT6.4, WeAT6.5, WeAT6.6,<br />

WeAT6.7, WeAT6.8, WeAT9.8,<br />

WeBPT10.16, WeBPT10.17,<br />

WeBPT10.18, WeBT6.1, WeBT6.2,<br />

WeBT6.3, WeBT6.4, WeBT6.5,<br />

WeBT6.6, WeBT6.7, WeBT6.8,<br />

WeBT9.5, WeCPT10.16, WeCPT10.17,<br />

WeCPT10.18, WeCPT10.26, WeCT3.2,<br />

WeCT5.3<br />

Agent-Based Systems ThAPT10.21, ThAT4.3, ThBT9.3,<br />

ThCT9.2, ThCT9.7, ThDT3.2,<br />

WeBT4.4, WeCPT10.10, WeDT8.6<br />

AI Reasoning Methods ThAT9.4, ThBPT10.25, ThBT3.3,<br />

ThBT7.2, TuCT1.7, WeCT7.1,<br />

WeDT2.2, WeDT2.3<br />

Animation and Simulation ThBT5.6, ThCPT10.15, ThCT9.8,<br />

TuBT3.8, TuBT7.5, TuCPT10.17,<br />

WeBT4.7, WeBT5.8<br />

Automation in Life MoAT1.1, MoAT1.2, MoAT1.3,<br />

Sciences: Biotechnology, MoBT1.6, TuAPT10.3, TuAT3.7,<br />

Pharmaceutical and Health TuAT5.7, TuBT3.7, TuCT3.6,<br />

Care<br />

WeAPT10.9, WeAT3.3, WeAT4.4,<br />

WeBPT10.10, WeDT2.1<br />

Autonomous Agents ThAPT10.16, ThAPT10.21, ThAT4.3,<br />

ThAT4.5, ThAT4.6, ThAT8.6, ThAT9.4,<br />

ThBPT10.11, ThBPT10.12,<br />

ThBPT10.24, ThBPT10.25, ThBT2.7,<br />

ThBT8.2, ThBT9.3, ThCT9.7, TuAT7.1,<br />

TuBT8.7, TuCT2.8, TuCT3.6,<br />

WeAPT10.9, WeAT5.3, WeAT5.4,<br />

WeAT8.6, WeBPT10.13, WeBPT10.24,<br />

WeBT5.8, WeBT6.3, WeCT5.3,<br />

WeDT7.4, WeDT8.1, WeDT8.6,<br />

WeDT8.7<br />

B<br />

Behaviour-Based Systems ThAPT10.2, ThCT9.4, ThDPT10.25,<br />

TuAT8.8, TuBT8.2, WeAT2.7,<br />

WeDT1.5<br />

Biologically-Inspired MoAT1.6, MoAT3.1, MoAT3.3,<br />

Robots<br />

MoAT5.1, MoAT5.2, MoAT5.4,<br />

MoAT5.5, MoAT5.6, MoAT5.7,<br />

MoAT5.8, MoAT8.7, MoAT9.6,<br />

MoBPT10.3, MoBPT10.13,<br />

MoBPT10.14, MoBPT10.15,<br />

MoBPT10.27, MoBT1.8, MoBT3.3,<br />

MoBT5.1, MoBT5.2, MoBT5.3,<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–263–<br />

MoBT5.4, MoBT5.5, MoBT5.6,<br />

MoBT5.7, MoBT5.8, MoBT9.7,<br />

ThAT1.3, ThAT1.4, ThAT1.6, ThAT4.6,<br />

ThAT4.7, ThAT4.8, ThAT5.5, ThAT7.3,<br />

ThAT7.6, ThAT9.1, ThAT9.3, ThAT9.5,<br />

ThAT9.8, ThBPT10.1, ThBPT10.3,<br />

ThBPT10.12, ThBPT10.14,<br />

ThBPT10.21, ThBPT10.26, ThBT4.3,<br />

ThBT4.5, ThBT7.3, ThCPT10.11,<br />

ThCT1.6, ThCT3.6, ThCT5.3, ThCT5.7,<br />

ThCT8.8, ThCT9.2, ThDPT10.3,<br />

ThDPT10.9, ThDT4.1, ThDT4.3,<br />

ThDT4.4, TuAPT10.10, TuAPT10.11,<br />

TuAPT10.12, TuAT3.4, TuAT7.8,<br />

TuAT8.1, TuAT8.2, TuAT8.3, TuAT8.4,<br />

TuAT8.6, TuAT8.7, TuBPT10.7,<br />

TuBPT10.19, TuBPT10.21, TuBT1.3,<br />

TuBT8.2, TuBT8.3, TuBT8.4, TuBT8.6,<br />

TuBT8.7, TuBT8.8, TuCPT10.19,<br />

TuCPT10.21, TuCT3.3, TuCT3.4,<br />

TuCT3.5, TuCT5.4, TuCT7.1, TuCT8.1,<br />

TuCT8.2, TuCT8.3, TuCT8.4, TuCT8.5,<br />

TuCT8.6, TuCT9.8, WeAPT10.7,<br />

WeAPT10.8, WeAPT10.10,<br />

WeAPT10.19, WeAPT10.20,<br />

WeAPT10.21, WeAT3.6, WeAT7.5,<br />

WeAT7.6, WeAT8.8, WeBPT10.9,<br />

WeBPT10.20, WeBPT10.21, WeCT7.6,<br />

WeDPT10.21, WePL.1<br />

Biomimetics MoAT1.1, MoAT3.1, MoAT9.6,<br />

MoBPT10.27, MoBT5.3, MoBT5.5,<br />

MoBT5.7, MoBT9.7, ThAT1.6,<br />

ThAT4.1, ThAT5.4, ThAT7.5,<br />

ThBPT10.3, ThBPT10.13, ThBPT10.20,<br />

ThBT4.8, ThCT3.4, ThDPT10.7,<br />

ThDT5.3, TuAPT10.11, TuAT8.5,<br />

TuAT8.7, TuAT8.8, TuBPT10.20,<br />

TuBT8.1, TuCT7.2, TuCT8.3, TuCT8.4,<br />

TuCT8.6, TuCT8.7, WeAPT10.19,<br />

WeAPT10.21, WeAT3.7, WePL.1<br />

Brain Machine Interface MoBT7.8, ThBT9.7, WeAT3.6,<br />

WeBPT10.9, WeBT1.6, WeCPT10.3,<br />

WeDT3.2<br />

C<br />

Calibration and<br />

Identification<br />

Cellular and Modular<br />

Robots<br />

MoAT1.7, MoBT1.5, MoBT3.4,<br />

ThAPT10.22, ThAPT10.23,<br />

ThAPT10.24, ThAT2.3, ThAT2.4,<br />

ThBPT10.4, ThBT4.1, ThCT2.7,<br />

TuAPT10.2, TuAPT10.7, TuBT3.8,<br />

WeAT4.4, WeBPT10.10, WeCT9.1,<br />

WeCT9.2, WeCT9.3, WeCT9.4,<br />

WeCT9.5, WeCT9.6, WeCT9.7,<br />

WeDPT10.25, WeDPT10.26,<br />

WeDPT10.27, WeDT9.1, WeDT9.2,<br />

WeDT9.3, WeDT9.4, WeDT9.5,<br />

WeDT9.6, WeDT9.8<br />

MoAT9.7, ThAPT10.19, ThBT4.5,<br />

ThCPT10.11, ThCT4.5, ThCT4.7,<br />

ThDPT10.11, WeCT5.7, WeDT8.3,<br />

WeDT8.4<br />

Climbing robots ThAT8.5, ThBPT10.23, ThBT2.8,<br />

ThDT5.1, ThDT5.3, ThDT5.4, ThDT6.4,<br />

TuBT8.3, TuBT8.5, TuCPT10.20,<br />

Cognitive Human-Robot<br />

Interaction<br />

WeBT7.7<br />

MoAT8.6, MoBPT10.24, MoBT3.3,<br />

ThAPT10.3, ThBT9.7, TuAT1.7,


TuAT5.3, TuBT1.5, TuBT1.8, TuBT9.1,<br />

TuCPT10.2, TuCT5.8, WeAT1.3,<br />

WeAT1.5, WeAT2.8, WeAT3.8,<br />

WeBPT10.2, WeBT1.1, WeBT1.2,<br />

WeBT1.3, WeBT1.8, WeBT4.4,<br />

WeCPT10.10, WeCT1.2, WeCT1.5,<br />

WeDPT10.2, WeDT1.1, WeDT1.6,<br />

WeDT1.7, WeDT1.8, WeDT2.1,<br />

WeDT3.3<br />

Collision Avoidance MoAT6.6, MoBPT10.18, ThAT3.1,<br />

ThAT3.2, ThAT3.5, ThAT4.4,<br />

ThBPT10.8, ThBPT10.10, ThBT3.1,<br />

ThBT4.6, ThBT9.4, ThCPT10.12,<br />

ThCPT10.25, ThCT3.3, ThCT8.5,<br />

ThDPT10.23, TuCT1.1, TuCT1.2,<br />

TuCT1.3, TuCT1.4, TuCT1.5, TuCT1.6,<br />

TuCT1.7, TuCT2.1, TuCT9.7,<br />

WeAPT10.1, WeAPT10.2, WeAPT10.3,<br />

WeAT8.7, WeBT4.5, WeBT5.4,<br />

WeBT9.2, WeCPT10.11, WeCPT10.13,<br />

WeDT1.3, WeDT5.7, WeDT5.8,<br />

Compliance and<br />

Impedance Control<br />

WeDT8.2<br />

MoAT9.4, MoBPT10.25, MoBT8.5,<br />

ThAT1.5, ThAT1.6, ThAT7.1, ThAT9.3,<br />

ThBPT10.2, ThBPT10.3, ThBT1.7,<br />

ThBT1.8, ThBT4.3, ThCT1.4,<br />

ThDPT10.1, TuAPT10.20, TuAT6.3,<br />

TuAT6.8, TuAT7.4, TuBPT10.16,<br />

TuCT6.6, WeAPT10.15, WeBT4.2,<br />

WeCT3.5, WeCT7.5, WeDPT10.8,<br />

WeDPT10.20<br />

Compliant Assembly MoBT9.4, TuAPT10.22<br />

Computer Vision MoAT1.2, MoAT1.4, MoBPT10.1,<br />

MoBT3.7, ThAPT10.6, ThAPT10.18,<br />

ThAT2.2, ThAT2.8, ThAT6.3, ThAT6.4,<br />

ThAT6.5, ThAT6.6, ThAT6.7, ThAT6.8,<br />

ThAT9.6, ThBPT10.16, ThBPT10.17,<br />

ThBPT10.18, ThBPT10.27, ThBT4.7,<br />

ThBT6.3, ThBT6.4, ThBT6.5, ThBT6.6,<br />

ThBT6.7, ThBT6.8, ThCPT10.16,<br />

ThCPT10.17, ThCPT10.18, ThCT5.4,<br />

ThCT6.1, ThCT6.3, ThCT6.4, ThCT6.5,<br />

ThCT6.6, ThCT6.7, ThCT6.8, ThCT8.2,<br />

ThDPT10.13, ThDPT10.16,<br />

ThDPT10.17, ThDPT10.18, TuAT1.1,<br />

TuAT1.2, TuAT1.3, TuAT1.6, TuAT1.7,<br />

TuAT6.5, TuAT8.6, TuBPT10.3,<br />

TuBPT10.14, TuBPT10.21, TuBT1.1,<br />

TuBT1.2, TuBT1.3, TuBT1.4, TuBT1.5,<br />

TuBT1.6, TuBT1.7, TuBT2.2, TuBT2.7,<br />

TuBT2.8, TuBT7.6, TuBT9.3,<br />

TuCPT10.1, TuCPT10.2, TuCPT10.3,<br />

TuCPT10.18, TuCT1.6, TuCT2.3,<br />

TuCT2.6, WeAPT10.3, WeAPT10.6,<br />

WeAT2.1, WeAT2.4, WeAT3.1,<br />

WeAT3.3, WeAT3.4, WeAT3.5,<br />

WeAT9.2, WeAT9.3, WeAT9.4,<br />

WeAT9.5, WeAT9.6, WeAT9.7,<br />

WeBPT10.4, WeBPT10.7, WeBPT10.8,<br />

WeBPT10.25, WeBPT10.26,<br />

WeBPT10.27, WeBT9.7, WeCT2.2,<br />

WeCT2.4, WeCT2.6, WeCT2.8,<br />

WeCT9.1, WeCT9.6, WeDPT10.4,<br />

WeDPT10.6, WeDPT10.27, WeDT1.1,<br />

WeDT2.6, WeDT2.8, WeDT7.3,<br />

WeDT7.6<br />

Contact Modelling MoAT7.3, ThAT9.2, ThBT1.8, ThBT7.1,<br />

TuBT6.6, TuBT7.5, TuBT7.6,<br />

TuCPT10.15, TuCPT10.17,<br />

TuCPT10.18, WeBT4.7<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–264–<br />

Control Architectures and<br />

Programming<br />

MoBT9.1, ThAPT10.18, ThAT3.6,<br />

ThBPT10.9, ThCT2.3, TuAT7.1,<br />

TuAT7.2, TuAT7.3, TuAT7.5,<br />

TuBPT10.17, WeAT9.8, WeCT5.7,<br />

WeDT7.6<br />

Cooperating Robots MoAT2.8, MoBT2.1, ThAPT10.16,<br />

ThAPT10.20, ThBT2.8, ThBT4.2,<br />

ThBT8.3, ThBT9.5, ThCPT10.26,<br />

ThCT2.8, ThCT9.3, ThCT9.5, ThCT9.6,<br />

ThDPT10.26, ThDPT10.27, ThDT3.2,<br />

TuCT2.7, WeAT6.4, WeAT6.6,<br />

WeAT8.4, WeAT8.5, WeAT8.7,<br />

WeBPT10.16, WeBPT10.18,<br />

WeBPT10.22, WeBPT10.23, WeBT1.8,<br />

WeBT8.6, WeBT8.7, WeBT8.8,<br />

WeCPT10.24, WeCT1.4, WeCT5.1,<br />

WeCT8.2, WeCT8.4, WeDPT10.1,<br />

WeDPT10.22, WeDT1.2, WeDT7.4,<br />

WeDT8.1, WeDT8.5<br />

Cooperative Manipulators MoAT6.6, MoBPT10.18, ThBT1.3,<br />

TuBT7.1<br />

D<br />

Demining Systems TuAT9.7<br />

Dexterous Manipulation ThAT1.2, ThAT7.5, ThAT7.7,<br />

ThBPT10.20, ThBT1.4, ThCPT10.1,<br />

ThDT6.1, TuBT6.7, TuBT7.4, TuBT7.7,<br />

TuBT8.1, TuCPT10.16, TuCT6.2,<br />

TuCT6.7, TuCT6.8, WeBT3.1,<br />

WeCT2.5, WeDPT10.5<br />

Distributed Robot Systems MoAT3.5, MoAT4.5, MoBPT10.8,<br />

MoBPT10.11, MoBT2.1, ThAPT10.19,<br />

ThAT4.6, ThBPT10.12, ThBT1.1,<br />

ThBT4.1, ThBT4.2, ThBT4.6, ThBT9.5,<br />

ThCPT10.12, ThCPT10.26, ThCT2.8,<br />

ThCT4.2, ThCT4.3, ThCT4.5, ThCT4.6,<br />

ThCT4.8, ThCT9.1, ThCT9.2, ThCT9.3,<br />

ThCT9.4, ThCT9.5, ThCT9.6, ThCT9.8,<br />

ThDPT10.11, ThDPT10.12,<br />

ThDPT10.25, ThDPT10.26,<br />

ThDPT10.27, ThDT8.4, TuAT7.3,<br />

TuCT8.3, TuCT8.6, WeAPT10.21,<br />

WeAT8.1, WeAT8.3, WeAT8.4,<br />

WeAT8.5, WeBPT10.22, WeBPT10.23,<br />

WeBT8.1, WeBT8.3, WeBT8.6,<br />

WeBT8.8, WeCPT10.24, WeCT4.2,<br />

WeCT5.1, WeCT8.1, WeCT8.2,<br />

WeCT8.3, WeCT9.5, WeDPT10.26,<br />

Domestic Robots and<br />

Home Automation<br />

WeDT8.3, WeDT8.4<br />

ThCT6.6, ThCT9.5, ThDPT10.18,<br />

ThDPT10.26, TuAT1.2, TuAT1.4,<br />

TuAT2.1, TuBPT10.1, TuBT1.5,<br />

TuBT1.8, TuCPT10.2, WeAT1.1,<br />

WeAT9.3, WeBT1.5, WeCPT10.2,<br />

WeCT1.6, WeCT2.4, WeCT3.4,<br />

WeCT3.7, WeDPT10.3, WeDPT10.4,<br />

WeDPT10.7<br />

Dynamics MoAT7.3, MoAT7.5, MoAT7.6,<br />

MoAT7.7, MoBPT10.20, MoBPT10.21,<br />

MoBT5.6, MoBT5.8, MoBT6.7,<br />

ThAPT10.24, ThAT1.8, ThAT4.3,<br />

ThAT7.5, ThAT7.6, ThAT8.1, ThAT9.2,<br />

ThAT9.5, ThBPT10.20, ThBPT10.21,<br />

ThBPT10.26, ThCT4.3, ThDT8.2,<br />

TuAPT10.12, TuAT8.7, TuBT7.5,<br />

TuCPT10.17, TuCT7.2, WeAT7.1,<br />

WeAT7.4, WeBPT10.19, WeBT6.5,<br />

WeBT7.1, WeBT7.5, WeBT7.6,<br />

WeCPT10.17, WeCPT10.20,<br />

WeCPT10.21, WeDT9.2, WeDT9.6<br />

E


Education Robotics WeBT1.2<br />

Entertainment Robotics MoAT3.4, MoBPT10.7, ThDT6.1,<br />

WeAT1.2, WeAT1.6, WeAT1.7,<br />

WeBPT10.3, WeCT1.8<br />

Evolutionary Robotics ThBT4.4, ThCPT10.10, ThCT9.3,<br />

WeAT7.7, WeBT8.7<br />

F<br />

Failure Detection and WeAT8.3, WeCT3.7<br />

Recovery<br />

Field Robots MoAT6.1, MoAT6.3, MoAT6.4,<br />

MoAT6.7, MoAT6.8, MoAT7.7,<br />

MoBPT10.16, MoBT2.2, MoBT2.4,<br />

MoBT2.7, MoBT6.3, MoBT6.5,<br />

MoBT6.8, MoBT8.6, ThAT2.5,<br />

ThAT3.4, ThAT6.8, ThAT8.2, ThAT8.3,<br />

ThAT8.4, ThAT8.5, ThBPT10.5,<br />

ThBPT10.7, ThBPT10.22, ThBPT10.23,<br />

ThBT2.7, ThBT6.1, ThBT8.4, ThBT8.6,<br />

ThBT9.1, ThBT9.6, ThCPT10.22,<br />

ThCPT10.24, ThCPT10.27, ThCT3.2,<br />

ThCT3.7, ThCT8.6, ThDPT10.24,<br />

ThDT5.4, TuAPT10.4, TuAPT10.14,<br />

TuAPT10.21, TuCT1.3, TuCT7.7,<br />

WeAT6.3, WeAT9.1, WeBT2.5,<br />

WeBT7.5, WeCPT10.5, WeCPT10.20,<br />

WeCT3.6, WeCT6.1, WeCT6.3,<br />

WeDPT10.9, WeDT6.3, WeDT8.8,<br />

WeDT9.7<br />

Flexible Arms MoAT9.2, ThAT1.4, ThAT3.8, ThAT7.2,<br />

ThBPT10.1, ThBT1.2, ThBT1.6,<br />

ThBT1.7, ThCPT10.3, ThCT1.6,<br />

ThDPT10.3, ThDT7.1, TuAT8.5,<br />

TuBPT10.20, TuBT6.3, TuCT6.6,<br />

TuCT7.5, TuCT7.6, TuCT9.1,<br />

WeAPT10.15, WeAPT10.17,<br />

WeAPT10.18, WeBT3.6, WeCPT10.9,<br />

WeDT9.3<br />

Force and Tactile Sensing MoAT1.6, MoAT1.7, MoAT7.4,<br />

MoBPT10.3, MoBPT10.19, ThAPT10.7,<br />

ThAPT10.9, ThAT1.4, ThAT9.7,<br />

ThBPT10.1, ThBT7.7, ThBT8.5,<br />

ThCPT10.23, ThCT2.4, ThDPT10.4,<br />

TuAT5.2, TuBT9.7, TuCT9.2, TuCT9.3,<br />

TuCT9.6, WeAPT10.24, WeAT4.5,<br />

WeAT4.6, WeBPT10.11, WeBPT10.12,<br />

WeBT4.5, WeCPT10.11, WeDT3.1,<br />

WeDT3.2, WeDT3.4, WeDT3.6,<br />

WeDT3.7, WeDT9.1, WeDT9.7<br />

Force Control MoAT7.6, MoAT9.4, MoBPT10.21,<br />

MoBPT10.25, MoBT5.4, ThAT1.3,<br />

ThBT1.2, ThBT1.8, ThCT7.1,<br />

TuAPT10.10, TuAT3.7, TuBT7.7,<br />

TuCT9.3, TuCT9.6, WeAPT10.24,<br />

WeBT3.6, WeCPT10.9, WeCT4.1,<br />

WeCT7.2<br />

G<br />

Gesture, Posture, Social<br />

Spaces and Facial<br />

Expressions<br />

MoAT2.6, MoBPT10.6, ThAPT10.2,<br />

WeAT1.5, WeAT1.6, WeBPT10.2,<br />

WeBPT10.3, WeDT1.5<br />

Grasping MoAT8.8, ThAPT10.7, ThBT1.6,<br />

ThCPT10.3, ThCT8.8, TuAT6.1,<br />

TuAT6.3, TuAT6.4, TuAT6.5, TuAT6.6,<br />

TuAT6.7, TuAT6.8, TuBPT10.13,<br />

TuBPT10.14, TuBPT10.15, TuBT1.8,<br />

TuBT6.1, TuBT6.4, TuBT6.6, TuBT6.8,<br />

TuBT7.2, TuBT8.1, TuBT9.6, TuBT9.7,<br />

TuCPT10.13, TuCPT10.15,<br />

TuCPT10.24, TuCT6.1, TuCT6.2,<br />

TuCT6.3, TuCT6.4, TuCT6.5, TuCT6.7,<br />

TuCT6.8, TuCT9.3, WeAPT10.13,<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–265–<br />

Haptics and Haptic<br />

Interfaces<br />

Human detection &<br />

tracking<br />

Human Performance<br />

Augmentation<br />

Humanoid and Bipedal<br />

Locomotion<br />

WeAPT10.14, WeBT6.2, WeCT2.5,<br />

WeDPT10.5, WeDT3.4<br />

H<br />

MoAT4.3, MoAT4.7, MoBT7.2,<br />

MoBT7.3, MoBT7.4, MoBT7.5,<br />

ThAPT10.8, ThCT1.8, TuAPT10.16,<br />

TuAPT10.17, TuAT6.7, TuBT5.3,<br />

TuCT9.6, WeAPT10.24, WeAT4.1,<br />

WeAT4.3, WeAT4.4, WeAT4.5,<br />

WeAT4.6, WeAT4.7, WeAT4.8,<br />

WeBPT10.10, WeBPT10.11,<br />

WeBPT10.12, WeBT4.1, WeBT4.3,<br />

WeBT4.5, WeBT4.6, WeBT4.7,<br />

WeBT4.8, WeCPT10.11, WeCPT10.12,<br />

WeCT4.1, WeCT4.2, WeCT4.3,<br />

WeCT4.5, WeCT4.6, WeCT4.7,<br />

WeCT4.8, WeDPT10.11, WeDPT10.12,<br />

WeDT3.1, WeDT3.2, WeDT3.3,<br />

WeDT3.5, WeDT3.7, WeDT3.8,<br />

WeDT9.8<br />

MoAT3.8, MoBT3.6, ThAPT10.5,<br />

ThAPT10.16, ThAPT10.17, ThAT2.6,<br />

ThAT2.7, ThBPT10.6, TuAPT10.9,<br />

WeAT2.3, WeBT1.4, WeCPT10.1,<br />

WeCT1.4, WeCT1.6, WeCT2.3,<br />

WeCT3.1, WeCT5.4, WeDPT10.1,<br />

WeDPT10.3, WeDPT10.13, WeDT2.5,<br />

WeDT7.1, WeDT7.2, WeDT7.3,<br />

WeDT7.4, WeDT7.5, WeDT7.7,<br />

WeDT7.8<br />

MoAT9.1, ThAPT10.1, ThBT8.5,<br />

ThCPT10.23, TuBT5.4, TuCPT10.10,<br />

TuCT5.5, WeAPT10.11, WeAT2.8,<br />

WeCT1.8, WeDT1.4, WeDT3.7<br />

MoAT8.1, ThAT5.1, ThAT5.3, ThAT5.4,<br />

ThAT5.5, ThAT5.6, ThAT5.7, ThAT7.3,<br />

ThBPT10.13, ThBPT10.14,<br />

ThBPT10.15, ThBT5.1, ThBT5.6,<br />

ThBT5.7, ThBT5.8, ThCPT10.15,<br />

ThCT5.1, ThCT5.3, ThDT5.2, TuBT8.2,<br />

TuCT5.3, TuCT7.1, WeAT7.1,<br />

WeAT7.3, WeAT7.4, WeAT7.5,<br />

WeAT7.8, WeBPT10.19, WeBPT10.20,<br />

WeBT7.1, WeBT7.2, WeBT7.4,<br />

WeCPT10.19, WeCT2.3, WeCT7.6,<br />

WeCT7.8, WeDPT10.21<br />

Humanoid Robots MoBT7.8, ThAPT10.9, ThAT3.6,<br />

ThAT5.1, ThAT5.3, ThAT5.4, ThAT5.6,<br />

ThAT5.7, ThAT5.8, ThAT9.2, ThAT9.6,<br />

ThAT9.7, ThBPT10.9, ThBPT10.13,<br />

ThBPT10.15, ThBPT10.27, ThBT4.3,<br />

ThBT5.1, ThBT5.3, ThBT5.4, ThBT5.5,<br />

ThBT5.6, ThBT5.8, ThBT6.7,<br />

ThCPT10.13, ThCPT10.14,<br />

ThCPT10.15, ThCT5.1, ThCT5.4,<br />

ThCT5.5, ThCT5.6, ThCT5.7, ThCT5.8,<br />

ThDPT10.13, ThDPT10.14,<br />

ThDPT10.15, TuAT7.1, TuAT7.2,<br />

TuAT7.7, TuBT1.1, TuBT9.3, TuCT1.4,<br />

TuCT5.3, TuCT6.5, WeAPT10.1,<br />

WeAPT10.14, WeAT1.6, WeAT7.3,<br />

WeBPT10.3, WeCT1.2, WeCT1.7,<br />

WeCT1.8, WeCT7.2, WeCT7.3,<br />

WeCT7.5, WeCT7.6, WeCT7.7,<br />

WeCT9.4, WeDPT10.20, WeDPT10.21,<br />

WeDPT10.25, WeDT1.7, WeDT2.7,<br />

Hydraulic/Pneumatic<br />

Actuators<br />

WeDT3.6<br />

MoBT7.4, MoBT9.1, MoBT9.4,<br />

ThBT7.3, ThCT5.7, TuAPT10.16,<br />

TuAPT10.22, TuBT5.5, TuCPT10.11,


TuCT7.2, TuCT7.4, WeAPT10.16,<br />

WeCT3.5, WeDPT10.8<br />

I<br />

Industrial Robots MoBT9.1, ThAT1.7, ThAT7.2, ThBT9.8,<br />

ThCT2.1, ThCT2.2, ThCT2.4, ThCT2.5,<br />

ThCT2.6, ThCT2.7, ThDPT10.4,<br />

ThDPT10.5, ThDPT10.6, ThDT4.2,<br />

WeAT8.3, WeBT2.4, WeBT9.8,<br />

Intelligent Transportation<br />

Systems<br />

WeCPT10.4, WeCT2.2, WeCT8.8<br />

ThAT3.4, ThBPT10.7, ThBT6.4,<br />

ThBT9.2, ThBT9.3, ThBT9.4, ThBT9.5,<br />

ThBT9.6, ThBT9.7, ThBT9.8,<br />

ThCPT10.16, ThCPT10.25,<br />

ThCPT10.26, ThCPT10.27, TuCT1.6,<br />

TuCT2.2, WeAPT10.3, WeAT2.3,<br />

WeCT8.8, WeDT8.8<br />

Intrusion Detection, MoBT3.6, TuAPT10.9<br />

Identification and Security<br />

J<br />

Joint/Mechanism MoBT9.2, MoBT9.6, ThAPT10.24,<br />

ThAT1.7, ThAT7.3, ThCT2.6,<br />

ThDPT10.6, ThDT3.1, ThDT5.3,<br />

TuAPT10.24, TuAT5.5, TuAT8.3,<br />

TuBPT10.11, TuBT6.3, TuBT7.3,<br />

TuCT7.4, TuCT7.5, TuCT7.8, TuCT9.8,<br />

WeAPT10.16, WeAPT10.17, WeBT7.5,<br />

WeCPT10.20, WeDT9.6<br />

K<br />

Kinematics MoAT5.8, MoAT7.5, MoAT9.5,<br />

MoBPT10.20, MoBPT10.26, MoBT1.8,<br />

MoBT8.2, ThAPT10.10, ThAT3.7,<br />

ThAT7.7, ThAT8.4, ThBPT10.22,<br />

ThBT3.6, ThBT3.7, ThBT3.8, ThBT9.1,<br />

ThCPT10.9, ThCT2.2, ThCT7.3,<br />

ThDT3.1, TuAT8.5, TuBPT10.20,<br />

TuBT7.3, TuCT1.8, TuCT7.7, TuCT8.2,<br />

TuCT9.1, WeAT3.1, WeCT2.1,<br />

WeCT7.8, WeDT5.4<br />

L<br />

Learning and Adaptive MoAT8.1, MoAT8.2, MoAT8.3,<br />

Systems<br />

MoAT8.4, MoAT8.5, MoAT8.6,<br />

MoAT8.7, MoBPT10.22, MoBPT10.23,<br />

MoBPT10.24, MoBT2.7, MoBT5.5,<br />

MoBT7.6, MoBT8.1, MoBT8.2,<br />

MoBT8.3, MoBT8.4, MoBT8.7,<br />

ThAPT10.3, ThAPT10.4, ThAT5.6,<br />

ThAT9.1, ThAT9.8, ThBPT10.15,<br />

ThCT1.3, ThCT1.4, ThDPT10.1,<br />

TuAPT10.11, TuAPT10.18,<br />

TuAPT10.19, TuAT1.5, TuAT1.8,<br />

TuAT6.4, TuAT6.7, TuAT8.1, TuAT9.8,<br />

TuBPT10.2, TuBPT10.13, TuBT1.3,<br />

TuBT1.4, TuBT7.1, TuBT9.1, TuBT9.3,<br />

TuBT9.6, TuBT9.7, TuBT9.8,<br />

TuCPT10.1, TuCPT10.24, TuCT8.1,<br />

WeAT2.2, WeAT2.6, WeAT2.8,<br />

WeAT3.6, WeAT5.6, WeAT8.5,<br />

WeAT9.6, WeBPT10.6, WeBPT10.9,<br />

WeBPT10.15, WeBPT10.23,<br />

WeBPT10.27, WeCT1.2, WeCT3.2,<br />

WeCT4.4, WeCT6.7, WeDPT10.10,<br />

WeDT1.6, WeDT1.8, WeDT2.2,<br />

WeDT2.3, WeDT2.4, WeDT8.7<br />

Legged Robots MoAT5.3, MoAT5.7, MoBT5.4,<br />

ThAT1.3, ThAT5.5, ThAT5.8,<br />

ThBPT10.14, ThBT5.5, ThBT5.7,<br />

ThCPT10.14, ThCT5.6, ThDPT10.15,<br />

ThDT3.1, ThDT4.2, ThDT4.3, ThDT4.4,<br />

ThDT5.1, ThDT6.3, TuAPT10.10,<br />

TuBT8.4, TuBT8.6, TuBT8.8,<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–266–<br />

TuCPT10.19, TuCPT10.21, TuCT5.3,<br />

WeAT7.1, WeAT7.2, WeAT7.3,<br />

WeAT7.5, WeAT7.6, WeAT7.7,<br />

WeBPT10.20, WeBPT10.21, WeBT2.6,<br />

WeBT7.3, WeBT7.4, WeBT7.6,<br />

WeBT7.7, WeBT7.8, WeCPT10.6,<br />

WeCPT10.19, WeCPT10.21, WeCT7.7<br />

Localization MoAT2.1, MoAT2.2, MoAT2.3,<br />

MoAT2.4, MoAT2.5, MoAT2.6,<br />

MoAT2.7, MoAT2.8, MoAT3.5,<br />

MoAT3.7, MoAT3.8, MoBPT10.4,<br />

MoBPT10.5, MoBPT10.6, MoBPT10.8,<br />

MoBT2.1, MoBT2.2, MoBT2.3,<br />

MoBT2.4, MoBT2.5, MoBT2.6,<br />

MoBT2.7, MoBT6.5, ThAT2.4,<br />

ThBPT10.4, ThBT2.1, ThBT2.3,<br />

ThBT3.1, ThBT7.7, ThCT2.4, ThCT2.5,<br />

ThDPT10.4, ThDPT10.5, ThDT3.4,<br />

ThDT7.2, TuAPT10.4, TuAPT10.5,<br />

TuAPT10.6, TuAPT10.14, TuBT2.2,<br />

TuBT5.7, TuCT2.3, TuCT2.6,<br />

WeAPT10.6, WeAT2.5, WeBPT10.5,<br />

WeBT2.5, WeBT2.7, WeBT8.5,<br />

WeCPT10.5, WeCPT10.23, WeCT1.3,<br />

WeCT1.7, WeCT2.3, WeCT8.3,<br />

WeCT8.4, WeDPT10.22, WeDT6.8<br />

M<br />

Manipulation and<br />

Compliant Assembly<br />

ThAPT10.2, ThBT1.6, ThCPT10.3,<br />

ThDT6.2, TuCT3.7, TuCT6.6,<br />

WeAPT10.15, WeDT1.5<br />

Manipulation Planning MoBT9.8, ThBT1.5, ThBT3.4,<br />

ThCPT10.2, ThCPT10.7, ThCT1.2,<br />

ThCT1.3, TuAT6.1, TuAT6.6, TuAT6.8,<br />

TuAT8.8, TuBPT10.15, TuBT6.1,<br />

TuBT6.6, TuBT6.7, TuBT7.2,<br />

TuCPT10.15, TuCT6.3, TuCT6.4,<br />

WeAPT10.13, WeAT1.3, WeCT7.1<br />

Mapping MoAT2.2, MoAT6.3, MoBT3.4,<br />

ThAPT10.22, ThAT2.2, ThAT2.3,<br />

ThAT2.5, ThAT6.3, ThAT6.7,<br />

ThBPT10.5, ThBT2.1, ThBT2.2,<br />

ThBT2.3, ThBT2.4, ThBT2.5, ThBT2.6,<br />

ThBT2.7, ThBT2.8, ThBT4.2, ThBT6.3,<br />

ThBT6.5, ThCPT10.4, ThCPT10.5,<br />

ThCPT10.6, ThCPT10.17, ThCT6.3,<br />

ThPT11.1, TuAPT10.7, TuAT2.2,<br />

TuAT2.5, TuAT2.6, TuAT9.1,<br />

TuBPT10.5, TuBPT10.6, TuBT2.3,<br />

TuBT2.6, TuBT2.8, TuCPT10.6,<br />

WeBT2.8, WeCT2.5, WeCT8.5,<br />

WeDPT10.5, WeDPT10.23, WeDT9.4<br />

Marine Robotics MoAT6.8, MoBT5.8, ThAPT10.13,<br />

ThAPT10.14, ThAPT10.15,<br />

ThAPT10.23, ThCT3.2, ThCT3.3,<br />

ThCT3.4, ThCT3.5, ThCT3.6, ThCT3.7,<br />

ThDPT10.7, ThDPT10.8, ThDPT10.9,<br />

ThDT3.2, ThDT3.3, ThDT3.4, TuAT9.4,<br />

TuAT9.6, TuBPT10.22, TuBPT10.24,<br />

TuBT2.3, TuBT9.4, TuCPT10.22,<br />

WeCT6.1, WeCT6.3, WeCT6.5,<br />

WeCT6.6, WeCT6.7, WeCT6.8,<br />

WeDPT10.17, WeDPT10.18, WeDT6.1,<br />

WeDT6.3, WeDT6.4, WeDT6.5,<br />

WeDT6.6, WeDT6.7, WeDT6.8,<br />

WeDT9.5<br />

Mechanism Design MoAT1.8, MoAT3.3, MoAT5.5,<br />

MoAT9.1, MoAT9.2, MoAT9.4,<br />

MoAT9.5, MoBPT10.14, MoBPT10.25,<br />

MoBPT10.26, MoBT5.3, MoBT9.5,<br />

ThAT7.4, ThBPT10.19, ThBT7.5,


Medical Robots and<br />

Systems<br />

ThBT8.4, ThCPT10.20, ThCPT10.22,<br />

ThCT3.4, ThCT7.6, ThDPT10.7,<br />

ThDPT10.21, ThDT5.1, ThDT5.4,<br />

TuAPT10.23, TuAT5.3, TuAT5.5,<br />

TuAT5.7, TuAT5.8, TuBPT10.11,<br />

TuBT6.1, TuCT7.3, TuCT7.4, TuCT7.5,<br />

TuCT7.6, TuCT7.8, TuCT9.5, TuCT9.8,<br />

TuPL.1, WeAPT10.16, WeAPT10.17,<br />

WeAPT10.18, WeAPT10.23, WeAT4.7,<br />

WeBT4.1, WeBT4.2, WeBT7.6,<br />

WeCPT10.21<br />

MoBT1.1, MoBT1.4, MoBT7.1,<br />

MoBT9.3, ThBT7.5, ThBT7.6, ThBT7.7,<br />

ThBT7.8, ThCPT10.20, ThCPT10.21,<br />

ThCT7.2, ThCT7.4, ThDPT10.19,<br />

ThDT7.1, ThDT7.2, ThDT7.3, ThDT7.4,<br />

TuAPT10.1, TuAT3.4, TuAT5.1,<br />

TuAT5.2, TuAT5.3, TuAT5.4, TuAT5.5,<br />

TuAT5.6, TuAT5.7, TuAT5.8,<br />

TuBPT10.7, TuBPT10.10, TuBPT10.11,<br />

TuBPT10.12, TuBT3.3, TuBT3.6,<br />

TuBT5.1, TuBT5.2, TuBT5.3, TuBT5.4,<br />

TuBT5.5, TuBT5.6, TuBT5.7, TuBT5.8,<br />

TuCPT10.9, TuCPT10.10,<br />

TuCPT10.11, TuCPT10.12, TuCT3.5,<br />

TuCT5.7, WeAPT10.8, WeAT3.1,<br />

WeAT3.2, WeAT3.4, WeAT3.5,<br />

WeAT3.7, WeAT3.8, WeAT4.5,<br />

WeAT9.4, WeAT9.7, WeBPT10.7,<br />

WeBPT10.8, WeBPT10.11,<br />

WeBPT10.25, WeBT3.1, WeBT3.2,<br />

WeBT3.3, WeBT3.4, WeBT3.5,<br />

WeBT3.6, WeBT3.7, WeBT3.8,<br />

WeBT4.3, WeBT9.3, WeCPT10.7,<br />

WeCPT10.8, WeCPT10.9, WeCT4.1,<br />

WeDT7.8, WeDT9.3<br />

Micro-manipulation MoAT1.3, MoAT1.5, MoBPT10.2,<br />

MoBT1.2, MoBT1.3, MoBT1.4,<br />

MoBT1.5, MoBT1.6, MoBT1.7,<br />

ThAT7.7, ThDT7.4, TuAPT10.1,<br />

TuAPT10.2, TuAPT10.3, TuAT3.3,<br />

TuAT3.5, TuAT3.6, TuAT3.7, TuAT3.8,<br />

TuBPT10.8, TuBPT10.9, TuBT3.4,<br />

TuBT3.8, TuBT5.5, TuCPT10.7,<br />

TuCPT10.11, TuCT3.3, TuCT3.7,<br />

TuCT3.8<br />

Micro/Nano Robots MoAT1.1, MoAT1.3, MoAT1.4,<br />

MoAT1.8, MoAT9.8, MoBPT10.1,<br />

MoBT1.1, MoBT1.3, MoBT1.7,<br />

MoBT1.8, ThAT3.7, ThAT4.8, ThAT6.1,<br />

ThBT4.5, ThCPT10.11, ThCT4.1,<br />

ThDT4.3, TuAT3.1, TuAT3.4, TuAT3.5,<br />

TuAT3.6, TuAT3.8, TuAT8.6,<br />

TuBPT10.7, TuBPT10.8, TuBPT10.9,<br />

TuBPT10.21, TuBT3.1, TuBT3.3,<br />

TuBT3.4, TuBT3.5, TuBT3.6, TuBT3.7,<br />

TuBT7.3, TuCPT10.7, TuCPT10.8,<br />

TuCPT10.9, TuCT3.1, TuCT3.3,<br />

TuCT3.4, TuCT3.5, TuCT3.6, TuCT3.7,<br />

TuCT3.8, WeAPT10.7, WeAPT10.8,<br />

WeAPT10.9<br />

Mining Robotics MoBT8.6, ThBT2.2, ThBT9.6,<br />

ThCPT10.27, TuAPT10.21<br />

Mobile Manipulation MoAT8.8, ThAPT10.7, ThBT2.4,<br />

ThCPT10.4, ThCT1.1, ThCT1.4,<br />

ThCT1.5, ThCT4.4, ThDPT10.1,<br />

ThDPT10.2, ThDPT10.10, TuAT7.4,<br />

TuBPT10.16, TuBT3.4, TuCPT10.7,<br />

WeBT4.4, WeBT6.1, WeCPT10.10,<br />

WeCT4.5, WeCT7.5, WeDPT10.11,<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–267–<br />

WeDPT10.20, WeDT3.4<br />

Motion and Path Planning MoBT6.3, MoBT9.8, ThAPT10.10,<br />

ThAPT10.11, ThAPT10.12,<br />

ThAPT10.15, ThAT1.5, ThAT1.8,<br />

ThAT3.1, ThAT3.2, ThAT3.3, ThAT3.5,<br />

ThAT4.4, ThBPT10.2, ThBPT10.8,<br />

ThBPT10.10, ThBT3.1, ThBT3.2,<br />

ThBT3.3, ThBT3.4, ThBT3.6, ThBT3.7,<br />

ThBT4.4, ThBT4.8, ThBT5.5, ThBT5.8,<br />

ThBT8.1, ThCPT10.7, ThCPT10.9,<br />

ThCPT10.10, ThCPT10.14, ThCT1.1,<br />

ThCT3.3, ThCT8.3, ThDT3.3, ThDT5.2,<br />

ThDT7.3, ThDT8.3, TuAT2.1, TuAT9.2,<br />

TuAT9.6, TuAT9.7, TuBPT10.24,<br />

TuBT3.5, TuBT5.6, TuBT8.6, TuBT9.8,<br />

TuCPT10.8, TuCPT10.12,<br />

TuCPT10.21, TuCT1.1, TuCT1.2,<br />

TuCT1.7, TuCT1.8, TuCT2.8, TuCT8.8,<br />

WeAT1.2, WeAT1.5, WeAT1.8,<br />

WeAT5.1, WeAT5.3, WeAT5.4,<br />

WeAT5.5, WeAT5.6, WeAT5.7,<br />

WeAT5.8, WeAT6.3, WeAT7.4,<br />

WeAT8.2, WeBPT10.2, WeBPT10.13,<br />

WeBPT10.14, WeBPT10.15,<br />

WeBPT10.19, WeBT3.5, WeBT5.1,<br />

WeBT5.3, WeBT5.4, WeBT5.5,<br />

WeBT5.6, WeBT5.7, WeBT5.8,<br />

WeBT6.5, WeCPT10.8, WeCPT10.13,<br />

WeCPT10.14, WeCPT10.15,<br />

WeCPT10.17, WeCT1.1, WeCT3.8,<br />

WeCT5.3, WeCT5.5, WeCT5.6,<br />

WeCT5.7, WeCT6.5, WeCT6.6,<br />

WeCT8.1, WeCT8.5, WeCT8.6,<br />

WeCT8.7, WeDPT10.14, WeDPT10.15,<br />

WeDPT10.17, WeDPT10.18,<br />

WeDPT10.23, WeDPT10.24, WeDT5.1,<br />

WeDT5.2, WeDT5.3, WeDT5.4,<br />

WeDT5.5, WeDT5.6, WeDT5.7,<br />

WeDT5.8, WeDT6.6, WeDT7.2<br />

Motion Control MoAT5.3, MoAT6.6, MoAT7.2,<br />

MoAT7.8, MoAT9.5, MoBPT10.18,<br />

MoBPT10.26, MoBT1.4, MoBT6.7,<br />

MoBT7.6, MoBT8.4, MoBT8.6,<br />

ThAPT10.20, ThAT1.1, ThAT1.8,<br />

ThAT3.6, ThAT3.8, ThAT4.5, ThAT4.8,<br />

ThAT5.3, ThAT5.7, ThAT8.1, ThAT8.6,<br />

ThBPT10.9, ThBPT10.11, ThBPT10.24,<br />

ThBT1.3, ThBT4.8, ThBT5.7, ThCT2.2,<br />

ThCT5.6, ThDPT10.15, ThDT7.4,<br />

ThDT8.3, TuAPT10.1, TuAPT10.18,<br />

TuAPT10.19, TuAPT10.21, TuAT8.4,<br />

TuBPT10.19, TuCT5.7, TuCT8.1,<br />

WeAT4.7, WeAT5.7, WeAT7.7,<br />

WeAT7.8, WeAT8.8, WeBT2.2,<br />

WeBT2.4, WeBT3.3, WeBT6.2,<br />

WeBT7.2, WeBT8.1, WeBT9.1,<br />

WeBT9.8, WeCPT10.4, WeCT4.5,<br />

WeCT5.2, WeCT5.8, WeCT7.7,<br />

WeDPT10.11, WeDT8.2, WeDT8.5<br />

Multifingered Hands ThBT1.4, ThBT1.5, ThCPT10.1,<br />

ThCPT10.2, TuAT6.3, TuBT6.3,<br />

TuBT6.4, TuBT6.5, TuBT6.8, TuBT7.4,<br />

TuBT7.7, TuCPT10.13, TuCPT10.14,<br />

TuCPT10.16, TuCT6.3, TuCT6.4,<br />

TuCT6.5, WeAPT10.13, WeAPT10.14<br />

N<br />

Navigation MoAT2.2, MoAT2.7, MoAT6.8,<br />

ThAPT10.11, ThAPT10.23, ThAT1.1,<br />

ThAT3.1, ThAT3.2, ThAT3.4, ThAT4.4,<br />

ThAT6.5, ThBPT10.7, ThBPT10.10,


ThBPT10.17, ThBT2.4, ThBT6.6,<br />

ThBT8.3, ThCPT10.4, ThCPT10.18,<br />

ThCT8.3, ThCT8.7, ThCT9.1, TuAT2.2,<br />

TuAT2.4, TuAT9.1, TuAT9.2, TuAT9.3,<br />

TuBPT10.4, TuCT1.3, TuCT2.5,<br />

TuCT8.7, WeAPT10.5, WeAT1.8,<br />

WeBT1.6, WeBT2.3, WeBT2.7,<br />

WeBT2.8, WeCPT10.3, WeCT3.3,<br />

WeCT5.2, WeCT6.6, WeDPT10.18,<br />

WeDT5.1, WeDT5.5, WeDT5.8,<br />

WeDT6.8, WeDT9.5<br />

Networked Robots ThAPT10.20, ThBT4.6, ThCPT10.12,<br />

ThCT4.6, ThCT9.1, ThDPT10.12,<br />

ThDT8.4, TuAT2.3, TuAT2.7, WeAT8.1,<br />

WeAT8.4, WeAT8.6, WeBPT10.22,<br />

WeBPT10.24, WeBT8.1, WeBT8.3,<br />

WeCT8.4, WeDPT10.22, WeDT8.2,<br />

WeDT8.5<br />

Networked Teleoperation MoAT4.4, MoAT4.6, MoAT4.7,<br />

MoAT4.8, MoBPT10.10, MoBPT10.12,<br />

MoBT7.3, MoBT7.7, ThAT8.8,<br />

WeCT4.2, WeCT4.3<br />

Neural and Fuzzy Control ThBT7.2, TuAT2.7, WeBT8.7,<br />

WeCT1.1<br />

Neurorobotics TuBT8.7, TuCT5.2, TuCT5.4,<br />

WeAPT10.10<br />

New Actuators for Robotics MoAT9.2, MoAT9.7, MoBT9.2,<br />

MoBT9.4, MoBT9.8, ThAT1.5,<br />

ThAT1.7, ThAT7.4, ThBPT10.2,<br />

ThBPT10.19, ThCT2.6, ThDPT10.6,<br />

Nonholonomic Motion<br />

Planning<br />

TuAPT10.22<br />

ThAPT10.10, ThAPT10.12, ThBT3.5,<br />

ThBT3.8, ThCPT10.8, WeAT5.4,<br />

WeAT8.2, WeBPT10.13, WeBT5.7,<br />

WeBT9.1, WeDT5.1, WeDT5.4,<br />

WeDT5.6<br />

O<br />

Omnidirectional Vision MoBT2.3, MoBT2.5, ThBT2.1,<br />

ThCT8.2, TuAPT10.5<br />

P<br />

Parallel Robots ThAT7.2, ThBT7.5, ThCPT10.20,<br />

ThCT7.5, ThDPT10.20, ThDT4.2,<br />

TuBT7.8, TuCT7.6, WeAPT10.18,<br />

WeAT4.6, WeBPT10.12, WeBT9.7,<br />

WeCT4.7<br />

Parts Feeding and TuBT6.7<br />

Fixturing<br />

Path Planning for<br />

Manipulators<br />

Path Planning for Multiple<br />

Mobile Robots or Agents<br />

ThAT3.8, ThBT3.4, ThCPT10.7,<br />

ThCT1.1, ThCT1.2, ThCT2.5,<br />

ThDPT10.5, TuBT7.8, WeAT5.5,<br />

WeAT5.6, WeBPT10.14, WeBPT10.15,<br />

WeBT5.6, WeCPT10.15, WeDT5.3<br />

MoBT1.6, ThAPT10.12, ThBT9.8,<br />

ThCT9.7, TuAPT10.3, TuCT1.1,<br />

WeAT8.2, WeBT5.4, WeBT5.6,<br />

WeBT8.4, WeCPT10.13, WeCPT10.15,<br />

WeCPT10.22, WeCT1.1, WeCT5.2,<br />

WeCT6.4, WeCT8.1, WeCT8.2,<br />

WeCT8.5, WeCT8.6, WeCT8.7,<br />

WeCT8.8, WeDPT10.16, WeDPT10.23,<br />

WeDPT10.24, WeDT5.6, WeDT5.7<br />

Personal Robots MoAT7.1, ThBT2.6, ThCPT10.6,<br />

ThCT6.3, ThCT6.7, WeAT1.4,<br />

WeBPT10.1, WeBT1.5, WeBT1.8,<br />

Physical Human-Robot<br />

Interaction<br />

WeCPT10.2<br />

MoAT7.8, MoAT8.6, MoAT9.1,<br />

MoBPT10.24, ThAPT10.1, ThAPT10.9,<br />

ThAT7.1, ThCT1.7, ThCT1.8, ThCT7.1,<br />

ThCT7.2, ThCT7.3, ThCT7.4, ThCT7.5,<br />

ThDPT10.19, ThDPT10.20, TuCT5.5,<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–268–<br />

Planning, Scheduling and<br />

Coordination<br />

TuCT5.8, WeAPT10.11, WeAT1.1,<br />

WeAT1.3, WeAT1.4, WeAT1.7,<br />

WeAT4.8, WeBPT10.1, WeBT1.3,<br />

WeBT1.7, WeBT4.2, WeBT9.2,<br />

WeCT1.5, WeCT4.4, WeCT4.7,<br />

WeCT4.8, WeDPT10.2, WeDPT10.10,<br />

WeDT1.3, WeDT1.4, WeDT3.6<br />

MoBT6.2, ThAPT10.14, ThAT3.5,<br />

ThBPT10.8, ThBT3.3, ThBT3.5,<br />

ThCPT10.8, ThCT1.2, ThCT2.1,<br />

ThCT2.8, ThCT4.6, ThCT4.8, ThCT8.6,<br />

ThDPT10.12, ThDPT10.24, TuAT9.2,<br />

TuAT9.3, TuAT9.4, TuAT9.5,<br />

TuBPT10.22, TuBPT10.23, WeAT1.1,<br />

WeBT8.8, WeCT8.6, WeDPT10.24,<br />

WeDT6.5, WeDT8.1, WeDT8.8<br />

Programming Environment TuAT7.3, TuAT7.6, TuBPT10.18,<br />

WeCT7.1<br />

R<br />

Range Sensing MoAT9.8, ThAT2.2, ThAT2.3, ThAT2.5,<br />

ThAT2.6, ThAT2.8, ThAT8.2,<br />

ThBPT10.5, ThBPT10.6, ThBT2.6,<br />

ThBT9.2, ThCPT10.6, ThCT6.5,<br />

ThDPT10.17, TuAT1.8, TuBT2.4,<br />

TuBT7.6, TuCPT10.4, TuCPT10.18,<br />

TuCT9.4, TuCT9.7, WeAPT10.22,<br />

WeAT2.1, WeCT9.5, WeDPT10.26,<br />

Reactive and Sensor-<br />

Based Planning<br />

WeDT7.1<br />

MoBT8.7, ThAPT10.15, ThCT8.5,<br />

ThCT8.7, ThDPT10.23, TuAT9.7,<br />

WeBT1.7, WeBT2.8, WeCT5.1,<br />

WeCT5.5, WeDPT10.14, WeDT5.2,<br />

WeDT6.6, WeDT8.7<br />

Recognition MoBT3.5, ThAT2.1, ThAT2.8, ThCT5.4,<br />

ThCT6.4, ThCT6.5, ThCT6.8,<br />

ThDPT10.13, ThDPT10.16,<br />

ThDPT10.17, TuAPT10.8, TuAT1.1,<br />

TuAT1.2, TuAT1.3, TuAT1.4, TuAT1.5,<br />

TuAT1.6, TuAT1.8, TuBPT10.1,<br />

TuBPT10.2, TuBPT10.3, TuBT1.1,<br />

TuBT1.4, TuBT2.4, TuBT2.7, TuBT9.2,<br />

TuBT9.5, TuCPT10.1, TuCPT10.4,<br />

TuCPT10.23, WeAT2.2, WeAT2.4,<br />

WeAT3.3, WeAT3.8, WeAT9.2,<br />

WeAT9.6, WeBPT10.4, WeBPT10.27,<br />

WeCT2.2, WeCT2.4, WeDPT10.4,<br />

WeDT2.1, WeDT2.3, WeDT2.8,<br />

WeDT3.1, WeDT7.1<br />

Redundant Robots MoAT5.8, MoBT7.5, MoBT7.7,<br />

MoBT8.2, MoBT8.3, ThAT1.2,<br />

ThAT9.8, ThBT1.3, TuAPT10.17,<br />

TuAT8.2, TuAT8.4, TuBPT10.19,<br />

TuCT1.8, WeBT7.7, WeBT7.8,<br />

WeCT7.4, WeCT7.8, WeDPT10.19<br />

Rehabilitation Robotics MoBT7.6, ThAT5.8, ThBT1.1, ThBT7.2,<br />

ThBT7.3, ThBT7.4, ThCPT10.19,<br />

ThCT1.7, ThCT7.1, ThCT7.2, ThCT7.3,<br />

ThCT7.5, ThCT7.6, ThDPT10.20,<br />

ThDPT10.21, TuAPT10.18, TuBT5.2,<br />

TuCT5.1, TuCT5.2, TuCT5.4, TuCT5.5,<br />

TuCT5.6, TuCT5.7, TuCT5.8, TuCT7.1,<br />

WeAPT10.10, WeAPT10.11,<br />

WeAPT10.12, WeBT1.1, WeBT1.4,<br />

Robot Companions and<br />

Social Human-Robot<br />

Interaction<br />

WeCPT10.1, WeCT4.8, WeDT9.2<br />

ThAPT10.17, WeAT1.4, WeAT1.8,<br />

WeBPT10.1, WeBT1.1, WeBT1.2,<br />

WeCT1.4, WeCT1.5, WeCT3.4,<br />

WeDPT10.1, WeDPT10.2,<br />

WeDPT10.7, WeDT1.2, WeDT1.7,<br />

WeDT7.5


Robot Safety MoAT6.4, MoBPT10.16, ThAT8.1,<br />

ThBT3.8, ThBT9.1, TuCT7.8,<br />

WeCT3.1, WeCT3.2, WeCT3.3,<br />

WeCT3.4, WeCT3.5, WeCT5.6,<br />

WeDPT10.7, WeDPT10.8,<br />

Robotic Software,<br />

Middleware and<br />

Programming<br />

Environments<br />

WeDPT10.15<br />

ThAPT10.19, ThCT2.3, ThCT4.4,<br />

ThCT9.8, ThDPT10.10, TuAT7.2,<br />

TuAT7.4, TuAT7.5, TuAT7.6, TuAT7.8,<br />

TuBPT10.16, TuBPT10.17,<br />

TuBPT10.18, TuCT6.2, WeAT2.7,<br />

WeDT8.4<br />

ThBT2.5, ThCPT10.5, TuBT8.3,<br />

TuBT8.5, TuCPT10.20, WeAT9.1<br />

Robotics in Agriculture and<br />

Forestry<br />

Robotics in Construction ThAT8.2, ThCT4.8, WeDT9.7<br />

Robotics in Hazardous MoAT5.6, MoAT6.4, MoAT6.5,<br />

Fields<br />

MoBPT10.15, MoBPT10.16,<br />

MoBPT10.17, ThCT4.7<br />

S<br />

Search and Rescue MoAT5.6, MoBPT10.15, MoBT8.7,<br />

Robots<br />

ThBT8.1, ThBT8.2, ThBT8.3, ThBT8.4,<br />

ThBT8.5, ThCPT10.22, ThCPT10.23,<br />

TuAT8.3, TuCT2.4, TuCT8.5, TuCT8.8,<br />

WeAPT10.4, WeAPT10.20, WeBT2.3,<br />

WeDT1.2<br />

Sensor Fusion MoAT2.3, MoAT3.8, MoAT6.7,<br />

MoAT7.4, MoBPT10.19, MoBT3.7,<br />

MoBT9.6, ThAPT10.6, ThAT2.6,<br />

ThAT4.7, ThAT8.5, ThBPT10.6,<br />

ThBPT10.23, ThBT2.2, ThBT6.3,<br />

ThBT8.6, ThBT9.2, ThCPT10.24,<br />

ThDT3.4, ThDT6.3, ThDT8.1, ThDT8.2,<br />

TuAPT10.24, TuAT2.4, TuAT2.8,<br />

TuAT8.2, TuBPT10.4, TuBT5.7,<br />

TuBT9.2, TuBT9.5, TuCPT10.23,<br />

TuCT2.1, TuCT9.7, WeAT2.5,<br />

WeAT3.7, WeAT6.7, WeAT6.8,<br />

WeBPT10.5, WeBT2.1, WeBT2.2,<br />

WeBT2.3, WeBT2.4, WeBT2.5,<br />

WeBT2.6, WeBT6.6, WeCPT10.4,<br />

WeCPT10.5, WeCPT10.6,<br />

WeCPT10.18, WeCT9.1, WeCT9.2,<br />

WeCT9.3, WeCT9.8, WeDT2.6,<br />

WeDT7.8<br />

Sensor Networks MoAT3.5, MoBPT10.8, MoBT2.6,<br />

ThAPT10.5, ThAPT10.14, ThDT8.4,<br />

TuAPT10.6, TuAT2.8, TuCT9.2,<br />

WeAT6.6, WeAT8.1, WeAT8.6,<br />

WeBPT10.18, WeBPT10.24, WeBT8.2,<br />

WeBT8.5, WeCPT10.23, WeCT5.4,<br />

WeCT6.1, WeCT6.4, WeDPT10.13,<br />

WeDPT10.16, WeDT2.5, WeDT6.5<br />

Service Robots ThAPT10.18, ThBT1.2, ThCT6.6,<br />

ThDPT10.18, TuAT1.1, TuAT1.4,<br />

TuBPT10.1, TuBT7.1, TuBT7.2,<br />

TuBT9.1, WeAT2.7, WeAT9.3,<br />

WeBT1.6, WeBT1.7, WeCPT10.3,<br />

WeDT7.6<br />

SLAM MoAT2.1, MoAT2.3, MoAT2.7,<br />

MoBT3.4, MoBT6.8, ThAPT10.22,<br />

ThAT6.4, ThAT6.5, ThAT6.8,<br />

ThBPT10.16, ThBPT10.17, ThBT6.5,<br />

ThCPT10.17, ThCT1.5, ThDPT10.2,<br />

ThDT7.2, ThPT11.1, TuAPT10.7,<br />

TuAT2.1, TuAT2.2, TuAT2.3, TuAT2.4,<br />

TuAT2.5, TuAT2.6, TuAT2.8,<br />

TuBPT10.4, TuBPT10.5, TuBPT10.6,<br />

TuBT2.1, TuBT2.2, TuBT2.3, TuBT2.4,<br />

TuBT2.5, TuBT2.6, TuBT2.7, TuBT2.8,<br />

TuCPT10.4, TuCPT10.5, TuCPT10.6,<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–269–<br />

TuCT2.4, TuCT2.5, TuCT2.6,<br />

WeAPT10.4, WeAPT10.5, WeAPT10.6,<br />

WeCT9.4, WeCT9.6, WeDPT10.25,<br />

WeDPT10.27, WeDT6.7, WeDT9.4<br />

Smart Actuators MoAT1.8, MoAT9.3, MoAT9.6,<br />

MoAT9.8, MoBPT10.27, MoBT9.3,<br />

MoBT9.7, TuBT8.8, TuCT9.5,<br />

WeAPT10.23<br />

Sonars WeCT1.7, WeCT6.7, WeDT6.7<br />

Space Robotics MoBT6.1, MoBT6.2, MoBT6.3,<br />

MoBT6.4, MoBT6.5, MoBT6.6,<br />

MoBT6.7, MoBT6.8, ThBT6.2,<br />

ThCT1.6, ThDPT10.3, TuAPT10.13,<br />

TuAPT10.14, TuAPT10.15, WeAT4.8,<br />

WeCT3.6, WeDPT10.9<br />

Surveillance Systems ThBT8.2, ThCT8.4, ThDPT10.22,<br />

TuBT1.6, TuBT1.7, TuCPT10.3,<br />

TuCT2.7, WeAT2.3, WeAT9.8,<br />

WeBT8.6, WeCPT10.24, WeCT3.1,<br />

WeCT5.4, WeDPT10.13, WeDT1.3,<br />

WeDT7.3<br />

T<br />

Telerobotics MoAT4.1, MoAT4.3, MoAT4.4,<br />

MoAT4.5, MoAT4.6, MoAT4.7,<br />

MoAT4.8, MoBPT10.10, MoBPT10.11,<br />

MoBPT10.12, MoBT1.5, MoBT7.1,<br />

MoBT7.2, MoBT7.3, MoBT7.4,<br />

MoBT7.5, ThAT8.8, ThBT7.1, ThCT3.7,<br />

ThCT4.7, TuAPT10.2, TuAPT10.16,<br />

TuAPT10.17, TuAT3.3, TuBT5.8,<br />

WeAT3.5, WeAT6.4, WeBPT10.8,<br />

WeBPT10.16, WeBT3.1, WeBT3.4,<br />

WeCPT10.7, WeCT4.3<br />

Tendon/Wire Mechanism MoBT5.7, MoBT9.5, ThBT1.7,<br />

ThCT8.8, TuAPT10.23, TuAT5.2<br />

U<br />

Ubiquitous Robotics ThAPT10.5, WeDT2.5<br />

Underactuated Robots MoBT6.4, ThAT3.7, ThAT7.6, ThAT9.5,<br />

ThBPT10.21, ThBPT10.26, ThBT3.7,<br />

ThCT3.6, ThCT4.3, ThDPT10.9,<br />

ThDT5.2, TuAPT10.13, TuBT6.4,<br />

TuBT6.5, TuCPT10.13, TuCPT10.14,<br />

TuCT7.3, TuCT8.5, WeAPT10.20,<br />

WeBT6.4, WeBT7.2, WeBT7.4,<br />

WeBT7.8, WeCPT10.16, WeCPT10.19<br />

V<br />

Virtual Reality and<br />

Interfaces<br />

MoAT2.6, MoAT4.4, MoBPT10.6,<br />

MoBPT10.10, MoBT3.5, ThAPT10.8,<br />

ThCT1.8, TuAPT10.8, TuBT5.3,<br />

TuBT5.4, TuBT5.8, TuCPT10.10,<br />

WeBT3.4, WeBT4.6, WeBT4.8,<br />

WeCPT10.7, WeCPT10.12, WeDT3.3,<br />

WeDT3.5<br />

Visual Learning ThAT6.3, ThBT6.1, ThCT1.3, ThCT6.4,<br />

ThCT6.8, ThDPT10.16, TuAT1.3,<br />

TuAT1.5, TuAT2.6, TuAT9.8,<br />

TuBPT10.2, TuBPT10.6, TuBT1.7,<br />

TuBT9.2, TuBT9.5, TuCPT10.23,<br />

WeAT2.2, WeAT2.6, WeAT9.2,<br />

WeAT9.7, WeBPT10.6, WeDT1.8<br />

Visual Navigation MoAT2.1, MoAT2.5, MoAT6.7,<br />

MoBPT10.5, MoBT2.5, ThAT6.4,<br />

ThAT8.3, ThAT8.8, ThBPT10.16,<br />

ThBT2.3, ThBT4.7, ThBT6.1, ThBT6.6,<br />

ThBT6.8, ThCPT10.18, ThDT3.3,<br />

ThPT11.1, TuAPT10.5, TuBT2.1,<br />

TuBT2.5, TuBT2.6, TuCPT10.5,<br />

TuCPT10.6, TuCT1.5, TuCT2.1,<br />

TuCT2.2, TuCT2.3, TuCT2.4, TuCT2.5,<br />

WeAPT10.2, WeAPT10.4, WeAPT10.5,


WeAT6.7, WeBT2.6, WeBT6.4,<br />

WeBT6.6, WeCPT10.6, WeCPT10.16,<br />

WeCPT10.18, WeCT9.3<br />

Visual Servoing MoBT6.6, MoBT9.6, ThAT6.1,<br />

ThBT1.5, ThBT6.2, ThCPT10.2,<br />

ThCT8.1, TuAPT10.15, TuAPT10.24,<br />

TuAT6.6, TuBPT10.15, TuBT1.2,<br />

TuCT1.5, WeAPT10.2, WeBT9.1,<br />

WeBT9.2, WeBT9.3, WeBT9.4,<br />

WeBT9.5, WeBT9.6, WeBT9.7,<br />

WeBT9.8, WeCPT10.25, WeCPT10.26,<br />

WeCPT10.27, WeCT2.7<br />

Visual Tracking MoAT1.4, MoAT2.5, MoAT3.4,<br />

MoAT8.3, MoBPT10.1, MoBPT10.5,<br />

MoBPT10.7, MoBT6.6, ThAPT10.3,<br />

ThAPT10.6, ThAPT10.17, ThAT2.4,<br />

ThAT2.7, ThAT4.7, ThAT6.1, ThAT8.3,<br />

ThBPT10.4, ThBT4.7, ThBT6.2,<br />

ThBT6.4, ThBT8.6, ThCPT10.16,<br />

ThCPT10.24, ThCT8.1, ThCT8.4,<br />

ThDPT10.22, ThDT8.1, TuAPT10.15,<br />

TuAT3.3, TuBT1.6, TuBT3.5, TuBT7.4,<br />

TuCPT10.3, TuCPT10.8, TuCPT10.16,<br />

WeAT2.5, WeAT3.4, WeAT9.4,<br />

WeBPT10.5, WeBPT10.7,<br />

<strong>2011</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–270–<br />

Voice, Speech Synthesis<br />

and Recognition<br />

WeBPT10.25, WeBT6.3, WeBT9.3,<br />

WeBT9.5, WeCPT10.26, WeCT2.7,<br />

WeCT2.8, WeCT7.4, WeCT9.7,<br />

WeDPT10.19, WeDT1.6, WeDT2.6,<br />

WeDT2.8, WeDT7.5<br />

MoAT3.1, MoAT3.2, MoAT3.3,<br />

MoAT3.6, MoBPT10.9, MoBT3.1,<br />

MoBT3.2, MoBT3.3, MoBT3.5,<br />

MoBT3.6, MoBT3.7, MoBT3.8,<br />

TuAPT10.8, TuAPT10.9, WeCT1.6,<br />

WeDPT10.3, WeDT1.1


Alan is looking for new colleagues!<br />

Bosch Research and Technology Center<br />

www.boschresearch.com


����������<br />

�������������������������������������������<br />

������������������������������������������<br />

��������������������������������������������<br />

�������������������������������������<br />

����������������<br />

������������������������<br />

���������<br />

���������������������������������������<br />

�����������������������<br />

�������������������������<br />

��������������������������<br />

�������������������������<br />

����������������������������������<br />

���������������������<br />

��������������������������������������������<br />

���������������������������������������������<br />

��������������������������������������������<br />

�����������������������������������������������<br />

�����������������������������������������������������������������������������������������������������<br />

�������������������������������������������������������������������<br />

�������� ���


Service robots. Human-robot interaction. Edutainment.<br />

The age of service robotics has long since dawned.<br />

In close cooperation with renowned universities and institutes<br />

around the world, KUKA Laboratories is developing new<br />

solutions and setting new technological trends. Already<br />

now, the product portfolio of KUKA Laboratories ranges from<br />

robotics technologies for medical applications to the entertainment<br />

sector; from motion simulators to technologies for<br />

patient positioning or medical imaging; and from production<br />

assistants that collaborate with humans to reference products<br />

for research and education. KUKA Laboratories is shaping the<br />

age of service robotics with its high-tech solutions.<br />

Capture this QR Code with your mobile phone<br />

and learn more about KUKA Laboratories:<br />

www.kuka-labs.com


Redefining the Design<br />

Space for Robotics<br />

• Taurus fieldable dexterous telemanipulation<br />

• Wall climbing robots and new grasping solutions<br />

• KARTO mapping capability<br />

• And more…<br />

About SRI International<br />

Visit Us in the<br />

Exhibit Area<br />

to Learn about<br />

Our Latest<br />

Robotics<br />

Innovations<br />

Silicon Valley-based SRI International, a nonprofit research and development<br />

organization, performs sponsored R&D for governments, businesses, and<br />

foundations. SRI brings its innovations to the marketplace through technology<br />

licensing, new products, and spin-off ventures. Commemorating its 65th<br />

anniversary in <strong>2011</strong>, SRI is known for world-changing innovations in computing<br />

and robotics, health and pharmaceuticals, chemistry and materials, sensing,<br />

energy, education, and national defense.<br />

www.sri.com/robotics


Transform your workforce with ABB robotics.<br />

Jumpstart your business with ABB robotics.<br />

ABB Inc.<br />

Discrete Automation and Motion<br />

Robotics<br />

Auburn Hills, MI<br />

248-391-9000<br />

The reality of the current business climate presents an<br />

incredible opportunity. Make good decisions now and you will<br />

emerge as a leader in your industry. With the most versatile<br />

and extensive lineup of top performing robots available, ABB<br />

will give you the automation edge to outsmart the competition.<br />

Examples include picking, packing and palletizing robots with<br />

better cycle-times and higher payloads, FlexArc® welding<br />

systems, precise, flexible and safe foundry robots for forging,<br />

casting, machining and material removal, high-speed, compact<br />

robots to assemble delicate solar cells; and large, long-reach<br />

robots to handle complete solar modules.<br />

ABB Robotics - the competitive edge you need!<br />

www.abb.com/robotics


It’ll only coSt you An ARm.<br />

Announcing the PR2 SE for under $200,000.<br />

As you can see, there’s something quite distinct about this<br />

model. The PR2 SE is a single-armed robot with an updated<br />

sensor suite.<br />

Particularly in light of such new programs as the NSF<br />

National Robotics Initiative, we’ve been encouraged to offer<br />

our personal robot platform at a more affordable price. The<br />

PR2 SE is priced at $285,000 but a 30% discount is offered<br />

to individuals with a proven track record of contributions to<br />

the open source community.<br />

While it doesn’t enable two-handed tasks, the PR2 SE is able<br />

to leverage the work of the PR2 community on navigation,<br />

perception, manipulation and grasping. And if you find<br />

you need a second arm, you can add one later. For more<br />

information, email us at pr2info@willowgarage.com or visit<br />

willowgarage.com.<br />

©<strong>2011</strong> Willow Garage.


Join the 450 prestigious<br />

universities already in the<br />

NAO community !<br />

www.aldebaran-robotics.com<br />

Intuitive Surgical leads in the development and commercialization of medical robotic technology. The da Vinci ® Surgical System<br />

enables surgeons to perform delicate and complex operations minimally-­‐invasively, with increased vision, precision, dexterity and<br />

control. State-­‐of-­‐the-­‐art robotic technology allows the surgeon’s hand movements to be scaled, filtered and translated into precise<br />

movements of the instruments working inside the patient’s body. The end result: a breakthrough in surgical capabilities.<br />

Taking surgical precision beyond the limits of the human hand<br />

SOON<br />

OPEN SOURCE<br />

americas@aldebaran-robotics.com


� � � � � � � � � � � � �<br />

� � � � � ����� � � � � � �<br />

��������������������<br />

�����������������������������<br />

����������������������������������������������������������������<br />

��������������������������������������������������������������������������������������<br />

������������������������������������������������������<br />

��������������������������������������������������<br />

����������������������������������������������������������������������������������������������<br />

SCHUNK_Anzeige_Servicerobotik_215x139_4C_<strong>2011</strong>.indd 1 26.08.<strong>2011</strong> 15:07:26<br />

��������������

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!