07.02.2013 Views

Session WedAT1 Pegaso A Wednesday, October 10, 2012 ... - Lirmm

Session WedAT1 Pegaso A Wednesday, October 10, 2012 ... - Lirmm

Session WedAT1 Pegaso A Wednesday, October 10, 2012 ... - Lirmm

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>Session</strong> WedET2 Fenix 2 <strong>Wednesday</strong>, <strong>October</strong> <strong>10</strong>, <strong>2012</strong>, 15:00–16:00<br />

Emotion Detection and Expression<br />

Chair C. S. George Lee, Purdue Univ.<br />

Co-Chair Ren Luo, National Taiwan Univ.<br />

15:00–15:15 WedET2.1<br />

An NARX-Based Approach for Human Emotion<br />

Identification<br />

Rami Alazrai and C.S. George Lee<br />

School of Electrical and Computer Engineering, Purdue University, U.S.A<br />

• Propose an NARX-based approach to<br />

capture the spatial and temporal dynamics<br />

of facial expressions.<br />

• Temporal phases of facial expressions are<br />

identified using the proposed MIBDIA<br />

algorithm.<br />

• The proposed human-emotion recognition<br />

system achieved 91.5% average<br />

recognition rate over the CK+ dataset.<br />

15:30–15:45 WedET2.3<br />

Development of Expressive Robotic Head<br />

for Bipedal Humanoid<br />

Robot<br />

Tatsuhiro Kishi, Takuya Otani, Nobutsuna Endo,<br />

Przmyslaw Kryczka, Kenji Hashimoto, Kei Nakata<br />

and Atsuo Takanishi<br />

Faculty of Science and Engineering, Waseda University, Japan<br />

• Robotic head was developped to increase<br />

the facial expression ability of bipedal<br />

humanoid robot.<br />

• Representative facial expressions for 6<br />

basic emotion were designed by<br />

cartoonists.<br />

• Compact Mechanisms were developped to<br />

build the robotic head as small as<br />

Japanese female’s head.<br />

• Evaluations with pictures and videos show<br />

that the robotic head has extensive facial<br />

expression ability.<br />

15:15–15:30 WedET2.2<br />

A Design Methodology for<br />

Expressing Emotion on Robot Faces<br />

Mohammad Shayganfar, Charles Rich and Candace L. Sidner<br />

Computer Science Dept<br />

Worcester Polytechnic Institute, USA<br />

• Methodology is grounded in the<br />

psychological literature (Ekman FACS)<br />

• Four steps: (i) assign action units to robot<br />

DOF, (ii) apply mapping from basic action<br />

units to emotions, (iii) predict confusions,<br />

(iv) add optional action units and cartoon<br />

ideas to reduce confusion<br />

• Demonstrated and evaluated by applying<br />

methodology to a recent humanoid robot<br />

(see figure)<br />

• The experimentaly observed emotion<br />

confusion matrix agrees qualitatively with<br />

the predictions of the methodology<br />

15:45–16:00 WedET2.4<br />

Confidence Fusion Based Emotion Recognition<br />

of Multiple Persons for Human-Robot Interaction<br />

Ren C. Luo, Pei Hsien Lin, Li Wen Chang<br />

Center for Intelligent Robotics and Automation Research, National Taiwan University<br />

• We propose an integrated system which<br />

has the ability to track multiple users at one<br />

time, to recognize their facial expressions,<br />

and to identify the indoor ambient<br />

atmosphere.<br />

• In our facial expression recognition<br />

scheme, we fuse Feature Vectors based<br />

Approach (FVA) and Differential-Active<br />

Appearance Model Features based<br />

Approach (DAFA) to obtain not only<br />

apposite positions of feature points, but<br />

also more information about texture and<br />

appearance.<br />

• With our system, our intelligent robot with<br />

vision systems are able to acquire the<br />

ambient atmosphere information and<br />

interacts with people properly.<br />

<strong>2012</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–162–<br />

A surprise situation of humanrobot<br />

interaction

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!