07.02.2013 Views

Session WedAT1 Pegaso A Wednesday, October 10, 2012 ... - Lirmm

Session WedAT1 Pegaso A Wednesday, October 10, 2012 ... - Lirmm

Session WedAT1 Pegaso A Wednesday, October 10, 2012 ... - Lirmm

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>Session</strong> WedBT1 <strong>Pegaso</strong> A <strong>Wednesday</strong>, <strong>October</strong> <strong>10</strong>, <strong>2012</strong>, 09:30–<strong>10</strong>:30<br />

Sensor Fusion<br />

Chair Reid Simmons, Carnegie Mellon Univ.<br />

Co-Chair Magnus Jansson, KTH Royal Inst. of Tech.<br />

09:30–09:45 WedBT1.1<br />

Ground Plane Feature Detection in Mobile<br />

Vision-Aided Inertial Navigation<br />

Ghazaleh Panahandeh, Nasser Mohammadiha, Magnus Jansson<br />

KTH Royal Institute of Technology, Sweden<br />

• The hardware of the mobile system consists of a monocular camera<br />

mounted on an inertial measurement unit (IMU).<br />

• Exploiting the complementary nature of the IMU-camera sensor fusion<br />

system for estimating the camera translation and rotation, the developed<br />

algorithm consists of two parts:<br />

I. Homography-based outlier rejection<br />

II. Normal-based outlier rejection<br />

<strong>10</strong>:00–<strong>10</strong>:15 WedBT1.3<br />

IEEE/RSJ IROS <strong>2012</strong> Digest Template<br />

Gaussian Process for lens distortion modeling<br />

Pradeep Ranganathan and Edwin Olson<br />

Computer Science and Engineering, University of Michigan, USA<br />

Contributions:<br />

• Incorporate a GP model into a factor<br />

graph inference framework optimized for<br />

Gaussian factor potentials.<br />

• Model evaluation based on test image set.<br />

• GP distortion models achieve accuracy<br />

comparable to the best parametric model.<br />

• GP models provide an implicit but rigorous<br />

framework for automatically determining<br />

distortion model complexity.<br />

09:45–<strong>10</strong>:00 WedBT1.2<br />

Sensor Fusion for<br />

Human Safety in Industrial Workcells<br />

Paul Rybski, Peter Anderson-Sprecher, Daniel Huber,<br />

Chris Niessl, and Reid Simmons<br />

The Robotics Institute, Carnegie Mellon University, USA<br />

• We present a sensor-based approach<br />

for ensuring safety of people in<br />

proximity to robots.<br />

• Our approach fuses data from multiple<br />

3D sensors into an evidence grid.<br />

• People and robots are surrounded by a<br />

safety and danger zone, respectively.<br />

• Impending intersections between safety<br />

and danger zones are identified and the<br />

robot is stopped.<br />

<strong>2012</strong> IEEE/RSJ International Conference on Intelligent Robots and Systems<br />

–129–<br />

The person’s position is registered by the<br />

3D sensors (shown in green) and is<br />

compared against the 3D volume filled by<br />

the robot (shown in red).<br />

<strong>10</strong>:15–<strong>10</strong>:30 WedBT1.4<br />

Distributed Altitude and Attitude Estimation<br />

from Multiple Distance Measurements<br />

Maximilian Kriegleder, Raymond Oung and Raffaello D’Andrea<br />

Institute for Dynamic Systems and Control, ETH Zurich, Switzerland<br />

• Distance sensors are attached to<br />

known positions on a rigid body.<br />

• Attitude and altitude may be<br />

estimated directly when sensors are<br />

centrally measurable.<br />

• For distributed sensor networks, this<br />

approach is modified to a scalable,<br />

distributed scheme.<br />

• In the limit of sharing information<br />

between sensor nodes, the<br />

estimates approach those<br />

obtainable in a centralized system.<br />

Each module of the Distributed Flight Array<br />

obtains a distance measurement and<br />

communicates with its neighbours.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!