VVV08 Pointing People

From Wiki for iCub and Friends
Revision as of 19:01, 28 July 2008 by Vvv08 (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

The Pointing Group members:

  • Giacomo Spigler
  • Jasmin Steinwender
  • Stephane Lallee
  • Frank Förster

Task idea: Imagine human and Robot are playing together. There are objects in the world and the human is pointing towards one object. The robot then has to guess which target object had been selected by the human like: "Are you pointing to the red circle?".

To make things easy, the objects are attached to a wall and the human is is standing next to the wall.

Armpointing original.png

Future Tasks:

  • 3D localization and object segmentation
  • Human-Robot interaction with Instructor-Learner roles, making use of Visual Attention techniques to find object being pointed

Developmental stages:

  • Deciding and discussing the world configuration
  • Setting up the constraints (position, background, configuration)
  • Detecting the Blob target and the arm (Subtracting background from actual picture - Update: now no more background subtraction needed instead movement detection)
  • Classifying arm as arm and blobs as targets (with help of blob.h)
  • Fitting a line through the arm to determine the direction of pointing (Hough Transform)
  • If the line crosses an earlier defined object which was not classified as arm this is the target the human is pointing to.
  • Then the target properties have to be defined (color, shape)
  • Text-to-Speech is used to let the robot ask: "Are you pointing at the green circle?"

Armpointing armdetected.png Armpointing linefit.png Armpointing blobdetect.png Armpointer nobackground.png Armpointer direction.png

Magic pointer

The idea is to have to robot pointing to the same object as the human. To do this, after having it detecting the pointed object we must calculate joints angles in order to have his arm pointing to that object.

We will do this with a neural network trained on 'natural' pointing postures.

We already done a program that open a port wich act as a neural network, you can train it on whatever you want.

Second step (actual) is to generate the natural pointing database, we plan to do this by setting manually the simulator. (if anyone have a better idea... ^^' )