Groups 2011

From Wiki for iCub and Friends
Jump to: navigation, search

Now that basic tutorials are done we would to invite people to pick a project idea and start working on that. Ideally this should be done in groups so it is more fun, but also individual groups are ok.

You can add your group/project idea below. Add a short description of the project and the list of people that will be working on it. For ideas, take a look at last year.

CHRIS Group

This group is part of the Fp7 project CHRIS (see http://www.chrisfp7.eu/). The focus is on human-robot interaction. The human supervises the robot using verbal commands and teaches the robot to recognize and the name of new objects.

  • Ugo
  • Stephane
  • Carlo
  • Andrea Takke
  • Vadim
  • Rujiao
  • Rizki

Update : The modules are in place, basic spoken interaction allow the iCub to learn new objects and to point to a specific object when we ask for it.

Efaa Group

See http://efaa.upf.edu/ - "The Experimental Functional Android Assistant (EFAA) project will contribute to the development of socially intelligent humanoids by advancing the state of the art in both single human-like social capabilities and in their integration in a consistent architecture." (from http://icub.org)

  • Maxime
  • Ilaria
  • Marti
  • Uriel
  • Miguel

Status Update : Modules working with SIM and real iCub (pmp, touchingTable, blobFinder). Integration for modules nearly done (BlobFinder missing)

Current Step : Test HandPoint with real iCub after integration BlobFinder + doing actually something useful with attentionControl

Force Controlled Crawling

The idea behind this group is to improve the current (crawling demo) initially developed in collaboration with the biorobotics laboratory at the EPFL. The idea will be to first implement a cartesian impedance module to simulate a cartesian spring-like behaviour with a stiffer compliance (in the heading direction) and a softer compliance (in the direction normal to the ground). At the moment we have some preliminary (code) for simulating cartesian stiffneess.

  • Francesco
  • Marco
  • Jorhabib
  • Ashwini
  • Gauss
  • Aurelien

The final video demonstrating cartesian impedance control on the iCub

Skin Group

This group is working to integrate the tactile feedback from the skin with the force control.

  • Andrea
  • Serena
  • Marco
  • Seungsu
  • ...

Eye contact and contingency group

Working on detecting contingency in interaction through patterns of eye contact. Can we develop a program which enables iCub to distinguish between interacting with a live, responsive companion iCub and a companion iCub whose behaviour is a recorded, unresponsive copy of a previous interaction? Infants can do this from pretty young. I'm concentrating on eye contact to mediate the interaction but other aspects, such as vocalisations, would be fun too.

  • Nick Wilkinson
  • Kiril
  • ...

Interaction on a table

Aims: Let humans interact with iCub on a table using different input cues (Kinect/in-eye cameras), including (virtual) object manipulation.

  • Anna
  • Patrick
  • Nikolas


iCub Remote Control / Motion Tracking Group

Aims: Integrate a motion capture device (Polhemus liberty) in yarp to track or remote control iCub movements.

  • Gabriel
  • Mike Ciaraldi
  • Dalia

The emo group

Do we need emotional robots? Some people says that emotions could help not only for communication but for managing limited resources in an unpredictable environment. I have a simple model inspired by human emotion mechanisms which should guide the robot to efficiently manage its energy and task performance in the famous grasp the ball scenario. Who wants to join me for playing with robot and improving the model.

  • Kiril
  • Dev
  • ...

Head Stabilization Group

Aims: Keep the iCub's head stabilized even when the torso is being randomly moved.

Nuno Moutinho ...

MACSi Group

Trying to make the robot interested (Fabien) and competent at seeing (Natalia) and lifting (Serena, Alain) a cardboard box.

  • Serena
  • Alain
  • Natalia
  • Fabien


Gaze Coordination Group

Coordination of gaze and action. The robot needs to decide where to look depending on rewards and task is doing.

  • Jose
  • Umar