Now that basic tutorials are done we would to invite people to pick a project idea and start working on that. Ideally this should be done in groups so it is more fun, but also individual groups are ok.
You can add your group/project idea below. Add a short description of the project and the list of people that will be working on it. For ideas, take a look at last year.
This group is part of the Fp7 project CHRIS (see http://www.chrisfp7.eu/). The focus is on human-robot interaction. The human supervises the robot using verbal commands and teaches the robot to recognize and the name of new objects.
- Andrea Takke
Update : The modules are in place, basic spoken interaction allow the iCub to learn new objects and to point to a specific object when we ask for it.
See http://efaa.upf.edu/ - "The Experimental Functional Android Assistant (EFAA) project will contribute to the development of socially intelligent humanoids by advancing the state of the art in both single human-like social capabilities and in their integration in a consistent architecture." (from http://icub.org)
Status Update : Modules working with SIM and real iCub (pmp, touchingTable, blobFinder). Integration for modules nearly done (BlobFinder missing)
Current Step : Test HandPoint with real iCub after integration BlobFinder + doing actually something useful with attentionControl
Force Controlled Crawling
The idea behind this group is to improve the current (crawling demo) initially developed in collaboration with the biorobotics laboratory at the EPFL. The idea will be to first implement a cartesian impedance module to simulate a cartesian spring-like behaviour with a stiffer compliance (in the heading direction) and a softer compliance (in the direction normal to the ground). At the moment we have some preliminary (code) for simulating cartesian stiffneess.
The final video demonstrating cartesian impedance control on the iCub
This group is working to integrate the tactile feedback from the skin with the force control.
Eye contact and contingency group
Working on detecting contingency in interaction through patterns of eye contact. Can we develop a program which enables iCub to distinguish between interacting with a live, responsive companion iCub and a companion iCub whose behaviour is a recorded, unresponsive copy of a previous interaction? Infants can do this from pretty young. I'm concentrating on eye contact to mediate the interaction but other aspects, such as vocalisations, would be fun too.
- Nick Wilkinson
Interaction on a table
Aims: Let humans interact with iCub on a table using different input cues (Kinect/in-eye cameras), including (virtual) object manipulation.
iCub Remote Control / Motion Tracking Group
Aims: Integrate a motion capture device (Polhemus liberty) in yarp to track or remote control iCub movements.
- Mike Ciaraldi
The emo group
Do we need emotional robots? Some people says that emotions could help not only for communication but for managing limited resources in an unpredictable environment. I have a simple model inspired by human emotion mechanisms which should guide the robot to efficiently manage its energy and task performance in the famous grasp the ball scenario. Who wants to join me for playing with robot and improving the model.
Head Stabilization Group
Aims: Keep the iCub's head stabilized even when the torso is being randomly moved.
Nuno Moutinho ...
Trying to make the robot interested (Fabien) and competent at seeing (Natalia) and lifting (Serena, Alain) a cardboard box.
Gaze Coordination Group
Coordination of gaze and action. The robot needs to decide where to look depending on rewards and task is doing.