Projects vvv2014

From Wiki for iCub and Friends
Revision as of 12:10, 26 July 2014 by Nkuppuswamy (talk | contribs) (Whole body related tasks)
Jump to: navigation, search

Please populate this page with the name of the project, the name of the people that participate in the project and a short description of what the project aims to do and what individual people are planning to do within the group. A group can be made by an individual person or more.

It is important that everybody works towards something, no matter how simple or complex.

Wysiwyd EU Project (What you say is what you did)

See here or here for a complete list of tasks.


  • Stephane, Matej : sensorimotor contingencies using SOMs that match other SOMs (e.g. see subtask 2.5 here)
  • Grégoire Pointeau.: learning through language. semantic bootstrapping. (see subtask 2.2). Detecting the object of focus, and through language associate several name to one object, or several objects to one name, and learning to use it according to the context.
  • Martina : online predictions of motion (see subtask 2.3)
  • Maxime : Reactable game for Semantic Bootstrapping (see subtask 2.3)
  • ...

Goal of the project:


Exploration of tools and objects affordances (in the context of Poeticon++)

People: Afonso, Lorenzo, Tanis, Vadim

Goal: Perform a set of actions on objects using tools, recording visual descriptors of tools, objects, and effects of the action.

Incremental Learning applications on iCub using the GURLS++ package

People: Raffaello

The main goals of the project are the following:

  1. Implement a numerically stable incremental RLS update rule (cholupdate) in the GURLS++ framework. The functionality has already been tested in MATLAB, replicating the work by Gijsberts et al.: ‘Incremental Learning of Robot Dynamics using Random Features’
  2. Design and develop a simple prototypical YARP module to serve as an interface with the robot
  3. Run an on-line inverse dynamics learner on the real robot


Optimal control for COM motion

People involved: Yue, thanks to Francesco for the idea, Silvio and Daniele for the dynamics explanation and help

The project is about task 1.10. The objective is to use optimal control to generate optimized reference control inputs (torques) for the iCub robot such that it performs some motion at the COM (like swing left right and keep balancing). To reach the goal forward dynamics is needed and a simple version with the Whole Body Interface is developed.
The optimal control problem is solved with the softwre package MUSCOD-II of University of Heidelberg.

Learning action representations from experience (in the context of the Xperience EU project)

Goal: create a simple demo of learning action representations on the iCub. See here for a complete list of tasks.


  • Vadim: generate symbolic action/state descriptions from the iCub database (subtask 6.1). Data should look something like this:

(AND (HANDEMPTY LEFT) (INHAND nesquik RIGHT) (ROBOTAT TABLE) (ON multivitaminjuice TABLE) (ON popcorn TABLE) )

  • Kira: learn planning representations from the action/state descriptions in offline batch mode (subtask 6.2) and incrementally online (subtask 6.3). Results should look something like this:

(:action PUTDOWN
 :parameters (?X1 ?X2 ?X3 )
 :precondition (AND (ROBOTAT ?X2) (INHAND ?X3 ?X1) )
 :effect (AND (HANDEMPTY ?X1) (ON ?X3 ?X2) ))

(:action GRASP
 :parameters (?X1 ?X2 ?X3 )
 :precondition (AND (HANDEMPTY ?X1) (ROBOTAT ?X2) (ON ?X3 ?X2) )
 :effect (INHAND ?X3 ?X1) )

Eye Tracking for Human Robot Interaction

People involved: Oskar Palinko

Face detection, face tracking, feature extraction, gaze tracking.


Exploration of tools and objects affordances (in the context of Poeticon++)

People: Afonso, Lorenzo, Tanis, Vadim

Goal: Perform a set of actions on objects using tools, recording visual descriptors of tools, objects, and effects of the action.

YARP/ROS Integration with MoveIt!

People involved: Miguel

Be able to plan the arm trajectories of the iCub on MoveIt, and, to populate the environment in ROS with the info from the iCub stereo vision.
Manage to get Vizzy's (a robot from my lab) controlboards simulated on Gazebo, using the previous work on iCub/Gazebo.

3D reconstruction of an object from multiple views

Contributors: Alessio Mauro Franchi, Evgenii Koriagin, Ilaria Gori, Ugo Pattacini

The aim of the project is to build a 3D object model registering multiple views.

Whole body related tasks

Floating base rigid body dynamics estimation

Contributors: Jorhabib Eljaik, Francesco Nori

We will consider the problem of estimating the dynamics of a floating base system. We will start with the simple problem of estimating the dynamics of a single rigid body with distributed force/torque sensors, accelerometers and gyroscopes. Adopted techniques will include Kalman filtering.

Balancing on a single foot

Contributors: Daniele Pucci, Silvio Traversaro

Test the balancing demo developed for the first year of the CoDyCo project with the robot standing on a single foot.

Graphical tools for ZMP and stability region visualizers

Contributors: Morteza Azad.

This module will be a graphical visualization tool for ZMP, balance etc. It will essentially be a 2D plot visualizing the feet (shape can be extracted from CAD drawings), the support polygon, the stability region, the ZMP, the COP, etc. It will integrate information provided by the dynamics modules, the skin information, etc.

Matlab API to the whole body interface

Contributors: Naveen Kuppuswamy, Jorhabib Eljaik

The WBI-Toolbox offers a Simulink interface to the whole body interface. Certain parts of the WBI can offer some important functionalities in Matlab. As an example, the wholeBodyModel should have a Matlab interface to allow inverse and forward dynamics computations in Matlab. This will allow numerical integration in matlab outside simulink of the the forward dynamics components. Potentially will also allow porting to GNU octave and give the WBI toolbox an FOSS application.

Proposed interface example :

[mass_matrix] = mexWholeBodyModel('mass_matrix',q,q_dot);

[lowerJoingLimits, higherJointLimits] = mexWholeBodyModel('joint_limits');

[JacobianCoM] = mexWholeBodyModel('Jacobian','CoM',q,q_dot);

Floating base forward dynamics

Contributors: Naveen Kuppuswamy, Daniele Pucci, Jorhabib Eljaik

In this task we aim at computing the floating base forward dynamics. The idea is to provide some preliminary implementations that assume a floating base system subject to a number of rigid constraints. The implementation could make use of the formulae presented by Aghili, 2005 and reported in formula (6.18) of Andrea Del Prete PhD thesis.

The second aim is to achieve numerical integration using ode15s to have high speed torque control prototyping. For instance, one demo scenario is to input torques from the dancing icub demo (moving CoM position while balancing) and have the output of integration stored as a timeseries which can then be replayed on the the robot visualisation for benchmarking. This can achieve benchmarking of novel torque controllers free of unknown errors such as unmodelled dynamics, software bugs in the complex architecture etc.

Integrating the ISIR whole-body controller with WBI and iCubGazebo

Contributors: Darwin Lau, Mingxing Liu, Ryan Lober

The whole-body controller developed at ISIR has shown promising results as a whole-body torque-based controller in our simulation software using iCub models. Our goal is to integrate this controller with the Whole-Body-Interface (WBI) of the iCub in order to begin testing the controller with different simulators and more importantly in real world implementations.

"Emergence" of Behaviors

Contributors (Alphabetic order): Jimmy Baraglia Joshua_Pepneck Takuji Yoshida

The tasks are:

- Build a robot self action model
- Recognize simple object directed actions
- Program a self-supervised learning for Prediction-error minimization
- Control the robot to perform simple actions

See here for more details..

Object Manipulation, grasping and exploration tasks

Visually guided grasping

Contributors: Mihai Gansari

Use stereo vision to detect object(s) and their location. Plan positioning of the hand to bring it close to the object and grasp/push the object. Using already implemented software or developing new techniques. Short term goal is to make the robot grab/push the object and send it to the other hand and viceversa. Long term goal is to obtain a playing robot by throwing an object from one hand to the other continously.

Merging point clouds

Contributors: Evgenii Koriagin

Receive point clouds of the object seen by the robot from different view points and merge these information in one coherent spatial representation using ICP.