This website uses browsing/session and functional cookies to ensure you get the best experience. Learn More

Groups page

From Wiki for iCub and Friends
Jump to: navigation, search

Each group should add an entry here.

Contents

The NERD Poets Society (related to POETICON++ EU Project)

Participants:

Vadim Tikhanoff; Lorenzo Jamone; Giovanni Saponaro; Afonso Gonçalves; Alexandre Antunes; Paschalis Veskos; Beata Grzyb

Description:

Our tasks during the summer school are mainly related to the EU Project Poeticon ++.

In particular, we focus on two tasks:

- learning affordances through autonomous exploration ("tool" and "grasp" affordances);

- updating the PRAXICON (i.e. a language-based knowledge structure) with information recorded during the exploration of tools and objects.


In more practical terms, this will lead to the development of the following [YARP/iCub]-based software modules:

- motor modules to realize simple actions (push, pull, tap, grasp reflex) using different tools (stick, rake, hand) -- Vadim Tikhanoff & Lorenzo Jamone & Afonso Gonçalves & Beata Grzyb;

- visual module to compute tools/objects descriptors -- Giovanni Saponaro;

- communication module that dumps information collected during the robot exploration (e.g. visual descriptors of tools and objects, effects detected on the environment, motor plans) to update the PRAXICON -- Alexandre Antunes & Paschalis Veskos.


Possible scenario for a final demo:

- the iCub explores different tools (stick, rake) performing simple actions (push, pull, tap) on different objects (cubes, balls);

- the iCub performs tapping with different parameters (e.g. movement speed, approaching direction, hand initial configuration) on different objects, trying to exploit a grasp reflex to grasp the object;

- visual, motor, tactile information will be collected and used to i) learn generic tool affordances, ii) learn grasp affordances, iii) update the PRAXICON (i.e. ground the language-based knowledge into the real-robot world).

CoDyCo

Foot step and CoM planner

Enrico, Jorhabib

CoM contoller RL

Participants: Thibaut, Alex

CoM controller MPC

Participants: Francesco R., Riccardo

CoM controller

CoM controller QP

Participants: Francesco N., Alessio, Alessandro.

Presentation slide: presentation slide

In this task we want to implement a controller for the center of mass trajectory given the output of the footstep planner. Possibilities consists in implementing the available open source controllers.

  • Convert the ouput of the CoM planner group into an iCub controller (either torques or joint velocities).
  • Trying to use available packages (e.g SOT and SOTH by Nicolas Mansard or Sentis whole-body-control-framework).
  • Handle additional constraints such as relative foot positions and limits at the joints
  • ComStepper
  1. Output should include zmp
  2. Input com, dcom

Torque controller

Andrea, Daniele, Silvio, Roberto

Reaching while balancing

Serena, Mingxing, Erwan

Using Bayesian Optimisation (and the BayesOpt lib), we want to give iCub the ability to reach an object that is out of range on a table. He has to learn where to put a third contact (hand) on the table to stay balanced and increase his arm range. Some interesting questions are :

  • How can the robot estimate his own stability ?
  • How to estimate the quality of the reaching behavior ?

Tactile Servoing

Participants

Description

We intent to use the icub's tactile sensors and force-torque sensor to implement a tactile servoing algorithm. Such a skill could be useful for exploring (3D)-surfaces, following an edge or surface with the robot's palm to explore a 3D shape.

  • As a first step, the force-torque sensor can be used to measure the pressure of the end-effector on the surface. Admittance control can be used to lower the arm to a table and keep constant pressure on the table.
  • In a second step, we can use the robot's skin (tactile sensors) to localize the edge or point contact where pressure is applied. We want to control the pose of the end effector, such that we can e.g. center a point or edge contact. Ideally, we would control both the position and orientation of the palm, combining information from tactile- and force-torque sensing.
  • By adding an external velocity tangential to the surface we could follow the table's edge in this manner.

Video impression:

tactile servoing: https://www.youtube.com/watch?v=TcWipks3qJ0

unknown object fine manipulation: http://www.youtube.com/watch?feature=player_embedded&v=UgCv5ESAYfc


Structure From Motion

Sean Ryan Fanello; Vadim Tikhanoff; Riccardo Spica;

Design, develop and implement robust camera position estimation for the computation of a robust disparity map.

http://wiki.icub.org/images/9/91/SFM2013.pdf


SWoOZ

Participant

Guillaume Gibert

Description

Real-time control of the iCub hand from a confederate's hand movements. Develop a module that mimics in real-time a confederate's arm/hand shape on the iCub

Subtasks:

  • Retrieves the full articulation and position of the hand using the "3D Hand Tracking" library (http://cvrlcode.ics.forth.gr/handtracking/)
  • Creates a RFModule that mimics the hand/arm articulation on the robot

modules

Source code

source code

Driving an enhanced iCub head with jaw and lips control for speech

A new iCub head version will be available at this summer school, including a mobile jaw and 4 degrees of freedom for lips. We will try to get it controled through captured speech data and synchronized with the corresponding audio recording.

See first examples of the screen-based talking head and the iCub head being driven in parallel. Final demo will involve real-time speech replay, not just sliding jaw and lips control (semi-static).

Driving both talking heads Another articulation set

Presentation slides :

Frédéric Elisei, Marco Randazzo

Order by Sentences

Ordering iCub a task using speech and vision to decrease ambiguity.

We start with a simple task with three objects (red, green and blue) and three actions (grasping, indexing and touching).

EFAA

The group is divided in 3 main tasks :

  Reactive dance based on gesture recognition
  Cooperative DJ task based on reactable input
  Pong game

EfaaQuick.png]

Kinaestethic teaching for low dimensional grasping

Kinaesthetic teaching experiments are carried out on the iCub in order to collect data to start up a grasping algorithm based on grasping synergies.

Validation of the algorithm will be carried out.

It may be collected data aiming to build a meccano assembly.

Participants: Giuseppe Cotugno

Personal tools
Namespaces

Variants
Actions
Navigation
Print/export
Toolbox