This website uses browsing/session and functional cookies to ensure you get the best experience. Learn More
Each group should add an entry here.
Our tasks during the summer school are mainly related to the EU Project Poeticon ++.
In particular, we focus on two tasks:
- learning affordances through autonomous exploration ("tool" and "grasp" affordances);
- updating the PRAXICON (i.e. a language-based knowledge structure) with information recorded during the exploration of tools and objects.
In more practical terms, this will lead to the development of the following [YARP/iCub]-based software modules:
- visual module to compute tools/objects descriptors -- Giovanni Saponaro;
- communication module that dumps information collected during the robot exploration (e.g. visual descriptors of tools and objects, effects detected on the environment, motor plans) to update the PRAXICON -- Alexandre Antunes & Paschalis Veskos.
Possible scenario for a final demo:
- the iCub explores different tools (stick, rake) performing simple actions (push, pull, tap) on different objects (cubes, balls);
- the iCub performs tapping with different parameters (e.g. movement speed, approaching direction, hand initial configuration) on different objects, trying to exploit a grasp reflex to grasp the object;
- visual, motor, tactile information will be collected and used to i) learn generic tool affordances, ii) learn grasp affordances, iii) update the PRAXICON (i.e. ground the language-based knowledge into the real-robot world).
Foot step and CoM planner
CoM contoller RL
Participants: Thibaut, Alex
CoM controller MPC
Participants: Francesco R., Riccardo
CoM controller QP
Presentation slide: presentation slide
In this task we want to implement a controller for the center of mass trajectory given the output of the footstep planner. Possibilities consists in implementing the available open source controllers.
- Convert the ouput of the CoM planner group into an iCub controller (either torques or joint velocities).
- Trying to use available packages (e.g SOT and SOTH by Nicolas Mansard or Sentis whole-body-control-framework).
- Handle additional constraints such as relative foot positions and limits at the joints
- Output should include zmp
- Input com, dcom
Andrea, Daniele, Silvio, Roberto
Reaching while balancing
Serena, Mingxing, Erwan
Using Bayesian Optimisation (and the BayesOpt lib), we want to give iCub the ability to reach an object that is out of range on a table. He has to learn where to put a third contact (hand) on the table to stay balanced and increase his arm range. Some interesting questions are :
- How can the robot estimate his own stability ?
- How to estimate the quality of the reaching behavior ?
We intent to use the icub's tactile sensors and force-torque sensor to implement a tactile servoing algorithm. Such a skill could be useful for exploring (3D)-surfaces, following an edge or surface with the robot's palm to explore a 3D shape.
- As a first step, the force-torque sensor can be used to measure the pressure of the end-effector on the surface. Admittance control can be used to lower the arm to a table and keep constant pressure on the table.
- In a second step, we can use the robot's skin (tactile sensors) to localize the edge or point contact where pressure is applied. We want to control the pose of the end effector, such that we can e.g. center a point or edge contact. Ideally, we would control both the position and orientation of the palm, combining information from tactile- and force-torque sensing.
- By adding an external velocity tangential to the surface we could follow the table's edge in this manner.
tactile servoing: https://www.youtube.com/watch?v=TcWipks3qJ0
unknown object fine manipulation: http://www.youtube.com/watch?feature=player_embedded&v=UgCv5ESAYfc
Structure From Motion
Design, develop and implement robust camera position estimation for the computation of a robust disparity map.
Real-time control of the iCub hand from a confederate's hand movements. Develop a module that mimics in real-time a confederate's arm/hand shape on the iCub
- Retrieves the full articulation and position of the hand using the "3D Hand Tracking" library (http://cvrlcode.ics.forth.gr/handtracking/)
- Creates a RFModule that mimics the hand/arm articulation on the robot
Driving an enhanced iCub head with jaw and lips control for speech
A new iCub head version will be available at this summer school, including a mobile jaw and 4 degrees of freedom for lips. We will try to get it controled through captured speech data and synchronized with the corresponding audio recording.
See first examples of the screen-based talking head and the iCub head being driven in parallel. Final demo will involve real-time speech replay, not just sliding jaw and lips control (semi-static).
Presentation slides :
Order by Sentences
Ordering iCub a task using speech and vision to decrease ambiguity.
We start with a simple task with three objects (red, green and blue) and three actions (grasping, indexing and touching).
The group is divided in 3 main tasks :
Reactive dance based on gesture recognition Cooperative DJ task based on reactable input Pong game
Kinaestethic teaching for low dimensional grasping
Kinaesthetic teaching experiments are carried out on the iCub in order to collect data to start up a grasping algorithm based on grasping synergies.
Validation of the algorithm will be carried out.
It may be collected data aiming to build a meccano assembly.
Participants: Giuseppe Cotugno