Each group should add an entry here.
- 1 POETICON++
- 2 CoDyCo
- 3 Tactile Servoing
- 4 Structure From Motion
- 5 SWoOZ
- 6 Driving an enhanced iCub head with jaw and lips control for speech
- 7 Order by Sentences
- 8 EFAA
Foot step and CoM planner
CoM contoller RL
Participants: Thibaut, Alex
CoM controller MPC
Participants: Francesco R., Riccardo
CoM controller QP
Participants: Francesco N., Alessio, Alessandro.
In this task we want to implement a controller for the center of mass trajectory given the output of the footstep planner. Possibilities consists in implementing the available open source controllers, i.e.
Andrea, Daniele, Silvio, Roberto
Reaching while balancing
Serena, Mingxing, Erwan
We intent to use the icub's tactile sensors and force-torque sensor to implement a tactile servoing algorithm. Such a skill could be useful for exploring (3D)-surfaces, following an edge or surface with the robot's palm to explore a 3D shape.
- As a first step, the force-torque sensor can be used to measure the pressure of the end-effector on the surface. Admittance control can be used to lower the arm to a table and keep constant pressure on the table.
- In a second step, we can use the robot's skin (tactile sensors) to localize the edge or point contact where pressure is applied. We want to control the pose of the end effector, such that we can e.g. center a point or edge contact. Ideally, we would control both the position and orientation of the palm, combining information from tactile- and force-torque sensing.
- By adding an external velocity tangential to the surface we could follow the table's edge in this manner.
tactile servoing: https://www.youtube.com/watch?v=TcWipks3qJ0
unknown object fine manipulation: http://www.youtube.com/watch?feature=player_embedded&v=UgCv5ESAYfc
Structure From Motion
Design, develop and implement robust camera position estimation for the computation of a robust disparity map.
Real-time control of the iCub hand from a confederate's hand movements.
Driving an enhanced iCub head with jaw and lips control for speech
A new iCub head version will be available at this summer school, including a mobile jaw and 4 degrees of freedom for lips. We will try to get it controled through captured speech data and synchronized with the corresponding audio recording.
Order by Sentences
Ordering iCub a task using speech and vision to decrease ambiguity.
We start with a simple task with three objects (red, green and blue) and three actions (grasping, indexing and touching).
The group is divided in 3 main tasks :
Reactive dance based on gesture recognition Cooperative DJ task based on reactable input Pong game