Add project, group, or tutorial ideas here. If there's one you're interested in, please add a note to it to say so.
ICUB kinematics and interfaces (tutorial)
Go through the basics of how to model and control the robot.
- DONE, Lorenzo did this yesterday, see ICub Tutorials for material.
- Federico T.
YARP basics (tutorial)
YARP basics. Does anyone need this?
- NOT DONE, but see http://eris.liralab.it/yarpdoc/tutorials.html for step-by-step YARP tutorials.
Blender and YARP
Could anyone give a tutorial about using Blender as a robot simuation environment using Yarp?
It is shown here: http://www.youtube.com/watch?v=OT1Ck_RE5Wg
- NOT DONE, but Paul is preparing to do it. Blender for Robotics
Two handed manipulation (Group)
People interested in reaching, grasping and visual tracking can join here.
- Alexis Maldonado (reaching, grasping)
- Federico Ruiz (reaching, grasping)
- Fabrizio Smeraldi (visual tracking)
- Boris Duran (reaching, grasping)
- Theo Jacobs
- Julian Schill (reaching, grasping, perhaps contact detection)
- Federico Tombari (visual tracking, object recognition)
Automatically deploying iCub Software (Automatic Build/Installation)
We will look at the problem how to deploy software in a lab.
Approaches will be presented:
- gar: Lars
- ROS: Alexis
GAR Article: http://www.lnx-bbc.com/garticle.html
- There are gar modules exising for building iCub Software on linux (tested on ubuntu).
- I will explain the Ideas and how to create gar build files.
- We will test the existing modules and refine them.
Who is interested in having a basic speech recognition setup?
- Grammar based
- English or German
YARP Interfaces for Neural Networks && Multi Modal Convergence Maps
I'd like to define some standard I/O for using neural networks in YARP. I also want to refine a model of a concept map that I worked on last months. It's based on Kohonen SOM and a multi modalities approach.
Note: there is already something done in this direction, it might be worth looking into this: http://eris.liralab.it/iCub/dox/html/classiCub_1_1contrib_1_1learningmachine_1_1IMachineLearner.html
Decision Module for iCub Cognitive Architecture (group)
Put together existing modules (ball tracking, face tracking, attention system, etc.) under the iCub Cognitive Architecture framework. Understand and improve the communication and data format of current modules; create a behaviour-based action selector with inhibition, habituation and associative memory mechanisms.
Learning by imitation
Having the iCub learn some behavior by imitating a human (with vision)
An article I found: http://robotic.media.mit.edu/pdfs/journals/Breazeal-AL05.pdf
Motor Babbling and Autonomous Discovery of Kinematics
Applying some machine learning methods to motor babbling, where the robot generates an internal model of its own motor function as a baby/child would.
Here is also some reading that may be good for brainstorming:
- http://www.brainpowerlabs.com/temp/VanHulle_kMER_algorithm.pdf - describes Van Hulle's kMER (kernel-based Maximum Entropy learning Rule), similar to Kohonen's SOM but with nicer properties (e.g., no wasted units). This paper is a little heavy if you're not deep into this area, but it might be interested to some people.
- MMCM framework
- The paper I'm based on is under review, if you're interested by you can put your e-mail and I will send it to you. The paper is about forward kinematic model learning and application of this model to control robotic system. I extended it on a 4DOFs system on space instead of planar 3DOFs.
Implementing a (working) walking algorithm (Group)
Realizing stable walking with the iCub by exploitation of torso and arm movability to track and control the center of mass of the iCub.
Some basic C++ classes for cartesian and joint space control are already available. They are using the orocos-kdl library.
Walking pattern generation of iCub
Performing walking task, even in the air, would be an impact since iCub so far hasn’t officially performed such a task. Hereby, we are looking for brilliant brains that are interested in walking combined with vision intelligence. To make the task easy to implement for this summer school, the proposal is organized as follows.
- Learning forward and inverse kinematics of iCub, inspiring approaches rather than analytical ways;
- Matlab simulation for generating various walking patterns, outputting a database;
- Vision detection of commands from human commander, such as start, pause, resume, and etc;
- Communication with real robot iCub via YARP, to perform aerial walking.
Proposer: ZHIBIN LI If interested, you can either talk to me or sign here=)
Do the robot
Making icub do the robot.
Here a video of how it works.
So we need some rhythm detection and and trajectories. We synchronize the trajectories to the music.
Perhaps someone wants to implement a learning algorithm which watches the video and learns the movements? ;)
Need to burn those extra pounds of pizza and pasta? This is your opportunity, but we need to know as soon as possible so we can book the place. Bring beer... and money.