VVV09 Locomotion group
From Wiki for iCub and Friends
The proposal evolves as time goes by.
- Matlab simulation for generating various walking patterns, outputting a database;
- Vision detection of commands from human commander, such as start, pause, resume, and etc;
- Communication with real robot iCub via YARP, to perform aerial walking.
Team members
- Nieves Pavón;
- Manfred Kroehnert
- Fan ZHANG
- Zhi Bin LI
Updates
- Currently our group is able to make the robot walk in the air. Initial test of real iCub is going to perform on 16:00 23rd July.
- Fan ZHANG is trying to link Matlab to iCub simulator.
- Day5: Fan ZHANG now is able to link Matlab simulink to iCub simulator. New test of iCub stretched knee walking is scheduled at 15:00.
- Day 7: Nieves is able to make the robot recognize a green block filtering the background. If the block is up on the left ICub must start to walk. If the block is up on the right, ICub must stop. If the block is down on the left, ICub must pause and, finally if the block is down on the right ICub resets. If the block is very far or just is not seen by ICub, the robot goes on doing the same thing according to the last command. I would be very nice to use gestures made using a hand, but it is a very short time to get a good result and it is better to get all the modules for locomotion working together in a simple way.
- Tuesday (before the last day): Nieves is able to make the robot understand commands using blocks. The vision module has been putting together with the rest of the project... and it is working (more or less) :P
Big Events
iCub is walking in VVV09 summer school
The test of iCub was performed as scheduled. The whole control frame is explained in our wiki page. Firstly, the dynamic simulation was performed in iCub simulator environment. Secondly, we switched the communication from the simulator to the real robot iCub. Then, it performed Moon Walk!
Special thanks should be given to Manfred Kroehnert. He is the VC++ expert in our team who implements my simulation to real application. Our next step is to introduce the vision intelligence. iCub will be able to receive hand posture commands from human operator! =)