This website uses browsing/session and functional cookies to ensure you get the best experience. Learn More

Deliverable 3.6 update

From Wiki for iCub and Friends
Jump to: navigation, search


Software implementation of the phylogenetic abilities... (version 2.0)
Icubijcai.jpg

This is the addendum to the Deliverable 3.6 which is the second release of the part of iCub Cognitive Architecture. A placeholder document has been added to the RobotCub.org website with a pointer to this page. The demonstrations are provided here as videos.


The software implementation deriving from the work in WP3 (sensorimotor coordination models) is a collection of applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the iCub applications documentation. The modules are described also in the iCub modules documentation.


Some of these applications and videos are shared with Deliverable 2.2 which is the final collection of iCub software (version 2.0).

Important note

  • Please note that often the browser won't display/embed your videos correctly because of coded and/or other player incompatibilities. In that case, we recommend downloading them to your computer and then playing them using your favorite media player software.



The iCub attention system

the attention system
  • The main Doxygen documentation of the attention system is available as the iCub application called attention_distributed and can be browsed here
  • Paper:
    • Ruesch J., Lopes, M., Hornstein J., Santos-Victor J., Pfeifer, R. Multimodal Saliency-Based Bottom-Up Attention - A Framework for the Humanoid Robot iCub. International Conference on Robotics and Automation, Pasadena, CA, USA, May 19-23, 2008. pp. 962-967. -PDF-



The iCub reaching/grasping controller

the Cartesian controller
  • The main Doxygen documentation of the reaching controller is available as the iCub application called armCartesianController and can be browsed here. The implementation includes also the multi-referential approach used for the learning of the body schema (see below).
  • Papers:
    • None, but details have been included in the progress report -PDF-



Crawling

the iCub crawling
  • The main Doxygen documentation of the crawling controller is available as the iCub application called missing_application and can be browsed [http:// here]
  • Videos:
    • crawling.wmv, a few steps with the crawling controller (old video).
  • Papers (and more):
    • -Deliverable 3.8-
    • A presentation on the controller structure: presentation.pdf
    • Degallier, S. and Ijspeert, A.J., Modeling Discrete and Rhythmic Movement through Motor Primitives: A Review. 2009. Submitted to Biological Cybernetics. -PDF-
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF-

Drumming uses the same modules of crawling

the iCub drumming
  • The main Doxygen documentation of the drumming controller is available as the iCub application called drummingEPFL and can be browsed here
  • Papers (and more):
    • -Deliverable 3.4-
    • -Deliverable 3.8-
    • A presentation on the controller structure: presentation.pdf
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF-



Force control on the iCub

the iCub v2.0 (torque control)
  • The main Doxygen documentation of the cognitive gaze demonstration is available as the iCub application called forceControl and can be browsed [http:// here]
  • Videos:
    • A preliminary video of some related research (iCub v2.0) is available here
  • Papers:
    • Parmiggiani, A., Randazzo, M., Natale, L., Metta, G., Sandini, G. Joint torque sensing for the upper-body of the iCub humanoid robot. IEEE-RAS International Conference on Humanoid Robots. (2009). -PDF-
    • Fumagalli M., Gijsberts, A., Ivaldi, S., Jamone, L., Metta, G., Natale, L., Nori, F., Sandini, G. Learning to Exploit Proximal Force Sensing: a Comparison Approach. From Motor Learning to Interaction Learning in Robots. (2010). -PDF-



Body schema learning

iCub reaching
  • The main Doxygen documentation of the body schema is available as the iCub application called lasaBodySchema and can be browsed here
  • Papers:
    • M. Hersch, E. Sauser and A. Billard. Online learning of the body schema. International Journal of Humanoid Robotics, (2008). -PDF-
    • M. Hersch, Adaptive sensorimotor peripersonal space representation and motor learning for a humanoid robot. PhD thesis (2009). link



Predictive gaze

The iCub smooth pursuit
  • The main Doxygen documentation of the predictive gaze is available as the iCub application called missing link and can be browsed [http:// here]
  • Papers:
    • Falotico E., Taiana M., Zambrano A., Bernardino A., Santos-Victor J., Laschi C., Dario P., Predictive tracking across occlusions on the iCub robot. 9th IEEE-RAS International Conference on Humanoid Robots, 7-10th December 2009, Paris, France. -PDF-
    • Zambrano D., Falotico E., Manfredi L., Laschi C., A model of the smooth pursuit eye movement with prediction and learning. Applied Bionics and Biomechanics. 2009 (accepted).



Bimanual coordination

iCub bimanual reaching
  • The main Doxygen documentation of the bimanual coordination experiments is available as the iCub application called missing link and can be browsed [http:// here]
  • Papers:
    • Mohan, V., Morasso, P., Metta, G., Sandini, G. A biomimetic, force-field based computational model for motion planning and bimanual coordination in humanoid robots. Autonomous Robots. (in press). pp.1-46 (2009). -PDF-
  • Videos:
    • ICubAutRobs.wmv, a video showing bimanual coordination in reaching and manipulation of simple objects.



Active stereo matching

binocular disparity processing
  • The main Doxygen documentation of the active stereo matching is available as the following iCub modules:
  • Papers:
    • Martinez, H., Lungarella, M. and Pfeifer, R., Influence of the sensor morphology in the generation of coordinated behavior as a function of information structure., Technical Report, AI-Lab, University of Zurich, Switzerland, November 2009. -PDF-
  • Videos:
    • logpolar.wmv, logpolar.wmv], this video shows how the zero disparity tracking keeps the robot looking at the interesting object, which allows the matching process.
    • tracking.wmv, a video showing generic tracking using zero disparity filtering and use of disparity data.
Personal tools
Namespaces

Variants
Actions
Navigation
Print/export
Toolbox