Deliverable 2.2 v2

From Wiki for iCub and Friends
Revision as of 18:10, 3 January 2010 by Pasa (talk | contribs)
Jump to: navigation, search


Software Implementation of the iCub Cognitive Architecture (version 2.0)
Icubijcai.jpg

This is the addendum to the Deliverable 2.2 which is the second release of the iCub Cognitive Architecture. A placeholder document has been added to the RobotCub.org website with a pointer to this page. The demonstrations are provided here as videos.


The software implementation is a collection applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the iCub applications documentation. The modules are described also in the iCub modules documentation.


The final goal is to have an application which instantiates all the modules in the iCub Cognitive Architecture and which realizes the behaviours encapsulated in Empirical Investigations. The current release is still a collection of behaviors and cannot run as a single entity. We have realized a certain number of demonstrations which include (live) as per the implementation plan:

  • Reaching, grasping, affordance understanding and imitation (using results from WP3 and WP4);
  • Human-robot interaction (using results from WP5N);
  • Crawling (using results from WP3);

And a number of ancillary live demonstrations such as:

  • Gazing, memory and prediction (using results from WP2 and WP3);
  • Force control on the iCub (using results described in WP3);

And a few more demonstrations on video/teleconferencing:

  • Predictive gaze (using results from WP3);
  • Human imitation (using results from WP5N).


Important note

  • Please note that often the browser won't display/embed your videos correctly because of coded and/or other player incompatibilities. In that case, we recommend downloading them to your computer and then playing them using your favorite media player software.



Affordances: reaching, grasping and imitation

the iCub grasping an object
  • This demo integrates basic sensorimotor skills (reaching and grasping) with affordance learning (offline) and exploitation (imitation game).
  • The main Doxygen documentation of the demonstration is the iCub aplication called demoAffv2 and can be browsed here
  • Paper:
    • Luis Montesano, Manuel Lopes, Alexandre Bernardino, Jose Santos-Victor, Learning Object Affordances: From Sensory Motor Maps to Imitation, IEEE Transactions on Robotics, Special Issue on Bio-Robotics, Vol 24(1) Feb 2008. -PDF-
    • See also the latest progress report (M49-M65) -PDF-
  • Additional information:
    • The preparation of the demo can be followed here.

The affordance demo uses the attention system

the attention system
  • The main Doxygen documentation of the attention system is available as the iCub application called attention_distributed and can be browsed here
  • Paper:
    • Ruesch J., Lopes, M., Hornstein J., Santos-Victor J., Pfeifer, R. Multimodal Saliency-Based Bottom-Up Attention - A Framework for the Humanoid Robot iCub. International Conference on Robotics and Automation, Pasadena, CA, USA, May 19-23, 2008. pp. 962-967. -PDF-

The affordance demo relies also on the reaching/grasping controller

the Jacobian-based Cartesian controller
  • The main Doxygen documentation of the reaching controller is available as the iCub application called armCartesianController and can be browsed here
  • Paper:
    • None, but details have been included in the progress report -PDF-


Body schema (WP3 and WP5)

iCub reaching
  • The main Doxygen documentation of the body schema is available as the iCub application called lasaBodySchema and can be browsed here
  • Papers:
    • M. Hersch, E. Sauser and A. Billard. Online learning of the body schema. International Journal of Humanoid Robotics, (2008). -PDF-
    • M. Hersch, Adaptive sensorimotor peripersonal space representation and motor learning for a humanoid robot. PhD thesis (2009). link

Crawling (WP3)

the iCub crawling
  • The main Doxygen documentation of the crawling controller is available as the iCub application called missing_application and can be browsed [http:// here]
  • Videos:
  • Paper (and more):
    • -Deliverable 3.4-
    • A presentation on the controller structure: presentation.pdf
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF of submitted paper-

Drumming (WP3)

the iCub drumming
  • The main Doxygen documentation of the drumming controller is available as the iCub application called drummingEPFL and can be browsed here
  • Paper (and more):
    • -Deliverable 3.4-
    • A presentation on the controller structure: presentation.pdf
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF of submitted paper-

Cartesian control (WP3)

the Jacobian-based Cartesian controller
  • The main Doxygen documentation of the cartesian controller is available as the iCub application called armCartesianController and can be browsed here. The implementation includes also the trajectory generation implemented for the learning of the body schema (see above).
  • Paper:
    • Not yet!

Affordances (WP4 & WP5)

the iCub running the affordance modules
  • The main Doxygen documentation of the affordance experiment is available as the iCub application called missing application and can be browsed here
  • Video, an initial video of the affordance experiment on the iCub:
  • Paper (and more):
    • -Deliverable 4.1-
    • Lopes, M., Melo, F., and Montesano, L. Affordance-Based Imitation Learning in Robots. IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, USA, October 2007.-PDF of submitted paper-

Interaction histories (WP6)

the Peekaboo game
  • The main Doxygen documentation of the interaction histories experiment is available as the iCub application called iha_manual and can be browsed here
  • Video:
    • iha.wmv, full video of the experiment.
  • Paper (and more):
    • -Deliverable 6.4-
    • Mirza, N. A., Nehaniv, C. L., Dautenhahn, K., AND te Boekhorst, R. 2005. Using sensory-motor phase-plots to characterise robot-environment interactions. In Proc. of 6th IEEE International Symposium on Computational Intelligence in Robotics and Automation. -PDF of submitted paper-