Deliverable 5N.2

From Wiki for iCub and Friends
Revision as of 00:48, 4 January 2010 by Pasa (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search


Software Implementation of the iCub Cognitive Architecture (version 2.0)
Icubijcai.jpg

This is the addendum to the Deliverable 2.2 which is the second release of the iCub Cognitive Architecture. A placeholder document has been added to the RobotCub.org website with a pointer to this page. The demonstrations are provided here as videos.


The software implementation is a collection applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the iCub applications documentation. The modules are described also in the iCub modules documentation.


The final goal is to have an application which instantiates all the modules in the iCub Cognitive Architecture and which realizes the behaviours encapsulated in Empirical Investigations. The current release is still a collection of behaviors and cannot run as a single entity. We have realized a certain number of demonstrations which include (live) as per the implementation plan:

  • Reaching, grasping, affordance understanding and imitation (using results from WP3 and WP4);
  • Human-robot interaction (using results from WP5N);
  • Crawling (using results from WP3);

And a number of ancillary live demonstrations such as:

  • Gazing, memory and prediction (using results from WP2 and WP3);
  • Force control on the iCub (using results described in WP3);

And a few more demonstrations on video/teleconferencing:

  • Predictive gaze (using results from WP3);
  • Human imitation (using results from WP5N).


Important note

  • Please note that often the browser won't display/embed your videos correctly because of coded and/or other player incompatibilities. In that case, we recommend downloading them to your computer and then playing them using your favorite media player software.



Affordances: reaching, grasping and imitation

the iCub grasping an object
  • This demo integrates basic sensorimotor skills (reaching and grasping) with affordance learning (offline) and exploitation (imitation game).
  • The main Doxygen documentation of the demonstration is the iCub aplication called demoAffv2 and can be browsed here
  • Papers:
    • Luis Montesano, Manuel Lopes, Alexandre Bernardino, Jose Santos-Victor, Learning Object Affordances: From Sensory Motor Maps to Imitation, IEEE Transactions on Robotics, Special Issue on Bio-Robotics, Vol 24(1) Feb 2008. -PDF-
    • See also the latest progress report (M49-M65) -PDF-
  • Additional information:
    • The preparation of the demo can be followed here.

The affordance demo uses the attention system

the attention system
  • The main Doxygen documentation of the attention system is available as the iCub application called attention_distributed and can be browsed here
  • Paper:
    • Ruesch J., Lopes, M., Hornstein J., Santos-Victor J., Pfeifer, R. Multimodal Saliency-Based Bottom-Up Attention - A Framework for the Humanoid Robot iCub. International Conference on Robotics and Automation, Pasadena, CA, USA, May 19-23, 2008. pp. 962-967. -PDF-

The affordance demo relies also on the reaching/grasping controller

the Jacobian-based Cartesian controller
  • The main Doxygen documentation of the reaching controller is available as the iCub application called armCartesianController and can be browsed here. The implementation includes also the trajectory generation implemented for the learning of the body schema (see below).
  • Papers:
    • None, but details have been included in the progress report -PDF-



Human-robot interaction

iCub interaction game
  • The main Doxygen documentation of the interaction histories experiment is available as the iCub application called ihaNew and can be browsed at missing link
  • Video:
    • No videos yet.
  • Papers:
    • Broz, F., Kose-Bagci, H., Nehaniv, C.L., Dautenhahn, K., Learning behavior for a social interaction game with a childlike humanoid robot, Social Learning in Interactive Scenarios Workshop, Humanoids 2009, Paris, France, 7 December, 2009. here

Human-robot interaction uses the interaction histories architecture

the Peekaboo game
  • The main Doxygen documentation of the interaction histories experiment is available as the iCub application called iha_manual and can be browsed here
  • Video:
    • iha.wmv, full video of the experiment.
  • Papers (and more):
    • -Deliverable 6.4-
    • Mirza, N. A., Nehaniv, C. L., Dautenhahn, K., AND te Boekhorst, R. 2005. Using sensory-motor phase-plots to characterise robot-environment interactions. In Proc. of 6th IEEE International Symposium on Computational Intelligence in Robotics and Automation. -PDF-
    • Mirza, N. A., Nehaniv, C. L., Dautenhahn, K., te Boekhorst, R., Developing Social Action Capabilities in a Humanoid Robot using an Interaction History Architecture. Proc. IEEE-RAS Humanoids 2008. -PDF-



Crawling

the iCub crawling
  • The main Doxygen documentation of the crawling controller is available as the iCub application called missing_application and can be browsed [http:// here]
  • Videos:
    • crawling.wmv, a few steps with the crawling controller (old video).
  • Papers (and more):
    • -Deliverable 3.8-
    • A presentation on the controller structure: presentation.pdf
    • Degallier, S. and Ijspeert, A.J., Modeling Discrete and Rhythmic Movement through Motor Primitives: A Review. 2009. Submitted to Biological Cybernetics. -PDF-
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF-

Drumming

the iCub drumming
  • The main Doxygen documentation of the drumming controller is available as the iCub application called drummingEPFL and can be browsed here
  • Papers (and more):
    • -Deliverable 3.4-
    • -Deliverable 3.8-
    • A presentation on the controller structure: presentation.pdf
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF-



Gaze, memory and prediction (cognitive gaze)

  • The main Doxygen documentation of the cognitive gaze demonstration is available as the iCub application called cognitiveGaze and can be browsed [http:// here]
  • Videos:
    • None as yet.
  • Papers:
    • The Cognitive Gaze is described in great length in the D2.1, please see section 15.6.9: -PDF-



Force control on the iCub

the iCub v2.0 (torque control)
  • The main Doxygen documentation of the cognitive gaze demonstration is available as the iCub application called forceControl and can be browsed [http:// here]
  • Videos:
    • A preliminary video of some related research (iCub v2.0) is available here
  • Papers:
    • Parmiggiani, A., Randazzo, M., Natale, L., Metta, G., Sandini, G. Joint torque sensing for the upper-body of the iCub humanoid robot. IEEE-RAS International Conference on Humanoid Robots. (2009). -PDF-
    • Fumagalli M., Gijsberts, A., Ivaldi, S., Jamone, L., Metta, G., Natale, L., Nori, F., Sandini, G. Learning to Exploit Proximal Force Sensing: a Comparison Approach. From Motor Learning to Interaction Learning in Robots. (2010). -PDF-



Imitation learning, refinement and reuse

imitation learning on the iCub
  • The main Doxygen documentation of the imitation learning application is available as the iCub application called Imitation learning, refinement and reuse and can be browsed here
  • Videos:
    • iCub_LasaImitation.avi: this video shows the imitation learning procedure as well as the process of policy refinement and reuse.
  • Papers:
    • Brenna D. Argall, Eric L. Sauser and Aude G. Billard. Tactile Guidance for Policy Refinement and Reuse. under preparation (2010) -PDF-

Body schema learning

iCub reaching
  • The main Doxygen documentation of the body schema is available as the iCub application called lasaBodySchema and can be browsed here
  • Papers:
    • M. Hersch, E. Sauser and A. Billard. Online learning of the body schema. International Journal of Humanoid Robotics, (2008). -PDF-
    • M. Hersch, Adaptive sensorimotor peripersonal space representation and motor learning for a humanoid robot. PhD thesis (2009). link



Predictive gaze

  • The main Doxygen documentation of the predictive gaze is available as the iCub application called missing link and can be browsed [http:// here]
  • Papers:
    • Falotico E., Taiana M., Zambrano A., Bernardino A., Santos-Victor J., Laschi C., Dario P., Predictive tracking across occlusions on the iCub robot. 9th IEEE-RAS International Conference on Humanoid Robots, 7-10th December 2009, Paris, France. -PDF-
    • Zambrano D., Falotico E., Manfredi L., Laschi C., A model of the smooth pursuit eye movement with prediction and learning. Applied Bionics and Biomechanics. 2009 (accepted).
  • Videos:
    • No videos as yet.