This website uses browsing/session and functional cookies to ensure you get the best experience. Learn More

Deliverable 2.2 v2

From Wiki for iCub and Friends
Jump to: navigation, search

Software Implementation of the iCub Cognitive Architecture (version 2.0)

Icubijcai.jpg

This is the on-line addendum to the Deliverable 2.2 which is the second release of a suite of software based on the iCub Cognitive Architecture. A placeholder document has been added to the RobotCub.org website with a pointer to this page. The demonstrations are provided here as videos.


The majority of the iCub Cognitive Architecture modules have now been implemented, albeit some of them in skeleton form. Nonetheless, they do represent an almost-complete instaniation of the cognitive architecture as planned for and as envisaged in Deliverable D2.1: A Roadmap for the Development of Cognitive Capabilities in Humanoid Robots. The most recent documentation for these individual implemented iCub YARP modules as well as the XML specification of the cognitive architecture application may be found on the ICub_Cognitive_Architecture wiki page.


That said, the current version of the cognitive architecture is at present far short of subsuming all of the functionality and behaviours of the many results in the other work-packages WP3-WP6. These outstanding behaviours and functionality are still encapsulated individually as a collection of discrete iCub XML-specified applications (i.e. self-contained network of iCub modules).


Ultimately, the cognitive architecture is still only a framework (although an almost-completely implemented framework); a lot more work still remains to be done to integrate into it all of the iCub behaviours instantiated as applications from other workpackages.


This is as one would expect: the RobotCub project (in the large sense of the word) doesn't come to an end just because the contract is at an end. Indeed, it is arguable that it would be truly unrealistic to have solved all the problems we identified at the outset, even in a five-year period. What is important is that (a) the work will carry on regardless of the end of the project, and (b) the cognitive architecture provides a solid framework to guide future developments (as it has guided past development in WP3-WP6) and to consolidate and integrate all of these past and future efforts.


Thus, D2.2 is a suite of software comprising iCub YARP modules and a collection of distinct XML-specified applications. Each application realizes a given behaviour and runs independently on the iCub. The applications are available from the Repository of iCub Applications and the constituent modules are described in the Repository of iCub Modules.


The ultimate long-term goal is to have an unified iCub XML application which instantiates all the modules in the Repository of iCub Modules and which realizes the behaviours encapsulated in Empirical Investigations.


The following is a list of (live) demonstrations which include, as per the implementation plan, the following:

  • Reaching, grasping, affordance understanding and imitation (using results from WP3 and WP4)
  • Human-robot interaction (using results from WP5N)
  • Crawling (using results from WP3)

and a number of ancillary live demonstrations including:

  • Gazing, memory and prediction (using results from WP2 and WP3);
  • Force control on the iCub (using results described in WP3);

Additionally, there two further demonstrations which will be effected by video/teleconferencing:

  • Predictive gaze (using results from WP3);
  • Human imitation (using results from WP5N).


Important note

  • Please note that often the browser won't display/embed your videos correctly because of coded and/or other player incompatibilities. In that case, we recommend downloading them to your computer and then playing them using your favorite media player software.



Affordances: reaching, grasping and imitation

the iCub grasping an object
  • This demo integrates basic sensorimotor skills (reaching and grasping) with affordance learning (offline) and exploitation (imitation game).
  • The main Doxygen documentation of the demonstration is the iCub aplication called demoAffv2 and can be browsed here
  • Papers:
    • Luis Montesano, Manuel Lopes, Alexandre Bernardino, Jose Santos-Victor, Learning Object Affordances: From Sensory Motor Maps to Imitation, IEEE Transactions on Robotics, Special Issue on Bio-Robotics, Vol 24(1) Feb 2008. -PDF-
    • See also the latest progress report (M49-M65) -PDF-
  • Additional information:
    • The preparation of the demo can be followed here.


Note that the approach taken to modelling affordances is consistent with the iCub cognitive architecture, in general, and the procedural memory, in particular. Specifically, the affordance triplet (O, A, E), where O is an object, A is an action performed on that object, and E is the effect of that action maps directly to the procedural memory perception-action-perception triplet of associations (Pi, Aj, Pk). For further discussion, see Procedural Memory and Affordances.

The affordance demo uses the attention system

the attention system
  • The main Doxygen documentation of the attention system is available as the iCub application called attentionDistributed and can be browsed here
  • Paper:
    • Ruesch J., Lopes, M., Hornstein J., Santos-Victor J., Pfeifer, R. Multimodal Saliency-Based Bottom-Up Attention - A Framework for the Humanoid Robot iCub. International Conference on Robotics and Automation, Pasadena, CA, USA, May 19-23, 2008. pp. 962-967. -PDF-


It is worth remarking that this attention system is incorporated directly in the iCub cognitive architecture, the only addition being the Endogenous Salience Module.

The affordance demo relies also on the reaching/grasping controller

the Cartesian controller
  • The main Doxygen documentation of the reaching controller is available as the iCub application called armCartesianController and can be browsed here. The implementation includes also the multi-referential approach used for the learning of the body schema (see below).
  • Papers:
    • None, but details have been included in the progress report -PDF-

This is one of the phylogenetic abilities/skills defined in the Cognitive Architecture and developed as part of WP3.



Human-robot interaction

iCub interaction game
  • The main Doxygen documentation of the interaction histories experiment is available as the iCub application called ihaNew and can be browsed here
  • Papers:
    • Broz, F., Kose-Bagci, H., Nehaniv, C.L., Dautenhahn, K., Learning behavior for a social interaction game with a childlike humanoid robot, Social Learning in Interactive Scenarios Workshop, Humanoids 2009, Paris, France, 7 December, 2009. here

Human-robot interaction uses the interaction histories architecture

the Peekaboo game
  • The main Doxygen documentation of the interaction histories experiment is available as the iCub application called iha and can be browsed here
  • Video:
    • iha.wmv, full video of the experiment.
  • Papers (and more):
    • -Deliverable 6.4-
    • Mirza, N. A., Nehaniv, C. L., Dautenhahn, K., AND te Boekhorst, R. 2005. Using sensory-motor phase-plots to characterise robot-environment interactions. In Proc. of 6th IEEE International Symposium on Computational Intelligence in Robotics and Automation. -PDF-
    • Mirza, N. A., Nehaniv, C. L., Dautenhahn, K., te Boekhorst, R., Developing Social Action Capabilities in a Humanoid Robot using an Interaction History Architecture. Proc. IEEE-RAS Humanoids 2008. -PDF-



Crawling

the iCub crawling
  • The main Doxygen documentation of the crawling controller is available as the iCub application called Crawling Demo and can be browsed here
  • Videos:
    • crawling.wmv, a few steps with the crawling controller (old video).
  • Papers (and more):
    • -Deliverable 3.8-
    • A presentation on the controller structure: presentation.pdf
    • Degallier, S. and Ijspeert, A.J., Modeling Discrete and Rhythmic Movement through Motor Primitives: A Review. 2009. Submitted to Biological Cybernetics. -PDF-
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF-

Drumming

the iCub drumming
  • The main Doxygen documentation of the drumming controller is available as the iCub application called drummingEPFL and can be browsed here
  • Papers (and more):
    • -Deliverable 3.4-
    • -Deliverable 3.8-
    • A presentation on the controller structure: presentation.pdf
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF-



Gaze, memory and prediction (cognitive gaze)

  • The main Doxygen documentation of the cognitive gaze demonstration is available as the iCub application called cognitiveGaze and can be browsed [http:// here] (work in progress)
  • Videos:
    • None as yet.
  • Papers:
    • The Cognitive Gaze is described in great length in the D2.1, please see section 15.6.9: -PDF-
  • Modules:
    • attention system: documentation for application
    • attentionSelection: documentation for module
    • controlGaze2: documentation for module
    • endogenousSalience: documentation for module
    • episodicMemory: documentation for module and application
    • proceduralMemory: documentation for module
    • crossPowerSpectrumVergence: documentation for module and documentation
    • actionSelection (work in progress)
    • affectiveState (work in progress)



Force control on the iCub

the iCub v2.0 (torque control)
  • The main Doxygen documentation of the force control demonstration is available as the iCub application called forceControl and can be browsed [http:// here]
  • Videos:
    • A preliminary video of some related research (iCub v2.0) is available here
  • Papers:
    • Parmiggiani, A., Randazzo, M., Natale, L., Metta, G., Sandini, G. Joint torque sensing for the upper-body of the iCub humanoid robot. IEEE-RAS International Conference on Humanoid Robots. (2009). -PDF-
    • Fumagalli M., Gijsberts, A., Ivaldi, S., Jamone, L., Metta, G., Natale, L., Nori, F., Sandini, G. Learning to Exploit Proximal Force Sensing: a Comparison Approach. From Motor Learning to Interaction Learning in Robots. (2010). -PDF-



Imitation learning, refinement and reuse

imitation learning on the iCub
  • The main Doxygen documentation of the imitation learning application is available as the iCub application called Imitation learning, refinement and reuse and can be browsed here
  • Videos:
    • iCub_LasaImitation.avi: this video shows the imitation learning procedure as well as the process of policy refinement and reuse.
  • Papers:
    • Brenna D. Argall, Eric L. Sauser and Aude G. Billard. Tactile Guidance for Policy Refinement and Reuse. under preparation (2010) -PDF-

Body schema learning

iCub reaching
  • The main Doxygen documentation of the body schema is available as the iCub application called lasaBodySchema and can be browsed here
  • Papers:
    • M. Hersch, E. Sauser and A. Billard. Online learning of the body schema. International Journal of Humanoid Robotics, (2008). -PDF-
    • M. Hersch, Adaptive sensorimotor peripersonal space representation and motor learning for a humanoid robot. PhD thesis (2009). link



Predictive gaze

The iCub smooth pursuit
  • The main Doxygen documentation of the predictive gaze is available as the iCub application called missing link and can be browsed [http:// here]
  • Papers:
    • Falotico E., Taiana M., Zambrano A., Bernardino A., Santos-Victor J., Laschi C., Dario P., Predictive tracking across occlusions on the iCub robot. 9th IEEE-RAS International Conference on Humanoid Robots, 7-10th December 2009, Paris, France. -PDF-
    • Zambrano D., Falotico E., Manfredi L., Laschi C., A model of the smooth pursuit eye movement with prediction and learning. Applied Bionics and Biomechanics. 2009 (accepted).
Personal tools
Namespaces

Variants
Actions
Navigation
Print/export
Toolbox