Difference between revisions of "Deliverable D2.2 v1.99"

From Wiki for iCub and Friends
Jump to: navigation, search
(The affordance demo relies also on the reaching/grasping controller)
 
Line 6: Line 6:
  
 
The software implementation is a collection applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the
 
The software implementation is a collection applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the
[http://eris.liralab.it/iCub/dox/html/group__icub__applications.html iCub applications] documentation. The modules are described also in the [http://eris.liralab.it/iCub/dox/html/group__icub__module.html iCub modules] documentation.
+
[http://wiki.icub.org/iCub/dox/html/group__icub__applications.html iCub applications] documentation. The modules are described also in the [http://wiki.icub.org/iCub/dox/html/group__icub__module.html iCub modules] documentation.
  
  
Line 32: Line 32:
 
* This demo integrates basic sensorimotor skills (reaching and grasping) with affordance learning (offline) and exploitation (imitation game).
 
* This demo integrates basic sensorimotor skills (reaching and grasping) with affordance learning (offline) and exploitation (imitation game).
  
* The main Doxygen documentation of the demonstration is the iCub aplication called '''demoAffv2''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__demoAffv2__app.html here]
+
* The main Doxygen documentation of the demonstration is the iCub aplication called '''demoAffv2''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__demoAffv2__app.html here]
  
 
* Videos:  
 
* Videos:  
** Older video: [http://eris.liralab.it/misc/icubvideos/affordances.wmv affordances.wmv]
+
** Older video: [http://wiki.icub.org/misc/icubvideos/affordances.wmv affordances.wmv]
** [http://eris.liralab.it/misc/icubvideos/icub_grasps_sponges.wmv icub_grasps.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icub_grasps_sponges.wmv icub_grasps.wmv]
  
 
* Papers:  
 
* Papers:  
Line 48: Line 48:
 
[[Image:attention.jpg|right|150px|thumb|the attention system]]
 
[[Image:attention.jpg|right|150px|thumb|the attention system]]
  
* The main Doxygen documentation of the attention system is available as the iCub application called '''attention_distributed''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__attentionDistributed.html here]
+
* The main Doxygen documentation of the attention system is available as the iCub application called '''attention_distributed''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__attentionDistributed.html here]
  
* Video: [http://eris.liralab.it/misc/icubvideos/iCub-attention-may14-2008.wmv iCub-attention.wmv]
+
* Video: [http://wiki.icub.org/misc/icubvideos/iCub-attention-may14-2008.wmv iCub-attention.wmv]
  
 
* Paper:  
 
* Paper:  
Line 58: Line 58:
 
[[Image:cart.jpg|right|150px|thumb|the Cartesian controller]]
 
[[Image:cart.jpg|right|150px|thumb|the Cartesian controller]]
  
* The main Doxygen documentation of the reaching controller is available as the iCub application called '''armCartesianController''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__armCartesianController.html here]. The implementation includes also the multi-referential approach used for the learning of the body schema (see below).
+
* The main Doxygen documentation of the reaching controller is available as the iCub application called '''armCartesianController''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__armCartesianController.html here]. The implementation includes also the multi-referential approach used for the learning of the body schema (see below).
  
 
* Videos:
 
* Videos:
** [http://eris.liralab.it/misc/icubvideos/icub_grasps_sponges.wmv icub_grasps.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icub_grasps_sponges.wmv icub_grasps.wmv]
  
 
* Papers:
 
* Papers:
Line 83: Line 83:
 
[[Image:iha.jpg|right|150px|thumb|the Peekaboo game]]
 
[[Image:iha.jpg|right|150px|thumb|the Peekaboo game]]
  
* The main Doxygen documentation of the interaction histories experiment is available as the iCub application called '''iha_manual''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__iha.html here]
+
* The main Doxygen documentation of the interaction histories experiment is available as the iCub application called '''iha_manual''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__iha.html here]
  
 
* Video:
 
* Video:
** [http://eris.liralab.it/misc/icubvideos/iha.wmv iha.wmv], full video of the experiment.
+
** [http://wiki.icub.org/misc/icubvideos/iha.wmv iha.wmv], full video of the experiment.
  
 
* Papers (and more):  
 
* Papers (and more):  
Line 101: Line 101:
  
 
* Videos:
 
* Videos:
** [http://eris.liralab.it/misc/icubvideos/first_crawl.wmv crawling.wmv], a few steps with the crawling controller (old video).
+
** [http://wiki.icub.org/misc/icubvideos/first_crawl.wmv crawling.wmv], a few steps with the crawling controller (old video).
  
 
* Papers (and more):  
 
* Papers (and more):  
Line 112: Line 112:
 
[[Image:drumming.jpg|right|150px|thumb|the iCub drumming]]
 
[[Image:drumming.jpg|right|150px|thumb|the iCub drumming]]
  
* The main Doxygen documentation of the drumming controller is available as the iCub application called '''drummingEPFL''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__drummingEPFL.html here]
+
* The main Doxygen documentation of the drumming controller is available as the iCub application called '''drummingEPFL''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__drummingEPFL.html here]
  
 
* Videos, various videos of the robot drumming (old videos):
 
* Videos, various videos of the robot drumming (old videos):
** [http://eris.liralab.it/misc/icubvideos/icubDrumPart1.wmv drumming1.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icubDrumPart1.wmv drumming1.wmv]
** [http://eris.liralab.it/misc/icubvideos/icubDrumPart2.wmv drumming2.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icubDrumPart2.wmv drumming2.wmv]
** [http://eris.liralab.it/misc/icubvideos/icubDrumPart3.wmv drumming3.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icubDrumPart3.wmv drumming3.wmv]
** [http://eris.liralab.it/misc/icubvideos/icubDrumPart4.wmv drumming4.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icubDrumPart4.wmv drumming4.wmv]
** [http://eris.liralab.it/misc/icubvideos/automatica08-edited.wmv automatica08.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/automatica08-edited.wmv automatica08.wmv]
  
 
* Papers (and more):  
 
* Papers (and more):  
Line 141: Line 141:
  
 
* Modules:
 
* Modules:
** attention system:  documentation for [http://eris.liralab.it/iCub/dox/html/group__icub__attentionDistributed.html application]
+
** attention system:  documentation for [http://wiki.icub.org/iCub/dox/html/group__icub__attentionDistributed.html application]
** attentionSelection:  documentation for [http://eris.liralab.it/iCub/dox/html/group__icub__attentionSelection.html module]
+
** attentionSelection:  documentation for [http://wiki.icub.org/iCub/dox/html/group__icub__attentionSelection.html module]
** controlGaze2:  documentation for [http://eris.liralab.it/iCub/dox/html/group__icub__controlGaze2.html module]
+
** controlGaze2:  documentation for [http://wiki.icub.org/iCub/dox/html/group__icub__controlGaze2.html module]
** endogenousSalience:  documentation for [http://eris.liralab.it/iCub/dox/html/group__icub__endogenousSalience.html module]
+
** endogenousSalience:  documentation for [http://wiki.icub.org/iCub/dox/html/group__icub__endogenousSalience.html module]
** episodicMemory:  documentation for [http://eris.liralab.it/iCub/dox/html/group__icub__episodicMemory.html module] and [http://eris.liralab.it/iCub/dox/html/group__icub__episodicMemoryApplication.html application]
+
** episodicMemory:  documentation for [http://wiki.icub.org/iCub/dox/html/group__icub__episodicMemory.html module] and [http://wiki.icub.org/iCub/dox/html/group__icub__episodicMemoryApplication.html application]
** proceduralMemory:  documentation for [http://eris.liralab.it/iCub/dox/html/group__icub__proceduralMemory.html  module]  
+
** proceduralMemory:  documentation for [http://wiki.icub.org/iCub/dox/html/group__icub__proceduralMemory.html  module]  
** crossPowerSpectrumVergence:  documentation for [http://eris.liralab.it/iCub/dox/html/group__icub__crossPowerSpectrumVergence.html module] and    [http://eris.liralab.it/iCub/dox/html/group__icub__crossPowerSpectrumVergenceApplication.html documentation]
+
** crossPowerSpectrumVergence:  documentation for [http://wiki.icub.org/iCub/dox/html/group__icub__crossPowerSpectrumVergence.html module] and    [http://wiki.icub.org/iCub/dox/html/group__icub__crossPowerSpectrumVergenceApplication.html documentation]
 
** actionSelection (work in progress)
 
** actionSelection (work in progress)
 
** affectiveState (work in progress)
 
** affectiveState (work in progress)
Line 159: Line 159:
  
 
* Videos:  
 
* Videos:  
** A preliminary video of some related research (iCub v2.0) is available [http://eris.liralab.it/misc/icubvideos/torqueControl.wmv here]
+
** A preliminary video of some related research (iCub v2.0) is available [http://wiki.icub.org/misc/icubvideos/torqueControl.wmv here]
  
 
* Papers:  
 
* Papers:  
Line 170: Line 170:
 
== Imitation learning, refinement and reuse ==
 
== Imitation learning, refinement and reuse ==
 
[[Image:imitationlearning.jpg|right|150px|thumb|imitation learning on the iCub]]
 
[[Image:imitationlearning.jpg|right|150px|thumb|imitation learning on the iCub]]
* The main Doxygen documentation of the imitation learning application is available as the iCub application called '''Imitation learning, refinement and reuse''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__lasaImitation.html here]
+
* The main Doxygen documentation of the imitation learning application is available as the iCub application called '''Imitation learning, refinement and reuse''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__lasaImitation.html here]
  
 
* Videos:
 
* Videos:
** [http://eris.liralab.it/misc/icubvideos/iCub_LasaImitation.avi iCub_LasaImitation.avi]: this video shows the imitation learning procedure as well as the process of policy refinement and reuse.
+
** [http://wiki.icub.org/misc/icubvideos/iCub_LasaImitation.avi iCub_LasaImitation.avi]: this video shows the imitation learning procedure as well as the process of policy refinement and reuse.
  
 
* Papers:   
 
* Papers:   
Line 181: Line 181:
 
[[Image:reach.jpg|right|150px|thumb|iCub reaching]]
 
[[Image:reach.jpg|right|150px|thumb|iCub reaching]]
  
* The main Doxygen documentation of the body schema is available as the iCub application called '''lasaBodySchema''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__lasabodyschema.html here]
+
* The main Doxygen documentation of the body schema is available as the iCub application called '''lasaBodySchema''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__lasabodyschema.html here]
  
 
* Videos (older):
 
* Videos (older):
** [http://eris.liralab.it/misc/icubvideos/body_schema_sim.avi bodyschema.avi], this video shows the learning procedure in simulation, or [http://eris.liralab.it/misc/icubvideos/body_schema.wmv .wmv file].
+
** [http://wiki.icub.org/misc/icubvideos/body_schema_sim.avi bodyschema.avi], this video shows the learning procedure in simulation, or [http://wiki.icub.org/misc/icubvideos/body_schema.wmv .wmv file].
** [http://eris.liralab.it/misc/icubvideos/icub_finger_s.avi fingerreach.avi] or [http://eris.liralab.it/misc/icubvideos/icub_finger_s.wmv fingerreach.wmv], learning to reach using a different effector (a different finger as the end-point).
+
** [http://wiki.icub.org/misc/icubvideos/icub_finger_s.avi fingerreach.avi] or [http://wiki.icub.org/misc/icubvideos/icub_finger_s.wmv fingerreach.wmv], learning to reach using a different effector (a different finger as the end-point).
** [http://eris.liralab.it/misc/icubvideos/icub_gazing_s.avi gazing.avi] or [http://eris.liralab.it/misc/icubvideos/icub_gazing.wmv gazing.wmv], learning to gaze appropriately (head-eye coordination).
+
** [http://wiki.icub.org/misc/icubvideos/icub_gazing_s.avi gazing.avi] or [http://wiki.icub.org/misc/icubvideos/icub_gazing.wmv gazing.wmv], learning to gaze appropriately (head-eye coordination).
** [http://eris.liralab.it/misc/icubvideos/icub_reaching_s.avi reaching.avi] or [http://eris.liralab.it/misc/icubvideos/iCub-reach-epfl.wmv reaching.wmv], learning to reach (whole body).
+
** [http://wiki.icub.org/misc/icubvideos/icub_reaching_s.avi reaching.avi] or [http://wiki.icub.org/misc/icubvideos/iCub-reach-epfl.wmv reaching.wmv], learning to reach (whole body).
  
 
* Papers:  
 
* Papers:  
Line 201: Line 201:
  
 
* Videos (older):
 
* Videos (older):
** [http://eris.liralab.it/misc/icubvideos/SmoothPursuitAndOcclusion.avi video1.avi], this video shows tracking across occlusions
+
** [http://wiki.icub.org/misc/icubvideos/SmoothPursuitAndOcclusion.avi video1.avi], this video shows tracking across occlusions
** [http://eris.liralab.it/misc/icubvideos/Part_A_SmoothPursuit.avi video2.avi], same as above from a different point of view  
+
** [http://wiki.icub.org/misc/icubvideos/Part_A_SmoothPursuit.avi video2.avi], same as above from a different point of view  
** [http://eris.liralab.it/misc/icubvideos/Part_B_SmoothPursuit.avi video3.avi], ditto  
+
** [http://wiki.icub.org/misc/icubvideos/Part_B_SmoothPursuit.avi video3.avi], ditto  
  
 
* Papers:
 
* Papers:
 
** Falotico E., Taiana M., Zambrano A., Bernardino A., Santos-Victor J., Laschi C., Dario P., '''Predictive tracking across occlusions on the iCub robot'''. 9th IEEE-RAS International Conference on Humanoid Robots, 7-10th December 2009, Paris, France. [http://www.robotcub.org/misc/review5/papers/09_Falotico_Taiana_etal.pdf -PDF-]
 
** Falotico E., Taiana M., Zambrano A., Bernardino A., Santos-Victor J., Laschi C., Dario P., '''Predictive tracking across occlusions on the iCub robot'''. 9th IEEE-RAS International Conference on Humanoid Robots, 7-10th December 2009, Paris, France. [http://www.robotcub.org/misc/review5/papers/09_Falotico_Taiana_etal.pdf -PDF-]
 
** Zambrano D., Falotico E., Manfredi L., Laschi C., '''A model of the smooth pursuit eye movement with prediction and learning'''. Applied Bionics and Biomechanics. 2009 (accepted).
 
** Zambrano D., Falotico E., Manfredi L., Laschi C., '''A model of the smooth pursuit eye movement with prediction and learning'''. Applied Bionics and Biomechanics. 2009 (accepted).

Latest revision as of 10:53, 20 June 2013


Software Implementation of the iCub Cognitive Architecture (version 1.99)
Icubijcai.jpg

This is the addendum to the Deliverable 2.2 which is the second release of the iCub Cognitive Architecture. A placeholder document has been added to the RobotCub.org website with a pointer to this page. The demonstrations are provided here as videos.


The software implementation is a collection applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the iCub applications documentation. The modules are described also in the iCub modules documentation.


The final goal is to have an application which instantiates all the modules in the iCub Cognitive Architecture and which realizes the behaviours encapsulated in Empirical Investigations. The current release is still a collection of behaviors and cannot run as a single entity. We have realized a certain number of demonstrations which include (live) as per the implementation plan:

  • Reaching, grasping, affordance understanding and imitation (using results from WP3 and WP4);
  • Human-robot interaction (using results from WP5N);
  • Crawling (using results from WP3);

And a number of ancillary live demonstrations such as:

  • Gazing, memory and prediction (using results from WP2 and WP3);
  • Force control on the iCub (using results described in WP3);

And a few more demonstrations on video/teleconferencing:

  • Predictive gaze (using results from WP3);
  • Human imitation (using results from WP5N).


Important note

  • Please note that often the browser won't display/embed your videos correctly because of coded and/or other player incompatibilities. In that case, we recommend downloading them to your computer and then playing them using your favorite media player software.



Affordances: reaching, grasping and imitation

the iCub grasping an object
  • This demo integrates basic sensorimotor skills (reaching and grasping) with affordance learning (offline) and exploitation (imitation game).
  • The main Doxygen documentation of the demonstration is the iCub aplication called demoAffv2 and can be browsed here
  • Papers:
    • Luis Montesano, Manuel Lopes, Alexandre Bernardino, Jose Santos-Victor, Learning Object Affordances: From Sensory Motor Maps to Imitation, IEEE Transactions on Robotics, Special Issue on Bio-Robotics, Vol 24(1) Feb 2008. -PDF-
    • See also the latest progress report (M49-M65) -PDF-
  • Additional information:
    • The preparation of the demo can be followed here.

The affordance demo uses the attention system

the attention system
  • The main Doxygen documentation of the attention system is available as the iCub application called attention_distributed and can be browsed here
  • Paper:
    • Ruesch J., Lopes, M., Hornstein J., Santos-Victor J., Pfeifer, R. Multimodal Saliency-Based Bottom-Up Attention - A Framework for the Humanoid Robot iCub. International Conference on Robotics and Automation, Pasadena, CA, USA, May 19-23, 2008. pp. 962-967. -PDF-

The affordance demo relies also on the reaching/grasping controller

the Cartesian controller
  • The main Doxygen documentation of the reaching controller is available as the iCub application called armCartesianController and can be browsed here. The implementation includes also the multi-referential approach used for the learning of the body schema (see below).
  • Papers:
    • None, but details have been included in the progress report -PDF-



Human-robot interaction

iCub interaction game
  • The main Doxygen documentation of the interaction histories experiment is available as the iCub application called ihaNew and can be browsed at missing link
  • Video:
    • No videos yet.
  • Papers:
    • Broz, F., Kose-Bagci, H., Nehaniv, C.L., Dautenhahn, K., Learning behavior for a social interaction game with a childlike humanoid robot, Social Learning in Interactive Scenarios Workshop, Humanoids 2009, Paris, France, 7 December, 2009. here

Human-robot interaction uses the interaction histories architecture

the Peekaboo game
  • The main Doxygen documentation of the interaction histories experiment is available as the iCub application called iha_manual and can be browsed here
  • Video:
    • iha.wmv, full video of the experiment.
  • Papers (and more):
    • -Deliverable 6.4-
    • Mirza, N. A., Nehaniv, C. L., Dautenhahn, K., AND te Boekhorst, R. 2005. Using sensory-motor phase-plots to characterise robot-environment interactions. In Proc. of 6th IEEE International Symposium on Computational Intelligence in Robotics and Automation. -PDF-
    • Mirza, N. A., Nehaniv, C. L., Dautenhahn, K., te Boekhorst, R., Developing Social Action Capabilities in a Humanoid Robot using an Interaction History Architecture. Proc. IEEE-RAS Humanoids 2008. -PDF-



Crawling

the iCub crawling
  • The main Doxygen documentation of the crawling controller is available as the iCub application called missing_application and can be browsed [http:// here]
  • Videos:
    • crawling.wmv, a few steps with the crawling controller (old video).
  • Papers (and more):
    • -Deliverable 3.8-
    • A presentation on the controller structure: presentation.pdf
    • Degallier, S. and Ijspeert, A.J., Modeling Discrete and Rhythmic Movement through Motor Primitives: A Review. 2009. Submitted to Biological Cybernetics. -PDF-
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF-

Drumming

the iCub drumming
  • The main Doxygen documentation of the drumming controller is available as the iCub application called drummingEPFL and can be browsed here
  • Papers (and more):
    • -Deliverable 3.4-
    • -Deliverable 3.8-
    • A presentation on the controller structure: presentation.pdf
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF-



Gaze, memory and prediction (cognitive gaze)

  • The main Doxygen documentation of the cognitive gaze demonstration is available as the iCub application called cognitiveGaze and can be browsed [http:// here] (work in progress)
  • Videos:
    • None as yet.
  • Papers:
    • The Cognitive Gaze is described in great length in the D2.1, please see section 15.6.9: -PDF-
  • Modules:
    • attention system: documentation for application
    • attentionSelection: documentation for module
    • controlGaze2: documentation for module
    • endogenousSalience: documentation for module
    • episodicMemory: documentation for module and application
    • proceduralMemory: documentation for module
    • crossPowerSpectrumVergence: documentation for module and documentation
    • actionSelection (work in progress)
    • affectiveState (work in progress)



Force control on the iCub

the iCub v2.0 (torque control)
  • The main Doxygen documentation of the cognitive gaze demonstration is available as the iCub application called forceControl and can be browsed [http:// here]
  • Videos:
    • A preliminary video of some related research (iCub v2.0) is available here
  • Papers:
    • Parmiggiani, A., Randazzo, M., Natale, L., Metta, G., Sandini, G. Joint torque sensing for the upper-body of the iCub humanoid robot. IEEE-RAS International Conference on Humanoid Robots. (2009). -PDF-
    • Fumagalli M., Gijsberts, A., Ivaldi, S., Jamone, L., Metta, G., Natale, L., Nori, F., Sandini, G. Learning to Exploit Proximal Force Sensing: a Comparison Approach. From Motor Learning to Interaction Learning in Robots. (2010). -PDF-



Imitation learning, refinement and reuse

imitation learning on the iCub
  • The main Doxygen documentation of the imitation learning application is available as the iCub application called Imitation learning, refinement and reuse and can be browsed here
  • Videos:
    • iCub_LasaImitation.avi: this video shows the imitation learning procedure as well as the process of policy refinement and reuse.
  • Papers:
    • Brenna D. Argall, Eric L. Sauser and Aude G. Billard. Tactile Guidance for Policy Refinement and Reuse. under preparation (2010) -PDF-

Body schema learning

iCub reaching
  • The main Doxygen documentation of the body schema is available as the iCub application called lasaBodySchema and can be browsed here
  • Papers:
    • M. Hersch, E. Sauser and A. Billard. Online learning of the body schema. International Journal of Humanoid Robotics, (2008). -PDF-
    • M. Hersch, Adaptive sensorimotor peripersonal space representation and motor learning for a humanoid robot. PhD thesis (2009). link



Predictive gaze

The iCub smooth pursuit
  • The main Doxygen documentation of the predictive gaze is available as the iCub application called missing link and can be browsed [http:// here]
  • Papers:
    • Falotico E., Taiana M., Zambrano A., Bernardino A., Santos-Victor J., Laschi C., Dario P., Predictive tracking across occlusions on the iCub robot. 9th IEEE-RAS International Conference on Humanoid Robots, 7-10th December 2009, Paris, France. -PDF-
    • Zambrano D., Falotico E., Manfredi L., Laschi C., A model of the smooth pursuit eye movement with prediction and learning. Applied Bionics and Biomechanics. 2009 (accepted).