Difference between revisions of "Deliverable 2.2"

From Wiki for iCub and Friends
Jump to: navigation, search
(Cartesian control (WP3))
 
Line 4: Line 4:
  
 
The software implementation is a collection applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the
 
The software implementation is a collection applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the
[http://eris.liralab.it/iCub/dox/html/group__icub__applications.html iCub applications] documentation. The modules are described also in the [http://eris.liralab.it/iCub/dox/html/group__icub__module.html iCub modules] documentation.
+
[http://wiki.icub.org/iCub/dox/html/group__icub__applications.html iCub applications] documentation. The modules are described also in the [http://wiki.icub.org/iCub/dox/html/group__icub__module.html iCub modules] documentation.
  
  
Line 20: Line 20:
 
* These are older videos to show the functionality of the robot and they are typical mechanical stress-tests.
 
* These are older videos to show the functionality of the robot and they are typical mechanical stress-tests.
  
* The main Doxygen documentation of the basic control modules is available as the iCub application called '''demoy3''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__appdemos.html here]
+
* The main Doxygen documentation of the basic control modules is available as the iCub application called '''demoy3''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__appdemos.html here]
  
 
* Videos:  
 
* Videos:  
** [http://eris.liralab.it/misc/icubvideos/icub_exercising.wmv icub_yoga.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icub_exercising.wmv icub_yoga.wmv]
** [http://eris.liralab.it/misc/icubvideos/iCub_Oct07.3.wmv icub_hands.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/iCub_Oct07.3.wmv icub_hands.wmv]
  
 
* Paper:  
 
* Paper:  
Line 32: Line 32:
 
[[Image:attention.jpg|right|150px|thumb|the attention system]]
 
[[Image:attention.jpg|right|150px|thumb|the attention system]]
  
* The main Doxygen documentation of the attention system is available as the iCub application called '''attention_distributed''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__attention__distributed.html here]
+
* The main Doxygen documentation of the attention system is available as the iCub application called '''attention_distributed''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__attention__distributed.html here]
  
* Video: [http://eris.liralab.it/misc/icubvideos/iCub-attention-may14-2008.wmv iCub-attention.wmv]
+
* Video: [http://wiki.icub.org/misc/icubvideos/iCub-attention-may14-2008.wmv iCub-attention.wmv]
  
 
* Paper:  
 
* Paper:  
Line 42: Line 42:
 
[[Image:reach.jpg|right|150px|thumb|iCub reaching]]
 
[[Image:reach.jpg|right|150px|thumb|iCub reaching]]
  
* The main Doxygen documentation of the body schema is available as the iCub application called '''lasaBodySchema''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__lasabodyschema.html here]
+
* The main Doxygen documentation of the body schema is available as the iCub application called '''lasaBodySchema''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__lasabodyschema.html here]
  
 
* Videos:
 
* Videos:
** [http://eris.liralab.it/misc/icubvideos/body_schema_sim.avi bodyschema.avi], this video shows the learning procedure in simulation, or [http://eris.liralab.it/misc/icubvideos/body_schema.wmv .wmv file].
+
** [http://wiki.icub.org/misc/icubvideos/body_schema_sim.avi bodyschema.avi], this video shows the learning procedure in simulation, or [http://wiki.icub.org/misc/icubvideos/body_schema.wmv .wmv file].
** [http://eris.liralab.it/misc/icubvideos/icub_finger_s.avi fingerreach.avi] or [http://eris.liralab.it/misc/icubvideos/icub_finger_s.wmv fingerreach.wmv], learning to reach using a different effector (a different finger as the end-point).
+
** [http://wiki.icub.org/misc/icubvideos/icub_finger_s.avi fingerreach.avi] or [http://wiki.icub.org/misc/icubvideos/icub_finger_s.wmv fingerreach.wmv], learning to reach using a different effector (a different finger as the end-point).
** [http://eris.liralab.it/misc/icubvideos/icub_gazing_s.avi gazing.avi] or [http://eris.liralab.it/misc/icubvideos/icub_gazing.wmv gazing.wmv], learning to gaze appropriately (head-eye coordination).
+
** [http://wiki.icub.org/misc/icubvideos/icub_gazing_s.avi gazing.avi] or [http://wiki.icub.org/misc/icubvideos/icub_gazing.wmv gazing.wmv], learning to gaze appropriately (head-eye coordination).
** [http://eris.liralab.it/misc/icubvideos/icub_reaching_s.avi reaching.avi] or [http://eris.liralab.it/misc/icubvideos/iCub-reach-epfl.wmv reaching.wmv], learning to reach (whole body).
+
** [http://wiki.icub.org/misc/icubvideos/icub_reaching_s.avi reaching.avi] or [http://wiki.icub.org/misc/icubvideos/iCub-reach-epfl.wmv reaching.wmv], learning to reach (whole body).
  
 
* Papers:  
 
* Papers:  
Line 60: Line 60:
  
 
* Videos:
 
* Videos:
** [http://eris.liralab.it/misc/icubvideos/first_crawl.wmv crawling.wmv], a few steps with the crawling controller.
+
** [http://wiki.icub.org/misc/icubvideos/first_crawl.wmv crawling.wmv], a few steps with the crawling controller.
  
 
* Paper (and more):  
 
* Paper (and more):  
Line 70: Line 70:
 
[[Image:drumming.jpg|right|150px|thumb|the iCub drumming]]
 
[[Image:drumming.jpg|right|150px|thumb|the iCub drumming]]
  
* The main Doxygen documentation of the drumming controller is available as the iCub application called '''drummingEPFL''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__drummingEPFL.html here]
+
* The main Doxygen documentation of the drumming controller is available as the iCub application called '''drummingEPFL''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__drummingEPFL.html here]
  
 
* Videos, various videos of the robot drumming:
 
* Videos, various videos of the robot drumming:
** [http://eris.liralab.it/misc/icubvideos/icubDrumPart1.wmv drumming1.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icubDrumPart1.wmv drumming1.wmv]
** [http://eris.liralab.it/misc/icubvideos/icubDrumPart2.wmv drumming2.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icubDrumPart2.wmv drumming2.wmv]
** [http://eris.liralab.it/misc/icubvideos/icubDrumPart3.wmv drumming3.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icubDrumPart3.wmv drumming3.wmv]
** [http://eris.liralab.it/misc/icubvideos/icubDrumPart4.wmv drumming4.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icubDrumPart4.wmv drumming4.wmv]
** [http://eris.liralab.it/misc/icubvideos/automatica08-edited.wmv automatica08.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/automatica08-edited.wmv automatica08.wmv]
  
 
* Paper (and more):  
 
* Paper (and more):  
Line 87: Line 87:
 
[[Image:cart.jpg|right|150px|thumb|the Cartesian controller]]
 
[[Image:cart.jpg|right|150px|thumb|the Cartesian controller]]
  
* The main Doxygen documentation of the cartesian controller is available as the iCub application called '''armCartesianController''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__armCartesianController.html here]. The implementation includes also the multi-referential approach used for the learning of the body schema (see above).
+
* The main Doxygen documentation of the cartesian controller is available as the iCub application called '''armCartesianController''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__armCartesianController.html here]. The implementation includes also the multi-referential approach used for the learning of the body schema (see above).
  
 
* Videos, Cartesian controller in action:
 
* Videos, Cartesian controller in action:
** [http://eris.liralab.it/misc/icubvideos/icub_guicontrolled.avi icub_guicontrolled.avi] or [http://eris.liralab.it/misc/icubvideos/icub_guicontroller.wmv icub_guicontroller.wmv], iCub arm receives commands from a GUI developed in MATLAB: the complete pose (position+orientation) is controlled.
+
** [http://wiki.icub.org/misc/icubvideos/icub_guicontrolled.avi icub_guicontrolled.avi] or [http://wiki.icub.org/misc/icubvideos/icub_guicontroller.wmv icub_guicontroller.wmv], iCub arm receives commands from a GUI developed in MATLAB: the complete pose (position+orientation) is controlled.
  
 
* Paper:  
 
* Paper:  
Line 98: Line 98:
 
[[Image:affordances.jpg|right|150px|thumb|the iCub running the affordance modules]]
 
[[Image:affordances.jpg|right|150px|thumb|the iCub running the affordance modules]]
  
* The main Doxygen documentation of the affordance experiment is available as the iCub application called '''missing application''' and can be browsed [http://eris.liralab.it/iCub/dox/html/ here]
+
* The main Doxygen documentation of the affordance experiment is available as the iCub application called '''missing application''' and can be browsed [http://wiki.icub.org/iCub/dox/html/ here]
  
 
* Video, an initial video of the affordance experiment on the iCub:
 
* Video, an initial video of the affordance experiment on the iCub:
** [http://eris.liralab.it/misc/icubvideos/affordances.wmv affordances.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/affordances.wmv affordances.wmv]
  
 
* Paper (and more):  
 
* Paper (and more):  
Line 110: Line 110:
 
[[Image:iha.jpg|right|150px|thumb|the Peekaboo game]]
 
[[Image:iha.jpg|right|150px|thumb|the Peekaboo game]]
  
* The main Doxygen documentation of the interaction histories experiment is available as the iCub application called '''iha_manual''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__iha__app.html here]
+
* The main Doxygen documentation of the interaction histories experiment is available as the iCub application called '''iha_manual''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__iha__app.html here]
  
 
* Video:
 
* Video:
** [http://eris.liralab.it/misc/icubvideos/iha.wmv iha.wmv], full video of the experiment.
+
** [http://wiki.icub.org/misc/icubvideos/iha.wmv iha.wmv], full video of the experiment.
  
 
* Paper (and more):  
 
* Paper (and more):  
 
** [http://www.robotcub.org/index.php/robotcub/content/download/1144/4009/file/d6.4.pdf -Deliverable 6.4-]
 
** [http://www.robotcub.org/index.php/robotcub/content/download/1144/4009/file/d6.4.pdf -Deliverable 6.4-]
 
** Mirza, N. A., Nehaniv, C. L., Dautenhahn, K., AND te Boekhorst, R. 2005. '''Using sensory-motor phase-plots to characterise robot-environment interactions'''. In Proc. of 6th IEEE International Symposium on Computational Intelligence in Robotics and Automation. [http://www.robotcub.org/misc/papers/05_Mirza_Nehaniv_Dautehahn_teBoekhorst.pdf -PDF of submitted paper-]
 
** Mirza, N. A., Nehaniv, C. L., Dautenhahn, K., AND te Boekhorst, R. 2005. '''Using sensory-motor phase-plots to characterise robot-environment interactions'''. In Proc. of 6th IEEE International Symposium on Computational Intelligence in Robotics and Automation. [http://www.robotcub.org/misc/papers/05_Mirza_Nehaniv_Dautehahn_teBoekhorst.pdf -PDF of submitted paper-]

Latest revision as of 10:50, 20 June 2013

Software Implementation of the iCub Cognitive Architecture (version 1.0)
Icubcover.jpg

This is the addendum to the Deliverable 2.2 which is the first release of the iCub Cognitive Architecture. An placeholder document has been added to the RobotCub.org website with a pointer to this page. The demonstrations are provided here as videos.


The software implementation is a collection applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the iCub applications documentation. The modules are described also in the iCub modules documentation.


The final goal is to have an application which instantiates all the modules in the iCub Cognitive Architecture and which realizes the behaviours encapsulated in Empirical Investigations. At that point, the software implementation will be redesignated version 2.0.


Each application is described below, organized by workpackage.

Important note

  • Please note that often the browser won't display/embed your videos correctly because of coded and/or other player incompatibilities. In that case, we recommend downloading them to your computer and then playing them from there.

Generic (WP7 & WP8)

the iCub hands
  • These are older videos to show the functionality of the robot and they are typical mechanical stress-tests.
  • The main Doxygen documentation of the basic control modules is available as the iCub application called demoy3 and can be browsed here
  • Paper:
    • G. Metta, G. Sandini, D. Vernon, L. Natale, F. Nori. The iCub humanoid robot: an open platform for research in embodied cognition. In PerMIS: Performance Metrics for Intelligent Systems Workshop. Aug 19-21, 2008, Washington DC - USA -PDF-

Attention system (WP3)

the attention system
  • The main Doxygen documentation of the attention system is available as the iCub application called attention_distributed and can be browsed here
  • Paper:
    • Ruesch J., Lopes, M., Hornstein J., Santos-Victor J., Pfeifer, R. Multimodal Saliency-Based Bottom-Up Attention - A Framework for the Humanoid Robot iCub. International Conference on Robotics and Automation, Pasadena, CA, USA, May 19-23, 2008. pp. 962-967. -PDF-

Body schema (WP3 and WP5)

iCub reaching
  • The main Doxygen documentation of the body schema is available as the iCub application called lasaBodySchema and can be browsed here
  • Papers:
    • M. Hersch, E. Sauser and A. Billard. Online learning of the body schema. International Journal of Humanoid Robotics, (2008). -PDF-
    • M. Hersch, Adaptive sensorimotor peripersonal space representation and motor learning for a humanoid robot. PhD thesis (2009). link

Crawling (WP3)

the iCub crawling
  • The main Doxygen documentation of the crawling controller is available as the iCub application called missing_application and can be browsed [http:// here]
  • Videos:
  • Paper (and more):
    • -Deliverable 3.4-
    • A presentation on the controller structure: presentation.pdf
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF of submitted paper-

Drumming (WP3)

the iCub drumming
  • The main Doxygen documentation of the drumming controller is available as the iCub application called drummingEPFL and can be browsed here
  • Paper (and more):
    • -Deliverable 3.4-
    • A presentation on the controller structure: presentation.pdf
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF of submitted paper-

Cartesian control (WP3)

the Cartesian controller
  • The main Doxygen documentation of the cartesian controller is available as the iCub application called armCartesianController and can be browsed here. The implementation includes also the multi-referential approach used for the learning of the body schema (see above).
  • Paper:
    • Not yet!

Affordances (WP4 & WP5)

the iCub running the affordance modules
  • The main Doxygen documentation of the affordance experiment is available as the iCub application called missing application and can be browsed here
  • Video, an initial video of the affordance experiment on the iCub:
  • Paper (and more):
    • -Deliverable 4.1-
    • Lopes, M., Melo, F., and Montesano, L. Affordance-Based Imitation Learning in Robots. IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, USA, October 2007.-PDF of submitted paper-

Interaction histories (WP6)

the Peekaboo game
  • The main Doxygen documentation of the interaction histories experiment is available as the iCub application called iha_manual and can be browsed here
  • Video:
    • iha.wmv, full video of the experiment.
  • Paper (and more):
    • -Deliverable 6.4-
    • Mirza, N. A., Nehaniv, C. L., Dautenhahn, K., AND te Boekhorst, R. 2005. Using sensory-motor phase-plots to characterise robot-environment interactions. In Proc. of 6th IEEE International Symposium on Computational Intelligence in Robotics and Automation. -PDF of submitted paper-