Difference between revisions of "Deliverable 3.1"

From Wiki for iCub and Friends
Jump to: navigation, search
(Cartesian control (WP3))
 
Line 5: Line 5:
  
 
== Attention system (WP3) ==
 
== Attention system (WP3) ==
* The main Doxygen documentation of the attention system is available as the iCub application called '''attention_distributed''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__attention__distributed.html here]
+
* The main Doxygen documentation of the attention system is available as the iCub application called '''attention_distributed''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__attention__distributed.html here]
  
* Video: [http://eris.liralab.it/misc/icubvideos/iCub-attention-may14-2008.wmv iCub-attention.wmv]
+
* Video: [http://wiki.icub.org/misc/icubvideos/iCub-attention-may14-2008.wmv iCub-attention.wmv]
  
 
* Paper:  
 
* Paper:  
Line 13: Line 13:
  
 
== Body schema (WP3) ==
 
== Body schema (WP3) ==
* The main Doxygen documentation of the body schema is available as the iCub application called '''lasaBodySchema''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__lasabodyschema.html here]
+
* The main Doxygen documentation of the body schema is available as the iCub application called '''lasaBodySchema''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__lasabodyschema.html here]
  
 
* Videos:
 
* Videos:
** [http://eris.liralab.it/misc/icubvideos/body_schema_sim.avi bodyschema.avi], this video shows the learning procedure in simulation, or [http://eris.liralab.it/misc/icubvideos/body_schema.wmv .wmv file].
+
** [http://wiki.icub.org/misc/icubvideos/body_schema_sim.avi bodyschema.avi], this video shows the learning procedure in simulation, or [http://wiki.icub.org/misc/icubvideos/body_schema.wmv .wmv file].
** [http://eris.liralab.it/misc/icubvideos/icub_finger_s.avi fingerreach.avi] or [http://eris.liralab.it/misc/icubvideos/icub_finger_s.wmv fingerreach.wmv], learning to reach using a different effector (a different finger as the end-point).
+
** [http://wiki.icub.org/misc/icubvideos/icub_finger_s.avi fingerreach.avi] or [http://wiki.icub.org/misc/icubvideos/icub_finger_s.wmv fingerreach.wmv], learning to reach using a different effector (a different finger as the end-point).
** [http://eris.liralab.it/misc/icubvideos/icub_gazing_s.avi gazing.avi] or [http://eris.liralab.it/misc/icubvideos/icub_gazing.wmv gazing.wmv], learning to gaze appropriately (head-eye coordination).
+
** [http://wiki.icub.org/misc/icubvideos/icub_gazing_s.avi gazing.avi] or [http://wiki.icub.org/misc/icubvideos/icub_gazing.wmv gazing.wmv], learning to gaze appropriately (head-eye coordination).
** [http://eris.liralab.it/misc/icubvideos/icub_reaching_s.avi reaching.avi] or [http://eris.liralab.it/misc/icubvideos/iCub-reach-epfl.wmv reaching.wmv], learning to reach (whole body).
+
** [http://wiki.icub.org/misc/icubvideos/icub_reaching_s.avi reaching.avi] or [http://wiki.icub.org/misc/icubvideos/iCub-reach-epfl.wmv reaching.wmv], learning to reach (whole body).
  
 
* Paper:  
 
* Paper:  
Line 29: Line 29:
  
 
* Videos:
 
* Videos:
** [http://eris.liralab.it/misc/icubvideos/first_crawl.wmv crawling.wmv], a few steps with the crawling controller.
+
** [http://wiki.icub.org/misc/icubvideos/first_crawl.wmv crawling.wmv], a few steps with the crawling controller.
  
 
* Paper (and more):  
 
* Paper (and more):  
Line 37: Line 37:
  
 
== Drumming (WP3) ==
 
== Drumming (WP3) ==
* The main Doxygen documentation of the drumming controller is available as the iCub application called '''drummingEPFL''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__drummingEPFL.html here]
+
* The main Doxygen documentation of the drumming controller is available as the iCub application called '''drummingEPFL''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__drummingEPFL.html here]
  
 
* Videos, various videos of the robot drumming:
 
* Videos, various videos of the robot drumming:
** [http://eris.liralab.it/misc/icubvideos/iCubDrumPart1.wmv drumming1.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/iCubDrumPart1.wmv drumming1.wmv]
** [http://eris.liralab.it/misc/icubvideos/iCubDrumPart2.wmv drumming2.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/iCubDrumPart2.wmv drumming2.wmv]
** [http://eris.liralab.it/misc/icubvideos/iCubDrumPart3.wmv drumming3.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/iCubDrumPart3.wmv drumming3.wmv]
** [http://eris.liralab.it/misc/icubvideos/iCubDrumPart4.wmv drumming4.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/iCubDrumPart4.wmv drumming4.wmv]
** [http://eris.liralab.it/misc/icubvideos/automatica08-edited.wmv automatica08.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/automatica08-edited.wmv automatica08.wmv]
  
 
* Paper (and more):  
 
* Paper (and more):  
Line 53: Line 53:
 
== Cartesian control (WP3) ==
 
== Cartesian control (WP3) ==
  
* The Cartesian Interface for controlling iCub limbs in the task space relies on [http://eris.liralab.it/iCub/dox/html/group__iKin.html iKin] library and has been embedded within [http://eris.liralab.it/wiki/YARP YARP] as a further motor device driver whose documentation can be found [http://eris.liralab.it/yarpdoc/dd/de6/classyarp_1_1dev_1_1ICartesianControl.html here].  
+
* The Cartesian Interface for controlling iCub limbs in the task space relies on [http://wiki.icub.org/iCub/dox/html/group__iKin.html iKin] library and has been embedded within [http://wiki.icub.org/wiki/YARP YARP] as a further motor device driver whose documentation can be found [http://wiki.icub.org/yarpdoc/dd/de6/classyarp_1_1dev_1_1ICartesianControl.html here].  
An example application resorting to the Cartesian Interface functionalities is the '''armCartesianController''' whose online documentation can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__armCartesianController.html here].
+
An example application resorting to the Cartesian Interface functionalities is the '''armCartesianController''' whose online documentation can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__armCartesianController.html here].
  
 
* Videos, Cartesian controller in action:
 
* Videos, Cartesian controller in action:
** [http://eris.liralab.it/misc/icubvideos/icub_grasps_sponges.wmv Grasping Sponges], iCub successfully grasps some sponges lying on a table.
+
** [http://wiki.icub.org/misc/icubvideos/icub_grasps_sponges.wmv Grasping Sponges], iCub successfully grasps some sponges lying on a table.
** [http://eris.liralab.it/misc/icubvideos/reaching_IIT_ISR.wmv Arm+Gaze Control], iCub tracks a red ball with the gaze and the hand, exploiting 8 DOF's.
+
** [http://wiki.icub.org/misc/icubvideos/reaching_IIT_ISR.wmv Arm+Gaze Control], iCub tracks a red ball with the gaze and the hand, exploiting 8 DOF's.
  
 
* Paper:  
 
* Paper:  
 
** Not yet!
 
** Not yet!

Latest revision as of 10:49, 20 June 2013

This is the addendum to the Deliverable 3.1 which consists of a report proper (available here) and a set of demonstrations. An addendum document on the main website points to this webpage. The demonstrations are provided here as videos.

Important note

  • Please note that often the browser won't display/embed your videos correctly because of codec and/or other player incompatibilities. In that case, we recommend downloading them to your computer and then playing them from there.

Attention system (WP3)

  • The main Doxygen documentation of the attention system is available as the iCub application called attention_distributed and can be browsed here
  • Paper:
    • Ruesch J., Lopes, M., Hornstein J., Santos-Victor J., Pfeifer, R. Multimodal Saliency-Based Bottom-Up Attention - A Framework for the Humanoid Robot iCub. International Conference on Robotics and Automation, Pasadena, CA, USA, May 19-23, 2008. pp. 962-967. -PDF-

Body schema (WP3)

  • The main Doxygen documentation of the body schema is available as the iCub application called lasaBodySchema and can be browsed here
  • Paper:
    • M. Hersch, E. Sauser and A. Billard. Online learning of the body schema. International Journal of Humanoid Robotics, (2008). -PDF-
    • M. Hersch, Adaptive sensorimotor peripersonal space representation and motor learning for a humanoid robot, PhD Thesis, EPFL, 2009 PDF

Crawling (WP3)

  • The main Doxygen documentation of the crawling controller is available as the iCub application called missing_application and can be browsed [http:// here]
  • Videos:
  • Paper (and more):
    • -Deliverable 3.4-
    • A presentation on the controller structure: presentation.pdf
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF of submitted paper-

Drumming (WP3)

  • The main Doxygen documentation of the drumming controller is available as the iCub application called drummingEPFL and can be browsed here
  • Paper (and more):
    • -Deliverable 3.4-
    • A presentation on the controller structure: presentation.pdf
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF of submitted paper-

Cartesian control (WP3)

  • The Cartesian Interface for controlling iCub limbs in the task space relies on iKin library and has been embedded within YARP as a further motor device driver whose documentation can be found here.

An example application resorting to the Cartesian Interface functionalities is the armCartesianController whose online documentation can be browsed here.

  • Videos, Cartesian controller in action:
    • Grasping Sponges, iCub successfully grasps some sponges lying on a table.
    • Arm+Gaze Control, iCub tracks a red ball with the gaze and the hand, exploiting 8 DOF's.
  • Paper:
    • Not yet!