Difference between revisions of "Deliverable 3.6 update"

From Wiki for iCub and Friends
Jump to: navigation, search
(The iCub reaching/grasping controller)
 
Line 6: Line 6:
  
 
The software implementation deriving from the work in WP3 (sensorimotor coordination models) is a collection of applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the
 
The software implementation deriving from the work in WP3 (sensorimotor coordination models) is a collection of applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the
[http://eris.liralab.it/iCub/dox/html/group__icub__applications.html iCub applications] documentation. The modules are described also in the [http://eris.liralab.it/iCub/dox/html/group__icub__module.html iCub modules] documentation.
+
[http://wiki.icub.org/iCub/dox/html/group__icub__applications.html iCub applications] documentation. The modules are described also in the [http://wiki.icub.org/iCub/dox/html/group__icub__module.html iCub modules] documentation.
  
  
Line 20: Line 20:
 
[[Image:attention.jpg|right|150px|thumb|the attention system]]
 
[[Image:attention.jpg|right|150px|thumb|the attention system]]
  
* The main Doxygen documentation of the attention system is available as the iCub application called '''attention_distributed''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__attentionDistributed.html here]
+
* The main Doxygen documentation of the attention system is available as the iCub application called '''attention_distributed''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__attentionDistributed.html here]
  
* Video: [http://eris.liralab.it/misc/icubvideos/iCub-attention-may14-2008.wmv iCub-attention.wmv]
+
* Video: [http://wiki.icub.org/misc/icubvideos/iCub-attention-may14-2008.wmv iCub-attention.wmv]
  
 
* Paper:  
 
* Paper:  
Line 33: Line 33:
 
[[Image:cart.jpg|right|150px|thumb|the Cartesian controller]]
 
[[Image:cart.jpg|right|150px|thumb|the Cartesian controller]]
  
* The main Doxygen documentation of the reaching controller is available as the iCub application called '''armCartesianController''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__armCartesianController.html here]. The implementation includes also the multi-referential approach used for the learning of the body schema (see below).
+
* The main Doxygen documentation of the reaching controller is available as the iCub application called '''armCartesianController''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__armCartesianController.html here]. The implementation includes also the multi-referential approach used for the learning of the body schema (see below).
  
 
* Videos:
 
* Videos:
** [http://eris.liralab.it/misc/icubvideos/icub_grasps_sponges.wmv icub_grasps.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icub_grasps_sponges.wmv icub_grasps.wmv]
  
 
* Papers:
 
* Papers:
Line 51: Line 51:
  
 
* Videos:
 
* Videos:
** [http://eris.liralab.it/misc/icubvideos/first_crawl.wmv crawling.wmv], a few steps with the crawling controller (old video).
+
** [http://wiki.icub.org/misc/icubvideos/first_crawl.wmv crawling.wmv], a few steps with the crawling controller (old video).
  
 
* Papers (and more):  
 
* Papers (and more):  
Line 62: Line 62:
 
[[Image:drumming.jpg|right|150px|thumb|the iCub drumming]]
 
[[Image:drumming.jpg|right|150px|thumb|the iCub drumming]]
  
* The main Doxygen documentation of the drumming controller is available as the iCub application called '''drummingEPFL''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__drummingEPFL.html here]
+
* The main Doxygen documentation of the drumming controller is available as the iCub application called '''drummingEPFL''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__drummingEPFL.html here]
  
 
* Videos, various videos of the robot drumming (old videos):
 
* Videos, various videos of the robot drumming (old videos):
** [http://eris.liralab.it/misc/icubvideos/icubDrumPart1.wmv drumming1.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icubDrumPart1.wmv drumming1.wmv]
** [http://eris.liralab.it/misc/icubvideos/icubDrumPart2.wmv drumming2.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icubDrumPart2.wmv drumming2.wmv]
** [http://eris.liralab.it/misc/icubvideos/icubDrumPart3.wmv drumming3.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icubDrumPart3.wmv drumming3.wmv]
** [http://eris.liralab.it/misc/icubvideos/icubDrumPart4.wmv drumming4.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/icubDrumPart4.wmv drumming4.wmv]
** [http://eris.liralab.it/misc/icubvideos/automatica08-edited.wmv automatica08.wmv]
+
** [http://wiki.icub.org/misc/icubvideos/automatica08-edited.wmv automatica08.wmv]
  
 
* Papers (and more):  
 
* Papers (and more):  
Line 85: Line 85:
  
 
* Videos:  
 
* Videos:  
** A preliminary video of some related research (iCub v2.0) is available [http://eris.liralab.it/misc/icubvideos/torqueControl.wmv here]
+
** A preliminary video of some related research (iCub v2.0) is available [http://wiki.icub.org/misc/icubvideos/torqueControl.wmv here]
  
 
* Papers:  
 
* Papers:  
Line 97: Line 97:
 
[[Image:reach.jpg|right|150px|thumb|iCub reaching]]
 
[[Image:reach.jpg|right|150px|thumb|iCub reaching]]
  
* The main Doxygen documentation of the body schema is available as the iCub application called '''lasaBodySchema''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__lasabodyschema.html here]
+
* The main Doxygen documentation of the body schema is available as the iCub application called '''lasaBodySchema''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__lasabodyschema.html here]
  
 
* Videos (older):
 
* Videos (older):
** [http://eris.liralab.it/misc/icubvideos/body_schema_sim.avi bodyschema.avi], this video shows the learning procedure in simulation, or [http://eris.liralab.it/misc/icubvideos/body_schema.wmv .wmv file].
+
** [http://wiki.icub.org/misc/icubvideos/body_schema_sim.avi bodyschema.avi], this video shows the learning procedure in simulation, or [http://wiki.icub.org/misc/icubvideos/body_schema.wmv .wmv file].
** [http://eris.liralab.it/misc/icubvideos/icub_finger_s.avi fingerreach.avi] or [http://eris.liralab.it/misc/icubvideos/icub_finger_s.wmv fingerreach.wmv], learning to reach using a different effector (a different finger as the end-point).
+
** [http://wiki.icub.org/misc/icubvideos/icub_finger_s.avi fingerreach.avi] or [http://wiki.icub.org/misc/icubvideos/icub_finger_s.wmv fingerreach.wmv], learning to reach using a different effector (a different finger as the end-point).
** [http://eris.liralab.it/misc/icubvideos/icub_gazing_s.avi gazing.avi] or [http://eris.liralab.it/misc/icubvideos/icub_gazing.wmv gazing.wmv], learning to gaze appropriately (head-eye coordination).
+
** [http://wiki.icub.org/misc/icubvideos/icub_gazing_s.avi gazing.avi] or [http://wiki.icub.org/misc/icubvideos/icub_gazing.wmv gazing.wmv], learning to gaze appropriately (head-eye coordination).
** [http://eris.liralab.it/misc/icubvideos/icub_reaching_s.avi reaching.avi] or [http://eris.liralab.it/misc/icubvideos/iCub-reach-epfl.wmv reaching.wmv], learning to reach (whole body).
+
** [http://wiki.icub.org/misc/icubvideos/icub_reaching_s.avi reaching.avi] or [http://wiki.icub.org/misc/icubvideos/iCub-reach-epfl.wmv reaching.wmv], learning to reach (whole body).
  
 
* Papers:  
 
* Papers:  
Line 117: Line 117:
  
 
* Videos (older):
 
* Videos (older):
** [http://eris.liralab.it/misc/icubvideos/SmoothPursuitAndOcclusion.avi video1.avi], this video shows tracking across occlusions
+
** [http://wiki.icub.org/misc/icubvideos/SmoothPursuitAndOcclusion.avi video1.avi], this video shows tracking across occlusions
** [http://eris.liralab.it/misc/icubvideos/Part_A_SmoothPursuit.avi video2.avi], same as above from a different point of view  
+
** [http://wiki.icub.org/misc/icubvideos/Part_A_SmoothPursuit.avi video2.avi], same as above from a different point of view  
** [http://eris.liralab.it/misc/icubvideos/Part_B_SmoothPursuit.avi video3.avi], ditto  
+
** [http://wiki.icub.org/misc/icubvideos/Part_B_SmoothPursuit.avi video3.avi], ditto  
  
 
* Papers:
 
* Papers:
Line 137: Line 137:
  
 
* Videos:
 
* Videos:
** [http://eris.liralab.it/misc/icubvideos/ICubAutRobs.wmv ICubAutRobs.wmv], a video showing bimanual coordination in reaching and manipulation of simple objects.
+
** [http://wiki.icub.org/misc/icubvideos/ICubAutRobs.wmv ICubAutRobs.wmv], a video showing bimanual coordination in reaching and manipulation of simple objects.
  
  
Line 145: Line 145:
 
[[Image:disparity.jpg|right|150px|thumb|binocular disparity processing]]
 
[[Image:disparity.jpg|right|150px|thumb|binocular disparity processing]]
 
* The main Doxygen documentation of the active stereo matching is available as the following iCub modules:
 
* The main Doxygen documentation of the active stereo matching is available as the following iCub modules:
** [http://eris.liralab.it/iCub/dox/html/group__icub__DisparityMapModule.html DisparityMapModule]
+
** [http://wiki.icub.org/iCub/dox/html/group__icub__DisparityMapModule.html DisparityMapModule]
  
 
* Papers:
 
* Papers:
Line 151: Line 151:
  
 
* Videos:
 
* Videos:
** [http://eris.liralab.it/misc/icubvideos/zd_tracking.mp4 logpolar.wmv], logpolar.wmv], this video shows how the zero disparity tracking keeps the robot looking at the interesting object, which allows the matching process.
+
** [http://wiki.icub.org/misc/icubvideos/zd_tracking.mp4 logpolar.wmv], logpolar.wmv], this video shows how the zero disparity tracking keeps the robot looking at the interesting object, which allows the matching process.
** [http://eris.liralab.it/misc/icubvideos/StereoTrackerWithDisparityData.wmv tracking.wmv], a video showing generic tracking using zero disparity filtering and use of disparity data.
+
** [http://wiki.icub.org/misc/icubvideos/StereoTrackerWithDisparityData.wmv tracking.wmv], a video showing generic tracking using zero disparity filtering and use of disparity data.

Latest revision as of 10:51, 20 June 2013


Software implementation of the phylogenetic abilities... (version 2.0)
Icubijcai.jpg

This is the addendum to the Deliverable 3.6 which is the second release of the part of iCub Cognitive Architecture. A placeholder document has been added to the RobotCub.org website with a pointer to this page. The demonstrations are provided here as videos.


The software implementation deriving from the work in WP3 (sensorimotor coordination models) is a collection of applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the iCub applications documentation. The modules are described also in the iCub modules documentation.


Some of these applications and videos are shared with Deliverable 2.2 which is the final collection of iCub software (version 2.0).

Important note

  • Please note that often the browser won't display/embed your videos correctly because of coded and/or other player incompatibilities. In that case, we recommend downloading them to your computer and then playing them using your favorite media player software.



The iCub attention system

the attention system
  • The main Doxygen documentation of the attention system is available as the iCub application called attention_distributed and can be browsed here
  • Paper:
    • Ruesch J., Lopes, M., Hornstein J., Santos-Victor J., Pfeifer, R. Multimodal Saliency-Based Bottom-Up Attention - A Framework for the Humanoid Robot iCub. International Conference on Robotics and Automation, Pasadena, CA, USA, May 19-23, 2008. pp. 962-967. -PDF-



The iCub reaching/grasping controller

the Cartesian controller
  • The main Doxygen documentation of the reaching controller is available as the iCub application called armCartesianController and can be browsed here. The implementation includes also the multi-referential approach used for the learning of the body schema (see below).
  • Papers:
    • None, but details have been included in the progress report -PDF-



Crawling

the iCub crawling
  • The main Doxygen documentation of the crawling controller is available as the iCub application called missing_application and can be browsed [http:// here]
  • Videos:
    • crawling.wmv, a few steps with the crawling controller (old video).
  • Papers (and more):
    • -Deliverable 3.8-
    • A presentation on the controller structure: presentation.pdf
    • Degallier, S. and Ijspeert, A.J., Modeling Discrete and Rhythmic Movement through Motor Primitives: A Review. 2009. Submitted to Biological Cybernetics. -PDF-
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF-

Drumming uses the same modules of crawling

the iCub drumming
  • The main Doxygen documentation of the drumming controller is available as the iCub application called drummingEPFL and can be browsed here
  • Papers (and more):
    • -Deliverable 3.4-
    • -Deliverable 3.8-
    • A presentation on the controller structure: presentation.pdf
    • S. Degallier, L. Righetti, L. Natale, F. Nori, G. Metta and A. Ijspeert. A modular bio-inspired architecture for movement generation for the infant-like robot iCub. In Proceedings of the second IEEE RAS / EMBS International Conference on Biomedical Robotics an Biomechatronics (BioRob), 2008. -PDF-



Force control on the iCub

the iCub v2.0 (torque control)
  • The main Doxygen documentation of the cognitive gaze demonstration is available as the iCub application called forceControl and can be browsed [http:// here]
  • Videos:
    • A preliminary video of some related research (iCub v2.0) is available here
  • Papers:
    • Parmiggiani, A., Randazzo, M., Natale, L., Metta, G., Sandini, G. Joint torque sensing for the upper-body of the iCub humanoid robot. IEEE-RAS International Conference on Humanoid Robots. (2009). -PDF-
    • Fumagalli M., Gijsberts, A., Ivaldi, S., Jamone, L., Metta, G., Natale, L., Nori, F., Sandini, G. Learning to Exploit Proximal Force Sensing: a Comparison Approach. From Motor Learning to Interaction Learning in Robots. (2010). -PDF-



Body schema learning

iCub reaching
  • The main Doxygen documentation of the body schema is available as the iCub application called lasaBodySchema and can be browsed here
  • Papers:
    • M. Hersch, E. Sauser and A. Billard. Online learning of the body schema. International Journal of Humanoid Robotics, (2008). -PDF-
    • M. Hersch, Adaptive sensorimotor peripersonal space representation and motor learning for a humanoid robot. PhD thesis (2009). link



Predictive gaze

The iCub smooth pursuit
  • The main Doxygen documentation of the predictive gaze is available as the iCub application called missing link and can be browsed [http:// here]
  • Papers:
    • Falotico E., Taiana M., Zambrano A., Bernardino A., Santos-Victor J., Laschi C., Dario P., Predictive tracking across occlusions on the iCub robot. 9th IEEE-RAS International Conference on Humanoid Robots, 7-10th December 2009, Paris, France. -PDF-
    • Zambrano D., Falotico E., Manfredi L., Laschi C., A model of the smooth pursuit eye movement with prediction and learning. Applied Bionics and Biomechanics. 2009 (accepted).



Bimanual coordination

iCub bimanual reaching
  • The main Doxygen documentation of the bimanual coordination experiments is available as the iCub application called missing link and can be browsed [http:// here]
  • Papers:
    • Mohan, V., Morasso, P., Metta, G., Sandini, G. A biomimetic, force-field based computational model for motion planning and bimanual coordination in humanoid robots. Autonomous Robots. (in press). pp.1-46 (2009). -PDF-
  • Videos:
    • ICubAutRobs.wmv, a video showing bimanual coordination in reaching and manipulation of simple objects.



Active stereo matching

binocular disparity processing
  • The main Doxygen documentation of the active stereo matching is available as the following iCub modules:
  • Papers:
    • Martinez, H., Lungarella, M. and Pfeifer, R., Influence of the sensor morphology in the generation of coordinated behavior as a function of information structure., Technical Report, AI-Lab, University of Zurich, Switzerland, November 2009. -PDF-
  • Videos:
    • logpolar.wmv, logpolar.wmv], this video shows how the zero disparity tracking keeps the robot looking at the interesting object, which allows the matching process.
    • tracking.wmv, a video showing generic tracking using zero disparity filtering and use of disparity data.