Difference between revisions of "Deliverable 5N.2"

From Wiki for iCub and Friends
Jump to: navigation, search
(Human-robot interaction)
 
Line 6: Line 6:
  
 
The software implementation is a collection applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the
 
The software implementation is a collection applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the
[http://eris.liralab.it/iCub/dox/html/group__icub__applications.html iCub applications] documentation. The modules are described also in the [http://eris.liralab.it/iCub/dox/html/group__icub__module.html iCub modules] documentation.
+
[http://wiki.icub.org/iCub/dox/html/group__icub__applications.html iCub applications] documentation. The modules are described also in the [http://wiki.icub.org/iCub/dox/html/group__icub__module.html iCub modules] documentation.
  
  
Line 19: Line 19:
 
[[Image:humanrobotinteraction.jpg|right|150px|thumb|iCub interaction game]]
 
[[Image:humanrobotinteraction.jpg|right|150px|thumb|iCub interaction game]]
  
* The main Doxygen documentation of the interaction histories experiment is available as the iCub application called '''ihaNew''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__iha2.html here]
+
* The main Doxygen documentation of the interaction histories experiment is available as the iCub application called '''ihaNew''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__iha2.html here]
  
 
* Video:
 
* Video:
** [http://eris.liralab.it/misc/icubvideos/ihaNew_short_web.mov ihaNew_short_web.mov]
+
** [http://wiki.icub.org/misc/icubvideos/ihaNew_short_web.mov ihaNew_short_web.mov]
** [http://eris.liralab.it/misc/icubvideos/ihaNew_short2_web.mov ihaNew_short2_web.mov]
+
** [http://wiki.icub.org/misc/icubvideos/ihaNew_short2_web.mov ihaNew_short2_web.mov]
  
 
* Papers:
 
* Papers:
Line 31: Line 31:
 
[[Image:iha.jpg|right|150px|thumb|the Peekaboo game]]
 
[[Image:iha.jpg|right|150px|thumb|the Peekaboo game]]
  
* The main Doxygen documentation of the interaction histories experiment is available as the iCub application called '''iha_manual''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__iha.html here]
+
* The main Doxygen documentation of the interaction histories experiment is available as the iCub application called '''iha_manual''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__iha.html here]
  
 
* Video:
 
* Video:
** [http://eris.liralab.it/misc/icubvideos/iha.wmv iha.wmv], full video of the experiment.
+
** [http://wiki.icub.org/misc/icubvideos/iha.wmv iha.wmv], full video of the experiment.
  
 
* Papers (and more):  
 
* Papers (and more):  
Line 46: Line 46:
 
== Imitation learning, refinement and reuse ==
 
== Imitation learning, refinement and reuse ==
 
[[Image:imitationlearning.jpg|right|150px|thumb|imitation learning on the iCub]]
 
[[Image:imitationlearning.jpg|right|150px|thumb|imitation learning on the iCub]]
* The main Doxygen documentation of the imitation learning application is available as the iCub application called '''Imitation learning, refinement and reuse''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__lasaImitation.html here]
+
* The main Doxygen documentation of the imitation learning application is available as the iCub application called '''Imitation learning, refinement and reuse''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__lasaImitation.html here]
  
 
* Videos:
 
* Videos:
** [http://eris.liralab.it/misc/icubvideos/iCub_LasaImitation.avi iCub_LasaImitation.avi]: this video shows the imitation learning procedure as well as the process of policy refinement and reuse.
+
** [http://wiki.icub.org/misc/icubvideos/iCub_LasaImitation.avi iCub_LasaImitation.avi]: this video shows the imitation learning procedure as well as the process of policy refinement and reuse.
  
 
* Papers:   
 
* Papers:   
Line 57: Line 57:
 
[[Image:reach.jpg|right|150px|thumb|iCub reaching]]
 
[[Image:reach.jpg|right|150px|thumb|iCub reaching]]
  
* The main Doxygen documentation of the body schema is available as the iCub application called '''lasaBodySchema''' and can be browsed [http://eris.liralab.it/iCub/dox/html/group__icub__lasabodyschema.html here]
+
* The main Doxygen documentation of the body schema is available as the iCub application called '''lasaBodySchema''' and can be browsed [http://wiki.icub.org/iCub/dox/html/group__icub__lasabodyschema.html here]
  
 
* Videos (older):
 
* Videos (older):
** [http://eris.liralab.it/misc/icubvideos/body_schema_sim.avi bodyschema.avi], this video shows the learning procedure in simulation, or [http://eris.liralab.it/misc/icubvideos/body_schema.wmv .wmv file].
+
** [http://wiki.icub.org/misc/icubvideos/body_schema_sim.avi bodyschema.avi], this video shows the learning procedure in simulation, or [http://wiki.icub.org/misc/icubvideos/body_schema.wmv .wmv file].
** [http://eris.liralab.it/misc/icubvideos/icub_finger_s.avi fingerreach.avi] or [http://eris.liralab.it/misc/icubvideos/icub_finger_s.wmv fingerreach.wmv], learning to reach using a different effector (a different finger as the end-point).
+
** [http://wiki.icub.org/misc/icubvideos/icub_finger_s.avi fingerreach.avi] or [http://wiki.icub.org/misc/icubvideos/icub_finger_s.wmv fingerreach.wmv], learning to reach using a different effector (a different finger as the end-point).
** [http://eris.liralab.it/misc/icubvideos/icub_gazing_s.avi gazing.avi] or [http://eris.liralab.it/misc/icubvideos/icub_gazing.wmv gazing.wmv], learning to gaze appropriately (head-eye coordination).
+
** [http://wiki.icub.org/misc/icubvideos/icub_gazing_s.avi gazing.avi] or [http://wiki.icub.org/misc/icubvideos/icub_gazing.wmv gazing.wmv], learning to gaze appropriately (head-eye coordination).
** [http://eris.liralab.it/misc/icubvideos/icub_reaching_s.avi reaching.avi] or [http://eris.liralab.it/misc/icubvideos/iCub-reach-epfl.wmv reaching.wmv], learning to reach (whole body).
+
** [http://wiki.icub.org/misc/icubvideos/icub_reaching_s.avi reaching.avi] or [http://wiki.icub.org/misc/icubvideos/iCub-reach-epfl.wmv reaching.wmv], learning to reach (whole body).
  
 
* Papers:  
 
* Papers:  
 
** M. Hersch, E. Sauser and A. Billard. '''Online learning of the body schema'''. International Journal of Humanoid Robotics, (2008). [http://www.robotcub.org/misc/papers/08_Hersch_Sauser_Billard.pdf -PDF-]
 
** M. Hersch, E. Sauser and A. Billard. '''Online learning of the body schema'''. International Journal of Humanoid Robotics, (2008). [http://www.robotcub.org/misc/papers/08_Hersch_Sauser_Billard.pdf -PDF-]
 
** M. Hersch, '''Adaptive sensorimotor peripersonal space representation and motor learning for a humanoid robot'''. PhD thesis (2009). [http://library.epfl.ch/theses/?nr=4289 link]
 
** M. Hersch, '''Adaptive sensorimotor peripersonal space representation and motor learning for a humanoid robot'''. PhD thesis (2009). [http://library.epfl.ch/theses/?nr=4289 link]

Latest revision as of 10:48, 20 June 2013


Software Implementation of the iCub Imitation and Communication skills (version 2.0)
Icubijcai.jpg

This is the addendum to the Deliverable 5N.2 which is the second release of part of the iCub Cognitive Architecture. A placeholder document has been added to the RobotCub.org website with a pointer to this page. The demonstrations are provided here as videos.


The software implementation is a collection applications comprising YARP modules: each application realizes a given behaviour and runs independently on the iCub. The applications are available from the iCub applications documentation. The modules are described also in the iCub modules documentation.


Important note

  • Please note that often the browser won't display/embed your videos correctly because of coded and/or other player incompatibilities. In that case, we recommend downloading them to your computer and then playing them using your favorite media player software.



Human-robot interaction

iCub interaction game
  • The main Doxygen documentation of the interaction histories experiment is available as the iCub application called ihaNew and can be browsed here
  • Papers:
    • Broz, F., Kose-Bagci, H., Nehaniv, C.L., Dautenhahn, K., Learning behavior for a social interaction game with a childlike humanoid robot, Social Learning in Interactive Scenarios Workshop, Humanoids 2009, Paris, France, 7 December, 2009. here

Human-robot interaction uses the interaction histories architecture

the Peekaboo game
  • The main Doxygen documentation of the interaction histories experiment is available as the iCub application called iha_manual and can be browsed here
  • Video:
    • iha.wmv, full video of the experiment.
  • Papers (and more):
    • -Deliverable 6.4-
    • Mirza, N. A., Nehaniv, C. L., Dautenhahn, K., AND te Boekhorst, R. 2005. Using sensory-motor phase-plots to characterise robot-environment interactions. In Proc. of 6th IEEE International Symposium on Computational Intelligence in Robotics and Automation. -PDF-
    • Mirza, N. A., Nehaniv, C. L., Dautenhahn, K., te Boekhorst, R., Developing Social Action Capabilities in a Humanoid Robot using an Interaction History Architecture. Proc. IEEE-RAS Humanoids 2008. -PDF-



Imitation learning, refinement and reuse

imitation learning on the iCub
  • The main Doxygen documentation of the imitation learning application is available as the iCub application called Imitation learning, refinement and reuse and can be browsed here
  • Videos:
    • iCub_LasaImitation.avi: this video shows the imitation learning procedure as well as the process of policy refinement and reuse.
  • Papers:
    • Brenna D. Argall, Eric L. Sauser and Aude G. Billard. Tactile Guidance for Policy Refinement and Reuse. under preparation (2010) -PDF-

Body schema learning

iCub reaching
  • The main Doxygen documentation of the body schema is available as the iCub application called lasaBodySchema and can be browsed here
  • Papers:
    • M. Hersch, E. Sauser and A. Billard. Online learning of the body schema. International Journal of Humanoid Robotics, (2008). -PDF-
    • M. Hersch, Adaptive sensorimotor peripersonal space representation and motor learning for a humanoid robot. PhD thesis (2009). link