ICub Cognitive Architecture
- The correct title of this article is iCub Cognitive Architecture. The initial letter is shown capitalized due to technical restrictions.
The iCub cognitive architecture is the result of a detailed design process founded on the developmental psychology and neurophysiology of humans so that it encapsulates what is currently known about the neuroscience of action, perception, and cognition. This process and the final outcome is documented in Deliverable D2.1: A Roadmap for the Development of Cognitive Capabilities in Humanoid Robots.
The architecture itself comprises a set of YARP executables, typically connected by YARP ports. Early prototypes were developed at a RobotCub project meeting at the University of Hertfordshire in July 2007 as an exercise in consolidating the software development effort of the project partners. Several subsequent versions were produced at the RobotCub Summer School 2007 VVV '07. These prototypes were developed in parallel with the Roadmap effort mentioned above. These two strands of design effort converged in the cognitive architecture shown below (version 1.0). Previous versions can be accessed via the links at the end of the page.
The immediate purpose in developing the software architecture is to create a core software infrastructure for the iCub so that it will be able to exhibit a set of target behaviours for an Experimental Investigation 1.
- Removed the
- attentionSelection is likely to expand (and/or sub-divide) into, first,
automaticActionSelection, and then to
egoSphereis likely to expand (and/or sub-divide) significantly in the future. First, it might become a full expression of peripersonal space, encompassing proprioceptive and exterioceptive perception in a temporally stable fashion. This stability would be maintained despite re-orientation by the iCub. Second, it might then become some form of allocentric> mechanism, effected possibly by some mutual association of many egocentric representations. This allocentric mechanism would be stable despite movement of the iCub around its environment.
controlGaze -> salience -> attentionSelectioncircuit is a fast retinotopic circuit. It is one of the circuits by which motoric state modulates attentional capacity.
soundLocalizationat present implies simply a binaural localization of the direction of arrival of a sound, expressed in head-centric spherical coordinates. It should ultimately effect some sensori-motor capability that acts to re-orient the iCub towards a localized sound.
faceLocalizationwould ideally be implemented as a ‘three blob’ detection; at present it is intended to utilize an OpenCV face detector for the immediate future.
Reaching -> salience -> egoSphere -> attentionSelection -> controGazeis the primary circuit for achieving visually-guided reaching.
Timbre, in the sense of time-frequency characteristics of a signal, typically focussing on the pattern of growth and decay of harmonics
Allocentric, in the sense of addressing aspects of the iCub’s sensory world that have been experienced but are not presently within its sensory compass.
At some point, we need to figure out what segmentation means, where it fits in this architecture, and how it is effected.
The same applies for object recognition.
The following links are to early versions of the iCub "software architecture", a design for an iCub application (i.e. a set of YARP modules) that approximated some of the elementary aspects of the iCub cognitive architecture which now supercedes them. These early version have all now been deprecated, as has the title "software architecture" in this context. Software Architecture now refers, as it originally did, to the YARP system.
- iCub software architecture version 0.1
- iCub software architecture version 0.2
- iCub software architecture version 0.3
- iCub software architecture version 0.4