ICub Cognitive Architecture
- The correct title of this article is iCub Cognitive Architecture. The initial letter is shown capitalized due to technical restrictions.
The iCub cognitive architecture is the result of a detailed design process founded on the developmental psychology and neurophysiology of humans, capturing much of what is known about the neuroscience of action, perception, and cognition. This process and the final outcome is documented in Deliverable D2.1: A Roadmap for the Development of Cognitive Capabilities in Humanoid Robots.
The architecture itself is realized as a set of YARP executables, typically connected by YARP ports. Early prototypes were developed at a RobotCub project meeting at the University of Hertfordshire in July 2007 as an exercise in consolidating the software development effort of the project partners. Several subsequent versions were produced at the RobotCub Summer School 2007 VVV '07. These prototypes were developed in parallel with the D2.1 Roadmap effort mentioned above. These two strands of design effort converged in the cognitive architecture shown below (version 1.1). Previous versions can be accessed via the links at the end of the page.
The immediate purpose in developing the cognitive architecture is to create a core software infrastructure for the iCub so that it will be able to exhibit a set of target behaviours for an Empirical Investigations.
Differences from previous version
- Removed the
tracker
(should be handled by attention/salience sub-system) - Removed the
face localization
(should be handled by attention/salience sub-system) - Removed the
hand localization
(should be handled by attention/salience sub-system) - Removed the
sound localization
(should be handled by salience module) - Removed the
attention selection
- Added
Exogenous Salience
andEndogenous Salience
- Added
Locomotion
- Added
Matching
- Added
Auto-associative memory
- Added
Hetero-associative procedural memory
- Added
Affective state
- Added
Action selection
Notes
- Gaze implies 7 DoF: head and eyes
- Locomotion paradigm: “go where you are looking”
- Reaching paradigm: “reach where you are looking”
- Endogenous and exogenous salience implies salience based on internal and external events, respectively
- Gaze, reaching, and locomotion motor activities condition endogenous salience: i.e. motor states condition attention
- Sensory inputs condition exogenous salience: i.e. attention conditions motor states
- Auto-associative memory storage is conditioned by poor matching and high salience
- Procedural memory is defined to mean perception-action event sequence
- Procedural memory recall:
- Event A & Event D inputs recall sequence of intermediate events
- Event A input recalls Event B (subsequent event)
- Affective state is a competitive network of three motives:
- Distraction (exogenous salience prevalent)
- Curiosity / Exploration (endogenous salience prevalent)
- Social engagement (exogenous and endogenous salience balanced)
- Action selection is not a winner-take-all process: one or more actions are disinhibited
- The developmental drive is to construct a procedural memory that improves prediction
A more detailed description of the behaviour of each module and circuit in this architecture will be added in due course (both here on the iCub wiki and in Deliverable D2.1: A Roadmap for the Development of Cognitive Capabilities in Humanoid Robots).
Links
The following links are to early versions of the iCub "software architecture", a design for an iCub application (i.e. a set of YARP modules) that approximated some of the elementary aspects of the iCub cognitive architecture which now supercedes them. These early versions have all now been deprecated, as has the title "software architecture" in this context. Software Architecture now refers, as it originally did, to the YARP system.
- iCub software architecture version 0.1
- iCub software architecture version 0.2
- iCub software architecture version 0.3
- iCub software architecture version 0.4
- See also the current draft iCub YARP module specifications
- iCub brain - current source code documentation