Cheng abstract

From Wiki for iCub and Friends
Jump to: navigation, search

In this presentation, we present an architecture that can support distributed processing for the integration of multi-sensory inputs for humanoid robots control. Motivated by findings in neurophysiology, we derived a framework that allow the flows of scalable processing modular units, through their interconnections, higher and higher processing sophistication are realised. At this initial stage of development, we have been able to utilise this framework to produce a number of sub-systems in controlling our humanoid robots, from control for visual attention, ocular-motor responses, to real-time markerless hand tracking. Further investigation are now being made to realise a full coupling between the integration of perception and action for humanoid robots.