Talking Robots

How can we create a man-made device that can display human language behavior?

Resources

 * An extensive robotics links page by Noel Sharkey.

Past Research

 * Sharkey, N.E. and Ziemke, T. (1998) A consideration of the biological and psychological foundations of autonomous robotics. Connection Science, 10, 361-391

Abstract: The new wave of robotics aims to provide robots with the capacity to learn, develop and evolve in interaction with their environments using biologically inspired techniques.


 * Luc Steels

The basic idea behind this work is that a community of language users (further called agents) can be viewed as a complex adaptive system which collectively solves the problem of developing a shared communication system.

Steels, L. and P. Vogt Grounding adaptive language games in robotic agents. In Harvey, P. and P. Husbands (eds.) (1997) Proceedings of the 4th European Conference on Artificial Life 97.

Abstract. The paper addresses the question how a group of physically embodied robotic agents may originate meaning and language through adaptive language games. The main principles underlying the approach are sketched as well as the steps needed to implement these principles on physical agents. Some experimental results based on this implementation are presented. (PDF)

Current Research

 * MirrorBot: Biomimetic multimodal learning in a mirror neuron-based robot. Mirror neuron areas correspond to cortical areas which are related to human language centres (e.g. Broca region) and could provide a cortical substrate for the integration of vision, language and action.
 * Cornelius Weber

Towards multimodal neural robot learning Robotics and Autonomous Systems 47 (2004) 171–175.

Abstract Learning by multimodal observation of vision and language offers a potentially powerful paradigm for robot learning. Recent experiments have shown that ‘mirror’ neurons are activated when an action is being performed, perceived, or verbally referred to. Different input modalities are processed by distributed cortical neuron ensembles for leg, arm and head actions. In this overview paper we consider this evidence from mirror neurons by integrating motor, vision and language representations in a learning robot.