11th Annual Conference of the International Speech Communication Association

Makuhari, Chiba, Japan
September 26-30. 2010

Learning Naturally Spoken Commands for a Robot

Anja Austermann (1), Seiji Yamada (1), Kotaro Funakoshi (2), Mikio Nakano (2)

(1) Sokendai, Japan
(2) Honda Research Institute Japan Co. Ltd., Japan

Enabling a robot to understand natural commands for Human-Robot-Interaction is a challenge that needs to be solved to enable novice users to interact with robots smoothly and intuitively. We propose a method to enable a robot to learn how its user utters commands in order to adapt to individual differences in speech usage. The learning method combines a stimulus encoding phase based on Hidden Markov models to encode speech sounds into units, modeling similar utterances, and a stimulus association phase based on classical conditioning to associate these models with their symbolic representations. Using this method, the robot is able to learn how its user utters parameterized commands, such as "Please put the book in the bookshelf" or "Can you clean the table for me?" through situated interaction with its user.

Full Paper

Bibliographic reference.  Austermann, Anja / Yamada, Seiji / Funakoshi, Kotaro / Nakano, Mikio (2010): "Learning naturally spoken commands for a robot", In INTERSPEECH-2010, 2506-2509.