ISCA Archive Interspeech 2006
ISCA Archive Interspeech 2006

Developing speech dialogs for multimodal HMIs using finite state machines

Silke Goronzy, Raquel Mochales, Nicole Beringer

We present a tool for model-based development of multimodal interfaces. The HMI model captures all involved modalities, thus ensuring highly consistent interfaces. In this paper we focus on the development of speech dialogs. These are specified using state machines, which is in contrast to the traditional way of using flow-charts. The usage of state machines gives us the possibility to fully specify the HMI so that it contains enough information to be fully simulated without the need to connect any target applications as well as for automatic target code generation. Due to the extensive simulation capabilities usability evaluations can be conducted at very early design stages. We further explain how different dialog strategies for different user types can be developed with the help of the user modelling plug-in. The tool thus supports the whole development chain starting from design studies to specification, development and testing over usability studies and target implementation.

doi: 10.21437/Interspeech.2006-491

Cite as: Goronzy, S., Mochales, R., Beringer, N. (2006) Developing speech dialogs for multimodal HMIs using finite state machines. Proc. Interspeech 2006, paper 1544-Wed2WeS.3, doi: 10.21437/Interspeech.2006-491

  author={Silke Goronzy and Raquel Mochales and Nicole Beringer},
  title={{Developing speech dialogs for multimodal HMIs using finite state machines}},
  booktitle={Proc. Interspeech 2006},
  pages={paper 1544-Wed2WeS.3},