We present a design of a rich multimodal interface for mobile route guidance. The application provides public transport information in Finland, including support for pedestrian guidance when the user is changing between the means of transportation. The range of input and output modalities include speech synthesis, speech recognition, a fisheye GUI, haptics, contextual text input, physical browsing, physical gestures, non-speech audio, and global positioning information. Together, these modalities provide an interface that is accessible for a wide range of users including persons with various levels of visual impairment. In this paper we describe the functional aspects and the design of the interface of our publicly available prototype system.
Bibliographic reference. Turunen, Markku / Hakulinen, Jaakko / Kainulainen, Anssi / Melto, Aleksi / Hurtig, Topi (2007): "Design of a rich multimodal interface for mobile spoken route guidance", In INTERSPEECH-2007, 2193-2196.