Towards an Autarkic Embedded Cognitive User Interface

Frank Duckhorn, Markus Huber, Werner Meyer, Oliver Jokisch, Constanze Tschöpe, Matthias Wolff


With this paper we present an overview of an autarkic embedded cognitive user interface. It is realized in form of an integrated device able to communicate with the user over speech & gesture recognition, speech synthesis and a touch display. Semantic processing and cognitive behaviour control support intuitive interaction and help controlling arbitrary electronic devices. To ensure user privacy and to operate autonomously of network access all information processing is done on the device.


Cite as: Duckhorn, F., Huber, M., Meyer, W., Jokisch, O., Tschöpe, C., Wolff, M. (2017) Towards an Autarkic Embedded Cognitive User Interface. Proc. Interspeech 2017, 3435-3436.


@inproceedings{Duckhorn2017,
  author={Frank Duckhorn and Markus Huber and Werner Meyer and Oliver Jokisch and Constanze Tschöpe and Matthias Wolff},
  title={Towards an Autarkic Embedded Cognitive User Interface},
  year=2017,
  booktitle={Proc. Interspeech 2017},
  pages={3435--3436}
}