作者
Michael Braun, Sarah Theres Völkel, Gesa Wiegand, Thomas Puls, Daniel Steidl, Yannick Weiß, Florian Alt
发表日期
2018/11/25
图书
Proceedings of the 17th International Conference on Mobile and Ubiquitous Multimedia
页码范围
383-389
简介
The control of user interfaces while driving is a textbook example for driver distraction. Modern in-car interfaces are growing in complexity and visual demand, yet they need to stay simple enough to handle while driving. One common approach to solve this problem are multimodal interfaces, incorporating e.g. touch, speech, and mid-air gestures for the control of distinct features. This allows for an optimization of used cognitive resources and can relieve the driver of potential overload. We introduce a novel modality for in-car interaction: our system allows drivers to use facial expressions to control a music player.
The results of a user study show that both implicit emotion recognition and explicit facial expressions are applicable for music control in cars. Subconscious emotion recognition could decrease distraction, while explicit expressions can be used as an alternative input modality. A simple smiling gesture showed …
引用总数
学术搜索中的文章
M Braun, ST Völkel, G Wiegand, T Puls, D Steidl… - Proceedings of the 17th International Conference on …, 2018