Article ID | Journal | Published Year | Pages | File Type |
---|---|---|---|---|
723263 | IFAC Proceedings Volumes | 2007 | 6 Pages |
Abstract
Multimodal user interface (MMUI) is an emerging technology that aims at providing a more intuitive and natural way for people to operate and control a computer or a machine. MMUI allows users to control a computer using various input modalities, including speech, touch, gestures and hand-writing. It has potential to minimise user's cognitive load when performing complex tasks. In this paper we present our work in building an MMUI research platform for intelligent transport system applications, and our attempt to evaluate a user's cognitive load based on analysis of his or her multimodal behaviours and physiological measurement.
Related Topics
Physical Sciences and Engineering
Engineering
Computational Mechanics