Perceptual user interfaces: multimodal interfaces that process what comes naturally
Top Cited Papers
- 1 March 2000
- journal article
- Published by Association for Computing Machinery (ACM) in Communications of the ACM
- Vol. 43 (3) , 45-53
- https://doi.org/10.1145/330534.330538
Abstract
During natural multimodal communication, we speak, gesture, gaze and move in a powerful flow of communication that bears little resemblance to the discrete keyboard and mouse clicks entered sequentially in a graphical user interface (GUI). A profound shift is occurring toward embracing users' natural behavior as the center of the human- computer interface. Multimodal interfaces are being developed that permit our highly skilled and coordinated communicative behaviors to control system interactions in a more transparent interface experience than ever before. Our voice, hands, and whole body together, once augmented by sensors such as microphones and cameras, now are the ultimate transparent and mobile multimodal input devices.Keywords
This publication has 10 references indexed in Scilit:
- Designing the User Interface for Multimodal Speech and Pen-Based Gesture Applications: State-of-the-Art Systems and Future Research DirectionsHuman–Computer Interaction, 2000
- Ten myths of multimodal interactionCommunications of the ACM, 1999
- Mutual disambiguation of recognition errors in a multimodel architecturePublished by Association for Computing Machinery (ACM) ,1999
- Manual and gaze input cascaded (MAGIC) pointingPublished by Association for Computing Machinery (ACM) ,1999
- Multimodal integration-a statistical viewIEEE Transactions on Multimedia, 1999
- EditorialSpeech Communication, 1998
- Mulitmodal Interactive Maps: Designing for Human PerformanceHuman–Computer Interaction, 1997
- Unification-based multimodal integrationPublished by Association for Computational Linguistics (ACL) ,1997
- QuickSetPublished by Association for Computing Machinery (ACM) ,1997
- “Put-that-there”ACM SIGGRAPH Computer Graphics, 1980