Toward an affect-sensitive multimodal human-computer interaction
Top Cited Papers
- 8 September 2003
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in Proceedings of the IEEE
- Vol. 91 (9) , 1370-1390
- https://doi.org/10.1109/jproc.2003.817122
Abstract
The ability to recognize affective states of a person we are communicating with is the core of emotional intelligence. Emotional intelligence is a facet of human intelligence that has been argued to be indispensable and perhaps the most important for successful interpersonal social interaction. This paper argues that next-generation human-computer interaction (HCI) designs need to include the essence of emotional intelligence - the ability to recognize a user's affective states-in order to become more human-like, more effective, and more efficient. Affective arousal modulates all nonverbal communicative cues (facial expressions, body movements, and vocal and physiological reactions). In a face-to-face interaction, humans detect and interpret those interactive signals of their communicator with little or no effort. Yet design and development of an automated system that accomplishes these tasks is rather difficult. This paper surveys the past work in solving these problems by a computer and provides a set of recommendations for developing the first part of an intelligent multimodal HCI-an automatic personalized analyzer of a user's nonverbal affective feedback.Keywords
This publication has 85 references indexed in Scilit:
- Detecting faces in images: a surveyPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Using moment invariants and HMM in facial expression recognitionPattern Recognition Letters, 2002
- Recognizing action units for facial expression analysisPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2001
- Emotion recognition and its application to computer agents with spontaneous interactive capabilitiesKnowledge-Based Systems, 2000
- Expert system for automatic analysis of facial expressionsImage and Vision Computing, 2000
- Facial Expression Recognition Using Model-Based Feature Extraction and Action Parameters ClassificationJournal of Visual Communication and Image Representation, 1997
- Sensor fusion potential exploitation-innovative architectures and illustrative applicationsProceedings of the IEEE, 1997
- The role of fundamental frequency in signaling linguistic stress and affect: Evidence for a dissociationPerception & Psychophysics, 1995
- Cultural similarities and differences in display rulesMotivation and Emotion, 1990
- Communicating emotion: The role of prosodic features.Psychological Bulletin, 1985