Baby Ears: a recognition system for affective vocalizations
- 27 November 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- Vol. 2 (15206149) , 985-988
- https://doi.org/10.1109/icassp.1998.675432
Abstract
We collected more than 500 utterances from adults talking to their infants. We automatically classified 65% of the strongest utterances correctly as approval, attentional bids, or prohibition. We used several pitch and formant measures, and a multidimensional Gaussian mixture-model discriminator to perform this task. As previous studies have shown, changes in pitch are an important cue for affective messages; we found that timbre or cepstral coefficients are also important. The utterances of female speakers, in this test, were easier to classify than were those of male speakers. We hope this research will allow us to build machines that sense the "emotional state" of a user.Keywords
This publication has 7 references indexed in Scilit:
- Experiments in syllable-based recognition of continuous speechPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2005
- Automatic spoken affect classification and analysisPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Approval and Disapproval: Infant Responsiveness to Vocal Affect in Familiar and Unfamiliar LanguagesChild Development, 1993
- An Introduction to the BootstrapPublished by Springer Nature ,1993
- Intonation and Communicative Intent in Mothers' Speech to Infants: Is the Melody the Message?Child Development, 1989
- Mixture Densities, Maximum Likelihood and the EM AlgorithmSIAM Review, 1984
- An improved endpoint detector for isolated word recognitionIEEE Transactions on Acoustics, Speech, and Signal Processing, 1981