Lipreading and audio-visual speech perception
- 29 January 1992
- journal article
- review article
- Published by The Royal Society in Philosophical Transactions Of The Royal Society B-Biological Sciences
- Vol. 335 (1273) , 71-78
- https://doi.org/10.1098/rstb.1992.0009
Abstract
This paper reviews progress in understanding the psychology of lipreading and audio-visual speech perception. It considers four questions. What distinguishes better from poorer lipreaders? What are the effects of introducing a delay between the acoustical and optical speech signals? What have attempts to produce computer animations of talking faces contributed to our understanding of the visual cues that distinguish consonants and vowels? Finally, how should the process of audio-visual integration in speech perception be described; that is, how are the sights and sounds of talking faces represented at their conflux?Keywords
This publication has 39 references indexed in Scilit:
- Listening with eye and hand: Cross-modal contributions to speech perception.Journal of Experimental Psychology: Human Perception and Performance, 1991
- Model-based analysis synthesis image coding (MBASIC) system for a person's faceSignal Processing: Image Communication, 1989
- Teaching lip-reading: The efficacy of lessons on videoBritish Journal of Audiology, 1989
- Lip-reading instruction and hearing aid useBritish Journal of Audiology, 1989
- On the role of visual rate information in phonetic perceptionPerception & Psychophysics, 1985
- The Detection of Auditory Visual DesynchronyPerception, 1980
- Lip reading in infants: Attention to speech presented in- and out-of-synchronyCognitive Psychology, 1979
- External Electrical Stimulation of the Cochlea: Clinical, Psychophysical, Speech-Perceptual and Histological FindingsBritish Journal of Audiology, 1979
- Attitude Changes Following Speechreading TrainingScandinavian Audiology, 1977
- Lip Positions in American English VowelsLanguage and Speech, 1964