Does audiovisual speech perception use information about facial configuration?
- 1 July 2001
- journal article
- Published by Taylor & Francis in The European Journal of Cognitive Psychology
- Vol. 13 (3) , 395-407
- https://doi.org/10.1080/09541440126006
Abstract
No abstract availableKeywords
This publication has 35 references indexed in Scilit:
- Eye movement of perceivers during audiovisualspeech perceptionPerception & Psychophysics, 1998
- Dissociating Face Processing Skills: Decisions about Lip read Speech, Expression, and IdentityThe Quarterly Journal of Experimental Psychology Section A, 1996
- Facial identity and facial speech processing: Familiar faces and voices in the McGurk effectPerception & Psychophysics, 1995
- The neuropsychology of lipreadingPhilosophical Transactions Of The Royal Society B-Biological Sciences, 1992
- Perception of Synthesized Audible and Visible SpeechPsychological Science, 1990
- Looking at Faces: First-Order and Second-Order Features as Determinants of Facial AppearancePerception, 1988
- Right cerebral hemisphere superiority for constructing facial representationsNeuropsychologia, 1985
- Discrimination of Identikit constructions of faces: Evidence for a dual processing strategyPerception & Psychophysics, 1978
- Hearing lips and seeing voicesNature, 1976
- Effect of Visual Factors on the Intelligibility of SpeechThe Journal of the Acoustical Society of America, 1956