Audio–visual matching of speech and non-speech oral gestures in patients with aphasia and apraxia of speech
- 29 August 2005
- journal article
- research article
- Published by Elsevier in Neuropsychologia
- Vol. 44 (4) , 546-555
- https://doi.org/10.1016/j.neuropsychologia.2005.07.002
Abstract
No abstract availableKeywords
This publication has 23 references indexed in Scilit:
- Integration of Auditory and Visual Information about Objects in Superior Temporal SulcusNeuron, 2004
- Neural processes underlying perceptual enhancement by visual speech gesturesNeuroReport, 2003
- The influence of the lexicon on speech read word recognition: Contrasting segmental and lexical distinctivenessPsychonomic Bulletin & Review, 2002
- Upper and lower face apraxia: role of the right hemisphereBrain, 2000
- The lateralization of lip-reading: A second lookNeuropsychologia, 1996
- Left and Right Hemifield Advantages of Fusions and Combinations in Audiovisual Speech PerceptionThe Quarterly Journal of Experimental Psychology Section A, 1995
- Hemispheric contributions to the integration of visual and auditory information in speech perceptionPerception & Psychophysics, 1994
- Neuropsychological studies of auditory-visual fusion illusions. Four case studies and their implicationsNeuropsychologia, 1990
- FACE RECOGNITION AND LIPREADINGBrain, 1986
- The lateralization of lip-read sounds: A first lookBrain and Cognition, 1986