Integration of Visual and Linguistic Information in Spoken Language Comprehension
- 16 June 1995
- journal article
- Published by American Association for the Advancement of Science (AAAS) in Science
- Vol. 268 (5217) , 1632-1634
- https://doi.org/10.1126/science.7777863
Abstract
Psycholinguists have commonly assumed that as a spoken linguistic message unfolds over time, it is initially structured by a syntactic processing module that is encapsulated from information provided by other perceptual and cognitive systems. To test the effects of relevant visual context on the rapid mental processes that accompany spoken language comprehension, eye movements were recorded with a head-mounted eye-tracking system while subjects followed instructions to manipulate real objects. Visual context influenced spoken word recognition and mediated syntactic processing, even during the earliest moments of language processing.Keywords
This publication has 16 references indexed in Scilit:
- Resolving attachment ambiguities with multiple constraintsCognition, 1995
- Memory Representations in Natural TasksJournal of Cognitive Neuroscience, 1995
- The lexical nature of syntactic ambiguity resolution.Psychological Review, 1994
- Saccadic overhead: Information-processing time with and without saccadesPerception & Psychophysics, 1993
- Rules of LanguageScience, 1991
- Interaction with context during human sentence processingCognition, 1988
- Segregation of Form, Color, Movement, and Depth: Anatomy, Physiology, and PerceptionScience, 1988
- The independence of syntactic processingJournal of Memory and Language, 1986
- Sentence Perception as an Interactive Parallel ProcessScience, 1975
- Linguistic Structure and Speech Shadowing at Very Short LatenciesNature, 1973