Eye-Tracking
- 1 December 1996
- journal article
- research article
- Published by Taylor & Francis in Language and Cognitive Processes
- Vol. 11 (6) , 583-588
- https://doi.org/10.1080/016909696386971
Abstract
Participants following spoken instructions to touch or move either real objects or objects on a computer screen make saccadic eye movements (to the objects) that are closely time-locked to relevant information in the speech stream. Monitoring eye movements using a head-mounted eye-camera allows one to use the locations and latencies of fixations to examine spoken word recognition during continuous speech in natural contexts. Preliminary work using this paradigm provides striking evidence for the continuous and incremental nature of comprehension, as well as clear effects of visual context on the earliest moments of linguistic processing. We review the eye-movement paradigm and refer to recent experiments applying the paradigm to issues of spoken word recognition (e.g. lexical competitor effects), syntactic processing (e.g. the interaction of referential context and ambiguity resolution), reference resolution (disambiguating temporarily ambiguous referential phrases), focus (modulating the salience of certain objects via contrastive stress), as well as issues in cross-modality integration that are central to evaluating the modularity hypothesis.Keywords
This publication has 4 references indexed in Scilit:
- Using Eye Movements to Study Spoken Language Comprehension: Evidence for Visually Mediated Incremental InterpretationPublished by MIT Press ,1996
- Eye movements as a window into real-time spoken language comprehension in natural contextsJournal of Psycholinguistic Research, 1995
- Integration of Visual and Linguistic Information in Spoken Language ComprehensionScience, 1995
- The control of eye fixation by the meaning of spoken languageCognitive Psychology, 1974