Symbolic gestures and spoken language are processed by a common neural system
Top Cited Papers
Open Access
- 8 December 2009
- journal article
- research article
- Published by Proceedings of the National Academy of Sciences in Proceedings of the National Academy of Sciences
- Vol. 106 (49) , 20664-20669
- https://doi.org/10.1073/pnas.0909197106
Abstract
Symbolic gestures, such as pantomimes that signify actions (e.g., threading a needle) or emblems that facilitate social transactions (e.g., finger to lips indicating “be quiet”), play an important role in human communication. They are autonomous, can fully take the place of words, and function as complete utterances in their own right. The relationship between these gestures and spoken language remains unclear. We used functional MRI to investigate whether these two forms of communication are processed by the same system in the human brain. Responses to symbolic gestures, to their spoken glosses (expressing the gestures' meaning in English), and to visually and acoustically matched control stimuli were compared in a randomized block design. General Linear Models (GLM) contrasts identified shared and unique activations and functional connectivity analyses delineated regional interactions associated with each condition. Results support a model in which bilateral modality-specific areas in superior and inferior temporal cortices extract salient features from vocal-auditory and gestural-visual stimuli respectively. However, both classes of stimuli activate a common, left-lateralized network of inferior frontal and posterior temporal regions in which symbolic gestures and spoken words may be mapped onto common, corresponding conceptual representations. We suggest that these anterior and posterior perisylvian areas, identified since the mid-19th century as the core of the brain's language system, are not in fact committed to language processing, but may function as a modality-independent semiotic system that plays a broader role in human communication, linking meaning with symbols whether these are words, gestures, images, sounds, or objects.Keywords
This publication has 39 references indexed in Scilit:
- CNS activation and regional connectivity during pantomime observation: No engagement of the mirror neuron system for deaf signersNeuroImage, 2010
- Eight Problems for the Mirror Neuron Theory of Action Understanding in Monkeys and HumansJournal of Cognitive Neuroscience, 2009
- Two cortical mechanisms support the integration of visual and auditory speech: A hypothesis and preliminary dataNeuroscience Letters, 2009
- The neural substrate of gesture recognitionNeuropsychologia, 2008
- Selective Attention to Semantic and Syntactic Features Modulates Sentence Processing Networks in Anterior Temporal CortexCerebral Cortex, 2008
- Language beyond actionJournal of Physiology-Paris, 2008
- Neural correlates of human action observation in hearing and deaf subjectsBrain Research, 2007
- The Representation of Object Concepts in the BrainAnnual Review of Psychology, 2007
- The neural basis for novel semantic categorizationNeuroImage, 2005
- Neural basis of pantomiming the use of visually presented objectsNeuroImage, 2004