Auditory Cortex Tracks Both Auditory and Visual Stimulus Dynamics Using Low-Frequency Neuronal Phase Modulation
Top Cited Papers
Open Access
- 10 August 2010
- journal article
- research article
- Published by Public Library of Science (PLoS) in PLoS Biology
- Vol. 8 (8) , e1000445
- https://doi.org/10.1371/journal.pbio.1000445
Abstract
Integrating information across sensory domains to construct a unified representation of multi-sensory signals is a fundamental characteristic of perception in ecological contexts. One provocative hypothesis deriving from neurophysiology suggests that there exists early and direct cross-modal phase modulation. We provide evidence, based on magnetoencephalography (MEG) recordings from participants viewing audiovisual movies, that low-frequency neuronal information lies at the basis of the synergistic coordination of information across auditory and visual streams. In particular, the phase of the 2–7 Hz delta and theta band responses carries robust (in single trials) and usable information (for parsing the temporal structure) about stimulus dynamics in both sensory modalities concurrently. These experiments are the first to show in humans that a particular cortical mechanism, delta-theta phase modulation across early sensory areas, plays an important “active” role in continuously tracking naturalistic audio-visual streams, carrying dynamic multi-sensory information, and reflecting cross-sensory interaction in real time. When faced with ecologically relevant stimuli in natural scenes, our brains need to coordinate information from multiple sensory systems in order to create accurate internal representations of the outside world. Unfortunately, we currently have little information about the neuronal mechanisms for this cross-modal processing during online sensory perception under natural conditions. Neurophysiological and human imaging studies are increasingly exploring the response properties elicited by natural scenes. In this study, we recorded magnetoencephalography (MEG) data from participants viewing audiovisual movie clips. We developed a phase coherence analysis technique that captures—in single trials of watching a movie—how the phase of cortical responses is tightly coupled to key aspects of stimulus dynamics. Remarkably, auditory cortex not only tracks auditory stimulus dynamics but also reflects dynamic aspects of the visual signal. Similarly, visual cortex mainly follows the visual properties of a stimulus, but also shows sensitivity to the auditory aspects of a scene. The critical finding is that cross-modal phase modulation appears to lie at the basis of this integrative processing. Continuous cross-modal phase modulation may permit the internal construction of behaviorally relevant stimuli. Our work therefore contributes to the understanding of how multi-sensory information is analyzed and represented in the human brain.Keywords
This publication has 69 references indexed in Scilit:
- The Leading Sense: Supramodal Control of Neurophysiological Context by AttentionNeuron, 2009
- The Natural Statistics of Audiovisual SpeechPLoS Computational Biology, 2009
- Neuronal Gamma-Band Synchronization as a Fundamental Process in Cortical ComputationAnnual Review of Neuroscience, 2009
- Spike-Phase Coding Boosts and Stabilizes Information Carried by Spatial and Temporal Spike PatternsNeuron, 2009
- Phase Patterns of Neuronal Responses Reliably Discriminate Speech in Human Auditory CortexNeuron, 2007
- Early Cross-Modal Interactions in Auditory and Visual Cortex Underlie a Sound-Induced Visual IllusionJournal of Neuroscience, 2007
- High Gamma Power Is Phase-Locked to Theta Oscillations in Human NeocortexScience, 2006
- Neuronal Oscillations Enhance Stimulus Discrimination by Ensuring Action Potential PrecisionPLoS Biology, 2006
- Efficient auditory codingNature, 2006
- Hearing lips and seeing voicesNature, 1976