Neural Oscillations Carry Speech Rhythm through to Comprehension
Top Cited Papers
Open Access
- 1 January 2012
- journal article
- review article
- Published by Frontiers Media SA in Frontiers in Psychology
- Vol. 3, 320
- https://doi.org/10.3389/fpsyg.2012.00320
Abstract
A key feature of speech is the quasi-regular rhythmic information contained in its slow amplitude modulations. In this article we review the information conveyed by speech rhythm, and the role of ongoing brain oscillations in listeners’ processing of this content. Our starting point is the fact that speech is inherently temporal, and that rhythmic information conveyed by the amplitude envelope contains important markers for place and manner of articulation, segmental information, and speech rate. Behavioral studies demonstrate that amplitude envelope information is relied upon by listeners and plays a key role in speech intelligibility. Extending behavioral findings, data from neuroimaging – particularly electroencephalography (EEG) and magnetoencephalography (MEG) – point to phase locking by ongoing cortical oscillations to low-frequency information (~4–8 Hz) in the speech envelope. This phase modulation effectively encodes a prediction of when important events (such as stressed syllables) are likely to occur, and acts to increase sensitivity to these relevant acoustic cues. We suggest a framework through which such neural entrainment to speech rhythm can explain effects of speech rate on word and segment perception (i.e., that the perception of phonemes and words in connected speech is influenced by preceding speech rate). Neuroanatomically, acoustic amplitude modulations are processed largely bilaterally in auditory cortex, with intelligible speech resulting in differential recruitment of left-hemisphere regions. Notable among these is lateral anterior temporal cortex, which we propose functions in a domain-general fashion to support ongoing memory and integration of meaningful input. Together, the reviewed evidence suggests that low-frequency oscillations in the acoustic speech signal form the foundation of a rhythmic hierarchy supporting spoken language, mirrored by phase-locked oscillations in the human brain.Keywords
This publication has 137 references indexed in Scilit:
- Cortical representation of the constituent structure of sentencesProceedings of the National Academy of Sciences, 2011
- The functional role of cross-frequency couplingTrends in Cognitive Sciences, 2010
- Evaluating an acoustically quiet EPI sequence for use in fMRI studies of speech and auditory processingNeuroImage, 2010
- The Temporal Structures and Functional Significance of Scale-free Brain ActivityNeuron, 2010
- Dynamics of Active Sensing and perceptual selectionPublished by Elsevier ,2010
- Maps and streams in the auditory cortex: nonhuman primates illuminate human speech processingNature Neuroscience, 2009
- Low-frequency neuronal oscillations as instruments of sensory selectionPublished by Elsevier ,2008
- Effects of prosodically modulated sub-phonetic variation on lexical competitionCognition, 2007
- An oscillatory interference model of grid cell firingHippocampus, 2007
- Phase Patterns of Neuronal Responses Reliably Discriminate Speech in Human Auditory CortexNeuron, 2007