Auditory-Visual Speech Perception and Aging
- 1 October 2002
- journal article
- research article
- Published by Wolters Kluwer Health in Ear & Hearing
- Vol. 23 (5) , 439-449
- https://doi.org/10.1097/00003446-200210000-00006
Abstract
Objective This experiment was designed to assess the integration of auditory and visual information for speech perception in older adults. The integration of place and voicing information was assessed across modalities using the McGurk effect. The following questions were addressed: 1) Are older adults as successful as younger adults at integrating auditory and visual information for speech perception? 2) Is successful integration of this information related to lipreading performance? Design The performance of three groups of participants was compared: young adults with normal hearing and vision, older adults with normal to near-normal hearing and vision, and young controls, whose hearing thresholds were shifted with noise to match the older adults. Each participant completed a lipreading test and auditory and auditory-plus-visual identification of syllables with conflicting auditory and visual cues. Results The results show that on average older adults are as successful as young adults at integrating auditory and visual information for speech perception at the syllable level. The number of fused responses did not differ for the CV tokens across the ages tested. Although there were no significant differences between groups for integration at the syllable level, there were differences in the response alternatives chosen. Young adults with normal peripheral sensitivity often chose an auditory alternative whereas, older adults and control participants leaned toward visual alternatives. In additions, older adults demonstrated poorer lipreading performance than their younger counterparts. This was not related to successful integration of information at the syllable level. Conclusions Based on the findings of this study, when auditory and visual integration of speech information fails to occur, producing a nonfused response, participants select an alternative response from the modality with the least ambiguous signal.Keywords
This publication has 36 references indexed in Scilit:
- Talker variability effects in auditory-visual speech perceptionThe Journal of the Acoustical Society of America, 1999
- Audiovisual integration in perception of real wordsPerception & Psychophysics, 1992
- Health screening and random recruitment for cognitive aging research.Psychology and Aging, 1992
- Crossmodal Integration in the Identification of Consonant SegmentsThe Quarterly Journal of Experimental Psychology Section A, 1991
- Teaching lip-reading: The efficacy of lessons on videoBritish Journal of Audiology, 1989
- Mathematical treatment of context effects in phoneme and word recognitionThe Journal of the Acoustical Society of America, 1988
- Speech understanding and agingThe Journal of the Acoustical Society of America, 1988
- Standardization of a Test of Speech Perception in NoiseJournal of Speech, Language, and Hearing Research, 1984
- Perceptual dominance during lipreadingPerception & Psychophysics, 1982
- Auditory and Visual Contributions to the Perception of ConsonantsJournal of Speech and Hearing Research, 1974