Combining Sensory Information: Mandatory Fusion Within, but Not Between, Senses
Top Cited Papers
- 22 November 2002
- journal article
- other
- Published by American Association for the Advancement of Science (AAAS) in Science
- Vol. 298 (5598) , 1627-1630
- https://doi.org/10.1126/science.1075396
Abstract
Humans use multiple sources of sensory information to estimate environmental properties. For example, the eyes and hands both provide relevant information about an object's shape. The eyes estimate shape using binocular disparity, perspective projection, etc. The hands supply haptic shape information by means of tactile and proprioceptive cues. Combining information across cues can improve estimation of object properties but may come at a cost: loss of single-cue information. We report that single-cue information is indeed lost when cues from within the same sensory modality (disparity and texture gradients in vision) are combined, but not when different modalities (vision and haptics) are combined.Keywords
This publication has 10 references indexed in Scilit:
- Humans integrate visual and haptic information in a statistically optimal fashionNature, 2002
- Ideal cue combination for localizing texture-defined edgesJournal of the Optical Society of America A, 2001
- Computational GeometryPublished by Springer Nature ,2000
- Horizontal and vertical disparity, eye position, and stereoscopic slant perceptionVision Research, 1999
- Perceived distance, shape and sizeVision Research, 1998
- Discrimination of planar surface slant from texture: human and ideal observers comparedVision Research, 1998
- Computational models of sensorimotor integrationPublished by Elsevier ,1997
- Measurement and modeling of depth cue combination: in defense of weak fusionVision Research, 1995
- Integration of stereopsis and motion shape cuesVision Research, 1994
- Data Fusion for Sensory Information Processing SystemsPublished by Springer Nature ,1990