Integration of Proprioceptive and Visual Position-Information: An Experimentally Supported Model
- 1 March 1999
- journal article
- research article
- Published by American Physiological Society in Journal of Neurophysiology
- Vol. 81 (3) , 1355-1364
- https://doi.org/10.1152/jn.1999.81.3.1355
Abstract
Integration of proprioceptive and visual position-information: an experimentally supported model. To localize one’s hand, i.e., to find out its position with respect to the body, humans may use proprioceptive information or visual information or both. It is still not known how the CNS combines simultaneous proprioceptive and visual information. In this study, we investigate in what position in a horizontal plane a hand is localized on the basis of simultaneous proprioceptive and visual information and compare this to the positions in which it is localized on the basis of proprioception only and vision only. Seated at a table, subjects matched target positions on the table top with their unseen left hand under the table. The experiment consisted of three series. In each of these series, the target positions were presented in three conditions: by vision only, by proprioception only, or by both vision and proprioception. In one of the three series, the visual information was veridical. In the other two, it was modified by prisms that displaced the visual field to the left and to the right, respectively. The results show that the mean of the positions indicated in the condition with both vision and proprioception generally lies off the straight line through the means of the other two conditions. In most cases the mean lies on the side predicted by a model describing the integration of multisensory information. According to this model, the visual information and the proprioceptive information are weighted with direction-dependent weights, the weights being related to the direction-dependent precision of the information in such a way that the available information is used very efficiently. Because the proposed model also can explain the unexpectedly small sizes of the variable errors in the localization of a seen hand that were reported earlier, there is strong evidence to support this model. The results imply that the CNS has knowledge about the direction-dependent precision of the proprioceptive and visual information.Keywords
This publication has 33 references indexed in Scilit:
- Signal-dependent noise determines motor planningNature, 1998
- BASIC ELEMENTS OF BIOLOGICAL COMPUTATIONAL SYSTEMSInternational Journal of Modern Physics C, 1994
- The role of brachial muscle spindle signals in assignment of visual directionJournal of Neurophysiology, 1993
- Proprioceptive Accuracy in Two DimensionsPerceptual and Motor Skills, 1987
- Neuronal Population Coding of Movement DirectionScience, 1986
- Static spatial effects in motor cortex and area 5: Quantitative relations in a two-dimensional spaceExperimental Brain Research, 1984
- Visually directed pointing as a function of target distance, direction, and available cuesPerception & Psychophysics, 1972
- Directed attention and maladaptive "adaptation" to displacement of the visual field.Journal of Experimental Psychology, 1971
- Intermodality inconsistency of input and directed attention as determinants of the nature of adaptation.Journal of Experimental Psychology, 1970
- Continuous versus Terminal Visual Feedback in Prism AftereffectsPerceptual and Motor Skills, 1967