A geometric feature relation graph formulation for consistent sensor fusion
- 4 December 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
Abstract
A generic framework for sensory fusion is presented. A sensor-independent feature-based relational model, the geometric feature relation graph (GFRG), is developed for representing sensory information acquired by various sensors. Sensory fusion is accomplished through consistent integration of multiple irregular GFRGs acquired by various sensors into a regular GFRG. In the integration process. An effective, robust procedure for identifying coincident measurements of features based on uncertain geometric information and topological constraints, and a nonlinear programming formulation for the maintenance of consistency are presented. The identifying procedure is established by the algorithm IDENT in a knowledge-fusing mechanism using the Dempster-Shafer theory of belief function. Computer simulations verify the validity and performance of the framework.Keywords
This publication has 5 references indexed in Scilit:
- Integrating Vision and Touch for Object Recognition TasksThe International Journal of Robotics Research, 1988
- Combining Sonar and Infrared Sensors for Mobile Robot NavigationThe International Journal of Robotics Research, 1988
- Dynamic multi-sensor data fusion system for intelligent robotsIEEE Journal on Robotics and Automation, 1988
- On the Representation and Estimation of Spatial UncertaintyThe International Journal of Robotics Research, 1986
- Sensor fusion and object localizationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1986