Dynamic multi-sensor data fusion system for intelligent robots

Abstract
The objective of the authors is to develop an intelligent robot workstation capable of integrating data from multiple sensors. The investigation is based on a Unimation PUMA 560 robot and various external sensors. These include overhead vision, eye-in-hand vision, proximity, tactile array, position, force/torque, cross-fire, overload, and slip-sensing devices. The efficient fusion of data from different sources will enable the machine to respond promptly in dealing with the 'real world'. Towards this goal, the general paradigm of a sensor data fusion system has been developed, and some simulation results, as well as results from the actual implementation of certain concepts of sensor data fusion, have been demonstrated.<>

This publication has 15 references indexed in Scilit: