Omnidirectional visual information for navigating a mobile robot
- 31 December 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- p. 799-804 vol.1
- https://doi.org/10.1109/robot.1993.292075
Abstract
A method for acquiring relative positions between observation points using omnidirectional views (ODVs) is presented. The method does not require any internal sensor data of the mobile robot. As a result, the robot can move in the environment without any constraints of observation. A method for obtaining the absolute positions and the global map by fusing local maps acquired with omnidirectional range information is also proposed. The global map obtained, however, is not sufficiently precise to recognize the environment. It will be necessary to get a much more precise global map by using other visual information including in ODVs, such as color and shape of objects.<>Keywords
This publication has 4 references indexed in Scilit:
- Integration of sonar and stereo range data using a grid-based representationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- Acquiring precise range information from camera motionPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Panoramic representations of scenes around a pointPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Omni-directional stereoPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1992