Acting under uncertainty: discrete Bayesian models for mobile-robot navigation
- 24 December 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- Vol. 2, 963-972
- https://doi.org/10.1109/iros.1996.571080
Abstract
Discrete Bayesian models have been used to model uncertainty for mobile-robot navigation, but the question of how actions should be chosen remains largely unexplored. This paper presents the optimal solution to the problem, formulated as a partially observable Markov decision process. Since solving for the optimal control policy is intractable, in general, it goes on to explore a variety of heuristic control strategies. The control strategies are compared experimentally, both in simulation and in runs on a robot.Keywords
This publication has 11 references indexed in Scilit:
- High resolution maps from wide angle sonarPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2005
- Unsupervised learning of probabilistic models for robot navigationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Learning policies for partially observable environments: Scaling upPublished by Elsevier ,1995
- Markov Decision ProcessesPublished by Wiley ,1994
- A survey of algorithmic methods for partially observed Markov decision processesAnnals of Operations Research, 1991
- A survey of solution techniques for the partially observed Markov decision processAnnals of Operations Research, 1991
- Mobile robot localization by tracking geometric beaconsIEEE Transactions on Robotics and Automation, 1991
- The Complexity of Markov Decision ProcessesMathematics of Operations Research, 1987
- The Optimal Control of Partially Observable Markov Processes over a Finite HorizonOperations Research, 1973
- Use of the Hough transformation to detect lines and curves in picturesCommunications of the ACM, 1972