Robust localization using context in omnidirectional imaging

Abstract
This work presents the concept to recover and utilize the visual context in panoramic images. Omnidirectional imaging has become recently an ecient,basis for robot navigation. The proposed Bayesian reasoning over local image appearances enables to reject false hypotheses which do not,t the structural constraints in corresponding feature trajectories. The methodology is proved with real image data from an oce,robot to dramatically increase the localization performance in the presence of severe occlusion eects, particularly in noisy environments, and to recover rotational information on the,y.

This publication has 18 references indexed in Scilit: