Improvement of panorama-based annotation overlay using omnidirectional vision and inertial sensors

Abstract
Annotation overlay on live video frames is an essential feature of augmented reality (AR), and is a well-suited application for wearable computers. A novel method of annotation overlay and its real-time implementation is presented. This method uses a set of panoramic images captured by omnidirectional vision at various points of environment and annotations attached on the images. The method overlays the annotations according to the image alignment between the input frames and the panoramic images. It uses inertial sensors not only to produce robust results of image registration but also to improve processing throughput and delay.

This publication has 3 references indexed in Scilit: