Self-Orienting with On-Line Learning of Environmental Features

Abstract
Evidence from recently conducted neurophysiological experiments on freely moving rats has revealed that the firing of the head-direction cell ensemble predicts the future head direction in response to the vestibular input and that visual cues strongly influence the shift of the tuning curve represented by the firing of the head-direction cell ensemble. In this article, we investigate the possibility of using learned landmark features to self-orient an autonomous agent in a partially known environment. A model is suggested that incorporates an artificial head-direction system for emulating the behavior of head-direction cell ensembles in biological systems, a lattice-based dynamic cell structure for categorizing and classifying environmental features, and an expectancy-based learning mechanism that learns to associate each head direction with a certain environmental feature. Our experimental results show that the suggested model is capable of correcting the drift in the orientation estimated by dead-reckoning.