Recognition of head gestures using hidden Markov models

Abstract
This paper explores the use of hidden Markov models (HMMs) for the recognition of head gestures. A gesture corresponds to a particular pattern of head movement. The facial plane is tracked using a parameterized model and the temporal sequence of three image rotation parameters are used to describe four gestures. A dynamic vector quantization scheme was implemented to transform the parameters into suitable input data for the HMMs. Each model was trained by the iterative Baum-Welch procedure using 28 sequences taken from 5 persons. Experimental results from a different data set (33 new sequences from 6 other persons) demonstrate the effectiveness of this approach.

This publication has 8 references indexed in Scilit: