Using gesture and speech control for commanding a robot assistant
- 25 June 2003
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
Abstract
Giving advice to a mobile robot assistant still requires classical user interfaces. A more intuitive way of commanding is achieved by verbal or gesture commands. In this article, we present new approaches and enhancements for established methods that are in use in our laboratory. Our aim is to interact with a robot using natural and direct communication techniques to facilitate robust performance of simple tasks. Within this paper, we describe the robot's vision and speech recognition system. Then, we display robot control for selecting the appropriate robot reaction for solving basic manipulation tasks.Keywords
This publication has 11 references indexed in Scilit:
- A person following behaviour for a mobile robotPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- KANTRA-human-machine interaction for intelligent robots using natural languagePublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Human motion analysis based on a robot arm modelPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Natural dialogue with the Jijo-2 office robotPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Helping computer vision by verbal and nonverbal communicationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Robust visual tracking by an active observerPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- A conversational robot utilizing facial and body expressionsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Building a multimodal human-robot interfaceIEEE Intelligent Systems, 2001
- Active ContoursPublished by Springer Nature ,1998
- Closing the loop: detection and pursuit of a moving object by a moving observerImage and Vision Computing, 1996