A gesture interface for human-robot-interaction
- 27 November 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- p. 546-551
- https://doi.org/10.1109/afgr.1998.671005
Abstract
The authors present a person-independent gesture interface implemented on a real robot which allows the user to give simple commands, e.g., how to grasp an object and where to put it. The gesture analysis relies on real-time tracking of the user's hand and a refined analysis of the hand's shape in the presence of varying complex backgrounds.Keywords
This publication has 9 references indexed in Scilit:
- Robust classification of hand postures against complex backgroundsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Toward the use of gesture in traditional user interfacesPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Hand sign recognition from intensity image sequences with complex backgroundsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Invariant features for 3-D gesture recognitionPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Towards 3D hand tracking using a deformable modelPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Happy patrons make better tippers: creating a robot waiter using Perseus and the Animate Agent architecturePublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Face recognition by elastic bunch graph matchingPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Visual interpretation of hand gestures for human-computer interaction: a reviewPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1997
- Distortion invariant object recognition in the dynamic link architectureIEEE Transactions on Computers, 1993