Automatic learning of assembly tasks using a DataGlove system
- 19 November 2002
- proceedings article
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- Vol. 1, 1-8
- https://doi.org/10.1109/iros.1995.525767
Abstract
This paper deals with the automatic programming of robots. We present here a method in which a robot can learn new assembly tasks by monitoring the performance of a human operator wearing a DataGlove. In particular, the system records the motions of the DataGlove as it is used to manipulate actual objects and, by using geometric reasoning, deduces the assembly task that is being performed. Subsequently, the assembly steps thus deduced are translated automatically by a task planner into a robot manipulation program. In experimental demonstrations of this system, the robot can now learn assembly tasks involving pickups, putdowns and various mating operations.Keywords
This publication has 7 references indexed in Scilit:
- Seeing, understanding and doing human taskPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- Robotic assembly operation based on task-level teaching in virtual realityPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- Recognizing teleoperated manipulationsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Integrating sensing, task planning and executionPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Toward an assembly plan from observation. I. Task recognition with polyhedral objectsIEEE Transactions on Robotics and Automation, 1994
- Toward automatic robot instruction from perception-recognizing a grasp from observationIEEE Transactions on Robotics and Automation, 1993
- A robot vision system for recognizing 3D objects in low-order polynomial timeIEEE Transactions on Systems, Man, and Cybernetics, 1989