Real-time tracking meets online grasp planning

Abstract
Describes a synergistic integration of a grasping simulator and a real-time visual tracking system, that work in concert to (1) find an object's pose, (2) plan grasps and movement trajectories, and (3) visually monitor task execution. Starting with a CAD model of an object to be grasped, the system can find the object's pose through vision which then synchronizes the state of the robot workcell with an online, model-based grasp planning and visualization system we have developed called GraspIt. GraspIt can then plan a stable grasp for the object, and direct the robotic hand system to perform the grasp. It can also generate trajectories for the movement of the grasped object, which are used by the visual control system to monitor the task and compare the actual grasp and trajectory with the planned ones. We present experimental results using typical grasping tasks.

This publication has 8 references indexed in Scilit: