Automatic visual station keeping of an underwater robot

Abstract
This paper presents a method for drift-free station keeping of an underwater robot using computer vision. The sensing problem is simplified by assuming an active control system can be used to keep positional errors small. Robot position is obtained by tracking texture features using image filtering and correlation. Errors in four degrees of freedom (translation and yaw) are determined in real time and are fed into a robot control system to accomplish the task of station keeping. Experimental results demonstrating sensing quality and robot station keeping are presented.

This publication has 14 references indexed in Scilit: