Remote-brained ape-like robot to study full-body mobile behaviors based on simulated models and real-time vision

Abstract
We present a new type of robot which has two arms and two legs like an ape and is aimed to study a variety of behaviors based on models and vision. The robot is designed as a remote-brained robot which does not bring its own brain within the body. It leaves the brain in the mother environment and talks with it by radio links. The brain software is raised in the mother environment inherited over generations. In this framework the robot system can have a powerful modeling system and vision processing system in the brain environment. We have applied this approach toward the formation of model and vision-based behaviors of a multi-limbed mobile robot. In this paper we present an ape-like robot with the remote-brained environment and describe model-based motions and vision-based experiments performed by the ape-like robot.

This publication has 6 references indexed in Scilit: