Remote-brained ape-like robot to study full-body mobile behaviors based on simulated models and real-time vision
- 1 January 1996
- journal article
- Published by Taylor & Francis in Advanced Robotics
- Vol. 11 (6) , 653-668
- https://doi.org/10.1163/156855397x00137
Abstract
We present a new type of robot which has two arms and two legs like an ape and is aimed to study a variety of behaviors based on models and vision. The robot is designed as a remote-brained robot which does not bring its own brain within the body. It leaves the brain in the mother environment and talks with it by radio links. The brain software is raised in the mother environment inherited over generations. In this framework the robot system can have a powerful modeling system and vision processing system in the brain environment. We have applied this approach toward the formation of model and vision-based behaviors of a multi-limbed mobile robot. In this paper we present an ape-like robot with the remote-brained environment and describe model-based motions and vision-based experiments performed by the ape-like robot.Keywords
This publication has 6 references indexed in Scilit:
- From stable to chaotic juggling: theory, simulation, and experimentsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Animate visionArtificial Intelligence, 1991
- Active visionInternational Journal of Computer Vision, 1988
- Active perceptionProceedings of the IEEE, 1988
- Adaptive Locomotion of a Multilegged Robot over Rough TerrainIEEE Transactions on Systems, Man, and Cybernetics, 1979
- Feature extraction of three-dimensional objects and visual processing in a hand-eye system using laser trackerPattern Recognition, 1976