Superior Generalization Capability of Hardware-Learing Algorithm Developed for Self-Learning Neuron-MOS Neural Networks
- 1 February 1995
- journal article
- Published by IOP Publishing in Japanese Journal of Applied Physics
- Vol. 34 (2S)
- https://doi.org/10.1143/jjap.34.1066
Abstract
We have investigated the learning performance of the hardware backpropagation (HBP) algorithm, a hardware-oriented learning algorithm developed for the self-learning architecture of neural networks constructed using neuron MOS (metal-oxide-semiconductor) transistors. The solution to finding a mirror symmetry axis in a 4×4 binary pixel array was tested by computer simulation based on the HBP algorithm. Despite the inherent restrictions imposed on the hardware-learning algorithm, HBP exhibits equivalent learning performance to that of the original backpropagation (BP) algorithm when all the pertinent parameters are optimized. Very importantly, we have found that HBP has a superior generalization capability over BP; namely, HBP exhibits higher performance in solving problems that the network has not yet learnt.Keywords
This publication has 5 references indexed in Scilit:
- Neuron MOS binary-logic integrated circuits. II. Simplifying techniques of circuit configuration and their practical applicationsIEEE Transactions on Electron Devices, 1993
- Neuron MOS binary-logic integrated circuits. I. Design fundamentals and soft-hardware-logic circuit implementationIEEE Transactions on Electron Devices, 1993
- A functional MOS transistor featuring gate-level weighted sum and threshold operationsIEEE Transactions on Electron Devices, 1992
- Analog VLSI Implementation of Neural SystemsPublished by Springer Nature ,1989
- Parallel Distributed ProcessingPublished by MIT Press ,1986