An analysis on the performance of silicon implementations of backpropagation algorithms for artificial neural networks
- 1 January 1991
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Computers
- Vol. 40 (12) , 1380-1389
- https://doi.org/10.1109/12.106223
Abstract
The effects of silicon implementation on the backpropagation learning rule in artificial neural systems are examined. The effects on learning performance of limited weight resolution, range limitations, and the steepness of the activation function are considered. A minimum resolution of about 20÷22 bits is generally required, but this figure can be reduced to about 14÷15 bits by properly choosing the learning parameter η which attains good performance in presence of limited resolution. This performance can be further improved by using a modified batch backpropagation rule. Theoretical analysis is compared with ad-hoc simulations and results are discussed in detaiKeywords
This publication has 8 references indexed in Scilit:
- Modified backpropagation algorithm for fast learning in neural networksElectronics Letters, 1990
- Acceleration techniques for the backpropagation algorithmPublished by Springer Nature ,1990
- VLSI architectures for neural networksIEEE Micro, 1989
- An analog VLSI implementation of Hopfield's neural networkIEEE Micro, 1989
- Analog VLSI Implementation of Neural SystemsPublished by Springer Nature ,1989
- Asynchronous VLSI neural networks using pulse-stream arithmeticIEEE Journal of Solid-State Circuits, 1988
- An algebraic projection analysis for optimal hidden units size and learning rates in back-propagation learningPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1988
- An introduction to computing with neural netsIEEE ASSP Magazine, 1987