Weight perturbation: an optimal architecture and learning technique for analog VLSI feedforward and recurrent multilayer networks
- 1 January 1992
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 3 (1) , 154-157
- https://doi.org/10.1109/72.105429
Abstract
Previous work on analog VLSI implementation of multilayer perceptrons with on-chip learning has mainly targeted the implementation of algorithms such as back-propagation. Although back-propagation is efficient, its implementation in analog VLSI requires excessive computational hardware. It is shown that using gradient descent with direct approximation of the gradient instead of back-propagation is more economical for parallel analog implementations. It is shown that this technique (which is called ;weight perturbation') is suitable for multilayer recurrent networks as well. A discrete level analog implementation showing the training of an XOR network as an example is presented.Keywords
This publication has 8 references indexed in Scilit:
- A new area and shape function estimation technique for VLSI layoutsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- A module area estimator for VLSI layoutPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- An Artificial Intelligence Approach to Integrated Circuit FloorplanningPublished by Springer Nature ,1991
- PIAF: a knowledge-based/algorithm top-down floorplanning systemPublished by Association for Computing Machinery (ACM) ,1989
- Techniques for area estimation of VLSI layoutsIEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 1989
- PIAF: efficient IC floor planningIEEE Expert, 1989
- Induction of decision treesMachine Learning, 1986
- Parallel Distributed ProcessingPublished by MIT Press ,1986