Hardware-backpropagation learning of neuron MOS neural networks
- 1 January 1992
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- No. 01631918,p. 435-438
- https://doi.org/10.1109/iedm.1992.307395
Abstract
This paper describes the design and architecture of a neural network having a hardware-learning capability, in which a functional transistor called neuron MOSFET (neuMOS or vMOS) is utilized as a key element. In order to implement learning algorithm on the chip, a new hardware-oriented backpropagation learning algorithm has been developed by modifying and simplifying the original backpropagation algorithm. In addition, a six-transistor synapse cell which is free from standby power dissipation and is capable of representing both positive and negative weights (excitatory and inhibitory synapse functions) under a single 5 V power supply has been developed for use on a self-learning chip.Keywords
This publication has 3 references indexed in Scilit:
- A functional MOS transistor featuring gate-level weighted sum and threshold operationsIEEE Transactions on Electron Devices, 1992
- Characteristics of floating gate device as analogue memory for neural networksElectronics Letters, 1991
- Optimum design of dual-control gate cell for high-density EEPROM'sIEEE Transactions on Electron Devices, 1985