Synaptic weight noise during multilayer perceptron training: fault tolerance and training improvements
- 1 July 1993
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 4 (4) , 722-725
- https://doi.org/10.1109/72.238328
Abstract
The authors develop a mathematical model of the effects of synaptic arithmetic noise in multilayer perceptron training. Predictions are made regarding enhanced fault-tolerance and generalization ability and improved learning trajectory. These predictions are subsequently verified by simulation. The results are perfectly general and have profound implications for the accuracy requirements in multilayer perceptron (MLP) training, particularly in the analog domain.Keywords
This publication has 6 references indexed in Scilit:
- Enhanced MLP performance and fault tolerance resulting from synaptic weight noise during trainingIEEE Transactions on Neural Networks, 1994
- Multilayer Perceptron Learning Optimized for On-Chip Implementation: A Noise-Robust SystemNeural Computation, 1992
- Using additive noise in back-propagation trainingIEEE Transactions on Neural Networks, 1992
- Analogue noise-enhanced learning in neural network circuitsElectronics Letters, 1991
- Curvature-Driven Smoothing in Backpropagation Neural NetworksPublished by Springer Nature ,1990
- Spontaneous behaviour in neural networksJournal of Theoretical Biology, 1972