Noise injection into inputs in back-propagation learning
- 1 January 1992
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Systems, Man, and Cybernetics
- Vol. 22 (3) , 436-440
- https://doi.org/10.1109/21.155944
Abstract
Back-propagation can be considered a nonlinear regression technique, allowing a nonlinear neural network to acquire an input/output (I/O) association using a limited number of samples chosen from a population of input and output patterns. A crucial problem on back-propagation is its generalization capability. A network successfully trained for given samples is not guaranteed to provide desired associations for untrained inputs as well. Concerning this problem some authors showed experimentally that the generalization capability could remarkably be enhanced by training the network with noise injected inputs. The author mathematically explains why and how the noise injection to inputs has such an effect.<>Keywords
This publication has 2 references indexed in Scilit:
- An algebraic projection analysis for optimal hidden units size and learning rates in back-propagation learningPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1988
- Neural net pruning-why and howPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1988