The effect of initial weights on premature saturation in back-propagation learning
- 9 December 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- Vol. i, 765-770
- https://doi.org/10.1109/ijcnn.1991.155275
Abstract
The critical drawback of the backpropagation learning algorithm is its slow error convergence. The major reason for this is the premature saturation, a phenomenon in which the error of a neural network stays almost constant for some period of time during learning. It is known to be caused by an inappropriate set of initial weights. The probability of incorrectly saturated output nodes at the beginning epoch of learning is derived as a function of the range of initial weights, the number of nodes in each layer, and the maximum slope of the sigmoidal activation function. This is verified by Monte Carlo simulation.<>Keywords
This publication has 4 references indexed in Scilit:
- An adaptive least squares algorithm for the efficient training of artificial neural networksIEEE Transactions on Circuits and Systems, 1989
- Theory of the backpropagation neural networkPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1989
- Multilayer feedforward networks are universal approximatorsNeural Networks, 1989
- Increased rates of convergence through learning rate adaptationNeural Networks, 1988