Speeding Up Backpropagation Algorithms by Using Cross-Entropy Combined with Pattern Normalization
- 1 April 1998
- journal article
- Published by World Scientific Pub Co Pte Ltd in International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems
- Vol. 6 (2) , 117-126
- https://doi.org/10.1142/s0218488598000100
Abstract
This paper demonstrates how the backpropagation algorithm (BP) and its variants can be accelerated significantly while the quality of the trained nets will increase. Two modifications were proposed: First, instead of the usual quadratic error we use the cross entropy as an error function and second, we normalize the input patterns. The first modification eliminates the so called sigmoid prime factor of the update rule for the output units. In order to balance the dynamic range of the inputs we use normalization. The combination of both modifications is called CEN–Optimization (Cross Entropy combined with Pattern Normalization). As our simulation results show CEN–Optimization can't only improve online BP but also RPPROP, the most sophisticated BP variant known today. Even though RPROP yields usually much better results than online BP the performance gap between CEN–BP and CEN–RPROP is smaller than between the standard versions of those algorithms. By means of CEN–RPROP it is nearly guaranteed to achieve an error of zero (with respect to the training set). Simultaneously, the generalization performance of the trained nets can be increased, because less complex networks suffice to fit the training set. Compared to the usual SSE (summed squared error) one can yield lower training errors with fewer weights.Keywords
This publication has 0 references indexed in Scilit: