A norm selection criterion for the generalized delta rule
- 1 January 1991
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 2 (1) , 125-130
- https://doi.org/10.1109/72.80298
Abstract
The derivation of a supervised training algorithm for a neural network implies the selection of a norm criterion which gives a suitable global measure of the particular distribution of errors. The author addresses this problem and proposes a correspondence between error distribution at the output of a layered feedforward neural network and L(p) norms. The generalized delta rule is investigated in order to verify how its structure can be modified in order to perform a minimization in the generic L(p) norm. The particular case of the Chebyshev norm is developed and tested.Keywords
This publication has 6 references indexed in Scilit:
- Consistent inference of probabilities in layered networks: predictions and generalizationsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1989
- Parallel Distributed ProcessingPublished by MIT Press ,1986
- On the existence of a class of maximum-entropy probability density functions (Corresp.)IEEE Transactions on Information Theory, 1977
- Maximum-entropy distributions having prescribed first and second moments (Corresp.)IEEE Transactions on Information Theory, 1973
- Fitting continuous probability density functions over [0,infty) using information theory ideas (Corresp.)IEEE Transactions on Information Theory, 1970
- Probability, Frequency and Reasonable ExpectationAmerican Journal of Physics, 1946