Gradient descent learning in perceptrons: A review of its possibilities
- 1 August 1995
- journal article
- review article
- Published by American Physical Society (APS) in Physical Review E
- Vol. 52 (2) , 1958-1967
- https://doi.org/10.1103/physreve.52.1958
Abstract
We present a streamlined formalism which reduces the calculation of the generalization error for a perceptron, trained on random examples generated by a teacher perceptron, to a matter of simple algebra. The method is valid whenever the student perceptron can be identified as the unique minimum of a specific cost function. The asymptotic generalization error is calculated explicitly for a broad class of cost functions, and a specific cost function is singled out that leads to a generalization error extremely close to the one of the Bayes classifier. DOI: http://dx.doi.org/10.1103/PhysRevE.52.1958 Received 28 November 1994Published in the issue dated August 1995© 1995 The American Physical SocietyKeywords
This publication has 19 references indexed in Scilit:
- Perceptrons above saturationJournal of Physics A: General Physics, 1993
- Neural networks optimally trained with noisy dataPhysical Review E, 1993
- ‘‘Cavity-approach’’ analysis of the neural-network learning problemPhysical Review E, 1993
- The statistical mechanics of learning a ruleReviews of Modern Physics, 1993
- A fast method for calculating the perceptron with maximal stabilityJournal de Physique I, 1993
- Statistical mechanics of learning from examplesPhysical Review A, 1992
- Generalization performance of Bayes optimal classification algorithm for learning a perceptronPhysical Review Letters, 1991
- Learning and retrieval in attractor neural networks above saturationJournal of Physics A: General Physics, 1991
- Optimally adapted attractor neural networks in the presence of noiseJournal of Physics A: General Physics, 1990
- The AdaTron: An Adaptive Perceptron AlgorithmEurophysics Letters, 1989