Learning from noisy data: An exactly solvable model
- 1 November 1995
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review E
- Vol. 52 (5) , R4624-R4627
- https://doi.org/10.1103/physreve.52.r4624
Abstract
Exact results are derived for the learning of a linearly separable rule with a single-layer perceptron. We consider two sources of noise in the training data: the random inversion of the example outputs and weight noise in the teacher network. In both scenarios, we investigate on-line learning schemes that utilize only the latest in a sequence of uncorrelated random examples for an update of the student weights. We study Hebbian learning as well as on-line algorithms that achieve an optimal decrease of the generalization error. The latter realize an asymptotic decay of the generalization error that coincides, apart from prefactors, with the one found for off-line schemes.Keywords
This publication has 10 references indexed in Scilit:
- On-line learning in the committee machineJournal of Physics A: General Physics, 1995
- On-Line Learning with a PerceptronEurophysics Letters, 1994
- Learning drifting concepts with neural networksJournal of Physics A: General Physics, 1993
- The statistical mechanics of learning a ruleReviews of Modern Physics, 1993
- Optimal generalization in perceptionsJournal of Physics A: General Physics, 1992
- Statistical mechanics of learning from examplesPhysical Review A, 1992
- Generalization performance of Bayes optimal classification algorithm for learning a perceptronPhysical Review Letters, 1991
- Improving a Network Generalization Ability by Selecting ExamplesEurophysics Letters, 1990
- Inference of a rule by a neural network with thermal noisePhysical Review Letters, 1990
- Linear and Nonlinear Extension of the Pseudo-Inverse Solution for Learning Boolean FunctionsEurophysics Letters, 1989