Generalization in a linear perceptron in the presence of noise
- 7 March 1992
- journal article
- Published by IOP Publishing in Journal of Physics A: General Physics
- Vol. 25 (5) , 1135-1147
- https://doi.org/10.1088/0305-4470/25/5/020
Abstract
The authors study the evolution of the generalization ability of a simple linear perceptron with N inputs which learns to imitate a 'teacher perceptron'. The system is trained on p= alpha N example inputs drawn from some distribution and the generalization ability is measured by the average agreement with the teacher on test examples drawn from the same distribution. The dynamics may be solved analytically and exhibits a phase transition from imperfect to perfect generalization at alpha =1, when there are no errors (static noise) in the training examples. If the examples are produced by an erroneous teacher, overfitting is observed, i.e. the generalization error starts to increase after a finite time of training. It is shown that a weight decay of the same size as the variance of the noise (errors) on the teacher improves on the generalization and suppresses the overfitting. The generalization error as a function of time is calculated numerically for various values of the parameters. Finally dynamic noise in the training is considered. White noise on the input corresponds on average to a weight decay, and can thus improve generalization, whereas white noise on the weights or the output degrades generalization. Generalization is particularly sensitive to noise on the weights (for alpha (1) where it makes the error constantly increase with time, but this effect is also shown to be damped by a weight decay. Weight noise and output noise acts similarly above the transition at alpha =1.Keywords
This publication has 15 references indexed in Scilit:
- Generalization performance of Bayes optimal classification algorithm for learning a perceptronPhysical Review Letters, 1991
- Learning from examples in large neural networksPhysical Review Letters, 1990
- Exhaustive LearningNeural Computation, 1990
- The Langevin method in the statistical dynamics of learningJournal of Physics A: General Physics, 1990
- Inference of a rule by a neural network with thermal noisePhysical Review Letters, 1990
- On the ability of the optimal perceptron to generaliseJournal of Physics A: General Physics, 1990
- The Vapnik-Chervonenkis Dimension: Information versus Complexity in LearningNeural Computation, 1989
- Phase transitions in simple learningJournal of Physics A: General Physics, 1989
- Three unfinished works on the optimal storage capacity of networksJournal of Physics A: General Physics, 1989
- What Size Net Gives Valid Generalization?Neural Computation, 1989