Bounds for Predictive Errors in the Statistical Mechanics of Supervised Learning
- 13 November 1995
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review Letters
- Vol. 75 (20) , 3772-3775
- https://doi.org/10.1103/physrevlett.75.3772
Abstract
Within a Bayesian framework, by generalizing inequalities known from statistical mechanics, we calculate general upper and lower bounds for a cumulative entropic error, which measures the success in the supervised learning of an unknown rule from examples. Both bounds match asymptotically, when the number of observed data grows large. We find that the information gain from observing a new example decreases universally like . Here is a dimension that is defined from the scaling of small volumes with respect to a distance in the space of rules.
Keywords
This publication has 17 references indexed in Scilit:
- The statistical mechanics of learning a ruleReviews of Modern Physics, 1993
- Generalization transitions in hidden-layer neural networks for third-order feature discriminationPhysical Review E, 1993
- Statistical Theory of Learning Curves under Entropic Loss CriterionNeural Computation, 1993
- Statistical mechanics of learning from examplesPhysical Review A, 1992
- Generalization of Rules by Neural NetsEurophysics Letters, 1992
- GENERALIZATION ERROR AND DYNAMICAL EFFECTS IN A TWO-DIMENSIONAL PATCHES DETECTORInternational Journal of Neural Systems, 1991
- Learning in a Two-Layer Neural Network of Edge DetectorsEurophysics Letters, 1990
- Learning from examples in large neural networksPhysical Review Letters, 1990
- Inference of a rule by a neural network with thermal noisePhysical Review Letters, 1990
- Learning from Examples in a Single-Layer Neural NetworkEurophysics Letters, 1990