Scaling laws in learning of classification tasks
- 17 May 1993
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review Letters
- Vol. 70 (20) , 3167-3170
- https://doi.org/10.1103/physrevlett.70.3167
Abstract
The effect of the structure of the input distribution on the complexity of learning a pattern classification task is investigated. Using statistical mechanics, we study the performance of a winner-take-all machine at learning to classify points generated by a mixture of K Gaussian distributions (‘‘clusters’’) in with intercluster distance u (relative to the cluster width). In the separation limit u≫1, the number of examples required for learning scales as , where the exponent p is 2 for zero-temperature Gibbs learning and 4 for the Hebb rule.
Keywords
This publication has 9 references indexed in Scilit:
- Statistical mechanics of learning from examplesPhysical Review A, 1992
- Learning multi-class classification problemsJournal de Physique I, 1992
- Recognizing Hand-Printed Letters and Digits Using Backpropagation LearningNeural Computation, 1991
- Generalization performance of Bayes optimal classification algorithm for learning a perceptronPhysical Review Letters, 1991
- Learning from examples in large neural networksPhysical Review Letters, 1990
- On the ability of the optimal perceptron to generaliseJournal of Physics A: General Physics, 1990
- Backpropagation Applied to Handwritten Zip Code RecognitionNeural Computation, 1989
- What Size Net Gives Valid Generalization?Neural Computation, 1989
- On the mean accuracy of statistical pattern recognizersIEEE Transactions on Information Theory, 1968