Statistical Mechanics of Support Vector Networks
- 5 April 1999
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review Letters
- Vol. 82 (14) , 2975-2978
- https://doi.org/10.1103/physrevlett.82.2975
Abstract
Using methods of statistical physics, we investigate the generalization performance of support vector machines (SVMs), which have been recently introduced as a general alternative to neural networks. For nonlinear classification rules, the generalization error saturates on a plateau when the number of examples is too small to properly estimate the coefficients of the nonlinear part. When trained on simple rules, we find that SVMs overfit only weakly. The performance of SVMs is strongly enhanced when the distribution of the inputs has a gap in feature space.All Related Versions
This publication has 8 references indexed in Scilit:
- Learning of higher-order perceptrons with tunable complexitiesJournal of Physics A: General Physics, 1998
- Convexity, internal representations and the statistical mechanics of neural networksEurophysics Letters, 1997
- Bounds for Predictive Errors in the Statistical Mechanics of Supervised LearningPhysical Review Letters, 1995
- Weight Space Structure and Internal Representations: A Direct Approach to Learning and Generalization in Multilayer Neural NetworksPhysical Review Letters, 1995
- The Nature of Statistical Learning TheoryPublished by Springer Nature ,1995
- The statistical mechanics of learning a ruleReviews of Modern Physics, 1993
- Statistical mechanics of learning from examplesPhysical Review A, 1992
- Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern RecognitionIEEE Transactions on Electronic Computers, 1965