Generalization properties of finite-size polynomial support vector machines
- 1 November 2000
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review E
- Vol. 62 (5) , 7092-7099
- https://doi.org/10.1103/physreve.62.7092
Abstract
The learning properties of finite-size polynomial support vector machines are analyzed in the case of realizable classification tasks. The normalization of the high-order features acts as a squeezing factor, introducing a strong anisotropy in the patterns distribution in feature space. As a function of the training set size, the corresponding generalization error presents a crossover, more or less abrupt depending on the distribution’s anisotropy and on the task to be learned, between a fast-decreasing and a slowly decreasing regime. This behavior corresponds to the stepwise decrease found by Dietrich et al. [Phys. Rev. Lett. 82, 2975 (1999)] in the thermodynamic limit. The theoretical results are in excellent agreement with the numerical simulations.Keywords
All Related Versions
This publication has 9 references indexed in Scilit:
- Statistical Mechanics of Support Vector NetworksPhysical Review Letters, 1999
- Learning of higher-order perceptrons with tunable complexitiesJournal of Physics A: General Physics, 1998
- Supervised Learning from Clustered Input ExamplesEurophysics Letters, 1995
- Learning with a Temperature-Dependent AlgorithmEurophysics Letters, 1995
- The Nature of Statistical Learning TheoryPublished by Springer Nature ,1995
- Properties of neural networks storing spatially correlated patternsJournal of Physics A: General Physics, 1992
- On the ability of the optimal perceptron to generaliseJournal of Physics A: General Physics, 1990
- Learning times of neural networks: Exact solution for a PERCEPTRON algorithmPhysical Review A, 1988
- Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern RecognitionIEEE Transactions on Electronic Computers, 1965