Generalization properties of finite-size polynomial support vector machines

Abstract
The learning properties of finite-size polynomial support vector machines are analyzed in the case of realizable classification tasks. The normalization of the high-order features acts as a squeezing factor, introducing a strong anisotropy in the patterns distribution in feature space. As a function of the training set size, the corresponding generalization error presents a crossover, more or less abrupt depending on the distribution’s anisotropy and on the task to be learned, between a fast-decreasing and a slowly decreasing regime. This behavior corresponds to the stepwise decrease found by Dietrich et al. [Phys. Rev. Lett. 82, 2975 (1999)] in the thermodynamic limit. The theoretical results are in excellent agreement with the numerical simulations.
All Related Versions