New Support Vector Algorithms
Top Cited Papers
- 1 May 2000
- journal article
- research article
- Published by MIT Press in Neural Computation
- Vol. 12 (5) , 1207-1245
- https://doi.org/10.1162/089976600300015565
Abstract
We propose a new class of support vector algorithms for regression and classification. In these algorithms, a parameter ν lets one effectively control the number of support vectors. While this can be useful in its own right, the parameterization has the additional benefit of enabling us to eliminate one of the other free parameters of the algorithm: the accuracy parameter ε in the regression case, and the regularization constant C in the classification case. We describe the algorithms, give some theoretical results concerning the meaning and the choice of ν, and report experimental results.Keywords
This publication has 11 references indexed in Scilit:
- On a Kernel-Based Method for Pattern Recognition, Regression, Approximation, and Operator InversionAlgorithmica, 1998
- Structural risk minimization over data-dependent hierarchiesIEEE Transactions on Information Theory, 1998
- An Equivalence Between Sparse Approximation and Support Vector MachinesNeural Computation, 1998
- Nonlinear Component Analysis as a Kernel Eigenvalue ProblemNeural Computation, 1998
- The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the networkIEEE Transactions on Information Theory, 1998
- A Tutorial on Support Vector Machines for Pattern RecognitionData Mining and Knowledge Discovery, 1998
- Comparing support vector machines with Gaussian kernels to radial basis function classifiersIEEE Transactions on Signal Processing, 1997
- Support-vector networksMachine Learning, 1995
- Network information criterion-determining the number of hidden units for an artificial neural network modelIEEE Transactions on Neural Networks, 1994
- Nonparametric estimation of residual variance revisitedBiometrika, 1993