The analysis of decomposition methods for support vector machines
- 1 July 2000
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 11 (4) , 1003-1008
- https://doi.org/10.1109/72.857780
Abstract
The support vector machine (SVM) is a promising technique for pattern recognition. It requires the solution of a large dense quadratic programming problem. Traditional optimization methods cannot be directly applied due to memory restrictions. Up to now, very few methods can handle the memory problem and an important one is the "decomposition method." However, there is no convergence proof so far. We connect this method to projected gradient methods and provide theoretical proofs for a version of decomposition methods. An extension to bound-constrained formulation of SVM is also provided. We then show that this convergence proof is valid for general decomposition methods if their working set selection meets a simple requirement.Keywords
This publication has 8 references indexed in Scilit:
- Training support vector machines: an application to face detectionPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Successive overrelaxation for support vector machinesIEEE Transactions on Neural Networks, 1999
- Making Large-Scale Support Vector Machine Learning PracticalPublished by MIT Press ,1998
- Fast Training of Support Vector Machines Using Sequential Minimal OptimizationPublished by MIT Press ,1998
- A Tutorial on Support Vector Machines for Pattern RecognitionData Mining and Knowledge Discovery, 1998
- Support-vector networksMachine Learning, 1995
- Projected gradient methods for linearly constrained problemsMathematical Programming, 1987
- On the Goldstein-Levitin-Polyak gradient projection methodIEEE Transactions on Automatic Control, 1976