The Generalized LASSO
- 6 February 2004
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 15 (1) , 16-28
- https://doi.org/10.1109/tnn.2003.809398
Abstract
In the last few years, the support vector machine (SVM) method has motivated new interest in kernel regression techniques. Although the SVM has been shown to exhibit excellent generalization properties in many experiments, it suffers from several drawbacks, both of a theoretical and a technical nature: the absence of probabilistic outputs, the restriction to Mercer kernels, and the steep growth of the number of support vectors with increasing size of the training set. In this paper, we present a different class of kernel regressors that effectively overcome the above problems. We call this approach generalized LASSO regression. It has a clear probabilistic interpretation, can handle learning sets that are corrupted by outliers, produces extremely sparse solutions, and is capable of dealing with large-scale problems. For regression functionals which can be modeled as iteratively reweighted least-squares (IRLS) problems, we present a highly efficient algorithm with guaranteed global convergence. This defies a unique framework for sparse regression models in the very rich class of IRLS models, including various types of robust regression models and logistic regression. Performance studies for many standard benchmark datasets effectively demonstrate the advantages of this model over related approaches.Keywords
This publication has 20 references indexed in Scilit:
- Bayesian learning of sparse classifiersPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2005
- SUPANOVA: a sparse, transparent modelling approachPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- Weighted least squares support vector machines: robustness and sparse approximationNeurocomputing, 2002
- Least Squares Support Vector MachinesPublished by World Scientific Pub Co Pte Ltd ,2002
- Weighted least squares training of support vector classifiers leading to compact and adaptive schemesIEEE Transactions on Neural Networks, 2001
- A new approach to variable selection in least squares problemsIMA Journal of Numerical Analysis, 2000
- Least Absolute Shrinkage is Equivalent to Quadratic PenalizationPublished by Springer Nature ,1998
- Atomic Decomposition by Basis PursuitSIAM Journal on Scientific Computing, 1998
- Multivariate Adaptive Regression SplinesThe Annals of Statistics, 1991
- On Linear Restricted and Interval Least-Squares ProblemsIMA Journal of Numerical Analysis, 1988