DASSO: Connections Between the Dantzig Selector and Lasso
- 24 July 2008
- journal article
- Published by Oxford University Press (OUP) in Journal of the Royal Statistical Society Series B: Statistical Methodology
- Vol. 71 (1) , 127-142
- https://doi.org/10.1111/j.1467-9868.2008.00668.x
Abstract
Summary. We propose a new algorithm, DASSO, for fitting the entire coefficient path of the Dantzig selector with a similar computational cost to the least angle regression algorithm that is used to compute the lasso. DASSO efficiently constructs a piecewise linear path through a sequential simplex-like algorithm, which is remarkably similar to the least angle regression algorithm. Comparison of the two algorithms sheds new light on the question of how the lasso and Dantzig selector are related. In addition, we provide theoretical conditions on the design matrix X under which the lasso and Dantzig selector coefficient estimates will be identical for certain tuning parameters. As a consequence, in many instances, we can extend the powerful non-asymptotic bounds that have been developed for the Dantzig selector to the lasso. Finally, through empirical studies of simulated and real world data sets we show that in practice, when the bounds hold for the Dantzig selector, they almost always also hold for the lasso.Keywords
Funding Information
- National Science Foundation (DMS-0705312, DMS-0806030)
This publication has 23 references indexed in Scilit:
- A Simple Proof of the Restricted Isometry Property for Random MatricesConstructive Approximation, 2008
- The Dantzig selector: Statistical estimation when p is much larger than nThe Annals of Statistics, 2007
- Rejoinder: The Dantzig selector: Statistical estimation when p is much larger than nThe Annals of Statistics, 2007
- For most large underdetermined systems of equations, the minimal 𝓁1‐norm near‐solution approximates the sparsest near‐solutionCommunications on Pure and Applied Mathematics, 2006
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency informationIEEE Transactions on Information Theory, 2006
- Stable recovery of sparse overcomplete representations in the presence of noiseIEEE Transactions on Information Theory, 2005
- Quantitative Robust Uncertainty Principles and Optimally Sparse DecompositionsFoundations of Computational Mathematics, 2005
- Decoding by Linear ProgrammingIEEE Transactions on Information Theory, 2005
- Atomic Decomposition by Basis PursuitSIAM Journal on Scientific Computing, 1998
- Better Subset Regression Using the Nonnegative GarroteTechnometrics, 1995