Simultaneous analysis of Lasso and Dantzig selector
Top Cited Papers
Open Access
- 1 August 2009
- journal article
- Published by Institute of Mathematical Statistics in The Annals of Statistics
- Vol. 37 (4) , 1705-1732
- https://doi.org/10.1214/08-aos620
Abstract
We show that, under a sparsity scenario, the Lasso estimator and the Dantzig selector exhibit similar behavior. For both methods, we derive, in parallel, oracle inequalities for the prediction risk in the general nonparametric regression model, as well as bounds on the ℓp estimation loss for 1≤p≤2 in the linear model when the number of variables can be much larger than the sample size.Keywords
All Related Versions
This publication has 22 references indexed in Scilit:
- The Dantzig selector and sparsity oracle inequalitiesBernoulli, 2009
- Lasso-type recovery of sparse representations for high-dimensional dataThe Annals of Statistics, 2009
- The sparsity and bias of the Lasso selection in high-dimensional linear regressionThe Annals of Statistics, 2008
- High-dimensional generalized linear models and the lassoThe Annals of Statistics, 2008
- Discussion: The Dantzig selector: Statistical estimation when p is much larger than nThe Annals of Statistics, 2007
- Aggregation for Gaussian regressionThe Annals of Statistics, 2007
- Sparsity oracle inequalities for the LassoElectronic Journal of Statistics, 2007
- Persistence in high-dimensional linear predictor selection and the virtue of overparametrizationBernoulli, 2004
- A new approach to variable selection in least squares problemsIMA Journal of Numerical Analysis, 2000
- Functional aggregation for nonparametric regressionThe Annals of Statistics, 2000