Iterated Hard Shrinkage for Minimization Problems with Sparsity Constraints
- 1 January 2008
- journal article
- Published by Society for Industrial & Applied Mathematics (SIAM) in SIAM Journal on Scientific Computing
- Vol. 30 (2) , 657-683
- https://doi.org/10.1137/060663556
Abstract
A new iterative algorithm for the solution of minimization problems in infinite-dimensional Hilbert spaces which involve sparsity constraints in form of $\ell^{p}$-penalties is proposed. In contrast to the well-known algorithm considered by Daubechies, Defrise, and De Mol, it uses hard instead of soft shrinkage. It is shown that the hard shrinkage algorithm is a special case of the generalized conditional gradient method. Convergence properties of the generalized conditional gradient method with quadratic discrepancy term are analyzed. This leads to strong convergence of the iterates with convergence rates $\mathcal{O}(n^{-1/2})$ and $\mathcal{O}(\lambda^n)$ for $p=1$ and $1
Keywords
This publication has 18 references indexed in Scilit:
- Coordinate and subspace optimization methods for linear least squares with non-quadratic regularizationApplied and Computational Harmonic Analysis, 2007
- Stable recovery of sparse overcomplete representations in the presence of noiseIEEE Transactions on Information Theory, 2005
- Decoding by Linear ProgrammingIEEE Transactions on Information Theory, 2005
- Signal Recovery by Proximal Forward-Backward SplittingMultiscale Modeling & Simulation, 2005
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraintCommunications on Pure and Applied Mathematics, 2004
- A Fast Spectral Method for Active 3D Shape ReconstructionJournal of Mathematical Imaging and Vision, 2004
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimizationProceedings of the National Academy of Sciences, 2003
- Atomic Decomposition by Basis PursuitSIAM Journal on Scientific Computing, 1998
- Convergence Rates for Conditional Gradient Sequences Generated by Implicit Step Length RulesSIAM Journal on Control and Optimization, 1980
- Rates of Convergence for Conditional Gradient Algorithms Near Singular and Nonsingular ExtremalsSIAM Journal on Control and Optimization, 1979