Abstract
We consider inexact linear equationsyā‰ˆ Ī¦xwhereyis a given vector in ā„n, Ī¦ is a givennƗmmatrix, and we wish to findx0,Ļµas sparse as possible while obeying ā€–yāˆ’ Ī¦x0,Ļµā€–2ā‰¤ Ļµ. In general, this requires combinatorial optimization and so is considered intractable. On the other hand, the š“1ā€minimization problemis convex and is considered tractable. We show that for most Ī¦, if the optimally sparse approximationx0,Ļµis sufficiently sparse, then the solutionx1,Ļµof the š“1ā€minimization problem is a good approximation tox0,Ļµ.We suppose that the columns of Ī¦ are normalized to the unit š“2ā€norm, and we place uniform measure on such Ī¦. We study theunderdeterminedcase wheremāˆ¼ Ļ„nand Ļ„ > 1, and prove the existence of Ļ = Ļ(Ļ„) > 0 andC=C(Ļ, Ļ„) so that for largenand for all Ī¦'s except a negligible fraction, the followingapproximate sparse solutionproperty of Ī¦ holds:for every y having an approximationā€–yāˆ’ Ī¦x0ā€–2ā‰¤ Ļµby a coefficient vector x0āˆˆ ā„mwith fewer than Ļ Ā· n nonzeros,This has two implications. First, for most Ī¦, whenever the combinatorial optimization resultx0,Ļµwould be very sparse,x1,Ļµis a good approximation tox0,Ļµ. Second, suppose we are given noisy data obeyingy= Ī¦x0+zwhere the unknownx0is known to be sparse and the noise ā€–zā€–2ā‰¤ Ļµ. For most Ī¦, noiseā€tolerant š“1ā€minimization will stably recoverx0fromyin the presence of noisez.We also study the barely determined casem=nand reach parallel conclusions by slightly different arguments.Proof techniques include the use of almostā€spherical sections in Banach space theory and concentration of measure for eigenvalues of random matrices.

This publication has 19 references indexed in Scilit: