Multitask Compressive Sensing
Top Cited Papers
- 19 September 2008
- journal article
- research article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Signal Processing
- Vol. 57 (1) , 92-106
- https://doi.org/10.1109/tsp.2008.2005866
Abstract
Compressive sensing (CS) is a framework whereby one performs N nonadaptive measurements to constitute a vector v isin RN used to recover an approximation u isin RM desired signal u isin RM with N << M this is performed under the assumption that u is sparse in the basis represented by the matrix Psi RMtimesM. It has been demonstrated that with appropriate design of the compressive measurements used to define v, the decompressive mapping vrarru may be performed with error ||u-u||2 2 having asymptotic properties analogous to those of the best adaptive transform-coding algorithm applied in the basis Psi. The mapping vrarru constitutes an inverse problem, often solved using l1 regularization or related techniques. In most previous research, if L > 1 sets of compressive measurements {vi}i=1,L are performed, each of the associated {ui}i=1,Lare recovered one at a time, independently. In many applications the L ldquotasksrdquo defined by the mappings virarrui are not statistically independent, and it may be possible to improve the performance of the inversion if statistical interrelationships are exploited. In this paper, we address this problem within a multitask learning setting, wherein the mapping vrarru for each task corresponds to inferring the parameters (here, wavelet coefficients) associated with the desired signal vi, and a shared prior is placed across all of the L tasks. Under this hierarchical Bayesian modeling, data from all L tasks contribute toward inferring a posterior on the hyperparameters, and once the shared prior is thereby inferred, the data from each of the L individual tasks is then employed to estimate the task-dependent wavelet coefficients. An empirical Bayesian procedure for the estimation of hyperparameters is considered; two fast inference algorithms extending the relevance vector machine (RVM) are developed. Example results on several data sets demonstrate the effectiveness and robustness of the proposed algorithms.Keywords
This publication has 34 references indexed in Scilit:
- Robust Bayesian mixture modellingNeurocomputing, 2005
- A Bayesian Semiparametric Model for Random-Effects Meta-AnalysisJournal of the American Statistical Association, 2005
- Efficient, Low-Complexity Image Coding With a Set-Partitioning Embedded Block CoderIEEE Transactions on Circuits and Systems for Video Technology, 2004
- A Method for Combining Inference Across Related Nonparametric Bayesian ModelsJournal of the Royal Statistical Society Series B: Statistical Methodology, 2004
- 10.1162/15324430152748236Applied Physics Letters, 2000
- Wavelet-based statistical signal processing using hidden Markov modelsIEEE Transactions on Signal Processing, 1998
- Combining information from several experiments with nonparametric priorsBiometrika, 1997
- Combining Information from Related RegressionsJournal of Agricultural, Biological and Environmental Statistics, 1997
- A new, fast, and efficient image codec based on set partitioning in hierarchical treesIEEE Transactions on Circuits and Systems for Video Technology, 1996
- A Bayesian Analysis of Some Nonparametric ProblemsThe Annals of Statistics, 1973