Exact low-rank matrix completion via convex optimization
- 1 September 2008
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
Abstract
Suppose that one observes an incomplete subset of entries selected uniformly at random from a low-rank matrix. When is it possible to complete the matrix and recover the entries that have not been seen? We show that in very general settings, one can perfectly recover all of the missing entries from a sufficiently large random subset by solving a convex programming problem. This program finds the matrix with the minimum nuclear norm agreeing with the observed entries. The techniques used in this analysis draw upon parallels in the field of compressed sensing, demonstrating that objects other than signals and images can be perfectly reconstructed from very limited information.Keywords
All Related Versions
This publication has 15 references indexed in Scilit:
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm MinimizationSIAM Review, 2010
- Uncovering shared structures in multiclass classificationPublished by Association for Computing Machinery (ACM) ,2007
- Compressed sensingIEEE Transactions on Information Theory, 2006
- Complexity of quantifier elimination in the theory of algebraically closed fieldsPublished by Springer Nature ,2005
- Fast maximum margin matrix factorization for collaborative predictionPublished by Association for Computing Machinery (ACM) ,2005
- On the rank minimization problem over a positive semidefinite linear matrix inequalityIEEE Transactions on Automatic Control, 1997
- Semidefinite ProgrammingSIAM Review, 1996
- The geometry of graphs and some of its algorithmic applicationsCombinatorica, 1995
- Decoupling Inequalities for the Tail Probabilities of Multivariate $U$-StatisticsThe Annals of Probability, 1995
- Decoupling and Khintchine's Inequalities for $U$-StatisticsThe Annals of Probability, 1992