A Generalized Divergence Measure for Nonnegative Matrix Factorization
Top Cited Papers
- 1 March 2007
- journal article
- Published by MIT Press in Neural Computation
- Vol. 19 (3) , 780-791
- https://doi.org/10.1162/neco.2007.19.3.780
Abstract
This letter presents a general parametric divergence measure. The metric includes as special cases quadratic error and Kullback-Leibler divergence. A parametric generalization of the two different multiplicative update rules for nonnegative matrix factorization by Lee and Seung (2001) is shown to lead to locally optimal solutions of the nonnegative matrix factorization problem with this new cost function. Numeric simulations demonstrate that the new update rule may improve the quadratic distance convergence speed. A proof of convergence is given that, as in Lee and Seung, uses an auxiliary function known from the expectation-maximization theoretical framework.Keywords
This publication has 5 references indexed in Scilit:
- Preintegration Lateral Inhibition Enhances Unsupervised LearningNeural Computation, 2002
- Learning the parts of objects by non-negative matrix factorizationNature, 1999
- Backpropagation Applied to Handwritten Zip Code RecognitionNeural Computation, 1989
- NEURAL NETWORKS, PRINCIPAL COMPONENTS, AND SUBSPACESInternational Journal of Neural Systems, 1989
- Maximum Likelihood from Incomplete Data Via the EM AlgorithmJournal of the Royal Statistical Society Series B: Statistical Methodology, 1977