Blind Separation of Positive Sources by Globally Convergent Gradient Search
- 1 September 2004
- journal article
- research article
- Published by MIT Press in Neural Computation
- Vol. 16 (9) , 1811-1825
- https://doi.org/10.1162/0899766041336413
Abstract
The instantaneous noise-free linear mixing model in independent component analysis is largely a solved problem under the usual assumption of independent nongaussian sources and full column rank mixing matrix. However, with some prior information on the sources, like positivity, new analysis and perhaps simplified solution methods may yet become possible. In this letter, we consider the task of independent component analysis when the independent sources are known to be nonnegative and well grounded, which means that they have a nonzero pdf in the region of zero. It can be shown that in this case, the solution method is basically very simple: an orthogonal rotation of the whitened observation vector into nonnegative outputs will give a positive permutation of the original sources. We propose a cost function whose minimum coincides with nonnegativity and derive the gradient algorithm under the whitening constraint, under which the separating matrix is orthogonal. We further prove that in the Stiefel manifold of orthogonal matrices, the cost function is a Lyapunov function for the matrix gradient flow, implying global convergence. Thus, this algorithm is guaranteed to find the nonnegative well-grounded independent sources. The analysis is complemented by a numerical simulation, which illustrates the algorithm.Keywords
This publication has 12 references indexed in Scilit:
- Algorithms for nonnegative independent component analysisIEEE Transactions on Neural Networks, 2003
- Conditions for nonnegative independent component analysisIEEE Signal Processing Letters, 2002
- Multivariate receptor models—current practice and future trendsChemometrics and Intelligent Laboratory Systems, 2002
- A Theory for Learning by Weight Flow on Stiefel-Grassman ManifoldNeural Computation, 2001
- Learning the parts of objects by non-negative matrix factorizationNature, 1999
- Modelling multiple-cause structure using rectification constraintsNetwork: Computation in Neural Systems, 1998
- The Geometry of Algorithms with Orthogonality ConstraintsSIAM Journal on Matrix Analysis and Applications, 1998
- The nonlinear PCA learning rule in independent component analysisNeurocomputing, 1997
- Equivariant adaptive source separationIEEE Transactions on Signal Processing, 1996
- Positive matrix factorization: A non‐negative factor model with optimal utilization of error estimates of data valuesEnvironmetrics, 1994