Natural Gradient Learning for Over- and Under-Complete Bases in ICA
- 1 November 1999
- journal article
- Published by MIT Press in Neural Computation
- Vol. 11 (8) , 1875-1883
- https://doi.org/10.1162/089976699300015990
Abstract
Independent component analysis or blind source separation is a new technique of extracting independent signals from mixtures. It is applicable even when the number of independent sources is unknown and is larger or smaller than the number of observed mixture signals. This article extends the natural gradient learning algorithm to be applicable to these overcomplete and undercomplete cases. Here, the observed signals are assumed to be whitened by preprocessing, so that we use the natural Riemannian gradient in Stiefel manifolds.Keywords
This publication has 10 references indexed in Scilit:
- An Equivalence Between Sparse Approximation and Support Vector MachinesNeural Computation, 1998
- Natural Gradient Works Efficiently in LearningNeural Computation, 1998
- The Geometry of Algorithms with Orthogonality ConstraintsSIAM Journal on Matrix Analysis and Applications, 1998
- Stability Analysis of Learning Algorithms for Blind Source SeparationNeural Networks, 1997
- Adaptive Online Learning Algorithms for Blind Separation: Maximum Entropy and Minimum Mutual InformationNeural Computation, 1997
- Sequential blind signal extraction in order specifiedbystochastic propertiesElectronics Letters, 1997
- Equivariant adaptive source separationIEEE Transactions on Signal Processing, 1996
- An Information-Maximization Approach to Blind Separation and Blind DeconvolutionNeural Computation, 1995
- Independent component analysis, A new concept?Signal Processing, 1994
- Blind separation of sources, part I: An adaptive algorithm based on neuromimetic architectureSignal Processing, 1991