Blind source separation using Renyi's mutual information
- 1 June 2001
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Signal Processing Letters
- Vol. 8 (6) , 174-176
- https://doi.org/10.1109/97.923043
Abstract
A blind source separation algorithm is proposed that is based on minimizing Renyi's mutual information by means of nonparametric probability density function (PDF) estimation. The two-stage process consists of spatial whitening and a series of Givens rotations and produces a cost function consisting only of marginal entropies. This formulation avoids the problems of PDF inaccuracy due to truncation of series expansion and the estimation of joint PDFs in high-dimensional spaces given the typical paucity of data. Simulations illustrate the superior efficiency, in terms of data length, of the proposed method compared to fast independent component analysis (FastICA), Comon's (1994) minimum mutual information, and Bell and Sejnowski's (1995) Infomax.Keywords
This publication has 5 references indexed in Scilit:
- Fast and robust fixed-point algorithms for independent component analysisIEEE Transactions on Neural Networks, 1999
- Blind signal separation: statistical principlesProceedings of the IEEE, 1998
- Adaptive Online Learning Algorithms for Blind Separation: Maximum Entropy and Minimum Mutual InformationNeural Computation, 1997
- An Information-Maximization Approach to Blind Separation and Blind DeconvolutionNeural Computation, 1995
- Independent component analysis, A new concept?Signal Processing, 1994