Abstract
We utilise an information theoretic criterion for exploratory projection pursuit (EPP) and have shown that maximisation by natural gradient ascent of the divergence of a multivariate distribution from normality, using the negentropy as a distance measure, yields a generalised independent component analysis (ICA). By considering a Gram-Charlier approximation of the latent probability density functions (PDF) we develop a generalised neuron nonlinearity which can be considered as a conditional mean estimator of the underlying independent components. The unsupervised learning rule developed is shown to asymptotically exhibit the Bussgang property and as such produces output data with independent components, irrespective of whether the independent latent variables are sub-gaussian or super-gaussian. Improved convergence speeds are reported when momentum terms are introduced into the learning.

This publication has 8 references indexed in Scilit: