Entropy manipulation of arbitrary nonlinear mappings
- 22 November 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
Abstract
We discuss an unsupervised learning method which is driven by an information theoretic based criterion. The method differs from previous work in that it is extensible to a feed-forward multilayer perceptron with an arbitrary number of layers and makes no assumption about the underlying PDF of the input space. We show a simple unsupervised method by which multidimensional signals can be nonlinearly transformed onto a maximum entropy feature space resulting in statistically independent features.Keywords
This publication has 6 references indexed in Scilit:
- An Information-Theoretic Approach to Neural ComputingPublished by Springer Nature ,1996
- An Information-Maximization Approach to Blind Separation and Blind DeconvolutionNeural Computation, 1995
- How to Generate Ordered Maps by Maximizing the Mutual Information between Input and Output SignalsNeural Computation, 1989
- Self-organization in a perceptual networkComputer, 1988
- Self-Organization and Associative MemoryPublished by Springer Nature ,1988
- On Estimation of a Probability Density Function and ModeThe Annals of Mathematical Statistics, 1962