Entropy manipulation of arbitrary nonlinear mappings

Abstract
We discuss an unsupervised learning method which is driven by an information theoretic based criterion. The method differs from previous work in that it is extensible to a feed-forward multilayer perceptron with an arbitrary number of layers and makes no assumption about the underlying PDF of the input space. We show a simple unsupervised method by which multidimensional signals can be nonlinearly transformed onto a maximum entropy feature space resulting in statistically independent features.