Learning from examples with quadratic mutual information
- 27 November 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- p. 155-164
- https://doi.org/10.1109/nnsp.1998.710645
Abstract
Discusses an algorithm to train nonlinear mappers with information theoretic criteria (entropy or mutual information) directly from a training set. The method is based on a Parzen window estimator and uses Renyi's quadratic definition of entropy and a distance measure based on the Cauchy-Schwartz inequality. We apply the algorithm to the difficult problem of vehicle pose estimation in synthetic aperture radar (SAR) with very good results.Keywords
This publication has 2 references indexed in Scilit:
- An Information-Theoretic Approach to Neural ComputingPublished by Springer Nature ,1996
- Learning representations by back-propagating errorsNature, 1986