Divergence measures based on the Shannon entropy
- 1 January 1991
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 37 (1) , 145-151
- https://doi.org/10.1109/18.61115
Abstract
A novel class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not require the condition of absolute continuity to be satisfied by the probability distributions involved. More importantly, their close relationship with the variational distance and the probability of misclassification error are established in terms of bounds. These bounds are crucial in many applications of divergence measures. The measures are also well characterized by the properties of nonnegativity, finiteness, semiboundedness, and boundedness.< >Keywords
This publication has 18 references indexed in Scilit:
- Entropy and Distance of Random Graphs with Application to Structural Pattern RecognitionPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1985
- A Decision Theory Approach to the Approximation of Discrete Probability DensitiesPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1980
- Axiomatic characterization of the directed divergences and their linear combinationsIEEE Transactions on Information Theory, 1979
- Sharper lower bounds for discrimination information in terms of variation (Corresp.)IEEE Transactions on Information Theory, 1975
- Note on discrimination information and variation (Corresp.)IEEE Transactions on Information Theory, 1970
- Probability of error, equivocation, and the Chernoff boundIEEE Transactions on Information Theory, 1970
- On the best finite set of linear observables for discriminating two Gaussian signalsIEEE Transactions on Information Theory, 1967
- The Divergence and Bhattacharyya Distance Measures in Signal SelectionIEEE Transactions on Communications, 1967
- A lower bound for discrimination information in terms of variation (Corresp.)IEEE Transactions on Information Theory, 1967
- On Information and SufficiencyThe Annals of Mathematical Statistics, 1951