About distances of discrete distributions satisfying the data processing theorem of information theory
- 1 July 1997
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 43 (4) , 1288-1293
- https://doi.org/10.1109/18.605597
Abstract
The distances of discrete probability distributions are considered. Necessary and sufficient conditions for validity of the data processing theorem of information theory are established. These conditions are applied to the Burbea-Rao (1982) divergences and Bregman (1967) distances.Keywords
This publication has 13 references indexed in Scilit:
- Maximum entropy and related methodsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Elements of Information TheoryPublished by Wiley ,2001
- Generalized cutoff rates and Renyi's information measuresIEEE Transactions on Information Theory, 1995
- Majorization, monotonicity of relative entropy, and stochastic matricesPublished by American Mathematical Society (AMS) ,1993
- Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse ProblemsThe Annals of Statistics, 1991
- Information-theoretic asymptotics of Bayes methodsIEEE Transactions on Information Theory, 1990
- Goodness-of-Fit Statistics for Discrete Multivariate DataPublished by Springer Nature ,1988
- The Strong Ergodic Theorem for Densities: Generalized Shannon-McMillan-Breiman TheoremThe Annals of Probability, 1985
- Robust StatisticsPublished by Wiley ,1981
- The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programmingUSSR Computational Mathematics and Mathematical Physics, 1967