Axiomatic characterization of the directed divergences and their linear combinations
- 1 November 1979
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 25 (6) , 709-716
- https://doi.org/10.1109/tit.1979.1056113
Abstract
The directed divergences of two probability densitiespandqare given by\int p(x) \log (p(x)/q(x))dxand by the same expression withpandqinterchanged; the divergence is the sum of the directed divergences. These quantities have applications in information theory and to the problem of assigning prior probabilities subject to constraints. It is shown that the directed divergences and their positive linear combinations, including the divegeoce, are characterized by axioms of \it{positivity, additivity}, and \it{finiteness}, which are fundamental in work on prior probabilities. In the course of the proof, the latter two are shown to imply yet another axiom: \it{linear invariance}.Keywords
This publication has 7 references indexed in Scilit:
- Comments on "Prior probability and uncertainty" by Kashyap, R. L.IEEE Transactions on Information Theory, 1979
- A comparison of the Shannon and Kullback information measuresJournal of Statistical Physics, 1973
- On Shannon's entropy, directed divergence and inaccuracyProbability Theory and Related Fields, 1972
- On directed divergence and inaccuracyProbability Theory and Related Fields, 1972
- Prior probability and uncertaintyIEEE Transactions on Information Theory, 1971
- Prior ProbabilitiesIEEE Transactions on Systems Science and Cybernetics, 1968
- Maximum Entropy for Hypothesis Formulation, Especially for Multidimensional Contingency TablesThe Annals of Mathematical Statistics, 1963