Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
Open Access
- 1 January 1980
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 26 (1) , 26-37
- https://doi.org/10.1109/tit.1980.1056144
Abstract
Summary:Within the framework of discrete probabilistic uncertain reasoning a large literature exists justifying the maximum entropy inference process, $\operatorname{\mathbf{ME}}$, as being optimal in the context of a single agent whose subjective probabilistic knowledge base is consistent. In particular Paris and Vencovská completely characterised the $\operatorname{\mathbf{ME}}$ inference process by means of an attractive set of axioms which an inference process should satisfy. More recently the second author extended the Paris-Vencovská axiomatic approach to inference processes in the context of several agents whose subjective probabilistic knowledge bases, while individually consistent, may be collectively inconsistent. In particular he defined a natural multi-agent extension of the inference process $\operatorname{\mathbf{ME}}$ called the social entropy process, $\operatorname{\mathbf{SEP}}$. However, while $\operatorname{\mathbf{SEP}}$ has been shown to possess many attractive properties, those which are known are almost certainly insufficient to uniquely characterise it. It is therefore of particular interest to study those Paris-Vencovská principles valid for $\operatorname{\mathbf{ME}}$ whose immediate generalisations to the multi-agent case are not satisfied by $\operatorname{\mathbf{SEP}}$. One of these principles is the Irrelevant Information Principle, a powerful and appealing principle which very few inference processes satisfy even in the single agent context. In this paper we will investigate whether $\operatorname{\mathbf{SEP}}$ can satisfy an interesting modified generalisation of this principle
Keywords
This publication has 37 references indexed in Scilit:
- Axiomatic characterization of the directed divergences and their linear combinationsIEEE Transactions on Information Theory, 1979
- Comments on "Prior probability and uncertainty" by Kashyap, R. L.IEEE Transactions on Information Theory, 1979
- Remarks on A-entropyReports on Mathematical Physics, 1977
- Information theory for marginal distributions: Application to energy disposal in an exothermic reactionThe Journal of Chemical Physics, 1977
- Maximum entropy spectral analysis and autoregressive decompositionReviews of Geophysics, 1975
- A comparison of the Shannon and Kullback information measuresJournal of Statistical Physics, 1973
- Prior probability and uncertaintyIEEE Transactions on Information Theory, 1971
- A solution of some problems of K. Borsuk and L. JánossyActa Physica Academiae Scientiarum Hungaricae, 1955
- Remarks on the foundation of probability calculusActa Physica Academiae Scientiarum Hungaricae, 1955
- An invariant form for the prior probability in estimation problemsProceedings of the Royal Society of London. Series A. Mathematical and Physical Sciences, 1946