Entropy and Uncertainty
- 1 December 1986
- journal article
- research article
- Published by Cambridge University Press (CUP) in Philosophy of Science
- Vol. 53 (4) , 467-491
- https://doi.org/10.1086/289336
Abstract
This essay is, primarily, a discussion of four results about the principle of maximizing entropy (MAXENT) and its connections with Bayesian theory. Result1provides a restricted equivalence between the two: where the Bayesian model for MAXENT inference uses an “a priori“ probability that is uniform, and where all MAXENT constraints are limited to 0–1 expectations for simple indicator-variables. The other three results report on an inability to extend the equivalence beyond these specialized constraints. Result2established a sensitivity of MAXENT inference to the choice of the algebra of possibilities even though all empirical constraints imposed on the MAXENT solution are satisfied in each measure space considered. The resulting MAXENT distribution is not invariant over the choice of measure space. Thus, old and familiar problems with the Laplacian principle of Insufficient Reason also plague MAXENT theory. Result3builds upon the findings of Friedman and Shimony (1971; 1973) and demonstrates the absence of an exchangeable, Bayesian model for predictive MAXENT distributions when the MAXENT constraints are interpreted according to Jaynes's (1978) prescription for his (1963) Brandeis Dice problem. Lastly, Result4generalizes the Friedman and Shimony objection to cross-entropy (Kullback-information) shifts subject to a constraint of a new odds-ratio for two disjoint events.Keywords
This publication has 16 references indexed in Scilit:
- On the convexity of some divergence measures based on entropy functionsIEEE Transactions on Information Theory, 1982
- Direct Inference and Confirmational ConditionalizationPhilosophy of Science, 1981
- Properties of cross-entropy minimizationIEEE Transactions on Information Theory, 1981
- Bayesian Conditionalisation and the Principle of Minimum InformationThe British Journal for the Philosophy of Science, 1980
- Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropyIEEE Transactions on Information Theory, 1980
- Why I am not an objective Bayesian; some reflections prompted by RosenkrantzTheory and Decision, 1979
- On the Kullback information measure as a basis for information theory: Comments on a proposal by Hobson and ChangJournal of Statistical Physics, 1973
- Comment on the interpretation of inductive probabilitiesJournal of Statistical Physics, 1973
- Restoring with Maximum Likelihood and Maximum Entropy*Journal of the Optical Society of America, 1972
- Information Theory and Statistical MechanicsPhysical Review B, 1957