Information Theory and Statistical Mechanics
- 15 May 1957
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review B
- Vol. 106 (4) , 620-630
- https://doi.org/10.1103/physrev.106.620
Abstract
Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.Keywords
This publication has 6 references indexed in Scilit:
- Extension of the Condensation Theory of Yang and Lee to the Pressure EnsemblePhysical Review B, 1956
- Foundations of Statistical MechanicsReviews of Modern Physics, 1955
- The general statistical problem in physics and the theory of probabilityIl Nuovo Cimento (1869-1876), 1955
- A Mathematical Theory of CommunicationBell System Technical Journal, 1948
- Statistical estimationTransactions of the American Mathematical Society, 1936
- Theory of Statistical EstimationMathematical Proceedings of the Cambridge Philosophical Society, 1925