Algorithmic randomness and physical entropy
- 1 October 1989
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review A
- Vol. 40 (8) , 4731-4751
- https://doi.org/10.1103/physreva.40.4731
Abstract
Algorithmic randomness provides a rigorous, entropylike measure of disorder of an individual, microscopic, definite state of a physical system. It is defined by the size (in binary digits) of the shortest message specifying the microstate uniquely up to the assumed resolution. Equivalently, algorithmic randomness can be expressed as the number of bits in the smallest program for a universal computer that can reproduce the state in question (for instance, by plotting it with the assumed accuracy). In contrast to the traditional definitions of entropy, algorithmic randomness can be used to measure disorder without any recourse to probabilities. Algorithmic randomness is typically very difficult to calculate exactly but relatively easy to estimate. In large systems, probabilistic ensemble definitions of entropy (e.g., coarse-grained entropy of Gibbs and Boltzmann’s entropy H=lnW, as well as Shannon’s information-theoretic entropy) provide accurate estimates of the algorithmic entropy of an individual system or its average value for an ensemble. One is thus able to rederive much of thermodynamics and statistical mechanics in a setting very different from the usual. Physical entropy, I suggest, is a sum of (i) the missing information measured by Shannon’s formula and (ii) of the algorithmic information content—algorithmic randomness—present in the available data about the system. This definition of entropy is essential in describing the operation of thermodynamic engines from the viewpoint of information gathering and using systems. These Maxwell demon-type entities are capable of acquiring and processing information and therefore can ‘‘decide’’ on the basis of the results of their measurements and computations the best strategy for extracting energy from their surroundings. From their internal point of view the outcome of each measurement is definite. The limits on the thermodynamic efficiency arise not from the ensemble considerations, but rather reflect basic laws of computation. Thus inclusion of algorithmic randomness in the definition of physical entropy allows one to formulate thermodynamics from the Maxwell demon’s point of view.Keywords
This publication has 23 references indexed in Scilit:
- Algorithmic Information TheoryPublished by Cambridge University Press (CUP) ,1987
- Wave function of the UniversePhysical Review D, 1983
- How random is a coin toss?Physics Today, 1983
- General properties of entropyReviews of Modern Physics, 1978
- Black holes and thermodynamicsPhysical Review D, 1976
- Randomness and Mathematical ProofScientific American, 1975
- Logical basis for information theory and probability theoryIEEE Transactions on Information Theory, 1968
- The definition of random sequencesInformation and Control, 1966
- A formal theory of inductive inference. Part IInformation and Control, 1964
- The many faces of entropyCommunications on Pure and Applied Mathematics, 1961