Abstract
The task of parametric model selection is cast in terms of a statistical mechanics on the space of probability distributions. Using the techniques of low-temperature expansions, we arrive at a systematic series for the Bayesian posterior probability of a model family that significantly extends known results in the literature. In particular, we arrive at a precise understanding of how Occam's Razor, the principle that simpler models should be preferred until the data justifies more complex models, is automatically embodied by probability theory. These results require a measure on the space of model parameters and we derive and discuss an interpretation of Jeffreys' prior distribution as a uniform prior over the distributions indexed by a family. Finally, we derive a theoretical index of the complexity of a parametric family relative to some true distribution that we call the {\it razor} of the model. The form of the razor immediately suggests several interesting questions in the theory of learning that can be studied using the techniques of statistical mechanics.
All Related Versions

This publication has 0 references indexed in Scilit: