Statistical mechanics of hypothesis evaluation
- 21 May 1994
- journal article
- Published by IOP Publishing in Journal of Physics A: General Physics
- Vol. 27 (10) , 3355-3363
- https://doi.org/10.1088/0305-4470/27/10/010
Abstract
Following ideas of Gull, Skilling and MacKay (1992), we develop and explore a statistical-mechanics framework through which one may assign values to the parameters of a model for a 'rule' (instanced, here, by the noisy linear perceptron), on the basis of data instancing the rule. The 'evidence' which the data offers in support of a given assignment, is likened to the free energy of a system with quenched variables (the data): the most probable (MAP) assignments of parameters are those which minimize this free-energy; tracking the free-energy minimum may lead to 'phase transitions' in the preferred assignments. We explore the extent to which the MAP assignments lead to optimal performance.Keywords
This publication has 8 references indexed in Scilit:
- Stochastic linear learning: Exact test and training error averagesNeural Networks, 1993
- Learning and generalization in a linear perceptron stochastically trained with noisy dataJournal of Physics A: General Physics, 1993
- Optimal Learning with a Neural NetworkEurophysics Letters, 1993
- Bayesian InterpolationNeural Computation, 1992
- Statistical mechanics of learning from examplesPhysical Review A, 1992
- Generalization in a linear perceptron in the presence of noiseJournal of Physics A: General Physics, 1992
- Phase transitions in simple learningJournal of Physics A: General Physics, 1989
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of ImagesPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1984