The statistical mechanics of learning a rule
- 1 April 1993
- journal article
- research article
- Published by American Physical Society (APS) in Reviews of Modern Physics
- Vol. 65 (2) , 499-556
- https://doi.org/10.1103/revmodphys.65.499
Abstract
A summary is presented of the statistical mechanical theory of learning a rule with a neural network, a rapidly advancing area which is closely related to other inverse problems frequently encountered by physicists. By emphasizing the relationship between neural networks and strongly interacting physical systems, such as spin glasses, the authors show how learning theory has provided a workshop in which to develop new, exact analytical techniques.This publication has 119 references indexed in Scilit:
- Dynamics of learning and generalization in a binary perceptron modelZeitschrift für Physik B Condensed Matter, 1992
- Fast generating algorithm for a general three-layer perceptronNeural Networks, 1992
- Dynamics of learning for the binary perceptron problemZeitschrift für Physik B Condensed Matter, 1992
- Thermodynamic properties of theQ-state Potts-glass neural networkPhysical Review A, 1992
- Learning algorithm for a neural network with binary synapsesZeitschrift für Physik B Condensed Matter, 1990
- Minimum class entropy: A maximum information approach to layered networksNeural Networks, 1989
- The Vapnik-Chervonenkis Dimension: Information versus Complexity in LearningNeural Computation, 1989
- Nonlinear neural networks. I. General theoryJournal of Statistical Physics, 1988
- Learning and forgetting on asymmetric, diluted neural networksJournal of Statistical Physics, 1987
- Information Theory and Statistical MechanicsPhysical Review B, 1957