Microscopic Equations and Stability Conditions in Optimal Neural Networks
- 1 May 1995
- journal article
- Published by IOP Publishing in Europhysics Letters
- Vol. 30 (4) , 245-250
- https://doi.org/10.1209/0295-5075/30/4/010
Abstract
Using the cavity method I derive the microscopic equations and their stabilitycondition for information learning in neural networks, optimized with arbitraryperformance functions in terms of the aligning fields of the examples. Inthe thermodynamic limit the aligning fields are well defined functions of thecavity fields. Iterating the microscopic equations provide a general algorithmfor network learning, supported by simulations in the maximally stable perceptronand the committee tree....Keywords
This publication has 8 references indexed in Scilit:
- Neural networks optimally trained with noisy dataPhysical Review E, 1993
- ‘‘Cavity-approach’’ analysis of the neural-network learning problemPhysical Review E, 1993
- The statistical mechanics of learning a ruleReviews of Modern Physics, 1993
- Two-layer perceptrons at saturationPhysical Review A, 1992
- Storage capacity and learning algorithms for two-layer neural networksPhysical Review A, 1992
- Learning and retrieval in attractor neural networks above saturationJournal of Physics A: General Physics, 1991
- The space of interactions in neural networks: Gardner's computation with the cavity methodJournal of Physics A: General Physics, 1989
- The space of interactions in neural network modelsJournal of Physics A: General Physics, 1988