On the capacity of neural networks with binary weights
- 1 January 1992
- journal article
- Published by IOP Publishing in Journal of Physics A: General Physics
- Vol. 25 (2) , 367-380
- https://doi.org/10.1088/0305-4470/25/2/017
Abstract
The authors study the critical capacity ( alpha c) of multilayered networks with binary couplings. They show that, for any network presenting a tree-like architecture after the first hidden layer, no fixed internal representation is required. Using Gardner's calculations, they apply statistical mechanics to the simplest network with two layers of adaptive weights. Following the same approach as for the binary perceptron they find from the zero-entropy point a critical capacity alpha c=0.92. They discuss the validity of this result in view of exhaustive search simulations on small networks.Keywords
This publication has 9 references indexed in Scilit:
- Finite-size effects and bounds for perceptron modelsJournal of Physics A: General Physics, 1991
- Storage Capacity of a Multilayer Neural Network with Binary WeightsEurophysics Letters, 1991
- Statistical mechanics of a multilayered neural networkPhysical Review Letters, 1990
- Capacity of neural networks with discrete synaptic couplingsJournal of Physics A: General Physics, 1990
- Critical storage capacity of the J = ± 1 neural networkJournal of Physics A: General Physics, 1989
- Storage capacity of memory networks with binary couplingsJournal de Physique, 1989
- Optimal storage properties of neural network modelsJournal of Physics A: General Physics, 1988
- The space of interactions in neural network modelsJournal of Physics A: General Physics, 1988
- Stability of the Sherrington-Kirkpatrick solution of a spin glass modelJournal of Physics A: General Physics, 1978