Statistical physics estimates for the complexity of feedforward neural networks
- 1 April 1995
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review E
- Vol. 51 (4) , 3613-3618
- https://doi.org/10.1103/physreve.51.3613
Abstract
Using simple information theoretic inequalities, a lower bound to the Vapnik-Chervonenkis (VC) complexity of neural networks is investigated. This bound is expressed by the average entropy used in the statistical mechanics approach to the network’s generalization problem. Within the annealed theory, exact bounds to the VC dimension or the storage capacity can be calculated explicitly, without using the replica method. For the parity machine, the estimates of capacities match known upper bounds asymptotically, when the number of hidden units grows large.Keywords
This publication has 24 references indexed in Scilit:
- Computational capabilities of restricted two-layered perceptronsPhysical Review E, 1994
- Systems that can learn from examples: Replica calculation of uniform convergence bounds for perceptronsPhysical Review Letters, 1993
- Memorization Without Generalization in a Multilayered Neural NetworkEurophysics Letters, 1992
- Generalization in a Large Committee MachineEurophysics Letters, 1992
- Statistical mechanics of learning from examplesPhysical Review A, 1992
- Broken symmetries in multilayered perceptronsPhysical Review A, 1992
- Learning unlearnable problems with perceptronsPhysical Review A, 1992
- Storage Capacity of a Multilayer Neural Network with Binary WeightsEurophysics Letters, 1991
- Statistical mechanics of a multilayered neural networkPhysical Review Letters, 1990
- What Size Net Gives Valid Generalization?Neural Computation, 1989