Learning and generalization in a two-layer neural network: The role of the Vapnik-Chervonvenkis dimension
- 28 March 1994
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review Letters
- Vol. 72 (13) , 2113-2116
- https://doi.org/10.1103/physrevlett.72.2113
Abstract
Bounds for the generalization ability of neural networks based on Vapnik-Chervonenkis (VC) theory are compared with statistical mechanics results for the case of the parity machine. For fixed phase space dimension, the VC dimension can grows arbitrarily by increasing the number K of hidden units. Generalization is impossible up to a critical number of training examples that grows with the VC dimension. The asymptotic decrease of the generalization error comes out independent of K and the VC bounds strongly overestimate . This shows that phase space dimension and VC dimension can play independent and different roles for the generalization process.
Keywords
This publication has 14 references indexed in Scilit:
- Systems that can learn from examples: Replica calculation of uniform convergence bounds for perceptronsPhysical Review Letters, 1993
- The statistical mechanics of learning a ruleReviews of Modern Physics, 1993
- Memorization Without Generalization in a Multilayered Neural NetworkEurophysics Letters, 1992
- Generalization in a Large Committee MachineEurophysics Letters, 1992
- Statistical mechanics of learning from examplesPhysical Review A, 1992
- Generalization performance of Bayes optimal classification algorithm for learning a perceptronPhysical Review Letters, 1991
- Statistical mechanics of a multilayered neural networkPhysical Review Letters, 1990
- What Size Net Gives Valid Generalization?Neural Computation, 1989
- The space of interactions in neural network modelsJournal of Physics A: General Physics, 1988
- Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern RecognitionIEEE Transactions on Electronic Computers, 1965