The Vapnik-Chervonenkis Dimension: Information versus Complexity in Learning
Open Access
- 1 September 1989
- journal article
- Published by MIT Press in Neural Computation
- Vol. 1 (3) , 312-317
- https://doi.org/10.1162/neco.1989.1.3.312
Abstract
When feasible, learning is a very attractive alternative to explicit programming. This is particularly true in areas where the problems do not lend themselves to systematic programming, such as pattern recognition in natural environments. The feasibility of learning an unknown function from examples depends on two questions: 1. Do the examples convey enough information to determine the function? 2. Is there a speedy way of constructing the function from the examples? These questions contrast the roles of information and complexity in learning. While the two roles share some ground, they are conceptually and technically different. In the common language of learning, the information question is that of generalization and the complexity question is that of scaling. The work of Vapnik and Chervonenkis (1971) provides the key tools for dealing with the information issue. In this review, we develop the main ideas of this framework and discuss how complexity fits in.Keywords
This publication has 5 references indexed in Scilit:
- What Size Net Gives Valid Generalization?Neural Computation, 1989
- On the complexity of loading shallow neural networksJournal of Complexity, 1988
- A theory of the learnableCommunications of the ACM, 1984
- On the Uniform Convergence of Relative Frequencies of Events to Their ProbabilitiesTheory of Probability and Its Applications, 1971
- Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern RecognitionIEEE Transactions on Electronic Computers, 1965