Results on learnability and the Vapnik-Chervonenkis dimension
- 1 January 1988
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
Abstract
The problem of learning a concept from examples in a distribution-free model is considered. The notion of dynamic sampling, wherein the number of examples examined can increase with the complexity of the target concept, is introduced. This method is used to establish the learnability of various concept classes with an infinite Vapnik-Chervonenkis (VC) dimension. An important variation on the problem of learning from examples, called approximating from examples, is also discussed. The problem of computing the VC dimension of a finite concept set defined on a finite domain is considered.Keywords
This publication has 8 references indexed in Scilit:
- Equivalence of models for polynomial learnabilityInformation and Computation, 1991
- Learning decision listsMachine Learning, 1987
- On the learnability of Boolean formulaePublished by Association for Computing Machinery (ACM) ,1987
- Recent Results on Boolean Concept LearningPublished by Elsevier ,1987
- Classifying learnable geometric concepts with the Vapnik-Chervonenkis dimensionPublished by Association for Computing Machinery (ACM) ,1986
- A theory of the learnableCommunications of the ACM, 1984
- Modeling by shortest data descriptionAutomatica, 1978
- On the Uniform Convergence of Relative Frequencies of Events to Their ProbabilitiesTheory of Probability and Its Applications, 1971