Abstract
Using simple information theoretic inequalities, a lower bound to the Vapnik-Chervonenkis (VC) complexity of neural networks is investigated. This bound is expressed by the average entropy used in the statistical mechanics approach to the network’s generalization problem. Within the annealed theory, exact bounds to the VC dimension or the storage capacity can be calculated explicitly, without using the replica method. For the parity machine, the estimates of capacities match known upper bounds asymptotically, when the number of hidden units grows large.