Bounds on the number of samples needed for neural learning
- 1 January 1991
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 2 (6) , 548-558
- https://doi.org/10.1109/72.97932
Abstract
The relationship between the number of hidden nodes in a neural network, the complexity of a multiclass discrimination problem, and the number of samples needed for effect learning are discussed. Bounds for the number of samples needed for effect learning are given. It is shown that Omega(min (d,n) M) boundary samples are required for successful classification of M clusters of samples using a two-hidden-layer neural network with d-dimensional inputs and n nodes in the first hidden layer.Keywords
This publication has 8 references indexed in Scilit:
- On hidden nodes for neural netsIEEE Transactions on Circuits and Systems, 1989
- What Size Net Gives Valid Generalization?Neural Computation, 1989
- Formation of disconnected decision regions with a single hidden layerPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1989
- Neural network learning time: effects of network and training set sizePublished by Institute of Electrical and Electronics Engineers (IEEE) ,1989
- On the capabilities of multilayer perceptronsJournal of Complexity, 1988
- An introduction to computing with neural netsIEEE ASSP Magazine, 1987
- Algorithms in Combinatorial GeometryPublished by Springer Nature ,1987
- Geometrical and Statistical Properties of Systems of Linear Inequalities with Applications in Pattern RecognitionIEEE Transactions on Electronic Computers, 1965