Network information criterion-determining the number of hidden units for an artificial neural network model
- 1 January 1994
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 5 (6) , 865-872
- https://doi.org/10.1109/72.329683
Abstract
The problem of model selection, or determination of the number of hidden units, can be approached statistically, by generalizing Akaike's information criterion (AIC) to be applicable to unfaithful (i.e., unrealizable) models with general loss criteria including regularization terms. The relation between the training error and the generalization error is studied in terms of the number of the training examples and the complexity of a network which reduces to the number of parameters in the ordinary statistical theory of AIC. This relation leads to a new network information criterion which is useful for selecting the optimal network model based on a given training set.Keywords
This publication has 6 references indexed in Scilit:
- Statistical Theory of Learning Curves under Entropic Loss CriterionNeural Computation, 1993
- An information criterion for optimal neural network selectionIEEE Transactions on Neural Networks, 1991
- Learning in Artificial Neural Networks: A Statistical PerspectiveNeural Computation, 1989
- Stochastic Complexity and ModelingThe Annals of Statistics, 1986
- A new look at the statistical model identificationIEEE Transactions on Automatic Control, 1974
- A Theory of Adaptive Pattern ClassifiersIEEE Transactions on Electronic Computers, 1967