Estimation of generalization capability by combination of new information criterion and cross validation
- 9 December 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- Vol. ii, 1-6
- https://doi.org/10.1109/ijcnn.1991.155303
Abstract
The authors propose a novel method of selecting the optimal neural network structure with maximum generalization capability. By expanding Akaike's information criterion, they propose a new information criterion that can estimate generalization capability without the maximum likelihood estimator of synaptic weights. The cross validation method is used to calculate the new information criterion. Computer simulation shows that the proposed information criterion can accurately predict the generalization capability of multilayer perceptrons, and thus the optimal number of hidden units can be determined.<>Keywords
This publication has 7 references indexed in Scilit:
- Learning from examples in large neural networksPhysical Review Letters, 1990
- The Vapnik-Chervonenkis Dimension: Information versus Complexity in LearningNeural Computation, 1989
- What Size Net Gives Valid Generalization?Neural Computation, 1989
- On the approximate realization of continuous mappings by neural networksNeural Networks, 1989
- Consistent inference of probabilities in layered networks: predictions and generalizationsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1989
- Capabilities of three-layered perceptronsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1988
- A new look at the statistical model identificationIEEE Transactions on Automatic Control, 1974