The dependence identification neural network construction algorithm
- 1 January 1996
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 7 (1) , 3-15
- https://doi.org/10.1109/72.478388
Abstract
An algorithm for constructing and training multilayer neural networks, dependence identification, is presented in this paper. Its distinctive features are that (i) it transforms the training problem into a set of quadratic optimization problems that are solved by a number of linear equations, (ii) it constructs an appropriate network to meet the training specifications, and (iii) the resulting network architecture and weights can be further refined with standard training algorithms, like backpropagation, giving a significant speedup in the development time of the neural network and decreasing the amount of trial and error usually associated with network development.Keywords
This publication has 11 references indexed in Scilit:
- Neural network training via quadratic optimizationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2003
- Learning distributed representations of concepts using linear relational embeddingIEEE Transactions on Knowledge and Data Engineering, 2001
- The Upstart Algorithm: A Method for Constructing and Training Feedforward Neural NetworksNeural Computation, 1990
- A Convergence Theorem for Sequential Learning in Two-Layer PerceptronsEurophysics Letters, 1990
- Constructive approximations for neural networks by sigmoidal functionsProceedings of the IEEE, 1990
- Approximation by superpositions of a sigmoidal functionMathematics of Control, Signals, and Systems, 1989
- Learning in feedforward layered networks: the tiling algorithmJournal of Physics A: General Physics, 1989
- Multilayer feedforward networks are universal approximatorsNeural Networks, 1989
- On the capabilities of multilayer perceptronsJournal of Complexity, 1988
- Counterpropagation networksApplied Optics, 1987