A neural-network learning theory and a polynomial time RBF algorithm
- 1 November 1997
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 8 (6) , 1301-1313
- https://doi.org/10.1109/72.641453
Abstract
This paper presents a new learning theory (a set of principles for brain-like learning) and a corresponding algorithm for the neural-network field. The learning theory defines computational characteristics that are much more brain-like than that of classical connectionist learning. Robust and reliable learning algorithms would result if these learning principles are followed rigorously when developing neural-network algorithms. This paper also presents a new algorithm for generating radial basis function (RBF) nets for function approximation. The design of the algorithm is based on the proposed set of learning principles. The net generated by this algorithm is not a typical RBF net, but a combination of "truncated" RBF and other types of hidden units. The algorithm uses random clustering and linear programming (LP) to design and train this "mixed" RBF net. Polynomial time complexity of the algorithm is proven and computational results are provided for the well known Mackey-Glass chaotic time series problem, the logistic map prediction problem, various neuro-control problems, and several time series forecasting problems. The algorithm can also be implemented as an online adaptive algorithm.Keywords
This publication has 27 references indexed in Scilit:
- Rival penalized competitive learning for clustering analysis, RBF net, and curve detectionIEEE Transactions on Neural Networks, 1993
- A Polynomial Time Algorithm for Generating Neural Networks for Pattern Classification: Its Stability Properties and Some Test ResultsNeural Computation, 1993
- On the training of radial basis function classifiersNeural Networks, 1992
- Universal Approximation Using Radial-Basis-Function NetworksNeural Computation, 1991
- A Resource-Allocating Network for Function InterpolationNeural Computation, 1991
- A fast and robust learning algorithm for feedforward neural networksNeural Networks, 1991
- Layered Neural Networks with Gaussian Hidden Units as Universal ApproximationsNeural Computation, 1990
- Regularization Algorithms for Learning That Are Equivalent to Multilayer NetworksScience, 1990
- Fast Learning in Networks of Locally-Tuned Processing UnitsNeural Computation, 1989
- The accuracy of extrapolation (time series) methods: Results of a forecasting competitionJournal of Forecasting, 1982