A Sequential Learning Scheme for Function Approximation Using Minimal Radial Basis Function Neural Networks
- 1 February 1997
- journal article
- Published by MIT Press in Neural Computation
- Vol. 9 (2) , 461-478
- https://doi.org/10.1162/neco.1997.9.2.461
Abstract
This article presents a sequential learning algorithm for function approximation and time-series prediction using a minimal radial basis function neural network (RBFNN). The algorithm combines the growth criterion of the resource-allocating network (RAN) of Platt (1991) with a pruning strategy based on the relative contribution of each hidden unit to the overall network output. The resulting network leads toward a minimal topology for the RBFNN. The performance of the algorithm is compared with RAN and the enhanced RAN algorithm of Kadirkamanathan and Niranjan (1993) for the following benchmark problems: (1) hearta from the benchmark problems database PROBEN1, (2) Hermite polynomial, and (3) Mackey-Glass chaotic time series. For these problems, the proposed algorithm is shown to realize RBFNNs with far fewer hidden neurons with better or same accuracy.Keywords
This publication has 6 references indexed in Scilit:
- Minimal Topology for a Radial Basis Functions Neural Network for Pattern ClassificationDigital Signal Processing, 1994
- A Function Estimation Approach to Sequential Learning with Neural NetworksNeural Computation, 1993
- Hybrid learning algorithm for Gaussian potential function networksIEE Proceedings D Control Theory and Applications, 1993
- On the training of radial basis function classifiersNeural Networks, 1992
- A Resource-Allocating Network for Function InterpolationNeural Computation, 1991
- Fast Learning in Networks of Locally-Tuned Processing UnitsNeural Computation, 1989