Combined genetic algorithm optimization and regularized orthogonal least squares learning for radial basis function networks
- 1 September 1999
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 10 (5) , 1239-1243
- https://doi.org/10.1109/72.788663
Abstract
Presents a two-level learning method for radial basis function (RBF) networks. A regularized orthogonal least squares (ROLS) algorithm is employed at the lower level to construct RBF networks while the two key learning parameters, the regularization parameter and the RBF width, are optimized using a genetic algorithm (GA) at the upper level. Nonlinear time series modeling and prediction is used as an example to demonstrate the effectiveness of this hierarchical learning approach.Keywords
This publication has 17 references indexed in Scilit:
- Regularized orthogonal least squares algorithm for constructing radial basis function networksInternational Journal of Control, 1996
- Regularization Theory and Neural Networks ArchitecturesNeural Computation, 1995
- ASMO—Dan algorithm for adaptive spline modelling of observation dataInternational Journal of Control, 1993
- Fuzzy basis functions, universal approximation, and orthogonal least-squares learningIEEE Transactions on Neural Networks, 1992
- On the training of radial basis function classifiersNeural Networks, 1992
- Bayesian InterpolationNeural Computation, 1992
- A Gaussian potential function network with hierarchically self-organizing learningNeural Networks, 1991
- Improving the Generalization Properties of Radial Basis Function Neural NetworksNeural Computation, 1991
- Predicting the Future: Advantages of Semilocal UnitsNeural Computation, 1991
- Universal Approximation Using Radial-Basis-Function NetworksNeural Computation, 1991