Kernel orthonormalization in radial basis function neural networks
- 1 September 1997
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 8 (5) , 1177-1183
- https://doi.org/10.1109/72.623218
Abstract
This paper deals with optimization of the computations involved in training radial basis function (RBF) neural networks. The main contribution of the reported work is the method for network weights calculation, in which the key idea is to transform the RBF kernels into an orthonormal set of functions (using the standard Gram-Schmidt orthogonalization). This significantly reduces the computing time if the RBF training scheme, which relies on adding one kernel hidden node at a time to improve network performance, is adopted. Another property of the method is that, after the RBF network weights are computed, the original network structure can be restored back. An additional strength of the method is the possibility to decompose the proposed computing task into a number of parallel subtasks so gaining further savings on computing time. Also, the proposed weight calculation technique has low storage requirements. These features make the method very attractive for hardware implementation. The paper presents a detailed derivation of the proposed network weights calculation procedure and demonstrates its validity for RBF network training on a number of data classification and function approximation problems.Keywords
This publication has 13 references indexed in Scilit:
- Cooperative-competitive genetic evolution of radial basis function centers and widths for time series predictionIEEE Transactions on Neural Networks, 1996
- Gradient radial basis function networks for nonlinear and nonstationary time series predictionIEEE Transactions on Neural Networks, 1996
- On the persistency of excitation in radial basis function network identification of nonlinear systemsIEEE Transactions on Neural Networks, 1995
- Approximation capability to functions of several variables, nonlinear functionals, and operators by radial basis function neural networksIEEE Transactions on Neural Networks, 1995
- Learning without local minima in radial basis function networksIEEE Transactions on Neural Networks, 1995
- On the application of orthogonal transformation for the design and analysis of feedforward networksIEEE Transactions on Neural Networks, 1995
- Orthogonal approaches to time-series analysis and system identificationIEEE Signal Processing Magazine, 1991
- Orthogonal least squares learning algorithm for radial basis function networksIEEE Transactions on Neural Networks, 1991
- Networks for approximation and learningProceedings of the IEEE, 1990
- Fast Learning in Networks of Locally-Tuned Processing UnitsNeural Computation, 1989