On Gaussian radial basis function approximations: interpretation, extensions, and learning strategies
- 11 November 2002
- proceedings article
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- Vol. 2, 618-621
- https://doi.org/10.1109/icpr.2000.906151
Abstract
In this paper we focus on an interpretation of Gaussian radial basis functions (GRBF) which motivates extensions and learning strategies. Specifically, we show that GRBF regression equations naturally result from representing the input-output joint probability density function by a finite mixture of Gaussians. Corollaries of this interpretation are: some special forms of GRBF representations can be traced back to the type of Gaussian mixture used; previously pro- posed learning methods based on input-output clustering have a new meaning; finally, estimation techniques for finite mixtures (namely the EM algorithm, and model selection criteria) can be invoked to learn GRBF regression equa- tions.Keywords
This publication has 11 references indexed in Scilit:
- SMEM Algorithm for Mixture ModelsNeural Computation, 2000
- Kullback proximal algorithms for maximum-likelihood estimationIEEE Transactions on Information Theory, 2000
- Structure Learning in Conditional Probability Models via an Entropic Prior and Parameter ExtinctionNeural Computation, 1999
- On Fitting Mixture ModelsPublished by Springer Nature ,1999
- Bayesian approaches to Gaussian mixture modelingPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1998
- Deterministic annealing for clustering, compression, classification, regression, and related optimization problemsProceedings of the IEEE, 1998
- Deterministic annealing EM algorithmNeural Networks, 1998
- Self annealing: Unifying deterministic annealing and relaxation labelingPublished by Springer Nature ,1997
- Pairwise data clustering by deterministic annealingPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1997
- Bayesian TheoryPublished by Wiley ,1994