Shared kernel models for class conditional density estimation
- 1 September 2001
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 12 (5) , 987-997
- https://doi.org/10.1109/72.950129
Abstract
We present probabilistic models which are suitable for class conditional density estimation and can be regarded as shared kernel models where sharing means that each kernel may contribute to the estimation of the conditional densities of an classes. We first propose a model that constitutes an adaptation of the classical radial basis function (RBF) network (with full sharing of kernels among classes) where the outputs represent class conditional densities. In the opposite direction is the approach of separate mixtures model where the density of each class is estimated using a separate mixture density (no sharing of kernels among classes). We present a general model that allows for the expression of intermediate cases where the degree of kernel sharing can be specified through an extra model parameter. This general model encompasses both the above mentioned models as special cases. In all proposed models the training process is treated as a maximum likelihood problem and expectation-maximization algorithms have been derived for adjusting the model parameters.Keywords
This publication has 6 references indexed in Scilit:
- A probabilistic RBF network for classificationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2000
- A kurtosis-based dynamic approach to Gaussian mixture modelingIEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 1999
- Mixture Density Estimation Based on Maximum Likelihood and Sequential Test StatisticsNeural Processing Letters, 1999
- Combining Artificial Neural NetsPublished by Springer Nature ,1999
- Neural Networks for Pattern RecognitionPublished by Oxford University Press (OUP) ,1995
- Mixture Densities, Maximum Likelihood and the EM AlgorithmSIAM Review, 1984