Maximum smoothed likelihood density estimation
- 1 January 1995
- journal article
- research article
- Published by Taylor & Francis in Journal of Nonparametric Statistics
- Vol. 4 (3) , 211-222
- https://doi.org/10.1080/10485259508832613
Abstract
We interpret the kernel estimator for the density as the solution to a maximum smoothed likelihood problem, and show under suitable conditions that it converges to the true densityin the sense that the Kullback-Leibler information number of the estimator relative to the true density converges to 0 in probability. We also consider the convergence in probability of the naiveestimator for the entropy number of the density. The conditions under which all this happens say essentially that the true density as well as the smoothed true density have finite entropy. The significance of the maximum (smoothed) likelihood set-up is that the relevant random variables are nonnegative, and all we have to do is show that their expected values converge to 0.Keywords
This publication has 9 references indexed in Scilit:
- Asymptotic Analysis of Penalized Likelihood and Related EstimatorsThe Annals of Statistics, 1990
- Akaike's information criterion and Kullback-Leibler loss for histogram density estimationProbability Theory and Related Fields, 1990
- Estimation of entropy and other functionals of a multivariate densityAnnals of the Institute of Statistical Mathematics, 1989
- On Kullback-Leibler Loss and Density EstimationThe Annals of Statistics, 1987
- Density-free convergence properties of various estimators of entropyComputational Statistics & Data Analysis, 1987
- Entropy and the Central Limit TheoremThe Annals of Probability, 1986
- Bayes EstimationPublished by Elsevier ,1983
- On the Estimation of a Probability Density Function by the Maximum Penalized Likelihood MethodThe Annals of Statistics, 1982
- Nonparametric Roughness Penalties for Probability DensitiesBiometrika, 1971