A nonparametric estimation of the entropy for absolutely continuous distributions (Corresp.)
- 1 May 1976
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 22 (3) , 372-375
- https://doi.org/10.1109/tit.1976.1055550
Abstract
LetF(x)be an absolutely continuous distribution having a density functionf(x)with respect to the Lebesgue measure. The Shannon entropy is defined asH(f) = -\int f(x) \ln f(x) dx. In this correspondence we propose, based on a random sampleX_{1}, \cdots , X_{n}generated fromF, a nonparametric estimate ofH(f)given by\hat{H}(f) = -(l/n) \sum_{i = 1}^{n} \In \hat{f}(x), where\hat{f}(x)is the kernel estimate offdue to Rosenblatt and Parzen. Regularity conditions are obtained under which the first and second mean consistencies of\hat{H}(f)are established. These conditions are mild and easily satisfied. Examples, such as Gamma, Weibull, and normal distributions, are considered.Keywords
This publication has 4 references indexed in Scilit:
- Some moments of an estimate of shannon's measure of informationCommunications in Statistics - Theory and Methods, 1974
- On Estimation of a Probability Density Function and ModeThe Annals of Mathematical Statistics, 1962
- On a Statistical Estimate for the Entropy of a Sequence of Independent Random VariablesTheory of Probability and Its Applications, 1959
- Remarks on Some Nonparametric Estimates of a Density FunctionThe Annals of Mathematical Statistics, 1956