Estimation of the entropy and information of absolutely continuous random variables
- 1 January 1989
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 35 (1) , 193-196
- https://doi.org/10.1109/18.42194
Abstract
A method is proposed for estimating the entropy and the mutual information of absolutely continuous random vectors, and an upper bound of the mean risks for the proposed estimators under strong mixing conditions is givenKeywords
This publication has 7 references indexed in Scilit:
- Entropy-Based Tests of UniformityJournal of the American Statistical Association, 1981
- A nonparametric estimation of the entropy for absolutely continuous distributions (Corresp.)IEEE Transactions on Information Theory, 1976
- On the Estimation of Functionals of the Probability Density and Its DerivativesTheory of Probability and Its Applications, 1974
- Some moments of an estimate of shannon's measure of informationCommunications in Statistics - Theory and Methods, 1974
- On a Statistical Estimate for the Entropy of a Sequence of Independent Random VariablesTheory of Probability and Its Applications, 1959
- Remarks on Some Nonparametric Estimates of a Density FunctionThe Annals of Mathematical Statistics, 1956
- A CENTRAL LIMIT THEOREM AND A STRONG MIXING CONDITIONProceedings of the National Academy of Sciences, 1956