Estimating mutual information
Top Cited Papers
Open Access
- 23 June 2004
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review E
- Vol. 69 (6) , 066138
- https://doi.org/10.1103/physreve.69.066138
Abstract
We present two classes of improved estimators for mutual information , from samples of random points distributed according to some joint probability density . In contrast to conventional estimators based on binnings, they are based on entropy estimates from -nearest neighbor distances. This means that they are data efficient (with we resolve structures down to the smallest possible scales), adaptive (the resolution is higher where data are more numerous), and have minimal bias. Indeed, the bias of the underlying entropy estimates is mainly due to nonuniformity of the density at the smallest resolved scale, giving typically systematic errors which scale as functions of for points. Numerically, we find that both families become exact for independent distributions, i.e. the estimator vanishes (up to statistical fluctuations) if . This holds for all tested marginal distributions and for all dimensions of and . In addition, we give estimators for redundancies between more than two random variables. We compare our algorithms in detail with existing algorithms. Finally, we demonstrate the usefulness of our estimators for assessing the actual independence of components obtained from independent component analysis (ICA), for improving ICA, and for estimating the reliability of blind source separation.
Keywords
All Related Versions
This publication has 26 references indexed in Scilit:
- Information transfer in continuous processesPublished by Elsevier ,2002
- Estimation of the information by an adaptive partitioning of the observation spaceIEEE Transactions on Information Theory, 1999
- Estimating the errors on measured entropy and mutual informationPhysica D: Nonlinear Phenomena, 1999
- Entropy estimators‐improvements and comparisonsCommunications in Statistics - Simulation and Computation, 1999
- Estimation of mutual information using kernel density estimatorsPhysical Review E, 1995
- A new estimator of entropyCommunications in Statistics - Theory and Methods, 1995
- Two measures of sample entropyStatistics & Probability Letters, 1994
- Finite sample corrections to entropy and dimension estimatesPhysics Letters A, 1988
- Independent coordinates for strange attractors from mutual informationPhysical Review A, 1986
- Entropy-Based Tests of UniformityJournal of the American Statistical Association, 1981