Estimation of mutual information using kernel density estimators
- 1 September 1995
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review E
- Vol. 52 (3) , 2318-2321
- https://doi.org/10.1103/physreve.52.2318
Abstract
Mutual information is useful for investigating the dependence between two experimental time series. It is often used to establish an appropriate time delay in phase-portrait reconstruction from time-series data. A histogram based approach has been used so far to estimate the probabilities. It is shown here that kernel density estimation of the probability density functions needed in estimating the average mutual information across two coordinates can be more effective than the histogram method of Fraser and Swinney [Phys. Rev. A 33, 1134 (1986)].Keywords
This publication has 16 references indexed in Scilit:
- Direct dynamical test for deterministic chaos and optimal embedding of a chaotic time seriesPhysical Review E, 1994
- Predicting physical variables in time-delay embeddingPhysical Review E, 1994
- Kernel flood frequency estimators: Bandwidth selection and kernel choiceWater Resources Research, 1993
- Mutual information, strange attractors, and the optimal estimation of dimensionPhysical Review A, 1992
- Mutual information functions versus correlation functionsJournal of Statistical Physics, 1990
- The weather attractor over very short timescalesNature, 1988
- Independent coordinates for strange attractors from mutual informationPhysical Review A, 1986
- Nearly one dimensional dynamics in an epidemicJournal of Theoretical Biology, 1985
- Low-Dimensional Chaos in a Hydrodynamic SystemPhysical Review Letters, 1983
- Observation of a strange attractorPhysica D: Nonlinear Phenomena, 1983