Statistical validation of mutual information calculations: Comparison of alternative numerical algorithms
- 22 June 2005
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review E
- Vol. 71 (6) , 066208
- https://doi.org/10.1103/physreve.71.066208
Abstract
Given two time series and , their mutual information, , is the average number of bits of that can be predicted by measuring and vice versa. In the analysis of observational data, calculation of mutual information occurs in three contexts: identification of nonlinear correlation, determination of an optimal sampling interval, particularly when embedding data, and in the investigation of causal relationships with directed mutual information. In this contribution a minimum description length argument is used to determine the optimal number of elements to use when characterizing the distributions of and . However, even when using partitions of the and axis indicated by minimum description length, mutual information calculations performed with a uniform partition of the plane can give misleading results. This motivated the construction of an algorithm for calculating mutual information that uses an adaptive partition. This algorithm also incorporates an explicit test of the statistical independence of and in a calculation that returns an assessment of the corresponding null hypothesis. The previously published Fraser-Swinney algorithm for calculating mutual information includes a sophisticated procedure for local adaptive control of the partitioning process. When the Fraser and Swinney algorithm and the algorithm constructed here are compared, they give very similar numerical results (less than 4% difference in a typical application). Detailed comparisons are possible when and are correlated jointly Gaussian distributed because an analytic expression for can be derived for that case. Based on these tests, three conclusions can be drawn. First, the algorithm constructed here has an advantage over the Fraser-Swinney algorithm in providing an explicit calculation of the probability of the null hypothesis that and are independent. Second, the Fraser-Swinney algorithm is marginally the more accurate of the two algorithms when large data sets are used. With smaller data sets, however, the Fraser-Swinney algorithm reports structures that disappear when more data are available. Third, the algorithm constructed here requires about 0.5% of the computation time required by the Fraser-Swinney algorithm.
Keywords
This publication has 27 references indexed in Scilit:
- Comparative study of embedding methodsPhysical Review E, 2003
- Comment on “Performance of different synchronization measures in real data: A case study on electroencephalographic signals”Physical Review E, 2003
- Estimation of mutual information using kernel density estimatorsPhysical Review E, 1995
- EmbedologyJournal of Statistical Physics, 1991
- Information and entropy in strange attractorsIEEE Transactions on Information Theory, 1989
- Interdependence of EEG signals: Linear vs. nonlinear Associations and the significance of time delays and phase shiftsBrain Topography, 1989
- Generalized dimensions and entropies from a measured time seriesPhysical Review A, 1987
- Ergodic theory of chaos and strange attractorsReviews of Modern Physics, 1985
- Investigating Causal Relations by Econometric Models and Cross-spectral MethodsEconometrica, 1969
- Some Methods for Strengthening the Common χ 2 TestsPublished by JSTOR ,1954