Measuring Information Transfer
Top Cited Papers
- 10 July 2000
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review Letters
- Vol. 85 (2) , 461-464
- https://doi.org/10.1103/physrevlett.85.461
Abstract
An information theoretic measure is derived that quantifies the statistical coherence between systems evolving in time. The standard time delayed mutual information fails to distinguish information that is actually exchanged from shared information due to common history and input signals. In our new approach, these influences are excluded by appropriate conditioning of transition probabilities. The resulting transfer entropy is able to distinguish effectively driving and responding elements and to detect asymmetry in the interaction of subsystems.Keywords
All Related Versions
This publication has 9 references indexed in Scilit:
- Nonlinear Time Series AnalysisPublished by Cambridge University Press (CUP) ,2003
- Nonlinear interdependencies of EEG signals in human intracranially recorded temporal lobe seizuresBrain Research, 1998
- Detecting nonlinearity in multivariate time seriesPhysics Letters A, 1996
- Phase Synchronization of Chaotic OscillatorsPhysical Review Letters, 1996
- Measuring statistical dependences in a time seriesJournal of Statistical Physics, 1993
- NONLINEAR TIME SEQUENCE ANALYSISInternational Journal of Bifurcation and Chaos, 1991
- Spatio-temporal structure in coupled map lattices: two-point correlations versus mutual informationJournal of Physics A: General Physics, 1990
- Relative Information — What For?Published by Springer Nature ,1990
- Information transport in spatiotemporal systemsPhysical Review Letters, 1988