Predictability and Information Theory. Part I: Measures of Predictability
Open Access
- 1 October 2004
- journal article
- Published by American Meteorological Society in Journal of the Atmospheric Sciences
- Vol. 61 (20) , 2425-2440
- https://doi.org/10.1175/1520-0469(2004)061<2425:paitpi>2.0.co;2
Abstract
This paper gives an introduction to the connection between predictability and information theory, and derives new connections between these concepts. A system is said to be unpredictable if the forecast distribution, which gives the most complete description of the future state based on all available knowledge, is identical to the climatological distribution, which describes the state in the absence of time lag information. It follows that a necessary condition for predictability is for the forecast and climatological distributions to differ. Information theory provides a powerful framework for quantifying the difference between two distributions that agrees with intuition about predictability. Three information theoretic measures have been proposed in the literature: predictive information, relative entropy, and mutual information. These metrics are discussed with the aim of clarifying their similarities and differences. All three metrics have attractive properties for defining predictability, i... Abstract This paper gives an introduction to the connection between predictability and information theory, and derives new connections between these concepts. A system is said to be unpredictable if the forecast distribution, which gives the most complete description of the future state based on all available knowledge, is identical to the climatological distribution, which describes the state in the absence of time lag information. It follows that a necessary condition for predictability is for the forecast and climatological distributions to differ. Information theory provides a powerful framework for quantifying the difference between two distributions that agrees with intuition about predictability. Three information theoretic measures have been proposed in the literature: predictive information, relative entropy, and mutual information. These metrics are discussed with the aim of clarifying their similarities and differences. All three metrics have attractive properties for defining predictability, i...Keywords
This publication has 22 references indexed in Scilit:
- Stochastic Models of Quasigeostrophic TurbulenceSurveys in Geophysics, 2004
- Predictable Component Analysis, Canonical Correlation Analysis, and Autoregressive ModelsJournal of the Atmospheric Sciences, 2003
- Measuring Dynamical Prediction Utility Using Relative EntropyJournal of the Atmospheric Sciences, 2002
- Evaluating the Potential Predictive Utility of Ensemble ForecastsJournal of Climate, 1996
- Optimal Fingerprints for the Detection of Time-dependent Climate ChangeJournal of Climate, 1993
- Information Theory and Climate PredictionJournal of Climate, 1990
- Origins and Levels of Monthly and Seasonal Forecast Skill for United States Surface Air Temperatures Determined by Canonical Correlation AnalysisMonthly Weather Review, 1987
- On Determining the Statistical Significance of Climate Experiments with General Circulation ModelsJournal of the Atmospheric Sciences, 1976
- Stochastic dynamic predictionTellus, 1969
- Deterministic Nonperiodic FlowJournal of the Atmospheric Sciences, 1963