Statistical information and discrimination
- 1 May 1993
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 39 (3) , 1036-1039
- https://doi.org/10.1109/18.256536
Abstract
In analogy with the definition of Shannon information, M.H. De Groot (1962) defined statistical information as the difference between prior and posterior risk of a statistical decision problem. Relations are studied between the statistical information and the discrimination functions of information theory known as f-divergences. Using previous results, it is shown that every f-divergence If(P,Q) is an average statistical information or decision problem with dichotomic parameter, 0-1 loss function, and corresponding observation distributions P and Q. The average is taken over a distribution on the parameter's prior probability. This distribution is uniquely determined by the function f. The main result is that every f-divergence is statistical information in some properly chosen statistical decision problem, and conversely, that every piece of statistical information is an f-divergence. This provides a new representation of discrimination functions figuring in signal detection, data compression, coding pattern classification, cluster analysis, etcKeywords
This publication has 12 references indexed in Scilit:
- Speech coding based upon vector quantizationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2005
- Divergence measures based on the Shannon entropyIEEE Transactions on Information Theory, 1991
- Information-theoretic asymptotics of Bayes methodsIEEE Transactions on Information Theory, 1990
- Quantization for decentralized hypothesis testing under communication constraintsIEEE Transactions on Information Theory, 1990
- General entropy criteria for inverse problems, with applications to data compression, pattern classification, and cluster analysisIEEE Transactions on Information Theory, 1990
- A lower bound on average codeword length of variable length error-correcting codesIEEE Transactions on Information Theory, 1990
- Divergenzen von Wahrscheinlichkeitsverteilungen — Integralgeometrisch BetrachtetActa Mathematica Hungarica, 1981
- Information-theoretical considerations on estimation problemsInformation and Control, 1971
- The Divergence and Bhattacharyya Distance Measures in Signal SelectionIEEE Transactions on Communications, 1967
- Uncertainty, Information, and Sequential ExperimentsThe Annals of Mathematical Statistics, 1962