Information-theoretic asymptotics of Bayes methods
- 1 May 1990
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 36 (3) , 453-471
- https://doi.org/10.1109/18.54897
Abstract
In the absence of knowledge of the true density function, Bayesian models take the joint density function for a sequence of n random variables to be an average of densities with respect to a prior. The authors examine the relative entropy distance D/sub n/ between the true density and the Bayesian density and show that the asymptotic distance is (d/2)(log n)+c, where d is the dimension of the parameter vector. Therefore, the relative entropy rate D/sub n//n converges to zero at rate (log n)/n. The constant c, which the authors explicitly identify, depends only on the prior density function and the Fisher information matrix evaluated at the true parameter value. Consequences are given for density estimation, universal data compression, composite hypothesis testing, and stock-market portfolio selection.<>Keywords
This publication has 40 references indexed in Scilit:
- The performance of universal encodingIEEE Transactions on Information Theory, 1981
- A simple proof of the Moy-Perez generalization of the Shannon-McMillan theoremPacific Journal of Mathematics, 1974
- A Counterexample to Perez's Generalization of the Shannon-McMillan TheoremThe Annals of Probability, 1973
- Asymptotic Expansions Associated with Posterior DistributionsThe Annals of Mathematical Statistics, 1970
- Asymptotic Expansions Associated with the $n$th Power of a DensityThe Annals of Mathematical Statistics, 1967
- Probability Inequalities for Sums of Bounded Random VariablesJournal of the American Statistical Association, 1963
- Distinguishability of Sets of DistributionsThe Annals of Mathematical Statistics, 1958
- On the Deviations of the Empiric Distribution Function of Vector Chance VariablesTransactions of the American Mathematical Society, 1958
- A new interpretation of information rateIEEE Transactions on Information Theory, 1956
- On Wald's Proof of the Consistency of the Maximum Likelihood EstimateThe Annals of Mathematical Statistics, 1949