Abstract
Second-order asymptotic theory of predictive distributions is investigated. It is shown that estimative distributions with asymptotically efficient estimators can be improved to predictive distributions that do not belong to the model. The model is assumed to be a multidimensional curved exponential family. Predictive distributions can be constructed by shifting estimative distributions in a direction orthogonal to the model. The average Kullback-Leibler divergence from the true distribution to a predictive distribution is represented as the sum of two components. One of them depends on the choice of the estimative distribution and the other depends on the shift orthogonal to the model. The optimal shift orthogonal to the model is obtained. A quantity that can be regarded as the mixture mean curvature of the model in the space of all probability distributions is introduced. The difference in average Kullback-Leibler divergence between an estimative distribution and the optimal predictive distribution obtained by shifting it can be regarded as the mixture mean curvature of the model. An asymptotic expression for Bayesian predictive distributions is also obtained.

This publication has 0 references indexed in Scilit: