Convergence of Best Entropy Estimates

Abstract
Given a finite number of moments of an unknown density $\bar x$ on a finite measure space, the best entropy estimate—that nonnegative density x with the given moments which minimizes the Boltzmann–Shannon entropy $I(x): = \int {x\log x} $—is considered. A direct proof is given that I has the Kadec property in $L_1 $—if $y_n $ converges weakly to $\bar y$ and $I(y_n )$ converges to $I(\bar y)$, then $y_n $ converges to $\bar y$ in norm. As a corollary, it is obtained that, as the number of given moments increases, the best entropy estimates converge in $L_1 $ norm to the best entropy estimate of the limiting problem, which is simply $\bar x$ in the determined case. Furthermore, for classical moment problems on intervals with $\bar x$ strictly positive and sufficiently smooth, error bounds and uniform convergence are actually obtained.

This publication has 15 references indexed in Scilit: