A Lower Bound for the Risk in Estimating the Value of a Probability Density

Abstract
This article considers estimation of f(0) of a density function satisfying a Lipshitz condition in a neighborhood of 0. A nonstandard use of the Cramer-Rao inequality yields numerical lower bounds on the minimax squared error risk of any estimator. These bounds are then compared with the minimax risk of the asymptotically optimal kernel-type estimator. The asymptotic bounds obtained (as the sample size n → ∞) are not quite as good as those in Donoho and Liu (in press a, b), but bounds are presented here also for finite values of n, and that paper contains no such bounds for this problem. The numerical results reported in Table 1 show that the asymptotically optimal kernel estimator performs within a factor of 3 of the minimax bound, even for sample size n = 30. As n increases the relative performance improves to its limiting value, although the convergence is fairly slow.

This publication has 0 references indexed in Scilit: