On the Convergence Rates of IPA and FDC Derivative Estimators

Abstract
We show that under the (sufficient) conditions usually given for infinitesimal perturbation analysis (IPA) to apply for derivative estimation, a finite-difference scheme with common random numbers (FDC) has the same order of convergence, namely O(n−1/2), provided that the size of the finite-difference interval converges to zero fast enough. This holds for both one- and two-sided FDC. This also holds for different variants of IPA, such as some versions of smoothed perturbation analysis (SPA), which is based on conditional expectation. Finally, this also holds for the estimation of steady-state performance measures by truncated-horizon estimators, under some ergodicity assumptions. Our developments do not involve monotonicity, but are based on continuity and smoothness. We give examples and numerical illustrations which show that the actual difference in mean square error (MSE) between IPA and FDC is typically negligible. We also obtain the order of convergence of that difference, which is faster than the convergence of the MSE to zero.

This publication has 0 references indexed in Scilit: