Conditional limit theorems under Markov conditioning

Abstract
LetX_{1},X_{2},cdotsbe independent identically distributed random variables taking values in a finite setXand consider the conditional joint distribution of the first m elements of the sampleX_{1},cdots , X_{n}on the condition thatX_{1}=x_{1}and the sliding block sample average of a functionh(cdot , cdot)defined onX^{2}exceeds a thresholdalpha > Eh(X_{1}, X_{2}). Formfixed andn rightarrow infty, this conditional joint distribution is shown to converge m them-step joint distribution of a Markov chain started inx_{1}which is closest toX_{l}, X_{2}, cdotsin Kullback-Leibler information divergence among all Markov chains whose two-dimensional stationary distributionP(cdot , cdot)satisfiessum P(x, y)h(x, y)geq alpha, provided some distributionPonX_{2}having equal marginals does satisfy this constraint with strict inequality. Similar conditional limit theorems are obtained whenX_{1}, X_{2},cdotsis an arbitrary finite-order Markov chain and more general conditioning is allowed.

This publication has 15 references indexed in Scilit: