Convergence of adaptive mixtures of importance sampling schemes
Open Access
- 1 February 2007
- journal article
- Published by Institute of Mathematical Statistics in The Annals of Statistics
- Vol. 35 (1) , 420-448
- https://doi.org/10.1214/009053606000001154
Abstract
In the design of efficient simulation algorithms, one is often beset with a poor choice of proposal distributions. Although the performance of a given simulation kernel can clarify a posteriori how adequate this kernel is for the problem at hand, a permanent on-line modification of kernels causes concerns about the validity of the resulting algorithm. While the issue is most often intractable for MCMC algorithms, the equivalent version for importance sampling algorithms can be validated quite precisely. We derive sufficient convergence conditions for adaptive mixtures of population Monte Carlo algorithms and show that Rao–Blackwellized versions asymptotically achieve an optimum in terms of a Kullback divergence criterion, while more rudimentary versions do not benefit from repeated updating.Keywords
All Related Versions
This publication has 18 references indexed in Scilit:
- Limit theorems for weighted samples with applications to sequential Monte Carlo methodsThe Annals of Statistics, 2008
- Minimum variance importance samplingviaPopulation Monte CarloESAIM: Probability and Statistics, 2007
- Sequential Monte Carlo SamplersJournal of the Royal Statistical Society Series B: Statistical Methodology, 2006
- Population Monte CarloJournal of Computational and Graphical Statistics, 2004
- Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inferenceThe Annals of Statistics, 2004
- Self-regenerative Markov chain Monte Carlo with adaptationBernoulli, 2003
- Weak convergence and optimal scaling of random walk Metropolis algorithmsThe Annals of Applied Probability, 1997
- Rates of convergence of the Hastings and Metropolis algorithmsThe Annals of Statistics, 1996
- Weighted Average Importance Sampling and Defensive Mixture DistributionsTechnometrics, 1995
- Markov Chains for Exploring Posterior DistributionsThe Annals of Statistics, 1994