Dynamic Conditional Independence Models and Markov Chain Monte Carlo Methods

Abstract
In dynamic statistical modeling situations, observations arise sequentially, causing the model to expand by progressive incorporation of new data items and new unknown parameters. For example, in clinical monitoring, patients and data arrive sequentially, and new patient-specific parameters are introduced with each new patient. Markov chain Monte Carlo (MCMC) might be used for continuous updating of the evolving posterior distribution, but would need to be restarted from scratch at each expansion stage. Thus MCMC methods are often too slow for real-time inference in dynamic contexts. By combining MCMC with importance resampling, we show how real-time sequential updating of posterior distributions can be effected. The proposed dynamic sampling algorithms use posterior samples from previous updating stages and exploit conditional independence between groups of parameters to allow samples of parameters no longer of interest to be discarded, such as when a patient dies or is discharged. We apply the ...