Simulated annealing for maximum a posteriori parameter estimation of hidden Markov models
- 1 May 2000
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 46 (3) , 994-1004
- https://doi.org/10.1109/18.841176
Abstract
Hidden Markov models are mixture models in which the populations from one observation to the next are selected according to an unobserved finite state-space Markov chain. Given a realization of the observation process, our aim is to estimate both the parameters of the Markov chain and of the mixture model in a Bayesian framework. In this paper, we present an original simulated annealing algorithm which, in the same way as the EM (Expectation-Maximization) algorithm, relies on data augmentation, and is based on stochastic simulation of the hidden Markov chain. This algorithm is shown to converge toward the set of Maximum A Posteriori (MAP) parameters under suitable regularity conditionsKeywords
This publication has 26 references indexed in Scilit:
- Simulation-based methods for blind maximum-likelihood filter identificationSignal Processing, 1999
- Understanding the Metropolis-Hastings AlgorithmThe American Statistician, 1995
- Markov Chains for Exploring Posterior DistributionsThe Annals of Statistics, 1994
- RECURSIVE ESTIMATION IN SWITCHING AUTOREGRESSIONS WITH A MARKOV REGIMEJournal of Time Series Analysis, 1994
- Bayes regression with autoregressive errorsJournal of Econometrics, 1993
- Simulated annealing process in general state spaceAdvances in Applied Probability, 1991
- A Monte Carlo Implementation of the EM Algorithm and the Poor Man's Data Augmentation AlgorithmsJournal of the American Statistical Association, 1990
- Simulated Annealing: Theory and ApplicationsPublished by Springer Nature ,1987
- Convergence and finite-time behavior of simulated annealingAdvances in Applied Probability, 1986
- Baum's forward-backward algorithm revisitedPattern Recognition Letters, 1985