EM algorithms without missing data
- 1 February 1997
- journal article
- research article
- Published by SAGE Publications in Statistical Methods in Medical Research
- Vol. 6 (1) , 38-54
- https://doi.org/10.1177/096228029700600104
Abstract
Most problems in computational statistics involve optimization of an objective function such as a loglikelihood, a sum of squares, or a log posterior function. The EM algorithm is one of the most effective algorithms for maximization because it iteratively transfers maximization from a complex function to a simple, surrogate function. This theoretical perspective clarifies the operation of the EM algorithm and suggests novel generalizations. Besides simplifying maximization, optimization transfer usually leads to highly stable algorithms with well-understood local and global convergence properties. Although convergence can be excruciatingly slow, various devices exist for accelerating it. Beginning with the EM algorithm, we review in this paper several optimization transfer algorithms of substantial utility in medical statistics.Keywords
This publication has 28 references indexed in Scilit:
- A modified expectation maximization algorithm for penalized likelihood estimation in emission tomographyIEEE Transactions on Medical Imaging, 1995
- Globally convergent algorithms for maximum a posteriori transmission tomographyIEEE Transactions on Image Processing, 1995
- On the relation between the ISRA and the EM algorithm for positron emission tomographyIEEE Transactions on Medical Imaging, 1993
- Latent class analysis of diagnostic agreementStatistics in Medicine, 1990
- Bayesian reconstructions from emission tomography data using a modified EM algorithmIEEE Transactions on Medical Imaging, 1990
- Generalizing Logistic Regression by Nonparametric MixingJournal of the American Statistical Association, 1989
- The value of latent class analysis in medical diagnosisStatistics in Medicine, 1986
- On the Convergence Properties of the EM AlgorithmThe Annals of Statistics, 1983