Statistical Physics, Mixtures of Distributions, and the EM Algorithm
- 1 March 1994
- journal article
- Published by MIT Press in Neural Computation
- Vol. 6 (2) , 334-340
- https://doi.org/10.1162/neco.1994.6.2.334
Abstract
We show that there are strong relationships between approaches to optmization and learning based on statistical physics or mixtures of experts. In particular, the EM algorithm can be interpreted as converging either to a local maximum of the mixtures model or to a saddle point solution to the statistical physics system. An advantage of the statistical physics approach is that it naturally gives rise to a heuristic continuation method, deterministic annealing, for finding good solutions.Keywords
This publication has 7 references indexed in Scilit:
- Adaptive Mixtures of Local ExpertsNeural Computation, 1991
- Parallel Distributed Approaches to Combinatorial Optimization: Benchmark Studies on Traveling Salesman ProblemNeural Computation, 1990
- Generalized Deformable Models, Statistical Physics, and Matching ProblemsNeural Computation, 1990
- Reading the structure of brainsNetwork: Computation in Neural Systems, 1990
- An Analysis of the Elastic Net Approach to the Traveling Salesman ProblemNeural Computation, 1989
- An analogue approach to the travelling salesman problem using an elastic net methodNature, 1987
- Another interpretation of the EM algorithm for mixture distributionsStatistics & Probability Letters, 1986