Complete-data spaces and generalized EM algorithms

Abstract
Expectation-maximization (EM) algorithms have been applied extensively for computing maximum-likelihood and penalized-likelihood parameter estimates in signal processing applications. Intrinsic to each EM algorithm is a complete-data space (CDS)-a hypothetical set of random variables that is related to the parameters more naturally than the measurements are. The authors describe two generalizations of the EM paradigm: (i) allowing the relationship between the CDS and the measured data to be nondeterministic, and (ii) using a sequence of alternating complete-data spaces. These generalizations are motivated in part by the influence of the CDS on the convergence rate, a relationship that is formalized through a data-processing inequality for Fisher information. These concepts are applied to the problem of estimating superimposed signals in Gaussian noise, and it is shown that the new space alternating generalized EM algorithm converges significantly faster than the ordinary EM algorithm.<>

This publication has 6 references indexed in Scilit: