Abstract
Probabilistic automata (PA) or stochastic sequential machines are very popular, because they are general enough to approximate an arbitrary process and at the same time they posses an attractive property of conditional independence which simplifies their applications. Therefore, it is important to develop an algorithm for approximating an arbitrary stochastic process with the PA. Standard fitting techniques are not robust and are difficult to implement, because of the large number of model parameters. The EM algorithm proved to be robust for maximum likelihood parameter estimation from experimental data of special cases of PA. We demonstrate that the EM algorithm can be used for fitting the PA to an arbitrary stochastic process if Kullback-Leibler discrimination information is used as a distance between the processes. We show that the EM algorithm can be used to calculate derivatives which allows us to combine it with other optimization algorithms. We develop also the modified forms of the algorithm for fitting Markov modulated Bernoulli and Poisson processes. Computational aspects of applying the EM algorithm are addressed in the Appendix