Representing and computing regular languages on massively parallel networks

Abstract
A general method is proposed for incorporating rule-based constraints corresponding to regular languages into stochastic inference problems, thereby allowing for a unified representation of stochastic and syntactic pattern constraints. The authors' approach establishes the formal connection of rules to Chomsky grammars and generalizes the original work of Shannon on the encoding of rule-based channel sequences to Markov chains of maximum entropy. This maximum entropy probabilistic view leads to Gibbs representations with potentials which have their number of minima growing at precisely the exponential rate that the language of deterministically constrained sequences grow. These representations are coupled to stochastic diffusion algorithms, which sample the language-constrained sequences by visiting the energy minima according to the underlying Gibbs probability law. This coupling yields the result that fully parallel stochastic cellular automata can be derived to generate samples from the rule-based constraint sets. The production rules and neighborhood state structure of the language of sequences directly determine the necessary connection structures of the required parallel computing surface. Representations of this type have been mapped to the DAP-510 massively parallel processor consisting of 1024 mesh-connected bit-serial processing elements for performing automated segmentation of electron-micrograph images.

This publication has 46 references indexed in Scilit: