Abstract
How much separate information about two random binary sequences is needed in order to tell with small probability of error in which positions the two sequences differ? If the sequences are the outputs of two correlated memoryless binary sources, then in some cases the rate of this information may be substantially less than the joint entropy of the two sources. This result is implied by the solution of the source coding problem with two separately encoded side information sources for a special class of source distributions.

This publication has 5 references indexed in Scilit: