Abstract
Consider the problem of discriminating two Gaussian signals by using only a finite number of linear observables. How to choose the set of n observables to minimize the error probabilityP_{e}, is a difficult problem. BecauseH, the Hellinger integral, andH^{2}form an upper and a lower bound forP_{e}, we minimizeHinstead. We find that the set of observables that minimizesHis a set of coefficients of the simultaneously orthogonal expansions of the two signals. The same set of observables maximizes the HájekJ-divergence as well.

This publication has 5 references indexed in Scilit: