On the best finite set of linear observables for discriminating two Gaussian signals
- 1 April 1967
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 13 (2) , 278-284
- https://doi.org/10.1109/tit.1967.1054013
Abstract
Consider the problem of discriminating two Gaussian signals by using only a finite number of linear observables. How to choose the set of n observables to minimize the error probabilityP_{e}, is a difficult problem. BecauseH, the Hellinger integral, andH^{2}form an upper and a lower bound forP_{e}, we minimizeHinstead. We find that the set of observables that minimizesHis a set of coefficients of the simultaneously orthogonal expansions of the two signals. The same set of observables maximizes the HájekJ-divergence as well.Keywords
This publication has 5 references indexed in Scilit:
- Simultaneously Orthogonal Expansion of Two Stationary Gaussian Processes - ExamplesBell System Technical Journal, 1966
- An Integral Expression for the Log Likelihood Ratio of Two Gaussian ProcessesSIAM Journal on Applied Mathematics, 1966
- Optimum Reception of Binary Gaussian SignalsBell System Technical Journal, 1964
- THE SINGULARITY OF GAUSSIAN MEASURES IN FUNCTION SPACEProceedings of the National Academy of Sciences, 1964
- A property of $J$-divergences of marginal probability distributionsCzechoslovak Mathematical Journal, 1958