On the converse theorem in statistical hypothesis testing

Abstract
Simple statistical hypothesis testing is investigated by making use of the divergence geometric method. The asymptotic behavior of the minimum value of the error probability of the second kind under the constraint that the error probability of the first kind is bounded above by exp(-rn) is looked for, where r is a given positive number. If r is greater than the divergence of the two probability measures, the so-called converse theorem holds. It is shown that the condition under which the converse theorem holds can be divided into two separate cases by analyzing the geodesic connecting the two probability measures, and, as a result, an explanation is given for the Han-Kobayashi linear function fT(X˜)

This publication has 4 references indexed in Scilit: