Abstract
A three-layer feedforward neural network (NN) that implements the optimum Neyman-Pearson (N-P) classifier is described. This NN is useful whenever it is appropriate to characterize (1) input classes as multivariate random variables, and (2) input data vectors as realizations of one of the multivariate random variables. The purpose of the NN is thus simply to compute the conditional likelihoods necessary for the N-P classifier. Because the N-P classifier is optimal, the classification performance of the NN is optimal too. Therefore, three-layer feedforward NN classifiers can equal but not exceed the performance of the N-P classifier. The optimal N-P classifier requires multivariate probability density functions (PDFs) characterizing the input classes. Class PDFs are approximated (arbitrarily closely) by mixtures of multivariate Gaussian PDFs. Supervised training of the class PDFs from input data vectors is, thus, equivalent to training the NN. Maximum likelihood training of the PDFs is performed by the EM algorithm (or by any other suitable optimization method)

This publication has 1 reference indexed in Scilit: