Abstract
Feed-forward neural networks—also known as multi-layer perceptrons—are now widely used for regression and classification. In parallel but slightly earlier, a family of methods for flexible regression and discrimination were developed in multivariate statistics, and tree-induction methods have been developed in both machine learning and statistics. We expound and compare these approaches in the context of a number of examples.

This publication has 31 references indexed in Scilit: