Neural networks and flexible regression and discrimination
- 1 January 1994
- journal article
- research article
- Published by Taylor & Francis in Journal of Applied Statistics
- Vol. 21 (1-2) , 39-57
- https://doi.org/10.1080/757582967
Abstract
Feed-forward neural networks—also known as multi-layer perceptrons—are now widely used for regression and classification. In parallel but slightly earlier, a family of methods for flexible regression and discrimination were developed in multivariate statistics, and tree-induction methods have been developed in both machine learning and statistics. We expound and compare these approaches in the context of a number of examples.Keywords
This publication has 31 references indexed in Scilit:
- Exact Calculation of the Hessian Matrix for the Multilayer PerceptronNeural Computation, 1992
- Multivariate Adaptive Regression SplinesThe Annals of Statistics, 1991
- Machine LearningAnnual Review of Computer Science, 1990
- Flexible Parsimonious Smoothing and Additive ModelingTechnometrics, 1989
- On the approximate realization of continuous mappings by neural networksNeural Networks, 1989
- Tree-Structured Classification via Generalized Discriminant AnalysisJournal of the American Statistical Association, 1988
- Estimating Transformations for Regression via Additivity and Variance StabilizationJournal of the American Statistical Association, 1988
- Attributes of the performance of central processing units: a relative performance prediction modelCommunications of the ACM, 1987
- Estimating Optimal Transformations for Multiple Regression and CorrelationJournal of the American Statistical Association, 1985
- Projection Pursuit RegressionJournal of the American Statistical Association, 1981