Functional approximation by feed-forward networks: a least-squares approach to generalization
- 1 May 1994
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 5 (3) , 363-371
- https://doi.org/10.1109/72.286908
Abstract
This paper considers a least-squares approach to function approximation and generalization. The particular problem addressed is one in which the training data are noiseless and the requirement is to define a mapping that approximates the data and that generalizes to situations in which data samples are corrupted by noise in the input variables. The least-squares approach produces a generalizer that has the form of a radial basis function network for a finite number of training samples. The finite sample approximation is valid provided that the perturbations due to noise on the expected operating conditions are large compared to the sample spacing in the data space. In the other extreme of small noise perturbations, a particular parametric form must be assumed for the generalizer. It is shown that better generalization will occur if the error criterion used in training the generalizer is modified by the addition of a specific regularization term. This is illustrated by an approximator that has a feedforward architecture and is applied to the problem of point-source location using the outputs of an array of receivers in the focal-plane of a lens.Keywords
This publication has 11 references indexed in Scilit:
- Information theoretic derivation of network architecture and learning algorithmsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2002
- Point-source location using a millimetre wave focal-plane array radarIEE Proceedings F Radar and Signal Processing, 1991
- Exploiting prior knowledge in network optimization: an illustration from medical prognosisNetwork: Computation in Neural Systems, 1990
- Regularization Algorithms for Learning That Are Equivalent to Multilayer NetworksScience, 1990
- Constructing a generalizer superior to NETtalk via a mathematical theory of generalizationNeural Networks, 1990
- The optimised internal representation of multilayer classifier networks performs nonlinear discriminant analysisNeural Networks, 1990
- Neural networks in noisy environment: a simple temporal higher order learning for feed-forward networksPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1990
- A benchmark for how well neural nets generalizeBiological Cybernetics, 1989
- An Analysis of the Total Least Squares ProblemSIAM Journal on Numerical Analysis, 1980
- Density Estimation for Statistics and Data AnalysisPublished by Springer Nature ,1400