A general regression neural network
- 1 January 1991
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 2 (6) , 568-576
- https://doi.org/10.1109/72.97934
Abstract
A memory-based network that provides estimates of continuous variables and converges to the underlying (linear or nonlinear) regression surface is described. The general regression neural network (GRNN) is a one-pass learning algorithm with a highly parallel structure. It is shown that, even with sparse data in a multidimensional measurement space, the algorithm provides smooth transitions from one observed value to another. The algorithmic form can be used for any regression problem in which an assumption of linearity is not justified.Keywords
This publication has 16 references indexed in Scilit:
- CMAC: an associative neural network alternative to backpropagationProceedings of the IEEE, 1990
- Identification and control of dynamical systems using neural networksIEEE Transactions on Neural Networks, 1990
- Probabilistic neural networksNeural Networks, 1990
- Adaptation and tracking in system identification—A surveyAutomatica, 1990
- Fast Learning in Networks of Locally-Tuned Processing UnitsNeural Computation, 1989
- A neural model for category learningBiological Cybernetics, 1982
- Variable Kernel Estimates of Multivariate DensitiesTechnometrics, 1977
- Series Estimation of a Probability Density FunctionTechnometrics, 1971
- Estimation of a multivariate densityAnnals of the Institute of Statistical Mathematics, 1966
- On Estimation of a Probability Density Function and ModeThe Annals of Mathematical Statistics, 1962