On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size
- 31 December 1994
- journal article
- Published by Elsevier in Neural Networks
- Vol. 7 (4) , 609-628
- https://doi.org/10.1016/0893-6080(94)90040-x
Abstract
No abstract availableKeywords
This publication has 24 references indexed in Scilit:
- Convergence Rates of Approximation by TranslatesPublished by Defense Technical Information Center (DTIC) ,1992
- Neural Networks and the Bias/Variance DilemmaNeural Computation, 1992
- Orthogonal least squares learning algorithm for radial basis function networksIEEE Transactions on Neural Networks, 1991
- Networks and the best approximation propertyBiological Cybernetics, 1990
- Layered Neural Networks with Gaussian Hidden Units as Universal ApproximationsNeural Computation, 1990
- An equivalence theorem for L1 convergence of the kernel regression estimateJournal of Statistical Planning and Inference, 1989
- Distribution-Free Pointwise Consistency of Kernel Regression EstimateThe Annals of Statistics, 1984
- On the Almost Everywhere Convergence of Nonparametric Regression Function EstimatesThe Annals of Statistics, 1981
- Probability Inequalities for Sums of Bounded Random VariablesJournal of the American Statistical Association, 1963
- Probability Inequalities for the Sum of Independent Random VariablesJournal of the American Statistical Association, 1962