Polynomial splines and nonparametric regression
- 1 January 1991
- journal article
- research article
- Published by Taylor & Francis in Journal of Nonparametric Statistics
- Vol. 1 (1-2) , 143-156
- https://doi.org/10.1080/10485259108832516
Abstract
Let (X, Y) ∈ [0, 1]d × R be a random vector and let the conditional distribution of Y given X = x have mean θ(x) and satisfy an appropriate moment condition. It is assumed that the density function of X is bounded away from zero and infinity on [0, 1]d. Suppose that θ(x) is known to be a general d-dimensional smooth function of x only. Consider an estimator of θ having the form of a polynomial spline with simple knots at equally spaced grids over [0, 1]d where the coefficients are determined by the method of least-squares based on a random sample of size n from the distribution of (X, Y). It is shown that this estimator achieves the optimal rates of convergence for nonparametric regression estimation as defined in Stone [14] under L 2 norm and supnorm, respectively.Keywords
This publication has 12 references indexed in Scilit:
- On Density Estimation in the View of Kolmogorov's Ideas in Approximation TheoryThe Annals of Statistics, 1990
- Monotone Regression Splines in ActionStatistical Science, 1988
- Nonparametric Regression Analysis of Longitudinal DataPublished by Springer Nature ,1988
- Spectral bounds for ∥A−1∥∞Journal of Approximation Theory, 1986
- Additive Regression and Other Nonparametric ModelsThe Annals of Statistics, 1985
- Multivariate Smoothing Spline FunctionsSIAM Journal on Numerical Analysis, 1984
- Optimal Global Rates of Convergence for Nonparametric RegressionThe Annals of Statistics, 1982
- Asymptotic Integrated Mean Square Error Using Least Squares and Bias Minimizing SplinesThe Annals of Statistics, 1980
- A Practical Guide to SplinesPublished by Springer Nature ,1978
- Probability Inequalities for Sums of Bounded Random VariablesJournal of the American Statistical Association, 1963