Abstract
Let (X, Y) ∈ [0, 1]d × R be a random vector and let the conditional distribution of Y given X = x have mean θ(x) and satisfy an appropriate moment condition. It is assumed that the density function of X is bounded away from zero and infinity on [0, 1]d. Suppose that θ(x) is known to be a general d-dimensional smooth function of x only. Consider an estimator of θ having the form of a polynomial spline with simple knots at equally spaced grids over [0, 1]d where the coefficients are determined by the method of least-squares based on a random sample of size n from the distribution of (X, Y). It is shown that this estimator achieves the optimal rates of convergence for nonparametric regression estimation as defined in Stone [14] under L 2 norm and supnorm, respectively.

This publication has 12 references indexed in Scilit: