Abstract
For a regression model $y_i = \theta(x_i) + \varepsilon_i$, the unknown function $\theta$ is estimated by least squares on a subspace $\Lambda_m = \operatorname{span}\{\psi_1, \psi, \cdots, \psi_m\}$, where the basis functions $\psi_i$ are predetermined and $m$ is varied. Assuming that the design is suitably approximated by an asymptotic design measure, a general method is presented for approximating the bias and variance in a scale of Hilbertian norms natural to the problem. The general theory is illustrated with two examples: truncated Fourier series regression and polynomial regression. For these examples, we give rates of convergence of derivative estimates in (weighted) $L_2$ norms and establish consistency in supremum norm.

This publication has 0 references indexed in Scilit: