Approximation of Least Squares Regression on Nested Subspaces
Open Access
- 1 June 1988
- journal article
- Published by Institute of Mathematical Statistics in The Annals of Statistics
- Vol. 16 (2) , 713-732
- https://doi.org/10.1214/aos/1176350830
Abstract
For a regression model $y_i = \theta(x_i) + \varepsilon_i$, the unknown function $\theta$ is estimated by least squares on a subspace $\Lambda_m = \operatorname{span}\{\psi_1, \psi, \cdots, \psi_m\}$, where the basis functions $\psi_i$ are predetermined and $m$ is varied. Assuming that the design is suitably approximated by an asymptotic design measure, a general method is presented for approximating the bias and variance in a scale of Hilbertian norms natural to the problem. The general theory is illustrated with two examples: truncated Fourier series regression and polynomial regression. For these examples, we give rates of convergence of derivative estimates in (weighted) $L_2$ norms and establish consistency in supremum norm.
Keywords
This publication has 0 references indexed in Scilit: