Abstract
This paper is mainly concerned with deriving finite-sample properties of least squares estimators for the regression function in a nonparametric regression situation under some simplifying assumptions such as normally distributed errors with a common known variance. The selection of basis functions to be used for the construction of an estimator may be regarded as a smoothing problem, and will usually be done in a data-dependent way, A straightforward application of a result by P. J. Kernpthorne yields that, under a squared error loss, all selection procedures are admissible. Furthermore, the minimax approach provides an interpolating estimator, which is often impractical, Therefore, within a certain class of selection procedures an optimal one is determined using the minimax regret principle. It can be seen to behave similarly to the procedure minimizing either an unbiased risk estimator or, equivalently, the Cp-criterion.