Additive nonparametric regression on principal components

Abstract
Nonparametric regression smoothing in high dimensions faces the problem of data sparseness. Additive regression models alleviate this problem by fitting a sum of one-dimensional smooth functions. A common approach for dimension reduction in multivariate statistics is to replace the original high dimensional predictor variables by their dominant principal components. In this paper we consider an additive nonparametric regression model on principal components. A three-stage procedure is proposed to decide how many and which components should be included into such an additive model. In a first step the predictor variables are made orthogonal by the principal component transformation. After the second step, which determines the number and sequence of components, the additive regression model is fit by the kernel method. The asymptotic distribution of this regression estimate is given. The practical performance is investigated via a simulation study.

This publication has 14 references indexed in Scilit: