Sparse On-Line Gaussian Processes
Top Cited Papers
- 1 March 2002
- journal article
- Published by MIT Press in Neural Computation
- Vol. 14 (3) , 641-668
- https://doi.org/10.1162/089976602317250933
Abstract
We develop an approach for sparse representations of gaussian process (GP) models (which are Bayesian types of kernel machines) in order to overcome their limitations for large data sets. The method is based on a combination of a Bayesian on-line algorithm, together with a sequential construction of a relevant subsample of the data that fully specifies the prediction of the GP model. By using an appealing parameterization and projection techniques in a reproducing kernel Hilbert space, recursions for the effective parameters and a sparse gaussian approximation of the posterior process are obtained. This allows for both a propagation of predictions and Bayesian error measures. The significance and robustness of our approach are demonstrated on a variety of experiments.Keywords
This publication has 11 references indexed in Scilit:
- Long-Lead Prediction of Pacific SSTs via Bayesian Dynamic ModelingJournal of Climate, 2000
- A Bayesian Committee MachineNeural Computation, 2000
- Bayesian Approach to Inverse Quantum StatisticsPhysical Review Letters, 2000
- Structured neural network modelling of multi-valued functions for wind vector retrieval from satellite scatterometer measurementsNeurocomputing, 2000
- RKHS-based functional analysis for exact incremental learningNeurocomputing, 1999
- Input space versus feature space in kernel-based methodsIEEE Transactions on Neural Networks, 1999
- On-line versus Off-line Learning from Random Examples: General ResultsPhysical Review Letters, 1996
- A Resource-Allocating Network for Function InterpolationNeural Computation, 1991
- Multivariate Adaptive Regression SplinesThe Annals of Statistics, 1991
- Some results on Tchebycheffian spline functionsJournal of Mathematical Analysis and Applications, 1971