An Expectation-Maximization Approach to Nonlinear Component Analysis
- 1 March 2001
- journal article
- Published by MIT Press in Neural Computation
- Vol. 13 (3) , 505-510
- https://doi.org/10.1162/089976601300014439
Abstract
The proposal of considering nonlinear principal component analysis as a kernel eigenvalue problem has provided an extremely powerful method of extracting nonlinear features for a number of classification and regression applications. Whereas the utilization of Mercer kernels makes the problem of computing principal components in, possibly, infinite-dimensional feature spaces tractable, there are still the attendant numerical problems of diagonalizing large matrices. In this contribution, we propose an expectation-maximization approach for performing kernel principal component analysis and show this to be a computationally efficient method, especially when the number of data points is large.Keywords
This publication has 3 references indexed in Scilit:
- Probabilistic Principal Component AnalysisJournal of the Royal Statistical Society Series B: Statistical Methodology, 1999
- A Unifying Review of Linear Gaussian ModelsNeural Computation, 1999
- Nonlinear Component Analysis as a Kernel Eigenvalue ProblemNeural Computation, 1998