A FAST NEURAL ALGORITHM FOR PRINCIPAL COMPONENT ANALYSIS AND SINGULAR VALUE DECOMPOSITION
- 1 January 1991
- journal article
- Published by World Scientific Pub Co Pte Ltd in International Journal of Neural Systems
- Vol. 2 (1) , 147-155
- https://doi.org/10.1142/s0129065791000145
Abstract
We propose a fast neural algorithm to perform Principal Component Analysis (PCA) of a set of examples. It is obtained by simplification of current neural learning rules for PCA. First, we use a single binary neuron to extract a given component by a Hebb-type learning rule (Self-Organized Perceptron). This rule rapidly yields the first principal component. Moreover, as the neuron is binary, convergence is easily interpreted in terms of geometry and trajectory. Then successive components are obtained after projection of examples on the subspace which is supplementary to already learnt components. This avoids mixing the components as in, for example, the “Subspace Method” recently proposed by Oja.11 We have tested this approach on a gaussian distribution of examples: the quality of results is identical to that obtained with methods which diagonalize the correlation matrix computed from the set of examples. A variant of the algorithm is also proposed to perform the Singular Value Decomposition (SVD) which “diagonalizes” an asymmetrical matrix. Performances are as satisfactory as for PCA. Complexity and performances of a VLSI implementation are then estimated from the specifications of the neural VLSI developed in our laboratory. Comparison with hardware implementations of non-neural approaches (SVD) seems to favor neatly the neural approach: the expected speed increase is at least two orders of magnitude for a 100-dimensional SVD calculation.Keywords
This publication has 0 references indexed in Scilit: