Global convergence of Oja's subspace algorithm for principal component extraction
- 1 January 1998
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 9 (1) , 58-67
- https://doi.org/10.1109/72.655030
Abstract
Oja's principal subspace algorithm is a well-known and powerful technique for learning and tracking principal information in time series. A thorough investigation of the convergence property of Oja's algorithm is undertaken in this paper. The asymptotic convergence rates of the algorithm is discovered. The dependence of the algorithm on its initial weight matrix and the singularity of the data covariance matrix is comprehensively addressed.Keywords
This publication has 10 references indexed in Scilit:
- Learning in linear neural networks: a surveyIEEE Transactions on Neural Networks, 1995
- Global analysis of Oja's flow for neural networksIEEE Transactions on Neural Networks, 1994
- Least mean square error reconstruction principle for self-organizing neural-netsNeural Networks, 1993
- Convergence analysis of local feature extraction algorithmsNeural Networks, 1992
- Principal components, minor components, and linear neural networksNeural Networks, 1992
- NEURAL NETS FOR DUAL SUBSPACE PATTERN RECOGNITION METHODInternational Journal of Neural Systems, 1991
- NEURAL NETWORKS, PRINCIPAL COMPONENTS, AND SUBSPACESInternational Journal of Neural Systems, 1989
- On stochastic approximation of the eigenvectors and eigenvalues of the expectation of a random matrixJournal of Mathematical Analysis and Applications, 1985
- Simplified neuron model as a principal component analyzerJournal of Mathematical Biology, 1982
- Neural theory of association and concept-formationBiological Cybernetics, 1977