A neural-type parallel algorithm for fast matrix inversion
- 9 December 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- p. 108-113
- https://doi.org/10.1109/ipps.1991.153764
Abstract
The paper introduces the orthogonalized back-propagation algorithm (OBA), a training procedure for adjusting the weights of a neural-type network used for matrix inversion. In this framework the adjustable weights correspond to the estimate of the inverse of the matrix. The algorithm is iterative, in the sense that an initial estimate of the solution is chosen and then updated according to some error measure. However, it is also a direct algorithm since, it guarantees exact convergence after n steps, independent of the initial estimate, where n is the dimension of the matrix to be inverted. The method can also be directly applied to solving linear equations and to computing the pseudoinverse of matrices with full row or column rank. From an optimization point of view, it is shown that the OBA is an optimal algorithm for minimizing a quadratic least-squares cost functional.Keywords
This publication has 4 references indexed in Scilit:
- Structured trainable networks for matrix algebraPublished by Institute of Electrical and Electronics Engineers (IEEE) ,1990
- Parallel Distributed ProcessingPublished by MIT Press ,1986
- An improved parallel processor bound in fast matrix inversionInformation Processing Letters, 1978
- Fast Parallel Matrix Inversion AlgorithmsSIAM Journal on Computing, 1976