A neural-type parallel algorithm for fast matrix inversion

Abstract
The paper introduces the orthogonalized back-propagation algorithm (OBA), a training procedure for adjusting the weights of a neural-type network used for matrix inversion. In this framework the adjustable weights correspond to the estimate of the inverse of the matrix. The algorithm is iterative, in the sense that an initial estimate of the solution is chosen and then updated according to some error measure. However, it is also a direct algorithm since, it guarantees exact convergence after n steps, independent of the initial estimate, where n is the dimension of the matrix to be inverted. The method can also be directly applied to solving linear equations and to computing the pseudoinverse of matrices with full row or column rank. From an optimization point of view, it is shown that the OBA is an optimal algorithm for minimizing a quadratic least-squares cost functional.

This publication has 4 references indexed in Scilit: