The solution of sparse linear equations by the conjugate gradient method

Abstract
The convergence properties of the conjugate gradient method are discussed in relation to relaxation methods and Chebyshev accelerated Jacobi iteration when applied to the solution of large sets of linear equations which have a sparse, symmetric and positive definite coefficient matrix. The conclusion is reached that its convergence rate is unlikely to be much worse than these methods, and may be considerably better. The conjugate gradient method may either be applied to the basic unscaled or scaled equations or alternatively to various transformed equations. Preconditioning, block elimination and partial elimination methods of transforming equations are considered, and some comparative tests given for six problems.