Global Convergence Properties of Conjugate Gradient Methods for Optimization

Abstract
This paper explores the convergence of nonlinear conjugate gradient methods without restarts, and with practical line searches. The analysis covers two classes of methods that are globally convergent on smooth, nonconvex functions. Some properties of the Fletcher-Reeves method play an important role in the first family, whereas the second family shares an important property with the Polak-Ribire method. Numerical experiments are presented. of several conjugate gradient methods for nonlinear optimization. We consider only the case where the methods are implemented without regular restarts, and ask under what conditions they are globally convergent for general smooth nonlinear functions. The analysis will allow us to highlight differences among various conjugate gradient methods, and will suggest new implementations.

This publication has 19 references indexed in Scilit: