This paper studies the behaviour of a family of conjugate gradient optimization algorithms, of which the best known is probably that introduced in 1964 by Fletcher & Reeves. This family has the property that, on a quadratic function, the directions generated by any member of the family are the same set of conjugate directions providing that, at each iteration, an exact linear search is performed. In this paper a modification is introduced that enables this set of conjugate directions to be generated without any accurate line searches. This enables the minimum of a quadratic function to be found in, at most, (n+2) gradient evaluations. As the modification only requires the storage of two additional n-vectors, the storage advantage of conjugate gradient algorithms viz-à-viz variable metric algorithms is maintained. Finally, a numerical study is reported in which the performance of this new method is compared to that of various members of the unmodified family.