A modified form of the Quasi-Newton family of variable metric algorithms used in function minimization is proposed that has quadratic termination without requiring linear searches. Most members of the Quasi-Newton family rely for quadratic termination on the fact that with accurate linear searches the directions generated, form a conjugate set when the function is quadratic. With some members of the family the convergence of the sequence of approximate inverse Hessian matrices to the true inverse Hessian is also stable. With the proposed modification the same sequence of matrices and the same set of conjugate directions are generated without accurate linear searches. On a quadratic function the proposal is also related to a suggestion by Hestenes which generates the same set of conjugate directions without accurate linear searches. Both methods therefore find the minimum of an n dimensional quadratic function in at most n+2 function and gradient calls. On non-quadratic functions the proposal retains the main advantages claimed for both the stable Quasi-Newton and Hestenes approaches. It shows promise in that it is competitive with the most efficient unconstrained optimization algorithms currently available.