Abstract
Significant advances have been made over the past decade in the development of powerful function minimization methods. Although the essential structure of each of these methods is fixed, several auxiliary features invariably remain to be selected by the user in any actual implementation. The effectiveness of the methods can, furthermore, be greatly influenced by the choice made by the user in this regard. The sensitivity of a variety of different methods to three of these features is examined through a series of computational experiments. These features are 1) the type of gradient information used (exact or approximated), 2) the precision requested in the solution of the line search subproblem, and 3) the superposition, on the basic algorithm, of a policy of periodic reinitialization.