Abstract
DA generalized descent algorithm theory is developed for unconstrained minimization problems. Here a descent algorithm is defined as a computational procedure where at each iteration a descent direction is determined and a single-dimensional search is made for the minimum in the descent direction. The theory is shown to be a generalization of the three most common descent algorithms; gradient, conjugate gradient, and Fletcher-Powell. Execution of the single-dimensional search can be computationally time consuming. Two additional algorithms are presented which reduce or eliminate single-dimensional search time. The first is a modification of Davidon's Variance Algorithm and requires a minimal single-dimensional search. The second is a direct method for minimizing a special class of quadratic functions.

This publication has 0 references indexed in Scilit: