Computing Gradients in Large-Scale Optimization Using Automatic Differentiation

Abstract
The accurate and efficient computation of gradients for partially separable functions is central to the solution of large-scale optimization problems, because these functions are ubiquitous in large-scale problems. We describe two approaches for computing gradients of partially separable functions via automatic differentiation. In our experiments we employ the ADIFOR (automatic differentiation of Fortran) tool and the SparsLinC (sparse linear combination) library. We use applications from the MINPACK-2 test problem collection to compare the numerical reliability and computational efficiency of these approaches with hand-coded derivatives and approximations based on differences of function values. Our conclusion is that automatic differentiation is the method of choice, providing code for the efficient computation of the gradient without the need for tedious hand-coding.

This publication has 0 references indexed in Scilit: