Abstract
The problem considered in this dissertation is that of finding local minimizers for a function subject to general nonlinear inequality constraints, when first and perhaps second derivatives are available. The methods studies belong to the class on the SQP algorithm embodied in the code NPSOL, which was developed at the Systems Optimization Laboratory, Stanford University. The goal of the dissertation is to develop SQP algorithms that allow some flexibility in their design. Specifically, we are interested in introducing modifications that enable the algorithms to solve large-scale problems efficiently. The following issues are considered in detail: Instead of trying to obtain the search direction as a minimizer for the QP, the solution process is terminated after a limited number of iterations. Suitable termination criteria are defined that ensure convergence for an algorithm that use a quasi-Newton approximation for the full Hessian. Theorems concerning the rate of convergence are also given. For many problems the reduced Hessian is considerably smaller than the full Hessian. Consequently, there are considerable practical benefits to be gained by only requiring an approximation to the reduced Hessian. Theorems are proved concerning the convergence and rate of convergence for an algorithm that uses a quasi-Newton approximation for the reduced Hessian when early termination of the QP subproblem is enforced.

This publication has 0 references indexed in Scilit: